US20110161843A1 - Internet browser and associated content definition supporting mixed two and three dimensional displays - Google Patents

Internet browser and associated content definition supporting mixed two and three dimensional displays Download PDF

Info

Publication number
US20110161843A1
US20110161843A1 US12/982,140 US98214010A US2011161843A1 US 20110161843 A1 US20110161843 A1 US 20110161843A1 US 98214010 A US98214010 A US 98214010A US 2011161843 A1 US2011161843 A1 US 2011161843A1
Authority
US
United States
Prior art keywords
display
content
dimensional
browser
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/982,140
Inventor
James D. Bennett
Jeyhan Karaoguz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US12/982,140 priority Critical patent/US20110161843A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KARAOGUZ, JEYHAN, BENNETT, JAMES D.
Publication of US20110161843A1 publication Critical patent/US20110161843A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • G03B35/24Stereoscopic photography by simultaneous viewing using apertured or refractive resolving means on screens or between screen and eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • H04N13/312Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being placed behind the display panel, e.g. between backlight and spatial light modulator [SLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • H04N13/315Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being time-variant
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • H04N13/351Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • H04N13/359Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • G09G2300/023Display panel composed of stacked panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/028Improving the quality of display appearance by changing the viewing angle properties, e.g. widening the viewing angle, adapting the viewing angle to the view direction
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N2013/40Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
    • H04N2013/403Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being monoscopic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N2013/40Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
    • H04N2013/405Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being stereoscopic or three dimensional

Definitions

  • the present invention relates to web browsers.
  • Images may be generated for display in various forms.
  • television is a widely used telecommunication medium for transmitting and displaying images in monochromatic (“black and white”) or color form.
  • images are provided in analog form and are displayed by display devices in two-dimensions.
  • images are being provided in digital form for display in two-dimensions on display devices having improved resolution (e.g., “high definition” or “HD”).
  • HD high definition
  • Conventional displays may use a variety of techniques to achieve three-dimensional image viewing functionality.
  • various types of glasses have been developed that may be worn by users to view three-dimensional images displayed by a conventional display.
  • glasses include glasses that utilize color filters or polarized filters.
  • the lenses of the glasses pass two-dimensional images of differing perspective to the user's left and right eyes.
  • the images are combined in the visual center of the brain of the user to be perceived as a three-dimensional image.
  • synchronized left eye, right eye LCD (liquid crystal display) shutter glasses may be used with conventional two-dimensional displays to create a three-dimensional viewing illusion.
  • LCD display glasses are being used to display three-dimensional images to a user.
  • the lenses of the LCD display glasses include corresponding displays that provide images of differing perspective to the user's eyes, to be perceived by the user as three-dimensional.
  • a display may include a parallax barrier that has a layer of material with a series of precision slits. The parallax barrier is placed proximal to a display so that a user's eyes each see a different set of pixels to create a sense of depth through parallax.
  • Another type of display for viewing three-dimensional images is one that includes a lenticular lens.
  • a lenticular lens includes an array of magnifying lenses configured so that when viewed from slightly different angles, different images are magnified. Displays are being developed that use lenticular lenses to enable autostereoscopic images to be generated.
  • display devices that are capable of displaying three-dimensional images, and further types are being developed.
  • Different types of displays that enable three-dimensional image viewing may have different capabilities and attributes, including having different depth resolutions, being configured for three-dimensional image viewing only, being switchable between two-dimensional image viewing and three-dimensional image viewing, and further capabilities and attributes.
  • Web browsers are applications that enable the retrieving, presenting, and traversing of information resources that are available on the World Wide Web (“the Web”).
  • Web browsers may be included in electronic devices such as desktop computers and handheld devices to enable users to interact with Web-based information resources. Examples of information resources that may be retrieved and presented by a web browser include web pages, images, and videos. Some of these information resources may include two-dimensional or three-dimensional content.
  • FIG. 1 shows a block diagram of a system that includes a web browser that supports mixed 2D (two-dimensional) and 3D (three-dimensional) displays, according to an exemplary embodiment.
  • FIG. 2 shows a block diagram of a web browser that supports mixed 2D and 3D displays interfaced with various display devices, according to an exemplary embodiment.
  • FIG. 3 shows a block diagram of examples of the web browser of FIG. 1 transmitting commands to a display device, according to embodiments.
  • FIG. 4A shows a block diagram of an electronic device that includes a browser architecture that supports mixed 2D and 3D displays, according to an exemplary embodiment.
  • FIG. 4B shows a block diagram of a display system that includes a 2D and 3D display enabled-browser architecture, according to an embodiment.
  • FIG. 5 shows a flowchart providing a process for enabling the display of 2D and 3D content using a web browser, according to an exemplary embodiment.
  • FIG. 6 shows a block diagram of a web browser configuring a display device of display of 2D and 3D content, according to an exemplary embodiment.
  • FIG. 7 shows a flowchart providing a process for using tag information to configure a screen for the display of 2D and 3D content, according to an exemplary embodiment.
  • FIGS. 8 , 9 , 10 A, and 10 B show examples of a screen displaying mixed 2D and 3D content in various screen regions, including tabs, frames, and objects, according to embodiments.
  • FIG. 11 shows a block diagram of a rendering engine configured to translate 3D content to 2D content and to translate a first type of 3D content to a second type of 3D content, according to an exemplary embodiment.
  • FIG. 12 shows a flowchart providing a process for determining display screen characteristics, according to an exemplary embodiment.
  • FIG. 13 shows a block diagram of storage that stores browser preferences, according to an exemplary embodiment.
  • FIG. 14 shows a block diagram of a display device having a light manipulator that enables display of 3D content by a screen, according to an exemplary embodiment.
  • FIG. 15 shows a block diagram of a display device having an adaptable light manipulator that enables the adaptable display of 3D content by a screen, according to an exemplary embodiment.
  • FIGS. 16 and 17 show block diagrams of examples of the display device of FIG. 15 , according to embodiments.
  • FIG. 18 shows a flowchart for generating three-dimensional images, according to an exemplary embodiment.
  • FIG. 19 shows a cross-sectional view of an example of a display system, according to an embodiment.
  • FIGS. 20 and 21 shows view of example parallax barriers with non-blocking slits, according to embodiments.
  • FIG. 22 shows a view of the barrier element array of FIG. 22 configured to enable the simultaneous display of two-dimensional and three-dimensional images of various sizes and shapes, according to an exemplary embodiment.
  • FIG. 23 shows a view of the parallax barrier of FIG. 22 with differently oriented non-blocking slits, according to an exemplary embodiment.
  • FIG. 24 shows a display system providing two two-dimensional images that are correspondingly viewable by a first viewer and a second viewer, according to an exemplary embodiment.
  • FIG. 25 shows a flowchart for generating multiple three-dimensional images, according to an exemplary embodiment.
  • FIG. 26 shows a cross-sectional view of an example of the display system of FIG. 15 , according to an embodiment.
  • FIGS. 27 and 28 show views of a lenticular lens, according to an exemplary embodiment.
  • FIG. 29 shows a flowchart for generating multiple three-dimensional images using multiple light manipulator layers, according to an exemplary embodiment.
  • FIG. 30 shows a block diagram of a display system, according to an exemplary embodiment.
  • FIGS. 31 and 32 show cross-sectional views of a display system, according to an exemplary embodiment.
  • FIG. 33 shows a block diagram of a display system, according to an exemplary embodiment.
  • FIG. 34 shows a block diagram of a display environment, according to an exemplary embodiment.
  • FIG. 35 shows a block diagram of an example electronic device, according to an embodiment.
  • FIG. 36 shows a block diagram of a display system that supports mixed 2D, stereoscopic 3D and multi-view 3D displays, according to an exemplary embodiment.
  • references in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • Embodiments of the present invention relate to web browsers that enable the display of two- and three-dimensional content.
  • web browsers may be enabled to display web pages, images, video, content generated by browser scripts and applications, and further types of information resources that include 2D and/or 3D content.
  • a browser may be capable of processing a markup language document that defines one or more browser windows, frames, or tabs, within which to display web pages, images, and/or video content.
  • the markup language document may include elements (e.g., tags) that specify one or more parameters to be associated with the displayed regions and content.
  • the browser may determine parameters to be associated with displayed regions and content based on other factors, such as a type of content to be displayed, a filename for the content, configuration information stored at a media server for the content, etc.
  • the web browsers may generate configuration commands based on the determined parameters that cause display screens to be configured to display the 2D and/or 3D content.
  • the display devices may display 2D and 3D content provided by the web browsers.
  • the display devices may include one or more light manipulators, such as parallax barriers and/or lenticular lenses, to deliver 3D media content in the form of images or views to the eyes of the viewers.
  • Other types may include display devices with 3D display pixel constructs that may or may not employ such light manipulators.
  • light manipulators may be fixed or dynamically modified to change the manner in which the views are delivered.
  • embodiments enable light manipulators that are adaptable to accommodate a changing viewer sweet spot, switching between two-dimensional (2D), stereoscopic three-dimensional (3D), and multi-view 3D views, as well as the simultaneous display of 2D, stereoscopic 3D, and multi-view 3D content.
  • example features that may be dynamically modified include one or more of a number of slits in the parallax barriers, the dimensions of each slit, the spacing between the slits, and the orientation of the slits. Slits of the parallax barriers may also be turned on or off in relation to certain regions of the screen such that simultaneous mixed 2D, stereoscopic 3D, and multi-view 3D presentations can be accommodated.
  • a lenticular lens may be dynamically modified, such as by modifying a width of the lenticular lens, to modify delivered images.
  • subsections describe numerous exemplary embodiments of the present invention. For instance, the next subsection describes embodiments for web browsers, followed by a subsection that describes embodiments for displaying content using a browser, a subsection that describes user input interface and web browser start up embodiments, a subsection that describes example display environments, and a subsection that describes example electronic devices. It noted that the section/subsection headings are not intended to be limiting. Embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection.
  • a web browser that provide native support for the display of mixed content.
  • a web browser comprises a graphical user interface (GUI) in which video content can be displayed in a window of the browser.
  • GUI graphical user interface
  • parameters e.g., indicated via “tags” or by other configuration information
  • the parameters can specify various display characteristics, such as one or more of: a type of video content to be displayed within the browser window (e.g., 2D, stereoscopic 3D, or a particular type of multi-view 3D), a desired orientation of the displayed video content, a brightness/contrast to be associated with the browser window, and/or a video resolution to be associated with the browser window.
  • the parameters to be associated with a browser window may be specified programmatically or determined dynamically at run-time.
  • the parameters may also be modified at run-time by a user through a user control interface provided by the web browser.
  • the web browser is further configured to cause one or more function calls to be placed to a graphics API (application programming interface), operating system, or device driver so that a window is opened on the display and the content is presented therein in a manner that is consistent with the associated parameters.
  • graphics API application programming interface
  • FIG. 1 shows a block diagram of a system 100 , according to an exemplary embodiment.
  • system 100 includes a display device 102 , a document server 104 , a web browser 106 , and a network 116 .
  • System 100 is a system where web browser 106 interfaces together one or more users and network content with display device 102 .
  • System 100 is described as follows.
  • System 100 may be implemented in one or more devices.
  • web browser 106 and display device 102 may be implemented in a common electronic device 112 that may be accessed by a user, such as a mobile computing device (e.g., a handheld computer, a laptop computer, a notebook computer, a tablet computer (e.g., an Apple iPadTM), a netbook, etc.), a mobile phone (e.g., a cell phone, a smart phone), a mobile email device, some types of televisions, etc.
  • a mobile computing device e.g., a handheld computer, a laptop computer, a notebook computer, a tablet computer (e.g., an Apple iPadTM), a netbook, etc.
  • a mobile phone e.g., a cell phone, a smart phone
  • a mobile email device e.g., a mobile email device, some types of televisions, etc.
  • FIG. 1 web browser 106 may be implemented in an electronic device 110 that is separate from display device 102 .
  • Web browser 106 also referred to as an “Internet browser” or “browser,” is an application for retrieving, presenting, and traversing network-based information resources.
  • web browser 106 may be implemented in software (e.g., computer programs and/or data) that runs on a device.
  • Web browser 106 may load an external information resource identified by a Uniform Resource Locator (URL), such as a web page, an image, a video, or other item of content.
  • URL Uniform Resource Locator
  • Web browser 106 may display the loaded information in a window of the browser.
  • An information resource loaded by web browser 106 may reference further information resources, which may be loaded by web browser 106 for display.
  • An information resource may include hyperlinks that when displayed can be selected by a user to enable the user to navigate to the related information resources.
  • document server 104 may store one or more information resources, such as an information resource 114 .
  • Information resource 114 may be an XML document, an HTML document (e.g., a web page), an image file, a video, or other type of information resource.
  • a user of browser 106 may desire to view information resource 114 , and may interact with browser 106 to cause information resource 114 to be loaded by browser 106 .
  • the user may enter the URL of information resource 114 into browser 106 , or may select a hyperlink in a markup document that links to information resource 114 , to cause browser 106 to load information resource 114 .
  • browser 106 may generate request 118 , such a HTTP (hypertext transfer protocol) request (e.g., if the URL starts with “http:” or “https:”) or other type of request (e.g., FTP (file transfer protocol), etc.), which is transmitted from the device that includes browser 106 .
  • Request 118 is directed to a location of information resource 114 according to the URL of information resource 114 .
  • Request 118 may be transmitted through a network 116 to be received by document server 104 .
  • network 116 may be any type of communication network, including a local area network (LAN), a wide area network (WAN), or a combination of communication networks, such as the Internet.
  • Document server 104 may be any suitable type of computer system capable of providing documents over a network, such as a server, etc.
  • document server 104 In response to receiving request 118 , document server 104 locates and identifies information resource 114 , and transmits information resource 114 to browser 106 through network 116 .
  • Browser 106 receives information resource 114 , and displays content of information resource 114 in a window in a screen of display device 102 .
  • web browser 106 includes mixed 2D/3D supporting logic 108 .
  • Mixed 2D/3D supporting logic 108 enables web browser 106 to support display of mixed 2D and 3D content, according to an exemplary embodiment.
  • logic 108 may enable web browser 106 to provide two- and three-dimensional content for display by display devices that are capable of separately displaying two-dimensional and three-dimensional content, display devices that are capable of simultaneously displaying two-dimensional and three-dimensional content, display devices that are capable of simultaneously displaying different types of three-dimensional content, as well as display devices that can adaptively change the display of two-dimensional and three-dimensional contents (e.g., by changing display screen regions).
  • 2D/3D supporting logic 108 may capable of enabling browser 106 to render 2D and 3D content at display device 102 in a manner based on the contents of information resource 114 and/or based on one or more tags (e.g., HTML tags) or other configuration information associated with information resource 114 in a markup document that refers to information resource 114 .
  • An HTML document is a type of markup document that includes a tree of HTML elements and other information (e.g., textual information, etc.) according to an HTML language format. Each HTML element can have attributes assigned. In HTML syntax, some elements may be written with associated tags to assign attributes to the elements.
  • An element may be written with a start tag and an end tag, with the content indicated in between the start tag and end tag.
  • a start tag includes the name of the element surrounded by angle brackets, and the corresponding end tag includes a slash character followed by the name of the element, which are both surrounded by angle brackets (not all elements necessarily include an end tag).
  • a paragraph may be indicated by a “p” element.
  • An example of a p element is shown as follows:
  • tags may be included in markup documents to indicate types and characteristics of 2D and 3D content included or referenced therein.
  • a “3D” element may be defined to indicate particular content as three-dimensional.
  • An example 3D element is shown as follows:
  • attributes may be added to a 3D element to indicate the various type of 3D content.
  • a 3D-4 video file may be indicated by the 3D element as:
  • configuration information for display of content may be extracted from a web page according to the HTML language format (e.g., tags, attributes, etc.). Such configuration information may be provided in the form of tags as indicated above, or in further ways, as would become apparent to persons skilled in the relevant art(s) from the teachings herein. Furthermore, as described in further detail below, configuration information for content may be determined in other ways, including being determined by the filename of the content (e.g., by file extension), by configuration information stored at the file server that serves the content, and/or in further ways.
  • display device 102 may be one of a variety of display devices capable of displaying two-dimensional and/or three-dimensional content.
  • FIG. 2 shows a block diagram of a display system 200 , which is an exemplary embodiment of system 100 of FIG. 1 .
  • system 200 includes web browser 106 , a first display device 202 , and a second display device 204 .
  • web browser 106 includes mixed 2D/3D supporting logic 108 .
  • First display device 202 is a display device that is only capable of displaying two-dimensional content
  • second display device 204 is a display device that is capable of display two-dimensional content and three-dimensional content.
  • FIG. 2 shows a block diagram of a display system 200 , which is an exemplary embodiment of system 100 of FIG. 1 .
  • system 200 includes web browser 106 , a first display device 202 , and a second display device 204 .
  • web browser 106 includes mixed 2D/3D supporting logic 108 .
  • First display device 202 is a display device that
  • web browser 106 is capable of displaying content at first and second display devices 202 and 204 .
  • web browser 106 may be capable of providing content for display by first and second display devices 202 and 204 one at a time.
  • web browser 106 may be capable of providing content for display by first and second display devices 202 and 204 and/or other combinations and numbers of display devices simultaneously.
  • FIG. 3 shows a block diagram of web browser 106 interfacing with display device 102 of FIG. 1 , according to embodiments.
  • browser 106 can be interfaced with display device 102 through an API (application programming interface) 302 and a display driver 306 , through an operating system (OS) 304 and display driver 306 , and/or through display driver 306 .
  • API application programming interface
  • OS operating system
  • FIG. 3 is described as follows.
  • API 302 is an interface implemented in software (e.g., computer program code or logic) for applications such as browser 106 that enables the applications to interact with other software and/or hardware. API 302 may be configured to perform graphics operations on graphics information received from the applications. API 302 may be implemented in a same device as browser 106 . API 302 may be a special purpose API, or may be a commercially available API, such as Microsoft DirectX® (e.g., Direct3D®), OpenGL®, or other 3D graphics API, which may be modified according to embodiments to receive commands and/or content from browser 106 . Further description of implementations of API 302 and other API implementations described herein is provided in pending U.S. patent application Ser. No. ______, titled “Application Programming Interface Supporting Mixed Two And Three Dimensional Displays,” filed on same date herewith, which is incorporated by reference herein in its entirety.
  • Microsoft DirectX® e.g., Direct3D®
  • OpenGL® OpenGL®
  • OS 304 is configured to interface users and applications with hardware, such as display device 102 .
  • OS 304 may be a commercially available or proprietary operating system.
  • OS 304 may be an operating system such as Microsoft Windows®, Apple Mac OS® X, Google AndroidTM, or Linux®, which may be modified according to embodiments. Further description of implementations of OS 304 and other operating system implementations described herein is provided in pending U.S. patent application Ser. No. ______, titled “Operating System Supporting Mixed 2D, Stereoscopic 3D And Multi-View 3D Display,” filed on same date herewith, which is incorporated by reference herein in its entirety.
  • Display driver 306 may be implemented in software, and enables applications (e.g., higher-level application code) such as browser 106 to interact with display device 102 .
  • Display driver 306 may be implemented in a same device as browser 106 .
  • Multiple display drivers 306 may be present, and each display driver 306 is typically display device-specific, although some display drivers 306 may be capable of driving multiple types of display devices.
  • Each type of display device typically is controlled by its own display device-specific commands.
  • most applications communicate with display devices according to high-level device-generic commands.
  • Display driver 306 accepts the generic high-level commands (directly from browser 106 , or via API 302 and/or OS 304 ), and breaks them into a series of low-level display device-specific commands, as used by the particular display device.
  • browser 106 may generate a command 308 associated with the display of 2D and/or 3D content that is received by API 302 .
  • API 302 passes command 308 to display driver 306 (in a modified or unmodified form).
  • Display driver 306 receives command 308 , and generates one or more control signals 314 received by display device 102 .
  • Control signal(s) 314 place(s) a screen of display device 102 in a display mode corresponding to command 308 .
  • browser 106 may stream content to display device 102 through API 302 and display driver 306 to be displayed in the screen configured according to command 308 .
  • the API 302 may be included in OS 304 (when present), or may be separate.
  • API 302 may communicate directly with display driver 306 as shown in FIG. 3 , or may communicate with display driver 306 through OS 304 .
  • browser 106 may generate a command 310 associated with the display of 2D and/or 3D content that is received by OS 304 .
  • OS 304 passes command 310 to display driver 306 (in a modified or unmodified form).
  • Display driver 306 receives command 310 , and generates control signal(s) 314 received by display device 102 .
  • Control signal(s) 314 place(s) a screen of display device 102 in a display mode corresponding to command 310 .
  • browser 106 may stream content to display device 102 through OS 304 and display driver 306 to be displayed in the screen configured according to command 310 .
  • browser 106 may generate a command 312 associated with the display of 2D and/or 3D content that is directly received by display driver 306 (e.g., does not pass through API 302 or OS 304 ).
  • Display driver 306 receives command 312 , and generates control signal(s) 314 received by display device 102 .
  • Control signal(s) 314 place(s) a screen of display device 102 in a display mode corresponding to command 312 .
  • browser 106 may stream content to display device 102 directly through display driver 306 to be displayed by the screen configured according to command 312 .
  • commands and content may be provided by browser 106 to display device 102 through one or more different intermediate components, which may include one or more of API 302 , OS 304 , and display driver 306 .
  • browser 106 may include one or more API 302 and/or OS 304 , or may be included in OS 304 , in a similar manner as some commercially available operating systems that incorporate a web browser (e.g., Google Chrome OSTM).
  • Web browser 106 may be implemented in various ways to interface users and network-based content with display devices that are capable of displaying two-dimensional content and/or three-dimensional content.
  • FIG. 4A shows a block diagram of an electronic device 412 that includes a browser architecture for a web browser 400 , according to an exemplary embodiment.
  • Device 412 may be any of the electronic devices mentioned herein as including a web browser (e.g., electronic devices 110 and 112 of FIG. 1 ), or may be an alternative device.
  • Browser 400 is configured to interface users and network-based content with display devices that are capable of displaying two-dimensional content and/or three-dimensional content.
  • browser 400 may be a proprietary web browser.
  • browser 400 may be a commercially available web browser that is modified to enable users and network-based content to be interface with display devices capable of displaying two-dimensional content and/or three-dimensional content.
  • web browsers such as Internet Explorer®, developed by Microsoft Corp. of Redmond, Wash., Mozilla Firefox®, developed by Mozilla Corp. of Mountain View, Calif., or Google® Chrome of Mountain View, Calif. may be modified according to embodiments.
  • browser 400 includes various browser portions, including a user interface 402 , a rendering engine 404 , one or more optional client applications 406 , a networking module 408 , and a code interpreter 410 . These features of browser 400 are described as follows.
  • User interface 402 is configured to display information to enable a person to interact with browser 400 .
  • user interface 402 may provide one or more graphical user interface (GUI) control elements that a user may interact with to use and/or configure browser 400 .
  • GUI graphical user interface
  • user interface 402 may provide an address bar into which a user may enter URLs of desired information resources, a back button, a forward button, a refresh button, a stop button, a home button, one or more additional/alternative buttons, one or more pull down menus (e.g., a list of bookmarks, etc.), etc.
  • Networking module 408 is a communications module for browser 400 with network-accessible entities, such as document server 104 shown in FIG. 1 .
  • networking module 408 may be configured to generate network calls, such as HTTP requests (e.g., request 118 of FIG. 1 ) and/or other types of requests.
  • the calls may be transmitted over a network (e.g., network 116 of FIG. 1 ) to remote entities to retrieve information resources corresponding to a URL in an address bar provided by user interface 402 , a URL for an information resource referenced in a markup document loaded by browser 400 , or a hyperlink present in content displayed by browser 400 .
  • Rendering engine 404 is configured to display requested content in one or more browser windows.
  • rendering engine 404 may request and receive a markup document (also known as a “markup language document”), and may render the content included in or referenced by the markup document for display in a screen of a display device.
  • Rendering engine 404 can render displays of HTML (hypertext markup language) and XML (extensible markup language) documents, as well as image/video content.
  • HTML hypertext markup language
  • XML extensible markup language
  • rendering engine 404 may parse the document to generate a DOM (document object model) tree.
  • the DOM is a cross-platform and language-independent convention for representing objects in HTML and XML documents.
  • Rendering engine 404 may generate a render tree from the DOM tree.
  • Rendering engine 404 may perform a layout process to determine screen coordinates for each node of the render tree, and may traverse and “paint” each node of the render tree on the display screen in a browser window.
  • client applications 406 may optionally be present. Each client application 406 may be interfaced with web browser 400 to add corresponding capabilities to web browser 406 , if web browser 406 does not already have such capabilities.
  • a client application 406 may be a presentation component configured to enable web browser 400 to play video, to scan for viruses, to display additional file types, such as PDF (portable document format) files, etc.
  • client application 406 may include a media player, an Adobe® Flash® plug-in that enables animation, video, and interactivity for web pages, a Apple QuickTime® plug-in that enables various formats of digital video, images, sound, and interactivity for web pages, a Microsoft® SilverlightTM plug-in that enables multimedia, graphics, animation, and interactivity for web pages, etc.
  • Code interpreter 410 (also known as a “script engine”) is configured to interpret and execute script code referenced by markup documents.
  • code interpreter 410 may be configured to interpret and execute JavaScript® code. JavaScript® may be present to provided enhanced user interfaces and dynamic web pages.
  • Code interpreter 410 may interpret JavaScript® source code, and execute the interpreted code.
  • web browser 400 may include a compiled code execution module that is capable of executing compiled code, such as Java bytecode.
  • browser 400 may include a virtual machine configured as a Java runtime environment to run Java applets, which may provide interactive features to web pages, including complex graphics.
  • FIG. 4B shows a block diagram of a display system 480 that includes a 2D and 3D display enabled-browser architecture, according to an embodiment.
  • display system 480 includes a web browser 490 .
  • Browser 490 is an embodiment of browser 400 that is configured to interface users and network-accessible resources with display devices that are capable of displaying two-dimensional content and/or three-dimensional content.
  • the embodiment of browser 490 shown in FIG. 4B is provided for purposes of illustration, and is not intended to be limiting. In further embodiments, browser 490 may include fewer, additional, and/or alternative features than shown in FIG. 4B .
  • Display system 480 is an example of a display system that is capable of displaying mixed 2D and 3D content (e.g., via mixed 2D/3D supporting logic 108 ).
  • system 480 includes web browser 490 , operating system kernel and kernel utilities with regional/3Dx support 432 (“OS 432 ”), one or more browser page and 2D/3Dx content servers 460 (“server 460 ”), first-third display circuitry 416 a - 416 c, a 2D display 418 a, a 3D display with 2D mode 418 b, a regionally configurable 2D/3Dx display 418 c, and a network 478 .
  • OS 432 operating system kernel and kernel utilities with regional/3Dx support 432
  • server 460 one or more browser page and 2D/3Dx content servers 460
  • first-third display circuitry 416 a - 416 c first-third display circuitry 416 a - 416 c, a 2D display 418 a, a 3D display with
  • Web browser 490 includes various browser portions, including a browser/rendering engine 442 , a 2D/3Dx UI (user interface) display 444 , a networking module 446 , a UI backend 448 , and one or more 2D/3Dx video and image client(s) 450 .
  • Browser/rendering engine 442 includes a parser 452 , a render tree preparation module 454 , and a rendered tree display 456 .
  • UI backend 448 includes 2D/3Dx support 458 .
  • Browser page and 2D/3Dx content server(s) 460 includes page content 462 , linked content file or files 464 , and a streaming server application 466 .
  • Page content 462 includes a hypertext content link 468 , a screen region location 470 , and an underlying screen configuration 472 .
  • Linked content file or files 464 includes a file A and screen configuration A 474 , and a file B and screen configuration B 476 .
  • OS 432 includes user input interfaces 420 , a 2D, 3Dx & mixed display driver interface 422 , shell operations 424 , 2D, 3Dx, mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 , an API supporting regional 2D/3Dx 428 (“API 428 ”), and one or more communication interfaces 440 .
  • API 428 an API supporting regional 2D/3Dx 428
  • 2D, 3Dx and mixed display driver interface 422 includes 2D only driver variant 434 , 3Dx only driver variant 436 , and mixed 2D and 3Ds driver variant 438 .
  • First-third display circuitry 416 a - 416 c each includes a corresponding one of translation services 430 a - 430 c. The features of system 480 are described as follows.
  • 2D display 418 a, 3D display with 2D mode 418 b, and regionally configurable 2D/3Dx display 418 c are example types of display devices that may display content provided by browser 490 .
  • One or more of displays 418 a - 418 c may be present.
  • 2D display 418 a is an example of 2D display device 202 of FIG. 2 , and is a display device that is only capable of displaying two-dimensional content.
  • 3Dx display with 2D mode 418 b is an example of 2D-3D display device 204 of FIG. 2 , and is a display device that is capable of displaying two-dimensional and three-dimensional content.
  • 3Dx display with 2D mode 418 b may be set in a 2D mode where 3Dx display with 2D mode 418 b can display 2D content in full screen, but not 3D content, and may be set in a 3D mode where 3Dx display with 2D mode 418 b can display 3D content in full screen, but not 2D content.
  • 3Dx display with 2D mode 418 b may be capable of displaying 3D content having multiple camera views (“multiview”)—a number of “x” views—such as 3D-4, having four camera views, 3D-16, having sixteen camera views, etc.
  • multiview multiple camera views
  • the additional camera views enable viewers to “view behind” displayed 3D content by moving their heads left-right, as further described elsewhere herein.
  • Regionally configurable 2D/3Dx display 418 c is an example of 2D-3D display device 204 of FIG. 2 , and is a display device that is capable of displaying two-dimensional and three-dimensional content simultaneously.
  • regionally configurable 2D/3Dx display 418 c may display 2D content in one or more regions of a display screen while simultaneously displaying 3D content in one or more other regions of the display screen.
  • regionally configurable 2D/3Dx display 418 c may be capable of displaying 3D content having multiple camera views.
  • Network 478 is an example of network 116 in FIG. 1
  • browser page and 2D/3Dx content server 460 is an example of document server 104 in FIG. 1
  • One or more browser page and 2D/3Dx content servers 460 may be present that are accessible to browser 490 over network 478 .
  • Browser page and 2D/3Dx content server 460 may include one or more information resources, such as markup documents (e.g., web pages, etc.), image files, video files, etc.
  • page content 462 is an example of markup document content.
  • Page content 462 may include text, page configuration information, references to other information resources, etc. For instance, as shown in FIG.
  • page content 462 may include one or more hypertext content links 468 , which are links displayed in a page generated from page content 462 and displayed by browser 490 .
  • Hypertext content link 468 may be selected by a user to traverse to and display an information resource as a page element.
  • Screen region location 470 may be present to indicate a region in the displayed page in which a page element corresponding to hypertext content link 468 is to be displayed.
  • screen region location 470 may be used by a layout module of browser/rendering engine 442 to select a location for display of the corresponding content in a display screen.
  • Underlying screen configuration 472 may be present to indicate a screen display configuration for the displayed page, including desired 2D and/or 3D display characteristics of the screen.
  • underlying screen configuration 472 may be included in a file that includes page content 462 (e.g., in the form of one or more tags), or may be separately stored in server 460 .
  • Browser/rendering engine 442 may use information of underlying screen configuration 472 in a configuration request to configure a screen region for displaying the corresponding content.
  • Linked content file(s) 464 includes files that may be requested for display by browser 490 (e.g., in request 118 ), such as in response to a user selecting a hyperlink in a displayed page.
  • linked content file(s) 464 may include multiple files from which a file may be selected to be provided in a response to a user selecting a hyperlink.
  • linked content file(s) 464 may include file A and screen configuration A 474 , and a file B and screen configuration B 476 .
  • File A and file B are alternative files to be provided to browser 490 in response to a request.
  • File A corresponds to a screen configuration A
  • file B corresponds to a different screen configuration B.
  • file A or file B may be provided by server 460 in response to a request based on characteristics of a display screen in which content of the file is to be displayed, based on a provided display frame size, based on communication link characteristics (testing), and/or based on other criteria.
  • Streaming server application 466 may be present in browser page and 2D/3Dx content server 460 to stream video content in response to a request from browser 490 to server 460 for video files.
  • OS 432 is an example of operating system 304 shown in FIG. 3 .
  • OS 432 interfaces applications, such as browser 490 , with displays 418 a - 418 c.
  • OS 432 may provide various forms of 2D/3Dx display support.
  • API supporting regional 2D/3Dx 428 is configured to interface one or more applications (e.g., browser 490 ) with OS 432 , and thereby interface the applications with a display device (e.g., one or more of displays 418 a - 418 c ) coupled to OS 432 .
  • API supporting regional 2D/3Dx 428 is configured to enable applications, such as browser 490 , to access various display functions, including enabling regional definition for 2D, 3D, and 3Dx content displayed by display screens and further display functions.
  • User input interfaces 420 are configured to receive user input to enable a person to interact with display system 480 , browser 490 , and content displayed by displays 418 a - 418 c. Further example embodiments for user input interfaces 420 are described elsewhere herein.
  • 2D, 3Dx & mixed display driver interface 422 enables applications, such as browser 490 , that interface with OS 432 via API 428 to provide and control two- and/or three-dimensional content displayed at a displays 418 a - 418 c.
  • 2D only driver variant 434 , 3Dx only driver variant 436 , and mixed 2D and 3Dx driver variant 438 are examples of display driver 306 of FIG. 3 .
  • 2D, 3Dx & mixed display driver interface 422 may forward commands (e.g., from browser 490 ) to 2D only driver variant 434 when 2D display 418 a is present, enabling only 2D-related commands to be processed.
  • 2D, 3Dx & mixed display driver interface 422 may forward commands to 3Dx only driver variant 436 when 3Dx display with 2D mode 418 b is present, enabling 2D or 3Dx related commands to be processed.
  • 2D, 3Dx & mixed display driver interface 422 may forward commands to mixed 2D and 3Dx driver variant 438 when regionally configurable 2D/3Dx display 418 c is present, enabling regional 2D or 3Dx related commands to be processed.
  • Shell operations 424 may be present in OS 432 to control and/or enable user configuration of environmental properties, such as the 2D and/or 3D display configuration of an environmental background, of desktop icons, of displayed windows, etc.
  • shell operations 424 may be implemented in hardware, software, firmware, or any combination thereof, including as a shell operations module.
  • Mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 may be present in OS 432 to provide for translation of received content (e.g., from an application such as browser 490 ) from a first dimensionality to a second dimensionality.
  • translation services 426 may be configured to translate received 3D content to 2D content, such as when an application provides 3D content, and 2D display 418 a is the target display (e.g., the target display is not capable of displaying 3D content).
  • translation services 426 may be configured to translate a first type of 3D content to a second type of 3D content, such as when an application provides regional 2D and/or 3D content, and 3Dx display with 2D mode is the target display (e.g., the target display is not capable of displaying content regionally), and/or to translate 3D content having a first number “x” of cameras (e.g., 3D-8 content) to 3D content having a second number “y” of cameras (e.g., 3D-4 content), if the target display does not support “x” camera views.
  • translation services 426 may be configured to translate 2D content to 3D content, and/or may be able to perform other forms of content translations. Example embodiments for mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 (e.g., translators) are described elsewhere herein.
  • Display circuitry 416 a - 416 c may have the form of hardware, software, firmware, or any combination thereof, such as having the form of a graphics card, circuitry etc.
  • Display circuitry 416 a - 416 c may be present to interface OS 432 with displays 418 a - 418 c, respectively.
  • Display circuitry 416 a - 416 c may receive content signals and control signals from OS 432 , and may be configured to generate drive signals to drive displays 418 a - 418 c, respectively. Examples of display circuitry (e.g., drive circuits) are described elsewhere herein.
  • display circuitry 416 a - 416 c may each optionally include a corresponding one of translation services 430 a - 430 c.
  • translation services 430 a - 430 c may perform translations of received content in a similar manner as mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 .
  • translation services 430 a may translate received 3D content to 2D content for display by 2D display 418 a.
  • Translation services 430 b may translate received regionally configurable 2D and/or 3D content to non-regional 2D and/or 3D content for display by 3Dx display with 2D mode display 418 b.
  • Translation services 430 b and 430 c may each translate unsupported types of 3D content to supported types of 3D content for display by 3Dx display with 2D mode display 418 b and regionally configurable 2D/3Dx display 418 c, respectively.
  • Translation services 430 a - 430 c may also be configured to perform additional and/or alternative forms of content translations, in embodiments.
  • Browser 490 is configured to enable network-accessible content to be displayed in two- and three-dimensions at displays 418 a - 418 c.
  • 2D/3Dx UI display 444 is an example of user interface 402 shown in FIG. 4A .
  • 2D/3Dx UI display 444 may include an address bar, back/forward buttons, bookmarking, and/or further portions the browser display (e.g., other than the main window displaying a requested page).
  • 2D/3Dx UI display 444 may include 2D & 3Dx counterparts, such as images or video streams (e.g., 2D/3Dx Applet like functionality).
  • Browser/Rendering Engine 442 is an example of rendering engine 404 of FIG. 4A .
  • Engine 442 processes HTML, and manages the display of web page and 2D and 3D image & Video (stream) file content.
  • parser 452 may parse a loaded HTML document to generate a DOM (document object model) tree, as described above.
  • Rendering tree preparation module 454 may generate a render tree from the DOM tree.
  • Module 454 may identify screen configurations to be applied to regions of the display screen based on the render tree, and may cause configuration requests to be generated based on the identified screen configuration to cause a configuration or reconfiguration of the screen in the regions.
  • Module 454 may include a layout module that performs a layout process to determine screen coordinates for each node of the render tree.
  • Module 454 may traverse and “paint” each node of the render tree in a browser window on the display screen, to generate render tree display 456 .
  • Networking module 446 is an example of networking module 408 shown in FIG. 4A .
  • networking module 446 is platform independent, and interfaces with OS 432 to operate through communication interface(s) 440 of OS 432 via network protocols (e.g., HTTP requests, etc.).
  • network protocols e.g., HTTP requests, etc.
  • UI Backend 448 is configured to draw basic widgets, such as drop down boxes, combo boxes, and windows.
  • UI Backend 448 may interface with API 428 of OS 432 to generate 2D and 3D image or video (e.g., streamed) elements.
  • UI Backend 448 may be platform independent.
  • 2D/3Dx video and image client(s) 450 are an example of client application(s) 406 of FIG. 4A .
  • 2D/3Dx video and image client(s) 450 may include plug-ins, add-ons, built-in, external helper apps, etc., that provide functionality to browser 480 .
  • clients 450 may provide the functionality for: (i) generating the control signals that are passed to OS 432 for configuring display screen regions in preparation for underlying video/image presentation; (ii) managing the retrieval of media content to be displayed; (iii) delivering the media content via OS 432 to one or more of displays 418 a - 418 c; and (iv) managing the presentation of such media content (e.g., enabling rewind, zoom, pause, etc.).
  • item (i) above may be performed by browser/renderer engine 442 according to HTML tag definitions, for example.
  • one or more others of (ii)-(iv) may be performed by engine 442 .
  • client(s) 450 can be integrated into engine 442 , or may remain a plug-in, an add-on, a built-in, a helper app as shown in FIG. 4B .
  • Client(s) 450 may reside outside of browser 490 , and launching and loading of an external client 450 may be performed by browser 490 within another external window.
  • code interpreters e.g., code interpreter 410 of FIG. 4A
  • Java interpreter may also be present in browser 490 , which operate pursuant to code of client 450 to perform a same function as a compiled add-on.
  • display system 480 and browser 490 shown in FIG. 4B are provided for purposes of illustration.
  • display system 480 and browser 490 may include fewer, further, and/or alternative components, as would be known to persons skilled in the relevant art(s). Further embodiments regarding the features of display system 480 and browsers 400 and 490 are described in the following subsections.
  • FIG. 5 shows a flowchart 500 providing a process for displaying web page content, according to an exemplary embodiment.
  • Flowchart 500 may be performed by browsers described elsewhere herein, such as browser 400 of FIG. 4A or browser 490 of FIG. 4B .
  • FIG. 6 shows a block diagram of browser 400 interfaced with a display device 606 , according to an exemplary embodiment.
  • browser 400 includes rendering engine 404 , application client(s) 406 , and code interpreter 410 .
  • rendering engine 404 includes mixed 2D/3D supporting logic 108 .
  • Display driver 604 is an example of display driver 306 of FIG. 3 .
  • Device 412 of FIG. 4A is not shown in FIG. 6 for ease of illustration, but it is noted that browser 400 may be included in device 412 , and display device 606 may be included in or may be external to device 412 . Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 500 . Flowchart 500 is described as follows.
  • Flowchart 500 begins with step 502 .
  • web page content is parsed.
  • browser 400 may receive a markup document 608 .
  • Markup document 608 may include HTML text that describes a web page.
  • Rendering engine 404 may parse markup document 608 .
  • rendering engine 404 may include parser 452 of FIG. 4B , which may be configured to parse HTML documents, such as markup document 608 .
  • Parser 452 may receive the content of markup document 608 in 8 K chunks or portions, and may begin parsing the underlying HTML text of markup document 608 on a chunk-by-chunk basis without waiting for all content to be received.
  • parser 452 may receive all of the content of markup document 608 before beginning parsing. Parser 452 may generate a document object model tree or other structure that identifies each of the elements of content included in or referenced by markup document 608 . A first portion of the elements may relate to two-dimensional content, and second portion of the elements may relate to three-dimensional content. Alternatively, all of the elements may relate to three-dimensional content (or two-dimensional content).
  • step 504 two-dimensional content to be displayed in a first region of the screen is identified.
  • rendering module 404 may identify a first object of markup document 608 that relates to two-dimensional content.
  • the first object may be identified by parser 452 encountering a hypertext link corresponding to the first object in markup document 608 , or in other manner.
  • the first object may include any form of two-dimensional content, such as an image, a video, another web page, etc.
  • the first object may be identified in various ways.
  • rendering engine 404 may handle further processing of the first object
  • client application 406 may be selected to manage the processing of the first object (e.g., for a particular type of first object that the client application 406 is configured to process), or code interpreter 410 may interpret and execute the first object when the first object is an un-compiled script.
  • the first object may be identified based on an identifier for the first object (e.g., a filename) or a structure of the first object itself (e.g., file contents, such as header information).
  • a MIME (multipurpose Internet mail extensions) type file extension to a filename for the first object provided in markup document 608 may be used to identify the first object, to identify that the first object includes 2D content, and to select rendering engine 404 or a particular client application 406 to process the first object.
  • the first object may be identified by a content server (e.g., content server 460 of FIG. 4B ) from which the first object is requested.
  • one or more parameters may be present in markup document 608 that may be passed to the content server by rendering engine 404 in a request that can be used to select the first object to be returned in response to the request.
  • the tag(s) and/or other parameter(s) may indicate a screen configuration for a screen 620 of display device 606 , a frame size to be generated by rendering engine 404 , and/or other information.
  • the content server may use the tag(s) and/or other parameters to select the first object, and/or may use other information to select the first object, such as characteristics of the communication link between the file server and browser. Referring to FIG.
  • file A or file B may be selected by server 460 based on whether the browser screen configuration matches screen configuration A or screen configuration B stored at server 460 , and the selected file is transmitted (e.g., an image file is transmitted, video is streamed, etc.) to browser 400 (e.g., as information resource 610 in FIG. 6 ).
  • the selected file is transmitted (e.g., an image file is transmitted, video is streamed, etc.) to browser 400 (e.g., as information resource 610 in FIG. 6 ).
  • a first configuration request is communicated to at least attempt to cause a first configuration of the first region of the screen to support the two-dimensional content.
  • rendering engine 404 e.g., render tree preparation module 454 of FIG. 4B
  • command 612 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604 .
  • Display driver 604 receives command 612 , and generates control signal(s) 616 that are received by display device 606 .
  • Control signal(s) 616 place(s) a region of screen 620 in a 2D display mode for display of the identified 2D content.
  • step 508 three-dimensional content to be displayed in a second region of the screen is identified.
  • rendering module 404 may identify a second object of markup document 608 that relates to three-dimensional content.
  • the second object may be identified by parser 452 encountering a hypertext link corresponding to the second object in markup document 608 , or in other manner.
  • the second object may include any form of three-dimensional content, such as an image, a video, another web page, etc., and any type of three-dimensional content (e.g., stereoscopic 3D, 3D-2, 3D-4, etc.).
  • the second object may be identified in various ways.
  • rendering engine 404 may handle further processing of the second object, a client application 406 may be selected to manage the processing of the second object (e.g., for a particular type of second object that the client application 406 is configured to process), or code interpreter 410 may interpret and execute a script of the second object.
  • the second object may be identified based on an identifier for the second object (e.g., a filename) or a structure of the second object itself (e.g., file contents, such as header information).
  • a MIME (multipurpose Internet mail extensions) type file extension to a filename for the second object provided in markup document 608 may be used to identify the second object, to identify that the second object includes 3D content, and to select rendering engine 404 or a particular client application 406 to process the second object.
  • the second object may be identified by a content server (e.g., content server 460 of FIG. 4B ) from which the second object is requested.
  • one or more parameters may be present in markup document 608 that may be passed to the content server by rendering engine 404 in a request that can be used to select the second object to be returned in response to the request.
  • the tag(s) and/or other parameter(s) may indicate a screen configuration for a screen 620 of display device 606 , a frame size to be generated by rendering engine 404 , and/or other information.
  • the content server may use the tag(s) and/or other parameters to select the second object, and/or may use other information to select the second object, such as characteristics of the communication link between the file server and browser. Referring to FIG.
  • file A or file B may be selected by server 460 based on whether the browser screen configuration matches screen configuration A or screen configuration B stored at server 460 , and the selected file is transmitted (e.g., an image file is transmitted, video is streamed, etc.) to browser 400 (e.g., as information resource 610 in FIG. 6 ).
  • the selected file is transmitted (e.g., an image file is transmitted, video is streamed, etc.) to browser 400 (e.g., as information resource 610 in FIG. 6 ).
  • a second configuration request is communicated to at least attempt to cause a second configuration of the second region of the screen to support the three-dimensional content, the first configuration being different from the second configuration.
  • rendering engine 404 e.g., render tree preparation module 454 of FIG. 4B
  • command 614 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604 .
  • Display driver 604 receives command 614 , and generates control signal(s) 618 that are received by display device 606 .
  • Control signal(s) 618 place(s) a second region of screen 620 in a 3D display mode for display of the identified 3D content. If the second region of screen 620 is in a different display mode (e.g., in a 2D display mode, or a different 3D display mode), the second region of the screen 620 is reconfigured according to the second configuration request.
  • a different display mode e.g., in a 2D display mode, or a different 3D display mode
  • rendering engine 404 may generate a render tree for each of the 2D and 3D content identified in steps 504 and 506 , and may perform a layout process to determine screen coordinates (positional information) for each node of each render tree (e.g., using render tree preparation module 454 shown in FIG. 4B ). Rendering engine 404 may traverse each node of each render tree for display on screen 620 , and may generate graphical data representative of each render tree to paint each node. As shown in FIG. 6 , rendering engine 404 may transmit 2D graphical data 622 corresponding to the identified 2D content, and 3D graphical data 624 corresponding to the identified 3D content.
  • Display driver 604 may receive 2D graphical data 622 and 3D graphical data 624 , and transmit corresponding processed 2D graphical data 626 and processed 3D graphical data 628 that are received by display device 606 .
  • Display device 606 may display the 2D content of processed 2D graphical data 626 in the first region of screen 620 , which is configured according to the first configuration request.
  • display device 606 may display the 3D content of processed 3D graphical data 628 in the second region of screen 620 , which is configured according to the second configuration request. In this manner, browser 400 enables simultaneously display of 2D and 3D content by a display screen.
  • tags may be included in markup document 608 .
  • the tags may be used to define characteristics of the display of 2D and 3D content by a display device.
  • the tags may be used to indicate one or more display properties of the displayed content, including indicating whether content is 2D or 3D, indicating a type of 3D content, etc.
  • FIG. 7 shows a flowchart 700 providing a process for using tags to configure the display of 2D and 3D content, according to an exemplary embodiment.
  • Flowchart 700 may be performed by browser embodiments described herein, such as browser 400 of FIG. 4A or browser 490 of FIG. 4B .
  • Flowchart 700 is described with respect to FIG. 6 for purposes of illustration. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 700 .
  • Flowchart 700 is described as follows.
  • Flowchart 700 begins with step 702 .
  • first tag information associated with two-dimensional content is identified, the two-dimensional content intended for both a left eye and a right eye of a viewer.
  • rendering module 404 of FIG. 6 may identify a first tagged object in markup document 608 that relates to two-dimensional content.
  • same images are delivered to the right and left eyes of a viewer so that the content is perceived as two-dimensional.
  • the first tagged object may be identified by parser 452 encountering a URL or other content identifier (e.g., a filename) that has associated tags in markup document 608 , or in other manner.
  • the first tagged object may include any form of two-dimensional content, such as an image, a video, another web page, etc.
  • the tag information associated with the two-dimensional content may include any number of attributes.
  • the tag information may indicate a screen configuration for screen 620 of display device 606 , a frame size to be generated by rendering engine 404 for display of the 2D content, a type of the 2D content, a display brightness for the 2D content, a resolution for the 2D content (e.g., 720p, 1080p, etc.), and/or any other suitable information described elsewhere herein or otherwise known.
  • step 704 second tag information associated with three-dimensional content is identified, the three-dimensional content having a first portion and a second portion, the first portion intended for the left eye of the viewer and the second portion intended for the right eye of the viewer, the first portion being a first camera view and the second portion being a second camera view.
  • rendering module 404 of FIG. 6 may identify a second tagged object in markup document 608 that relates to three-dimensional content.
  • images of differing perspective are delivered to the right and left eyes of a viewer. The images are combined in the visual center of the brain of the viewer to be perceived as a three-dimensional image.
  • the second tagged object may be identified by parser 452 encountering a second URL or other content identifier (e.g., a filename) that has associated tags in markup document 608 , or in other manner.
  • the second tagged object may include any form of three-dimensional content, such as an image, a video, another web page, etc.
  • the second tag information associated with the three-dimensional content may include any number of attributes.
  • the second tag information may indicate a screen configuration for screen 620 of display device 606 for display of the 3D content, a frame size to be generated by rendering engine 404 for display of the 3D content, a type of the 3D content, a display brightness for the 3D content, a display resolution for the 3D content, and/or any other suitable information described elsewhere herein or otherwise known.
  • step 706 the presentation of the two-dimensional content is caused in a first region of a screen.
  • rendering engine 404 may generate command 612 that is a configuration request for a first region of screen 620 to support display of the 2D content according to the first tag information identified in step 702 .
  • command 612 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604 .
  • Display driver 604 receives command 612 , and generates control signal(s) 616 that are received by display device 606 .
  • Control signal(s) 616 place(s) a first region of screen 620 in a 2D display mode for display of the 2D content.
  • Rendering engine 404 may generate a render tree for the 2D content, and may perform a layout process to determine screen coordinates (positional information) for each node of the render tree (e.g., using render tree preparation module 454 shown in FIG. 4B ). Rendering engine 404 may traverse each node of the render tree, and may generate graphical data representative of the render tree to paint each node. As shown in FIG. 6 , rendering engine 404 may transmit 2D graphical data 622 corresponding to the 2D content. Display driver 604 may receive 2D graphical data 622 and transmit corresponding processed 2D graphical data 626 that is received by display device 606 . Display device 606 may display the 2D content of processed 2D graphical data 626 in the first region of screen 620 , which is configured according to the first configuration request.
  • step 708 the presentation of the three-dimensional content is caused in a second region of the screen.
  • rendering engine 404 may generate command 614 that is a configuration request for a second region of screen 620 to support display of the 3D content according to the second tag information identified in step 704 .
  • Command 614 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604 .
  • Display driver 604 receives command 614 , and generates control signal(s) 618 that are received by display device 606 .
  • Control signal(s) 618 place(s) a second region of screen 620 in a 2D display mode for display of the 2D content.
  • Rendering engine 404 may generate a render tree for the 3D content, and may perform a layout process to determine screen coordinates for each node of the render tree. Rendering engine 404 may traverse each node of the render tree, and may generate graphical data representative of the render tree to paint each node. As shown in FIG. 6 , rendering engine 404 may transmit 3D graphical data 624 corresponding to the 3D content. Display driver 604 may receive 3D graphical data 624 and transmit corresponding processed 3D graphical data 628 that is received by display device 606 . Display device 606 may display the 3D content of processed 3D graphical data 628 in the second region of screen 620 , which is configured according to the second configuration request. In this manner, browser 400 causes display of the 3D content on screen 620 of display screen 606 simultaneously with the display of the 2D content on screen 620 .
  • two-dimensional and three-dimensional content identified by browser 400 may simultaneously be displayed within corresponding regions of screen 620 .
  • different types of three-dimensional content e.g., different resolutions, different numbers of image pairs, different stereoscopic depths, etc.
  • any number of different types of two-dimensional and three-dimensional content may be displayed in any number of regions of screen 620 .
  • FIGS. 8 , 9 , 10 A, and 10 B show examples of screen 620 displaying content in various screen regions, including tabs, frames, and display objects, according to embodiments.
  • FIG. 8 shows screen 620 of FIG. 6 displaying a browser window 802 that includes multiple frames. Frames enable browsers to display two or more web pages or other media elements within the same browser window (e.g., side-by-side, etc.). Frames may be defined using “frameset” tags that define frames and their sizes.
  • FIG. 8 shows browser window 802 including a first frame 804 and a second frame 806 .
  • First frame 804 is configured for the display of two-dimensional content (e.g., according to step 506 of FIG. 5 or step 706 of FIG.
  • first frame 804 and second frame 806 are configured for the display of three-dimensional content (e.g., according to step 510 of FIG. 5 or step 708 of FIG. 7 ).
  • first frame 804 and second frame 806 have approximately the same size, and are positioned side-by-side.
  • first and second frames 804 and 806 may have different sizes, and may have different positions relative to each other (e.g., above and below, etc).
  • first and second frames 804 and 806 are shown as having rectangular shapes in FIG. 8 , in other embodiments, first and second frames 804 and 806 may have other shapes. Note that any number of frames may be displayed in browser window 802 that respectively display two-dimensional or three-dimensional content.
  • FIG. 9 shows screen 620 displaying a browser window 902 that includes multiple tabs.
  • Tabs enable browsers to display two or more documents in a same browser window one at a time. The tabs can be used as a navigational widget to switch the display of the documents.
  • FIG. 9 shows browser window 902 including a first tab region 904 and a second tab region 906 .
  • First tab region 904 may be configured for the display of two-dimensional content (e.g., according to step 506 of FIG. 5 or step 706 of FIG. 7 ), and second tab region 906 may be configured for the display of three-dimensional content (e.g., according to step 508 of FIG. 5 or step 708 of FIG. 7 ). As shown in FIG.
  • first and second tab regions 904 and 906 each have a corresponding tab extending upward that may be used to bring the respective region forward.
  • First tab region 904 is displayed over second tab region 906 , such that second tab region 906 is not visible (except for the tab of second tab region 906 ).
  • the tab of second tab region 906 may be selected (e.g., by mouse click, etc.) to bring second tab region 906 to the forefront to be displayed over first tab region 904 , causing first tab region 904 to not be visible (except for the tab of first tab region 904 ).
  • any number of tab regions may be present in browser window 902 that respectively display two-dimensional or three-dimensional content.
  • FIG. 10A shows screen 620 displaying browser window 902 of FIG. 9 , with browser window 902 including tab regions 902 and 904 .
  • First tab region 902 is displayed over second tab region 904 , and a frame 1002 is displayed in tab region 902 . Any number of frames may be displayed in a tab region.
  • an object 1004 is displayed that overlaps first tab region 904 and frame 1002 .
  • Object 1004 may be a two-dimensional object (e.g., displayed according to step 506 of FIG. 5 or step 706 of FIG. 7 ) or a three-dimensional object (e.g., displayed according to step 510 of FIG. 5 or step 708 of FIG. 7 ).
  • display of object 1004 may be a graphical object generated at least in part by client application 406 interacting with rendering engine 406 .
  • object 1004 may be generated based on a Flash® application, a Java applet, etc., that is executed by client application 406 (or by rendering engine 404 ).
  • client application 406 or by rendering engine 404 .
  • any number of two-dimensional and/or three-dimensional content objects similar to object 1004 may be displayed in browser window 902 .
  • object 1004 is shown as having a round shape in FIG. 10A , in other embodiments, object 1004 may have other shapes (e.g., rectangular, other polygonal shape, shape of a person, an animal, an animated character, a product, etc.).
  • FIG. 10B shows another example of screen 620 displaying a browser window 1020 similar to browser window 902 of FIG. 10A , with browser window 1020 including tab regions 902 and 904 , and with tab region 904 including first frame 1002 and a second frame 1006 .
  • Frames 1002 and 1006 may each include two-dimensional content (e.g., displayed according to step 506 of FIG. 5 or step 706 of FIG. 7 ) or three-dimensional content (e.g., displayed according to step 510 of FIG. 5 or step 708 of FIG. 7 ). Any number of two-dimensional and/or three-dimensional content objects similar to frames 100 and 1006 may be displayed in browser window 1020 having any shape.
  • Browser window 1020 includes various user interface elements providing controls for navigating the display of 2D and 3D content.
  • browser window 1020 may include a navigation bar 1008 , which may include various controls.
  • a user may interact with navigation bar 1008 to navigate to web pages by entering corresponding URLs in an address entry box. Such web pages may include 2D and/or 3D content for display in browser window 1020 .
  • a user may interact with back and forward buttons in navigation bar 1008 to navigate to a previous resource or forward to a subsequent resource.
  • a user may interact with a refresh button of navigation bar 1008 to reload a current resource, and may interact with a stop button of navigation bar 1008 to cancel loading a resource.
  • navigation bar 1008 shown in FIG. 10B is provided for purposes of illustration and is not intended to be limiting.
  • navigation bar 1008 may include additional and/or alternative navigation elements, such as a search engine query entry box, a home button, etc.
  • browser window 1020 provides various browser controls for controlling the display of two-dimensional and three-dimensional content.
  • browser window 1020 may include a 3D display control bar 1010 .
  • 3D display control bar 1010 is positioned in a North position in browser window 1020 immediately below navigation bar 1008 , but in other embodiments may have other forms or positions (e.g., right side, left side, South position, etc.), and may be combined with other displayed bars.
  • 3D display control bar 1010 may have other forms, such as a widget, an icon, or other user interface element.
  • 3D display control bar 1010 enables a user to configure 3D display settings and/or preferences for browser window 1020 .
  • 3D display control bar 1010 may include a 2D-3D toggle button 1014 and/or a 3D options button 1016 .
  • 2D-3D toggle button 1014 may be selected (e.g., by clicking with a mouse pointer 1024 , by keystrokes, etc.) by a user to toggle between display of content in browser window 1020 in 2D form, or to enable 3D-enabled content to be displayed in 3D form.
  • 2D-3D toggle button 1014 may display the current 2D-3D setting (e.g., either 2D or 3D).
  • 3D options button 1016 may be selected by a user to set one or more 3D display settings/preferences for browser window 1020 .
  • a user may select 3D options button 1016 to invoke a menu 1018 that lists one or more 3D display options that may be selected by the user.
  • menu 1018 includes a “set 3Dx” option (to select a 3D multiview display type), a set 3D intensity option (to set a 3D display depth), a linked defaults option, and an advertisements defaults option.
  • the linked defaults option enables a user to configure whether content invoked by clicking on a hyperlink in a web page displayed in browser window 1020 is displayed in 2D or 3D form.
  • a user can set as a default all content generated by the same domain to be displayed regionally in full. Any hypertext linked content (e.g., coming from another source) may be set according to the linked defaults option be reduced to 2D or to be enabled to be displayed in 3D (e.g., of a particular 3D type). Thereafter, by clicking on content that has been reduced to 2D form, a restoration to 3D form may be performed. Content that was restored to full 3D may be clicked again to be reduced back to 2D form.
  • a user may use the advertisements defaults option to set whether advertisements are displayed in 2D form by default, or whether 3D-enabled advertisements may be displayed in 3D form.
  • an advertiser may attempt to push strong 3D effect graphics/video/text to users of browser window 1020 to grab their attention. This may be overridden through setup with the advertisements defaults option, or through direct user interaction with the advertisement itself. For example, a right click on the advertisement may generate reduce intensity/2D/3D/stop-pause” type options.
  • 3D display control bar 1010 shown in FIG. 10B is provided for purposes of illustration and is not intended to be limiting. In further embodiments, 3D display control bar 1010 may have other form or position, and may include additional and/or alternative 3D control elements.
  • tab regions may enable users to configure 2D-3D settings on a tab region-by-tab region basis.
  • tab region 904 may include a 3D user interface element 1012 that enables 3D settings to be made for tab region 1012 .
  • 3D user interface element 1012 may be a button, icon, widget, may invoke a menu, etc., that a user may interact with to configure 2D-3D settings for tab region 904 .
  • Such settings may be similar to those described above with respect to 3D display control bar 1010 and/or may include further and/or alternative settings.
  • 3D user interface element 1012 is shown for purposes of illustration, and may have other form and capabilities than described with respect FIG. 10B .
  • browser window 1020 may enable frames and/or specific content items to enable users to configure 2D-3D settings on frame-by-frame or content-by-content basis.
  • a user may invoke a menu 1022 with respect to frame 1002 (e.g., by right clicking pointer 1024 in frame 1002 ) that provides one or more 2D-3D configuration options.
  • menu 1022 may include a toggle 2D-3D option (to toggle between display of content in 2D or 3D), a change 3Dx option (to change a 3D multiview display setting), an increase 3D intensity option, a reduce 3D intensity option, a pause option (to pause display of video content), etc.
  • Menu 1022 is shown for purposes of illustration, and may provide further and/or alternative 2D-3D display related options to those shown in FIG. 10B .
  • browser window 1020 may include a 3D status bar 1012 .
  • 3D status bar 1012 is positioned in a South-most position in browser window 1020 , but in other embodiments may have other positions, and may be combined with other displayed bars.
  • 3D status bar 1012 may have other forms, such as a widget, an icon, or other user interface element.
  • 3D status bar 1012 displays a current 2D-3D setting status for browser window 1020 , and may optionally change the displayed 2D-3D setting status depending on the particular region (e.g., tab region, frame, content, etc.) over which pointer 1024 is hovered.
  • 3D status bar 1012 may show any suitable 3D status information, such as whether display of 2D or 3D content is enabled, a type of 3D multiview that is displayed (e.g., “3D-8”), 2D-3D settings for advertisements, a 3D intensity setting, and/or further display information.
  • 3D status bar 1012 shown in FIG. 10B is provided for purposes of illustration and is not intended to be limiting. In further embodiments, 3D status bar 1012 may have other form or position, and may include additional and/or alternative 3D status elements.
  • FIGS. 8 , 9 , 10 A, and 10 B are provided for purposes of illustration, and are not intended to be limiting.
  • display device 606 supports the display of both two-dimensional and three-dimensional content.
  • display device 606 may support both two-dimensional content and three-dimensional content.
  • not all types of display device 606 that support three-dimensional content may support all types of three-dimensional content.
  • browser 400 may be configured to translate unsupported types of content to supported types of content.
  • browser 400 may be interfaced with components that are configured to perform such translations. For instance, as shown in FIG. 4B , OS 432 includes translation services 426 , and display circuitry 416 a - 416 c include respective translation services 430 a - 430 c.
  • rendering engine 404 of browser 400 may be configured to translate types of content that are not supported by a display device to supported types of content.
  • FIG. 11 shows a block diagram of rendering engine 404 , according to an exemplary embodiment.
  • rendering engine 404 includes a first translator 1102 and a second translator 1104 .
  • rendering engine 404 may include one or both of first and second translators 1102 and 1104 .
  • First translator 1102 may be present in rendering engine 404 to support display devices that do not support the display of three-dimensional content.
  • Second translator 1102 may be present in rendering engine 404 to support display devices that do not support the display of one or more types of three-dimensional content.
  • First translator 1102 is configured to translate received 3D data to 2D data for display by a display device.
  • three-dimensional graphical data associated with an information resource may be received by rendering engine 404 .
  • Rendering engine 404 may determine that the information resource contains three-dimensional content in any manner, included such as by a MIME file extension, by contents of a media file containing the data, by a tag associated with the information resource, etc.
  • first translator 1102 may translate three-dimensional graphical data 1106 of the information resource to two-dimensional graphical data 1108 .
  • Two-dimensional graphical data 1108 may be transmitted to the display device (e.g., 2D display of FIG. 4B ) to enable two-dimensional content to be displayed in a screen region based on two-dimensional graphical data 1106 .
  • a display device that supports the display of three-dimensional data may not support all types of three-dimensional data (e.g., the display device does not support 3D graphics data having additional camera views other than initial first right and left views, does not support a number of camera views greater than 3D-4, etc.).
  • Second translator 1104 is configured to translate 3D data of an information resource of one or more unsupported 3D content types to 3D data of one or more supported 3D content types for display by a display device. For example, as shown in FIG. 11 , first-type three-dimensional graphical data 1110 associated with an information resource may be received.
  • Rendering engine 404 may determine that the first-type of three-dimensional content is an unsupported type in any manner, included such as by a MIME file extension, by contents of a media file containing the data, by a tag associated with the information resource, etc.
  • second translator 1104 may translate first-type three-dimensional graphical data 1110 to second-type three-dimensional graphical data 1112 .
  • Second-type three-dimensional graphical data 1112 is transmitted to display device 606 to enable the corresponding second type of three-dimensional content to be displayed in the region of screen 620 .
  • First translator 1102 may be configured in various ways to translate received 3D data to 2D data. For instance, in an embodiment, three-dimensional graphical data 1106 may be received as a stream of right image data and left image data. First translator 1102 may be configured to combine the right and left image data into two-dimensional image data that defines a stream of two-dimensional images that may be output as two-dimensional data 1108 . In another embodiment, first translator 1102 may be configured to select the right image data or the left image data to be output as two-dimensional data 1108 , while the other of the right image data or left image data is not used. In further embodiments, first translator 1102 may translate received 3D data to 2D data in other ways.
  • Second translator 1104 may be configured in various ways to translate 3D data of a first 3D content type to 3D data of a second 3D content type. For instance, second translator 1104 may translate a first 3D multiview type (e.g., 3D-16) to a second 3D multiview type (e.g., 3D-4) or to a single 3D view. In such an embodiment, second translator 1104 may not pass extra left-right image pairs from first-type three-dimensional data 1110 to second-type three-dimensional data 1112 . In an embodiment, second translator 1104 (and/or first translator 1102 ) may use techniques of image scaling to modify an unsupported display resolution to a supported display resolution.
  • a first 3D multiview type e.g., 3D-16
  • second 3D multiview type e.g., 3D-4
  • second translator 1104 may use techniques of image scaling to modify an unsupported display resolution to a supported display resolution.
  • second translator 1104 may use upsampling or interpolating to increase resolution, and may use subsampling or downsampling to decrease resolution.
  • second translator 1104 may translate 3D data in other ways.
  • a translator may be present to translate 2D content to 3D content, such as when a user has a preference to view content as 3D content.
  • Various techniques may be used to convert 2D graphical data to 3D graphical data, as would be known to person skilled in the relevant art(s).
  • FIG. 12 shows a flowchart 1200 providing a process for determining display screen characteristics, according to an exemplary embodiment.
  • Flowchart 1200 may be performed by browser 400 of FIG. 4A , browser 490 of FIG. 4B , etc.
  • an indication of at least one characteristic of the screen is requested.
  • browser 400 of FIG. 6 may transmit a screen characteristic request to display device 606 .
  • the screen characteristic request may be transmitted through an API (e.g., API 302 of FIG. 3 ), an OS (e.g., OS 304 of FIG. 3 ), and/or a display driver (e.g., display driver 306 of FIG. 3 ), when present in a communication path between browser 400 and display device 606 .
  • API e.g., API 302 of FIG. 3
  • an OS e.g., OS 304 of FIG. 3
  • a display driver e.g., display driver 306 of FIG. 3
  • a response to the request is received.
  • display driver 606 may transmit a response to the screen characteristic request that includes an indication of one or more characteristics of screen 620 , including whether screen 620 supports display of 2D and/or 3D content, an indication of supported types of 3D content, an indication of a resolution of screen 620 , whether screen 620 supports display of mixed 2D and 3D, etc.
  • the response may be transmitted through the display driver, OS, and/or API, when present.
  • Browser 400 may receive the response, and rendering engine 404 may used the received response information to render 2D and/or 3D content that is supported by screen 620 .
  • first translator 1102 or second translator 1104 may be activated to translate an unsupported content type to a supported content type, and/or other actions may be taken.
  • user input interfaces 420 in FIG. 4B receive user input to enable persons to interact with browser content displayed by a display device.
  • a user may be enabled to interact with displayed controls of browser 402 (e.g., displayed in 2D/3Dx UI display 444 ), to select tabs to view different tab regions, to interact with displayed graphical items (e.g., windows, frames, objects etc.), to modify (e.g., rotate, resize, etc.) displayed graphical items, etc.
  • displayed controls of browser 402 e.g., displayed in 2D/3Dx UI display 444
  • displayed graphical items e.g., windows, frames, objects etc.
  • modify e.g., rotate, resize, etc.
  • browser 400 may provide a command-line interface (e.g., a URL address entry box), a GUI, and/or other browser interface with which the user can interact using user input interfaces 420 .
  • user input interface 420 may enable users to interact with displayed controls of browser 400 to adjust three-dimensional characteristics of three-dimensional content displayed by browser 400 (e.g., rendered by rendering engine 404 ).
  • user input interface 420 may enable three-dimensionality of displayed content to be turned on or off (e.g., to toggle between two-dimensionality and three-dimensionality).
  • User input interface 420 may enable a degree of three-dimensionality of displayed content to be modified (e.g., increased or decreased, such as by changing a depth of three-dimensionality, increasing or decreasing a number of supplied camera views, etc.), may enable three-dimensional objects to be rotated in three-dimensions, and/or may enable further types of adjustment to three-dimensional characteristics of displayed three-dimensional content. Furthermore, user input interface 420 may enable other characteristics of displayed content to be modified, such as modifying contrast, brightness, etc.
  • the user may interact with user input interface 402 in various ways, including using a mouse/pointing device to move a displayed pointer/cursor.
  • the pointer may be used to select control settings.
  • the pointer may be used to “click and drag” objects to move them, to resize objects, to rotate objects, to select controls/settings, to open a pop-up menu, etc.
  • the user may interact with a keyboard, a thumb wheel or other wheel, a roller ball, a stick pointer, a touch sensitive display, any number of virtual interface elements (e.g., such as a keyboard or other user interface element displayed by screen 620 ), a voice recognition system, and/or other user interface elements described elsewhere herein or otherwise known to provide user input.
  • user input interface 402 may support a touch screen that is reactive to user finger touches to the screen to cause three-dimensional characteristics of displayed objects to be modified. For instance, particular motions of one or more figures against the screen may cause object resizing, 3D rotation, movement in 3D, etc. (e.g., touching two fingers to the screen, and dragging them together may be interpreted as “grabbing” a window and moving the window in 3D).
  • users may have preferences with regard to a browser environment upon the browser being activated. Such preferences may include preferences with regard to display of three-dimensional content. For example, a user may desire for a browser to power up in a two-dimensional or three-dimensional display mode, and if a three-dimensional display mode is desired, the user may have particular three-dimensional display preferences (e.g., a preferred degree of displayed three-dimensionality). For instance, the user may desire for the various controls of the browser to be displayed in two- or three-dimensions, may desire all content to be displayed as two-dimensional or three-dimensional by default, may desire particular contents such as advertisements to be displayed as two-dimensional by default, etc.
  • three-dimensional display preferences e.g., a preferred degree of displayed three-dimensionality
  • Embodiments enable display preferences to be set by users, and to be used to configure the display environments of users upon device boot up, user login, browser activation, etc.
  • FIG. 13 shows a block diagram of storage 1302 that may be included in an electronic device (e.g., device 412 of FIG. 4A ) that includes browser 400 , according to an exemplary embodiment.
  • storage 1302 stores user browser preferences 1304 .
  • User browser preferences 1304 may indicate the user preferences that a user may have for a browser environment upon the browser being activated, including the browser preferences mentioned above and/or further preferences.
  • User preferences 1304 may be loaded at browser startup, and used (e.g., by rendering engine 404 , OS 304 or 432 , etc.) to enable the browser environment to be displayed as desired by a user.
  • Storage 1402 may include one or more non-volatile storage elements, such as non-volatile random access memory (RAM) devices (e.g., flash memory, electrically erasable programmable read-only memory, etc.), read only memory (ROM) devices, a hard disk drive, a CDROM (compact disc ROM), a DVD (digital video disc), etc.
  • RAM non-volatile random access memory
  • ROM read only memory
  • User preferences 1304 may be associated with a user by being stored in a user account of the user, being stored in a cookie associated with the user, etc.
  • Embodiments described herein for browsers that support the display of two-dimensional and three-dimensional content may be implemented with respect to various types of display devices.
  • some display screens are configured for displaying two-dimensional content, although they may display two-dimensional images that may be combined to form three-dimensional images by special glasses worn by users.
  • Some other types of display screens are capable of display two-dimensional content and three-dimensional content without the users having to wear special glasses using techniques of autostereoscopy.
  • browser embodiments described herein may generate configuration requests/commands to configure regions of the display screen for display of content, and may provide the content for display in the configured regions.
  • Display drivers e.g., display driver 306 of FIG.
  • driver variants 434 , 436 , and 438 of FIG. 4B , etc. may receive the configuration requests/commands, and may generate control signals to cause the screen to be configured as indicated.
  • the display drivers may supply the content provided by the browsers to the display devices to be displayed on the screen.
  • Example display devices, screens, and display drivers are described as follows that receive the control signals, are configured accordingly, and that receive and display the provided content.
  • display devices such as display device 606 may be implemented in various ways.
  • display device 606 may be a television display (e.g., an LCD (liquid crystal display) television, a plasma television, etc.), a computer monitor, or any other type of display device.
  • Display device 606 may include any suitable type or combination of light and image generating devices, including an LCD screen, a plasma screen, an LED (light emitting device) screen (e.g., an OLED (organic LED) screen), etc.
  • display device 606 may include any suitable type of light filtering device, such as a parallax barrier (e.g., an LCD filter, a mechanical filter (e.g., that incorporates individually controllable shutters), etc.) and/or a lenticular lens, and may be configured in any manner, including as a thin-film device (e.g., formed of a stack of thin film layers), etc.
  • display device 606 may include any suitable light emitting device as backlighting, including a panel of LEDs or other light emitting elements.
  • FIG. 14 shows a block diagram of a display device 1400 , according to an exemplary embodiment.
  • display device 1400 includes a screen 1402 .
  • Display device 1400 is an example of display device 606 and screen 1402 is an example of screen 620 described above (e.g., with respect to FIG. 6 ).
  • Device 1400 receives one or more control signals 1406 (e.g., from browser 400 ) that are configured to place screen 620 in a desired display mode (e.g., either a two-dimensional display mode or a three-dimensional display mode).
  • screen 1404 includes a light manipulator 1404 .
  • Light manipulator 1404 is configured to manipulate light that passes through light manipulator 1404 to enable three-dimensional images to be delivered to users in a viewing space.
  • control signal(s) 1406 may be configured to activate or deactivate light manipulator 1404 to place screen 620 in a three-dimensional display mode or a two-dimensional display mode, respectively.
  • light manipulator 1404 examples include a parallax barrier and a lenticular lens.
  • light manipulator 1404 may be a parallax barrier that has a layer of material with a series of precision slits. The parallax barrier is placed proximal to a light emitting pixel array so that a user's eyes each see a different set of pixels to create a sense of depth through parallax.
  • light manipulator 1404 may be a lenticular lens that includes an array of magnifying lenses configured so that when viewed from slightly different angles, different images are magnified.
  • Such a lenticular lens may be used to deliver light from a different set of pixels of a pixel array to each of the user's eyes to create a sense of depth.
  • Embodiments are applicable display devices that include such light manipulators, include other types of light manipulators, and that may include multiple light manipulators.
  • display device 1400 receives a content signal 1408 (e.g., from device 412 of FIG. 4A , or other electronic device).
  • Content signal 1408 includes two-dimensional or three-dimensional content for display by screen 1402 , depending on the particular display mode.
  • light manipulator 1404 is physically fixed—is not adaptable. As such, when present, light manipulator 1404 (e.g., a fixed parallax barrier or a fixed lenticular lens) always delivers three-dimensional images of a particular type to a particular region in a viewing space. As such, light manipulator 1404 is not adaptable to deliver other types of three-dimensional images and/or to deliver two and/or three-dimensional images to multiple different regions of a viewing space.
  • FIG. 15 shows a block diagram of a display device 1500 that is adaptable, according to an exemplary embodiment.
  • display device 1502 includes a screen 1502 .
  • Display device 1500 is an example of display device 606 and screen 1502 is an example of screen 620 described above (e.g., with respect to FIG. 6 ).
  • screen 1504 includes an adaptable light manipulator 1504 .
  • Adaptable light manipulator 1504 is configured to manipulate light that passes through adaptable light manipulator 1504 to enable three-dimensional images to be delivered to users in a viewing space.
  • adaptable light manipulator 1504 is adaptable—is not physically fixed in configuration.
  • adaptable light manipulator 1504 is adaptable to deliver multiple different types of three-dimensional images and/or to deliver three-dimensional images to different/moving regions of a viewing space. Furthermore, in an embodiment, different regions of adaptable light manipulator 1504 may be adaptable such that multiple two-dimensional and/or three-dimensional images may be simultaneously delivered by screen 1502 to the viewing space.
  • Device 1500 receives one or more control signals 1506 (e.g., from browser 400 ) that are configured to place screen 1502 in a desired display mode (e.g., either a two-dimensional display mode or a three-dimensional display mode), and/or to configure three-dimensional characteristics of any number and type as described above, such as configuring adaptable light manipulator 1504 to deliver different types of three-dimensional images, to deliver three-dimensional images to different/moving regions of a viewing space, and to deliver two-dimensional and/or three-dimensional images from any number of regions of screen 1502 to the viewing space.
  • a desired display mode e.g., either a two-dimensional display mode or a three-dimensional display mode
  • three-dimensional characteristics of any number and type such as configuring adaptable light manipulator 1504 to deliver different types of three-dimensional images, to deliver three-dimensional images to different/moving regions of a viewing space, and to deliver two-dimensional and/or three-dimensional images from any number of regions of screen 1502 to the viewing space.
  • display device 1500 receives a content signal 1508 (e.g., from device 412 of FIG. 4A , or other electronic device).
  • Content signal 1508 includes two-dimensional and/or three-dimensional content for display by screen 1502 , depending on the particular display mode and on the number of regions of screen 1502 that are delivering different two- or three-dimensional views to a viewing space.
  • Content signals 1408 and 1508 may include video content according to any suitable format.
  • content signals 1408 and 1508 may include video content delivered over an HDMI (High-Definition Multimedia Interface) interface, over a coaxial cable, as composite video, as S-Video, a VGA (video graphics array) interface, etc.
  • control signals 1406 and 1506 may be provided separately or in a same signal stream to display devices as their corresponding one of content signals 1408 and 1508 .
  • Exemplary embodiments for display devices 1400 and 1500 of FIGS. 14 and 15 are described as follows for purposes of illustration.
  • Display devices 1400 and 1500 may include parallax barriers as light manipulators 1404 and 1504 , respectively.
  • FIG. 16 shows a block diagram of a display system 1600 , which is an example of display device 606 , according to an embodiment.
  • system 1600 includes a display device driver circuit 1602 , an image generator 1612 , and parallax barrier 1620 .
  • image generator 1612 includes a pixel array 1608
  • parallax barrier 1620 includes a barrier element array 1610 .
  • display driver circuit 1602 includes a pixel array driver circuit 1604 and a barrier array driver circuit 1606 .
  • Pixel array 1608 includes a two-dimensional array of pixels (e.g., arranged in a grid or other distribution). Pixel array 1608 is a self-illuminating or light-generating pixel array such that the pixels of pixel array 1608 each emit light included in light 1652 emitted from image generator 1612 . Each pixel may be a separately addressable light source (e.g., a pixel of a plasma display, an LCD display, an LED display such as an OLED display, or of other type of display). Each pixel of pixel array 1608 may be individually controllable to vary color and intensity. In an embodiment, each pixel of pixel array 1608 may include a plurality of sub-pixels that correspond to separate color channels, such as a trio of red, green, and blue sub-pixels included in each pixel.
  • a trio of red, green, and blue sub-pixels included in each pixel such as a trio of red, green, and blue sub-pixels included in each pixel.
  • Parallax barrier 1620 is positioned proximate to a surface of pixel array 1608 .
  • Barrier element array 1610 is a layer of parallax barrier 1620 that includes a plurality of barrier elements or blocking regions arranged in an array. Each barrier element of the array is configured to be selectively opaque or transparent. Combinations of barrier elements may be configured to be selectively opaque or transparent to enable various effects. For example, in one embodiment, each barrier element may have a round, square, or rectangular shape, and barrier element array 1610 may have any number of rows of barrier elements that extend a vertical length of barrier element array 1610 .
  • each barrier element may have a “band” shape that extends a vertical length of barrier element array 1610 , such that barrier element array 1610 includes a single horizontal row of barrier elements.
  • Each barrier element may include one or more of such bands, and different regions of barrier element array may include barrier elements that include different numbers of such bands.
  • barrier elements do not need to have spacing between them because there is no need for drive signal routing in such space.
  • a transistor-plus-capacitor circuit is typically placed onsite at the corner of a single pixel in the array, and drive signals for such transistors are routed between the LCD pixels (row-column control, for example).
  • local transistor control may not be necessary because barrier elements may not need to be changing as rapidly as display pixels (e.g., pixels of pixel array 1608 ).
  • one band or multiple adjacent bands may comprise a barrier element in a blocking state, followed by one band or multiple adjacent bands (e.g., two bands) that comprise a barrier element in a non-blocking state (a slit), and so on.
  • the five bands may combine to offer a single black barrier element of approximately 2.5 times the width of a single transparent slit with no spaces therein.
  • barrier elements may be capable of being completely transparent or opaque, and in other embodiments, barrier elements may not be capable of being fully transparent or opaque.
  • barrier elements may be capable of being 95% transparent when considered to be “transparent” and may be capable of being 5% transparent when considered to be “opaque.”
  • Transparent and opaque as used herein are intended to encompass barrier elements being substantially transparent (e.g., greater than 75% transparent, including completely transparent) and substantially opaque (e.g., less than 25% transparent, including completely opaque), respectively.
  • Display driver circuit 1602 receives control signal 1622 and content signal 1624 .
  • content signal 1624 includes two-dimensional and/or three-dimensional content for display.
  • Control signal 1622 may be control signal 1406 of FIG. 14 (for a non-adaptable parallax barrier 1620 ) or may be control signal 1506 of FIG. 15 (for an adaptable parallax barrier 1620 ).
  • Control signal 1622 may be received from a display driver of an operating system (e.g., may be control signal 618 received from display driver 604 in FIG. 6 ).
  • Display driver circuit 1602 is configured to generate drive signals based on control signal 1622 and content signal 1624 to enable display system 1600 to display two-dimensional and three-dimensional images to users 1618 in viewing space 1670 .
  • pixel array driver circuit 1604 is configured to generate a drive signal 1614 that is received by pixel array 1608 (e.g., based on content signal 1624 and/or control signal 1622 ).
  • Drive signal 1614 may include one or more drive signals used to cause pixels of pixel array 1608 to emit light 1652 of particular desired colors and/or intensity.
  • Barrier array driver circuit 1606 is configured to generate a drive signal 1616 that is received by barrier element array 1610 (e.g., based on control signal 1622 ).
  • Drive signal 1616 may include one or more drive signals used to cause each of the barrier elements of barrier element array 1610 to be transparent or opaque.
  • barrier element array 1610 filters light 1652 to generate filtered light 1672 that includes one or more two-dimensional and/or three-dimensional images that may be viewed by users 1618 in viewing space 1670 .
  • Example further description of implementations of the display driver circuits described herein is provided in pending U.S. patent application Ser. No. ______, titled “Integrated Backlighting, Sub-Pixel and Display Driver Circuitry Supporting Adaptive 2D, Stereoscopic 3D and Multi-View 3D Displays,” filed on same date herewith, which is incorporated by reference herein in its entirety, although the driver circuits described herein are not limited to such implementations.
  • drive signal 1614 may control sets of pixels of pixel array 1608 to each emit light representative of a respective image, to provide a plurality of images.
  • Drive signal 1616 may control barrier elements of barrier element array 1610 to filter the light received from pixel array 1608 according to the provided images such that one or more of the images are received by users 1618 in two-dimensional form.
  • drive signal 1616 may select one or more sets of barrier elements of barrier element array 1610 to be transparent, to transmit one or more corresponding two-dimensional images or views to users 1618 .
  • drive signal 1616 may control sections of barrier element array 1610 to include opaque and transparent barrier elements to filter the light received from pixel array 1608 so that one or more pairs of images or views provided by pixel array 1608 are each received by users 1618 as a corresponding three-dimensional image or view.
  • drive signal 1616 may select parallel strips of barrier elements of barrier element array 1610 to be transparent to form slits that enable three-dimensional images to be received by users 1618 .
  • drive signal 1616 may be generated by barrier array driver circuit 1606 to configure one or more characteristics of barrier element array 1610 .
  • drive signal 1616 may be generated to form any number of parallel strips of barrier elements of barrier element array 1610 to be transparent, to modify the number and/or spacing of parallel strips of barrier elements of barrier element array 1610 that are transparent, to select and/or modify a width and/or a length (in barrier elements) of one or more strips of barrier elements of barrier element array 1610 that are transparent or opaque, to select and/or modify an orientation of one or more strips of barrier elements of barrier element array 1610 that are transparent, to select one or more areas of barrier element array 1610 to include all transparent or all opaque barrier elements, etc.
  • FIG. 17 shows a block diagram of a display system 1700 , which is another example of display device 1500 of FIG. 15 , according to an embodiment.
  • system 1700 includes display device driver circuit 1602 , a pixel array 1722 , parallax barrier 1620 , and a backlighting 1716 .
  • Parallax barrier 1620 includes barrier element array 1610 and backlighting 1716 includes a light element array 1736 .
  • display driver circuit 1602 includes a pixel array driver circuit 1728 , barrier array driver circuit 1606 , and a light source driver circuit 1730 .
  • Backlighting 1716 is a backlight panel that emits light 1738 .
  • Light element array 1736 (or “backlight array”) of backlighting 1716 includes a two-dimensional array of light sources. Such light sources may be arranged, for example, in a rectangular grid. Each light source in light element array 1736 is individually addressable and controllable to select an amount of light emitted thereby.
  • a single light source may comprise one or more light-emitting elements depending upon the implementation.
  • each light source in light element array 1736 comprises a single light-emitting diode (LED) although this example is not intended to be limiting. Further description of implementations of backlighting 1716 and other backlighting implementations described herein is provided in pending U.S.
  • Parallax barrier 1620 is positioned proximate to a surface of backlighting 1716 (e.g., a surface of the backlight panel).
  • barrier element array 1610 is a layer of parallax barrier 1620 that includes a plurality of barrier elements or blocking regions arranged in an array. Each barrier element of the array is configured to be selectively opaque or transparent.
  • Barrier element array 1610 filters light 1738 received from backlighting 1716 to generate filtered light 1740 .
  • Filtered light 1740 is configured to enable a two-dimensional image or a three-dimensional image (e.g., formed by a pair of two-dimensional images in filtered light 1672 ) to be formed based on images subsequently imposed on filtered light 1740 by pixel array 1722 .
  • pixel array 1722 of FIG. 17 includes a two-dimensional array of pixels (e.g., arranged in a grid or other distribution).
  • pixel array 1722 is not self-illuminating, and instead is a light filter that imposes images (e.g., in the form of color, grayscale, etc.) on filtered light 1740 from parallax barrier 1620 to generate filtered light 1672 to include one or more images.
  • Each pixel of pixel array 1722 may be a separately addressable filter (e.g., a pixel of a plasma display, an LCD display, an LED display, or of other type of display).
  • Each pixel of pixel array 1722 may be individually controllable to vary the color imposed on the corresponding light passing through, and/or to vary the intensity of the passed light in filtered light 1672 .
  • each pixel of pixel array 1722 may include a plurality of sub-pixels that correspond to separate color channels, such as a trio of red, green, and blue sub-pixels included in each pixel.
  • Display driver circuit 1602 of FIG. 17 is configured to generate drive signals based on control signal 1622 and/or content signal 1624 to enable display system 1700 to display two-dimensional and three-dimensional images to users 1618 in viewing space 1670 .
  • light source driver circuit 1730 within display driver circuit 1602 controls the amount of light emitted by each light source in light element array 1736 by generating a drive signal 1734 that is received by light element array 1736 (based on content signal 1624 and/or control signal 1622 ).
  • Drive signal 1734 may include one or more drive signals used to control the amount of light emitted by each light source in light element array 1736 to generate light 1738 .
  • barrier array driver circuit 1606 is configured to generate drive signal 1616 received by barrier element array 1610 (e.g., based on control signal 1622 ).
  • Drive signal 1616 may include one or more drive signals used to cause each of the barrier elements of barrier element array to be transparent or opaque, to filter light 1738 to generate filtered light 1740 .
  • Pixel array driver circuit 1728 is configured to generate a drive signal 1732 that is received by pixel array 1722 (e.g., based on content signal 1624 and/or control signal 1622 ).
  • Drive signal 1732 may include one or more drive signals used to cause pixels of pixel array 1722 to impose desired images (e.g., colors, grayscale, etc.) on filtered light 1740 as it passes through pixel array 1722 .
  • desired images e.g., colors, grayscale, etc.
  • drive signal 1734 may control sets of light sources of light element array 1736 to emit light 1738 .
  • Drive signal 1616 may control barrier elements of barrier element array 1610 to filter light 1738 received from light element array 1736 to enable filtered light 1740 to enable two- and/or three-dimensionality.
  • Drive signal 1732 may control sets of pixels of pixel array 1722 to filter filtered light 1740 according to respective images, to provide a plurality of images.
  • drive signal 1616 may select one or more sets of the barrier elements of barrier element array 1610 to be transparent, to enable one or more corresponding two-dimensional images to be delivered to users 1618 .
  • drive signal 1616 may control sections of barrier element array 1610 to include opaque and transparent barrier elements to filter the light received from light element array 1736 so that one or more pairs of images provided by pixel array 1722 are each enabled to be received by users 1618 as a corresponding three-dimensional image.
  • drive signal 1616 may select parallel strips of barrier elements of barrier element array 1610 to be transparent to form slits that enable three-dimensional images to be received by users 1618 .
  • FIG. 18 shows a flowchart 1800 for generating images that are delivered to users in a viewing space, according to an exemplary embodiment.
  • Flowchart 1800 may be performed by system 1600 in FIG. 16 or system 1700 of FIG. 17 , for example.
  • Flowchart 1800 is described with respect to FIG. 19 , which shows a cross-sectional view of a display system 1900 .
  • Display system 1900 is an exemplary embodiment of system 1600 shown in FIG. 16 , and is shown for purposes of illustration.
  • system 1900 includes a pixel array 1902 and a barrier element array 1904 .
  • system 1900 may further include backlighting in a configuration similar to display system 1700 of FIG. 17 .
  • Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 1800 .
  • Flowchart 1800 is described as follows.
  • Flowchart 1800 begins with step 1802 .
  • step 1802 light is received at an array of barrier elements.
  • light 1652 is received at parallax barrier 1620 from pixel array 1608 .
  • Each pixel of pixel array 1608 may generate light that is received at parallax barrier 1620 .
  • parallax barrier 1620 may filter light 1652 from pixel array 1608 to generate a two-dimensional image or a three-dimensional image viewable in viewing space 1670 by users 1618 .
  • light 1738 may be received by parallax barrier 1620 from light element array 1736 .
  • a first set of the barrier elements of the array of barrier elements is configured in the blocking state and a second set of the barrier elements of the array of barrier elements is configured in the non-blocking state to enable a viewer to be delivered a three-dimensional view.
  • Three-dimensional image content may be provided for viewing in viewing space 1670 .
  • barrier array driver circuit 1606 may generate drive signal 1616 to configure barrier element array 1610 to include transparent strips of barrier elements to enable a three-dimensional view to be formed.
  • barrier element array 1904 includes a plurality of barrier elements that are each either transparent (in a non-blocking state) or opaque (in a blocking state).
  • Barrier elements 1910 a - 1910 f Barrier elements that are blocking are indicated as barrier elements 1910 a - 1910 f, and barrier elements that are non-blocking are indicated as barrier elements 1912 a - 1912 e. Further barrier elements may be included in barrier element array 1904 that are not visible in FIG. 19 .
  • Each of barrier elements 1910 a - 1910 f and 1912 a - 1912 e may include one or more barrier elements.
  • Barrier elements 1910 alternate with barrier elements 1912 in series in the order of barrier elements 1910 a, 1912 a, 1910 b, 1912 b, 1910 c, 1912 c, 1910 d, 1912 d, 1910 e, 1912 e, and 1910 f. In this manner, blocking barrier elements 1910 are alternated with non-blocking barrier elements 1912 to form a plurality of parallel non-blocking or transparent slits in barrier element array 1904 .
  • FIG. 20 shows a view of a parallax barrier 2000 with transparent slits, according to an exemplary embodiment.
  • Parallax barrier 2000 is an example of parallax barrier 1620 of FIGS. 16 and 17 .
  • parallax barrier 2000 includes barrier element array 2002 , which includes a plurality of barrier elements 2004 arranged in a two-dimensional array.
  • barrier element array 2002 includes a plurality of parallel strips of barrier elements 2004 that are selected to be non-blocking to form a plurality of parallel non-blocking strips (or “slits”) 2006 a - 2006 g.
  • slits or “slitslits”
  • non-blocking strips 2006 a - 2006 g are alternated with parallel blocking or blocking strips 2008 a - 2008 g of barrier elements 2004 that are selected to be blocking.
  • non-blocking strips 2006 a - 2006 g and blocking strips 2008 a - 2008 g each have a width (along the x-dimension) of two barrier elements 2004 , and have lengths that extend along the entire y-dimension (twenty barrier elements 2004 ) of barrier element array 2002 , although in other embodiments, may have alternative dimensions.
  • Non-blocking strips 2006 a - 2006 g and blocking strips 2008 a - 2008 g form a parallax barrier configuration for parallax barrier 300 .
  • the spacing (and number) of parallel non-blocking strips 2006 in barrier element array 2002 may be selectable by choosing any number and combination of particular strips of barrier elements 2004 in barrier element array 2002 to be non-blocking, to be alternated with blocking strips 2008 , as desired. For example, hundreds, thousands, or even larger numbers of non-blocking strips 2006 and blocking strips 2008 may be present in parallax barrier 300 .
  • FIG. 21 shows a parallax barrier 2100 that is another example of parallax barrier 1620 with parallel transparent slits, according to an embodiment.
  • parallax barrier 2100 has includes a barrier element array 2112 , which includes a plurality of barrier elements 2114 arranged in a two-dimensional array (28 by 1 array).
  • Barrier elements 2114 have widths (along the x-dimension) similar to the widths of barrier elements 2004 in FIG. 20 , but have lengths that extend along the entire vertical length (y-dimension) of barrier element array 2114 . As shown in FIG.
  • barrier element array 2112 includes parallel non-blocking strips 2006 a - 2006 g alternated with parallel blocking strips 2008 a - 2008 g.
  • parallel non-blocking strips 2006 a - 2006 g and parallel blocking strips 2008 a - 2008 g each have a width (along the x-dimension) of two barrier elements 2114 , and have lengths that extend along the entire y-dimension (one barrier element 314 ) of barrier element array 2112 .
  • the light is filtered at the array of barrier elements to form the three-dimensional view in a viewing space.
  • Barrier element array 1610 of parallax barrier 1620 is configured to filter light 1652 received from pixel array 1608 ( FIG. 16 ) or light 1738 received from light element array 1736 ( FIG. 17 ) according to whether barrier element array 1610 is transparent or non-blocking (e.g., in a two-dimensional mode) or includes parallel non-blocking strips (e.g., in a three-dimensional mode).
  • barrier element array 1610 If one or more regions of barrier element array 1610 are transparent, those regions of barrier element array 1610 function as “all pass” filters to substantially pass all of light 1652 as filtered light 1672 to deliver one or more corresponding two-dimensional images generated by pixel array 1608 to viewing space 1670 , to be viewable as a two-dimensional images in a similar fashion as a conventional display. If barrier element array 1610 includes one or more regions having parallel non-blocking strips (e.g., as shown for barrier element array 2002 in FIGS. 20 and 21 ), those regions of barrier element array 1610 pass a portion of light 1652 as filtered light 1672 to deliver one or more corresponding three-dimensional images to viewing space 1670 .
  • pixel array 1902 includes a plurality of pixels 1914 a - 1914 d and 1916 a - 1916 d. Pixels 1914 alternate with pixels 1916 , such that pixels 1914 a - 1914 d and 1916 a - 1916 d are arranged in series in the order of pixels 1914 a, 1916 a, 1914 b, 1916 b, 1914 c, 1916 c, 1914 d, and 1916 d. Further pixels may be included in pixel array 1902 that are not visible in FIG. 19 , including further pixels along the width dimension of pixel array 1902 (e.g., in the left-right directions) as well as pixels along a length dimension of pixel array 1902 (not visible in FIG. 19 ).
  • Each of pixels 1914 a - 1914 d and 1916 a - 1916 d generates light, which emanates from display surface 1924 of pixel array 1902 (e.g., generally upward in FIG. 19 ) towards barrier element array 1904 .
  • Some example indications of light emanating from pixels 1914 a - 1914 d and 1916 a - 1916 d are shown in FIG. 19 (as dotted lines), including light 1924 a and light 1918 a emanating from pixel 1914 a, light 1924 b, light 1918 b, and light 1924 c emanating from pixel 1914 b, etc.
  • light emanating from pixel array 1902 is filtered by barrier element array 1904 to form a plurality of images in a viewing space 1926 , including a first image 1906 a at a first location 1908 a and a second image 1906 b at a second location 1908 b.
  • a portion of the light emanating from pixel array 1902 is blocked by blocking barrier elements 1910
  • another portion of the light emanating from pixel array 1902 passes through non-blocking barrier elements 1912 , according to the filtering by barrier element array 1904 .
  • light 1924 a from pixel 1914 a is blocked by blocking barrier element 1910 a
  • light 1924 b and light 1924 c from pixel 1914 b are blocked by blocking barrier elements 1910 b and 1910 c, respectively.
  • light 1918 a from pixel 1914 a is passed by non-blocking barrier element 1912 a
  • light 1918 b from pixel 1914 b is passed by non-blocking barrier element 1912 b.
  • system 1900 shown in FIG. 19 is configured to form first and second images 1906 a and 1906 b at locations 1908 a and 1908 b, respectively, which are positioned at a distance 1928 from pixel array 1902 (as shown in FIG. 19 , further instances of first and second images 1906 a and 1906 b may be formed in viewing space 1926 according to system 1900 , in a repeating, alternating fashion).
  • pixel array 1902 includes a first set of pixels 1914 a - 1914 d and a second set of pixels 1916 a - 1916 d.
  • Pixels 1914 a - 1914 d correspond to first image 1906 a and pixels 1916 a - 1916 d correspond to second image 1906 b. Due to the spacing of pixels 1914 a - 1914 d and 1916 a - 1916 d in pixel array 1902 , and the geometry of non-blocking barrier elements 1912 in barrier element array 1904 , first and second images 1906 a and 1906 b are formed at locations 1908 a and 1908 b, respectively. As shown in FIG. 19 , light 1918 a - 1918 d from the first set of pixels 1914 a - 1914 d is focused at location 1908 a to form first image 1906 a at location 1908 a. Light 1920 a - 1920 d from the second set of pixels 1916 a - 1916 d is focused at location 1908 b to form second image 1906 b at location 1908 b.
  • FIG. 19 shows a slit spacing 1922 (center-to-center) of non-blocking barrier elements 1912 in barrier element array 1904 .
  • Spacing 1922 may be determined to select locations for parallel non-blocking slits to be formed in barrier element array 1904 for a particular image distance 1928 at which images are desired to be formed (for viewing by users). For example, in an embodiment, if a spacing of pixels 1914 a - 1914 d corresponding to an image is known, and a distance 1928 at which the image is desired to be displayed is known, the spacing 1922 between adjacent parallel non-blocking slits in barrier element array 1904 may be selected.
  • First and second images 1906 a and 1906 b are configured to be perceived by a user as a three-dimensional image or view.
  • a viewer may receive first image 1906 a at a first eye location and second image 1906 b at a second eye location, according to an exemplary embodiment.
  • First and second images 1906 a and 1906 b may be generated by first set of pixels 1914 a - 1914 d and second set of pixels 1916 a - 1916 d as images that are slightly different perspective from each other.
  • Images 1906 a and 1906 b are combined in the visual center of the brain of the viewer to be perceived as a three-dimensional image or view.
  • first and second images 1906 a and 1906 b may be formed by display system 1900 such that their centers are spaced apart a width of a user's pupils (e.g., an “interocular distance”).
  • the entire regions of parallax barriers 2000 and 2100 are filled with parallel non-blocking strips (e.g., as shown for barrier element array 2002 in FIGS. 20 and 21 ) to be configured to deliver three-dimensional images to viewing space 1670 .
  • one or more regions of a parallax barrier may be filled with parallel non-blocking strips to deliver three-dimensional images, and one or more other regions of the parallax barrier may be transparent to deliver two-dimensional images.
  • different regions of a parallax barrier that have parallel non-blocking strips may have the parallel non-blocking strips oriented at different angles to deliver three-dimensional images to viewers that are oriented differently.
  • FIG. 22 shows a view of a parallax barrier 2200 configured to enable the simultaneous display of two-dimensional and three-dimensional images at different regions, according to exemplary embodiments.
  • Parallax barrier 2200 is similar to parallax barrier 2000 of FIG. 20 , having barrier element array 2002 including a plurality of barrier elements 2004 arranged in a two-dimensional array.
  • a first region 2202 of barrier element array 2002 includes a plurality of parallel non-blocking strips alternated with parallel blocking strips that together fill first region 2202 .
  • a second region 2204 of barrier element array 2002 is surrounded by first region 2202 .
  • Second region 2204 is a rectangular shaped region of barrier element array 2002 that includes a two-dimensional array of barrier elements 2004 that are non-blocking.
  • barrier element array 2002 is configured to enable a three-dimensional image to be generated by pixels of a pixel array that are adjacent to barrier elements of first region 2202 , and to enable a two-dimensional image to be generated by pixels of the pixel array that are adjacent to barrier elements inside of second region 2204 .
  • first region 2202 may include all non-blocking barrier elements 2002 to pass a two-dimensional image
  • second region 2204 may include parallel non-blocking strips alternated with parallel blocking strips to pass a three-dimensional image.
  • parallax barrier 2200 may have additional numbers, sizes, and arrangements of regions configured to pass different combinations of two-dimensional images and three-dimensional images.
  • FIG. 23 shows a view of a parallax barrier 2300 with transparent slits having different orientations, according to an exemplary embodiment.
  • Parallax barrier 2300 is similar to parallax barrier 2000 of FIG. 20 , having barrier element array 2002 including a plurality of barrier elements 2004 arranged in a two-dimensional array.
  • a first region 2310 (e.g., a bottom half) of barrier element array 2002 includes a first plurality of parallel strips of barrier elements 2004 that are selected to be non-blocking to form a first plurality of parallel non-blocking strips 2302 a - 2302 e (each having a width of two barrier elements 2004 ).
  • FIG. 23 shows a view of a parallax barrier 2300 with transparent slits having different orientations, according to an exemplary embodiment.
  • Parallax barrier 2300 is similar to parallax barrier 2000 of FIG. 20 , having barrier element array 2002 including a plurality of barrier elements 2004 arranged in a two-dimensional array.
  • parallel non-blocking strips 2302 a - 2302 e are alternated with parallel blocking strips 2304 a - 2304 f of barrier elements 2004 (each having a width of three barrier elements 2004 ).
  • Parallel non-blocking strips 2302 a - 2302 e are oriented in a first direction (e.g., along a vertical axis).
  • a second region 2312 (e.g., a top half) of barrier element array 2002 includes a second plurality of parallel strips of barrier elements 2004 that are selected to be non-blocking to form a second plurality of parallel non-blocking strips 2306 a - 2306 d (each having a width of one barrier element 2004 ).
  • parallel non-blocking strips 2306 a - 2306 d are alternated with parallel blocking strips 2308 a - 2308 c of barrier elements 2004 (each having a width of two barrier elements 2004 ).
  • Parallel non-blocking strips 2306 a - 2306 d are oriented in a second direction (e.g., along a horizontal axis).
  • first and second pluralities of parallel non-blocking strips 2302 a - 2302 e and 2306 a - 2306 d are present in barrier element array 2002 that are oriented perpendicularly to each other.
  • the region of barrier element array 2002 that includes first plurality of parallel non-blocking strips 2302 a - 2302 e may be configured to deliver a three-dimensional image in a viewing space (as described above) to be viewable by a user whose body is oriented vertically (e.g., sitting upright or standing up).
  • the region of barrier element array 2002 that includes second plurality of parallel non-blocking strips 2306 a - 2306 d may be configured to deliver a three-dimensional image in a viewing space (as described above) to be viewable by a user whose body is oriented horizontally (e.g., laying down). In this manner, users who are oriented differently relative to each other can still each be provided with a corresponding three-dimensional image that accommodates their position.
  • barrier element array 1610 may be configured into a third configuration to deliver a two-dimensional view.
  • barrier array driver circuit 1606 may generate drive signal 1616 to configure each barrier element of barrier element array 1610 to be in the non-blocking state (transparent). If barrier element array 1610 is non-blocking, barrier element array 1610 functions as an “all pass” filter to substantially pass all of light 1652 ( FIG. 16 ) or light 1738 ( FIG. 17 ) as filtered light 1672 to deliver the two-dimensional image to viewing space 1670 , to be viewable as a two-dimensional image in a similar fashion as a conventional display.
  • display systems may be configured to generate multiple two-dimensional images or views for viewing by users in a viewing space.
  • FIG. 24 shows a display system 2400 configured to deliver two two-dimensional images, according to an embodiment.
  • Display system 2400 is configured similarly to display system 1900 of FIG. 19 .
  • display system 2400 includes pixel array 1902 and barrier element array 1904 , which generate first and second images 2402 a and 2402 b.
  • a first viewer 2404 a receives first image 2402 a at a first location and a second viewer 2404 b receives second image 2402 b at a second location, according to an exemplary embodiment.
  • FIG. 24 shows a display system 2400 configured to deliver two two-dimensional images, according to an embodiment.
  • FIG. 24 shows a display system 2400 configured similarly to display system 1900 of FIG. 19 .
  • display system 2400 includes pixel array 1902 and barrier element array 1904 , which generate first and second images 2402 a and 2402 b.
  • a first viewer 2404 a
  • first and second images 2402 a and 2402 b may be generated by first set of pixels 1914 a - 1914 d and second set of pixels 1916 a - 1916 d of pixel array 1902 .
  • first and second images 2402 a and 2402 b are each a two-dimensional image that may be viewed independently from each other.
  • image 2402 a and image 2402 b may generated by display system 1900 from first media content and second media content, respectively, that are independent of each other.
  • Image 2402 a may be received by both eyes of first viewer 2404 a to be perceived by first viewer 2404 a as a first two-dimensional image
  • image 2402 b may be received by both eyes of second viewer 2404 b to be perceived by second viewer 2404 b as a second two-dimensional image.
  • first and second images 2402 a and 2402 b may be generated to have a spacing that enables them to be separately viewed by first and second users 2404 a and 2404 b.
  • display system 2400 of FIG. 24 can be configured to deliver a single three-dimensional view to a viewer (e.g., as shown in FIG. 19 for display system 1900 ), to deliver a pair of two-dimensional views to a pair of viewers (e.g., as shown in FIG. 24 ), or to deliver a pair of three-dimensional views to a pair of viewers as described above).
  • Display system 2400 can be configured to switch between delivering views to one and two viewers by turning off or turning on, respectively, the display of media content by pixel array 1902 associated with one of the viewers (e.g., by turning off or on pixels 1916 associated with second image 2402 b ).
  • Display system 2400 can be configured to switch between delivering two-dimensional and three-dimensional views by providing the corresponding media content type at pixel array 2402 . Furthermore, display system 2400 may provide such capabilities when configured similarly to display system 1700 shown in FIG. 17 (e.g., including backlighting 1716 ).
  • display system 1900 may be configured to generate multiple three-dimensional images that include related image content (e.g., each three-dimensional image is a different viewpoint of a common scene), or that each include unrelated image content, for viewing by users in a viewing space.
  • Each of the three-dimensional images may correspond to a pair of images generated by pixels of the pixel array.
  • the barrier element array filters light from the pixel array to form the image pairs in a viewing space to be perceived by users as three-dimensional images.
  • FIG. 25 shows a flowchart 2500 for generating multiple three-dimensional images, according to an exemplary embodiment.
  • Flowchart 2500 is described with respect to FIG. 26 , which shows a cross-sectional view of a display system 2600 .
  • Display system 2600 is an exemplary embodiment of system 1600 shown in FIG. 16 , and is shown for purposes of illustration.
  • system 2600 includes a pixel array 2602 and a barrier element array 2604 .
  • System 2600 may also include display driver circuit 1602 of FIG. 16 , which is not shown in FIG. 26 for ease of illustration. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 2500 .
  • Flowchart 2500 is described as follows.
  • Flowchart 2500 begins with step 2502 .
  • step 2502 light is received from an array of pixels that includes a plurality of pairs of sets of pixels.
  • pixel array 2602 includes a first set of pixels 2614 a - 2614 d, a second set of pixels 2616 a - 2616 d, a third set of pixels 2618 a - 2618 d, and a fourth set of pixels 2620 a - 2620 d.
  • Each of pixels 2614 a - 2614 d, 2616 a - 2616 d, 2618 a - 2618 d, 2620 a - 2620 d generates light, which emanates from the surface of pixel array 2602 towards barrier element array 2604 .
  • Each set of pixels generates a corresponding image.
  • First set of pixels 2614 a - 2614 d and third set of pixels 2618 a - 2618 d are configured to generate images that combine to form a first three-dimensional image.
  • Second set of pixels 2616 a - 2616 d and fourth set of pixels 2620 a - 2620 d are configured to generate images that combine to form a second three-dimensional image.
  • Pixels of the four sets of pixels are alternated in pixel array 2602 in the order of pixel 2614 a, pixel 2616 a, pixel 2618 a, pixel 2620 a, pixel 2614 b, pixel 2616 b, etc. Further pixels may be included in each set of pixels in pixel array 2602 that are not visible in FIG. 26 , including hundreds, thousands, or millions of pixels in each set of pixels.
  • pixel array 2602 is segmented into a plurality of pairs of sets of pixels. For instance, in the example of FIG. 26 , pixel array 2602 is segmented into four sets of pixels.
  • the first set of pixels includes pixels 2614 a - 2614 g and the other pixels in the same columns
  • the second set of pixels includes pixels 2616 a - 2616 g and the other pixels in the same columns
  • barrier element array 2604 includes barrier elements that are each either non-blocking or blocking. Barrier elements that are blocking are indicated as barrier elements 2610 a - 2610 f, and barrier elements that are non-blocking are indicated as barrier elements 2612 a - 2612 e. Further barrier elements may be included in barrier element array 2604 that are not visible in FIG. 26 , including hundreds, thousands, or millions of barrier elements, etc. Each of barrier elements 2610 a - 2610 f and 2612 a - 2612 e may include one or more barrier elements. Barrier elements 2610 alternate with barrier elements 2612 . In this manner, blocking barrier elements 2610 are alternated with non-blocking barrier elements 2612 to form a plurality of parallel non-blocking slits in barrier element array 2604 .
  • step 2506 the light is filtered at the barrier element array to form a plurality of pairs of images in a viewing space corresponding to the plurality of pairs of sets of pixels, each pair of images of the plurality of pairs of images being configured to be perceived as a corresponding three-dimensional image of a plurality of three-dimensional images.
  • light emanating from pixel array 2602 is filtered by barrier element array 2604 to form a plurality of images in a viewing space 2626 . For instance, four images are formed in viewing space 2626 , including first-fourth images 2606 a - 2606 d.
  • Pixels 2614 a - 2614 d correspond to first image 2606 a
  • pixels 2616 a - 2616 d correspond to second image 2606 b
  • pixels 2618 a - 2618 d correspond to third image 2606 c
  • pixels 2620 a - 2620 d correspond to fourth image 2606 d.
  • light 2622 a - 2622 d from the first set of pixels 2614 a - 2614 d forms first image 2606 a
  • light 2624 a - 2624 d from the third set of pixels 2618 a - 2618 d forms third image 2606 c
  • the filtering of the non-blocking slits corresponding to non-blocking barrier elements 2612 a - 2612 e
  • light from the second set of pixels 2616 a - 2616 d forms second image 2606 b
  • light from the fourth set of pixels 2620 a - 2620 d forms fourth image 2606 d.
  • any pair of images of images 2606 a - 2606 d may be configured to be perceived as a three-dimensional image by a user in viewing space 2626 .
  • first and third images 2606 a and 2606 c may be configured to be perceived by a user as a first three-dimensional image, such that first image 2606 a is received at a first eye location and third image 2606 c is received at a second eye location of a user.
  • second and fourth images 2606 b and 2606 d may be configured to be perceived by a user as a second three-dimensional image, such that second image 2606 b is received at a first eye location and fourth image 2606 d is received at a second eye location of a user.
  • each three-dimensional image is generated by filtering light (using a barrier element array) corresponding to an image pair generated by a corresponding pair of sets of pixels of the pixel array, in a similar fashion as described with respect to FIG. 26 for two three-dimensional images.
  • pixel array 2602 may include fifth and sixth sets of pixels that generate fifth and sixth images, respectively, to be perceived by a user as a third three-dimensional image.
  • pixel array 2602 may include seventh and eighth sets of pixels that generate seventh and eighth images, respectively, to be perceived by a user as the fourth three-dimensional image.
  • the first and second three-dimensional images generated based on first and third images 2606 a and 2606 c and second and fourth images 2606 b and 2606 d, respectively, and any further three-dimensional images that may be generated may include related image content or may each include unrelated image content.
  • the first and second three-dimensional images (and any further three-dimensional images) may have been captured as different viewpoints of a common scene.
  • a user in viewing space 2626 that moves laterally to sequentially view the first and second three-dimensional images (and any further three-dimensional images) may perceive being able to partially or fully “view behind” objects of the common scene.
  • display devices 1400 and 1500 of FIGS. 14 and 15 may include one or more lenticular lenses as light manipulators 1404 and 1504 used to deliver three-dimensional images and/or two-dimensional images.
  • display systems 1600 and 1700 of FIGS. 16 and 17 may each include a sub-lens array of a lenticular lens in place of parallax barrier 1620 .
  • FIG. 27 shows a perspective view of a lenticular lens 2700 in accordance with an embodiment. As shown in FIG. 27 , lenticular lens 2700 includes a sub-lens array 2702 .
  • Sub-lens array 2702 includes a plurality of sub-lenses 2704 arranged in a two-dimensional array (e.g., arranged side-by-side in a row). Each sub-lens 2704 is shown in FIG. 27 as generally cylindrical in shape and having a substantially semi-circular cross-section, but in other embodiments may have other shapes. In FIG. 27 , sub-lens array 2702 is shown to include eight sub-lenses for illustrative purposes and is not intended to be limiting. For instance, sub-lens array 2702 may include any number (e.g., hundreds, thousands, etc.) of sub-lenses 2704 . FIG.
  • FIG. 28 shows a side view of lenticular lens 2700 , oriented as lenticular lens 2700 may be positioned in system 1900 of FIG. 19 (in place of parallax barrier 1904 ) for lenticular lens 1902 to deliver three-dimensional views.
  • light may be passed through lenticular lens 2700 in the direction of dotted arrow 2802 to be diverted.
  • lenticular lens 2700 may be fixed in size.
  • light manipulator 1404 of FIG. 14 may include lenticular lens 2700 when fixed in size.
  • lenticular lens 2700 may be adaptable.
  • light manipulator 1504 of FIG. 15 may include lenticular lens 2700 when adaptable.
  • lenticular lens 2700 may be made from an elastic material. Such a lenticular lens 2700 may be adapted in size in response to generated drive signals.
  • Display devices 1400 and 1500 may include multiple layers of light manipulators in embodiments. Multiple three-dimensional images may be displayed in a viewing space using multiple light manipulator layers, according to embodiments.
  • the multiple light manipulating layers may enable spatial separation of the images.
  • a display device that includes multiple light manipulator layers may be configured to display a first three-dimensional image in a first region of a viewing space (e.g., a left-side area), a second three-dimensional image in a second region of the viewing space (e.g., a central area), a third three-dimensional image in a third region of the viewing space (e.g., a right-side area), etc.
  • a display device may be configured to display any number of spatially separated three-dimensional images, as desired for a particular application (e.g., according to a number and spacing of viewers in the viewing space, etc.).
  • FIG. 29 shows a flowchart 2900 for generating multiple three-dimensional images using multiple light manipulator layers, according to an exemplary embodiment.
  • Flowchart 2900 is described with respect to FIG. 30 , which shows a cross-sectional view of a display system 3000 that includes multiple light manipulator layers, according to an exemplary embodiment.
  • system 3000 includes a display driver circuit 3002 , an image generator 1612 , a first light manipulator 3014 a, and a second light manipulator 3014 b.
  • FIG. 30 shows a display driver circuit 3002 , an image generator 1612 , a first light manipulator 3014 a, and a second light manipulator 3014 b.
  • image generator 1612 includes pixel array 1608
  • first light manipulator 3014 a includes first light manipulator elements 3016 a
  • second light manipulator 3014 b includes second light manipulator elements 3016 b.
  • display driver circuit 3002 includes a pixel array driver circuit 3004 and a light manipulator driver circuit 3006 .
  • step 2902 light is received from an array of pixels that includes a plurality of pairs of sets of pixels.
  • light 1652 is received at first light manipulator 3014 a from pixel array 208 of image generator 1612 .
  • Pixel array driver circuit 3004 may generate driver signals based on content signal 1624 received by display driver circuit 3002 , and the driver signals may be received by pixel array 1614 to generate light 1652 .
  • Each pixel of pixel array 1608 may generate light that is received at first light manipulator 3014 a.
  • pixel array driver circuit 3004 may generate drive signal 1614 to cause pixel array 1608 to emit light 1652 containing a plurality of images corresponding to the sets of pixels.
  • first light manipulator 3014 a may be configured to manipulate light 1652 received from pixel array 1608 .
  • first light manipulator 3014 a includes light manipulator elements 3016 a configured to perform manipulating (e.g., filtering, diverting, etc.) of light 1652 to generate manipulated light 1672 .
  • Light manipulator elements 3016 a may optionally be configurable to adjust the manipulating performed by first light manipulator 3014 a.
  • First light manipulator 3014 a may perform filtering in a similar manner as a parallax barrier described above or in other manner.
  • first light manipulator 3014 a may include a lenticular lens that diverts light 1652 to perform light manipulating, generating manipulated light 1672 .
  • light manipulator driver circuit 3006 may generate drive signal 1616 a based on control signal 1622 received by display driver 3002 to cause light manipulator elements 3016 a to manipulate light 1652 as desired.
  • the light manipulated by the first light manipulator is manipulated with a second light manipulator to form a plurality of pairs of images corresponding to the plurality of pairs of sets of pixels in a viewing space.
  • manipulated light 1672 is received by second light manipulator 3014 b to generate manipulated light 3008 that includes a plurality of three-dimensional images 3010 a - 3010 n formed in viewing space 1670 .
  • second light manipulator 3014 b includes light manipulator elements 3016 b configured to perform manipulating of manipulated light 1672 to generate manipulated light 3008 .
  • Light manipulator elements 3016 b may optionally be configurable to adjust the manipulating performed by second light manipulator 3014 b.
  • light manipulator driver circuit 3006 may generate drive signal 1616 b based on control signal 1622 to cause light manipulator elements 3016 b to manipulate manipulated light 1652 to generate manipulated light 3008 including three-dimensional images 3010 a - 3010 n as desired.
  • second light manipulator 3014 a may include a parallax barrier or a lenticular lens configured to manipulate manipulated light 1652 to generate manipulated light 3008 .
  • display system 3000 has a single viewing plane or surface (e.g., a plane or surface of pixel array 1608 , first light manipulator 3014 a, second light manipulator 3014 b ) that supports multiple viewers with media content in the form of three-dimensional images or views.
  • the single viewing plane of display system 3000 may provide a first three-dimensional view based on first three-dimensional media content to a first viewer, a second three-dimensional view based on second three-dimensional media content to a second viewer, and optionally further three-dimensional views based on further three-dimensional media content to further viewers.
  • First and second light manipulators 3014 a and 3014 b each cause three-dimensional media content to be presented to a corresponding viewer via a corresponding area of the single viewing plane, with each viewer being enabled to view corresponding media content without viewing media content directed to other viewers. Furthermore, the areas of the single viewing plane that provide the various three-dimensional views of media content overlap each other at least in part. In the embodiment of FIG. 30 , the areas may be the same area—an area of a display screen or surface of display system 3000 . As such, multiple three-dimensional views that are each viewable by a corresponding viewer may be delivered by a single display viewing plane.
  • Display system 3000 may be configured in various ways to generate multiple three-dimensional images according to flowchart 2900 , in embodiments. Furthermore, as described below, embodiments of display system 3000 may be configured to generate two-dimensional views, as well as any combination of one or more two-dimensional views simultaneously with one or more three-dimensional views.
  • FIG. 31 shows a cross-sectional view of a display system 3100 , according to an exemplary embodiment.
  • Display system 3100 is an example of system 3000 shown in FIG. 30 .
  • system 3100 includes a pixel array 3102 , a first barrier element array 3104 , and a second barrier element array 3106 .
  • System 3100 may also include display driver circuit 3002 of FIG. 30 , which is not shown in FIG. 31 for ease of illustration.
  • System 3100 is described as follows.
  • pixel array 3102 includes a first set of pixels 3114 a - 3114 c, a second set of pixels 3116 a - 3116 c, a third set of pixels 3118 a - 3118 c, and a fourth set of pixels 3120 a - 3120 c. Pixels of the four sets of pixels are alternated in pixel array 3102 in the order of pixel 3114 a, pixel 3116 a, pixel 3118 a, pixel 3120 a, pixel 3114 b, pixel 3116 b, etc. Further pixels may be included in each set of pixels in pixel array 3102 that are not visible in FIG. 31 , including hundreds, thousands, or millions of pixels in each set of pixels.
  • Each of pixels 3114 a - 3114 c, 3116 a - 3116 c, 3118 a - 3118 c, and 3120 a - 3120 c is configured to generate light, which emanates from the surface of pixel array 3102 towards first barrier element array 3104 .
  • Each set of pixels is configured to generate a corresponding image.
  • FIG. 32 shows display system 3100 , where pixels of pixel array 3102 emit light.
  • Light from second set of pixels 3116 a - 3116 c and first set of pixels 3114 a - 3114 c is configured to generate third and fourth images 3206 c and 3206 d, respectively, which may be perceived together as a second three-dimensional image by a second viewer 2404 b.
  • Light from fourth set of pixels 3120 a - 3120 c and third set of pixels 3118 a - 3118 c is configured to generate first and second images 3206 a and 3206 b, respectively, which may be perceived together as a first three-dimensional image by a first viewer 2404 a.
  • the light emitted by the sets of pixels is filtered by first and second barrier element arrays 3104 and 3106 to generate the first and second three-dimensional images in respective desired regions of a user space 3202 adjacent to display system 3100 .
  • First-fourth images 3206 a - 3206 d may be formed in viewing space 3202 at a distance from pixel array 3102 and at a lateral location of viewing space 3202 as determined by a configuration of display system 3100 of FIG. 31 , including a width and spacing of non-blocking slits in first barrier element array 3104 , by a width and positioning of non-blocking slits in second barrier element array 3106 , by a spacing between pixel array 3102 and first barrier element array 3104 , and a spacing between first and second barrier element arrays 3104 and 3106 .
  • system 3000 of FIG. 30 may be configured similarly to display system 1700 of FIG. 17 to deliver three-dimensional images and/or two-dimensional images.
  • system 3000 may include backlighting 1716 and pixel array 1722 separated by one or both of first and second light manipulators 3014 a and 3014 b.
  • FIG. 33 shows a block diagram of a display system 3300 , which is an example of display devices 1400 and 1500 shown in FIGS. 14 and 15 , according to an embodiment.
  • Display system 3300 is configured to display multiple three-dimensional images in a viewing space in a spatially separated manner. As shown in FIG.
  • system 3300 includes display driver circuit 3002 , backlighting 1716 , first light manipulator 3014 a, second light manipulator 3014 b, and pixel array 1722 .
  • backlighting 1716 optionally includes light element array 1736
  • first light manipulator 3014 a includes first light manipulator elements 3016 a
  • second light manipulator 3014 b includes second light manipulator elements 3016 b.
  • display driver circuit 3002 receives control signal 1622 and content signal 1624 and includes light source driver circuit 1730 , light manipulator driver circuit 3006 , and pixel array driver circuit 1728 .
  • Light source driver circuit 1730 , light manipulator driver circuit 3006 , and pixel array driver circuit 1728 may generate drives signals to perform their respective functions based on control signal 1622 and/or content signal 1624 . As shown in FIG. 33 , first and second light manipulators 3014 a and 3014 b are positioned between backlighting 1716 and pixel array 1722 . In another embodiment, pixel array 1722 may instead be located between first and second light manipulators 3014 a and 3014 b.
  • display driver circuit 1602 receives content signal 1624
  • display driver circuit 3002 receives content signal 1624
  • Content signal 1624 is an example of content signals 1408 and 1508 of FIGS. 14 and 15
  • Content signal 1624 includes two-dimensional and/or three-dimensional content for display by the respective display devices/systems.
  • display driver circuits 1602 and 3002 generate respective drive signals (e.g., pixel array drive signals) based on content signal 1624 to enable the content carried by content signal 1624 to be displayed.
  • light manipulators may be reconfigured to change the locations of delivered views based on changing viewer positions.
  • a position of a viewer may be determined/tracked so that a parallax barrier and/or light manipulator may be reconfigured to deliver views consistent with the changing position of the viewer.
  • a spacing, number, arrangement, and/or other characteristic of slits may be adapted according to the changing viewer position.
  • a size of the lenticular lens may be adapted (e.g., stretched, compressed) according to the changing viewer position.
  • a position of a viewer may be determined/tracked by determining a position of the viewer directly, or by determining a position of a device associated with the viewer (e.g., a device worn by the viewer, held by the viewer, sitting in the viewer's lap, in the viewer's pocket, sitting next the viewer, etc.).
  • a device associated with the viewer e.g., a device worn by the viewer, held by the viewer, sitting in the viewer's lap, in the viewer's pocket, sitting next the viewer, etc.
  • Examples of display environments for display embodiments described herein include environments having a single viewer, as well as environments having multiple viewers.
  • a single viewer interacts with an electronic device, mobile or stationary, to view and/or interact with mixed 2D and 3D content, such as a mobile or desktop computer, smart phone, television, or other mobile or stationary device.
  • this type of environment may include more than one viewer.
  • multiple viewers are enabled to interact with an electronic device, such as a television set (e.g., high-def, small screen, large screen, etc.), to view and/or interact with mixed 2D and 3D content in the form of television content, movies, video games, etc.
  • a television set e.g., high-def, small screen, large screen, etc.
  • mixed 2D and 3D content in the form of television content, movies, video games, etc.
  • FIG. 34 shows a block diagram of a display environment 3400 , according to an exemplary embodiment.
  • first and second viewers 3406 a and 3406 b are present in display environment 3400 , and are enabled to interact with a display device 3402 to be delivered two-dimensional and/or three-dimensional media content.
  • two viewers 3406 are shown present in FIG. 34 , in other embodiments, other numbers of viewers 3406 may be present in display environment 3400 that may interact with display device 3402 and may be delivered media content by display device 3402 .
  • FIG. 34 shows a block diagram of a display environment 3400 , according to an exemplary embodiment.
  • first and second viewers 3406 a and 3406 b are present in display environment 3400 , and are enabled to interact with a display device 3402 to be delivered two-dimensional and/or three-dimensional media content.
  • two viewers 3406 are shown present in FIG. 34 , in other embodiments, other numbers of viewers 3406 may be present in display environment 3400 that may interact with display device 3402 and may be delivered media
  • display environment 3400 includes display device 3402 , a first remote control 3404 a, a second remote control 3404 b, a first headset 3412 a, a second headset 3412 b, and viewers 3406 a and 3406 b.
  • Display device 3402 is an example of the display devices described above, and may be configured similarly to any display device described herein, including display device 606 .
  • Viewer 3406 a is delivered a view 3408 a by display device 3402
  • viewer 3406 b is delivered a view 3408 b by display device 3402 .
  • Views 3408 a and 3408 b may each be a two-dimensional view or a three-dimensional view.
  • view 3408 a may be delivered to viewer 3406 a, but not be visible by viewer 3406 b
  • view 3408 b may be delivered to viewer 3406 b, but not be visible by viewer 3406 a.
  • Remote control 3404 a is a device that viewer 3406 a may use to interact with display device 3402
  • remote control 3404 b is a device that viewer 3406 b may use to interact with display device 3402
  • viewer 3406 a may interact with a user interface of remote control 3404 a to generate a display control signal 3414 a
  • viewer 3406 b may interact with a user interface of remote control 3404 b to generate a display control signal 3414 b
  • Display control signals 3414 a and 3414 b may be transmitted to display device 3402 using wireless or wired communication links.
  • Display control signals 3414 a and 3414 b may be configured to select particular content desired to be viewed by viewers 3406 a and 3406 b, respectively.
  • display control signals 3414 a and 3414 b may select particular media content to be viewed (e.g., television channels, video games, DVD (digital video discs) content, video tape content, web content, etc.).
  • Display control signals 3414 a and 3414 b may select whether such media content is desired to be viewed in two-dimensional or three-dimensional form by viewers 3406 a and 3406 b, respectively.
  • Remote controls 3404 a and 3404 b may be television remote control devices, game controllers, smart phones, or other remote control type device.
  • Headsets 3412 a and 3412 b are worn by viewers 3406 a and 3406 b, respectively.
  • Headsets 3412 a and 3412 b each include one or two speakers (e.g., earphones) that enable viewers 3406 a and 3406 b to hear audio associated with the media content of views 3408 a and 3408 b.
  • Headsets 3412 a and 3412 b enable viewers 3406 a and 3406 b to hear audio of their respective media content without hearing audio associated the media content of the other of viewers 3406 a and 3406 b.
  • Headsets 3412 a and 3412 b may each optionally include a microphone to enable viewers 3406 a and 3406 b to interact with display device 3402 using voice commands.
  • Display device 3402 a, headset 3412 a, and/or remote control 3404 a may operate to provide position information 3410 a regarding viewers 3406 a to display device 3402
  • display device 3402 b, headset 3412 b, and/or remote control 3404 b may operate to provide position information 3410 b regarding viewers 3406 b to display device 3402
  • Display device 3402 may use position information 3410 a and 3410 b to reconfigure one or more light manipulators (e.g., parallax barriers and/or lenticular lenses) of display device 3402 to enable views 3408 a and 3408 b to be delivered to viewers 3406 a and 3406 b, respectively, at various locations.
  • light manipulators e.g., parallax barriers and/or lenticular lenses
  • display device 3402 a, headset 3412 a, and/or remote control 3404 a may use positioning techniques to track the position of viewer 3406 a
  • display device 3402 b, headset 3412 b, and/or remote control 3404 b may use positioning techniques to track the position of viewer 3406 b.
  • Embodiments may be implemented in hardware, software, firmware, or any combination thereof
  • browser 106 mixed 2D/3D supporting logic 108 , API 302 , operating system 304 , display driver 306 , browser 400 , user interface 402 , rendering engine 404 , client application(s) 406 , networking module 408 , code interpreter 410 , web browser 490 , OS 432 , browser/rendering engine 442 , 2D/3Dx UI display 444 , networking module 446 , UI backend 448 , client(s) 450 , parser 452 , render tree preparation module 454 , rendered tree display 456 , 2D/3Dx support 458 , streaming server application 466 , user input interfaces 420 , 2D, 3Dx & mixed display driver interface 422 , shell operations 424 , 2D, 3Dx, mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 , API supporting regional 2D/3Dx, API
  • FIG. 35 shows a block diagram of an example implementation of an electronic device 3500 , according to an embodiment.
  • electronic device 3500 may include one or more of the elements shown in FIG. 35 .
  • electronic device 3500 may include one or more processors (also called central processing units, or CPUs), such as a processor 3504 .
  • processors also called central processing units, or CPUs
  • Processor 3504 is connected to a communication infrastructure 3502 , such as a communication bus.
  • processor 3504 can simultaneously operate multiple computing threads.
  • Electronic device 3500 also includes a primary or main memory 3506 , such as random access memory (RAM).
  • Main memory 3506 has stored therein control logic 3528 A (computer software), and data.
  • Electronic device 3500 also includes one or more secondary storage devices 3510 .
  • Secondary storage devices 3510 include, for example, a hard disk drive 3512 and/or a removable storage device or drive 3514 , as well as other types of storage devices, such as memory cards and memory sticks.
  • electronic device 3500 may include an industry standard interface, such a universal serial bus (USB) interface for interfacing with devices such as a memory stick.
  • Removable storage drive 3514 represents a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup, etc.
  • secondary storage devices 3510 may include an operating system 3532 and a browser 3534 .
  • operating system 3532 e.g., OS 304 , OS 432 , etc.
  • browser 3534 e.g., browser 106 , browser 400 , browser 490 , etc.
  • Removable storage drive 3514 interacts with a removable storage unit 3516 .
  • Removable storage unit 3516 includes a computer useable or readable storage medium 3524 having stored therein computer software 3528 B (control logic) and/or data.
  • Removable storage unit 3516 represents a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, or any other computer data storage device.
  • Removable storage drive 3514 reads from and/or writes to removable storage unit 3516 in a well known manner.
  • Electronic device 3500 further includes a communication or network interface 3518 .
  • Communication interface 3518 enables the electronic device 3500 to communicate with remote devices.
  • communication interface 3518 allows electronic device 3500 to communicate over communication networks or mediums 3542 (representing a form of a computer useable or readable medium), such as LANs, WANs, the Internet, etc.
  • Network interface 3518 may interface with remote sites or networks via wired or wireless connections.
  • Control logic 3528 C may be transmitted to and from electronic device 3500 via the communication medium 3542 .
  • Any apparatus or manufacture comprising a computer useable or readable medium having control logic (software) stored therein is referred to herein as a computer program product or program storage device.
  • Devices in which embodiments may be implemented may include storage, such as storage drives, memory devices, and further types of computer-readable media.
  • Examples of such computer-readable storage media include a hard disk, a removable magnetic disk, a removable optical disk, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROM), and the like.
  • computer program medium and “computer-readable medium” are used to generally refer to the hard disk associated with a hard disk drive, a removable magnetic disk, a removable optical disk (e.g., CDROMs, DVDs, etc.), zip disks, tapes, magnetic storage devices, MEMS (micro-electromechanical systems) storage, nanotechnology-based storage devices, as well as other media such as flash memory cards, digital video discs, RAM devices, ROM devices, and the like.
  • Such computer-readable storage media may store program modules that include computer program logic for browser 106 , mixed 2D/3D supporting logic 108 , API 302 , operating system 304 , display driver 306 , browser 400 , user interface 402 , rendering engine 404 , client application(s) 406 , networking module 408 , code interpreter 410 , web browser 490 , OS 432 , browser/rendering engine 442 , 2D/3Dx UI display 444 , networking module 446 , UI backend 448 , client(s) 450 , parser 452 , render tree preparation module 454 , rendered tree display 456 , 2D/3Dx support 458 , streaming server application 466 , user input interfaces 420 , 2D, 3Dx & mixed display driver interface 422 , shell operations 424 , 2D, 3Dx, mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 , API supporting regional 2D/3Dx 428 ,
  • Embodiments of the invention are directed to computer program products comprising such logic (e.g., in the form of program code or software) stored on any computer useable medium (e.g., a computer readable storage medium).
  • Such program code when executed in one or more processors, causes a device to operate as described herein.
  • the invention can work with software, hardware, and/or browser implementations other than those described herein. Any software, hardware, and browser implementations suitable for performing the functions described herein can be used.
  • electronic device 3500 may be implemented in association with a variety of types of display devices.
  • electronic device 3500 may be one of a variety of types of media devices, such as a stand-alone display (e.g., a television display such as flat panel display, etc.), a computer, a game console, a set top box, a digital video recorder (DVR), other electronic device mentioned elsewhere herein, etc.
  • Media content that is delivered in two-dimensional or three-dimensional form according to embodiments described herein may be stored locally or received from remote locations. For instance, such media content may be locally stored for playback (replay TV, DVR), may be stored in removable memory (e.g.
  • FIG. 35 shows a first media content 3530 A that is stored in hard disk drive 3512 , a second media content 3530 B that is stored in storage medium 3524 of removable storage unit 3516 , and a third media content 3530 C that may be remotely stored and received over communication medium 3522 by communication interface 3518 .
  • Media content 3530 may be stored and/or received in these manners and/or in other ways.
  • FIG. 36 shows a block diagram of a display system 3600 that supports mixed 2D, stereoscopic 3D and multi-view 3D displays according to an exemplary embodiment.
  • Display system 3600 is another electronic device embodiment.
  • display system 3600 includes media input interfaces 3602 , host processing circuitry 3604 , user input devices 3606 , display processing circuitry 3608 , adaptable display driver circuitry 3610 , adaptable 2D, 3Dx and mixed display 3612 , and first-third interface circuitry 3614 - 3618 .
  • Host processing circuitry 3604 includes mixed 2D and 3Dx browser 490 (of FIG. 4B ), operating system 432 (of FIG. 4B ), and application programs 3622 .
  • Display processing circuitry 3608 includes 2D, 3Dx, mixed 2D and 3Dx, and mixed 3Dx and 3Dy translation services 3640 .
  • Media input interfaces 3602 includes one or more media input interfaces, wired or wireless, for received media, such as those described elsewhere herein.
  • media input interface 3602 may include an interface for receiving media content from a local media player device, such as a DVD player, a memory stick, a computer media player, etc., and may include commercially available (e.g., USB, HDMI, etc.) or proprietary interfaces for receiving local media content.
  • Media input interface 3602 may include an interface for receiving media content from a remote source, such as the Internet, satellite, cable, etc.), and may include commercially available (e.g., WLAN, Data Over Cable Service Interface Specification (DOCSIS), etc.) or proprietary interfaces for receiving remote media content.
  • DOCSIS Data Over Cable Service Interface Specification
  • Host processing circuitry 3604 may include one or more integrated circuit chips and/or additional circuitry, which may be configured to execute software/firmware, including operating system 432 , browser 490 , and application programs 3622 .
  • User input devices 3606 includes one or more user input devices that a user may use to interact with display system 3600 . Examples of user input devices are described elsewhere herein, such as a keyboard, a mouse/pointer, etc.
  • Display processing circuitry 3608 may be included in host processing circuitry 3604 , or may be separate from host processing circuitry 3604 as shown in FIG. 36 .
  • display processing circuitry 3608 may include one or more processors (e.g., graphics processors), further circuitry and/or other hardware, software, firmware, or any combination thereof
  • Display processing circuitry 3608 may be present to perform graphics processing tasks.
  • display processing circuitry 3608 may optionally include 2D, 3Dx, mixed 2D and 3Dx, and mixed 3Dx and 3Dy translation services 3640 to perform 2D/3D related translation services in addition or alternatively to translation services of OS 432 and/or browser 490 .
  • Adaptable display driver circuitry 3610 includes one or more display driver circuits for an adaptable display. Examples of adaptable display driver circuitry 3610 are described above, such as with regard to FIGS. 4B , 16 , 17 , 30 , and 33 .
  • Adaptable 2D, 3Dx and mixed display 3612 includes a display that is adaptable, and is capable of displaying 2D content, 3D content, and a mixture of 2D and/or 3D content. Examples of adaptable 2D, 3Dx and mixed display 3612 are described elsewhere herein.
  • First-third interface circuitry 3614 - 3618 is optional.
  • a communication infrastructure e.g., a signal bus
  • 3634 may be present to couple signals of media input interfaces 3602 , host processing circuitry 3604 , user input devices 3606 , display processing circuitry 3608 , adaptable display driver circuitry 3610 , and display 3612 .
  • display processing circuitry 3608 , adaptable display driver circuitry 3610 , and/or display 3612 are contained in a common housing/structure with host processing circuitry 3604 (e.g., in a handheld device, etc.) interface circuitry 3614 - 3618 may not be needed to be present.
  • interface circuitry 3614 - 3618 may be present to provide an interface.
  • host processing circuitry 3604 may be in a game console, a desktop computer tower, a home audio receiver, a set top box, etc.
  • display processing circuitry 3608 , adaptable display driver circuitry 3610 , and/or display 3612 may be included in a display device structure. In such case, interface circuitry 3614 - 3618 may not be present.
  • first-third circuitry 3614 - 3618 may each include circuitry, such as receivers and/or transmitters (wired or wireless), for enabling communications between the respective one of display processing circuitry 3608 , adaptable display driver circuitry 3610 , and display 3612 , and the other components of system 3600 (e.g., host processing circuitry 3604 , etc.).
  • circuitry such as receivers and/or transmitters (wired or wireless), for enabling communications between the respective one of display processing circuitry 3608 , adaptable display driver circuitry 3610 , and display 3612 , and the other components of system 3600 (e.g., host processing circuitry 3604 , etc.).
  • display system 3600 shown in FIG. 36 is provided for purposes of illustration, and is not intended to be limiting. In further embodiments, display system 3600 may include fewer, additional, and/or alternative features than shown in FIG. 36 .

Abstract

A browser architecture and associated content definition are provided that support display on a display screen of two-dimensional content and three-dimensional content. Web page content is received and parsed. Two-dimensional content to be displayed in a first region of the screen is identified. A first configuration request is communicated to cause a first configuration of the first region of the screen to support the two-dimensional content. Three-dimensional content to be displayed in a second region of the screen is identified. A second configuration request is communicated to cause a second configuration of the second region of the screen to support the three-dimensional content.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 61/291,818, filed on Dec. 31, 2009, which is incorporated by reference herein in its entirety; and
  • This application claims the benefit of U.S. Provisional Application No. 61/303,119, filed on Feb. 10, 2010, which is incorporated by reference herein in its entirety.
  • This application is also related to the following U.S. Patent Applications, each of which also claims the benefit of U.S. Provisional Patent Application Nos. 61/291,818 and 61/303,119 and each of which is incorporated by reference herein:
  • U.S. patent application Ser. No. 12/845,409, titled “Display With Adaptable Parallax Barrier,” filed Jul. 28, 2010;
  • U.S. patent application Ser. No. 12/845,440, titled “Adaptable Parallax Barrier Supporting Mixed 2D And Stereoscopic 3D Display Regions,” filed Jul. 28, 2010;
  • U.S. patent application Ser. No. 12/845,461, titled “Display Supporting Multiple Simultaneous 3D Views,” filed Jul. 28, 2010;
  • U.S. patent application Ser. No. 12/774,307, titled “Display with Elastic Light Manipulator,” filed May 5, 2010;
  • U.S. patent application Ser. No. ______, titled “Backlighting Array Supporting Adaptable Parallax Barrier,” filed on same date herewith;
  • U.S. patent application Ser. No. ______, titled “Operating System Supporting Mixed 2D, Stereoscopic 3D And Multi-View 3D Displays,” filed on same date herewith;
  • U.S. patent application Ser. No. ______, titled “Application Programming Interface Supporting Mixed Two And Three Dimensional Displays,” filed on same date herewith;
  • U.S. patent application Ser. No. ______, titled “Programming Architecture Supporting Mixed Two And Three Dimensional Displays,” filed on same date herewith; and
  • U.S. patent application Ser. No. ______, titled “Integrated Backlighting, Sub-Pixel and Display Driver Circuitry Supporting Adaptive 2D, Stereoscopic 3D and Multi-View 3D Displays,” filed on same date herewith.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to web browsers.
  • 2. Background Art
  • Images may be generated for display in various forms. For instance, television (TV) is a widely used telecommunication medium for transmitting and displaying images in monochromatic (“black and white”) or color form. Conventionally, images are provided in analog form and are displayed by display devices in two-dimensions. More recently, images are being provided in digital form for display in two-dimensions on display devices having improved resolution (e.g., “high definition” or “HD”). Even more recently, images capable of being displayed in three-dimensions are being generated.
  • Conventional displays may use a variety of techniques to achieve three-dimensional image viewing functionality. For example, various types of glasses have been developed that may be worn by users to view three-dimensional images displayed by a conventional display. Examples of such glasses include glasses that utilize color filters or polarized filters. In each case, the lenses of the glasses pass two-dimensional images of differing perspective to the user's left and right eyes. The images are combined in the visual center of the brain of the user to be perceived as a three-dimensional image. In another example, synchronized left eye, right eye LCD (liquid crystal display) shutter glasses may be used with conventional two-dimensional displays to create a three-dimensional viewing illusion. In still another example, LCD display glasses are being used to display three-dimensional images to a user. The lenses of the LCD display glasses include corresponding displays that provide images of differing perspective to the user's eyes, to be perceived by the user as three-dimensional.
  • Some displays are configured for viewing three-dimensional images without the user having to wear special glasses, such as by using techniques of autostereoscopy. For example, a display may include a parallax barrier that has a layer of material with a series of precision slits. The parallax barrier is placed proximal to a display so that a user's eyes each see a different set of pixels to create a sense of depth through parallax. Another type of display for viewing three-dimensional images is one that includes a lenticular lens. A lenticular lens includes an array of magnifying lenses configured so that when viewed from slightly different angles, different images are magnified. Displays are being developed that use lenticular lenses to enable autostereoscopic images to be generated.
  • As such, many types of display devices exist that are capable of displaying three-dimensional images, and further types are being developed. Different types of displays that enable three-dimensional image viewing may have different capabilities and attributes, including having different depth resolutions, being configured for three-dimensional image viewing only, being switchable between two-dimensional image viewing and three-dimensional image viewing, and further capabilities and attributes.
  • Web browsers are applications that enable the retrieving, presenting, and traversing of information resources that are available on the World Wide Web (“the Web”). Web browsers may be included in electronic devices such as desktop computers and handheld devices to enable users to interact with Web-based information resources. Examples of information resources that may be retrieved and presented by a web browser include web pages, images, and videos. Some of these information resources may include two-dimensional or three-dimensional content.
  • BRIEF SUMMARY OF THE INVENTION
  • Methods, systems, and apparatuses are described for a browser that enables display of network-accessible content by display devices that have two-dimensional and three-dimensional display capability, substantially as shown in and/or described herein in connection with at least one of the figures, as set forth more completely in the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
  • The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.
  • FIG. 1 shows a block diagram of a system that includes a web browser that supports mixed 2D (two-dimensional) and 3D (three-dimensional) displays, according to an exemplary embodiment.
  • FIG. 2 shows a block diagram of a web browser that supports mixed 2D and 3D displays interfaced with various display devices, according to an exemplary embodiment.
  • FIG. 3 shows a block diagram of examples of the web browser of FIG. 1 transmitting commands to a display device, according to embodiments.
  • FIG. 4A shows a block diagram of an electronic device that includes a browser architecture that supports mixed 2D and 3D displays, according to an exemplary embodiment.
  • FIG. 4B shows a block diagram of a display system that includes a 2D and 3D display enabled-browser architecture, according to an embodiment.
  • FIG. 5 shows a flowchart providing a process for enabling the display of 2D and 3D content using a web browser, according to an exemplary embodiment.
  • FIG. 6 shows a block diagram of a web browser configuring a display device of display of 2D and 3D content, according to an exemplary embodiment.
  • FIG. 7 shows a flowchart providing a process for using tag information to configure a screen for the display of 2D and 3D content, according to an exemplary embodiment.
  • FIGS. 8, 9, 10A, and 10B show examples of a screen displaying mixed 2D and 3D content in various screen regions, including tabs, frames, and objects, according to embodiments.
  • FIG. 11 shows a block diagram of a rendering engine configured to translate 3D content to 2D content and to translate a first type of 3D content to a second type of 3D content, according to an exemplary embodiment.
  • FIG. 12 shows a flowchart providing a process for determining display screen characteristics, according to an exemplary embodiment.
  • FIG. 13 shows a block diagram of storage that stores browser preferences, according to an exemplary embodiment.
  • FIG. 14 shows a block diagram of a display device having a light manipulator that enables display of 3D content by a screen, according to an exemplary embodiment.
  • FIG. 15 shows a block diagram of a display device having an adaptable light manipulator that enables the adaptable display of 3D content by a screen, according to an exemplary embodiment.
  • FIGS. 16 and 17 show block diagrams of examples of the display device of FIG. 15, according to embodiments.
  • FIG. 18 shows a flowchart for generating three-dimensional images, according to an exemplary embodiment.
  • FIG. 19 shows a cross-sectional view of an example of a display system, according to an embodiment.
  • FIGS. 20 and 21 shows view of example parallax barriers with non-blocking slits, according to embodiments.
  • FIG. 22 shows a view of the barrier element array of FIG. 22 configured to enable the simultaneous display of two-dimensional and three-dimensional images of various sizes and shapes, according to an exemplary embodiment.
  • FIG. 23 shows a view of the parallax barrier of FIG. 22 with differently oriented non-blocking slits, according to an exemplary embodiment.
  • FIG. 24 shows a display system providing two two-dimensional images that are correspondingly viewable by a first viewer and a second viewer, according to an exemplary embodiment.
  • FIG. 25 shows a flowchart for generating multiple three-dimensional images, according to an exemplary embodiment.
  • FIG. 26 shows a cross-sectional view of an example of the display system of FIG. 15, according to an embodiment.
  • FIGS. 27 and 28 show views of a lenticular lens, according to an exemplary embodiment.
  • FIG. 29 shows a flowchart for generating multiple three-dimensional images using multiple light manipulator layers, according to an exemplary embodiment.
  • FIG. 30 shows a block diagram of a display system, according to an exemplary embodiment.
  • FIGS. 31 and 32 show cross-sectional views of a display system, according to an exemplary embodiment.
  • FIG. 33 shows a block diagram of a display system, according to an exemplary embodiment.
  • FIG. 34 shows a block diagram of a display environment, according to an exemplary embodiment.
  • FIG. 35 shows a block diagram of an example electronic device, according to an embodiment.
  • FIG. 36 shows a block diagram of a display system that supports mixed 2D, stereoscopic 3D and multi-view 3D displays, according to an exemplary embodiment.
  • The present invention will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
  • DETAILED DESCRIPTION OF THE INVENTION I. Introduction
  • The present specification discloses one or more embodiments that incorporate the features of the invention. The disclosed embodiment(s) merely exemplify various aspects of the invention. The scope of the invention is not limited to the disclosed embodiment(s). The invention is defined by the claims appended hereto.
  • References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • Furthermore, it should be understood that spatial descriptions (e.g., “above,” “below,” “up,” “left,” “right,” “down,” “top,” “bottom,” “vertical,” “horizontal,” etc.) used herein are for purposes of illustration only, and that practical implementations of the structures described herein can be spatially arranged in any orientation or manner.
  • II. Exemplary Embodiments
  • Embodiments of the present invention relate to web browsers that enable the display of two- and three-dimensional content. For instance, such web browsers may be enabled to display web pages, images, video, content generated by browser scripts and applications, and further types of information resources that include 2D and/or 3D content. In one example, a browser may be capable of processing a markup language document that defines one or more browser windows, frames, or tabs, within which to display web pages, images, and/or video content. The markup language document may include elements (e.g., tags) that specify one or more parameters to be associated with the displayed regions and content. In further examples, the browser may determine parameters to be associated with displayed regions and content based on other factors, such as a type of content to be displayed, a filename for the content, configuration information stored at a media server for the content, etc. The web browsers may generate configuration commands based on the determined parameters that cause display screens to be configured to display the 2D and/or 3D content.
  • Numerous types of display devices may display 2D and 3D content provided by the web browsers. For example, the display devices may include one or more light manipulators, such as parallax barriers and/or lenticular lenses, to deliver 3D media content in the form of images or views to the eyes of the viewers. Other types may include display devices with 3D display pixel constructs that may or may not employ such light manipulators. When used, light manipulators may be fixed or dynamically modified to change the manner in which the views are delivered. For instance, embodiments enable light manipulators that are adaptable to accommodate a changing viewer sweet spot, switching between two-dimensional (2D), stereoscopic three-dimensional (3D), and multi-view 3D views, as well as the simultaneous display of 2D, stereoscopic 3D, and multi-view 3D content. With regard to parallax barriers, example features that may be dynamically modified include one or more of a number of slits in the parallax barriers, the dimensions of each slit, the spacing between the slits, and the orientation of the slits. Slits of the parallax barriers may also be turned on or off in relation to certain regions of the screen such that simultaneous mixed 2D, stereoscopic 3D, and multi-view 3D presentations can be accommodated. Similarly, a lenticular lens may be dynamically modified, such as by modifying a width of the lenticular lens, to modify delivered images.
  • The following subsections describe numerous exemplary embodiments of the present invention. For instance, the next subsection describes embodiments for web browsers, followed by a subsection that describes embodiments for displaying content using a browser, a subsection that describes user input interface and web browser start up embodiments, a subsection that describes example display environments, and a subsection that describes example electronic devices. It noted that the section/subsection headings are not intended to be limiting. Embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection.
  • It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made to the embodiments described herein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of exemplary embodiments described herein.
  • A. Example Web Browser Embodiments
  • In embodiments, web browsers that provide native support for the display of mixed content are provided. For example, in one embodiment, a web browser comprises a graphical user interface (GUI) in which video content can be displayed in a window of the browser. Furthermore, one or more parameters (e.g., indicated via “tags” or by other configuration information) may be associated with the browser window and/or the displayed content. The parameters can specify various display characteristics, such as one or more of: a type of video content to be displayed within the browser window (e.g., 2D, stereoscopic 3D, or a particular type of multi-view 3D), a desired orientation of the displayed video content, a brightness/contrast to be associated with the browser window, and/or a video resolution to be associated with the browser window. The parameters to be associated with a browser window may be specified programmatically or determined dynamically at run-time. The parameters may also be modified at run-time by a user through a user control interface provided by the web browser. The web browser is further configured to cause one or more function calls to be placed to a graphics API (application programming interface), operating system, or device driver so that a window is opened on the display and the content is presented therein in a manner that is consistent with the associated parameters.
  • For instance, FIG. 1 shows a block diagram of a system 100, according to an exemplary embodiment. As shown in FIG. 1, system 100 includes a display device 102, a document server 104, a web browser 106, and a network 116. System 100 is a system where web browser 106 interfaces together one or more users and network content with display device 102. System 100 is described as follows.
  • System 100 may be implemented in one or more devices. For example, in one embodiment, web browser 106 and display device 102 may be implemented in a common electronic device 112 that may be accessed by a user, such as a mobile computing device (e.g., a handheld computer, a laptop computer, a notebook computer, a tablet computer (e.g., an Apple iPad™), a netbook, etc.), a mobile phone (e.g., a cell phone, a smart phone), a mobile email device, some types of televisions, etc. In another embodiment, as shown in FIG. 1, web browser 106 may be implemented in an electronic device 110 that is separate from display device 102. For instance, device 110 may be a home theater system receiver, a set-top box, a computer, a game console, or other such device, and display device 102 may be a display device that is coupled to device 110 in a wired or wireless fashion.
  • Web browser 106, also referred to as an “Internet browser” or “browser,” is an application for retrieving, presenting, and traversing network-based information resources. For instance, web browser 106 may be implemented in software (e.g., computer programs and/or data) that runs on a device. Web browser 106 may load an external information resource identified by a Uniform Resource Locator (URL), such as a web page, an image, a video, or other item of content. Web browser 106 may display the loaded information in a window of the browser. An information resource loaded by web browser 106 may reference further information resources, which may be loaded by web browser 106 for display. An information resource may include hyperlinks that when displayed can be selected by a user to enable the user to navigate to the related information resources.
  • For instance, as shown in FIG. 1, document server 104 may store one or more information resources, such as an information resource 114. Information resource 114 may be an XML document, an HTML document (e.g., a web page), an image file, a video, or other type of information resource. A user of browser 106 may desire to view information resource 114, and may interact with browser 106 to cause information resource 114 to be loaded by browser 106. For instance, the user may enter the URL of information resource 114 into browser 106, or may select a hyperlink in a markup document that links to information resource 114, to cause browser 106 to load information resource 114. In response to the user interaction with browser 106, browser 106 may generate request 118, such a HTTP (hypertext transfer protocol) request (e.g., if the URL starts with “http:” or “https:”) or other type of request (e.g., FTP (file transfer protocol), etc.), which is transmitted from the device that includes browser 106. Request 118 is directed to a location of information resource 114 according to the URL of information resource 114.
  • Request 118 may be transmitted through a network 116 to be received by document server 104. For instance, network 116 may be any type of communication network, including a local area network (LAN), a wide area network (WAN), or a combination of communication networks, such as the Internet. Document server 104 may be any suitable type of computer system capable of providing documents over a network, such as a server, etc.
  • In response to receiving request 118, document server 104 locates and identifies information resource 114, and transmits information resource 114 to browser 106 through network 116. Browser 106 receives information resource 114, and displays content of information resource 114 in a window in a screen of display device 102.
  • As shown in FIG. 1, web browser 106 includes mixed 2D/3D supporting logic 108. Mixed 2D/3D supporting logic 108 enables web browser 106 to support display of mixed 2D and 3D content, according to an exemplary embodiment. For example, logic 108 may enable web browser 106 to provide two- and three-dimensional content for display by display devices that are capable of separately displaying two-dimensional and three-dimensional content, display devices that are capable of simultaneously displaying two-dimensional and three-dimensional content, display devices that are capable of simultaneously displaying different types of three-dimensional content, as well as display devices that can adaptively change the display of two-dimensional and three-dimensional contents (e.g., by changing display screen regions).
  • For example, with regard to information resource 114, 2D/3D supporting logic 108 may capable of enabling browser 106 to render 2D and 3D content at display device 102 in a manner based on the contents of information resource 114 and/or based on one or more tags (e.g., HTML tags) or other configuration information associated with information resource 114 in a markup document that refers to information resource 114. An HTML document is a type of markup document that includes a tree of HTML elements and other information (e.g., textual information, etc.) according to an HTML language format. Each HTML element can have attributes assigned. In HTML syntax, some elements may be written with associated tags to assign attributes to the elements. An element may be written with a start tag and an end tag, with the content indicated in between the start tag and end tag. A start tag includes the name of the element surrounded by angle brackets, and the corresponding end tag includes a slash character followed by the name of the element, which are both surrounded by angle brackets (not all elements necessarily include an end tag). For instance, a paragraph may be indicated by a “p” element. An example of a p element is shown as follows:
      • <p>Paragraph text is included in here . . . </p>
        Furthermore, one or more attributes may be specified in the start tag. An attribute is defined in a start tag with a name of the attribute, an equal sign, and a value of the attribute (which may or may not be in quotes). For instance, the “abbr” element, which represents an abbreviation, expects a “title” attribute with its expansion. An example of the abbr element is shown as follows:
      • <abbr title=“Hyper Text Markup Language”>HTML</abbr>
  • In embodiments, tags may be included in markup documents to indicate types and characteristics of 2D and 3D content included or referenced therein. For instance, a “3D” element may be defined to indicate particular content as three-dimensional. An example 3D element is shown as follows:
      • <3D>mediafile.mpg</3D>
        where mediafile.mpg is an MPEG video file that is indicated by the 3D tag as containing 3D content. Similarly, a “2D” element may be defined to indicate particular content as two-dimensional. An example 2D element is shown as follows:
      • <2D>imagefile.jpg</2D>
        where imagefile.jpg is an JPG image file that is indicated by the 2D tag as containing 2D content. Further elements/tags may be defined to indicate various types of three-dimensional content, such as “3D-4” to indicate 3D multiview content with four camera views, “3D-8” to indicate 3D multiview content with eight camera views, “3D-HVGA” to indicate three-dimensional HVGA (half-size VGA), etc. It is noted that other characters than “3D” and “2D” may be used to indicate content as three-dimensional or two-dimensional.
  • Furthermore, attributes may be added to a 3D element to indicate the various type of 3D content. For instance, a 3D-4 video file may be indicated by the 3D element as:
      • <3D multiviewtype=“4”>mediafile2.mpg</3D>
        where the start 3D tag includes a “multiviewtype” attribute indicating a type of multiview 3D content. A 3D element may include any number of attributes to indicate one or more 3D display characteristics, such as stereoscopic depth (e.g., a “depth” attribute), brightness (e.g., a “brightness” attribute), a size of a region of a screen in which the 3D content is to be displayed (e.g., a “regionsize” attribute having parameters such as row, column, width, and height parameters, a resolution attribute (e.g., a “resolution” attribute), a window orientation attribute indicating whether image content is to be displayed vertically or rotated by 90 degrees or by other amount (e.g., an “orientation” attribute), a freeform window attribute indicating a non-rectangular shape for the displayed content (e.g., a “freeform” attribute), a display region type attribute (e.g., a frame, a tab, etc.), etc. Any of these and/or additional three-dimensional display characteristics may be defined by separate elements/tags, or as attributes of an element/tag. Furthermore, a single element may be used to determine whether content is two-dimensional or three-dimensional:
      • <content type=“3D”>image2file.jpg</2D>
        where the element “content” has a “type” attribute which can have the value of “3D” to indicate 3D content, “2D” to indicate 2D content, and possibly further values (e.g., “3D-4” to indicate 3D-4 multiview content, etc.).
  • As such, configuration information for display of content may be extracted from a web page according to the HTML language format (e.g., tags, attributes, etc.). Such configuration information may be provided in the form of tags as indicated above, or in further ways, as would become apparent to persons skilled in the relevant art(s) from the teachings herein. Furthermore, as described in further detail below, configuration information for content may be determined in other ways, including being determined by the filename of the content (e.g., by file extension), by configuration information stored at the file server that serves the content, and/or in further ways.
  • In embodiments, display device 102 may be one of a variety of display devices capable of displaying two-dimensional and/or three-dimensional content. For instance, FIG. 2 shows a block diagram of a display system 200, which is an exemplary embodiment of system 100 of FIG. 1. As shown in FIG. 2, system 200 includes web browser 106, a first display device 202, and a second display device 204. As shown in FIG. 2, web browser 106 includes mixed 2D/3D supporting logic 108. First display device 202 is a display device that is only capable of displaying two-dimensional content, and second display device 204 is a display device that is capable of display two-dimensional content and three-dimensional content. In the example of FIG. 2, via mixed 2D/3D supporting logic 108, web browser 106 is capable of displaying content at first and second display devices 202 and 204. In embodiments, web browser 106 may be capable of providing content for display by first and second display devices 202 and 204 one at a time. In another embodiment, web browser 106 may be capable of providing content for display by first and second display devices 202 and 204 and/or other combinations and numbers of display devices simultaneously.
  • Note that web browser 106 may be interfaced with display devices in various ways. For instance, FIG. 3 shows a block diagram of web browser 106 interfacing with display device 102 of FIG. 1, according to embodiments. As shown in FIG. 3, browser 106 can be interfaced with display device 102 through an API (application programming interface) 302 and a display driver 306, through an operating system (OS) 304 and display driver 306, and/or through display driver 306. FIG. 3 is described as follows.
  • API 302 is an interface implemented in software (e.g., computer program code or logic) for applications such as browser 106 that enables the applications to interact with other software and/or hardware. API 302 may be configured to perform graphics operations on graphics information received from the applications. API 302 may be implemented in a same device as browser 106. API 302 may be a special purpose API, or may be a commercially available API, such as Microsoft DirectX® (e.g., Direct3D®), OpenGL®, or other 3D graphics API, which may be modified according to embodiments to receive commands and/or content from browser 106. Further description of implementations of API 302 and other API implementations described herein is provided in pending U.S. patent application Ser. No. ______, titled “Application Programming Interface Supporting Mixed Two And Three Dimensional Displays,” filed on same date herewith, which is incorporated by reference herein in its entirety.
  • OS 304 is configured to interface users and applications with hardware, such as display device 102. OS 304 may be a commercially available or proprietary operating system. For instance, OS 304 may be an operating system such as Microsoft Windows®, Apple Mac OS® X, Google Android™, or Linux®, which may be modified according to embodiments. Further description of implementations of OS 304 and other operating system implementations described herein is provided in pending U.S. patent application Ser. No. ______, titled “Operating System Supporting Mixed 2D, Stereoscopic 3D And Multi-View 3D Display,” filed on same date herewith, which is incorporated by reference herein in its entirety.
  • Display driver 306 may be implemented in software, and enables applications (e.g., higher-level application code) such as browser 106 to interact with display device 102. Display driver 306 may be implemented in a same device as browser 106. Multiple display drivers 306 may be present, and each display driver 306 is typically display device-specific, although some display drivers 306 may be capable of driving multiple types of display devices. Each type of display device typically is controlled by its own display device-specific commands. In contrast, most applications communicate with display devices according to high-level device-generic commands. Display driver 306 accepts the generic high-level commands (directly from browser 106, or via API 302 and/or OS 304), and breaks them into a series of low-level display device-specific commands, as used by the particular display device.
  • As shown in FIG. 3, browser 106 may generate a command 308 associated with the display of 2D and/or 3D content that is received by API 302. API 302 passes command 308 to display driver 306 (in a modified or unmodified form). Display driver 306 receives command 308, and generates one or more control signals 314 received by display device 102. Control signal(s) 314 place(s) a screen of display device 102 in a display mode corresponding to command 308. Furthermore, browser 106 may stream content to display device 102 through API 302 and display driver 306 to be displayed in the screen configured according to command 308. Note the API 302 may be included in OS 304 (when present), or may be separate. Furthermore, API 302 may communicate directly with display driver 306 as shown in FIG. 3, or may communicate with display driver 306 through OS 304.
  • Alternatively, browser 106 may generate a command 310 associated with the display of 2D and/or 3D content that is received by OS 304. OS 304 passes command 310 to display driver 306 (in a modified or unmodified form). Display driver 306 receives command 310, and generates control signal(s) 314 received by display device 102. Control signal(s) 314 place(s) a screen of display device 102 in a display mode corresponding to command 310. Furthermore, browser 106 may stream content to display device 102 through OS 304 and display driver 306 to be displayed in the screen configured according to command 310.
  • In another example, browser 106 may generate a command 312 associated with the display of 2D and/or 3D content that is directly received by display driver 306 (e.g., does not pass through API 302 or OS 304). Display driver 306 receives command 312, and generates control signal(s) 314 received by display device 102. Control signal(s) 314 place(s) a screen of display device 102 in a display mode corresponding to command 312. Furthermore, browser 106 may stream content to display device 102 directly through display driver 306 to be displayed by the screen configured according to command 312.
  • As such, commands and content may be provided by browser 106 to display device 102 through one or more different intermediate components, which may include one or more of API 302, OS 304, and display driver 306. Note that in an embodiment, browser 106 may include one or more API 302 and/or OS 304, or may be included in OS 304, in a similar manner as some commercially available operating systems that incorporate a web browser (e.g., Google Chrome OS™).
  • Web browser 106 may be implemented in various ways to interface users and network-based content with display devices that are capable of displaying two-dimensional content and/or three-dimensional content. For instance, FIG. 4A shows a block diagram of an electronic device 412 that includes a browser architecture for a web browser 400, according to an exemplary embodiment. Device 412 may be any of the electronic devices mentioned herein as including a web browser (e.g., electronic devices 110 and 112 of FIG. 1), or may be an alternative device. Browser 400 is configured to interface users and network-based content with display devices that are capable of displaying two-dimensional content and/or three-dimensional content. In an embodiment, browser 400 may be a proprietary web browser. In another embodiment, browser 400 may be a commercially available web browser that is modified to enable users and network-based content to be interface with display devices capable of displaying two-dimensional content and/or three-dimensional content. For instance, web browsers such as Internet Explorer®, developed by Microsoft Corp. of Redmond, Wash., Mozilla Firefox®, developed by Mozilla Corp. of Mountain View, Calif., or Google® Chrome of Mountain View, Calif. may be modified according to embodiments. As shown in FIG. 4A, browser 400 includes various browser portions, including a user interface 402, a rendering engine 404, one or more optional client applications 406, a networking module 408, and a code interpreter 410. These features of browser 400 are described as follows.
  • User interface 402 is configured to display information to enable a person to interact with browser 400. For instance, user interface 402 may provide one or more graphical user interface (GUI) control elements that a user may interact with to use and/or configure browser 400. For instance, user interface 402 may provide an address bar into which a user may enter URLs of desired information resources, a back button, a forward button, a refresh button, a stop button, a home button, one or more additional/alternative buttons, one or more pull down menus (e.g., a list of bookmarks, etc.), etc.
  • Networking module 408 is a communications module for browser 400 with network-accessible entities, such as document server 104 shown in FIG. 1. For instance, networking module 408 may be configured to generate network calls, such as HTTP requests (e.g., request 118 of FIG. 1) and/or other types of requests. The calls may be transmitted over a network (e.g., network 116 of FIG. 1) to remote entities to retrieve information resources corresponding to a URL in an address bar provided by user interface 402, a URL for an information resource referenced in a markup document loaded by browser 400, or a hyperlink present in content displayed by browser 400.
  • Rendering engine 404 is configured to display requested content in one or more browser windows. For example, rendering engine 404 may request and receive a markup document (also known as a “markup language document”), and may render the content included in or referenced by the markup document for display in a screen of a display device. Rendering engine 404 can render displays of HTML (hypertext markup language) and XML (extensible markup language) documents, as well as image/video content. In the case where the markup document is an XML or HTML document (e.g., a web page), rendering engine 404 may parse the document to generate a DOM (document object model) tree. The DOM is a cross-platform and language-independent convention for representing objects in HTML and XML documents. Rendering engine 404 may generate a render tree from the DOM tree. Rendering engine 404 may perform a layout process to determine screen coordinates for each node of the render tree, and may traverse and “paint” each node of the render tree on the display screen in a browser window.
  • One or more client applications 406 may optionally be present. Each client application 406 may be interfaced with web browser 400 to add corresponding capabilities to web browser 406, if web browser 406 does not already have such capabilities. For example, a client application 406 may be a presentation component configured to enable web browser 400 to play video, to scan for viruses, to display additional file types, such as PDF (portable document format) files, etc. Examples of client application 406 may include a media player, an Adobe® Flash® plug-in that enables animation, video, and interactivity for web pages, a Apple QuickTime® plug-in that enables various formats of digital video, images, sound, and interactivity for web pages, a Microsoft® Silverlight™ plug-in that enables multimedia, graphics, animation, and interactivity for web pages, etc.
  • Code interpreter 410 (also known as a “script engine”) is configured to interpret and execute script code referenced by markup documents. For example, code interpreter 410 may be configured to interpret and execute JavaScript® code. JavaScript® may be present to provided enhanced user interfaces and dynamic web pages. Code interpreter 410 may interpret JavaScript® source code, and execute the interpreted code. Similarly to code interpreter 410, web browser 400 may include a compiled code execution module that is capable of executing compiled code, such as Java bytecode. For instance, browser 400 may include a virtual machine configured as a Java runtime environment to run Java applets, which may provide interactive features to web pages, including complex graphics.
  • As described above, browser 400 of FIG. 4A is configured to interface users and network-accessible resources with display devices that are capable of displaying two-dimensional content and/or three-dimensional content. Browser 400 may be configured in various ways to perform its functions, and various embodiments for browser 400 are described herein. For instance, FIG. 4B shows a block diagram of a display system 480 that includes a 2D and 3D display enabled-browser architecture, according to an embodiment. As shown in FIG. 4B, display system 480 includes a web browser 490. Browser 490 is an embodiment of browser 400 that is configured to interface users and network-accessible resources with display devices that are capable of displaying two-dimensional content and/or three-dimensional content. The embodiment of browser 490 shown in FIG. 4B is provided for purposes of illustration, and is not intended to be limiting. In further embodiments, browser 490 may include fewer, additional, and/or alternative features than shown in FIG. 4B.
  • Display system 480 is an example of a display system that is capable of displaying mixed 2D and 3D content (e.g., via mixed 2D/3D supporting logic 108). As shown in FIG. 4B, system 480 includes web browser 490, operating system kernel and kernel utilities with regional/3Dx support 432 (“OS 432”), one or more browser page and 2D/3Dx content servers 460 (“server 460”), first-third display circuitry 416 a-416 c, a 2D display 418 a, a 3D display with 2D mode 418 b, a regionally configurable 2D/3Dx display 418 c, and a network 478. Web browser 490 includes various browser portions, including a browser/rendering engine 442, a 2D/3Dx UI (user interface) display 444, a networking module 446, a UI backend 448, and one or more 2D/3Dx video and image client(s) 450. Browser/rendering engine 442 includes a parser 452, a render tree preparation module 454, and a rendered tree display 456. UI backend 448 includes 2D/3Dx support 458. Browser page and 2D/3Dx content server(s) 460 includes page content 462, linked content file or files 464, and a streaming server application 466. Page content 462 includes a hypertext content link 468, a screen region location 470, and an underlying screen configuration 472. Linked content file or files 464 includes a file A and screen configuration A 474, and a file B and screen configuration B 476. OS 432 includes user input interfaces 420, a 2D, 3Dx & mixed display driver interface 422, shell operations 424, 2D, 3Dx, mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426, an API supporting regional 2D/3Dx 428 (“API 428”), and one or more communication interfaces 440. 2D, 3Dx and mixed display driver interface 422 includes 2D only driver variant 434, 3Dx only driver variant 436, and mixed 2D and 3Ds driver variant 438. First-third display circuitry 416 a-416 c each includes a corresponding one of translation services 430 a-430 c. The features of system 480 are described as follows.
  • 2D display 418 a, 3D display with 2D mode 418 b, and regionally configurable 2D/3Dx display 418 c are example types of display devices that may display content provided by browser 490. One or more of displays 418 a-418 c may be present. 2D display 418 a is an example of 2D display device 202 of FIG. 2, and is a display device that is only capable of displaying two-dimensional content. 3Dx display with 2D mode 418 b is an example of 2D-3D display device 204 of FIG. 2, and is a display device that is capable of displaying two-dimensional and three-dimensional content. For instance, 3Dx display with 2D mode 418 b may be set in a 2D mode where 3Dx display with 2D mode 418 b can display 2D content in full screen, but not 3D content, and may be set in a 3D mode where 3Dx display with 2D mode 418 b can display 3D content in full screen, but not 2D content. Furthermore, 3Dx display with 2D mode 418 b may be capable of displaying 3D content having multiple camera views (“multiview”)—a number of “x” views—such as 3D-4, having four camera views, 3D-16, having sixteen camera views, etc. The additional camera views enable viewers to “view behind” displayed 3D content by moving their heads left-right, as further described elsewhere herein. Regionally configurable 2D/3Dx display 418 c is an example of 2D-3D display device 204 of FIG. 2, and is a display device that is capable of displaying two-dimensional and three-dimensional content simultaneously. For instance, regionally configurable 2D/3Dx display 418 c may display 2D content in one or more regions of a display screen while simultaneously displaying 3D content in one or more other regions of the display screen. Furthermore, regionally configurable 2D/3Dx display 418 c may be capable of displaying 3D content having multiple camera views.
  • Network 478 is an example of network 116 in FIG. 1, and browser page and 2D/3Dx content server 460 is an example of document server 104 in FIG. 1. One or more browser page and 2D/3Dx content servers 460 may be present that are accessible to browser 490 over network 478. Browser page and 2D/3Dx content server 460 may include one or more information resources, such as markup documents (e.g., web pages, etc.), image files, video files, etc. For example, page content 462 is an example of markup document content. Page content 462 may include text, page configuration information, references to other information resources, etc. For instance, as shown in FIG. 4B, page content 462 may include one or more hypertext content links 468, which are links displayed in a page generated from page content 462 and displayed by browser 490. Hypertext content link 468 may be selected by a user to traverse to and display an information resource as a page element. Screen region location 470 may be present to indicate a region in the displayed page in which a page element corresponding to hypertext content link 468 is to be displayed. For example, screen region location 470 may be used by a layout module of browser/rendering engine 442 to select a location for display of the corresponding content in a display screen. Underlying screen configuration 472 may be present to indicate a screen display configuration for the displayed page, including desired 2D and/or 3D display characteristics of the screen. For example, underlying screen configuration 472 may be included in a file that includes page content 462 (e.g., in the form of one or more tags), or may be separately stored in server 460. Browser/rendering engine 442 may use information of underlying screen configuration 472 in a configuration request to configure a screen region for displaying the corresponding content.
  • Linked content file(s) 464 includes files that may be requested for display by browser 490 (e.g., in request 118), such as in response to a user selecting a hyperlink in a displayed page. In some cases, linked content file(s) 464 may include multiple files from which a file may be selected to be provided in a response to a user selecting a hyperlink. For instance, as shown in FIG. 4B, linked content file(s) 464 may include file A and screen configuration A 474, and a file B and screen configuration B 476. File A and file B are alternative files to be provided to browser 490 in response to a request. File A corresponds to a screen configuration A, and file B corresponds to a different screen configuration B. For instance, file A or file B may be provided by server 460 in response to a request based on characteristics of a display screen in which content of the file is to be displayed, based on a provided display frame size, based on communication link characteristics (testing), and/or based on other criteria.
  • Streaming server application 466 may be present in browser page and 2D/3Dx content server 460 to stream video content in response to a request from browser 490 to server 460 for video files.
  • OS 432 is an example of operating system 304 shown in FIG. 3. OS 432 interfaces applications, such as browser 490, with displays 418 a-418 c. As indicated in FIG. 4B, OS 432 may provide various forms of 2D/3Dx display support. For instance, API supporting regional 2D/3Dx 428 is configured to interface one or more applications (e.g., browser 490) with OS 432, and thereby interface the applications with a display device (e.g., one or more of displays 418 a-418 c) coupled to OS 432. API supporting regional 2D/3Dx 428 is configured to enable applications, such as browser 490, to access various display functions, including enabling regional definition for 2D, 3D, and 3Dx content displayed by display screens and further display functions.
  • User input interfaces 420 are configured to receive user input to enable a person to interact with display system 480, browser 490, and content displayed by displays 418 a-418 c. Further example embodiments for user input interfaces 420 are described elsewhere herein.
  • 2D, 3Dx & mixed display driver interface 422 enables applications, such as browser 490, that interface with OS 432 via API 428 to provide and control two- and/or three-dimensional content displayed at a displays 418 a-418 c. 2D only driver variant 434, 3Dx only driver variant 436, and mixed 2D and 3Dx driver variant 438 are examples of display driver 306 of FIG. 3. 2D, 3Dx & mixed display driver interface 422 may forward commands (e.g., from browser 490) to 2D only driver variant 434 when 2D display 418 a is present, enabling only 2D-related commands to be processed. 2D, 3Dx & mixed display driver interface 422 may forward commands to 3Dx only driver variant 436 when 3Dx display with 2D mode 418 b is present, enabling 2D or 3Dx related commands to be processed. 2D, 3Dx & mixed display driver interface 422 may forward commands to mixed 2D and 3Dx driver variant 438 when regionally configurable 2D/3Dx display 418 c is present, enabling regional 2D or 3Dx related commands to be processed.
  • Shell operations 424 may be present in OS 432 to control and/or enable user configuration of environmental properties, such as the 2D and/or 3D display configuration of an environmental background, of desktop icons, of displayed windows, etc. In embodiments, shell operations 424 may be implemented in hardware, software, firmware, or any combination thereof, including as a shell operations module.
  • Mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 may be present in OS 432 to provide for translation of received content (e.g., from an application such as browser 490) from a first dimensionality to a second dimensionality. For instance, translation services 426 may be configured to translate received 3D content to 2D content, such as when an application provides 3D content, and 2D display 418 a is the target display (e.g., the target display is not capable of displaying 3D content). In another example, translation services 426 may be configured to translate a first type of 3D content to a second type of 3D content, such as when an application provides regional 2D and/or 3D content, and 3Dx display with 2D mode is the target display (e.g., the target display is not capable of displaying content regionally), and/or to translate 3D content having a first number “x” of cameras (e.g., 3D-8 content) to 3D content having a second number “y” of cameras (e.g., 3D-4 content), if the target display does not support “x” camera views. Still further, translation services 426 may be configured to translate 2D content to 3D content, and/or may be able to perform other forms of content translations. Example embodiments for mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426 (e.g., translators) are described elsewhere herein.
  • Further description regarding an operating system configured to interface applications with displays supporting two-dimensional and three-dimensional views, such as OS 432, is provided in pending U.S. patent application Ser. No. ______, titled “Operating System Supporting Mixed 2D, Stereoscopic 3D And Multi-View 3D Displays,” which is incorporated by reference herein in its entirety.
  • Display circuitry 416 a-416 c may have the form of hardware, software, firmware, or any combination thereof, such as having the form of a graphics card, circuitry etc. Display circuitry 416 a-416 c may be present to interface OS 432 with displays 418 a-418 c, respectively. Display circuitry 416 a-416 c may receive content signals and control signals from OS 432, and may be configured to generate drive signals to drive displays 418 a-418 c, respectively. Examples of display circuitry (e.g., drive circuits) are described elsewhere herein.
  • As shown in FIG. 4B, display circuitry 416 a-416 c may each optionally include a corresponding one of translation services 430 a-430 c. When present, translation services 430 a-430 c may perform translations of received content in a similar manner as mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426. For instance, translation services 430 a may translate received 3D content to 2D content for display by 2D display 418 a. Translation services 430 b may translate received regionally configurable 2D and/or 3D content to non-regional 2D and/or 3D content for display by 3Dx display with 2D mode display 418 b. Translation services 430 b and 430 c may each translate unsupported types of 3D content to supported types of 3D content for display by 3Dx display with 2D mode display 418 b and regionally configurable 2D/3Dx display 418 c, respectively. Translation services 430 a-430 c may also be configured to perform additional and/or alternative forms of content translations, in embodiments.
  • Browser 490 is configured to enable network-accessible content to be displayed in two- and three-dimensions at displays 418 a-418 c. 2D/3Dx UI display 444 is an example of user interface 402 shown in FIG. 4A. 2D/3Dx UI display 444 may include an address bar, back/forward buttons, bookmarking, and/or further portions the browser display (e.g., other than the main window displaying a requested page). 2D/3Dx UI display 444 may include 2D & 3Dx counterparts, such as images or video streams (e.g., 2D/3Dx Applet like functionality).
  • Browser/Rendering Engine 442 is an example of rendering engine 404 of FIG. 4A. Engine 442 processes HTML, and manages the display of web page and 2D and 3D image & Video (stream) file content. For instance, parser 452 may parse a loaded HTML document to generate a DOM (document object model) tree, as described above. Rendering tree preparation module 454 may generate a render tree from the DOM tree. Module 454 may identify screen configurations to be applied to regions of the display screen based on the render tree, and may cause configuration requests to be generated based on the identified screen configuration to cause a configuration or reconfiguration of the screen in the regions. Module 454 may include a layout module that performs a layout process to determine screen coordinates for each node of the render tree. Module 454 may traverse and “paint” each node of the render tree in a browser window on the display screen, to generate render tree display 456.
  • Networking module 446 is an example of networking module 408 shown in FIG. 4A. In an embodiment, networking module 446 is platform independent, and interfaces with OS 432 to operate through communication interface(s) 440 of OS 432 via network protocols (e.g., HTTP requests, etc.).
  • UI Backend 448 is configured to draw basic widgets, such as drop down boxes, combo boxes, and windows. UI Backend 448 may interface with API 428 of OS 432 to generate 2D and 3D image or video (e.g., streamed) elements. UI Backend 448 may be platform independent.
  • 2D/3Dx video and image client(s) 450 are an example of client application(s) 406 of FIG. 4A. 2D/3Dx video and image client(s) 450 may include plug-ins, add-ons, built-in, external helper apps, etc., that provide functionality to browser 480. In one mode of operation, clients 450 may provide the functionality for: (i) generating the control signals that are passed to OS 432 for configuring display screen regions in preparation for underlying video/image presentation; (ii) managing the retrieval of media content to be displayed; (iii) delivering the media content via OS 432 to one or more of displays 418 a-418 c; and (iv) managing the presentation of such media content (e.g., enabling rewind, zoom, pause, etc.). Alternatively, item (i) above may be performed by browser/renderer engine 442 according to HTML tag definitions, for example. Similarly, one or more others of (ii)-(iv) may be performed by engine 442. In an embodiment, client(s) 450 can be integrated into engine 442, or may remain a plug-in, an add-on, a built-in, a helper app as shown in FIG. 4B. Client(s) 450 may reside outside of browser 490, and launching and loading of an external client 450 may be performed by browser 490 within another external window. Note that code interpreters (e.g., code interpreter 410 of FIG. 4A), such as a Java interpreter, may also be present in browser 490, which operate pursuant to code of client 450 to perform a same function as a compiled add-on.
  • The embodiments of display system 480 and browser 490 shown in FIG. 4B are provided for purposes of illustration. In further embodiments, display system 480 and browser 490 may include fewer, further, and/or alternative components, as would be known to persons skilled in the relevant art(s). Further embodiments regarding the features of display system 480 and browsers 400 and 490 are described in the following subsections.
  • B. Exemplary Embodiments for Displaying Content Using a Browser
  • As described above, browser 400 may retrieve two-dimensional and three-dimensional content for display by display devices, including content associated with a web page. FIG. 5 shows a flowchart 500 providing a process for displaying web page content, according to an exemplary embodiment. Flowchart 500 may be performed by browsers described elsewhere herein, such as browser 400 of FIG. 4A or browser 490 of FIG. 4B. Flowchart 500 is described with respect to FIG. 6, which shows a block diagram of browser 400 interfaced with a display device 606, according to an exemplary embodiment. In the example of FIG. 6, browser 400 includes rendering engine 404, application client(s) 406, and code interpreter 410. Furthermore, rendering engine 404 includes mixed 2D/3D supporting logic 108. Display driver 604 is an example of display driver 306 of FIG. 3. Device 412 of FIG. 4A is not shown in FIG. 6 for ease of illustration, but it is noted that browser 400 may be included in device 412, and display device 606 may be included in or may be external to device 412. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 500. Flowchart 500 is described as follows.
  • Flowchart 500 begins with step 502. In step 502, web page content is parsed. For example, as shown in FIG. 6, browser 400 may receive a markup document 608. Markup document 608 may include HTML text that describes a web page. Rendering engine 404 may parse markup document 608. For instance, rendering engine 404 may include parser 452 of FIG. 4B, which may be configured to parse HTML documents, such as markup document 608. Parser 452 may receive the content of markup document 608 in 8K chunks or portions, and may begin parsing the underlying HTML text of markup document 608 on a chunk-by-chunk basis without waiting for all content to be received. Alternatively, parser 452 may receive all of the content of markup document 608 before beginning parsing. Parser 452 may generate a document object model tree or other structure that identifies each of the elements of content included in or referenced by markup document 608. A first portion of the elements may relate to two-dimensional content, and second portion of the elements may relate to three-dimensional content. Alternatively, all of the elements may relate to three-dimensional content (or two-dimensional content).
  • In step 504, two-dimensional content to be displayed in a first region of the screen is identified. For example, rendering module 404 may identify a first object of markup document 608 that relates to two-dimensional content. The first object may be identified by parser 452 encountering a hypertext link corresponding to the first object in markup document 608, or in other manner. The first object may include any form of two-dimensional content, such as an image, a video, another web page, etc. The first object may be identified in various ways. Upon identification, rendering engine 404 may handle further processing of the first object, client application 406 may be selected to manage the processing of the first object (e.g., for a particular type of first object that the client application 406 is configured to process), or code interpreter 410 may interpret and execute the first object when the first object is an un-compiled script.
  • For instance, in an embodiment, the first object may be identified based on an identifier for the first object (e.g., a filename) or a structure of the first object itself (e.g., file contents, such as header information). For example, a MIME (multipurpose Internet mail extensions) type file extension to a filename for the first object provided in markup document 608 may be used to identify the first object, to identify that the first object includes 2D content, and to select rendering engine 404 or a particular client application 406 to process the first object. In another embodiment, the first object may be identified by a content server (e.g., content server 460 of FIG. 4B) from which the first object is requested. For instance, one or more parameters, such as tag information or other information, may be present in markup document 608 that may be passed to the content server by rendering engine 404 in a request that can be used to select the first object to be returned in response to the request. For instance, the tag(s) and/or other parameter(s) may indicate a screen configuration for a screen 620 of display device 606, a frame size to be generated by rendering engine 404, and/or other information. The content server may use the tag(s) and/or other parameters to select the first object, and/or may use other information to select the first object, such as characteristics of the communication link between the file server and browser. Referring to FIG. 4B, file A or file B may be selected by server 460 based on whether the browser screen configuration matches screen configuration A or screen configuration B stored at server 460, and the selected file is transmitted (e.g., an image file is transmitted, video is streamed, etc.) to browser 400 (e.g., as information resource 610 in FIG. 6).
  • In step 506, a first configuration request is communicated to at least attempt to cause a first configuration of the first region of the screen to support the two-dimensional content. For example, as shown in FIG. 6, rendering engine 404 (e.g., render tree preparation module 454 of FIG. 4B) may generate a command 612 that is a configuration request for a first region of screen 620 to support display of the 2D content identified in step 504. For instance, as described above with respect to FIG. 3, command 612 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604. Display driver 604 receives command 612, and generates control signal(s) 616 that are received by display device 606. Control signal(s) 616 place(s) a region of screen 620 in a 2D display mode for display of the identified 2D content.
  • In step 508, three-dimensional content to be displayed in a second region of the screen is identified. For example, in a similar fashion as described above with respect to step 504, rendering module 404 may identify a second object of markup document 608 that relates to three-dimensional content. The second object may be identified by parser 452 encountering a hypertext link corresponding to the second object in markup document 608, or in other manner. The second object may include any form of three-dimensional content, such as an image, a video, another web page, etc., and any type of three-dimensional content (e.g., stereoscopic 3D, 3D-2, 3D-4, etc.). The second object may be identified in various ways. Upon identification, rendering engine 404 may handle further processing of the second object, a client application 406 may be selected to manage the processing of the second object (e.g., for a particular type of second object that the client application 406 is configured to process), or code interpreter 410 may interpret and execute a script of the second object.
  • For instance, in a similar manner as described above, the second object may be identified based on an identifier for the second object (e.g., a filename) or a structure of the second object itself (e.g., file contents, such as header information). For example, a MIME (multipurpose Internet mail extensions) type file extension to a filename for the second object provided in markup document 608 may be used to identify the second object, to identify that the second object includes 3D content, and to select rendering engine 404 or a particular client application 406 to process the second object. In another embodiment, the second object may be identified by a content server (e.g., content server 460 of FIG. 4B) from which the second object is requested. For instance, one or more parameters, such as tag information or other information, may be present in markup document 608 that may be passed to the content server by rendering engine 404 in a request that can be used to select the second object to be returned in response to the request. For instance, the tag(s) and/or other parameter(s) may indicate a screen configuration for a screen 620 of display device 606, a frame size to be generated by rendering engine 404, and/or other information. The content server may use the tag(s) and/or other parameters to select the second object, and/or may use other information to select the second object, such as characteristics of the communication link between the file server and browser. Referring to FIG. 4B, file A or file B may be selected by server 460 based on whether the browser screen configuration matches screen configuration A or screen configuration B stored at server 460, and the selected file is transmitted (e.g., an image file is transmitted, video is streamed, etc.) to browser 400 (e.g., as information resource 610 in FIG. 6).
  • In step 510, a second configuration request is communicated to at least attempt to cause a second configuration of the second region of the screen to support the three-dimensional content, the first configuration being different from the second configuration. For example, as shown in FIG. 6, rendering engine 404 (e.g., render tree preparation module 454 of FIG. 4B) may generate a second command 614 that is a second configuration request for a second region of screen 620 to support display of the 3D content identified in step 508. For instance, command 614 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604. Display driver 604 receives command 614, and generates control signal(s) 618 that are received by display device 606. Control signal(s) 618 place(s) a second region of screen 620 in a 3D display mode for display of the identified 3D content. If the second region of screen 620 is in a different display mode (e.g., in a 2D display mode, or a different 3D display mode), the second region of the screen 620 is reconfigured according to the second configuration request.
  • To provide the content to the first and second regions of screen 620, rendering engine 404 may generate a render tree for each of the 2D and 3D content identified in steps 504 and 506, and may perform a layout process to determine screen coordinates (positional information) for each node of each render tree (e.g., using render tree preparation module 454 shown in FIG. 4B). Rendering engine 404 may traverse each node of each render tree for display on screen 620, and may generate graphical data representative of each render tree to paint each node. As shown in FIG. 6, rendering engine 404 may transmit 2D graphical data 622 corresponding to the identified 2D content, and 3D graphical data 624 corresponding to the identified 3D content. Display driver 604 may receive 2D graphical data 622 and 3D graphical data 624, and transmit corresponding processed 2D graphical data 626 and processed 3D graphical data 628 that are received by display device 606. Display device 606 may display the 2D content of processed 2D graphical data 626 in the first region of screen 620, which is configured according to the first configuration request. Furthermore, display device 606 may display the 3D content of processed 3D graphical data 628 in the second region of screen 620, which is configured according to the second configuration request. In this manner, browser 400 enables simultaneously display of 2D and 3D content by a display screen.
  • As described above, tags may be included in markup document 608. The tags may be used to define characteristics of the display of 2D and 3D content by a display device. For instance, the tags may be used to indicate one or more display properties of the displayed content, including indicating whether content is 2D or 3D, indicating a type of 3D content, etc. FIG. 7 shows a flowchart 700 providing a process for using tags to configure the display of 2D and 3D content, according to an exemplary embodiment. Flowchart 700 may be performed by browser embodiments described herein, such as browser 400 of FIG. 4A or browser 490 of FIG. 4B. Flowchart 700 is described with respect to FIG. 6 for purposes of illustration. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 700. Flowchart 700 is described as follows.
  • Flowchart 700 begins with step 702. In step 702, first tag information associated with two-dimensional content is identified, the two-dimensional content intended for both a left eye and a right eye of a viewer. For example, rendering module 404 of FIG. 6 may identify a first tagged object in markup document 608 that relates to two-dimensional content. As described above for two-dimensional content, same images are delivered to the right and left eyes of a viewer so that the content is perceived as two-dimensional. The first tagged object may be identified by parser 452 encountering a URL or other content identifier (e.g., a filename) that has associated tags in markup document 608, or in other manner. The first tagged object may include any form of two-dimensional content, such as an image, a video, another web page, etc. The tag information associated with the two-dimensional content may include any number of attributes. The tag information may indicate a screen configuration for screen 620 of display device 606, a frame size to be generated by rendering engine 404 for display of the 2D content, a type of the 2D content, a display brightness for the 2D content, a resolution for the 2D content (e.g., 720p, 1080p, etc.), and/or any other suitable information described elsewhere herein or otherwise known.
  • In step 704, second tag information associated with three-dimensional content is identified, the three-dimensional content having a first portion and a second portion, the first portion intended for the left eye of the viewer and the second portion intended for the right eye of the viewer, the first portion being a first camera view and the second portion being a second camera view. For example, rendering module 404 of FIG. 6 may identify a second tagged object in markup document 608 that relates to three-dimensional content. As described above for three-dimensional content, images of differing perspective are delivered to the right and left eyes of a viewer. The images are combined in the visual center of the brain of the viewer to be perceived as a three-dimensional image. The second tagged object may be identified by parser 452 encountering a second URL or other content identifier (e.g., a filename) that has associated tags in markup document 608, or in other manner. The second tagged object may include any form of three-dimensional content, such as an image, a video, another web page, etc. The second tag information associated with the three-dimensional content may include any number of attributes. The second tag information may indicate a screen configuration for screen 620 of display device 606 for display of the 3D content, a frame size to be generated by rendering engine 404 for display of the 3D content, a type of the 3D content, a display brightness for the 3D content, a display resolution for the 3D content, and/or any other suitable information described elsewhere herein or otherwise known.
  • In step 706, the presentation of the two-dimensional content is caused in a first region of a screen. For instance, rendering engine 404 may generate command 612 that is a configuration request for a first region of screen 620 to support display of the 2D content according to the first tag information identified in step 702. As described above with respect to FIG. 3, command 612 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604. Display driver 604 receives command 612, and generates control signal(s) 616 that are received by display device 606. Control signal(s) 616 place(s) a first region of screen 620 in a 2D display mode for display of the 2D content.
  • Rendering engine 404 may generate a render tree for the 2D content, and may perform a layout process to determine screen coordinates (positional information) for each node of the render tree (e.g., using render tree preparation module 454 shown in FIG. 4B). Rendering engine 404 may traverse each node of the render tree, and may generate graphical data representative of the render tree to paint each node. As shown in FIG. 6, rendering engine 404 may transmit 2D graphical data 622 corresponding to the 2D content. Display driver 604 may receive 2D graphical data 622 and transmit corresponding processed 2D graphical data 626 that is received by display device 606. Display device 606 may display the 2D content of processed 2D graphical data 626 in the first region of screen 620, which is configured according to the first configuration request.
  • In step 708, the presentation of the three-dimensional content is caused in a second region of the screen. For instance, rendering engine 404 may generate command 614 that is a configuration request for a second region of screen 620 to support display of the 3D content according to the second tag information identified in step 704. Command 614 may be transmitted from rendering engine 404 directly, or through an API and/or OS, to display driver 604. Display driver 604 receives command 614, and generates control signal(s) 618 that are received by display device 606. Control signal(s) 618 place(s) a second region of screen 620 in a 2D display mode for display of the 2D content.
  • Rendering engine 404 may generate a render tree for the 3D content, and may perform a layout process to determine screen coordinates for each node of the render tree. Rendering engine 404 may traverse each node of the render tree, and may generate graphical data representative of the render tree to paint each node. As shown in FIG. 6, rendering engine 404 may transmit 3D graphical data 624 corresponding to the 3D content. Display driver 604 may receive 3D graphical data 624 and transmit corresponding processed 3D graphical data 628 that is received by display device 606. Display device 606 may display the 3D content of processed 3D graphical data 628 in the second region of screen 620, which is configured according to the second configuration request. In this manner, browser 400 causes display of the 3D content on screen 620 of display screen 606 simultaneously with the display of the 2D content on screen 620.
  • As such, according to flowcharts 500 and 700, two-dimensional and three-dimensional content identified by browser 400 may simultaneously be displayed within corresponding regions of screen 620. Furthermore, different types of three-dimensional content (e.g., different resolutions, different numbers of image pairs, different stereoscopic depths, etc.) are enabled to be individually or simultaneously displayed by browser 400. In embodiments, any number of different types of two-dimensional and three-dimensional content may be displayed in any number of regions of screen 620.
  • For instance, FIGS. 8, 9, 10A, and 10B show examples of screen 620 displaying content in various screen regions, including tabs, frames, and display objects, according to embodiments. FIG. 8 shows screen 620 of FIG. 6 displaying a browser window 802 that includes multiple frames. Frames enable browsers to display two or more web pages or other media elements within the same browser window (e.g., side-by-side, etc.). Frames may be defined using “frameset” tags that define frames and their sizes. FIG. 8 shows browser window 802 including a first frame 804 and a second frame 806. First frame 804 is configured for the display of two-dimensional content (e.g., according to step 506 of FIG. 5 or step 706 of FIG. 7), and second frame 806 is configured for the display of three-dimensional content (e.g., according to step 510 of FIG. 5 or step 708 of FIG. 7). In the example of FIG. 8, first frame 804 and second frame 806 have approximately the same size, and are positioned side-by-side. In other embodiments, first and second frames 804 and 806 may have different sizes, and may have different positions relative to each other (e.g., above and below, etc). Still further, although first and second frames 804 and 806 are shown as having rectangular shapes in FIG. 8, in other embodiments, first and second frames 804 and 806 may have other shapes. Note that any number of frames may be displayed in browser window 802 that respectively display two-dimensional or three-dimensional content.
  • In another example, FIG. 9 shows screen 620 displaying a browser window 902 that includes multiple tabs. Tabs enable browsers to display two or more documents in a same browser window one at a time. The tabs can be used as a navigational widget to switch the display of the documents. FIG. 9 shows browser window 902 including a first tab region 904 and a second tab region 906. First tab region 904 may be configured for the display of two-dimensional content (e.g., according to step 506 of FIG. 5 or step 706 of FIG. 7), and second tab region 906 may be configured for the display of three-dimensional content (e.g., according to step 508 of FIG. 5 or step 708 of FIG. 7). As shown in FIG. 9, first and second tab regions 904 and 906 each have a corresponding tab extending upward that may be used to bring the respective region forward. First tab region 904 is displayed over second tab region 906, such that second tab region 906 is not visible (except for the tab of second tab region 906). The tab of second tab region 906 may be selected (e.g., by mouse click, etc.) to bring second tab region 906 to the forefront to be displayed over first tab region 904, causing first tab region 904 to not be visible (except for the tab of first tab region 904). Note that any number of tab regions may be present in browser window 902 that respectively display two-dimensional or three-dimensional content.
  • In another example, FIG. 10A shows screen 620 displaying browser window 902 of FIG. 9, with browser window 902 including tab regions 902 and 904. First tab region 902 is displayed over second tab region 904, and a frame 1002 is displayed in tab region 902. Any number of frames may be displayed in a tab region. Furthermore, an object 1004 is displayed that overlaps first tab region 904 and frame 1002. Object 1004 may be a two-dimensional object (e.g., displayed according to step 506 of FIG. 5 or step 706 of FIG. 7) or a three-dimensional object (e.g., displayed according to step 510 of FIG. 5 or step 708 of FIG. 7). In an embodiment, display of object 1004 may be a graphical object generated at least in part by client application 406 interacting with rendering engine 406. For instance, object 1004 may be generated based on a Flash® application, a Java applet, etc., that is executed by client application 406 (or by rendering engine 404). Note that any number of two-dimensional and/or three-dimensional content objects similar to object 1004 may be displayed in browser window 902. Furthermore, although object 1004 is shown as having a round shape in FIG. 10A, in other embodiments, object 1004 may have other shapes (e.g., rectangular, other polygonal shape, shape of a person, an animal, an animated character, a product, etc.).
  • FIG. 10B shows another example of screen 620 displaying a browser window 1020 similar to browser window 902 of FIG. 10A, with browser window 1020 including tab regions 902 and 904, and with tab region 904 including first frame 1002 and a second frame 1006. Frames 1002 and 1006 may each include two-dimensional content (e.g., displayed according to step 506 of FIG. 5 or step 706 of FIG. 7) or three-dimensional content (e.g., displayed according to step 510 of FIG. 5 or step 708 of FIG. 7). Any number of two-dimensional and/or three-dimensional content objects similar to frames 100 and 1006 may be displayed in browser window 1020 having any shape.
  • Browser window 1020 includes various user interface elements providing controls for navigating the display of 2D and 3D content. As shown in FIG. 10B, browser window 1020 (and any other browser windows described herein) may include a navigation bar 1008, which may include various controls. A user may interact with navigation bar 1008 to navigate to web pages by entering corresponding URLs in an address entry box. Such web pages may include 2D and/or 3D content for display in browser window 1020. A user may interact with back and forward buttons in navigation bar 1008 to navigate to a previous resource or forward to a subsequent resource. A user may interact with a refresh button of navigation bar 1008 to reload a current resource, and may interact with a stop button of navigation bar 1008 to cancel loading a resource. The example of navigation bar 1008 shown in FIG. 10B is provided for purposes of illustration and is not intended to be limiting. In further embodiments, navigation bar 1008 may include additional and/or alternative navigation elements, such as a search engine query entry box, a home button, etc.
  • Furthermore, as shown in FIG. 10B, browser window 1020 provides various browser controls for controlling the display of two-dimensional and three-dimensional content. For instance, as shown in FIG. 10B, browser window 1020 may include a 3D display control bar 1010. In the example of FIG. 10B, 3D display control bar 1010 is positioned in a North position in browser window 1020 immediately below navigation bar 1008, but in other embodiments may have other forms or positions (e.g., right side, left side, South position, etc.), and may be combined with other displayed bars. Furthermore, in other embodiments, 3D display control bar 1010 may have other forms, such as a widget, an icon, or other user interface element.
  • 3D display control bar 1010 enables a user to configure 3D display settings and/or preferences for browser window 1020. For instance, 3D display control bar 1010 may include a 2D-3D toggle button 1014 and/or a 3D options button 1016. 2D-3D toggle button 1014 may be selected (e.g., by clicking with a mouse pointer 1024, by keystrokes, etc.) by a user to toggle between display of content in browser window 1020 in 2D form, or to enable 3D-enabled content to be displayed in 3D form. 2D-3D toggle button 1014 may display the current 2D-3D setting (e.g., either 2D or 3D). 3D options button 1016 may be selected by a user to set one or more 3D display settings/preferences for browser window 1020. For example, in an embodiment, a user may select 3D options button 1016 to invoke a menu 1018 that lists one or more 3D display options that may be selected by the user. In the example of FIG. 10B, menu 1018 includes a “set 3Dx” option (to select a 3D multiview display type), a set 3D intensity option (to set a 3D display depth), a linked defaults option, and an advertisements defaults option. The linked defaults option enables a user to configure whether content invoked by clicking on a hyperlink in a web page displayed in browser window 1020 is displayed in 2D or 3D form. For instance, a user can set as a default all content generated by the same domain to be displayed regionally in full. Any hypertext linked content (e.g., coming from another source) may be set according to the linked defaults option be reduced to 2D or to be enabled to be displayed in 3D (e.g., of a particular 3D type). Thereafter, by clicking on content that has been reduced to 2D form, a restoration to 3D form may be performed. Content that was restored to full 3D may be clicked again to be reduced back to 2D form. A user may use the advertisements defaults option to set whether advertisements are displayed in 2D form by default, or whether 3D-enabled advertisements may be displayed in 3D form. For instance, an advertiser may attempt to push strong 3D effect graphics/video/text to users of browser window 1020 to grab their attention. This may be overridden through setup with the advertisements defaults option, or through direct user interaction with the advertisement itself. For example, a right click on the advertisement may generate reduce intensity/2D/3D/stop-pause” type options.
  • The example of 3D display control bar 1010 shown in FIG. 10B is provided for purposes of illustration and is not intended to be limiting. In further embodiments, 3D display control bar 1010 may have other form or position, and may include additional and/or alternative 3D control elements.
  • Furthermore, tab regions may enable users to configure 2D-3D settings on a tab region-by-tab region basis. For instance, as shown in FIG. 10B, tab region 904 may include a 3D user interface element 1012 that enables 3D settings to be made for tab region 1012. 3D user interface element 1012 may be a button, icon, widget, may invoke a menu, etc., that a user may interact with to configure 2D-3D settings for tab region 904. Such settings may be similar to those described above with respect to 3D display control bar 1010 and/or may include further and/or alternative settings. 3D user interface element 1012 is shown for purposes of illustration, and may have other form and capabilities than described with respect FIG. 10B.
  • Still further, browser window 1020 may enable frames and/or specific content items to enable users to configure 2D-3D settings on frame-by-frame or content-by-content basis. For instance, as shown in FIG. 10B, a user may invoke a menu 1022 with respect to frame 1002 (e.g., by right clicking pointer 1024 in frame 1002) that provides one or more 2D-3D configuration options. For instance, as shown in FIG. 10B, menu 1022 may include a toggle 2D-3D option (to toggle between display of content in 2D or 3D), a change 3Dx option (to change a 3D multiview display setting), an increase 3D intensity option, a reduce 3D intensity option, a pause option (to pause display of video content), etc. Menu 1022 is shown for purposes of illustration, and may provide further and/or alternative 2D-3D display related options to those shown in FIG. 10B.
  • As shown in FIG. 10B, browser window 1020 may include a 3D status bar 1012. In the example of FIG. 10B, 3D status bar 1012 is positioned in a South-most position in browser window 1020, but in other embodiments may have other positions, and may be combined with other displayed bars. Furthermore, in other embodiments, 3D status bar 1012 may have other forms, such as a widget, an icon, or other user interface element. 3D status bar 1012 displays a current 2D-3D setting status for browser window 1020, and may optionally change the displayed 2D-3D setting status depending on the particular region (e.g., tab region, frame, content, etc.) over which pointer 1024 is hovered. 3D status bar 1012 may show any suitable 3D status information, such as whether display of 2D or 3D content is enabled, a type of 3D multiview that is displayed (e.g., “3D-8”), 2D-3D settings for advertisements, a 3D intensity setting, and/or further display information. The example of 3D status bar 1012 shown in FIG. 10B is provided for purposes of illustration and is not intended to be limiting. In further embodiments, 3D status bar 1012 may have other form or position, and may include additional and/or alternative 3D status elements.
  • It is noted that the examples of FIGS. 8, 9, 10A, and 10B are provided for purposes of illustration, and are not intended to be limiting. In the examples of FIGS. 8, 9, 10A, and 10B, it is assumed that display device 606 supports the display of both two-dimensional and three-dimensional content. However, it is noted that not all types of display device 606 may support both two-dimensional content and three-dimensional content. Furthermore, not all types of display device 606 that support three-dimensional content may support all types of three-dimensional content. As such, in embodiments, browser 400 may be configured to translate unsupported types of content to supported types of content. Additionally and/or alternatively, browser 400 may be interfaced with components that are configured to perform such translations. For instance, as shown in FIG. 4B, OS 432 includes translation services 426, and display circuitry 416 a-416 c include respective translation services 430 a-430 c.
  • In an embodiment, rendering engine 404 of browser 400 may be configured to translate types of content that are not supported by a display device to supported types of content. FIG. 11 shows a block diagram of rendering engine 404, according to an exemplary embodiment. As shown in FIG. 11, rendering engine 404 includes a first translator 1102 and a second translator 1104. In embodiments, rendering engine 404 may include one or both of first and second translators 1102 and 1104. First translator 1102 may be present in rendering engine 404 to support display devices that do not support the display of three-dimensional content. Second translator 1102 may be present in rendering engine 404 to support display devices that do not support the display of one or more types of three-dimensional content.
  • First translator 1102 is configured to translate received 3D data to 2D data for display by a display device. For example, as shown in FIG. 11, three-dimensional graphical data associated with an information resource may be received by rendering engine 404. Rendering engine 404 may determine that the information resource contains three-dimensional content in any manner, included such as by a MIME file extension, by contents of a media file containing the data, by a tag associated with the information resource, etc. When a display device does not support the display of three-dimensional content, first translator 1102 may translate three-dimensional graphical data 1106 of the information resource to two-dimensional graphical data 1108. Two-dimensional graphical data 1108 may be transmitted to the display device (e.g., 2D display of FIG. 4B) to enable two-dimensional content to be displayed in a screen region based on two-dimensional graphical data 1106.
  • Furthermore, a display device that supports the display of three-dimensional data may not support all types of three-dimensional data (e.g., the display device does not support 3D graphics data having additional camera views other than initial first right and left views, does not support a number of camera views greater than 3D-4, etc.). Second translator 1104 is configured to translate 3D data of an information resource of one or more unsupported 3D content types to 3D data of one or more supported 3D content types for display by a display device. For example, as shown in FIG. 11, first-type three-dimensional graphical data 1110 associated with an information resource may be received. Rendering engine 404 may determine that the first-type of three-dimensional content is an unsupported type in any manner, included such as by a MIME file extension, by contents of a media file containing the data, by a tag associated with the information resource, etc. When a display device does not support the display of the first-type of three-dimensional content, second translator 1104 may translate first-type three-dimensional graphical data 1110 to second-type three-dimensional graphical data 1112. Second-type three-dimensional graphical data 1112 is transmitted to display device 606 to enable the corresponding second type of three-dimensional content to be displayed in the region of screen 620.
  • First translator 1102 may be configured in various ways to translate received 3D data to 2D data. For instance, in an embodiment, three-dimensional graphical data 1106 may be received as a stream of right image data and left image data. First translator 1102 may be configured to combine the right and left image data into two-dimensional image data that defines a stream of two-dimensional images that may be output as two-dimensional data 1108. In another embodiment, first translator 1102 may be configured to select the right image data or the left image data to be output as two-dimensional data 1108, while the other of the right image data or left image data is not used. In further embodiments, first translator 1102 may translate received 3D data to 2D data in other ways.
  • Second translator 1104 may be configured in various ways to translate 3D data of a first 3D content type to 3D data of a second 3D content type. For instance, second translator 1104 may translate a first 3D multiview type (e.g., 3D-16) to a second 3D multiview type (e.g., 3D-4) or to a single 3D view. In such an embodiment, second translator 1104 may not pass extra left-right image pairs from first-type three-dimensional data 1110 to second-type three-dimensional data 1112. In an embodiment, second translator 1104 (and/or first translator 1102) may use techniques of image scaling to modify an unsupported display resolution to a supported display resolution. For instance, second translator 1104 may use upsampling or interpolating to increase resolution, and may use subsampling or downsampling to decrease resolution. In further embodiments, second translator 1104 may translate 3D data in other ways. In still further embodiments, a translator may be present to translate 2D content to 3D content, such as when a user has a preference to view content as 3D content. Various techniques may be used to convert 2D graphical data to 3D graphical data, as would be known to person skilled in the relevant art(s).
  • Note that a determination may be made of whether a display device supports the display of two-dimensional content and/or three-dimensional content, and/or a determination may be made of further display device characteristics in various ways. For instance, FIG. 12 shows a flowchart 1200 providing a process for determining display screen characteristics, according to an exemplary embodiment. Flowchart 1200 may be performed by browser 400 of FIG. 4A, browser 490 of FIG. 4B, etc.
  • In step 1202, an indication of at least one characteristic of the screen is requested. For instance, browser 400 of FIG. 6 may transmit a screen characteristic request to display device 606. The screen characteristic request may be transmitted through an API (e.g., API 302 of FIG. 3), an OS (e.g., OS 304 of FIG. 3), and/or a display driver (e.g., display driver 306 of FIG. 3), when present in a communication path between browser 400 and display device 606.
  • In step 1204, a response to the request is received. For example, display driver 606 may transmit a response to the screen characteristic request that includes an indication of one or more characteristics of screen 620, including whether screen 620 supports display of 2D and/or 3D content, an indication of supported types of 3D content, an indication of a resolution of screen 620, whether screen 620 supports display of mixed 2D and 3D, etc. The response may be transmitted through the display driver, OS, and/or API, when present. Browser 400 may receive the response, and rendering engine 404 may used the received response information to render 2D and/or 3D content that is supported by screen 620. For instance, due to the information included in the response, first translator 1102 or second translator 1104 may be activated to translate an unsupported content type to a supported content type, and/or other actions may be taken.
  • B. Example User Input Interface and Browser Start Up Embodiments
  • As described above, user input interfaces 420 in FIG. 4B receive user input to enable persons to interact with browser content displayed by a display device. For example, via user input interface 420, a user may be enabled to interact with displayed controls of browser 402 (e.g., displayed in 2D/3Dx UI display 444), to select tabs to view different tab regions, to interact with displayed graphical items (e.g., windows, frames, objects etc.), to modify (e.g., rotate, resize, etc.) displayed graphical items, etc. As described above, user interface 402 (FIG. 4A) of browser 400 may provide a command-line interface (e.g., a URL address entry box), a GUI, and/or other browser interface with which the user can interact using user input interfaces 420. In embodiments, user input interface 420 may enable users to interact with displayed controls of browser 400 to adjust three-dimensional characteristics of three-dimensional content displayed by browser 400 (e.g., rendered by rendering engine 404). For example, user input interface 420 may enable three-dimensionality of displayed content to be turned on or off (e.g., to toggle between two-dimensionality and three-dimensionality). User input interface 420 may enable a degree of three-dimensionality of displayed content to be modified (e.g., increased or decreased, such as by changing a depth of three-dimensionality, increasing or decreasing a number of supplied camera views, etc.), may enable three-dimensional objects to be rotated in three-dimensions, and/or may enable further types of adjustment to three-dimensional characteristics of displayed three-dimensional content. Furthermore, user input interface 420 may enable other characteristics of displayed content to be modified, such as modifying contrast, brightness, etc.
  • In embodiments, the user may interact with user input interface 402 in various ways, including using a mouse/pointing device to move a displayed pointer/cursor. The pointer may be used to select control settings. The pointer may be used to “click and drag” objects to move them, to resize objects, to rotate objects, to select controls/settings, to open a pop-up menu, etc. In other embodiments, the user may interact with a keyboard, a thumb wheel or other wheel, a roller ball, a stick pointer, a touch sensitive display, any number of virtual interface elements (e.g., such as a keyboard or other user interface element displayed by screen 620), a voice recognition system, and/or other user interface elements described elsewhere herein or otherwise known to provide user input. For instance, user input interface 402 may support a touch screen that is reactive to user finger touches to the screen to cause three-dimensional characteristics of displayed objects to be modified. For instance, particular motions of one or more figures against the screen may cause object resizing, 3D rotation, movement in 3D, etc. (e.g., touching two fingers to the screen, and dragging them together may be interpreted as “grabbing” a window and moving the window in 3D).
  • In embodiments, users may have preferences with regard to a browser environment upon the browser being activated. Such preferences may include preferences with regard to display of three-dimensional content. For example, a user may desire for a browser to power up in a two-dimensional or three-dimensional display mode, and if a three-dimensional display mode is desired, the user may have particular three-dimensional display preferences (e.g., a preferred degree of displayed three-dimensionality). For instance, the user may desire for the various controls of the browser to be displayed in two- or three-dimensions, may desire all content to be displayed as two-dimensional or three-dimensional by default, may desire particular contents such as advertisements to be displayed as two-dimensional by default, etc.
  • Embodiments enable display preferences to be set by users, and to be used to configure the display environments of users upon device boot up, user login, browser activation, etc. For instance, FIG. 13 shows a block diagram of storage 1302 that may be included in an electronic device (e.g., device 412 of FIG. 4A) that includes browser 400, according to an exemplary embodiment. As shown in FIG. 13, storage 1302 stores user browser preferences 1304. User browser preferences 1304 may indicate the user preferences that a user may have for a browser environment upon the browser being activated, including the browser preferences mentioned above and/or further preferences. User preferences 1304 may be loaded at browser startup, and used (e.g., by rendering engine 404, OS 304 or 432, etc.) to enable the browser environment to be displayed as desired by a user. Storage 1402 may include one or more non-volatile storage elements, such as non-volatile random access memory (RAM) devices (e.g., flash memory, electrically erasable programmable read-only memory, etc.), read only memory (ROM) devices, a hard disk drive, a CDROM (compact disc ROM), a DVD (digital video disc), etc. User preferences 1304 may be associated with a user by being stored in a user account of the user, being stored in a cookie associated with the user, etc.
  • C. Example Display Device Screen Embodiments
  • Embodiments described herein for browsers that support the display of two-dimensional and three-dimensional content may be implemented with respect to various types of display devices. For example, as described above, some display screens are configured for displaying two-dimensional content, although they may display two-dimensional images that may be combined to form three-dimensional images by special glasses worn by users. Some other types of display screens are capable of display two-dimensional content and three-dimensional content without the users having to wear special glasses using techniques of autostereoscopy. As described above, browser embodiments described herein may generate configuration requests/commands to configure regions of the display screen for display of content, and may provide the content for display in the configured regions. Display drivers (e.g., display driver 306 of FIG. 3, driver variants 434, 436, and 438 of FIG. 4B, etc.) may receive the configuration requests/commands, and may generate control signals to cause the screen to be configured as indicated. Furthermore, the display drivers may supply the content provided by the browsers to the display devices to be displayed on the screen. Example display devices, screens, and display drivers are described as follows that receive the control signals, are configured accordingly, and that receive and display the provided content.
  • As described above, display devices, such as display device 606, may be implemented in various ways. For instance, display device 606 may be a television display (e.g., an LCD (liquid crystal display) television, a plasma television, etc.), a computer monitor, or any other type of display device. Display device 606 may include any suitable type or combination of light and image generating devices, including an LCD screen, a plasma screen, an LED (light emitting device) screen (e.g., an OLED (organic LED) screen), etc. Furthermore, display device 606 may include any suitable type of light filtering device, such as a parallax barrier (e.g., an LCD filter, a mechanical filter (e.g., that incorporates individually controllable shutters), etc.) and/or a lenticular lens, and may be configured in any manner, including as a thin-film device (e.g., formed of a stack of thin film layers), etc. Furthermore, display device 606 may include any suitable light emitting device as backlighting, including a panel of LEDs or other light emitting elements.
  • For instance, FIG. 14 shows a block diagram of a display device 1400, according to an exemplary embodiment. As shown in FIG. 14, display device 1400 includes a screen 1402. Display device 1400 is an example of display device 606 and screen 1402 is an example of screen 620 described above (e.g., with respect to FIG. 6). Device 1400 receives one or more control signals 1406 (e.g., from browser 400) that are configured to place screen 620 in a desired display mode (e.g., either a two-dimensional display mode or a three-dimensional display mode). As shown in FIG. 14, screen 1404 includes a light manipulator 1404. Light manipulator 1404 is configured to manipulate light that passes through light manipulator 1404 to enable three-dimensional images to be delivered to users in a viewing space. For instance, control signal(s) 1406 may be configured to activate or deactivate light manipulator 1404 to place screen 620 in a three-dimensional display mode or a two-dimensional display mode, respectively.
  • Examples of light manipulator 1404 include a parallax barrier and a lenticular lens. For instance, light manipulator 1404 may be a parallax barrier that has a layer of material with a series of precision slits. The parallax barrier is placed proximal to a light emitting pixel array so that a user's eyes each see a different set of pixels to create a sense of depth through parallax. In another embodiment, light manipulator 1404 may be a lenticular lens that includes an array of magnifying lenses configured so that when viewed from slightly different angles, different images are magnified. Such a lenticular lens may be used to deliver light from a different set of pixels of a pixel array to each of the user's eyes to create a sense of depth. Embodiments are applicable display devices that include such light manipulators, include other types of light manipulators, and that may include multiple light manipulators.
  • As shown in FIG. 14, display device 1400 receives a content signal 1408 (e.g., from device 412 of FIG. 4A, or other electronic device). Content signal 1408 includes two-dimensional or three-dimensional content for display by screen 1402, depending on the particular display mode. In the embodiment of FIG. 14, light manipulator 1404 is physically fixed—is not adaptable. As such, when present, light manipulator 1404 (e.g., a fixed parallax barrier or a fixed lenticular lens) always delivers three-dimensional images of a particular type to a particular region in a viewing space. As such, light manipulator 1404 is not adaptable to deliver other types of three-dimensional images and/or to deliver two and/or three-dimensional images to multiple different regions of a viewing space.
  • In contrast, FIG. 15 shows a block diagram of a display device 1500 that is adaptable, according to an exemplary embodiment. As shown in FIG. 15, display device 1502 includes a screen 1502. Display device 1500 is an example of display device 606 and screen 1502 is an example of screen 620 described above (e.g., with respect to FIG. 6). Furthermore, as shown in FIG. 15, screen 1504 includes an adaptable light manipulator 1504. Adaptable light manipulator 1504 is configured to manipulate light that passes through adaptable light manipulator 1504 to enable three-dimensional images to be delivered to users in a viewing space. Furthermore, adaptable light manipulator 1504 is adaptable—is not physically fixed in configuration. As such, adaptable light manipulator 1504 is adaptable to deliver multiple different types of three-dimensional images and/or to deliver three-dimensional images to different/moving regions of a viewing space. Furthermore, in an embodiment, different regions of adaptable light manipulator 1504 may be adaptable such that multiple two-dimensional and/or three-dimensional images may be simultaneously delivered by screen 1502 to the viewing space.
  • Device 1500 receives one or more control signals 1506 (e.g., from browser 400) that are configured to place screen 1502 in a desired display mode (e.g., either a two-dimensional display mode or a three-dimensional display mode), and/or to configure three-dimensional characteristics of any number and type as described above, such as configuring adaptable light manipulator 1504 to deliver different types of three-dimensional images, to deliver three-dimensional images to different/moving regions of a viewing space, and to deliver two-dimensional and/or three-dimensional images from any number of regions of screen 1502 to the viewing space.
  • As shown in FIG. 15 display device 1500 receives a content signal 1508 (e.g., from device 412 of FIG. 4A, or other electronic device). Content signal 1508 includes two-dimensional and/or three-dimensional content for display by screen 1502, depending on the particular display mode and on the number of regions of screen 1502 that are delivering different two- or three-dimensional views to a viewing space.
  • Content signals 1408 and 1508 may include video content according to any suitable format. For example, content signals 1408 and 1508 may include video content delivered over an HDMI (High-Definition Multimedia Interface) interface, over a coaxial cable, as composite video, as S-Video, a VGA (video graphics array) interface, etc. Note that control signals 1406 and 1506 may be provided separately or in a same signal stream to display devices as their corresponding one of content signals 1408 and 1508.
  • Exemplary embodiments for display devices 1400 and 1500 of FIGS. 14 and 15 are described as follows for purposes of illustration.
  • 1. Exemplary Embodiments Using Parallax Barriers
  • Display devices 1400 and 1500 may include parallax barriers as light manipulators 1404 and 1504, respectively. For instance, FIG. 16 shows a block diagram of a display system 1600, which is an example of display device 606, according to an embodiment. As shown in FIG. 16, system 1600 includes a display device driver circuit 1602, an image generator 1612, and parallax barrier 1620. As shown in FIG. 16, image generator 1612 includes a pixel array 1608, and parallax barrier 1620 includes a barrier element array 1610. Furthermore, as shown in FIG. 16, display driver circuit 1602 includes a pixel array driver circuit 1604 and a barrier array driver circuit 1606. These features of system 1600 are described as follows.
  • Pixel array 1608 includes a two-dimensional array of pixels (e.g., arranged in a grid or other distribution). Pixel array 1608 is a self-illuminating or light-generating pixel array such that the pixels of pixel array 1608 each emit light included in light 1652 emitted from image generator 1612. Each pixel may be a separately addressable light source (e.g., a pixel of a plasma display, an LCD display, an LED display such as an OLED display, or of other type of display). Each pixel of pixel array 1608 may be individually controllable to vary color and intensity. In an embodiment, each pixel of pixel array 1608 may include a plurality of sub-pixels that correspond to separate color channels, such as a trio of red, green, and blue sub-pixels included in each pixel.
  • Parallax barrier 1620 is positioned proximate to a surface of pixel array 1608. Barrier element array 1610 is a layer of parallax barrier 1620 that includes a plurality of barrier elements or blocking regions arranged in an array. Each barrier element of the array is configured to be selectively opaque or transparent. Combinations of barrier elements may be configured to be selectively opaque or transparent to enable various effects. For example, in one embodiment, each barrier element may have a round, square, or rectangular shape, and barrier element array 1610 may have any number of rows of barrier elements that extend a vertical length of barrier element array 1610. In another embodiment, each barrier element may have a “band” shape that extends a vertical length of barrier element array 1610, such that barrier element array 1610 includes a single horizontal row of barrier elements. Each barrier element may include one or more of such bands, and different regions of barrier element array may include barrier elements that include different numbers of such bands.
  • One advantage of such a configuration where barrier elements extend a vertical length of barrier element array 1610 is that such barrier elements do not need to have spacing between them because there is no need for drive signal routing in such space. For instance, in a two-dimensional LCD array configuration, such as TFT (thin film transistor) display, a transistor-plus-capacitor circuit is typically placed onsite at the corner of a single pixel in the array, and drive signals for such transistors are routed between the LCD pixels (row-column control, for example). In a pixel configuration for a parallax barrier, local transistor control may not be necessary because barrier elements may not need to be changing as rapidly as display pixels (e.g., pixels of pixel array 1608). For a single row of vertical bands of barrier elements, drive signals may be routed to the top and/or bottom of barrier elements. Because in such a configuration drive signal routing between rows is not needed, the vertical bands can be arranged side-by-side with little-to-no space in between. Thus, if the vertical bands are thin and oriented edge-to-edge, one band or multiple adjacent bands (e.g., five bands) may comprise a barrier element in a blocking state, followed by one band or multiple adjacent bands (e.g., two bands) that comprise a barrier element in a non-blocking state (a slit), and so on. In the example of five bands in a blocking state and two bands in a non-blocking state, the five bands may combine to offer a single black barrier element of approximately 2.5 times the width of a single transparent slit with no spaces therein.
  • It is noted that in some embodiments, barrier elements may be capable of being completely transparent or opaque, and in other embodiments, barrier elements may not be capable of being fully transparent or opaque. For instance, such barrier elements may be capable of being 95% transparent when considered to be “transparent” and may be capable of being 5% transparent when considered to be “opaque.” “Transparent” and “opaque” as used herein are intended to encompass barrier elements being substantially transparent (e.g., greater than 75% transparent, including completely transparent) and substantially opaque (e.g., less than 25% transparent, including completely opaque), respectively.
  • Display driver circuit 1602 receives control signal 1622 and content signal 1624. As described below, content signal 1624 includes two-dimensional and/or three-dimensional content for display. Control signal 1622 may be control signal 1406 of FIG. 14 (for a non-adaptable parallax barrier 1620) or may be control signal 1506 of FIG. 15 (for an adaptable parallax barrier 1620). Control signal 1622 may be received from a display driver of an operating system (e.g., may be control signal 618 received from display driver 604 in FIG. 6). Display driver circuit 1602 is configured to generate drive signals based on control signal 1622 and content signal 1624 to enable display system 1600 to display two-dimensional and three-dimensional images to users 1618 in viewing space 1670. For example, pixel array driver circuit 1604 is configured to generate a drive signal 1614 that is received by pixel array 1608 (e.g., based on content signal 1624 and/or control signal 1622). Drive signal 1614 may include one or more drive signals used to cause pixels of pixel array 1608 to emit light 1652 of particular desired colors and/or intensity. Barrier array driver circuit 1606 is configured to generate a drive signal 1616 that is received by barrier element array 1610 (e.g., based on control signal 1622). Drive signal 1616 may include one or more drive signals used to cause each of the barrier elements of barrier element array 1610 to be transparent or opaque. In this manner, barrier element array 1610 filters light 1652 to generate filtered light 1672 that includes one or more two-dimensional and/or three-dimensional images that may be viewed by users 1618 in viewing space 1670. Example further description of implementations of the display driver circuits described herein is provided in pending U.S. patent application Ser. No. ______, titled “Integrated Backlighting, Sub-Pixel and Display Driver Circuitry Supporting Adaptive 2D, Stereoscopic 3D and Multi-View 3D Displays,” filed on same date herewith, which is incorporated by reference herein in its entirety, although the driver circuits described herein are not limited to such implementations.
  • For example, drive signal 1614 may control sets of pixels of pixel array 1608 to each emit light representative of a respective image, to provide a plurality of images. Drive signal 1616 may control barrier elements of barrier element array 1610 to filter the light received from pixel array 1608 according to the provided images such that one or more of the images are received by users 1618 in two-dimensional form. For instance, drive signal 1616 may select one or more sets of barrier elements of barrier element array 1610 to be transparent, to transmit one or more corresponding two-dimensional images or views to users 1618. Furthermore, drive signal 1616 may control sections of barrier element array 1610 to include opaque and transparent barrier elements to filter the light received from pixel array 1608 so that one or more pairs of images or views provided by pixel array 1608 are each received by users 1618 as a corresponding three-dimensional image or view. For example, drive signal 1616 may select parallel strips of barrier elements of barrier element array 1610 to be transparent to form slits that enable three-dimensional images to be received by users 1618.
  • In embodiments, drive signal 1616 may be generated by barrier array driver circuit 1606 to configure one or more characteristics of barrier element array 1610. For example, drive signal 1616 may be generated to form any number of parallel strips of barrier elements of barrier element array 1610 to be transparent, to modify the number and/or spacing of parallel strips of barrier elements of barrier element array 1610 that are transparent, to select and/or modify a width and/or a length (in barrier elements) of one or more strips of barrier elements of barrier element array 1610 that are transparent or opaque, to select and/or modify an orientation of one or more strips of barrier elements of barrier element array 1610 that are transparent, to select one or more areas of barrier element array 1610 to include all transparent or all opaque barrier elements, etc.
  • FIG. 17 shows a block diagram of a display system 1700, which is another example of display device 1500 of FIG. 15, according to an embodiment. As shown in FIG. 17, system 1700 includes display device driver circuit 1602, a pixel array 1722, parallax barrier 1620, and a backlighting 1716. Parallax barrier 1620 includes barrier element array 1610 and backlighting 1716 includes a light element array 1736. Furthermore, display driver circuit 1602 includes a pixel array driver circuit 1728, barrier array driver circuit 1606, and a light source driver circuit 1730. These features of system 1700 are described as follows.
  • Backlighting 1716 is a backlight panel that emits light 1738. Light element array 1736 (or “backlight array”) of backlighting 1716 includes a two-dimensional array of light sources. Such light sources may be arranged, for example, in a rectangular grid. Each light source in light element array 1736 is individually addressable and controllable to select an amount of light emitted thereby. A single light source may comprise one or more light-emitting elements depending upon the implementation. In one embodiment, each light source in light element array 1736 comprises a single light-emitting diode (LED) although this example is not intended to be limiting. Further description of implementations of backlighting 1716 and other backlighting implementations described herein is provided in pending U.S. patent application Ser. No. ______, titled “Backlighting Array Supporting Adaptable Parallax Barrier,” filed on same date herewith, which is incorporated by reference herein in its entirety.
  • Parallax barrier 1620 is positioned proximate to a surface of backlighting 1716 (e.g., a surface of the backlight panel). As described above, barrier element array 1610 is a layer of parallax barrier 1620 that includes a plurality of barrier elements or blocking regions arranged in an array. Each barrier element of the array is configured to be selectively opaque or transparent. Barrier element array 1610 filters light 1738 received from backlighting 1716 to generate filtered light 1740. Filtered light 1740 is configured to enable a two-dimensional image or a three-dimensional image (e.g., formed by a pair of two-dimensional images in filtered light 1672) to be formed based on images subsequently imposed on filtered light 1740 by pixel array 1722.
  • Similarly to pixel array 1608 of FIG. 16, pixel array 1722 of FIG. 17 includes a two-dimensional array of pixels (e.g., arranged in a grid or other distribution). However, pixel array 1722 is not self-illuminating, and instead is a light filter that imposes images (e.g., in the form of color, grayscale, etc.) on filtered light 1740 from parallax barrier 1620 to generate filtered light 1672 to include one or more images. Each pixel of pixel array 1722 may be a separately addressable filter (e.g., a pixel of a plasma display, an LCD display, an LED display, or of other type of display). Each pixel of pixel array 1722 may be individually controllable to vary the color imposed on the corresponding light passing through, and/or to vary the intensity of the passed light in filtered light 1672. In an embodiment, each pixel of pixel array 1722 may include a plurality of sub-pixels that correspond to separate color channels, such as a trio of red, green, and blue sub-pixels included in each pixel.
  • Display driver circuit 1602 of FIG. 17 is configured to generate drive signals based on control signal 1622 and/or content signal 1624 to enable display system 1700 to display two-dimensional and three-dimensional images to users 1618 in viewing space 1670. For example, light source driver circuit 1730 within display driver circuit 1602 controls the amount of light emitted by each light source in light element array 1736 by generating a drive signal 1734 that is received by light element array 1736 (based on content signal 1624 and/or control signal 1622). Drive signal 1734 may include one or more drive signals used to control the amount of light emitted by each light source in light element array 1736 to generate light 1738. As described above, barrier array driver circuit 1606 is configured to generate drive signal 1616 received by barrier element array 1610 (e.g., based on control signal 1622). Drive signal 1616 may include one or more drive signals used to cause each of the barrier elements of barrier element array to be transparent or opaque, to filter light 1738 to generate filtered light 1740. Pixel array driver circuit 1728 is configured to generate a drive signal 1732 that is received by pixel array 1722 (e.g., based on content signal 1624 and/or control signal 1622). Drive signal 1732 may include one or more drive signals used to cause pixels of pixel array 1722 to impose desired images (e.g., colors, grayscale, etc.) on filtered light 1740 as it passes through pixel array 1722. In this manner, pixel array 1722 generates filtered light 1672 that includes one or more two-dimensional and/or three-dimensional images that may be viewed by users 1618 in viewing space 1670.
  • For example, drive signal 1734 may control sets of light sources of light element array 1736 to emit light 1738. Drive signal 1616 may control barrier elements of barrier element array 1610 to filter light 1738 received from light element array 1736 to enable filtered light 1740 to enable two- and/or three-dimensionality. Drive signal 1732 may control sets of pixels of pixel array 1722 to filter filtered light 1740 according to respective images, to provide a plurality of images. For instance, drive signal 1616 may select one or more sets of the barrier elements of barrier element array 1610 to be transparent, to enable one or more corresponding two-dimensional images to be delivered to users 1618. Furthermore, drive signal 1616 may control sections of barrier element array 1610 to include opaque and transparent barrier elements to filter the light received from light element array 1736 so that one or more pairs of images provided by pixel array 1722 are each enabled to be received by users 1618 as a corresponding three-dimensional image. For example, drive signal 1616 may select parallel strips of barrier elements of barrier element array 1610 to be transparent to form slits that enable three-dimensional images to be received by users 1618.
  • FIG. 18 shows a flowchart 1800 for generating images that are delivered to users in a viewing space, according to an exemplary embodiment. Flowchart 1800 may be performed by system 1600 in FIG. 16 or system 1700 of FIG. 17, for example. Flowchart 1800 is described with respect to FIG. 19, which shows a cross-sectional view of a display system 1900. Display system 1900 is an exemplary embodiment of system 1600 shown in FIG. 16, and is shown for purposes of illustration. As shown in FIG. 19, system 1900 includes a pixel array 1902 and a barrier element array 1904. In another embodiment, system 1900 may further include backlighting in a configuration similar to display system 1700 of FIG. 17. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 1800. Flowchart 1800 is described as follows.
  • Flowchart 1800 begins with step 1802. In step 1802, light is received at an array of barrier elements. For example, as shown in FIG. 16, light 1652 is received at parallax barrier 1620 from pixel array 1608. Each pixel of pixel array 1608 may generate light that is received at parallax barrier 1620. Depending on the particular display mode of parallax barrier 1620, parallax barrier 1620 may filter light 1652 from pixel array 1608 to generate a two-dimensional image or a three-dimensional image viewable in viewing space 1670 by users 1618. As described above with respect to FIG. 17, alternatively, light 1738 may be received by parallax barrier 1620 from light element array 1736.
  • In step 1804, a first set of the barrier elements of the array of barrier elements is configured in the blocking state and a second set of the barrier elements of the array of barrier elements is configured in the non-blocking state to enable a viewer to be delivered a three-dimensional view. Three-dimensional image content may be provided for viewing in viewing space 1670. In such case, referring to FIG. 16 or 17, barrier array driver circuit 1606 may generate drive signal 1616 to configure barrier element array 1610 to include transparent strips of barrier elements to enable a three-dimensional view to be formed. For example, as shown in FIG. 19, barrier element array 1904 includes a plurality of barrier elements that are each either transparent (in a non-blocking state) or opaque (in a blocking state). Barrier elements that are blocking are indicated as barrier elements 1910 a-1910 f, and barrier elements that are non-blocking are indicated as barrier elements 1912 a-1912 e. Further barrier elements may be included in barrier element array 1904 that are not visible in FIG. 19. Each of barrier elements 1910 a-1910 f and 1912 a-1912 e may include one or more barrier elements. Barrier elements 1910 alternate with barrier elements 1912 in series in the order of barrier elements 1910 a, 1912 a, 1910 b, 1912 b, 1910 c, 1912 c, 1910 d, 1912 d, 1910 e, 1912 e, and 1910 f. In this manner, blocking barrier elements 1910 are alternated with non-blocking barrier elements 1912 to form a plurality of parallel non-blocking or transparent slits in barrier element array 1904.
  • For instance, FIG. 20 shows a view of a parallax barrier 2000 with transparent slits, according to an exemplary embodiment. Parallax barrier 2000 is an example of parallax barrier 1620 of FIGS. 16 and 17. As shown in FIG. 20, parallax barrier 2000 includes barrier element array 2002, which includes a plurality of barrier elements 2004 arranged in a two-dimensional array. Furthermore, as shown in FIG. 20, barrier element array 2002 includes a plurality of parallel strips of barrier elements 2004 that are selected to be non-blocking to form a plurality of parallel non-blocking strips (or “slits”) 2006 a-2006 g. As shown in FIG. 20, parallel non-blocking strips 2006 a-2006 g (non-blocking slits) are alternated with parallel blocking or blocking strips 2008 a-2008 g of barrier elements 2004 that are selected to be blocking. In the example of FIG. 20, non-blocking strips 2006 a-2006 g and blocking strips 2008 a-2008 g each have a width (along the x-dimension) of two barrier elements 2004, and have lengths that extend along the entire y-dimension (twenty barrier elements 2004) of barrier element array 2002, although in other embodiments, may have alternative dimensions. Non-blocking strips 2006 a-2006 g and blocking strips 2008 a-2008 g form a parallax barrier configuration for parallax barrier 300. The spacing (and number) of parallel non-blocking strips 2006 in barrier element array 2002 may be selectable by choosing any number and combination of particular strips of barrier elements 2004 in barrier element array 2002 to be non-blocking, to be alternated with blocking strips 2008, as desired. For example, hundreds, thousands, or even larger numbers of non-blocking strips 2006 and blocking strips 2008 may be present in parallax barrier 300.
  • FIG. 21 shows a parallax barrier 2100 that is another example of parallax barrier 1620 with parallel transparent slits, according to an embodiment. Similarly to parallax barrier 2000 of FIG. 20, parallax barrier 2100 has includes a barrier element array 2112, which includes a plurality of barrier elements 2114 arranged in a two-dimensional array (28 by 1 array). Barrier elements 2114 have widths (along the x-dimension) similar to the widths of barrier elements 2004 in FIG. 20, but have lengths that extend along the entire vertical length (y-dimension) of barrier element array 2114. As shown in FIG. 21, barrier element array 2112 includes parallel non-blocking strips 2006 a-2006 g alternated with parallel blocking strips 2008 a-2008 g. In the example of FIG. 21, parallel non-blocking strips 2006 a-2006 g and parallel blocking strips 2008 a-2008 g each have a width (along the x-dimension) of two barrier elements 2114, and have lengths that extend along the entire y-dimension (one barrier element 314) of barrier element array 2112.
  • Referring back to FIG. 18, in step 1806, the light is filtered at the array of barrier elements to form the three-dimensional view in a viewing space. Barrier element array 1610 of parallax barrier 1620 is configured to filter light 1652 received from pixel array 1608 (FIG. 16) or light 1738 received from light element array 1736 (FIG. 17) according to whether barrier element array 1610 is transparent or non-blocking (e.g., in a two-dimensional mode) or includes parallel non-blocking strips (e.g., in a three-dimensional mode). If one or more regions of barrier element array 1610 are transparent, those regions of barrier element array 1610 function as “all pass” filters to substantially pass all of light 1652 as filtered light 1672 to deliver one or more corresponding two-dimensional images generated by pixel array 1608 to viewing space 1670, to be viewable as a two-dimensional images in a similar fashion as a conventional display. If barrier element array 1610 includes one or more regions having parallel non-blocking strips (e.g., as shown for barrier element array 2002 in FIGS. 20 and 21), those regions of barrier element array 1610 pass a portion of light 1652 as filtered light 1672 to deliver one or more corresponding three-dimensional images to viewing space 1670.
  • For example, as shown in FIG. 19, pixel array 1902 includes a plurality of pixels 1914 a-1914 d and 1916 a-1916 d. Pixels 1914 alternate with pixels 1916, such that pixels 1914 a-1914 d and 1916 a-1916 d are arranged in series in the order of pixels 1914 a, 1916 a, 1914 b, 1916 b, 1914 c, 1916 c, 1914 d, and 1916 d. Further pixels may be included in pixel array 1902 that are not visible in FIG. 19, including further pixels along the width dimension of pixel array 1902 (e.g., in the left-right directions) as well as pixels along a length dimension of pixel array 1902 (not visible in FIG. 19). Each of pixels 1914 a-1914 d and 1916 a-1916 d generates light, which emanates from display surface 1924 of pixel array 1902 (e.g., generally upward in FIG. 19) towards barrier element array 1904. Some example indications of light emanating from pixels 1914 a-1914 d and 1916 a-1916 d are shown in FIG. 19 (as dotted lines), including light 1924 a and light 1918 a emanating from pixel 1914 a, light 1924 b, light 1918 b, and light 1924 c emanating from pixel 1914 b, etc.
  • Furthermore, light emanating from pixel array 1902 is filtered by barrier element array 1904 to form a plurality of images in a viewing space 1926, including a first image 1906 a at a first location 1908 a and a second image 1906 b at a second location 1908 b. A portion of the light emanating from pixel array 1902 is blocked by blocking barrier elements 1910, while another portion of the light emanating from pixel array 1902 passes through non-blocking barrier elements 1912, according to the filtering by barrier element array 1904. For instance, light 1924 a from pixel 1914 a is blocked by blocking barrier element 1910 a, and light 1924 b and light 1924 c from pixel 1914 b are blocked by blocking barrier elements 1910 b and 1910 c, respectively. In contrast, light 1918 a from pixel 1914 a is passed by non-blocking barrier element 1912 a and light 1918 b from pixel 1914 b is passed by non-blocking barrier element 1912 b.
  • By forming parallel non-blocking slits in a barrier element array, light from a pixel array can be filtered to form multiple images or views in a viewing space. For instance, system 1900 shown in FIG. 19 is configured to form first and second images 1906 a and 1906 b at locations 1908 a and 1908 b, respectively, which are positioned at a distance 1928 from pixel array 1902 (as shown in FIG. 19, further instances of first and second images 1906 a and 1906 b may be formed in viewing space 1926 according to system 1900, in a repeating, alternating fashion). As described above, pixel array 1902 includes a first set of pixels 1914 a-1914 d and a second set of pixels 1916 a-1916 d. Pixels 1914 a-1914 d correspond to first image 1906 a and pixels 1916 a-1916 d correspond to second image 1906 b. Due to the spacing of pixels 1914 a-1914 d and 1916 a-1916 d in pixel array 1902, and the geometry of non-blocking barrier elements 1912 in barrier element array 1904, first and second images 1906 a and 1906 b are formed at locations 1908 a and 1908 b, respectively. As shown in FIG. 19, light 1918 a-1918 d from the first set of pixels 1914 a-1914 d is focused at location 1908 a to form first image 1906 a at location 1908 a. Light 1920 a-1920 d from the second set of pixels 1916 a-1916 d is focused at location 1908 b to form second image 1906 b at location 1908 b.
  • FIG. 19 shows a slit spacing 1922 (center-to-center) of non-blocking barrier elements 1912 in barrier element array 1904. Spacing 1922 may be determined to select locations for parallel non-blocking slits to be formed in barrier element array 1904 for a particular image distance 1928 at which images are desired to be formed (for viewing by users). For example, in an embodiment, if a spacing of pixels 1914 a-1914 d corresponding to an image is known, and a distance 1928 at which the image is desired to be displayed is known, the spacing 1922 between adjacent parallel non-blocking slits in barrier element array 1904 may be selected.
  • First and second images 1906 a and 1906 b are configured to be perceived by a user as a three-dimensional image or view. For example, a viewer may receive first image 1906 a at a first eye location and second image 1906 b at a second eye location, according to an exemplary embodiment. First and second images 1906 a and 1906 b may be generated by first set of pixels 1914 a-1914 d and second set of pixels 1916 a-1916 d as images that are slightly different perspective from each other. Images 1906 a and 1906 b are combined in the visual center of the brain of the viewer to be perceived as a three-dimensional image or view. In such an embodiment, first and second images 1906 a and 1906 b may be formed by display system 1900 such that their centers are spaced apart a width of a user's pupils (e.g., an “interocular distance”).
  • Note that in the embodiments of FIGS. 20 and 21, the entire regions of parallax barriers 2000 and 2100 are filled with parallel non-blocking strips (e.g., as shown for barrier element array 2002 in FIGS. 20 and 21) to be configured to deliver three-dimensional images to viewing space 1670. In further embodiments, one or more regions of a parallax barrier may be filled with parallel non-blocking strips to deliver three-dimensional images, and one or more other regions of the parallax barrier may be transparent to deliver two-dimensional images. Furthermore, different regions of a parallax barrier that have parallel non-blocking strips may have the parallel non-blocking strips oriented at different angles to deliver three-dimensional images to viewers that are oriented differently.
  • For instance, FIG. 22 shows a view of a parallax barrier 2200 configured to enable the simultaneous display of two-dimensional and three-dimensional images at different regions, according to exemplary embodiments. Parallax barrier 2200 is similar to parallax barrier 2000 of FIG. 20, having barrier element array 2002 including a plurality of barrier elements 2004 arranged in a two-dimensional array. In FIG. 22, a first region 2202 of barrier element array 2002 includes a plurality of parallel non-blocking strips alternated with parallel blocking strips that together fill first region 2202. A second region 2204 of barrier element array 2002 is surrounded by first region 2202. Second region 2204 is a rectangular shaped region of barrier element array 2002 that includes a two-dimensional array of barrier elements 2004 that are non-blocking. Thus, in FIG. 22, barrier element array 2002 is configured to enable a three-dimensional image to be generated by pixels of a pixel array that are adjacent to barrier elements of first region 2202, and to enable a two-dimensional image to be generated by pixels of the pixel array that are adjacent to barrier elements inside of second region 2204. Note that alternatively, first region 2202 may include all non-blocking barrier elements 2002 to pass a two-dimensional image, and second region 2204 may include parallel non-blocking strips alternated with parallel blocking strips to pass a three-dimensional image. In further embodiments, parallax barrier 2200 may have additional numbers, sizes, and arrangements of regions configured to pass different combinations of two-dimensional images and three-dimensional images.
  • In another example, FIG. 23 shows a view of a parallax barrier 2300 with transparent slits having different orientations, according to an exemplary embodiment. Parallax barrier 2300 is similar to parallax barrier 2000 of FIG. 20, having barrier element array 2002 including a plurality of barrier elements 2004 arranged in a two-dimensional array. A first region 2310 (e.g., a bottom half) of barrier element array 2002 includes a first plurality of parallel strips of barrier elements 2004 that are selected to be non-blocking to form a first plurality of parallel non-blocking strips 2302 a-2302 e (each having a width of two barrier elements 2004). As shown in FIG. 23, parallel non-blocking strips 2302 a-2302 e are alternated with parallel blocking strips 2304 a-2304 f of barrier elements 2004 (each having a width of three barrier elements 2004). Parallel non-blocking strips 2302 a-2302 e are oriented in a first direction (e.g., along a vertical axis).
  • Furthermore, as shown in FIG. 23, a second region 2312 (e.g., a top half) of barrier element array 2002 includes a second plurality of parallel strips of barrier elements 2004 that are selected to be non-blocking to form a second plurality of parallel non-blocking strips 2306 a-2306 d (each having a width of one barrier element 2004). As shown in FIG. 23, parallel non-blocking strips 2306 a-2306 d are alternated with parallel blocking strips 2308 a-2308 c of barrier elements 2004 (each having a width of two barrier elements 2004). Parallel non-blocking strips 2306 a-2306 d are oriented in a second direction (e.g., along a horizontal axis).
  • As such, in FIG. 23, first and second pluralities of parallel non-blocking strips 2302 a-2302 e and 2306 a-2306 d are present in barrier element array 2002 that are oriented perpendicularly to each other. The region of barrier element array 2002 that includes first plurality of parallel non-blocking strips 2302 a-2302 e may be configured to deliver a three-dimensional image in a viewing space (as described above) to be viewable by a user whose body is oriented vertically (e.g., sitting upright or standing up). The region of barrier element array 2002 that includes second plurality of parallel non-blocking strips 2306 a-2306 d may be configured to deliver a three-dimensional image in a viewing space (as described above) to be viewable by a user whose body is oriented horizontally (e.g., laying down). In this manner, users who are oriented differently relative to each other can still each be provided with a corresponding three-dimensional image that accommodates their position.
  • As described above, in an embodiment, display device 1502 of FIG. 15 may be configured to generate a two-dimensional image for viewing by users in a viewing space. For instance, referring to FIGS. 16 and 17, barrier element array 1610 may be configured into a third configuration to deliver a two-dimensional view. In the third configuration, barrier array driver circuit 1606 may generate drive signal 1616 to configure each barrier element of barrier element array 1610 to be in the non-blocking state (transparent). If barrier element array 1610 is non-blocking, barrier element array 1610 functions as an “all pass” filter to substantially pass all of light 1652 (FIG. 16) or light 1738 (FIG. 17) as filtered light 1672 to deliver the two-dimensional image to viewing space 1670, to be viewable as a two-dimensional image in a similar fashion as a conventional display.
  • In embodiments, display systems may be configured to generate multiple two-dimensional images or views for viewing by users in a viewing space. For example, FIG. 24 shows a display system 2400 configured to deliver two two-dimensional images, according to an embodiment. Display system 2400 is configured similarly to display system 1900 of FIG. 19. As shown in FIG. 24, display system 2400 includes pixel array 1902 and barrier element array 1904, which generate first and second images 2402 a and 2402 b. As shown in FIG. 24, a first viewer 2404 a receives first image 2402 a at a first location and a second viewer 2404 b receives second image 2402 b at a second location, according to an exemplary embodiment. Similarly to the description provided above with respect to FIG. 19, first and second images 2402 a and 2402 b may be generated by first set of pixels 1914 a-1914 d and second set of pixels 1916 a-1916 d of pixel array 1902. However, rather than first and second images 2402 a and 2402 b being images that are of different perspective, first and second images 2402 a and 2402 b are each a two-dimensional image that may be viewed independently from each other. For instance, image 2402 a and image 2402 b may generated by display system 1900 from first media content and second media content, respectively, that are independent of each other. Image 2402 a may be received by both eyes of first viewer 2404 a to be perceived by first viewer 2404 a as a first two-dimensional image, and image 2402 b may be received by both eyes of second viewer 2404 b to be perceived by second viewer 2404 b as a second two-dimensional image. Thus, first and second images 2402 a and 2402 b may be generated to have a spacing that enables them to be separately viewed by first and second users 2404 a and 2404 b.
  • As such, display system 2400 of FIG. 24 can be configured to deliver a single three-dimensional view to a viewer (e.g., as shown in FIG. 19 for display system 1900), to deliver a pair of two-dimensional views to a pair of viewers (e.g., as shown in FIG. 24), or to deliver a pair of three-dimensional views to a pair of viewers as described above). Display system 2400 can be configured to switch between delivering views to one and two viewers by turning off or turning on, respectively, the display of media content by pixel array 1902 associated with one of the viewers (e.g., by turning off or on pixels 1916 associated with second image 2402 b). Display system 2400 can be configured to switch between delivering two-dimensional and three-dimensional views by providing the corresponding media content type at pixel array 2402. Furthermore, display system 2400 may provide such capabilities when configured similarly to display system 1700 shown in FIG. 17 (e.g., including backlighting 1716).
  • In an embodiment, display system 1900 may be configured to generate multiple three-dimensional images that include related image content (e.g., each three-dimensional image is a different viewpoint of a common scene), or that each include unrelated image content, for viewing by users in a viewing space. Each of the three-dimensional images may correspond to a pair of images generated by pixels of the pixel array. The barrier element array filters light from the pixel array to form the image pairs in a viewing space to be perceived by users as three-dimensional images.
  • For instance, FIG. 25 shows a flowchart 2500 for generating multiple three-dimensional images, according to an exemplary embodiment. Flowchart 2500 is described with respect to FIG. 26, which shows a cross-sectional view of a display system 2600. Display system 2600 is an exemplary embodiment of system 1600 shown in FIG. 16, and is shown for purposes of illustration. As shown in FIG. 26, system 2600 includes a pixel array 2602 and a barrier element array 2604. System 2600 may also include display driver circuit 1602 of FIG. 16, which is not shown in FIG. 26 for ease of illustration. Further structural and operational embodiments will be apparent to persons skilled in the relevant art(s) based on the discussion regarding flowchart 2500. Flowchart 2500 is described as follows.
  • Flowchart 2500 begins with step 2502. In step 2502, light is received from an array of pixels that includes a plurality of pairs of sets of pixels. For instance, in the example of FIG. 26, pixel array 2602 includes a first set of pixels 2614 a-2614 d, a second set of pixels 2616 a-2616 d, a third set of pixels 2618 a-2618 d, and a fourth set of pixels 2620 a-2620 d. Each of pixels 2614 a-2614 d, 2616 a-2616 d, 2618 a-2618 d, 2620 a-2620 d generates light, which emanates from the surface of pixel array 2602 towards barrier element array 2604. Each set of pixels generates a corresponding image. First set of pixels 2614 a-2614 d and third set of pixels 2618 a-2618 d are configured to generate images that combine to form a first three-dimensional image. Second set of pixels 2616 a-2616 d and fourth set of pixels 2620 a-2620 d are configured to generate images that combine to form a second three-dimensional image. Pixels of the four sets of pixels are alternated in pixel array 2602 in the order of pixel 2614 a, pixel 2616 a, pixel 2618 a, pixel 2620 a, pixel 2614 b, pixel 2616 b, etc. Further pixels may be included in each set of pixels in pixel array 2602 that are not visible in FIG. 26, including hundreds, thousands, or millions of pixels in each set of pixels.
  • As described above, in the current embodiment, pixel array 2602 is segmented into a plurality of pairs of sets of pixels. For instance, in the example of FIG. 26, pixel array 2602 is segmented into four sets of pixels. The first set of pixels includes pixels 2614 a-2614 g and the other pixels in the same columns, the second set of pixels includes pixels 2616 a-2616 g and the other pixels in the same columns, pixels 2618 a-2618 g and the other pixels in the same columns, and pixels 2620 a-2620 g and the other pixels in the same columns.
  • In step 2504, a plurality of strips of barrier elements of a barrier element array is selected to be non-blocking to form a plurality of parallel non-blocking slits. As shown in FIG. 26, barrier element array 2604 includes barrier elements that are each either non-blocking or blocking. Barrier elements that are blocking are indicated as barrier elements 2610 a-2610 f, and barrier elements that are non-blocking are indicated as barrier elements 2612 a-2612 e. Further barrier elements may be included in barrier element array 2604 that are not visible in FIG. 26, including hundreds, thousands, or millions of barrier elements, etc. Each of barrier elements 2610 a-2610 f and 2612 a-2612 e may include one or more barrier elements. Barrier elements 2610 alternate with barrier elements 2612. In this manner, blocking barrier elements 2610 are alternated with non-blocking barrier elements 2612 to form a plurality of parallel non-blocking slits in barrier element array 2604.
  • In step 2506, the light is filtered at the barrier element array to form a plurality of pairs of images in a viewing space corresponding to the plurality of pairs of sets of pixels, each pair of images of the plurality of pairs of images being configured to be perceived as a corresponding three-dimensional image of a plurality of three-dimensional images. As shown in FIG. 26, light emanating from pixel array 2602 is filtered by barrier element array 2604 to form a plurality of images in a viewing space 2626. For instance, four images are formed in viewing space 2626, including first-fourth images 2606 a-2606 d. Pixels 2614 a-2614 d correspond to first image 2606 a, pixels 2616 a-2616 d correspond to second image 2606 b, pixels 2618 a-2618 d correspond to third image 2606 c, and pixels 2620 a-2620 d correspond to fourth image 2606 d. As shown in FIG. 26, light 2622 a-2622 d from the first set of pixels 2614 a-2614 d forms first image 2606 a, and light 2624 a-2624 d from the third set of pixels 2618 a-2618 d forms third image 2606 c, due to the filtering of the non-blocking slits (corresponding to non-blocking barrier elements 2612 a-2612 e) in barrier element array 2604. Although not indicated in FIG. 26 (for ease of illustration), in a similar fashion, light from the second set of pixels 2616 a-2616 d forms second image 2606 b, and light from the fourth set of pixels 2620 a-2620 d forms fourth image 2606 d.
  • In the embodiment of FIG. 26, any pair of images of images 2606 a-2606 d may be configured to be perceived as a three-dimensional image by a user in viewing space 2626. For instance, first and third images 2606 a and 2606 c may be configured to be perceived by a user as a first three-dimensional image, such that first image 2606 a is received at a first eye location and third image 2606 c is received at a second eye location of a user. Furthermore, second and fourth images 2606 b and 2606 d may be configured to be perceived by a user as a second three-dimensional image, such that second image 2606 b is received at a first eye location and fourth image 2606 d is received at a second eye location of a user.
  • In the example of FIG. 26, two three-dimensional images are provided by system 2600. In further embodiments, further numbers of three-dimensional images may be provided, including a third three-dimensional image, a fourth three-dimensional image, etc. In such case, each three-dimensional image is generated by filtering light (using a barrier element array) corresponding to an image pair generated by a corresponding pair of sets of pixels of the pixel array, in a similar fashion as described with respect to FIG. 26 for two three-dimensional images. For example, to provide three three-dimensional images, pixel array 2602 may include fifth and sixth sets of pixels that generate fifth and sixth images, respectively, to be perceived by a user as a third three-dimensional image. To provide a fourth three-dimensional image, pixel array 2602 may include seventh and eighth sets of pixels that generate seventh and eighth images, respectively, to be perceived by a user as the fourth three-dimensional image.
  • In FIG. 26, the first and second three-dimensional images generated based on first and third images 2606 a and 2606 c and second and fourth images 2606 b and 2606 d, respectively, and any further three-dimensional images that may be generated, may include related image content or may each include unrelated image content. For example, in an embodiment, the first and second three-dimensional images (and any further three-dimensional images) may have been captured as different viewpoints of a common scene. Thus, a user in viewing space 2626 that moves laterally to sequentially view the first and second three-dimensional images (and any further three-dimensional images) may perceive being able to partially or fully “view behind” objects of the common scene.
  • Further description regarding using a parallax barrier to deliver three-dimensional views, including adaptable versions of parallax barriers, is provided in pending U.S. patent application Ser. No. 12/845,409, titled “Display With Adaptable Parallax Barrier,” in pending U.S. patent application Ser. No. 12/845,440, titled “Adaptable Parallax Barrier Supporting Mixed 2D And Stereoscopic 3D Display Regions,” and in pending U.S. patent application Ser. No. 12/845,461, titled “Display Supporting Multiple Simultaneous 3D Views,” which are each incorporated by reference herein in their entireties.
  • 2. Exemplary Embodiments Using Lenticular Lenses
  • In embodiments, as described herein, display devices 1400 and 1500 of FIGS. 14 and 15 may include one or more lenticular lenses as light manipulators 1404 and 1504 used to deliver three-dimensional images and/or two-dimensional images. For instance, display systems 1600 and 1700 of FIGS. 16 and 17 may each include a sub-lens array of a lenticular lens in place of parallax barrier 1620. For example, FIG. 27 shows a perspective view of a lenticular lens 2700 in accordance with an embodiment. As shown in FIG. 27, lenticular lens 2700 includes a sub-lens array 2702. Sub-lens array 2702 includes a plurality of sub-lenses 2704 arranged in a two-dimensional array (e.g., arranged side-by-side in a row). Each sub-lens 2704 is shown in FIG. 27 as generally cylindrical in shape and having a substantially semi-circular cross-section, but in other embodiments may have other shapes. In FIG. 27, sub-lens array 2702 is shown to include eight sub-lenses for illustrative purposes and is not intended to be limiting. For instance, sub-lens array 2702 may include any number (e.g., hundreds, thousands, etc.) of sub-lenses 2704. FIG. 28 shows a side view of lenticular lens 2700, oriented as lenticular lens 2700 may be positioned in system 1900 of FIG. 19 (in place of parallax barrier 1904) for lenticular lens 1902 to deliver three-dimensional views. In FIG. 28, light may be passed through lenticular lens 2700 in the direction of dotted arrow 2802 to be diverted.
  • In one embodiment, lenticular lens 2700 may be fixed in size. For example, light manipulator 1404 of FIG. 14 may include lenticular lens 2700 when fixed in size. In another embodiment, lenticular lens 2700 may be adaptable. For instance, light manipulator 1504 of FIG. 15 may include lenticular lens 2700 when adaptable. For instance, in an embodiment lenticular lens 2700 may be made from an elastic material. Such a lenticular lens 2700 may be adapted in size in response to generated drive signals.
  • Further description regarding using a lenticular lens to deliver three-dimensional views, including adaptable versions of lenticular lenses, is provided in pending U.S. patent application Ser. No. 12/774,307, titled “Display with Elastic Light Manipulator,” which is incorporated by reference herein in its entirety.
  • 3. Exemplary Embodiments Using Multiple Light Manipulators
  • Display devices 1400 and 1500 may include multiple layers of light manipulators in embodiments. Multiple three-dimensional images may be displayed in a viewing space using multiple light manipulator layers, according to embodiments. In embodiments, the multiple light manipulating layers may enable spatial separation of the images. For instance, in such an embodiment, for example, a display device that includes multiple light manipulator layers may be configured to display a first three-dimensional image in a first region of a viewing space (e.g., a left-side area), a second three-dimensional image in a second region of the viewing space (e.g., a central area), a third three-dimensional image in a third region of the viewing space (e.g., a right-side area), etc. In embodiments, a display device may be configured to display any number of spatially separated three-dimensional images, as desired for a particular application (e.g., according to a number and spacing of viewers in the viewing space, etc.).
  • For instance, FIG. 29 shows a flowchart 2900 for generating multiple three-dimensional images using multiple light manipulator layers, according to an exemplary embodiment. Flowchart 2900 is described with respect to FIG. 30, which shows a cross-sectional view of a display system 3000 that includes multiple light manipulator layers, according to an exemplary embodiment. As shown in FIG. 30, system 3000 includes a display driver circuit 3002, an image generator 1612, a first light manipulator 3014 a, and a second light manipulator 3014 b. As shown in FIG. 30, image generator 1612 includes pixel array 1608, first light manipulator 3014 a includes first light manipulator elements 3016 a, and second light manipulator 3014 b includes second light manipulator elements 3016 b. Furthermore, as shown in FIG. 30, display driver circuit 3002 includes a pixel array driver circuit 3004 and a light manipulator driver circuit 3006. Flowchart 2900 and system 3000 are described as follows.
  • Flowchart 2900 begins with step 2902. In step 2902, light is received from an array of pixels that includes a plurality of pairs of sets of pixels. For example, as shown in FIG. 30, light 1652 is received at first light manipulator 3014 a from pixel array 208 of image generator 1612. Pixel array driver circuit 3004 may generate driver signals based on content signal 1624 received by display driver circuit 3002, and the driver signals may be received by pixel array 1614 to generate light 1652. Each pixel of pixel array 1608 may generate light that is received at first light manipulator 3014 a. In an embodiment, pixel array driver circuit 3004 may generate drive signal 1614 to cause pixel array 1608 to emit light 1652 containing a plurality of images corresponding to the sets of pixels.
  • In step 2904, the light from the array of pixels is manipulated with a first light manipulator. For example, first light manipulator 3014 a may be configured to manipulate light 1652 received from pixel array 1608. As shown in FIG. 30, first light manipulator 3014 a includes light manipulator elements 3016 a configured to perform manipulating (e.g., filtering, diverting, etc.) of light 1652 to generate manipulated light 1672. Light manipulator elements 3016 a may optionally be configurable to adjust the manipulating performed by first light manipulator 3014 a. First light manipulator 3014 a may perform filtering in a similar manner as a parallax barrier described above or in other manner. In another embodiment, first light manipulator 3014 a may include a lenticular lens that diverts light 1652 to perform light manipulating, generating manipulated light 1672. In an embodiment, light manipulator driver circuit 3006 may generate drive signal 1616 a based on control signal 1622 received by display driver 3002 to cause light manipulator elements 3016 a to manipulate light 1652 as desired.
  • In step 2906, the light manipulated by the first light manipulator is manipulated with a second light manipulator to form a plurality of pairs of images corresponding to the plurality of pairs of sets of pixels in a viewing space. For example, as shown in FIG. 30, manipulated light 1672 is received by second light manipulator 3014 b to generate manipulated light 3008 that includes a plurality of three-dimensional images 3010 a-3010 n formed in viewing space 1670. As shown in FIG. 30, second light manipulator 3014 b includes light manipulator elements 3016 b configured to perform manipulating of manipulated light 1672 to generate manipulated light 3008. Light manipulator elements 3016 b may optionally be configurable to adjust the manipulating performed by second light manipulator 3014 b. In an embodiment, light manipulator driver circuit 3006 may generate drive signal 1616 b based on control signal 1622 to cause light manipulator elements 3016 b to manipulate manipulated light 1652 to generate manipulated light 3008 including three-dimensional images 3010 a-3010 n as desired. In embodiments, second light manipulator 3014 a may include a parallax barrier or a lenticular lens configured to manipulate manipulated light 1652 to generate manipulated light 3008.
  • As such, display system 3000 has a single viewing plane or surface (e.g., a plane or surface of pixel array 1608, first light manipulator 3014 a, second light manipulator 3014 b) that supports multiple viewers with media content in the form of three-dimensional images or views. The single viewing plane of display system 3000 may provide a first three-dimensional view based on first three-dimensional media content to a first viewer, a second three-dimensional view based on second three-dimensional media content to a second viewer, and optionally further three-dimensional views based on further three-dimensional media content to further viewers. First and second light manipulators 3014 a and 3014 b each cause three-dimensional media content to be presented to a corresponding viewer via a corresponding area of the single viewing plane, with each viewer being enabled to view corresponding media content without viewing media content directed to other viewers. Furthermore, the areas of the single viewing plane that provide the various three-dimensional views of media content overlap each other at least in part. In the embodiment of FIG. 30, the areas may be the same area—an area of a display screen or surface of display system 3000. As such, multiple three-dimensional views that are each viewable by a corresponding viewer may be delivered by a single display viewing plane.
  • Display system 3000 may be configured in various ways to generate multiple three-dimensional images according to flowchart 2900, in embodiments. Furthermore, as described below, embodiments of display system 3000 may be configured to generate two-dimensional views, as well as any combination of one or more two-dimensional views simultaneously with one or more three-dimensional views.
  • For instance, in an embodiment, delivery of three-dimensional images may be performed in system 3000 using multiple parallax barriers. FIG. 31 shows a cross-sectional view of a display system 3100, according to an exemplary embodiment. Display system 3100 is an example of system 3000 shown in FIG. 30. As shown in FIG. 31, system 3100 includes a pixel array 3102, a first barrier element array 3104, and a second barrier element array 3106. System 3100 may also include display driver circuit 3002 of FIG. 30, which is not shown in FIG. 31 for ease of illustration. System 3100 is described as follows.
  • As shown in the example of FIG. 31, pixel array 3102 includes a first set of pixels 3114 a-3114 c, a second set of pixels 3116 a-3116 c, a third set of pixels 3118 a-3118 c, and a fourth set of pixels 3120 a-3120 c. Pixels of the four sets of pixels are alternated in pixel array 3102 in the order of pixel 3114 a, pixel 3116 a, pixel 3118 a, pixel 3120 a, pixel 3114 b, pixel 3116 b, etc. Further pixels may be included in each set of pixels in pixel array 3102 that are not visible in FIG. 31, including hundreds, thousands, or millions of pixels in each set of pixels.
  • Each of pixels 3114 a-3114 c, 3116 a-3116 c, 3118 a-3118 c, and 3120 a-3120 c is configured to generate light, which emanates from the surface of pixel array 3102 towards first barrier element array 3104. Each set of pixels is configured to generate a corresponding image. For example, FIG. 32 shows display system 3100, where pixels of pixel array 3102 emit light. Light from second set of pixels 3116 a-3116 c and first set of pixels 3114 a-3114 c is configured to generate third and fourth images 3206 c and 3206 d, respectively, which may be perceived together as a second three-dimensional image by a second viewer 2404 b. Light from fourth set of pixels 3120 a-3120 c and third set of pixels 3118 a-3118 c is configured to generate first and second images 3206 a and 3206 b, respectively, which may be perceived together as a first three-dimensional image by a first viewer 2404 a. The light emitted by the sets of pixels is filtered by first and second barrier element arrays 3104 and 3106 to generate the first and second three-dimensional images in respective desired regions of a user space 3202 adjacent to display system 3100.
  • First-fourth images 3206 a-3206 d may be formed in viewing space 3202 at a distance from pixel array 3102 and at a lateral location of viewing space 3202 as determined by a configuration of display system 3100 of FIG. 31, including a width and spacing of non-blocking slits in first barrier element array 3104, by a width and positioning of non-blocking slits in second barrier element array 3106, by a spacing between pixel array 3102 and first barrier element array 3104, and a spacing between first and second barrier element arrays 3104 and 3106.
  • In an embodiment, system 3000 of FIG. 30 may be configured similarly to display system 1700 of FIG. 17 to deliver three-dimensional images and/or two-dimensional images. For instance, in embodiments, system 3000 may include backlighting 1716 and pixel array 1722 separated by one or both of first and second light manipulators 3014 a and 3014 b. For example, FIG. 33 shows a block diagram of a display system 3300, which is an example of display devices 1400 and 1500 shown in FIGS. 14 and 15, according to an embodiment. Display system 3300 is configured to display multiple three-dimensional images in a viewing space in a spatially separated manner. As shown in FIG. 33, system 3300 includes display driver circuit 3002, backlighting 1716, first light manipulator 3014 a, second light manipulator 3014 b, and pixel array 1722. As shown in FIG. 33, backlighting 1716 optionally includes light element array 1736, first light manipulator 3014 a includes first light manipulator elements 3016 a, and second light manipulator 3014 b includes second light manipulator elements 3016 b. Furthermore, as shown in FIG. 33, display driver circuit 3002 receives control signal 1622 and content signal 1624 and includes light source driver circuit 1730, light manipulator driver circuit 3006, and pixel array driver circuit 1728. Light source driver circuit 1730, light manipulator driver circuit 3006, and pixel array driver circuit 1728 may generate drives signals to perform their respective functions based on control signal 1622 and/or content signal 1624. As shown in FIG. 33, first and second light manipulators 3014 a and 3014 b are positioned between backlighting 1716 and pixel array 1722. In another embodiment, pixel array 1722 may instead be located between first and second light manipulators 3014 a and 3014 b.
  • As shown in FIGS. 16 and 17, display driver circuit 1602 receives content signal 1624, and as shown in FIGS. 30 and 33, display driver circuit 3002 receives content signal 1624. Content signal 1624 is an example of content signals 1408 and 1508 of FIGS. 14 and 15. Content signal 1624 includes two-dimensional and/or three-dimensional content for display by the respective display devices/systems. For instance, display driver circuits 1602 and 3002 generate respective drive signals (e.g., pixel array drive signals) based on content signal 1624 to enable the content carried by content signal 1624 to be displayed.
  • D. Example Display Environments
  • As described above, light manipulators may be reconfigured to change the locations of delivered views based on changing viewer positions. As such, a position of a viewer may be determined/tracked so that a parallax barrier and/or light manipulator may be reconfigured to deliver views consistent with the changing position of the viewer. For instance, with regard to a parallax barrier, a spacing, number, arrangement, and/or other characteristic of slits may be adapted according to the changing viewer position. With regard to a lenticular lens, a size of the lenticular lens may be adapted (e.g., stretched, compressed) according to the changing viewer position. In embodiments, a position of a viewer may be determined/tracked by determining a position of the viewer directly, or by determining a position of a device associated with the viewer (e.g., a device worn by the viewer, held by the viewer, sitting in the viewer's lap, in the viewer's pocket, sitting next the viewer, etc.).
  • Examples of display environments for display embodiments described herein include environments having a single viewer, as well as environments having multiple viewers. For example, in one type of environment (e.g., an office, living room, etc.), a single viewer interacts with an electronic device, mobile or stationary, to view and/or interact with mixed 2D and 3D content, such as a mobile or desktop computer, smart phone, television, or other mobile or stationary device. It is noted that this type of environment may include more than one viewer. In another type of environment (e.g., a living room, a home theatre room, etc.), multiple viewers are enabled to interact with an electronic device, such as a television set (e.g., high-def, small screen, large screen, etc.), to view and/or interact with mixed 2D and 3D content in the form of television content, movies, video games, etc.
  • For instance, FIG. 34 shows a block diagram of a display environment 3400, according to an exemplary embodiment. In the example of FIG. 34, first and second viewers 3406 a and 3406 b are present in display environment 3400, and are enabled to interact with a display device 3402 to be delivered two-dimensional and/or three-dimensional media content. Although two viewers 3406 are shown present in FIG. 34, in other embodiments, other numbers of viewers 3406 may be present in display environment 3400 that may interact with display device 3402 and may be delivered media content by display device 3402. As shown in FIG. 34, display environment 3400 includes display device 3402, a first remote control 3404 a, a second remote control 3404 b, a first headset 3412 a, a second headset 3412 b, and viewers 3406 a and 3406 b. Display device 3402 is an example of the display devices described above, and may be configured similarly to any display device described herein, including display device 606. Viewer 3406 a is delivered a view 3408 a by display device 3402, and viewer 3406 b is delivered a view 3408 b by display device 3402. Views 3408 a and 3408 b may each be a two-dimensional view or a three-dimensional view. Furthermore, in embodiments, view 3408 a may be delivered to viewer 3406 a, but not be visible by viewer 3406 b, and view 3408 b may be delivered to viewer 3406 b, but not be visible by viewer 3406 a.
  • Remote control 3404 a is a device that viewer 3406 a may use to interact with display device 3402, and remote control 3404 b is a device that viewer 3406 b may use to interact with display device 3402. For example, as shown in FIG. 34, viewer 3406 a may interact with a user interface of remote control 3404 a to generate a display control signal 3414 a, and viewer 3406 b may interact with a user interface of remote control 3404 b to generate a display control signal 3414 b. Display control signals 3414 a and 3414 b may be transmitted to display device 3402 using wireless or wired communication links. Display control signals 3414 a and 3414 b may be configured to select particular content desired to be viewed by viewers 3406 a and 3406 b, respectively. For example, display control signals 3414 a and 3414 b may select particular media content to be viewed (e.g., television channels, video games, DVD (digital video discs) content, video tape content, web content, etc.). Display control signals 3414 a and 3414 b may select whether such media content is desired to be viewed in two-dimensional or three-dimensional form by viewers 3406 a and 3406 b, respectively. Remote controls 3404 a and 3404 b may be television remote control devices, game controllers, smart phones, or other remote control type device.
  • Headsets 3412 a and 3412 b are worn by viewers 3406 a and 3406 b, respectively. Headsets 3412 a and 3412 b each include one or two speakers (e.g., earphones) that enable viewers 3406 a and 3406 b to hear audio associated with the media content of views 3408 a and 3408 b. Headsets 3412 a and 3412 b enable viewers 3406 a and 3406 b to hear audio of their respective media content without hearing audio associated the media content of the other of viewers 3406 a and 3406 b. Headsets 3412 a and 3412 b may each optionally include a microphone to enable viewers 3406 a and 3406 b to interact with display device 3402 using voice commands.
  • Display device 3402 a, headset 3412 a, and/or remote control 3404 a may operate to provide position information 3410 a regarding viewers 3406 a to display device 3402, and display device 3402 b, headset 3412 b, and/or remote control 3404 b may operate to provide position information 3410 b regarding viewers 3406 b to display device 3402. Display device 3402 may use position information 3410 a and 3410 b to reconfigure one or more light manipulators (e.g., parallax barriers and/or lenticular lenses) of display device 3402 to enable views 3408 a and 3408 b to be delivered to viewers 3406 a and 3406 b, respectively, at various locations. For example, display device 3402 a, headset 3412 a, and/or remote control 3404 a may use positioning techniques to track the position of viewer 3406 a, and display device 3402 b, headset 3412 b, and/or remote control 3404 b may use positioning techniques to track the position of viewer 3406 b.
  • E. Example Electronic Device Implementations
  • Embodiments may be implemented in hardware, software, firmware, or any combination thereof For example, browser 106, mixed 2D/3D supporting logic 108, API 302, operating system 304, display driver 306, browser 400, user interface 402, rendering engine 404, client application(s) 406, networking module 408, code interpreter 410, web browser 490, OS 432, browser/ rendering engine 442, 2D/3Dx UI display 444, networking module 446, UI backend 448, client(s) 450, parser 452, render tree preparation module 454, rendered tree display 456, 2D/3Dx support 458, streaming server application 466, user input interfaces 420, 2D, 3Dx & mixed display driver interface 422, shell operations 424, 2D, 3Dx, mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426, API supporting regional 2D/ 3Dx 428, 2D only driver variant 434, 3Dx only driver variant 436, mixed 2D and 3Ds driver variant 438, translation services 430 a-430 c, display driver 604, first translator 1102, and/or second translator 1104 may be implemented as computer program code configured to be executed in one or more processors, and/or as circuit logic.
  • For instance, FIG. 35 shows a block diagram of an example implementation of an electronic device 3500, according to an embodiment. In embodiments, electronic device 3500 may include one or more of the elements shown in FIG. 35. As shown in the example of FIG. 35, electronic device 3500 may include one or more processors (also called central processing units, or CPUs), such as a processor 3504. Processor 3504 is connected to a communication infrastructure 3502, such as a communication bus. In some embodiments, processor 3504 can simultaneously operate multiple computing threads.
  • Electronic device 3500 also includes a primary or main memory 3506, such as random access memory (RAM). Main memory 3506 has stored therein control logic 3528A (computer software), and data.
  • Electronic device 3500 also includes one or more secondary storage devices 3510. Secondary storage devices 3510 include, for example, a hard disk drive 3512 and/or a removable storage device or drive 3514, as well as other types of storage devices, such as memory cards and memory sticks. For instance, electronic device 3500 may include an industry standard interface, such a universal serial bus (USB) interface for interfacing with devices such as a memory stick. Removable storage drive 3514 represents a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup, etc.
  • As shown in FIG. 35, secondary storage devices 3510 may include an operating system 3532 and a browser 3534. Embodiments for operating system 3532 (e.g., OS 304, OS 432, etc.) and for browser 3534 (e.g., browser 106, browser 400, browser 490, etc.) are described in detail above.
  • Removable storage drive 3514 interacts with a removable storage unit 3516. Removable storage unit 3516 includes a computer useable or readable storage medium 3524 having stored therein computer software 3528B (control logic) and/or data. Removable storage unit 3516 represents a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, or any other computer data storage device. Removable storage drive 3514 reads from and/or writes to removable storage unit 3516 in a well known manner.
  • Electronic device 3500 further includes a communication or network interface 3518. Communication interface 3518 enables the electronic device 3500 to communicate with remote devices. For example, communication interface 3518 allows electronic device 3500 to communicate over communication networks or mediums 3542 (representing a form of a computer useable or readable medium), such as LANs, WANs, the Internet, etc. Network interface 3518 may interface with remote sites or networks via wired or wireless connections.
  • Control logic 3528C may be transmitted to and from electronic device 3500 via the communication medium 3542.
  • Any apparatus or manufacture comprising a computer useable or readable medium having control logic (software) stored therein is referred to herein as a computer program product or program storage device. This includes, but is not limited to, electronic device 3500, main memory 3506, secondary storage devices 3510, and removable storage unit 3516. Such computer program products, having control logic stored therein that, when executed by one or more data processing devices, cause such data processing devices to operate as described herein, represent embodiments of the invention.
  • Devices in which embodiments may be implemented may include storage, such as storage drives, memory devices, and further types of computer-readable media. Examples of such computer-readable storage media include a hard disk, a removable magnetic disk, a removable optical disk, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROM), and the like. As used herein, the terms “computer program medium” and “computer-readable medium” are used to generally refer to the hard disk associated with a hard disk drive, a removable magnetic disk, a removable optical disk (e.g., CDROMs, DVDs, etc.), zip disks, tapes, magnetic storage devices, MEMS (micro-electromechanical systems) storage, nanotechnology-based storage devices, as well as other media such as flash memory cards, digital video discs, RAM devices, ROM devices, and the like. Such computer-readable storage media may store program modules that include computer program logic for browser 106, mixed 2D/3D supporting logic 108, API 302, operating system 304, display driver 306, browser 400, user interface 402, rendering engine 404, client application(s) 406, networking module 408, code interpreter 410, web browser 490, OS 432, browser/ rendering engine 442, 2D/3Dx UI display 444, networking module 446, UI backend 448, client(s) 450, parser 452, render tree preparation module 454, rendered tree display 456, 2D/3Dx support 458, streaming server application 466, user input interfaces 420, 2D, 3Dx & mixed display driver interface 422, shell operations 424, 2D, 3Dx, mixed 2D and 3Dx, & mixed 3Dx and 3Dy translation services 426, API supporting regional 2D/ 3Dx 428, 2D only driver variant 434, 3Dx only driver variant 436, mixed 2D and 3Ds driver variant 438, translation services 430 a-430 c, display driver 604, first translator 1102, second translator 1104, flowchart 500, flowchart 700, flowchart 1200 (including any one or more steps of flowcharts 500, 700, and 1200), and/or further embodiments of the present invention described herein. Embodiments of the invention are directed to computer program products comprising such logic (e.g., in the form of program code or software) stored on any computer useable medium (e.g., a computer readable storage medium). Such program code, when executed in one or more processors, causes a device to operate as described herein.
  • The invention can work with software, hardware, and/or browser implementations other than those described herein. Any software, hardware, and browser implementations suitable for performing the functions described herein can be used.
  • As described herein, electronic device 3500 may be implemented in association with a variety of types of display devices. For instance, electronic device 3500 may be one of a variety of types of media devices, such as a stand-alone display (e.g., a television display such as flat panel display, etc.), a computer, a game console, a set top box, a digital video recorder (DVR), other electronic device mentioned elsewhere herein, etc. Media content that is delivered in two-dimensional or three-dimensional form according to embodiments described herein may be stored locally or received from remote locations. For instance, such media content may be locally stored for playback (replay TV, DVR), may be stored in removable memory (e.g. DVDs, memory sticks, etc.), may be received on wireless and/or wired pathways through a network such as a home network, through Internet download streaming, through a cable network, a satellite network, and/or a fiber network, etc. For instance, FIG. 35 shows a first media content 3530A that is stored in hard disk drive 3512, a second media content 3530B that is stored in storage medium 3524 of removable storage unit 3516, and a third media content 3530C that may be remotely stored and received over communication medium 3522 by communication interface 3518. Media content 3530 may be stored and/or received in these manners and/or in other ways.
  • FIG. 36 shows a block diagram of a display system 3600 that supports mixed 2D, stereoscopic 3D and multi-view 3D displays according to an exemplary embodiment. Display system 3600 is another electronic device embodiment. As shown in FIG. 36, display system 3600 includes media input interfaces 3602, host processing circuitry 3604, user input devices 3606, display processing circuitry 3608, adaptable display driver circuitry 3610, adaptable 2D, 3Dx and mixed display 3612, and first-third interface circuitry 3614-3618. Host processing circuitry 3604 includes mixed 2D and 3Dx browser 490 (of FIG. 4B), operating system 432 (of FIG. 4B), and application programs 3622. Display processing circuitry 3608 includes 2D, 3Dx, mixed 2D and 3Dx, and mixed 3Dx and 3Dy translation services 3640.
  • Media input interfaces 3602 includes one or more media input interfaces, wired or wireless, for received media, such as those described elsewhere herein. For instance, media input interface 3602 may include an interface for receiving media content from a local media player device, such as a DVD player, a memory stick, a computer media player, etc., and may include commercially available (e.g., USB, HDMI, etc.) or proprietary interfaces for receiving local media content. Media input interface 3602 may include an interface for receiving media content from a remote source, such as the Internet, satellite, cable, etc.), and may include commercially available (e.g., WLAN, Data Over Cable Service Interface Specification (DOCSIS), etc.) or proprietary interfaces for receiving remote media content.
  • Host processing circuitry 3604 may include one or more integrated circuit chips and/or additional circuitry, which may be configured to execute software/firmware, including operating system 432, browser 490, and application programs 3622.
  • User input devices 3606 includes one or more user input devices that a user may use to interact with display system 3600. Examples of user input devices are described elsewhere herein, such as a keyboard, a mouse/pointer, etc.
  • Display processing circuitry 3608 may be included in host processing circuitry 3604, or may be separate from host processing circuitry 3604 as shown in FIG. 36. For instance, display processing circuitry 3608 may include one or more processors (e.g., graphics processors), further circuitry and/or other hardware, software, firmware, or any combination thereof Display processing circuitry 3608 may be present to perform graphics processing tasks. For instance, as shown in FIG. 36, display processing circuitry 3608 may optionally include 2D, 3Dx, mixed 2D and 3Dx, and mixed 3Dx and 3Dy translation services 3640 to perform 2D/3D related translation services in addition or alternatively to translation services of OS 432 and/or browser 490.
  • Adaptable display driver circuitry 3610 includes one or more display driver circuits for an adaptable display. Examples of adaptable display driver circuitry 3610 are described above, such as with regard to FIGS. 4B, 16, 17, 30, and 33.
  • Adaptable 2D, 3Dx and mixed display 3612 includes a display that is adaptable, and is capable of displaying 2D content, 3D content, and a mixture of 2D and/or 3D content. Examples of adaptable 2D, 3Dx and mixed display 3612 are described elsewhere herein.
  • First-third interface circuitry 3614-3618 is optional. For instance, as shown in FIG. 36, a communication infrastructure (e.g., a signal bus) 3634 may be present to couple signals of media input interfaces 3602, host processing circuitry 3604, user input devices 3606, display processing circuitry 3608, adaptable display driver circuitry 3610, and display 3612. In an embodiment, if display processing circuitry 3608, adaptable display driver circuitry 3610, and/or display 3612 are contained in a common housing/structure with host processing circuitry 3604 (e.g., in a handheld device, etc.) interface circuitry 3614-3618 may not be needed to be present. If display processing circuitry 3608, adaptable display driver circuitry 3610, and/or display 3612 are in a separate housing/structure from host processing circuitry 3604, corresponding interface circuitry 3614-3618 may be present to provide an interface. For instance, host processing circuitry 3604 may be in a game console, a desktop computer tower, a home audio receiver, a set top box, etc., and display processing circuitry 3608, adaptable display driver circuitry 3610, and/or display 3612 may be included in a display device structure. In such case, interface circuitry 3614-3618 may not be present. When present, first-third circuitry 3614-3618 may each include circuitry, such as receivers and/or transmitters (wired or wireless), for enabling communications between the respective one of display processing circuitry 3608, adaptable display driver circuitry 3610, and display 3612, and the other components of system 3600 (e.g., host processing circuitry 3604, etc.).
  • Note that the embodiment of display system 3600 shown in FIG. 36 is provided for purposes of illustration, and is not intended to be limiting. In further embodiments, display system 3600 may include fewer, additional, and/or alternative features than shown in FIG. 36.
  • IV. Conclusion
  • While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims (20)

1. A method used by a browser architecture to support display on a screen of two-dimensional content and three-dimensional content, the browser architecture receiving web page content, the method comprising:
parsing the web page content;
identifying two-dimensional content to be displayed in a first region of the screen;
communicating a first configuration request to at least attempt to cause a first configuration of the first region of the screen to support the two-dimensional content;
identifying three-dimensional content to be displayed in a second region of the screen; and
communicating a second configuration request to at least attempt to cause a second configuration of the second region of the screen to support the three-dimensional content, the first configuration being different from the second configuration.
2. The method of claim 1, further comprising identifying at least part of the second configuration via interaction with a file containing the three-dimensional content.
3. The method of claim 1, further comprising identifying at least part of the second configuration via interaction with a server.
4. The method of claim 1, further comprising identifying at least part of the second configuration from the web page content.
5. The method of claim 4, wherein the web page content has a language format, and the identification of the at least part of the second configuration involves extracting configuration information via the language format.
6. A browser architecture that processes textually formatted content and supports display on a screen of two-dimensional content and three-dimensional content, the browser architecture comprising:
a first browser portion that identifies a two-dimensional element and a three-dimensional element by parsing the textually formatted content;
a second browser portion that identifies a screen configuration to be applied to a first region of the screen, the second browser portion delivering a configuration request based on the identification of the screen configuration to at least attempt to cause a reconfiguration of the screen within the first region; and
a third browser portion that manages retrieval and display of the three-dimensional element within the second region of the screen.
7. The browser architecture of claim 6, wherein the first browser portion comprising an engine, and the third portion comprising a three dimensional presentation component.
8. The browser architecture of claim 7, wherein the three dimensional presentation component comprises a media player.
9. The browser architecture of claim 8, wherein the media player is an application that assists the engine.
10. The browser architecture of claim 6, wherein the first browser portion and the second browser portion both comprise an engine.
11. A method used by a browser architecture, the method comprising:
identifying first tag information associated with two-dimensional content, the two-dimensional content intended for both a left eye and a right eye of a viewer;
identifying second tag information associated with three-dimensional content, the three-dimensional content having a first portion and a second portion, the first portion intended for the left eye of the viewer and the second portion intended for the right eye of the viewer, the first portion being a first camera view and the second portion being a second camera view;
causing the presentation of the two-dimensional content in a first region of a screen; and
causing the presentation of the three-dimensional content in a second region of the screen.
12. The method of claim 11, wherein the causing the presentation of the three-dimensional content of the screen comprises delivering a configuration request to attempt to cause a reconfiguration of a region of the screen to support the three-dimensional content.
13. The method of claim 11, wherein the second tag information comprises an external link to three-dimensional content.
14. The method of claim 11, wherein the first tag information comprises textual information.
15. The method of claim 11, further comprising causing display of three-dimensional browser control elements.
16. The method of claim 11, wherein said causing the presentation of the three-dimensional content in a second region of the screen comprises:
responding to the second tag information by causing a control signal to be generated and received by a display driver, the control signal enabling the three-dimensional content to be displayed in the second region of the screen.
17. The method of claim 11, wherein said causing the presentation of the three-dimensional content in a second region of the screen comprises:
loading an information resource that includes the three-dimensional content, the information resource including at least one of a three-dimensional web page, a three-dimensional image, or a three-dimensional video.
18. The method of claim 11, wherein said causing the presentation of the three-dimensional content in a second region of the screen comprises:
executing a compiled script to generate the three-dimensional content to be displayed in the second region.
19. The method of claim 11, wherein the first region is a first tab of a browser window displayed in the screen, and the second region is a second tab of the browser window, the method further comprising:
displaying the two-dimensional content in the first tab of the browser window; and
displaying the three-dimensional content in the second tab of the browser window.
20. The method of claim 11, wherein the first region is a first frame of a browser window displayed in the screen, and the second region is a second frame of the browser window, the method further comprising:
displaying the two-dimensional content in the first frame of the browser window; and
displaying the three-dimensional content in the second frame of the browser window.
US12/982,140 2009-12-31 2010-12-30 Internet browser and associated content definition supporting mixed two and three dimensional displays Abandoned US20110161843A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/982,140 US20110161843A1 (en) 2009-12-31 2010-12-30 Internet browser and associated content definition supporting mixed two and three dimensional displays

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US29181809P 2009-12-31 2009-12-31
US30311910P 2010-02-10 2010-02-10
US12/982,140 US20110161843A1 (en) 2009-12-31 2010-12-30 Internet browser and associated content definition supporting mixed two and three dimensional displays

Publications (1)

Publication Number Publication Date
US20110161843A1 true US20110161843A1 (en) 2011-06-30

Family

ID=43797724

Family Applications (27)

Application Number Title Priority Date Filing Date
US12/774,307 Active 2032-01-14 US8964013B2 (en) 2009-12-31 2010-05-05 Display with elastic light manipulator
US12/774,225 Abandoned US20110157322A1 (en) 2009-12-31 2010-05-05 Controlling a pixel array to support an adaptable light manipulator
US12/845,461 Active 2031-10-30 US8767050B2 (en) 2009-12-31 2010-07-28 Display supporting multiple simultaneous 3D views
US12/845,440 Abandoned US20110157697A1 (en) 2009-12-31 2010-07-28 Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions
US12/845,409 Abandoned US20110157696A1 (en) 2009-12-31 2010-07-28 Display with adaptable parallax barrier
US12/982,140 Abandoned US20110161843A1 (en) 2009-12-31 2010-12-30 Internet browser and associated content definition supporting mixed two and three dimensional displays
US12/982,031 Active 2032-12-14 US9019263B2 (en) 2009-12-31 2010-12-30 Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays
US12/982,309 Active 2033-05-02 US9204138B2 (en) 2009-12-31 2010-12-30 User controlled regional display of mixed two and three dimensional content
US12/982,053 Abandoned US20110157309A1 (en) 2009-12-31 2010-12-30 Hierarchical video compression supporting selective delivery of two-dimensional and three-dimensional video content
US12/982,248 Abandoned US20110157315A1 (en) 2009-12-31 2010-12-30 Interpolation of three-dimensional video content
US12/982,377 Abandoned US20110157327A1 (en) 2009-12-31 2010-12-30 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking
US12/982,199 Active 2032-09-27 US8988506B2 (en) 2009-12-31 2010-12-30 Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US12/982,273 Active 2032-08-13 US9979954B2 (en) 2009-12-31 2010-12-30 Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US12/982,047 Abandoned US20110157330A1 (en) 2009-12-31 2010-12-30 2d/3d projection system
US12/982,088 Active 2032-01-06 US9066092B2 (en) 2009-12-31 2010-12-30 Communication infrastructure including simultaneous video pathways for multi-viewer support
US12/982,362 Active 2031-02-05 US9049440B2 (en) 2009-12-31 2010-12-30 Independent viewer tailoring of same media source content via a common 2D-3D display
US12/982,156 Active 2035-11-09 US9654767B2 (en) 2009-12-31 2010-12-30 Programming architecture supporting mixed two and three dimensional displays
US12/982,330 Abandoned US20110157326A1 (en) 2009-12-31 2010-12-30 Multi-path and multi-source 3d content storage, retrieval, and delivery
US12/982,124 Active 2033-02-08 US9124885B2 (en) 2009-12-31 2010-12-30 Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays
US12/982,020 Abandoned US20110157257A1 (en) 2009-12-31 2010-12-30 Backlighting array supporting adaptable parallax barrier
US12/982,173 Active 2033-08-22 US9143770B2 (en) 2009-12-31 2010-12-30 Application programming interface supporting mixed two and three dimensional displays
US12/982,069 Active 2033-05-07 US8922545B2 (en) 2009-12-31 2010-12-30 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US12/982,212 Active 2032-04-05 US9013546B2 (en) 2009-12-31 2010-12-30 Adaptable media stream servicing two and three dimensional content
US12/982,062 Active 2032-06-13 US8687042B2 (en) 2009-12-31 2010-12-30 Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints
US14/504,095 Abandoned US20150015668A1 (en) 2009-12-31 2014-10-01 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US14/616,130 Abandoned US20150156473A1 (en) 2009-12-31 2015-02-06 Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US14/723,922 Abandoned US20150264341A1 (en) 2009-12-31 2015-05-28 Communication infrastructure including simultaneous video pathways for multi-viewer support

Family Applications Before (5)

Application Number Title Priority Date Filing Date
US12/774,307 Active 2032-01-14 US8964013B2 (en) 2009-12-31 2010-05-05 Display with elastic light manipulator
US12/774,225 Abandoned US20110157322A1 (en) 2009-12-31 2010-05-05 Controlling a pixel array to support an adaptable light manipulator
US12/845,461 Active 2031-10-30 US8767050B2 (en) 2009-12-31 2010-07-28 Display supporting multiple simultaneous 3D views
US12/845,440 Abandoned US20110157697A1 (en) 2009-12-31 2010-07-28 Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions
US12/845,409 Abandoned US20110157696A1 (en) 2009-12-31 2010-07-28 Display with adaptable parallax barrier

Family Applications After (21)

Application Number Title Priority Date Filing Date
US12/982,031 Active 2032-12-14 US9019263B2 (en) 2009-12-31 2010-12-30 Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays
US12/982,309 Active 2033-05-02 US9204138B2 (en) 2009-12-31 2010-12-30 User controlled regional display of mixed two and three dimensional content
US12/982,053 Abandoned US20110157309A1 (en) 2009-12-31 2010-12-30 Hierarchical video compression supporting selective delivery of two-dimensional and three-dimensional video content
US12/982,248 Abandoned US20110157315A1 (en) 2009-12-31 2010-12-30 Interpolation of three-dimensional video content
US12/982,377 Abandoned US20110157327A1 (en) 2009-12-31 2010-12-30 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking
US12/982,199 Active 2032-09-27 US8988506B2 (en) 2009-12-31 2010-12-30 Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US12/982,273 Active 2032-08-13 US9979954B2 (en) 2009-12-31 2010-12-30 Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US12/982,047 Abandoned US20110157330A1 (en) 2009-12-31 2010-12-30 2d/3d projection system
US12/982,088 Active 2032-01-06 US9066092B2 (en) 2009-12-31 2010-12-30 Communication infrastructure including simultaneous video pathways for multi-viewer support
US12/982,362 Active 2031-02-05 US9049440B2 (en) 2009-12-31 2010-12-30 Independent viewer tailoring of same media source content via a common 2D-3D display
US12/982,156 Active 2035-11-09 US9654767B2 (en) 2009-12-31 2010-12-30 Programming architecture supporting mixed two and three dimensional displays
US12/982,330 Abandoned US20110157326A1 (en) 2009-12-31 2010-12-30 Multi-path and multi-source 3d content storage, retrieval, and delivery
US12/982,124 Active 2033-02-08 US9124885B2 (en) 2009-12-31 2010-12-30 Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays
US12/982,020 Abandoned US20110157257A1 (en) 2009-12-31 2010-12-30 Backlighting array supporting adaptable parallax barrier
US12/982,173 Active 2033-08-22 US9143770B2 (en) 2009-12-31 2010-12-30 Application programming interface supporting mixed two and three dimensional displays
US12/982,069 Active 2033-05-07 US8922545B2 (en) 2009-12-31 2010-12-30 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US12/982,212 Active 2032-04-05 US9013546B2 (en) 2009-12-31 2010-12-30 Adaptable media stream servicing two and three dimensional content
US12/982,062 Active 2032-06-13 US8687042B2 (en) 2009-12-31 2010-12-30 Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints
US14/504,095 Abandoned US20150015668A1 (en) 2009-12-31 2014-10-01 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US14/616,130 Abandoned US20150156473A1 (en) 2009-12-31 2015-02-06 Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US14/723,922 Abandoned US20150264341A1 (en) 2009-12-31 2015-05-28 Communication infrastructure including simultaneous video pathways for multi-viewer support

Country Status (5)

Country Link
US (27) US8964013B2 (en)
EP (4) EP2357630A1 (en)
CN (3) CN102183840A (en)
HK (1) HK1161754A1 (en)
TW (3) TWI467234B (en)

Cited By (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110080472A1 (en) * 2009-10-02 2011-04-07 Eric Gagneraud Autostereoscopic status display
US20110164115A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US20110164188A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US20110191328A1 (en) * 2010-02-03 2011-08-04 Vernon Todd H System and method for extracting representative media content from an online document
US20110234769A1 (en) * 2010-03-23 2011-09-29 Electronics And Telecommunications Research Institute Apparatus and method for displaying images in image system
CN102368244A (en) * 2011-09-08 2012-03-07 广州市动景计算机科技有限公司 Page content alignment method, device and mobile terminal browser
US20120069243A1 (en) * 2010-09-20 2012-03-22 Echostar Global B.V. Separate Display Surfaces for EPG and Program Content
US20120069001A1 (en) * 2010-09-17 2012-03-22 Fujifilm Corporation Electronic album generating apparatus, stereoscopic image pasting apparatus, and methods and programs for controlling operation of same
US20120140025A1 (en) * 2010-12-07 2012-06-07 At&T Intellectual Property I, L.P. Dynamic Modification of Video Content at a Set-Top Box Device
US20120200592A1 (en) * 2011-02-04 2012-08-09 Seiko Epson Corporation Control device for controlling image display device, head-mounted display device, image display system, control method for the image display device, and control method for the head-mounted display device
US20120229450A1 (en) * 2011-03-09 2012-09-13 Lg Electronics Inc. Mobile terminal and 3d object control method thereof
US20120249734A1 (en) * 2011-03-28 2012-10-04 Shunsuke Takayama Electronic apparatus and display control method
US20120268456A1 (en) * 2011-04-19 2012-10-25 Hidetoshi Yokoi Information processor, information processing method, and computer program product
US20130033583A1 (en) * 2011-06-28 2013-02-07 Lg Electronics Inc. Image display device and controlling method thereof
US8416217B1 (en) * 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
EP2581821A1 (en) * 2011-10-10 2013-04-17 LG Electronics Inc. Mobile terminal and controlling method thereof
US20130127841A1 (en) * 2011-11-18 2013-05-23 Samsung Electronics Co., Ltd. Three-dimensional (3d) image display method and apparatus for 3d imaging and displaying contents according to start or end of operation
US20130265297A1 (en) * 2012-04-06 2013-10-10 Motorola Mobility, Inc. Display of a Corrected Browser Projection of a Visual Guide for Placing a Three Dimensional Object in a Browser
US20130326425A1 (en) * 2012-06-05 2013-12-05 Apple Inc. Mapping application with 3d presentation
US20140036044A1 (en) * 2012-08-03 2014-02-06 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
US20140152648A1 (en) * 2012-11-30 2014-06-05 Legend3D, Inc. Three-dimensional annotation system and method
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US20140223462A1 (en) * 2012-12-04 2014-08-07 Christopher Allen Aimone System and method for enhancing content using brain-state data
US20140237536A1 (en) * 2011-10-13 2014-08-21 Samsung Electronics Co., Ltd. Method of displaying contents, method of synchronizing contents, and method and device for displaying broadcast contents
US8854531B2 (en) 2009-12-31 2014-10-07 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display
US20140316907A1 (en) * 2013-04-17 2014-10-23 Asaf NAIM Multilayered user interface for internet browser
US20140317537A1 (en) * 2011-12-22 2014-10-23 Tencent Technology (Shenzhen) Company Limited Browser based application program extension method and device
US20140354633A1 (en) * 2012-02-24 2014-12-04 Huawei Technologies Co., Ltd. Image processing method and image processing device
US20150035821A1 (en) * 2013-08-01 2015-02-05 Equldp Limited Stereoscopic online web content creation and rendering
US20150082180A1 (en) * 2013-09-17 2015-03-19 Amazon Technologies, Inc. Approaches for three-dimensional object display used in content navigation
US20150153940A1 (en) * 2011-04-14 2015-06-04 Mediatek Inc. Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
US20150156472A1 (en) * 2012-07-06 2015-06-04 Lg Electronics Inc. Terminal for increasing visual comfort sensation of 3d object and control method thereof
US20150170397A1 (en) * 2012-06-08 2015-06-18 Lg Electronics Inc. Rendering method of 3d web-page and terminal using the same
US20150205797A1 (en) * 2014-01-22 2015-07-23 Al Squared Identifying a set of related visible content elements in a markup language document
US9247286B2 (en) 2009-12-31 2016-01-26 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
WO2016021861A1 (en) * 2014-08-02 2016-02-11 Samsung Electronics Co., Ltd. Electronic device and user interaction method thereof
US20160041630A1 (en) * 2012-06-25 2016-02-11 Zspace, Inc. Operations in a Three Dimensional Display System
US9348495B2 (en) 2014-03-07 2016-05-24 Sony Corporation Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone
US9626798B2 (en) 2011-12-05 2017-04-18 At&T Intellectual Property I, L.P. System and method to digitally replace objects in images or video
US20170223409A1 (en) * 2014-09-30 2017-08-03 Orange Method and device for adapting the display of a video stream by a client
US20170285762A1 (en) * 2014-12-15 2017-10-05 Bayerische Motoren Werke Aktiengesellschaft Method for Controlling a Vehicle System
US9880019B2 (en) 2012-06-05 2018-01-30 Apple Inc. Generation of intersection information by a mapping service
US9886794B2 (en) 2012-06-05 2018-02-06 Apple Inc. Problem reporting in maps
US9903732B2 (en) 2012-06-05 2018-02-27 Apple Inc. Providing navigation instructions while device is in locked mode
US20180143757A1 (en) * 2016-11-18 2018-05-24 Zspace, Inc. 3D User Interface
US9997069B2 (en) 2012-06-05 2018-06-12 Apple Inc. Context-aware voice guidance
US10006505B2 (en) 2012-06-05 2018-06-26 Apple Inc. Rendering road signs during navigation
US10018478B2 (en) 2012-06-05 2018-07-10 Apple Inc. Voice instructions during navigation
US10067634B2 (en) 2013-09-17 2018-09-04 Amazon Technologies, Inc. Approaches for three-dimensional object display
US10127715B2 (en) * 2016-11-18 2018-11-13 Zspace, Inc. 3D user interface—non-native stereoscopic image conversion
US10176633B2 (en) 2012-06-05 2019-01-08 Apple Inc. Integrated mapping and navigation application
US20190026004A1 (en) * 2017-07-18 2019-01-24 Chicago Labs, LLC Three Dimensional Icons for Computer Applications
US10271043B2 (en) 2016-11-18 2019-04-23 Zspace, Inc. 3D user interface—360-degree visualization of 2D webpage content
CN109725819A (en) * 2018-12-25 2019-05-07 努比亚技术有限公司 Interface display method, device, double screen dual system termi-nal and readable storage medium storing program for executing
US10298974B2 (en) * 2014-08-05 2019-05-21 Uc Mobile Co., Ltd. Method and device for presenting content data from network
US10318104B2 (en) 2012-06-05 2019-06-11 Apple Inc. Navigation application with adaptive instruction text
US10366523B2 (en) 2012-06-05 2019-07-30 Apple Inc. Method, system and apparatus for providing visual feedback of a map view change
US10523922B2 (en) * 2018-04-06 2019-12-31 Zspace, Inc. Identifying replacement 3D images for 2D images via ranking criteria
US10523921B2 (en) * 2018-04-06 2019-12-31 Zspace, Inc. Replacing 2D images with 3D images
US10579238B2 (en) 2016-05-13 2020-03-03 Sap Se Flexible screen layout across multiple platforms
US10649611B2 (en) 2016-05-13 2020-05-12 Sap Se Object pages in multi application user interface
US10650416B1 (en) * 2017-02-17 2020-05-12 Sprint Communications Company L.P. Live production interface and response testing
US10691880B2 (en) * 2016-03-29 2020-06-23 Microsoft Technology Licensing, Llc Ink in an electronic document
US10802324B2 (en) 2017-03-14 2020-10-13 Boe Technology Group Co., Ltd. Double vision display method and device
US10942983B2 (en) 2015-10-16 2021-03-09 F4 Interactive web device with customizable display
US11119811B2 (en) * 2015-07-15 2021-09-14 F4 Interactive device for displaying web page data in three dimensions
US11321103B2 (en) * 2017-06-16 2022-05-03 Microsoft Technology Licensing, Llc Generating user interface containers

Families Citing this family (448)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9015736B2 (en) * 2005-12-29 2015-04-21 Rovi Guides, Inc. Systems and methods for episode tracking in an interactive media environment
EP2023812B1 (en) 2006-05-19 2016-01-27 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
FR2906899B1 (en) * 2006-10-05 2009-01-16 Essilor Int DISPLAY DEVICE FOR STEREOSCOPIC VISUALIZATION.
JP2008106185A (en) * 2006-10-27 2008-05-08 Shin Etsu Chem Co Ltd Method for adhering thermally conductive silicone composition, primer for adhesion of thermally conductive silicone composition and method for production of adhesion composite of thermally conductive silicone composition
US8570423B2 (en) * 2009-01-28 2013-10-29 Hewlett-Packard Development Company, L.P. Systems for performing visual collaboration between remotely situated participants
EP2256620A1 (en) * 2009-05-29 2010-12-01 Koninklijke Philips Electronics N.V. Picture selection method for modular lighting system
US8125418B2 (en) * 2009-06-26 2012-02-28 Global Oled Technology Llc Passive-matrix chiplet drivers for displays
WO2011021894A2 (en) * 2009-08-20 2011-02-24 Lg Electronics Inc. Image display apparatus and method for operating the same
JP5187639B2 (en) * 2009-08-28 2013-04-24 独立行政法人情報通信研究機構 3D display
US9420250B2 (en) * 2009-10-07 2016-08-16 Robert Laganiere Video analytics method and system
CN102474632A (en) * 2009-12-08 2012-05-23 美国博通公司 Method and system for handling multiple 3-d video formats
US20110143769A1 (en) * 2009-12-16 2011-06-16 Microsoft Corporation Dual display mobile communication device
WO2011075825A1 (en) 2009-12-21 2011-06-30 Kik Interactive, Inc. Systems and methods for accessing and controlling media stored remotely
US8684531B2 (en) * 2009-12-28 2014-04-01 Vision3D Technologies, Llc Stereoscopic display device projecting parallax image and adjusting amount of parallax
US20110187839A1 (en) * 2010-02-01 2011-08-04 VIZIO Inc. Frame based three-dimensional encoding method
US20110202845A1 (en) * 2010-02-17 2011-08-18 Anthony Jon Mountjoy System and method for generating and distributing three dimensional interactive content
US20110205336A1 (en) * 2010-02-23 2011-08-25 Panasonic Corporation Three-dimensional image reproducing apparatus
DE102010009737A1 (en) * 2010-03-01 2011-09-01 Institut für Rundfunktechnik GmbH Method and arrangement for reproducing 3D image content
JP5462672B2 (en) * 2010-03-16 2014-04-02 株式会社ジャパンディスプレイ Display device and electronic device
US8634873B2 (en) * 2010-03-17 2014-01-21 Microsoft Corporation Mobile communication device having multiple, interchangeable second devices
KR20110109565A (en) * 2010-03-31 2011-10-06 삼성전자주식회사 Backlight unit, 3d display having the same and method of making 3d image
US10448083B2 (en) * 2010-04-06 2019-10-15 Comcast Cable Communications, Llc Streaming and rendering of 3-dimensional video
KR20110115806A (en) * 2010-04-16 2011-10-24 삼성전자주식회사 Display apparatus and 3d glasses, and display system including the same
CN102449534B (en) * 2010-04-21 2014-07-02 松下电器产业株式会社 Three-dimensional video display device and three-dimensional video display method
US8667533B2 (en) * 2010-04-22 2014-03-04 Microsoft Corporation Customizing streaming content presentation
US9271052B2 (en) * 2010-05-10 2016-02-23 Comcast Cable Communications, Llc Grid encoded media asset data
US9030536B2 (en) 2010-06-04 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
JP5510097B2 (en) * 2010-06-16 2014-06-04 ソニー株式会社 Signal transmission method, signal transmission device, and signal reception device
US10089937B2 (en) * 2010-06-21 2018-10-02 Microsoft Technology Licensing, Llc Spatial and temporal multiplexing display
US9225975B2 (en) 2010-06-21 2015-12-29 Microsoft Technology Licensing, Llc Optimization of a multi-view display
KR20110139497A (en) * 2010-06-23 2011-12-29 삼성전자주식회사 Display apparatus and method for displaying thereof
JP2012013980A (en) * 2010-07-01 2012-01-19 Sony Corp Stereoscopic display device and display drive circuit
US9049426B2 (en) * 2010-07-07 2015-06-02 At&T Intellectual Property I, Lp Apparatus and method for distributing three dimensional media content
US8670070B2 (en) * 2010-07-15 2014-03-11 Broadcom Corporation Method and system for achieving better picture quality in various zoom modes
US9032470B2 (en) 2010-07-20 2015-05-12 At&T Intellectual Property I, Lp Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US9232274B2 (en) 2010-07-20 2016-01-05 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
JP2012034138A (en) * 2010-07-29 2012-02-16 Toshiba Corp Signal processing apparatus and signal processing method
KR20120020627A (en) * 2010-08-30 2012-03-08 삼성전자주식회사 Apparatus and method for image processing using 3d image format
TW201227684A (en) * 2010-09-01 2012-07-01 Seereal Technologies Sa Backplane device
JP5058316B2 (en) * 2010-09-03 2012-10-24 株式会社東芝 Electronic device, image processing method, and image processing program
US20120057007A1 (en) * 2010-09-03 2012-03-08 Satoshi Ishiguro Simplified Visual Screening Check on Television
JP5364666B2 (en) * 2010-09-13 2013-12-11 株式会社東芝 Stereoscopic image display apparatus, method and program
AU2011305445B2 (en) 2010-09-24 2017-03-16 The Board Of Trustees Of The Leland Stanford Junior University Direct capture, amplification and sequencing of target DNA using immobilized primers
WO2012050036A1 (en) * 2010-10-13 2012-04-19 シャープ株式会社 Display device
KR20120046937A (en) * 2010-11-03 2012-05-11 삼성전자주식회사 Method and apparatus for providing 3d effect in video device
US10157526B2 (en) 2010-11-05 2018-12-18 Razberi Technologies, Inc. System and method for a security system
US11082665B2 (en) 2010-11-05 2021-08-03 Razberi Secure Technologies, Llc System and method for a security system
US8922658B2 (en) * 2010-11-05 2014-12-30 Tom Galvin Network video recorder system
KR101670927B1 (en) * 2010-11-05 2016-11-01 삼성전자주식회사 Display apparatus and method
US10477158B2 (en) 2010-11-05 2019-11-12 Razberi Technologies, Inc. System and method for a security system
US9860490B2 (en) 2010-11-05 2018-01-02 Tom Galvin Network video recorder system
US9218115B2 (en) 2010-12-02 2015-12-22 Lg Electronics Inc. Input device and image display apparatus including the same
KR20120065774A (en) * 2010-12-13 2012-06-21 삼성전자주식회사 Audio providing apparatus, audio receiver and method for providing audio
KR101734285B1 (en) * 2010-12-14 2017-05-11 엘지전자 주식회사 Video processing apparatus of mobile terminal and method thereof
US8963694B2 (en) * 2010-12-17 2015-02-24 Sony Corporation System and method for remote controlled device selection based on device position data and orientation data of a user
US20120154559A1 (en) * 2010-12-21 2012-06-21 Voss Shane D Generate Media
US9386294B2 (en) * 2011-01-05 2016-07-05 Google Technology Holdings LLC Method and apparatus for 3DTV image adjustment
US20120178380A1 (en) * 2011-01-07 2012-07-12 Microsoft Corporation Wireless Communication Techniques
US8643684B2 (en) * 2011-01-18 2014-02-04 Disney Enterprises, Inc. Multi-layer plenoptic displays that combine multiple emissive and light modulating planes
TW201232280A (en) * 2011-01-20 2012-08-01 Hon Hai Prec Ind Co Ltd System and method for sharing desktop information
KR20120088467A (en) * 2011-01-31 2012-08-08 삼성전자주식회사 Method and apparatus for displaying partial 3d image in 2d image disaply area
JP5632764B2 (en) * 2011-02-02 2014-11-26 セイコーインスツル株式会社 Stereoscopic image display device
US20120202187A1 (en) * 2011-02-03 2012-08-09 Shadowbox Comics, Llc Method for distribution and display of sequential graphic art
US8724467B2 (en) 2011-02-04 2014-05-13 Cisco Technology, Inc. System and method for managing congestion in a network environment
TWI569041B (en) 2011-02-14 2017-02-01 半導體能源研究所股份有限公司 Display device
US8630247B2 (en) * 2011-02-15 2014-01-14 Cisco Technology, Inc. System and method for managing tracking area identity lists in a mobile network environment
US9035860B2 (en) 2011-02-16 2015-05-19 Semiconductor Energy Laboratory Co., Ltd. Display device
KR101899178B1 (en) 2011-02-16 2018-09-14 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Display device
US9443455B2 (en) 2011-02-25 2016-09-13 Semiconductor Energy Laboratory Co., Ltd. Display device having a plurality of pixels
US9558687B2 (en) 2011-03-11 2017-01-31 Semiconductor Energy Laboratory Co., Ltd. Display device and method for driving the same
US9578299B2 (en) 2011-03-14 2017-02-21 Qualcomm Incorporated Stereoscopic conversion for shader based graphics content
JP5766479B2 (en) * 2011-03-25 2015-08-19 京セラ株式会社 Electronic device, control method, and control program
JP5730091B2 (en) * 2011-03-25 2015-06-03 株式会社ジャパンディスプレイ Display panel, display device and electronic device
JP2012205285A (en) * 2011-03-28 2012-10-22 Sony Corp Video signal processing apparatus and video signal processing method
WO2012138539A2 (en) * 2011-04-08 2012-10-11 The Regents Of The University Of California Interactive system for collecting, displaying, and ranking items based on quantitative and textual input from multiple participants
JP5162000B2 (en) * 2011-04-19 2013-03-13 株式会社東芝 Information processing apparatus, information processing method, and program
JP5161999B2 (en) * 2011-04-19 2013-03-13 株式会社東芝 Electronic device, display control method, and display control program
KR101569602B1 (en) * 2011-05-05 2015-11-16 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Lenticular directional display
US20120287115A1 (en) * 2011-05-10 2012-11-15 Ding Junjie Method for generating image frames
KR20120126458A (en) * 2011-05-11 2012-11-21 엘지전자 주식회사 Method for processing broadcasting signal and display device thereof
WO2012156778A1 (en) * 2011-05-13 2012-11-22 Sony Ericsson Mobile Communications Ab Adjusting parallax barriers
US8913104B2 (en) * 2011-05-24 2014-12-16 Bose Corporation Audio synchronization for two dimensional and three dimensional video signals
US9420259B2 (en) * 2011-05-24 2016-08-16 Comcast Cable Communications, Llc Dynamic distribution of three-dimensional content
JP6050941B2 (en) * 2011-05-26 2016-12-21 サターン ライセンシング エルエルシーSaturn Licensing LLC Display device and method, and program
US9442562B2 (en) * 2011-05-27 2016-09-13 Dolby Laboratories Licensing Corporation Systems and methods of image processing that adjust for viewer position, screen size and viewing distance
US9084068B2 (en) * 2011-05-30 2015-07-14 Sony Corporation Sensor-based placement of sound in video recording
CN103262551B (en) * 2011-06-01 2015-12-09 松下电器产业株式会社 Image processor, dispensing device, image processing system, image treatment method, sending method and integrated circuit
JP2012253543A (en) * 2011-06-02 2012-12-20 Seiko Epson Corp Display device, control method of display device, and program
JP5770018B2 (en) * 2011-06-03 2015-08-26 任天堂株式会社 Display control program, display control apparatus, display control method, and display control system
US9420268B2 (en) 2011-06-23 2016-08-16 Lg Electronics Inc. Apparatus and method for displaying 3-dimensional image
CN103621073B (en) * 2011-06-24 2016-06-22 汤姆逊许可公司 Transmit the method and apparatus of three-dimensional content
US9602766B2 (en) 2011-06-24 2017-03-21 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US9445046B2 (en) 2011-06-24 2016-09-13 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
US20130265300A1 (en) * 2011-07-03 2013-10-10 Neorai Vardi Computer device in form of wearable glasses and user interface thereof
JP2013015779A (en) * 2011-07-06 2013-01-24 Sony Corp Display control device, display control method, and computer program
US8988411B2 (en) 2011-07-08 2015-03-24 Semiconductor Energy Laboratory Co., Ltd. Display device
US9137522B2 (en) * 2011-07-11 2015-09-15 Realtek Semiconductor Corp. Device and method for 3-D display control
US9294752B2 (en) * 2011-07-13 2016-03-22 Google Technology Holdings LLC Dual mode user interface system and method for 3D video
US8928708B2 (en) 2011-07-15 2015-01-06 Semiconductor Energy Laboratory Co., Ltd. Display device and method for driving the display device
US8587635B2 (en) 2011-07-15 2013-11-19 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
KR101926477B1 (en) * 2011-07-18 2018-12-11 삼성전자 주식회사 Contents play method and apparatus
KR20130010834A (en) * 2011-07-19 2013-01-29 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Display device
JP2013038454A (en) * 2011-08-03 2013-02-21 Sony Corp Image processor, method, and program
JP2013038504A (en) 2011-08-04 2013-02-21 Sony Corp Imaging device, image processing method and program
JP5815326B2 (en) * 2011-08-12 2015-11-17 ルネサスエレクトロニクス株式会社 Video decoding device and image display device
EP2745462B1 (en) * 2011-08-18 2021-10-20 Pfaqutruma Research LLC Systems and methods of virtual world interaction
US10659724B2 (en) * 2011-08-24 2020-05-19 Ati Technologies Ulc Method and apparatus for providing dropped picture image processing
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
JP2013050538A (en) 2011-08-30 2013-03-14 Sony Corp Display device and electronic apparatus
JP2013050539A (en) * 2011-08-30 2013-03-14 Sony Corp Display device and electronic apparatus
JP2013050537A (en) * 2011-08-30 2013-03-14 Sony Corp Display device and electronic apparatus
US20130050596A1 (en) * 2011-08-30 2013-02-28 Industrial Technology Research Institute Auto-stereoscopic display and method for fabricating the same
WO2013032221A1 (en) * 2011-08-31 2013-03-07 엘지전자 주식회사 Digital broadcast signal processing method and device
US8872813B2 (en) 2011-09-02 2014-10-28 Adobe Systems Incorporated Parallax image authoring and viewing in digital media
DE112012003931T5 (en) 2011-09-21 2014-07-10 Magna Electronics, Inc. Image processing system for a motor vehicle with image data transmission and power supply via a coaxial cable
CN102510503B (en) * 2011-09-30 2015-06-03 深圳超多维光电子有限公司 Stereoscopic display method and stereoscopic display equipment
JP5715539B2 (en) * 2011-10-06 2015-05-07 株式会社ジャパンディスプレイ Display device and electronic device
KR20130037861A (en) * 2011-10-07 2013-04-17 삼성디스플레이 주식회사 Display apparatus and method of displaying three dimensional image using the same
GB2495725B (en) * 2011-10-18 2014-10-01 Sony Comp Entertainment Europe Image transfer apparatus and method
JP5149435B1 (en) * 2011-11-04 2013-02-20 株式会社東芝 Video processing apparatus and video processing method
CA2794898C (en) 2011-11-10 2019-10-29 Victor Yang Method of rendering and manipulating anatomical images on mobile computing device
KR101887058B1 (en) * 2011-11-11 2018-08-09 엘지전자 주식회사 A process for processing a three-dimensional image and a method for controlling electric power of the same
WO2013073428A1 (en) * 2011-11-15 2013-05-23 シャープ株式会社 Display device
US9942580B2 (en) * 2011-11-18 2018-04-10 At&T Intellecutal Property I, L.P. System and method for automatically selecting encoding/decoding for streaming media
US8660362B2 (en) * 2011-11-21 2014-02-25 Microsoft Corporation Combined depth filtering and super resolution
WO2013081985A1 (en) 2011-11-28 2013-06-06 Magna Electronics, Inc. Vision system for vehicle
DE102011055967B4 (en) * 2011-12-02 2016-03-10 Seereal Technologies S.A. Measuring method and device for carrying out the measuring method
CN103163650A (en) * 2011-12-08 2013-06-19 武汉天马微电子有限公司 Naked eye three-dimensional (3D) grating structure
US20130156090A1 (en) * 2011-12-14 2013-06-20 Ati Technologies Ulc Method and apparatus for enabling multiuser use
US9042266B2 (en) * 2011-12-21 2015-05-26 Kik Interactive, Inc. Methods and apparatus for initializing a network connection for an output device
CN202995143U (en) * 2011-12-29 2013-06-12 三星电子株式会社 Glasses device and display device
US9392251B2 (en) 2011-12-29 2016-07-12 Samsung Electronics Co., Ltd. Display apparatus, glasses apparatus and method for controlling depth
EP2611176A3 (en) * 2011-12-29 2015-11-18 Samsung Electronics Co., Ltd. Display apparatus and controlling method thereof
TWI467235B (en) * 2012-02-06 2015-01-01 Innocom Tech Shenzhen Co Ltd Three-dimensional (3d) display and displaying method thereof
US11164394B2 (en) 2012-02-24 2021-11-02 Matterport, Inc. Employing three-dimensional (3D) data predicted from two-dimensional (2D) images using neural networks for 3D modeling applications and other applications
US9324190B2 (en) 2012-02-24 2016-04-26 Matterport, Inc. Capturing and aligning three-dimensional scenes
US10848731B2 (en) 2012-02-24 2020-11-24 Matterport, Inc. Capturing and aligning panoramic image and depth data
KR20130098023A (en) * 2012-02-27 2013-09-04 한국전자통신연구원 Apparatus and method for displaying an image on 3-dimentional display based on multi-layer parallax barrier
JP5942477B2 (en) * 2012-02-29 2016-06-29 富士ゼロックス株式会社 Setting device and program
EP2637416A1 (en) * 2012-03-06 2013-09-11 Alcatel Lucent A system and method for optimized streaming of variable multi-viewpoint media
CN104145234A (en) * 2012-03-07 2014-11-12 索尼公司 Information processing device, information processing method, and program
JP5762998B2 (en) * 2012-03-07 2015-08-12 株式会社ジャパンディスプレイ Display device and electronic device
JP5779124B2 (en) * 2012-03-13 2015-09-16 株式会社ジャパンディスプレイ Display device and electronic device
JP5806150B2 (en) * 2012-03-13 2015-11-10 株式会社ジャパンディスプレイ Display device
US9280042B2 (en) * 2012-03-16 2016-03-08 City University Of Hong Kong Automatic switching of a multi-mode projector display screen for displaying three-dimensional and two-dimensional images
BR112014001749B1 (en) 2012-03-16 2022-08-16 Tencent Technology (Shenzhen) Company Limited OFFLINE DOWNLOAD METHOD AND SYSTEM
CN102650741B (en) * 2012-03-16 2014-06-11 京东方科技集团股份有限公司 Light splitting device, manufacturing method thereof and 3D (Three-Dimensional) display device
US9733707B2 (en) 2012-03-22 2017-08-15 Honeywell International Inc. Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system
US9308439B2 (en) * 2012-04-10 2016-04-12 Bally Gaming, Inc. Controlling three-dimensional presentation of wagering game content
WO2013153418A1 (en) * 2012-04-12 2013-10-17 Sony Mobile Communications Ab Improved 3d image display system
KR101923150B1 (en) * 2012-04-16 2018-11-29 삼성디스플레이 주식회사 Display apparatus and method of displaying three dimensional image using the same
CN102645959A (en) * 2012-04-16 2012-08-22 上海颖杰计算机系统设备有限公司 3D (Three Dimensional) integrated computer
WO2013158322A1 (en) * 2012-04-18 2013-10-24 The Regents Of The University Of California Simultaneous 2d and 3d images on a display
EP2653906B1 (en) 2012-04-20 2022-08-24 Dolby Laboratories Licensing Corporation A system for delivering stereoscopic images
CN103379362B (en) * 2012-04-24 2017-07-07 腾讯科技(深圳)有限公司 VOD method and system
US9201495B2 (en) * 2012-04-24 2015-12-01 Mobitv, Inc. Control of perspective in multi-dimensional media
US9707892B2 (en) * 2012-04-25 2017-07-18 Gentex Corporation Multi-focus optical system
US20130290867A1 (en) * 2012-04-27 2013-10-31 Litera Technologies, LLC Systems and Methods For Providing Dynamic and Interactive Viewing and Control of Applications
KR20130123599A (en) * 2012-05-03 2013-11-13 한국과학기술원 Speed dependent automatic dimming technique
CN103457960B (en) 2012-05-15 2018-03-09 腾讯科技(深圳)有限公司 The method and system of load document in web game
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
EP2856760B1 (en) * 2012-06-01 2018-09-05 Koninklijke Philips N.V. Autostereoscopic display device and driving method
US9201270B2 (en) * 2012-06-01 2015-12-01 Leia Inc. Directional backlight with a modulation layer
US8570651B1 (en) * 2012-06-04 2013-10-29 Hae-Yong Choi Both side screen for combined use of 2D/3D images
JP6046923B2 (en) * 2012-06-07 2016-12-21 キヤノン株式会社 Image coding apparatus, image coding method, and program
US9800862B2 (en) * 2012-06-12 2017-10-24 The Board Of Trustees Of The University Of Illinois System and methods for visualizing information
WO2014000129A1 (en) * 2012-06-30 2014-01-03 Intel Corporation 3d graphical user interface
US20140022241A1 (en) * 2012-07-18 2014-01-23 Electronics And Telecommunications Research Institute Display apparatus and method based on symmetrically spb
US10353718B2 (en) * 2012-07-23 2019-07-16 Vmware, Inc. Providing access to a remote application via a web client
US8959176B2 (en) 2012-07-31 2015-02-17 Apple Inc. Streaming common media content to multiple devices
US9491784B2 (en) * 2012-07-31 2016-11-08 Apple Inc. Streaming common media content to multiple devices
CA2822217A1 (en) 2012-08-02 2014-02-02 Iwatchlife Inc. Method and system for anonymous video analytics processing
US9786281B1 (en) * 2012-08-02 2017-10-10 Amazon Technologies, Inc. Household agent learning
US9423871B2 (en) * 2012-08-07 2016-08-23 Honeywell International Inc. System and method for reducing the effects of inadvertent touch on a touch screen controller
KR101994295B1 (en) * 2012-08-08 2019-06-28 삼성전자주식회사 Terminal and method for generating live image in terminal
US9225972B2 (en) 2012-08-10 2015-12-29 Pixtronix, Inc. Three dimensional (3D) image generation using electromechanical display elements
US9198209B2 (en) 2012-08-21 2015-11-24 Cisco Technology, Inc. Providing integrated end-to-end architecture that includes quality of service transport for tunneled traffic
TWI509289B (en) * 2012-08-27 2015-11-21 Innocom Tech Shenzhen Co Ltd Stereoscopic display apparatus and image display method thereof
CN103631021B (en) * 2012-08-27 2016-06-15 群康科技(深圳)有限公司 3 d display device and image display method thereof
KR20140028780A (en) * 2012-08-30 2014-03-10 삼성디스플레이 주식회사 Display apparatus and method of displaying three dimensional image using the same
US9811878B1 (en) * 2012-09-04 2017-11-07 Amazon Technologies, Inc. Dynamic processing of image borders
US10171540B2 (en) * 2012-09-07 2019-01-01 High Sec Labs Ltd Method and apparatus for streaming video security
US20150138444A1 (en) * 2012-09-14 2015-05-21 Masayuki Hirabayashi Video display apparatus and terminal device
US9179232B2 (en) * 2012-09-17 2015-11-03 Nokia Technologies Oy Method and apparatus for associating audio objects with content and geo-location
JP5837009B2 (en) * 2012-09-26 2015-12-24 キヤノン株式会社 Display device and control method thereof
CN104104934B (en) * 2012-10-04 2019-02-19 陈笛 The component and method of the more spectators' Three-dimensional Displays of glasses-free
JP5928286B2 (en) * 2012-10-05 2016-06-01 富士ゼロックス株式会社 Information processing apparatus and program
MX2015004575A (en) * 2012-10-10 2016-07-06 Broadcast 3Dtv Inc System for distributing auto-stereoscopic images.
US20140104242A1 (en) * 2012-10-12 2014-04-17 Nvidia Corporation System and method for concurrent display of a video signal on a plurality of display devices
CN102917265A (en) * 2012-10-25 2013-02-06 深圳创维-Rgb电子有限公司 Information browsing method and system based on network television
US9235103B2 (en) * 2012-10-25 2016-01-12 Au Optronics Corporation 3D liquid crystal display comprising four electrodes alternately arrange between a first and second substrate
US9161018B2 (en) * 2012-10-26 2015-10-13 Christopher L. UHL Methods and systems for synthesizing stereoscopic images
TWI452345B (en) * 2012-10-26 2014-09-11 Au Optronics Corp Three dimensions display device and displaying method thereof
JP2014092744A (en) * 2012-11-06 2014-05-19 Japan Display Inc Stereoscopic display device
US9674510B2 (en) * 2012-11-21 2017-06-06 Elwha Llc Pulsed projection system for 3D video
CN102981343B (en) * 2012-11-21 2015-01-07 京东方科技集团股份有限公司 Convertible lens and preparation method thereof, as well as two-dimensional and three-dimensional display surface substrate and display device
CN104516168B (en) * 2012-11-21 2018-05-08 京东方科技集团股份有限公司 Convertible lens and preparation method thereof, 2 d-3 d display base plate and display device
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9128580B2 (en) 2012-12-07 2015-09-08 Honeywell International Inc. System and method for interacting with a touch screen interface utilizing an intelligent stencil mask
US20140165209A1 (en) * 2012-12-11 2014-06-12 Verizon Patent And Licensing Inc. Digital content delivery platform for multiple retailers
US9047054B1 (en) * 2012-12-20 2015-06-02 Audible, Inc. User location-based management of content presentation
US9497448B2 (en) * 2012-12-31 2016-11-15 Lg Display Co., Ltd. Image processing method of transparent display apparatus and apparatus thereof
TWI531213B (en) * 2013-01-18 2016-04-21 國立成功大學 Image conversion method and module for naked-eye 3d display
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
CN105392423B (en) 2013-02-01 2018-08-17 凯内蒂科尔股份有限公司 The motion tracking system of real-time adaptive motion compensation in biomedical imaging
WO2014129134A1 (en) * 2013-02-19 2014-08-28 パナソニック株式会社 Image display device
TWI502247B (en) * 2013-02-26 2015-10-01 Chunghwa Picture Tubes Ltd Autostereoscopic display device and display method thereof
US8712217B1 (en) * 2013-03-01 2014-04-29 Comcast Cable Communications, Llc Methods and systems for time-shifting content
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US20140267601A1 (en) * 2013-03-14 2014-09-18 Corel Corporation System and method for efficient editing of 3d video
US20140268324A1 (en) * 2013-03-18 2014-09-18 3-D Virtual Lens Technologies, Llc Method of displaying 3d images from 2d source images using a barrier grid
CN103236074B (en) * 2013-03-25 2015-12-23 深圳超多维光电子有限公司 A kind of 2D/3D image processing method and device
US10110647B2 (en) * 2013-03-28 2018-10-23 Qualcomm Incorporated Method and apparatus for altering bandwidth consumption
KR101981530B1 (en) 2013-03-29 2019-05-23 엘지디스플레이 주식회사 Stereoscopic image display device and method for driving the same
CN103235415B (en) * 2013-04-01 2015-12-23 昆山龙腾光电有限公司 Based on the multi-view free stereoscopic displayer of grating
KR101970577B1 (en) * 2013-04-09 2019-04-19 엘지디스플레이 주식회사 Stereoscopic display device and eye-tracking method thereof
US20140328505A1 (en) * 2013-05-02 2014-11-06 Microsoft Corporation Sound field adaptation based upon user tracking
CN103293689B (en) * 2013-05-31 2015-05-13 京东方科技集团股份有限公司 Method capable of switching between different display modes and display device
KR20140142863A (en) * 2013-06-05 2014-12-15 한국전자통신연구원 Apparatus and method for providing graphic editors
TWI510813B (en) * 2013-06-18 2015-12-01 Zhangjiagang Kangde Xin Optronics Material Co Ltd A liquid crystal parallax barrier device that displays three-dimensional images in both directions
CN104238185B (en) * 2013-06-19 2017-04-12 扬升照明股份有限公司 Light source module, display device and light source module drive method
CN103309639A (en) * 2013-06-21 2013-09-18 广东威创视讯科技股份有限公司 Method and device based on split screen display of three-dimensional scene
US10003789B2 (en) 2013-06-24 2018-06-19 The Regents Of The University Of California Practical two-frame 3D+2D TV
CN103365657B (en) * 2013-06-28 2019-03-15 北京智谷睿拓技术服务有限公司 Display control method, device and the display equipment including the device
TWI495904B (en) * 2013-07-12 2015-08-11 Vision Technology Co Ltd C Field sequential color lcd and method for generating 3d images by matching a software optical grating
US9418469B1 (en) 2013-07-19 2016-08-16 Outward, Inc. Generating video content
JP2015025968A (en) * 2013-07-26 2015-02-05 ソニー株式会社 Presentation medium and display device
TWI489148B (en) * 2013-08-23 2015-06-21 Au Optronics Corp Stereoscopic display and the driving method
TWI505243B (en) * 2013-09-10 2015-10-21 Zhangjiagang Kangde Xin Optronics Material Co Ltd A device that can display 2D and 3D images at the same time
KR101856568B1 (en) * 2013-09-16 2018-06-19 삼성전자주식회사 Multi view image display apparatus and controlling method thereof
US9392355B1 (en) * 2013-09-19 2016-07-12 Voyetra Turtle Beach, Inc. Gaming headset with voice scrambling for private in-game conversations
US9591295B2 (en) * 2013-09-24 2017-03-07 Amazon Technologies, Inc. Approaches for simulating three-dimensional views
WO2015054235A1 (en) * 2013-10-07 2015-04-16 Vid Scale, Inc. User adaptive 3d video rendering and delivery
CN103508999B (en) * 2013-10-12 2015-05-13 浙江海正药业股份有限公司 Maxacalcitol synthesizing intermediate and preparation method and application thereof
US10652525B2 (en) 2013-10-31 2020-05-12 3Di Llc Quad view display system
US9883173B2 (en) 2013-12-25 2018-01-30 3Di Llc Stereoscopic display
US10116914B2 (en) * 2013-10-31 2018-10-30 3Di Llc Stereoscopic display
US9986228B2 (en) 2016-03-24 2018-05-29 3Di Llc Trackable glasses system that provides multiple views of a shared display
US11343487B2 (en) 2013-10-31 2022-05-24 David Woods Trackable glasses system for perspective views of a display
JP6411862B2 (en) * 2013-11-15 2018-10-24 パナソニック株式会社 File generation method and file generation apparatus
KR20150057064A (en) * 2013-11-18 2015-05-28 엘지전자 주식회사 Electronic device and control method thereof
US20150138184A1 (en) * 2013-11-20 2015-05-21 Apple Inc. Spatially interactive computing device
TWI511112B (en) * 2013-11-27 2015-12-01 Acer Inc Image display method and display system
CN103605211B (en) * 2013-11-27 2016-04-20 南京大学 Tablet non-auxiliary stereo display device and method
KR20150065056A (en) * 2013-12-04 2015-06-12 삼성디스플레이 주식회사 Image display apparatus
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
US20150189256A1 (en) * 2013-12-16 2015-07-02 Christian Stroetmann Autostereoscopic multi-layer display and control approaches
CN103676302B (en) * 2013-12-31 2016-04-06 京东方科技集团股份有限公司 Realize array base palte, display device and method that 2D/3D display switches
US10303242B2 (en) 2014-01-06 2019-05-28 Avegant Corp. Media chair apparatus, system, and method
US10409079B2 (en) 2014-01-06 2019-09-10 Avegant Corp. Apparatus, system, and method for displaying an image using a plate
JP6467680B2 (en) * 2014-01-10 2019-02-13 パナソニックIpマネジメント株式会社 File generation method and file generation apparatus
EP3097689B1 (en) 2014-01-23 2019-12-25 Telefonaktiebolaget LM Ericsson (publ) Multi-view display control for channel selection
US9182605B2 (en) * 2014-01-29 2015-11-10 Emine Goulanian Front-projection autostereoscopic 3D display system
US10554962B2 (en) 2014-02-07 2020-02-04 Samsung Electronics Co., Ltd. Multi-layer high transparency display for light field generation
US10565925B2 (en) 2014-02-07 2020-02-18 Samsung Electronics Co., Ltd. Full color display with intrinsic transparency
US10375365B2 (en) 2014-02-07 2019-08-06 Samsung Electronics Co., Ltd. Projection system with enhanced color and contrast
US10453371B2 (en) 2014-02-07 2019-10-22 Samsung Electronics Co., Ltd. Multi-layer display with color and contrast enhancement
CN103792672B (en) * 2014-02-14 2016-03-23 成都京东方光电科技有限公司 Stereo display assembly, liquid crystal panel and display device
CN104853008B (en) * 2014-02-17 2020-05-19 北京三星通信技术研究有限公司 Portable device and method capable of switching between two-dimensional display and three-dimensional display
KR101678389B1 (en) * 2014-02-28 2016-11-22 엔트릭스 주식회사 Method for providing media data based on cloud computing, apparatus and system
CN103903548B (en) * 2014-03-07 2016-03-02 京东方科技集团股份有限公司 A kind of driving method of display panel and drive system
CN106572810A (en) 2014-03-24 2017-04-19 凯内蒂科尔股份有限公司 Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US9373306B2 (en) * 2014-03-25 2016-06-21 Intel Coporation Direct viewer projection
KR102175813B1 (en) * 2014-04-18 2020-11-09 삼성디스플레이 주식회사 Three dimensional image display device and method of processing image
US20150334367A1 (en) * 2014-05-13 2015-11-19 Nagravision S.A. Techniques for displaying three dimensional objects
US9838756B2 (en) * 2014-05-20 2017-12-05 Electronics And Telecommunications Research Institute Method and apparatus for providing three-dimensional territorial broadcasting based on non real time service
KR102204830B1 (en) * 2014-05-20 2021-01-19 한국전자통신연구원 Method and apparatus for providing three-dimensional territorial brordcasting based on non real time service
CN104023223B (en) * 2014-05-29 2016-03-02 京东方科技集团股份有限公司 Display control method, Apparatus and system
CN104090365A (en) * 2014-06-18 2014-10-08 京东方科技集团股份有限公司 Shutter glasses, display device, display system and display method
US10613585B2 (en) * 2014-06-19 2020-04-07 Samsung Electronics Co., Ltd. Transparent display apparatus, group play system using transparent display apparatus and performance methods thereof
GB2527548A (en) * 2014-06-25 2015-12-30 Sharp Kk Variable barrier pitch correction
KR102221676B1 (en) * 2014-07-02 2021-03-02 삼성전자주식회사 Method, User terminal and Audio System for the speaker location and level control using the magnetic field
CN104155769A (en) * 2014-07-15 2014-11-19 深圳市亿思达显示科技有限公司 2D/3D co-fusion display device and advertizing device
CN104090818A (en) * 2014-07-16 2014-10-08 北京智谷睿拓技术服务有限公司 Information processing method, device and system
TWI556624B (en) * 2014-07-18 2016-11-01 友達光電股份有限公司 Image displaying method and image dispaly device
CN104252058B (en) * 2014-07-18 2017-06-20 京东方科技集团股份有限公司 Grating control method and device, grating, display panel and 3D display devices
EP3188660A4 (en) 2014-07-23 2018-05-16 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
WO2016017695A1 (en) * 2014-07-30 2016-02-04 オリンパス株式会社 Image processing device
KR102366677B1 (en) * 2014-08-02 2022-02-23 삼성전자주식회사 Apparatus and Method for User Interaction thereof
JP6327062B2 (en) * 2014-08-25 2018-05-23 オムロン株式会社 Display device
US9925980B2 (en) 2014-09-17 2018-03-27 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
US10257494B2 (en) 2014-09-22 2019-04-09 Samsung Electronics Co., Ltd. Reconstruction of three-dimensional video
CN107079147B (en) 2014-09-25 2019-08-16 皇家飞利浦有限公司 Display equipment with outbound course control and the backlight for this display equipment
FR3026852B1 (en) * 2014-10-03 2016-12-02 Thales Sa SEMI-TRANSPARENT SCREEN DISPLAY SYSTEM SHARED BY TWO OBSERVERS
US10506295B2 (en) * 2014-10-09 2019-12-10 Disney Enterprises, Inc. Systems and methods for delivering secondary content to viewers
KR102266064B1 (en) * 2014-10-15 2021-06-18 삼성디스플레이 주식회사 Method of driving display panel, display panel driving apparatus and display apparatus having the display panel driving apparatus
US20160119685A1 (en) * 2014-10-21 2016-04-28 Samsung Electronics Co., Ltd. Display method and display device
CN104361622B (en) * 2014-10-31 2018-06-19 福建星网视易信息系统有限公司 A kind of interface method for drafting and device
CN104461440B (en) * 2014-12-31 2018-01-02 上海天马有机发光显示技术有限公司 Rendering intent, rendering device and display device
EP3243093A4 (en) 2015-01-10 2018-09-19 LEIA Inc. Diffraction grating-based backlighting having controlled diffractive coupling efficiency
WO2016111706A1 (en) 2015-01-10 2016-07-14 Leia Inc. Polarization-mixing light guide and multibeam grating-based backlighting using same
CN107209406B (en) 2015-01-10 2021-07-27 镭亚股份有限公司 Two-dimensional/three-dimensional (2D/3D) switchable display backlight and electronic display
WO2016118107A1 (en) 2015-01-19 2016-07-28 Leia Inc. Unidirectional grating-based backlighting employing a reflective island
KR20160089600A (en) * 2015-01-19 2016-07-28 삼성디스플레이 주식회사 Display device
US9690110B2 (en) * 2015-01-21 2017-06-27 Apple Inc. Fine-coarse autostereoscopic display
WO2016122679A1 (en) * 2015-01-28 2016-08-04 Leia Inc. Three-dimensional (3d) electronic display
US9973725B2 (en) * 2015-02-02 2018-05-15 Continental Teves Ag & Co. Ohg Modular television system
JP6359990B2 (en) * 2015-02-24 2018-07-18 株式会社ジャパンディスプレイ Display device and display method
JP6359989B2 (en) * 2015-02-24 2018-07-18 株式会社ジャパンディスプレイ Display device and display method
TWI554788B (en) * 2015-03-04 2016-10-21 友達光電股份有限公司 Display device
KR102321364B1 (en) * 2015-03-05 2021-11-03 삼성전자주식회사 Method for synthesizing a 3d backgroud content and device thereof
EP3271761B1 (en) 2015-03-16 2021-04-21 LEIA Inc. Unidirectional grating-based backlighting employing an angularly selective reflective layer
JP6411257B2 (en) * 2015-03-19 2018-10-24 株式会社ジャパンディスプレイ Display device and control method thereof
US9823474B2 (en) 2015-04-02 2017-11-21 Avegant Corp. System, apparatus, and method for displaying an image with a wider field of view
US9995857B2 (en) 2015-04-03 2018-06-12 Avegant Corp. System, apparatus, and method for displaying an image using focal modulation
US9846309B2 (en) * 2015-04-17 2017-12-19 Dongseo University Technology Headquarters Depth-priority integral imaging display method using nonuniform dynamic mask array
KR102329108B1 (en) 2015-04-23 2021-11-18 레이아 인코포레이티드 Dual light guide grating-based backlight and electronic display using same
US10360617B2 (en) 2015-04-24 2019-07-23 Walmart Apollo, Llc Automated shopping apparatus and method in response to consumption
US9705936B2 (en) * 2015-04-24 2017-07-11 Mersive Technologies, Inc. System and method for interactive and real-time visualization of distributed media
KR102239156B1 (en) 2015-05-09 2021-04-12 레이아 인코포레이티드 Color-scanning grating-based backlight and electronic display using same
CN104834104B (en) * 2015-05-25 2017-05-24 京东方科技集团股份有限公司 2D/3D switchable display panel, and display method and display device thereof
ES2819239T3 (en) 2015-05-30 2021-04-15 Leia Inc Vehicle display system
US10904091B2 (en) 2015-06-03 2021-01-26 Avago Technologies International Sales Pte. Limited System for network-based reallocation of functions
CN104883559A (en) * 2015-06-06 2015-09-02 深圳市虚拟现实科技有限公司 Video playing method and video playing device
CN104851394B (en) * 2015-06-10 2017-11-28 京东方科技集团股份有限公司 A kind of display device and display methods
CN104849870B (en) * 2015-06-12 2018-01-09 京东方科技集团股份有限公司 Display panel and display device
CN107810631A (en) * 2015-06-16 2018-03-16 Lg电子株式会社 Broadcast singal dispensing device, broadcast receiver, broadcast singal sending method and broadcast signal received method
US9846310B2 (en) * 2015-06-22 2017-12-19 Innolux Corporation 3D image display device with improved depth ranges
GB2540376A (en) * 2015-07-14 2017-01-18 Sharp Kk Parallax barrier with independently controllable regions
GB2540377A (en) 2015-07-14 2017-01-18 Sharp Kk Parallax barrier with independently controllable regions
WO2017015056A1 (en) * 2015-07-17 2017-01-26 Abl Ip Holding Llc Arrangements for software configurable lighting device
EP3325401A1 (en) 2015-07-17 2018-05-30 ABL IP Holding LLC Systems and methods to provide configuration data to a software configurable lighting device
EP3325400A1 (en) 2015-07-17 2018-05-30 ABL IP Holding LLC Software configurable lighting device
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10079000B2 (en) 2015-08-12 2018-09-18 Microsoft Technology Licensing, Llc Reducing display degradation
CN105100783B (en) * 2015-08-19 2018-03-23 京东方科技集团股份有限公司 3D display device and 3D display method
US10186188B2 (en) * 2015-09-23 2019-01-22 Motorola Solutions, Inc. Multi-angle simultaneous view light-emitting diode display
EP3148188A1 (en) * 2015-09-24 2017-03-29 Airbus Operations GmbH Virtual windows for airborne verhicles
CN106254845B (en) * 2015-10-20 2017-08-25 深圳超多维光电子有限公司 A kind of method of bore hole stereoscopic display, device and electronic equipment
CN105306866A (en) * 2015-10-27 2016-02-03 青岛海信电器股份有限公司 Frame rate conversion method and device
RU2720660C2 (en) * 2015-11-10 2020-05-12 Конинклейке Филипс Н.В. Display device and a display device control method
US11079931B2 (en) 2015-11-13 2021-08-03 Harman International Industries, Incorporated User interface for in-vehicle system
US20170148488A1 (en) * 2015-11-20 2017-05-25 Mediatek Inc. Video data processing system and associated method for analyzing and summarizing recorded video data
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
US9711128B2 (en) 2015-12-04 2017-07-18 Opentv, Inc. Combined audio for multiple content presentation
CA3011552A1 (en) 2016-01-19 2017-07-27 Walmart Apollo, Llc Consumable item ordering system
US10373544B1 (en) 2016-01-29 2019-08-06 Leia, Inc. Transformation from tiled to composite images
CN108885352B (en) * 2016-01-29 2021-11-23 奇跃公司 Display of three-dimensional images
WO2017156622A1 (en) * 2016-03-13 2017-09-21 Rising Sun Productions Limited Head-mounted audiovisual capture device
US10063917B2 (en) 2016-03-16 2018-08-28 Sorenson Media Inc. Fingerprint layouts for content fingerprinting
US10200428B1 (en) * 2016-03-30 2019-02-05 Amazon Technologies, Inc. Unicast routing of a media stream to subscribers
US10185787B1 (en) * 2016-04-06 2019-01-22 Bentley Systems, Incorporated Tool for accurate onsite model visualization that facilitates environment interaction
US10256277B2 (en) * 2016-04-11 2019-04-09 Abl Ip Holding Llc Luminaire utilizing a transparent organic light emitting device display
WO2017188955A1 (en) * 2016-04-28 2017-11-02 Hewlett-Packard Development Company, L.P. Digital display devices
TWI626475B (en) * 2016-06-08 2018-06-11 國立交通大學 Stereoscopic display screen and stereoscopic display system
KR102483042B1 (en) 2016-06-17 2022-12-29 디티에스, 인코포레이티드 Distance panning using near/far rendering
CN105842865B (en) * 2016-06-21 2018-01-30 成都工业学院 A kind of slim grating 3D display device based on slit grating
CN106257321B (en) * 2016-06-28 2021-11-30 京东方科技集团股份有限公司 3D head-up display system and method
US20180035236A1 (en) * 2016-07-28 2018-02-01 Leonardo Basterra Audio System with Binaural Elements and Method of Use with Perspective Switching
US10235010B2 (en) 2016-07-28 2019-03-19 Canon Kabushiki Kaisha Information processing apparatus configured to generate an audio signal corresponding to a virtual viewpoint image, information processing system, information processing method, and non-transitory computer-readable storage medium
US10089063B2 (en) 2016-08-10 2018-10-02 Qualcomm Incorporated Multimedia device for processing spatialized audio based on movement
US10154253B2 (en) * 2016-08-29 2018-12-11 Disney Enterprises, Inc. Multi-view displays using images encoded with orbital angular momentum (OAM) on a pixel or image basis
WO2018044711A1 (en) * 2016-08-31 2018-03-08 Wal-Mart Stores, Inc. Systems and methods of enabling retail shopping while disabling components based on location
US10621898B2 (en) * 2016-11-23 2020-04-14 Pure Depth Limited Multi-layer display system for vehicle dash or the like
GB2556910A (en) * 2016-11-25 2018-06-13 Nokia Technologies Oy Virtual reality display
US10170060B2 (en) * 2016-12-27 2019-01-01 Facebook Technologies, Llc Interlaced liquid crystal display panel and backlight used in a head mounted display
US10856016B2 (en) 2016-12-31 2020-12-01 Turner Broadcasting System, Inc. Publishing disparate live media output streams in mixed mode based on user selection
US10645462B2 (en) 2016-12-31 2020-05-05 Turner Broadcasting System, Inc. Dynamic channel versioning in a broadcast air chain
US10992973B2 (en) 2016-12-31 2021-04-27 Turner Broadcasting System, Inc. Publishing a plurality of disparate live media output stream manifests using live input streams and pre-encoded media assets
US10694231B2 (en) 2016-12-31 2020-06-23 Turner Broadcasting System, Inc. Dynamic channel versioning in a broadcast air chain based on user preferences
US10965967B2 (en) 2016-12-31 2021-03-30 Turner Broadcasting System, Inc. Publishing a disparate per-client live media output stream based on dynamic insertion of targeted non-programming content and customized programming content
US11038932B2 (en) 2016-12-31 2021-06-15 Turner Broadcasting System, Inc. System for establishing a shared media session for one or more client devices
US10075753B2 (en) 2016-12-31 2018-09-11 Turner Broadcasting System, Inc. Dynamic scheduling and channel creation based on user selection
US11051061B2 (en) 2016-12-31 2021-06-29 Turner Broadcasting System, Inc. Publishing a disparate live media output stream using pre-encoded media assets
US11051074B2 (en) 2016-12-31 2021-06-29 Turner Broadcasting System, Inc. Publishing disparate live media output streams using live input streams
US11134309B2 (en) 2016-12-31 2021-09-28 Turner Broadcasting System, Inc. Creation of channels using pre-encoded media assets
US10425700B2 (en) 2016-12-31 2019-09-24 Turner Broadcasting System, Inc. Dynamic scheduling and channel creation based on real-time or near-real-time content context analysis
US11109086B2 (en) 2016-12-31 2021-08-31 Turner Broadcasting System, Inc. Publishing disparate live media output streams in mixed mode
US11503352B2 (en) 2016-12-31 2022-11-15 Turner Broadcasting System, Inc. Dynamic scheduling and channel creation based on external data
CN108287679A (en) * 2017-01-10 2018-07-17 中兴通讯股份有限公司 A kind of display characteristic parameter adjusting method and terminal
CN106710531B (en) * 2017-01-19 2019-11-05 深圳市华星光电技术有限公司 Backlight control circuit and electronic device
US11044464B2 (en) * 2017-02-09 2021-06-22 Fyusion, Inc. Dynamic content modification of image and video based multi-view interactive digital media representations
US10210833B2 (en) * 2017-03-31 2019-02-19 Panasonic Liquid Crystal Display Co., Ltd. Display device
US10078135B1 (en) * 2017-04-25 2018-09-18 Intel Corporation Identifying a physical distance using audio channels
WO2018213101A1 (en) 2017-05-14 2018-11-22 Leia Inc. Multiview backlight, display, and method employing active emitters
US10375375B2 (en) 2017-05-15 2019-08-06 Lg Electronics Inc. Method of providing fixed region information or offset region information for subtitle in virtual reality system and device for controlling the same
FR3066672B1 (en) * 2017-05-19 2020-05-22 Sagemcom Broadband Sas METHOD FOR COMMUNICATING AN IMMERSIVE VIDEO
US11245964B2 (en) 2017-05-25 2022-02-08 Turner Broadcasting System, Inc. Management and delivery of over-the-top services over different content-streaming systems
CN116666814A (en) 2017-05-30 2023-08-29 奇跃公司 Power supply assembly with fan assembly for electronic device
CN107146573B (en) * 2017-06-26 2020-05-01 上海天马有机发光显示技术有限公司 Display panel, display method thereof and display device
EP3422151A1 (en) * 2017-06-30 2019-01-02 Nokia Technologies Oy Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality
JP7398962B2 (en) 2017-07-28 2023-12-15 マジック リープ, インコーポレイテッド Fan assembly for displaying images
CN107396087B (en) * 2017-07-31 2019-03-12 京东方科技集团股份有限公司 Naked eye three-dimensional display device and its control method
US10692279B2 (en) * 2017-07-31 2020-06-23 Quantum Spatial, Inc. Systems and methods for facilitating making partial selections of multidimensional information while maintaining a multidimensional structure
US11049218B2 (en) 2017-08-11 2021-06-29 Samsung Electronics Company, Ltd. Seamless image stitching
US10515397B2 (en) * 2017-09-08 2019-12-24 Uptown Network LLC System and method for facilitating virtual gift giving
CN107707901B (en) * 2017-09-30 2019-10-25 深圳超多维科技有限公司 It is a kind of for the display methods of naked eye 3D display screen, device and equipment
CN108205411A (en) * 2017-09-30 2018-06-26 中兴通讯股份有限公司 Display changeover method and device, terminal
US10777057B1 (en) * 2017-11-30 2020-09-15 Amazon Technologies, Inc. Premises security system with audio simulating occupancy
US10212532B1 (en) 2017-12-13 2019-02-19 At&T Intellectual Property I, L.P. Immersive media with media device
EP3503579B1 (en) * 2017-12-20 2022-03-23 Nokia Technologies Oy Multi-camera device
US11132842B2 (en) * 2017-12-22 2021-09-28 Unity IPR ApS Method and system for synchronizing a plurality of augmented reality devices to a virtual reality device
JP2019154008A (en) * 2018-03-06 2019-09-12 シャープ株式会社 Stereoscopic image display device, method for displaying liquid crystal display, and program for liquid crystal display
CN108469682A (en) * 2018-03-30 2018-08-31 京东方科技集团股份有限公司 A kind of three-dimensional display apparatus and its 3 D displaying method
CN108490703B (en) * 2018-04-03 2021-10-15 京东方科技集团股份有限公司 Display system and display control method thereof
US11025892B1 (en) 2018-04-04 2021-06-01 James Andrew Aman System and method for simultaneously providing public and private images
KR102622714B1 (en) 2018-04-08 2024-01-08 디티에스, 인코포레이티드 Ambisonic depth extraction
CN112567759B (en) * 2018-04-11 2023-09-29 阿尔卡鲁兹公司 Digital media system supporting multiple features regarding virtual reality content
US10999573B2 (en) * 2018-04-25 2021-05-04 Raxium, Inc. Partial light field display architecture
CN112005289B (en) 2018-04-26 2023-07-18 株式会社半导体能源研究所 Display device and electronic apparatus
EP3579584A1 (en) 2018-06-07 2019-12-11 Nokia Technologies Oy Controlling rendering of a spatial audio scene
US10600246B2 (en) * 2018-06-15 2020-03-24 Microsoft Technology Licensing, Llc Pinning virtual reality passthrough regions to real-world locations
KR102506873B1 (en) * 2018-07-18 2023-03-08 현대자동차주식회사 Vehicle cluster having a three-dimensional effect, system having the same and method providing a three-dimensional scene thereof
US11276360B2 (en) * 2018-07-27 2022-03-15 Kyocera Corporation Display device and mobile body
US10762394B2 (en) 2018-07-31 2020-09-01 Intel Corporation System and method for 3D blob classification and transmission
US11212506B2 (en) 2018-07-31 2021-12-28 Intel Corporation Reduced rendering of six-degree of freedom video
US11178373B2 (en) 2018-07-31 2021-11-16 Intel Corporation Adaptive resolution of point cloud and viewpoint prediction for video streaming in computing environments
US10887574B2 (en) 2018-07-31 2021-01-05 Intel Corporation Selective packing of patches for immersive video
US10893299B2 (en) 2018-07-31 2021-01-12 Intel Corporation Surface normal vector processing mechanism
US10757324B2 (en) 2018-08-03 2020-08-25 Semiconductor Components Industries, Llc Transform processors for gradually switching between image transforms
US11057631B2 (en) 2018-10-10 2021-07-06 Intel Corporation Point cloud coding standard conformance definition in computing environments
CN109192136B (en) * 2018-10-25 2020-12-22 京东方科技集团股份有限公司 Display substrate, light field display device and driving method thereof
US11727859B2 (en) 2018-10-25 2023-08-15 Boe Technology Group Co., Ltd. Display panel and display device
US10880534B2 (en) * 2018-11-09 2020-12-29 Korea Electronics Technology Institute Electronic device and method for tiled video multi-channel playback
KR102023905B1 (en) * 2018-11-09 2019-11-04 전자부품연구원 Electronic device and method for multi-channel reproduction of tiled image
US10699673B2 (en) * 2018-11-19 2020-06-30 Facebook Technologies, Llc Apparatus, systems, and methods for local dimming in brightness-controlled environments
CN109598254B (en) * 2018-12-17 2019-11-26 海南大学 The space representation combined optimization method of Group-oriented
US10880606B2 (en) 2018-12-21 2020-12-29 Turner Broadcasting System, Inc. Disparate live media output stream playout and broadcast distribution
US11082734B2 (en) 2018-12-21 2021-08-03 Turner Broadcasting System, Inc. Publishing a disparate live media output stream that complies with distribution format regulations
US10873774B2 (en) 2018-12-22 2020-12-22 Turner Broadcasting System, Inc. Publishing a disparate live media output stream manifest that includes one or more media segments corresponding to key events
US10854171B2 (en) * 2018-12-31 2020-12-01 Samsung Electronics Co., Ltd. Multi-user personal display system and applications thereof
EP3687166A1 (en) * 2019-01-23 2020-07-29 Ultra-D Coöperatief U.A. Interoperable 3d image content handling
CN109686303B (en) * 2019-01-28 2021-09-17 厦门天马微电子有限公司 Organic light-emitting display panel, organic light-emitting display device and compensation method
JP7317517B2 (en) * 2019-02-12 2023-07-31 株式会社ジャパンディスプレイ Display device
US10932080B2 (en) 2019-02-14 2021-02-23 Microsoft Technology Licensing, Llc Multi-sensor object tracking for modifying audio
CN110007475A (en) * 2019-04-17 2019-07-12 万维云视(上海)数码科技有限公司 Utilize the method and apparatus of virtual depth compensation eyesight
US10504453B1 (en) 2019-04-18 2019-12-10 Apple Inc. Displays with adjustable direct-lit backlight units
US10964275B2 (en) 2019-04-18 2021-03-30 Apple Inc. Displays with adjustable direct-lit backlight units and adaptive processing
US10571744B1 (en) 2019-04-18 2020-02-25 Apple Inc. Displays with adjustable direct-lit backlight units and power consumption compensation
EP3938884A4 (en) * 2019-04-29 2022-11-09 Hewlett-Packard Development Company, L.P. Wireless configuration of display attribute
CN110262051B (en) * 2019-07-26 2023-12-29 成都工业学院 Retroreflective stereoscopic display device based on directional light source
EP3779612A1 (en) * 2019-08-16 2021-02-17 The Swatch Group Research and Development Ltd Method for broadcasting a message to the wearer of a watch
CN112394845B (en) * 2019-08-19 2024-03-01 北京小米移动软件有限公司 Distance sensor module, display device, electronic equipment and distance detection method
US11335095B1 (en) * 2019-08-27 2022-05-17 Gopro, Inc. Systems and methods for characterizing visual content
KR20220054850A (en) 2019-09-03 2022-05-03 라이트 필드 랩 인코포레이티드 Lightfield display system for gaming environments
CN111415629B (en) * 2020-04-28 2022-02-22 Tcl华星光电技术有限公司 Display device driving method and display device
US11750795B2 (en) 2020-05-12 2023-09-05 Apple Inc. Displays with viewer tracking
US11936844B1 (en) 2020-08-11 2024-03-19 Apple Inc. Pre-processing in a display pipeline
CN112505942B (en) * 2021-02-03 2021-04-20 成都工业学院 Multi-resolution stereoscopic display device based on rear projection light source
CN113992885B (en) * 2021-09-22 2023-03-21 联想(北京)有限公司 Data synchronization method and device
NL2030325B1 (en) * 2021-12-28 2023-07-03 Dimenco Holding B V Scaling of three-dimensional content for an autostereoscopic display device
KR20230112485A (en) * 2022-01-20 2023-07-27 엘지전자 주식회사 Display device and operating method thereof
CN114936002A (en) * 2022-06-10 2022-08-23 斑马网络技术有限公司 Interface display method and device and vehicle

Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4829365A (en) * 1986-03-07 1989-05-09 Dimension Technologies, Inc. Autostereoscopic display with illuminating lines, light valve and mask
US5615046A (en) * 1995-01-23 1997-03-25 Cyber Scientific Inc. Stereoscopic viewing system
US5855425A (en) * 1996-07-19 1999-01-05 Sanyo Electric Co., Ltd. Stereoscopic display
US5945965A (en) * 1995-06-29 1999-08-31 Canon Kabushiki Kaisha Stereoscopic image display method
US5959597A (en) * 1995-09-28 1999-09-28 Sony Corporation Image/audio reproducing system
US5969850A (en) * 1996-09-27 1999-10-19 Sharp Kabushiki Kaisha Spatial light modulator, directional display and directional light source
US5990975A (en) * 1996-11-22 1999-11-23 Acer Peripherals, Inc. Dual screen displaying device
US6023277A (en) * 1996-07-03 2000-02-08 Canon Kabushiki Kaisha Display control apparatus and method
US6049424A (en) * 1995-11-15 2000-04-11 Sanyo Electric Co., Ltd. Three dimensional display device
US6094216A (en) * 1995-05-22 2000-07-25 Canon Kabushiki Kaisha Stereoscopic image display method, and stereoscopic image display apparatus using the method
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6188442B1 (en) * 1997-08-01 2001-02-13 International Business Machines Corporation Multiviewer display system for television monitors
US6285368B1 (en) * 1997-02-10 2001-09-04 Canon Kabushiki Kaisha Image display system and image display apparatus and information processing apparatus in the system
US20020010798A1 (en) * 2000-04-20 2002-01-24 Israel Ben-Shaul Differentiated content and application delivery via internet
US20020037037A1 (en) * 2000-09-22 2002-03-28 Philips Electronics North America Corporation Preferred transmission/streaming order of fine-granular scalability
US20020167862A1 (en) * 2001-04-03 2002-11-14 Carlo Tomasi Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device
US20020171666A1 (en) * 1999-02-19 2002-11-21 Takaaki Endo Image processing apparatus for interpolating and generating images from an arbitrary view point
US20030012425A1 (en) * 1998-11-12 2003-01-16 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
US20030103165A1 (en) * 2000-05-19 2003-06-05 Werner Bullinger System for operating a consumer electronics appaliance
US20030137506A1 (en) * 2001-11-30 2003-07-24 Daniel Efran Image-based rendering for 3D viewing
US20030154261A1 (en) * 1994-10-17 2003-08-14 The Regents Of The University Of California, A Corporation Of The State Of California Distributed hypermedia method and system for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document
US20030223499A1 (en) * 2002-04-09 2003-12-04 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
US20040027452A1 (en) * 2002-08-07 2004-02-12 Yun Kug Jin Method and apparatus for multiplexing multi-view three-dimensional moving picture
US6697687B1 (en) * 1998-11-09 2004-02-24 Hitachi, Ltd. Image display apparatus having audio output control means in accordance with image signal type
US20040036763A1 (en) * 1994-11-14 2004-02-26 Swift David C. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US20040041747A1 (en) * 2002-08-27 2004-03-04 Nec Corporation 3D image/2D image switching display apparatus and portable terminal device
US6710920B1 (en) * 1998-03-27 2004-03-23 Sanyo Electric Co., Ltd Stereoscopic display
US20040109093A1 (en) * 2002-12-05 2004-06-10 Small-Stryker Aaron Tug Method and apparatus for simultaneous television video presentation and separate viewing of different broadcasts
US20040141237A1 (en) * 1995-06-07 2004-07-22 Wohlstadter Jacob N. Three dimensional imaging system
US20040164292A1 (en) * 2003-02-21 2004-08-26 Yeh-Jiun Tung Transflective display having an OLED backlight
US20040239231A1 (en) * 2002-10-30 2004-12-02 Keisuke Miyagawa Display device and electronic equipment
US20040252187A1 (en) * 2001-09-10 2004-12-16 Alden Ray M. Processes and apparatuses for efficient multiple program and 3D display
US20050073472A1 (en) * 2003-07-26 2005-04-07 Samsung Electronics Co., Ltd. Method of removing Moire pattern in 3D image display apparatus using complete parallax
US20050078108A1 (en) * 2000-06-12 2005-04-14 Swift David C. Electronic stereoscopic media delivery system
US20050128353A1 (en) * 2003-12-16 2005-06-16 Young Bruce A. System and method for using second remote control device for sub-picture control in television receiver
US20050237487A1 (en) * 2004-04-23 2005-10-27 Chang Nelson L A Color wheel assembly for stereoscopic imaging
US20050248561A1 (en) * 2002-04-25 2005-11-10 Norio Ito Multimedia information generation method and multimedia information reproduction device
US20050259147A1 (en) * 2002-07-16 2005-11-24 Nam Jeho Apparatus and method for adapting 2d and 3d stereoscopic video signal
US20060050785A1 (en) * 2004-09-09 2006-03-09 Nucore Technology Inc. Inserting a high resolution still image into a lower resolution video stream
US7030903B2 (en) * 1997-02-20 2006-04-18 Canon Kabushiki Kaisha Image display system, information processing apparatus, and method of controlling the same
US20060087556A1 (en) * 2004-10-21 2006-04-27 Kazunari Era Stereoscopic image display device
US7038698B1 (en) * 1996-02-08 2006-05-02 Palm Charles S 3D stereo browser for the internet
US20060109242A1 (en) * 2004-11-19 2006-05-25 Simpkins Daniel S User interface for impaired users
US20060139490A1 (en) * 2004-12-15 2006-06-29 Fekkes Wilhelmus F Synchronizing audio with delayed video
US20060139448A1 (en) * 2004-12-29 2006-06-29 Samsung Electronics Co., Ltd. 3D displays with flexible switching capability of 2D/3D viewing modes
US7091471B2 (en) * 2004-03-15 2006-08-15 Agilent Technologies, Inc. Using eye detection for providing control and power management of electronic devices
US7123213B2 (en) * 1995-10-05 2006-10-17 Semiconductor Energy Laboratory Co., Ltd. Three dimensional display unit and display method
US20060244918A1 (en) * 2005-04-27 2006-11-02 Actuality Systems, Inc. Minimized-thickness angular scanner of electromagnetic radiation
US20060256136A1 (en) * 2001-10-01 2006-11-16 Adobe Systems Incorporated, A Delaware Corporation Compositing two-dimensional and three-dimensional image layers
US20060256302A1 (en) * 2005-05-13 2006-11-16 Microsoft Corporation Three-dimensional (3D) image projection
US20060271791A1 (en) * 2005-05-27 2006-11-30 Sbc Knowledge Ventures, L.P. Method and system for biometric based access control of media content presentation devices
US20070002041A1 (en) * 2005-07-02 2007-01-04 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding video data to implement local three-dimensional video
US20070008406A1 (en) * 2005-07-08 2007-01-11 Samsung Electronics Co., Ltd. High resolution 2D-3D switchable autostereoscopic display apparatus
US20070008620A1 (en) * 2005-07-11 2007-01-11 Samsung Electronics Co., Ltd. Switchable autostereoscopic display
US20070052807A1 (en) * 2005-09-07 2007-03-08 Fuji Xerox Co., Ltd. System and method for user monitoring interface of 3-D video streams from multiple cameras
US7190518B1 (en) * 1996-01-22 2007-03-13 3Ality, Inc. Systems for and methods of three dimensional viewing
US20070072674A1 (en) * 2005-09-12 2007-03-29 Nintendo Co., Ltd. Information processing program
US20070085814A1 (en) * 2003-09-20 2007-04-19 Koninklijke Philips Electronics N.V. Image display device
US20070096125A1 (en) * 2005-06-24 2007-05-03 Uwe Vogel Illumination device
US20070097103A1 (en) * 2003-09-11 2007-05-03 Shoji Yoshioka Portable display device
US20070097208A1 (en) * 2003-05-28 2007-05-03 Satoshi Takemoto Stereoscopic image display apparatus, text data processing apparatus, program, and storing medium
US20070139371A1 (en) * 2005-04-04 2007-06-21 Harsham Bret A Control system and method for differentiating multiple users utilizing multi-view display devices
US20070146267A1 (en) * 2005-12-22 2007-06-28 Lg.Philips Lcd Co., Ltd. Display device and method of driving the same
US20070147827A1 (en) * 2005-12-28 2007-06-28 Arnold Sheynman Methods and apparatus for wireless stereo video streaming
US20070153916A1 (en) * 2005-12-30 2007-07-05 Sharp Laboratories Of America, Inc. Wireless video transmission system
US20070162392A1 (en) * 2006-01-12 2007-07-12 Microsoft Corporation Management of Streaming Content
US20070258140A1 (en) * 2006-05-04 2007-11-08 Samsung Electronics Co., Ltd. Multiview autostereoscopic display
US20070270218A1 (en) * 2006-05-08 2007-11-22 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20070296874A1 (en) * 2004-10-20 2007-12-27 Fujitsu Ten Limited Display Device,Method of Adjusting the Image Quality of the Display Device, Device for Adjusting the Image Quality and Device for Adjusting the Contrast
US20080025390A1 (en) * 2006-07-25 2008-01-31 Fang Shi Adaptive video frame interpolation
US20080037120A1 (en) * 2006-08-08 2008-02-14 Samsung Electronics Co., Ltd High resolution 2d/3d switchable display apparatus
US20080043644A1 (en) * 2006-08-18 2008-02-21 Microsoft Corporation Techniques to perform rate matching for multimedia conference calls
US20080043096A1 (en) * 2006-04-04 2008-02-21 Anthony Vetro Method and System for Decoding and Displaying 3D Light Fields
US20080068329A1 (en) * 2006-09-15 2008-03-20 Samsung Electronics Co., Ltd. Multi-view autostereoscopic display with improved resolution
US7359105B2 (en) * 2006-02-07 2008-04-15 Sharp Kabushiki Kaisha Spatial light modulator and a display device
US20080126557A1 (en) * 2006-09-08 2008-05-29 Tetsuro Motoyama System, method, and computer program product using an SNMP implementation to obtain vendor information from remote devices
US20080133122A1 (en) * 2006-03-29 2008-06-05 Sanyo Electric Co., Ltd. Multiple visual display device and vehicle-mounted navigation system
US20080150853A1 (en) * 2006-12-22 2008-06-26 Hong Kong Applied Science and Technology Research Institute Company Limited Backlight device and liquid crystal display incorporating the backlight device
US20080165176A1 (en) * 2006-09-28 2008-07-10 Charles Jens Archer Method of Video Display and Multiplayer Gaming
US20080168129A1 (en) * 2007-01-08 2008-07-10 Jeffrey Robbin Pairing a Media Server and a Media Client
US20080184301A1 (en) * 1999-10-29 2008-07-31 Boylan Peter C Interactive television system with programming-related links
US20080191964A1 (en) * 2005-04-22 2008-08-14 Koninklijke Philips Electronics, N.V. Auto-Stereoscopic Display With Mixed Mode For Concurrent Display of Two- and Three-Dimensional Images
US20080192112A1 (en) * 2005-03-18 2008-08-14 Ntt Data Sanyo System Corporation Stereoscopic Image Display Apparatus, Stereoscopic Image Displaying Method And Computer Program Product
US20080246757A1 (en) * 2005-04-25 2008-10-09 Masahiro Ito 3D Image Generation and Display System
US7440193B2 (en) * 2004-04-30 2008-10-21 Gunasekaran R Alfred Wide-angle variable focal length lens system
US20080259233A1 (en) * 2005-12-20 2008-10-23 Koninklijke Philips Electronics, N.V. Autostereoscopic Display Device
US20080273242A1 (en) * 2003-09-30 2008-11-06 Graham John Woodgate Directional Display Apparatus
US20080284844A1 (en) * 2003-02-05 2008-11-20 Graham John Woodgate Switchable Lens
US20080303832A1 (en) * 2007-06-11 2008-12-11 Samsung Electronics Co., Ltd. Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same
US20090002178A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Dynamic mood sensing
US20090010264A1 (en) * 2006-03-21 2009-01-08 Huawei Technologies Co., Ltd. Method and System for Ensuring QoS and SLA Server
US20090052164A1 (en) * 2007-08-24 2009-02-26 Masako Kashiwagi Directional backlight, display apparatus, and stereoscopic display apparatus
US20090051759A1 (en) * 2005-05-27 2009-02-26 Adkins Sean M Equipment and methods for the synchronization of stereoscopic projection displays
US20090058845A1 (en) * 2004-10-20 2009-03-05 Yasuhiro Fukuda Display device
US7511774B2 (en) * 2005-11-30 2009-03-31 Samsung Mobile Display Co., Ltd. Three-dimensional display device
US20090102915A1 (en) * 2005-04-25 2009-04-23 Svyatoslav Ivanovich Arsenich Stereoprojection system
US20090115783A1 (en) * 2007-11-02 2009-05-07 Dimension Technologies, Inc. 3d optical illusions from off-axis displays
US20110050687A1 (en) * 2008-04-04 2011-03-03 Denis Vladimirovich Alyshev Presentation of Objects in Stereoscopic 3D Displays
US8209396B1 (en) * 2008-12-10 2012-06-26 Howcast Media, Inc. Video player

Family Cites Families (164)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS56109649A (en) 1980-02-05 1981-08-31 Matsushita Electric Ind Co Ltd Ultrasonic diagnosing device
JPH05122733A (en) * 1991-10-28 1993-05-18 Nippon Hoso Kyokai <Nhk> Three-dimensional picture display device
US5493427A (en) * 1993-05-25 1996-02-20 Sharp Kabushiki Kaisha Three-dimensional display unit with a variable lens
JPH10232626A (en) * 1997-02-20 1998-09-02 Canon Inc Stereoscopic image display device
US6590605B1 (en) 1998-10-14 2003-07-08 Dimension Technologies, Inc. Autostereoscopic display
US6533420B1 (en) 1999-01-22 2003-03-18 Dimension Technologies, Inc. Apparatus and method for generating and projecting autostereoscopic images
US6591306B1 (en) * 1999-04-01 2003-07-08 Nec Corporation IP network access for portable devices
US8271336B2 (en) 1999-11-22 2012-09-18 Accenture Global Services Gmbh Increased visibility during order management in a network-based supply chain environment
US7389214B1 (en) 2000-05-01 2008-06-17 Accenture, Llp Category analysis in a market management
US6856581B1 (en) 2000-10-31 2005-02-15 International Business Machines Corporation Batteryless, oscillatorless, binary time cell usable as an horological device with associated programming methods and devices
WO2002037471A2 (en) 2000-11-03 2002-05-10 Zoesis, Inc. Interactive character system
DE10103922A1 (en) 2001-01-30 2002-08-01 Physoptics Opto Electronic Gmb Interactive data viewing and operating system
US20020194604A1 (en) 2001-06-19 2002-12-19 Sanchez Elizabeth C. Interactive television virtual shopping cart
JP2003322824A (en) * 2002-02-26 2003-11-14 Namco Ltd Stereoscopic video display device and electronic apparatus
JP3738843B2 (en) 2002-06-11 2006-01-25 ソニー株式会社 Image detection apparatus, image detection method, and image detection program
JP2004072202A (en) 2002-08-01 2004-03-04 Ktfreetel Co Ltd Separate billing method of communication utility charge and apparatus therefor
US20080008202A1 (en) 2002-10-31 2008-01-10 Terrell William C Router with routing processors and methods for virtualization
US7769668B2 (en) 2002-12-09 2010-08-03 Sam Balabon System and method for facilitating trading of financial instruments
US8270810B2 (en) 2002-12-11 2012-09-18 Broadcom Corporation Method and system for advertisement insertion and playback for STB with PVR functionality
US8799366B2 (en) 2002-12-11 2014-08-05 Broadcom Corporation Migration of stored media through a media exchange network
CA2457602A1 (en) 2003-02-19 2004-08-19 Impatica Inc. Method of synchronizing streams of real time data
US8438601B2 (en) 2003-07-02 2013-05-07 Rovi Solutions Corporation Resource management for a networked personal video recording system
US7557876B2 (en) * 2003-07-25 2009-07-07 Nitto Denko Corporation Anisotropic fluorescent thin crystal film and backlight system and liquid crystal display incorporating the same
GB0326005D0 (en) 2003-11-07 2003-12-10 Koninkl Philips Electronics Nv Waveguide for autostereoscopic display
WO2005057248A2 (en) 2003-12-04 2005-06-23 New York University Eye tracked foveal display by controlled illumination
US8154686B2 (en) 2004-01-20 2012-04-10 Sharp Kabushiki Kaisha Directional backlight, a multiple view display and a multi-direction display
KR100786862B1 (en) 2004-11-30 2007-12-20 삼성에스디아이 주식회사 Barrier device, three dimensional image display using the same and method thereof
EP1838899A2 (en) 2004-11-30 2007-10-03 Agoura Technologies Inc. Applications and fabrication techniques for large scale wire grid polarizers
JP2008523689A (en) 2004-12-10 2008-07-03 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Wireless video streaming and prioritized streaming using single layer coding
CN101107644B (en) * 2005-01-18 2010-11-24 皇家飞利浦电子股份有限公司 Multi-view display device
JP4600317B2 (en) 2005-03-31 2010-12-15 カシオ計算機株式会社 Illumination device that emits at least two illumination lights having directivity and display device using the same
KR100732961B1 (en) 2005-04-01 2007-06-27 경희대학교 산학협력단 Multiview scalable image encoding, decoding method and its apparatus
ES2860754T3 (en) * 2005-04-29 2021-10-05 Koninklijke Philips Nv A stereoscopic display apparatus
KR100661241B1 (en) * 2005-05-16 2006-12-22 엘지전자 주식회사 Fabrication method of optical sheet
GB2426351A (en) 2005-05-19 2006-11-22 Sharp Kk A dual view display
KR100813961B1 (en) * 2005-06-14 2008-03-14 삼성전자주식회사 Method and apparatus for transmitting and receiving of video, and transport stream structure thereof
JP5091857B2 (en) 2005-06-30 2012-12-05 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ System control method
KR100647517B1 (en) 2005-08-26 2006-11-23 (주)마스터이미지 Cell type parallax-barrier and stereoscopic image display apparatus using the same
JP5112326B2 (en) 2005-11-02 2013-01-09 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Optical system for 3D display
US20070110035A1 (en) 2005-11-14 2007-05-17 Broadcom Corporation, A California Corporation Network nodes cooperatively routing traffic flow amongst wired and wireless networks
JP5121136B2 (en) 2005-11-28 2013-01-16 株式会社ジャパンディスプレイウェスト Image display device, electronic device, portable device, and image display method
KR100739067B1 (en) 2005-11-30 2007-07-12 삼성에스디아이 주식회사 Three-dimensional display device
EP3599764A3 (en) * 2005-12-20 2020-07-08 Koninklijke Philips N.V. Autostereoscopic display device
US20070153122A1 (en) 2005-12-30 2007-07-05 Ayite Nii A Apparatus and method for simultaneous multiple video channel viewing
WO2007095476A2 (en) 2006-02-10 2007-08-23 Colorlink, Inc. Multi-functional active matrix liquid crystal displays
US20070225994A1 (en) 2006-03-17 2007-09-27 Moore Barrett H Method for Providing Private Civil Security Services Bundled with Second Party Products
US8368749B2 (en) * 2006-03-27 2013-02-05 Ge Inspection Technologies Lp Article inspection apparatus
US8466954B2 (en) 2006-04-03 2013-06-18 Sony Computer Entertainment Inc. Screen sharing method and apparatus
KR100893616B1 (en) * 2006-04-17 2009-04-20 삼성모바일디스플레이주식회사 Electronic imaging device, 2d/3d image display device and the driving method thereof
TWI378747B (en) * 2006-08-18 2012-12-01 Ind Tech Res Inst Flexible electronic assembly
US20110090413A1 (en) * 2006-08-18 2011-04-21 Industrial Technology Research Institute 3-dimensional image display
US7844547B2 (en) 2006-08-21 2010-11-30 Carl Raymond Amos Uncle gem IV, universal automatic instant money, data and precious metal and stone transfer machine
US8587638B2 (en) 2006-09-25 2013-11-19 Nokia Corporation Supporting a 3D presentation
JP4669482B2 (en) * 2006-09-29 2011-04-13 セイコーエプソン株式会社 Display device, image processing method, and electronic apparatus
US20080086391A1 (en) 2006-10-05 2008-04-10 Kurt Maynard Impromptu asset tracking
US8645176B2 (en) 2006-10-05 2014-02-04 Trimble Navigation Limited Utilizing historical data in an asset management environment
US20080086685A1 (en) * 2006-10-05 2008-04-10 James Janky Method for delivering tailored asset information to a device
US7640223B2 (en) 2006-11-16 2009-12-29 University Of Tennessee Research Foundation Method of organizing and presenting data in a table using stutter peak rule
US7586681B2 (en) 2006-11-29 2009-09-08 Honeywell International Inc. Directional display
US20100066850A1 (en) 2006-11-30 2010-03-18 Westar Display Technologies, Inc. Motion artifact measurement for display devices
JP4285532B2 (en) 2006-12-01 2009-06-24 ソニー株式会社 Backlight control device, backlight control method, and liquid crystal display device
US8248462B2 (en) * 2006-12-15 2012-08-21 The Board Of Trustees Of The University Of Illinois Dynamic parallax barrier autosteroscopic display system and method
JP4686795B2 (en) * 2006-12-27 2011-05-25 富士フイルム株式会社 Image generating apparatus and image reproducing apparatus
US7924456B1 (en) 2007-01-12 2011-04-12 Broadbus Technologies, Inc. Data distribution and buffering
CN101013559A (en) 2007-01-30 2007-08-08 京东方科技集团股份有限公司 LED brightness control circuit and backlight of LCD
JP4255032B2 (en) 2007-03-15 2009-04-15 富士通テン株式会社 Display device and display method
US7917853B2 (en) 2007-03-21 2011-03-29 At&T Intellectual Property I, L.P. System and method of presenting media content
US8269822B2 (en) 2007-04-03 2012-09-18 Sony Computer Entertainment America, LLC Display viewing system and methods for optimizing display view based on active tracking
US8600932B2 (en) 2007-05-07 2013-12-03 Trimble Navigation Limited Telematic asset microfluidic analysis
GB0709134D0 (en) * 2007-05-11 2007-06-20 Surman Philip Multi-user autostereoscopic Display
GB0709411D0 (en) 2007-05-16 2007-06-27 Barco Nv Methods and systems for stereoscopic imaging
TWI466093B (en) 2007-06-26 2014-12-21 Apple Inc Management techniques for video playback
KR101400285B1 (en) 2007-08-03 2014-05-30 삼성전자주식회사 Front light unit and flat display apparatus employing the same
US7911442B2 (en) 2007-08-27 2011-03-22 Au Optronics Corporation Dynamic color gamut of LED backlight
KR101362647B1 (en) 2007-09-07 2014-02-12 삼성전자주식회사 System and method for generating and palying three dimensional image file including two dimensional image
US7881976B2 (en) * 2007-09-27 2011-02-01 Virgin Mobile Usa, L.P. Apparatus, methods and systems for discounted referral and recommendation of electronic content
GB2453323A (en) * 2007-10-01 2009-04-08 Sharp Kk Flexible backlight arrangement and display
TWI354115B (en) * 2007-10-05 2011-12-11 Ind Tech Res Inst Three-dimensional display apparatus
US8416247B2 (en) * 2007-10-09 2013-04-09 Sony Computer Entertaiment America Inc. Increasing the number of advertising impressions in an interactive environment
US8031175B2 (en) 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
JP4956520B2 (en) 2007-11-13 2012-06-20 ミツミ電機株式会社 Backlight device and liquid crystal display device using the same
US8121191B1 (en) 2007-11-13 2012-02-21 Harmonic Inc. AVC to SVC transcoder
KR101439845B1 (en) 2007-11-16 2014-09-12 삼성전자주식회사 Digital image processing apparatus
JP2011504710A (en) 2007-11-21 2011-02-10 ジェスチャー テック,インコーポレイテッド Media preferences
WO2009067676A1 (en) 2007-11-21 2009-05-28 Gesturetek, Inc. Device access control
US20090138280A1 (en) 2007-11-26 2009-05-28 The General Electric Company Multi-stepped default display protocols
JP5236938B2 (en) 2007-12-03 2013-07-17 パナソニック株式会社 Digital broadcast receiving apparatus, semiconductor integrated circuit, and digital broadcast receiving method
TWI365302B (en) * 2007-12-31 2012-06-01 Ind Tech Res Inst Stereo image display with switch function between horizontal display and vertical display
US8339333B2 (en) 2008-01-02 2012-12-25 3M Innovative Properties Company Methods of reducing perceived image crosstalk in a multiview display
CN101939998A (en) 2008-02-08 2011-01-05 皇家飞利浦电子股份有限公司 Autostereoscopic display device
KR101451565B1 (en) 2008-02-13 2014-10-16 삼성전자 주식회사 Autostereoscopic display system
JP5642347B2 (en) 2008-03-07 2014-12-17 ミツミ電機株式会社 LCD backlight device
KR101488199B1 (en) * 2008-03-12 2015-01-30 삼성전자주식회사 Method and apparatus for processing and reproducing image, and computer readable medium thereof
US20090237564A1 (en) 2008-03-18 2009-09-24 Invism, Inc. Interactive immersive virtual reality and simulation
US20090244266A1 (en) 2008-03-26 2009-10-01 Thomas Carl Brigham Enhanced Three Dimensional Television
JP4925354B2 (en) 2008-03-31 2012-04-25 富士フイルム株式会社 Image processing apparatus, image display apparatus, imaging apparatus, and image processing method
US20090282429A1 (en) * 2008-05-07 2009-11-12 Sony Ericsson Mobile Communications Ab Viewer tracking for displaying three dimensional views
DE102008001644B4 (en) 2008-05-08 2010-03-04 Seereal Technologies S.A. Device for displaying three-dimensional images
US20090295791A1 (en) 2008-05-29 2009-12-03 Microsoft Corporation Three-dimensional environment created from video
CN101291415B (en) 2008-05-30 2010-07-21 华为终端有限公司 Method, apparatus and system for three-dimensional video communication
US20090319625A1 (en) 2008-06-20 2009-12-24 Alcatel Lucent Interactivity in a digital public signage network architecture
TWI401658B (en) 2008-07-18 2013-07-11 Hannstar Display Corp Gate line driving circuit of lcd panel
JP5127633B2 (en) * 2008-08-25 2013-01-23 三菱電機株式会社 Content playback apparatus and method
US20100070987A1 (en) 2008-09-12 2010-03-18 At&T Intellectual Property I, L.P. Mining viewer responses to multimedia content
JP2010074557A (en) 2008-09-18 2010-04-02 Toshiba Corp Television receiver
KR101609890B1 (en) 2008-09-18 2016-04-06 파나소닉 아이피 매니지먼트 가부시키가이샤 Image decoding device, image encoding device, image decoding method, image encoding method, and program
KR101497511B1 (en) 2008-09-19 2015-03-02 삼성전자주식회사 APPARATUS FOR MULTIPLEXING 2 DIMENSIONAL and 3 DIMENSIONAL IMAGE AND VIDEO
KR20100033067A (en) 2008-09-19 2010-03-29 삼성전자주식회사 Image display apparatus and method for both 2d and 3d image
MX2010002097A (en) 2008-09-30 2010-08-02 Panasonic Corp Recording medium, reproduction device, system lsi, reproduction method, spectacle, and display device associated with 3d video.
US20100107184A1 (en) 2008-10-23 2010-04-29 Peter Rae Shintani TV with eye detection
US8752087B2 (en) 2008-11-07 2014-06-10 At&T Intellectual Property I, L.P. System and method for dynamically constructing personalized contextual video programs
CN102224737B (en) 2008-11-24 2014-12-03 皇家飞利浦电子股份有限公司 Combining 3D video and auxiliary data
US8103608B2 (en) 2008-11-26 2012-01-24 Microsoft Corporation Reference model for data-driven analytics
US20100128112A1 (en) 2008-11-26 2010-05-27 Samsung Electronics Co., Ltd Immersive display system for interacting with three-dimensional content
US20100135640A1 (en) 2008-12-03 2010-06-03 Dell Products L.P. System and Method for Storing and Displaying 3-D Video Content
US8913105B2 (en) 2009-01-07 2014-12-16 Thomson Licensing Joint depth estimation
WO2010095440A1 (en) 2009-02-20 2010-08-26 パナソニック株式会社 Recording medium, reproduction device, and integrated circuit
WO2010095381A1 (en) 2009-02-20 2010-08-26 パナソニック株式会社 Recording medium, reproduction device, and integrated circuit
US9565397B2 (en) 2009-02-26 2017-02-07 Akamai Technologies, Inc. Deterministically skewing transmission of content streams
US20100225576A1 (en) 2009-03-03 2010-09-09 Horizon Semiconductors Ltd. Three-dimensional interactive system and method
US8477175B2 (en) 2009-03-09 2013-07-02 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
US20100231511A1 (en) 2009-03-10 2010-09-16 David L. Henty Interactive media system with multi-directional remote control and dual mode camera
CN102356638A (en) 2009-03-16 2012-02-15 Lg电子株式会社 A method of displaying three-dimensional image data and an apparatus of processing three-dimensional image data
US20100247080A1 (en) * 2009-03-27 2010-09-30 Kug-Jin Yun Method and apparatus for creating and consuming multiview image media file
JP5695819B2 (en) 2009-03-30 2015-04-08 日立マクセル株式会社 TV operation method
WO2010117315A1 (en) 2009-04-09 2010-10-14 Telefonaktiebolaget Lm Ericsson (Publ) Media container file management
CA2760158C (en) 2009-04-26 2016-08-02 Nike International Ltd. Gps features and functionality in an athletic watch system
US8532310B2 (en) 2010-03-30 2013-09-10 Bose Corporation Frequency-dependent ANR reference sound compression
US8315405B2 (en) 2009-04-28 2012-11-20 Bose Corporation Coordinated ANR reference sound compression
US20100280959A1 (en) 2009-05-01 2010-11-04 Darrel Stone Real-time sourcing of service providers
CN101983400B (en) 2009-05-15 2013-07-17 株式会社东芝 Image display device
US8788676B2 (en) 2009-05-22 2014-07-22 Motorola Mobility Llc Method and system for controlling data transmission to or from a mobile device
US8704958B2 (en) 2009-06-01 2014-04-22 Lg Electronics Inc. Image display device and operation method thereof
US9237296B2 (en) 2009-06-01 2016-01-12 Lg Electronics Inc. Image display apparatus and operating method thereof
WO2010143820A2 (en) 2009-06-08 2010-12-16 엘지전자 주식회사 Device and method for providing a three-dimensional pip image
US20100309290A1 (en) 2009-06-08 2010-12-09 Stephen Brooks Myers System for capture and display of stereoscopic content
US8411746B2 (en) 2009-06-12 2013-04-02 Qualcomm Incorporated Multiview video coding over MPEG-2 systems
US20100321465A1 (en) 2009-06-19 2010-12-23 Dominique A Behrens Pa Method, System and Computer Program Product for Mobile Telepresence Interactions
KR20120088664A (en) 2009-08-07 2012-08-08 리얼디 인크. Stereoscopic flat panel display with updated blanking intervals
US8976871B2 (en) 2009-09-16 2015-03-10 Qualcomm Incorporated Media extractor tracks for file format track selection
US8446462B2 (en) 2009-10-15 2013-05-21 At&T Intellectual Property I, L.P. Method and system for time-multiplexed shared display
US20110093882A1 (en) 2009-10-21 2011-04-21 Candelore Brant L Parental control through the HDMI interface
KR101600818B1 (en) 2009-11-06 2016-03-09 삼성디스플레이 주식회사 3 three dimensional optical module and display device including the same
US8705624B2 (en) 2009-11-24 2014-04-22 STMicroelectronics International N. V. Parallel decoding for scalable video coding
US8335763B2 (en) 2009-12-04 2012-12-18 Microsoft Corporation Concurrently presented data subfeeds
US8462197B2 (en) 2009-12-17 2013-06-11 Motorola Mobility Llc 3D video transforming device
US20110153362A1 (en) 2009-12-17 2011-06-23 Valin David A Method and mechanism for identifying protecting, requesting, assisting and managing information
US8964013B2 (en) 2009-12-31 2015-02-24 Broadcom Corporation Display with elastic light manipulator
US8823782B2 (en) 2009-12-31 2014-09-02 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US8854531B2 (en) 2009-12-31 2014-10-07 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display
US9247286B2 (en) 2009-12-31 2016-01-26 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US20110199469A1 (en) * 2010-02-15 2011-08-18 Gallagher Andrew C Detection and display of stereo images
US8384774B2 (en) 2010-02-15 2013-02-26 Eastman Kodak Company Glasses for viewing stereo images
KR101356248B1 (en) 2010-02-19 2014-01-29 엘지디스플레이 주식회사 Image display device
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US8964298B2 (en) 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
KR101324412B1 (en) 2010-05-06 2013-11-01 엘지디스플레이 주식회사 Stereoscopic image display and driving method thereof
WO2011142141A1 (en) 2010-05-13 2011-11-17 パナソニック株式会社 Display device and image viewing system
KR101255711B1 (en) 2010-07-02 2013-04-17 엘지디스플레이 주식회사 3d image display device and driving method thereof
US8605136B2 (en) 2010-08-10 2013-12-10 Sony Corporation 2D to 3D user interface content data conversion
US8363928B1 (en) 2010-12-24 2013-01-29 Trimble Navigation Ltd. General orientation positioning system
WO2012132797A1 (en) 2011-03-31 2012-10-04 富士フイルム株式会社 Image capturing device and image capturing method
US9424667B2 (en) * 2011-11-21 2016-08-23 Schlumberger Technology Corporation Interface for controlling and improving drilling operations

Patent Citations (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4829365A (en) * 1986-03-07 1989-05-09 Dimension Technologies, Inc. Autostereoscopic display with illuminating lines, light valve and mask
US20030154261A1 (en) * 1994-10-17 2003-08-14 The Regents Of The University Of California, A Corporation Of The State Of California Distributed hypermedia method and system for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document
US20040036763A1 (en) * 1994-11-14 2004-02-26 Swift David C. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US5615046A (en) * 1995-01-23 1997-03-25 Cyber Scientific Inc. Stereoscopic viewing system
US6094216A (en) * 1995-05-22 2000-07-25 Canon Kabushiki Kaisha Stereoscopic image display method, and stereoscopic image display apparatus using the method
US6909555B2 (en) * 1995-06-07 2005-06-21 Jacob N. Wohlstadter Three dimensional imaging system
US20040141237A1 (en) * 1995-06-07 2004-07-22 Wohlstadter Jacob N. Three dimensional imaging system
US5945965A (en) * 1995-06-29 1999-08-31 Canon Kabushiki Kaisha Stereoscopic image display method
US5959597A (en) * 1995-09-28 1999-09-28 Sony Corporation Image/audio reproducing system
US7123213B2 (en) * 1995-10-05 2006-10-17 Semiconductor Energy Laboratory Co., Ltd. Three dimensional display unit and display method
US6049424A (en) * 1995-11-15 2000-04-11 Sanyo Electric Co., Ltd. Three dimensional display device
US7190518B1 (en) * 1996-01-22 2007-03-13 3Ality, Inc. Systems for and methods of three dimensional viewing
US7038698B1 (en) * 1996-02-08 2006-05-02 Palm Charles S 3D stereo browser for the internet
US6023277A (en) * 1996-07-03 2000-02-08 Canon Kabushiki Kaisha Display control apparatus and method
US5855425A (en) * 1996-07-19 1999-01-05 Sanyo Electric Co., Ltd. Stereoscopic display
US5969850A (en) * 1996-09-27 1999-10-19 Sharp Kabushiki Kaisha Spatial light modulator, directional display and directional light source
US5990975A (en) * 1996-11-22 1999-11-23 Acer Peripherals, Inc. Dual screen displaying device
US6285368B1 (en) * 1997-02-10 2001-09-04 Canon Kabushiki Kaisha Image display system and image display apparatus and information processing apparatus in the system
US7030903B2 (en) * 1997-02-20 2006-04-18 Canon Kabushiki Kaisha Image display system, information processing apparatus, and method of controlling the same
US6188442B1 (en) * 1997-08-01 2001-02-13 International Business Machines Corporation Multiviewer display system for television monitors
US6710920B1 (en) * 1998-03-27 2004-03-23 Sanyo Electric Co., Ltd Stereoscopic display
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6697687B1 (en) * 1998-11-09 2004-02-24 Hitachi, Ltd. Image display apparatus having audio output control means in accordance with image signal type
US20030012425A1 (en) * 1998-11-12 2003-01-16 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
US20020171666A1 (en) * 1999-02-19 2002-11-21 Takaaki Endo Image processing apparatus for interpolating and generating images from an arbitrary view point
US20080184301A1 (en) * 1999-10-29 2008-07-31 Boylan Peter C Interactive television system with programming-related links
US20020010798A1 (en) * 2000-04-20 2002-01-24 Israel Ben-Shaul Differentiated content and application delivery via internet
US20030103165A1 (en) * 2000-05-19 2003-06-05 Werner Bullinger System for operating a consumer electronics appaliance
US20050078108A1 (en) * 2000-06-12 2005-04-14 Swift David C. Electronic stereoscopic media delivery system
US20020037037A1 (en) * 2000-09-22 2002-03-28 Philips Electronics North America Corporation Preferred transmission/streaming order of fine-granular scalability
US20020167862A1 (en) * 2001-04-03 2002-11-14 Carlo Tomasi Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device
US20040252187A1 (en) * 2001-09-10 2004-12-16 Alden Ray M. Processes and apparatuses for efficient multiple program and 3D display
US20060256136A1 (en) * 2001-10-01 2006-11-16 Adobe Systems Incorporated, A Delaware Corporation Compositing two-dimensional and three-dimensional image layers
US20030137506A1 (en) * 2001-11-30 2003-07-24 Daniel Efran Image-based rendering for 3D viewing
US20030223499A1 (en) * 2002-04-09 2003-12-04 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
US20050248561A1 (en) * 2002-04-25 2005-11-10 Norio Ito Multimedia information generation method and multimedia information reproduction device
US20050259147A1 (en) * 2002-07-16 2005-11-24 Nam Jeho Apparatus and method for adapting 2d and 3d stereoscopic video signal
US20040027452A1 (en) * 2002-08-07 2004-02-12 Yun Kug Jin Method and apparatus for multiplexing multi-view three-dimensional moving picture
US20040041747A1 (en) * 2002-08-27 2004-03-04 Nec Corporation 3D image/2D image switching display apparatus and portable terminal device
US20040239231A1 (en) * 2002-10-30 2004-12-02 Keisuke Miyagawa Display device and electronic equipment
US20040109093A1 (en) * 2002-12-05 2004-06-10 Small-Stryker Aaron Tug Method and apparatus for simultaneous television video presentation and separate viewing of different broadcasts
US20080284844A1 (en) * 2003-02-05 2008-11-20 Graham John Woodgate Switchable Lens
US20040164292A1 (en) * 2003-02-21 2004-08-26 Yeh-Jiun Tung Transflective display having an OLED backlight
US20070097208A1 (en) * 2003-05-28 2007-05-03 Satoshi Takemoto Stereoscopic image display apparatus, text data processing apparatus, program, and storing medium
US20050073472A1 (en) * 2003-07-26 2005-04-07 Samsung Electronics Co., Ltd. Method of removing Moire pattern in 3D image display apparatus using complete parallax
US20070097103A1 (en) * 2003-09-11 2007-05-03 Shoji Yoshioka Portable display device
US20070085814A1 (en) * 2003-09-20 2007-04-19 Koninklijke Philips Electronics N.V. Image display device
US20080273242A1 (en) * 2003-09-30 2008-11-06 Graham John Woodgate Directional Display Apparatus
US20050128353A1 (en) * 2003-12-16 2005-06-16 Young Bruce A. System and method for using second remote control device for sub-picture control in television receiver
US7091471B2 (en) * 2004-03-15 2006-08-15 Agilent Technologies, Inc. Using eye detection for providing control and power management of electronic devices
US20050237487A1 (en) * 2004-04-23 2005-10-27 Chang Nelson L A Color wheel assembly for stereoscopic imaging
US7440193B2 (en) * 2004-04-30 2008-10-21 Gunasekaran R Alfred Wide-angle variable focal length lens system
US20060050785A1 (en) * 2004-09-09 2006-03-09 Nucore Technology Inc. Inserting a high resolution still image into a lower resolution video stream
US20090058845A1 (en) * 2004-10-20 2009-03-05 Yasuhiro Fukuda Display device
US20070296874A1 (en) * 2004-10-20 2007-12-27 Fujitsu Ten Limited Display Device,Method of Adjusting the Image Quality of the Display Device, Device for Adjusting the Image Quality and Device for Adjusting the Contrast
US20060087556A1 (en) * 2004-10-21 2006-04-27 Kazunari Era Stereoscopic image display device
US20060109242A1 (en) * 2004-11-19 2006-05-25 Simpkins Daniel S User interface for impaired users
US20060139490A1 (en) * 2004-12-15 2006-06-29 Fekkes Wilhelmus F Synchronizing audio with delayed video
US20060139448A1 (en) * 2004-12-29 2006-06-29 Samsung Electronics Co., Ltd. 3D displays with flexible switching capability of 2D/3D viewing modes
US20080192112A1 (en) * 2005-03-18 2008-08-14 Ntt Data Sanyo System Corporation Stereoscopic Image Display Apparatus, Stereoscopic Image Displaying Method And Computer Program Product
US20070139371A1 (en) * 2005-04-04 2007-06-21 Harsham Bret A Control system and method for differentiating multiple users utilizing multi-view display devices
US20080191964A1 (en) * 2005-04-22 2008-08-14 Koninklijke Philips Electronics, N.V. Auto-Stereoscopic Display With Mixed Mode For Concurrent Display of Two- and Three-Dimensional Images
US20090102915A1 (en) * 2005-04-25 2009-04-23 Svyatoslav Ivanovich Arsenich Stereoprojection system
US20080246757A1 (en) * 2005-04-25 2008-10-09 Masahiro Ito 3D Image Generation and Display System
US20060244918A1 (en) * 2005-04-27 2006-11-02 Actuality Systems, Inc. Minimized-thickness angular scanner of electromagnetic radiation
US20060256302A1 (en) * 2005-05-13 2006-11-16 Microsoft Corporation Three-dimensional (3D) image projection
US20060271791A1 (en) * 2005-05-27 2006-11-30 Sbc Knowledge Ventures, L.P. Method and system for biometric based access control of media content presentation devices
US20090051759A1 (en) * 2005-05-27 2009-02-26 Adkins Sean M Equipment and methods for the synchronization of stereoscopic projection displays
US20070096125A1 (en) * 2005-06-24 2007-05-03 Uwe Vogel Illumination device
US20070002041A1 (en) * 2005-07-02 2007-01-04 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding video data to implement local three-dimensional video
US20070008406A1 (en) * 2005-07-08 2007-01-11 Samsung Electronics Co., Ltd. High resolution 2D-3D switchable autostereoscopic display apparatus
US20070008620A1 (en) * 2005-07-11 2007-01-11 Samsung Electronics Co., Ltd. Switchable autostereoscopic display
US20070052807A1 (en) * 2005-09-07 2007-03-08 Fuji Xerox Co., Ltd. System and method for user monitoring interface of 3-D video streams from multiple cameras
US20070072674A1 (en) * 2005-09-12 2007-03-29 Nintendo Co., Ltd. Information processing program
US7511774B2 (en) * 2005-11-30 2009-03-31 Samsung Mobile Display Co., Ltd. Three-dimensional display device
US20080259233A1 (en) * 2005-12-20 2008-10-23 Koninklijke Philips Electronics, N.V. Autostereoscopic Display Device
US20070146267A1 (en) * 2005-12-22 2007-06-28 Lg.Philips Lcd Co., Ltd. Display device and method of driving the same
US20070147827A1 (en) * 2005-12-28 2007-06-28 Arnold Sheynman Methods and apparatus for wireless stereo video streaming
US20070153916A1 (en) * 2005-12-30 2007-07-05 Sharp Laboratories Of America, Inc. Wireless video transmission system
US20070162392A1 (en) * 2006-01-12 2007-07-12 Microsoft Corporation Management of Streaming Content
US7359105B2 (en) * 2006-02-07 2008-04-15 Sharp Kabushiki Kaisha Spatial light modulator and a display device
US20090010264A1 (en) * 2006-03-21 2009-01-08 Huawei Technologies Co., Ltd. Method and System for Ensuring QoS and SLA Server
US20080133122A1 (en) * 2006-03-29 2008-06-05 Sanyo Electric Co., Ltd. Multiple visual display device and vehicle-mounted navigation system
US20080043096A1 (en) * 2006-04-04 2008-02-21 Anthony Vetro Method and System for Decoding and Displaying 3D Light Fields
US20070258140A1 (en) * 2006-05-04 2007-11-08 Samsung Electronics Co., Ltd. Multiview autostereoscopic display
US20070270218A1 (en) * 2006-05-08 2007-11-22 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20080025390A1 (en) * 2006-07-25 2008-01-31 Fang Shi Adaptive video frame interpolation
US20080037120A1 (en) * 2006-08-08 2008-02-14 Samsung Electronics Co., Ltd High resolution 2d/3d switchable display apparatus
US20080043644A1 (en) * 2006-08-18 2008-02-21 Microsoft Corporation Techniques to perform rate matching for multimedia conference calls
US20080126557A1 (en) * 2006-09-08 2008-05-29 Tetsuro Motoyama System, method, and computer program product using an SNMP implementation to obtain vendor information from remote devices
US20080068329A1 (en) * 2006-09-15 2008-03-20 Samsung Electronics Co., Ltd. Multi-view autostereoscopic display with improved resolution
US20080165176A1 (en) * 2006-09-28 2008-07-10 Charles Jens Archer Method of Video Display and Multiplayer Gaming
US20080150853A1 (en) * 2006-12-22 2008-06-26 Hong Kong Applied Science and Technology Research Institute Company Limited Backlight device and liquid crystal display incorporating the backlight device
US20080168129A1 (en) * 2007-01-08 2008-07-10 Jeffrey Robbin Pairing a Media Server and a Media Client
US20080303832A1 (en) * 2007-06-11 2008-12-11 Samsung Electronics Co., Ltd. Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same
US20090002178A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Dynamic mood sensing
US20090052164A1 (en) * 2007-08-24 2009-02-26 Masako Kashiwagi Directional backlight, display apparatus, and stereoscopic display apparatus
US20090115783A1 (en) * 2007-11-02 2009-05-07 Dimension Technologies, Inc. 3d optical illusions from off-axis displays
US20110050687A1 (en) * 2008-04-04 2011-03-03 Denis Vladimirovich Alyshev Presentation of Objects in Stereoscopic 3D Displays
US8209396B1 (en) * 2008-12-10 2012-06-26 Howcast Media, Inc. Video player

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
IEEE 100 The Authoritative Dictionary of IEEE Standards Terms Seventh Edition, IEEE 100-2000, pp. 1270-1287, entry for "Web page". *
IEEE 100 The Authoritative Dictionary of IEEE Standards Terms Seventh Edition, IEEE 100-2000, pp. 349-411, entry for "engine". *
Wikipedia entry on "Scripting language," retrieved 08-16-2012. *

Cited By (140)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8884926B1 (en) 2002-11-04 2014-11-11 Neonode Inc. Light-based finger gesture user interface
US8810551B2 (en) 2002-11-04 2014-08-19 Neonode Inc. Finger gesture user interface
US9262074B2 (en) 2002-11-04 2016-02-16 Neonode, Inc. Finger gesture user interface
US20130093727A1 (en) * 2002-11-04 2013-04-18 Neonode, Inc. Light-based finger gesture user interface
US8416217B1 (en) * 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US20110080472A1 (en) * 2009-10-02 2011-04-07 Eric Gagneraud Autostereoscopic status display
US8854531B2 (en) 2009-12-31 2014-10-07 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display
US9019263B2 (en) 2009-12-31 2015-04-28 Broadcom Corporation Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays
US8964013B2 (en) 2009-12-31 2015-02-24 Broadcom Corporation Display with elastic light manipulator
US8922545B2 (en) 2009-12-31 2014-12-30 Broadcom Corporation Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US8988506B2 (en) 2009-12-31 2015-03-24 Broadcom Corporation Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US9979954B2 (en) 2009-12-31 2018-05-22 Avago Technologies General Ip (Singapore) Pte. Ltd. Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US9013546B2 (en) 2009-12-31 2015-04-21 Broadcom Corporation Adaptable media stream servicing two and three dimensional content
US20110164115A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US9654767B2 (en) 2009-12-31 2017-05-16 Avago Technologies General Ip (Singapore) Pte. Ltd. Programming architecture supporting mixed two and three dimensional displays
US20110164188A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US9049440B2 (en) 2009-12-31 2015-06-02 Broadcom Corporation Independent viewer tailoring of same media source content via a common 2D-3D display
US9066092B2 (en) 2009-12-31 2015-06-23 Broadcom Corporation Communication infrastructure including simultaneous video pathways for multi-viewer support
US9247286B2 (en) 2009-12-31 2016-01-26 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US9204138B2 (en) 2009-12-31 2015-12-01 Broadcom Corporation User controlled regional display of mixed two and three dimensional content
US8687042B2 (en) 2009-12-31 2014-04-01 Broadcom Corporation Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints
US8823782B2 (en) 2009-12-31 2014-09-02 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US8767050B2 (en) 2009-12-31 2014-07-01 Broadcom Corporation Display supporting multiple simultaneous 3D views
US9124885B2 (en) 2009-12-31 2015-09-01 Broadcom Corporation Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays
US9143770B2 (en) 2009-12-31 2015-09-22 Broadcom Corporation Application programming interface supporting mixed two and three dimensional displays
US20110191328A1 (en) * 2010-02-03 2011-08-04 Vernon Todd H System and method for extracting representative media content from an online document
US20110234769A1 (en) * 2010-03-23 2011-09-29 Electronics And Telecommunications Research Institute Apparatus and method for displaying images in image system
US8988423B2 (en) * 2010-09-17 2015-03-24 Fujifilm Corporation Electronic album generating apparatus, stereoscopic image pasting apparatus, and methods and programs for controlling operation of same
US20120069001A1 (en) * 2010-09-17 2012-03-22 Fujifilm Corporation Electronic album generating apparatus, stereoscopic image pasting apparatus, and methods and programs for controlling operation of same
US9325965B2 (en) * 2010-09-20 2016-04-26 Echostar Technologies L.L.C. Separate display surfaces for EPG and program content
US20120069243A1 (en) * 2010-09-20 2012-03-22 Echostar Global B.V. Separate Display Surfaces for EPG and Program Content
US9172943B2 (en) * 2010-12-07 2015-10-27 At&T Intellectual Property I, L.P. Dynamic modification of video content at a set-top box device
US20120140025A1 (en) * 2010-12-07 2012-06-07 At&T Intellectual Property I, L.P. Dynamic Modification of Video Content at a Set-Top Box Device
US10083639B2 (en) * 2011-02-04 2018-09-25 Seiko Epson Corporation Control device for controlling image display device, head-mounted display device, image display system, control method for the image display device, and control method for the head-mounted display device
US20120200592A1 (en) * 2011-02-04 2012-08-09 Seiko Epson Corporation Control device for controlling image display device, head-mounted display device, image display system, control method for the image display device, and control method for the head-mounted display device
US8970629B2 (en) * 2011-03-09 2015-03-03 Lg Electronics Inc. Mobile terminal and 3D object control method thereof
US20120229450A1 (en) * 2011-03-09 2012-09-13 Lg Electronics Inc. Mobile terminal and 3d object control method thereof
US8941719B2 (en) * 2011-03-28 2015-01-27 Kabushiki Kaisha Toshiba Electronic apparatus and display control method
US20120249734A1 (en) * 2011-03-28 2012-10-04 Shunsuke Takayama Electronic apparatus and display control method
US20150153940A1 (en) * 2011-04-14 2015-06-04 Mediatek Inc. Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
US9367218B2 (en) * 2011-04-14 2016-06-14 Mediatek Inc. Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
US20120268456A1 (en) * 2011-04-19 2012-10-25 Hidetoshi Yokoi Information processor, information processing method, and computer program product
US9294760B2 (en) * 2011-06-28 2016-03-22 Lg Electronics Inc. Image display device and controlling method thereof
US20130033583A1 (en) * 2011-06-28 2013-02-07 Lg Electronics Inc. Image display device and controlling method thereof
CN102368244A (en) * 2011-09-08 2012-03-07 广州市动景计算机科技有限公司 Page content alignment method, device and mobile terminal browser
KR101813035B1 (en) * 2011-10-10 2017-12-28 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9013474B2 (en) 2011-10-10 2015-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
EP2581821A1 (en) * 2011-10-10 2013-04-17 LG Electronics Inc. Mobile terminal and controlling method thereof
US20140237536A1 (en) * 2011-10-13 2014-08-21 Samsung Electronics Co., Ltd. Method of displaying contents, method of synchronizing contents, and method and device for displaying broadcast contents
US20130127841A1 (en) * 2011-11-18 2013-05-23 Samsung Electronics Co., Ltd. Three-dimensional (3d) image display method and apparatus for 3d imaging and displaying contents according to start or end of operation
US9626798B2 (en) 2011-12-05 2017-04-18 At&T Intellectual Property I, L.P. System and method to digitally replace objects in images or video
US10580219B2 (en) 2011-12-05 2020-03-03 At&T Intellectual Property I, L.P. System and method to digitally replace objects in images or video
US10249093B2 (en) 2011-12-05 2019-04-02 At&T Intellectual Property I, L.P. System and method to digitally replace objects in images or video
US20140317537A1 (en) * 2011-12-22 2014-10-23 Tencent Technology (Shenzhen) Company Limited Browser based application program extension method and device
US20140354633A1 (en) * 2012-02-24 2014-12-04 Huawei Technologies Co., Ltd. Image processing method and image processing device
US20130265297A1 (en) * 2012-04-06 2013-10-10 Motorola Mobility, Inc. Display of a Corrected Browser Projection of a Visual Guide for Placing a Three Dimensional Object in a Browser
US10006505B2 (en) 2012-06-05 2018-06-26 Apple Inc. Rendering road signs during navigation
US10018478B2 (en) 2012-06-05 2018-07-10 Apple Inc. Voice instructions during navigation
US10323701B2 (en) 2012-06-05 2019-06-18 Apple Inc. Rendering road signs during navigation
US20130326425A1 (en) * 2012-06-05 2013-12-05 Apple Inc. Mapping application with 3d presentation
US10366523B2 (en) 2012-06-05 2019-07-30 Apple Inc. Method, system and apparatus for providing visual feedback of a map view change
US11956609B2 (en) 2012-06-05 2024-04-09 Apple Inc. Context-aware voice guidance
US10508926B2 (en) 2012-06-05 2019-12-17 Apple Inc. Providing navigation instructions while device is in locked mode
US10176633B2 (en) 2012-06-05 2019-01-08 Apple Inc. Integrated mapping and navigation application
US9367959B2 (en) * 2012-06-05 2016-06-14 Apple Inc. Mapping application with 3D presentation
US10156455B2 (en) 2012-06-05 2018-12-18 Apple Inc. Context-aware voice guidance
US11727641B2 (en) 2012-06-05 2023-08-15 Apple Inc. Problem reporting in maps
US10718625B2 (en) 2012-06-05 2020-07-21 Apple Inc. Voice instructions during navigation
US10732003B2 (en) 2012-06-05 2020-08-04 Apple Inc. Voice instructions during navigation
US11290820B2 (en) 2012-06-05 2022-03-29 Apple Inc. Voice instructions during navigation
US11082773B2 (en) 2012-06-05 2021-08-03 Apple Inc. Context-aware voice guidance
US9997069B2 (en) 2012-06-05 2018-06-12 Apple Inc. Context-aware voice guidance
US11055912B2 (en) 2012-06-05 2021-07-06 Apple Inc. Problem reporting in maps
US10911872B2 (en) 2012-06-05 2021-02-02 Apple Inc. Context-aware voice guidance
US10318104B2 (en) 2012-06-05 2019-06-11 Apple Inc. Navigation application with adaptive instruction text
US9903732B2 (en) 2012-06-05 2018-02-27 Apple Inc. Providing navigation instructions while device is in locked mode
US9880019B2 (en) 2012-06-05 2018-01-30 Apple Inc. Generation of intersection information by a mapping service
US9886794B2 (en) 2012-06-05 2018-02-06 Apple Inc. Problem reporting in maps
US9773338B2 (en) * 2012-06-08 2017-09-26 Lg Electronics Inc. Rendering method of 3D web-page and terminal using the same
US20150170397A1 (en) * 2012-06-08 2015-06-18 Lg Electronics Inc. Rendering method of 3d web-page and terminal using the same
US9829996B2 (en) * 2012-06-25 2017-11-28 Zspace, Inc. Operations in a three dimensional display system
US20160041630A1 (en) * 2012-06-25 2016-02-11 Zspace, Inc. Operations in a Three Dimensional Display System
US20150156472A1 (en) * 2012-07-06 2015-06-04 Lg Electronics Inc. Terminal for increasing visual comfort sensation of 3d object and control method thereof
US9674501B2 (en) * 2012-07-06 2017-06-06 Lg Electronics Inc. Terminal for increasing visual comfort sensation of 3D object and control method thereof
US20140362196A1 (en) * 2012-08-03 2014-12-11 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
US20140036044A1 (en) * 2012-08-03 2014-02-06 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
US8842169B2 (en) * 2012-08-03 2014-09-23 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
US20140152648A1 (en) * 2012-11-30 2014-06-05 Legend3D, Inc. Three-dimensional annotation system and method
US9547937B2 (en) * 2012-11-30 2017-01-17 Legend3D, Inc. Three-dimensional annotation system and method
US11743527B2 (en) 2012-12-04 2023-08-29 Interaxon Inc. System and method for enhancing content using brain-state data
US20140223462A1 (en) * 2012-12-04 2014-08-07 Christopher Allen Aimone System and method for enhancing content using brain-state data
US11259066B2 (en) 2012-12-04 2022-02-22 Interaxon Inc. System and method for enhancing content using brain-state data
US10405025B2 (en) 2012-12-04 2019-09-03 Interaxon Inc. System and method for enhancing content using brain-state data
US10009644B2 (en) * 2012-12-04 2018-06-26 Interaxon Inc System and method for enhancing content using brain-state data
US10856032B2 (en) 2012-12-04 2020-12-01 Interaxon Inc. System and method for enhancing content using brain-state data
US20140316907A1 (en) * 2013-04-17 2014-10-23 Asaf NAIM Multilayered user interface for internet browser
US9678929B2 (en) * 2013-08-01 2017-06-13 Equldo Limited Stereoscopic online web content creation and rendering
US10878177B2 (en) * 2013-08-01 2020-12-29 Equldo Limited Techniques for stereoscopic online web content creation and rendering
US20150035821A1 (en) * 2013-08-01 2015-02-05 Equldp Limited Stereoscopic online web content creation and rendering
US20190171695A1 (en) * 2013-08-01 2019-06-06 Dimitrios Andriotis Techniques for stereoscopic online web content creation and rendering
US10592064B2 (en) * 2013-09-17 2020-03-17 Amazon Technologies, Inc. Approaches for three-dimensional object display used in content navigation
US20150082180A1 (en) * 2013-09-17 2015-03-19 Amazon Technologies, Inc. Approaches for three-dimensional object display used in content navigation
US10067634B2 (en) 2013-09-17 2018-09-04 Amazon Technologies, Inc. Approaches for three-dimensional object display
US20150205797A1 (en) * 2014-01-22 2015-07-23 Al Squared Identifying a set of related visible content elements in a markup language document
US9785623B2 (en) * 2014-01-22 2017-10-10 Freedom Scientific, Inc. Identifying a set of related visible content elements in a markup language document
US9348495B2 (en) 2014-03-07 2016-05-24 Sony Corporation Control of large screen display using wireless portable computer and facilitating selection of audio on a headphone
US11102543B2 (en) 2014-03-07 2021-08-24 Sony Corporation Control of large screen display using wireless portable computer to pan and zoom on large screen display
US10809894B2 (en) 2014-08-02 2020-10-20 Samsung Electronics Co., Ltd. Electronic device for displaying object or information in three-dimensional (3D) form and user interaction method thereof
WO2016021861A1 (en) * 2014-08-02 2016-02-11 Samsung Electronics Co., Ltd. Electronic device and user interaction method thereof
US10298974B2 (en) * 2014-08-05 2019-05-21 Uc Mobile Co., Ltd. Method and device for presenting content data from network
US10631046B2 (en) * 2014-09-30 2020-04-21 Orange Method and device for adapting the display of a video stream by a client
US20170223409A1 (en) * 2014-09-30 2017-08-03 Orange Method and device for adapting the display of a video stream by a client
US20170285762A1 (en) * 2014-12-15 2017-10-05 Bayerische Motoren Werke Aktiengesellschaft Method for Controlling a Vehicle System
US10528146B2 (en) * 2014-12-15 2020-01-07 Bayerische Motoren Werke Aktiengesellschaft Method for controlling a vehicle system
US11119811B2 (en) * 2015-07-15 2021-09-14 F4 Interactive device for displaying web page data in three dimensions
US10942983B2 (en) 2015-10-16 2021-03-09 F4 Interactive web device with customizable display
US10691880B2 (en) * 2016-03-29 2020-06-23 Microsoft Technology Licensing, Llc Ink in an electronic document
US10649611B2 (en) 2016-05-13 2020-05-12 Sap Se Object pages in multi application user interface
US10579238B2 (en) 2016-05-13 2020-03-03 Sap Se Flexible screen layout across multiple platforms
US11003305B2 (en) * 2016-11-18 2021-05-11 Zspace, Inc. 3D user interface
US10587871B2 (en) 2016-11-18 2020-03-10 Zspace, Inc. 3D User Interface—360-degree visualization of 2D webpage content
US10127715B2 (en) * 2016-11-18 2018-11-13 Zspace, Inc. 3D user interface—non-native stereoscopic image conversion
US20180143757A1 (en) * 2016-11-18 2018-05-24 Zspace, Inc. 3D User Interface
US10863168B2 (en) * 2016-11-18 2020-12-08 Zspace, Inc. 3D user interface—360-degree visualization of 2D webpage content
US20190043247A1 (en) * 2016-11-18 2019-02-07 Zspace, Inc. 3D User Interface - Non-native Stereoscopic Image Conversion
US20190230346A1 (en) * 2016-11-18 2019-07-25 Zspace, Inc. 3D User Interface - 360-degree Visualization of 2D Webpage Content
US10271043B2 (en) 2016-11-18 2019-04-23 Zspace, Inc. 3D user interface—360-degree visualization of 2D webpage content
US10623713B2 (en) * 2016-11-18 2020-04-14 Zspace, Inc. 3D user interface—non-native stereoscopic image conversion
US20200099923A1 (en) * 2016-11-18 2020-03-26 Zspace, Inc. 3D User Interface - 360-degree Visualization of 2D Webpage Content
US10650416B1 (en) * 2017-02-17 2020-05-12 Sprint Communications Company L.P. Live production interface and response testing
US10802324B2 (en) 2017-03-14 2020-10-13 Boe Technology Group Co., Ltd. Double vision display method and device
US11321103B2 (en) * 2017-06-16 2022-05-03 Microsoft Technology Licensing, Llc Generating user interface containers
US20220382566A1 (en) * 2017-06-16 2022-12-01 Microsoft Technology Licensing, Llc Generating User Interface Containers
US20190026004A1 (en) * 2017-07-18 2019-01-24 Chicago Labs, LLC Three Dimensional Icons for Computer Applications
US10701346B2 (en) 2018-04-06 2020-06-30 Zspace, Inc. Replacing 2D images with 3D images
US10523922B2 (en) * 2018-04-06 2019-12-31 Zspace, Inc. Identifying replacement 3D images for 2D images via ranking criteria
US10523921B2 (en) * 2018-04-06 2019-12-31 Zspace, Inc. Replacing 2D images with 3D images
US10701347B2 (en) 2018-04-06 2020-06-30 Zspace, Inc. Identifying replacement 3D images for 2D images via ranking criteria
CN109725819A (en) * 2018-12-25 2019-05-07 努比亚技术有限公司 Interface display method, device, double screen dual system termi-nal and readable storage medium storing program for executing

Also Published As

Publication number Publication date
HK1161754A1 (en) 2012-08-03
US9019263B2 (en) 2015-04-28
US20110157264A1 (en) 2011-06-30
US9204138B2 (en) 2015-12-01
CN102183841A (en) 2011-09-14
US9143770B2 (en) 2015-09-22
TW201142356A (en) 2011-12-01
US9979954B2 (en) 2018-05-22
US8988506B2 (en) 2015-03-24
US20110157168A1 (en) 2011-06-30
US20110157170A1 (en) 2011-06-30
CN102183841B (en) 2014-04-02
US20110157169A1 (en) 2011-06-30
EP2346021B1 (en) 2014-11-19
US8767050B2 (en) 2014-07-01
US9654767B2 (en) 2017-05-16
TW201137399A (en) 2011-11-01
US20110164115A1 (en) 2011-07-07
US8964013B2 (en) 2015-02-24
US20110157309A1 (en) 2011-06-30
US20110164034A1 (en) 2011-07-07
US20110157322A1 (en) 2011-06-30
US8922545B2 (en) 2014-12-30
US9124885B2 (en) 2015-09-01
US9049440B2 (en) 2015-06-02
US20110157339A1 (en) 2011-06-30
US20110157257A1 (en) 2011-06-30
CN102215408A (en) 2011-10-12
TWI467234B (en) 2015-01-01
US20110157330A1 (en) 2011-06-30
US20110157471A1 (en) 2011-06-30
EP2357508A1 (en) 2011-08-17
US20110157336A1 (en) 2011-06-30
TW201142357A (en) 2011-12-01
US20110169930A1 (en) 2011-07-14
US20110157327A1 (en) 2011-06-30
US20110157326A1 (en) 2011-06-30
US20110157697A1 (en) 2011-06-30
US20110157696A1 (en) 2011-06-30
EP2346021A1 (en) 2011-07-20
US9013546B2 (en) 2015-04-21
US20110157172A1 (en) 2011-06-30
US20110169913A1 (en) 2011-07-14
US20150015668A1 (en) 2015-01-15
US20110164111A1 (en) 2011-07-07
US8687042B2 (en) 2014-04-01
US20110157315A1 (en) 2011-06-30
EP2357630A1 (en) 2011-08-17
CN102183840A (en) 2011-09-14
US20150156473A1 (en) 2015-06-04
US9066092B2 (en) 2015-06-23
US20150264341A1 (en) 2015-09-17
EP2357631A1 (en) 2011-08-17
US20110157167A1 (en) 2011-06-30

Similar Documents

Publication Publication Date Title
US20110161843A1 (en) Internet browser and associated content definition supporting mixed two and three dimensional displays
US10587871B2 (en) 3D User Interface—360-degree visualization of 2D webpage content
US11175818B2 (en) Method and apparatus for controlling display of video content
EP2480960B1 (en) Apparatus and method for grid navigation
US20140040949A1 (en) User control interface for interactive digital television
US8918737B2 (en) Zoom display navigation
US11003305B2 (en) 3D user interface
US20150074735A1 (en) Method and Apparatus for Rendering Video Content Including Secondary Digital Content
KR20110102359A (en) Extending 2d graphics in a 3d gui
CN105979339A (en) Window display method and client
CN112463269B (en) User interface display method and display equipment
US10623713B2 (en) 3D user interface—non-native stereoscopic image conversion
CN110971955A (en) Page processing method and device, electronic equipment and storage medium
US11962743B2 (en) 3D display system and 3D display method
US20220345679A1 (en) 3d display system and 3d display method
JP2013540378A (en) Setting the Z-axis position of the graphic surface of the 3D video display
TWM628625U (en) 3d display system
WO2016179214A1 (en) Method and apparatus for control video content on a display

Legal Events

Date Code Title Description
AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119