US20090315884A1 - Method and apparatus for outputting and displaying image data - Google Patents

Method and apparatus for outputting and displaying image data Download PDF

Info

Publication number
US20090315884A1
US20090315884A1 US12/479,978 US47997809A US2009315884A1 US 20090315884 A1 US20090315884 A1 US 20090315884A1 US 47997809 A US47997809 A US 47997809A US 2009315884 A1 US2009315884 A1 US 2009315884A1
Authority
US
United States
Prior art keywords
image data
perspective image
perspective
additional information
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/479,978
Inventor
Dae-jong LEE
Hyun-kwon Chung
Kil-soo Jung
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US12/479,978 priority Critical patent/US20090315884A1/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHUNG, HYUN-KWON, JUNG, KIL-SOO, LEE, DAE-JONG
Publication of US20090315884A1 publication Critical patent/US20090315884A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • G06T5/92
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/133Equalising the characteristics of different image components, e.g. their average brightness or colour balance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/158Switching image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/339Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • H04N13/359Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/005Aspects relating to the "3D+depth" image format

Definitions

  • aspects of the present invention relate to methods and apparatuses to output and display image data.
  • an image may be generated as a 3D image.
  • a base image and a depth image are stored in a storage space.
  • a user reproduces 3D images by processing the base image and the depth image by using an image outputting device supporting reproduction of 3D images.
  • such a method is unlikely to create an image as a 3D image. That is, in most cases, images are created as 2D images.
  • technologies to reproduce 3D images by processing such 2D images are being used.
  • aspects of the present invention provide methods and apparatuses to output and to display three-dimensional (3D) images.
  • a method of outputting 3D image data including: generating first-perspective image data and second-perspective image data to display the 3D image by converting a same two-dimensional (2D) image data; generating additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and outputting the first-perspective image data, the second-perspective image data, and the additional information.
  • the additional information may include pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
  • the first-perspective image data may be left-perspective image data
  • the second-perspective image data may be right-perspective image data
  • the additional information may include perspective information indicating that the first-perspective image data is the left-perspective image data, and the second-perspective image data is the right-perspective image data.
  • a method of displaying 3D image data including: receiving first-perspective image data and second-perspective image data, which are generated by converting a same 2D image data and are to display a 3D image; receiving additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and displaying the first-perspective image data and the second-perspective image data based on the additional information.
  • the additional information may include pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
  • the first-perspective image data may be left-perspective image data
  • the second-perspective image data may be right-perspective image data
  • the additional information may include perspective information indicating that the first-perspective image data is the left-perspective image data, and the second-perspective image data is the right-perspective image data.
  • the first-perspective image data may be stored in a first buffer, of a plurality of buffers classified according to a predetermined standard
  • the second-perspective image data may be stored in a second buffer, of the plurality of buffers.
  • the first-perspective image data and the second-perspective image data may be repeatedly displayed for a predetermined number of times.
  • the first-perspective image data and the second-perspective image data when the first-perspective image data and the second-perspective image data have been repeatedly displayed for the predetermined number of times, the first-perspective image data and the second-perspective image data may be deleted from the first and second buffers, respectively.
  • an image data outputting device including: a generating unit to generate first-perspective image data and second-perspective image data to display a 3D image by converting a same 2D image data; an adding unit to add additional information indicating a relationship between the first-perspective image data and the second-perspective image data to the first-perspective image data and/or the second-perspective image data; and an outputting unit to output the first-perspective image data and the second-perspective image data.
  • an image data displaying device including: a receiving unit to receive first-perspective image data and second-perspective image data, which are generated by converting a same 2D image data and are to display a 3D image, and to receive additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and a displaying unit to display the first-perspective image data and the second-perspective image data based on the additional information.
  • a method of outputting three-dimensional (3D) image data including: generating first-perspective image data and second-perspective image data to display a 3D image by converting a same two-dimensional (2D) image data; generating additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and displaying the first-perspective image data and the second-perspective image data based on the additional information.
  • an image data outputting device including: a generating unit to generate first-perspective image data and second-perspective image data to display a three-dimensional (3D) image by converting a same two-dimensional (2D) image data; and a displaying unit to display the first-perspective image data and the second-perspective image data based on the additional information indicating a relationship between the first-perspective image data and the second-perspective image data.
  • an image data outputting system including: an image data outputting device including: a generating unit to generate first-perspective image data and second-perspective image data to display a three-dimensional (3D) image by converting a same two-dimensional (2D) image data, an adding unit to add additional information indicating a relationship between the first-perspective image data and the second-perspective image data to the first-perspective image data and/or the second-perspective image data, and an outputting unit to output the first-perspective image data and the second-perspective image data; and an image data displaying device including: a receiving unit to receive the first-perspective image data, the second-perspective image data, and the additional information, and a displaying unit to display the first-perspective image data and the second-perspective image data based on the additional information.
  • an image data outputting device including: a generating unit to generate first-perspective image data and second-perspective image data to display a three-dimensional (3D) image by
  • a method of outputting three-dimensional (3D) image data including: generating additional information indicating a relationship between first-perspective image data and second-perspective image data that are generated from a same two-dimensional (2D) image data to display a 3D image.
  • FIG. 1 is a block diagram of an image data outputting device according to an embodiment of the present invention
  • FIG. 2A is a diagram showing a detailed configuration of the image data outputting device shown in FIG. 1 ;
  • FIG. 2B is a diagram showing an example wherein an image data outputting unit according to an embodiment of the present invention outputs 3D image data to which additional information is added;
  • FIG. 2C is a diagram showing another example wherein the image data outputting unit outputs 3D image data to which additional information is added;
  • FIG. 3 is a block diagram of the image data displaying device according to an embodiment of the present invention.
  • FIG. 4 is a block diagram of an image data displaying device according to another embodiment of the present invention.
  • FIG. 5 is a diagram showing an example of displaying 3D image data by using the image data displaying device according to an embodiment of the present invention
  • FIGS. 6A to 6C are diagrams of a storage unit within the image data displaying device according to an embodiment of the present invention.
  • FIG. 7 is a flowchart of a method of outputting image data according to an embodiment of the present invention.
  • FIG. 8 is a flowchart of a method of displaying image data according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of an image data outputting device 100 according to an embodiment of the present invention.
  • the image data outputting device 100 includes a generating unit 110 , an adding unit 120 , and an outputting unit 130 .
  • the image data outputting device 100 can be a transmission device, a data recorder that records the data on a recording medium, a workstation, a desktop computer, a notebook computer, etc., and can be implemented using one or more computers and/or processors whose functions are executed using software/firmware.
  • the generating unit 110 generates three-dimensional (3D) image data by converting two-dimensional (2D) image data into 3D image data.
  • 2D image data is image data used to reproduce 2D images
  • 3D image data is image data used to reproduce 3D images.
  • the generating unit 110 converts 2D image data and generates left-perspective image data and right-perspective image data.
  • the left-perspective image data is image data for a left eye of a viewer
  • the right-perspective image data is image data for a right eye of the viewer.
  • the generating unit 110 converts 2D image data and generates three or more pieces of image data in different perspectives.
  • a stereoscopic image is being reproduced and descriptions thereof will focus on left-perspective image data, though it is understood that aspects of the present invention are not limited thereto. That is, aspects of the present invention, as described below, can also be applied to right-perspective image data, and to a case where a multi-perspective image is reproduced in three or more perspectives.
  • the adding unit 120 adds additional information to at least one piece of generated 3D image data.
  • the adding unit 120 may add additional information to each piece of generated 3D image data.
  • additional information is information indicating relationships among 3D image data generated by the generating unit 110 converting the same 2D image data, and may be in any format.
  • pair information and/or perspective information may be included in additional information.
  • Pair information is information indicating that left-perspective image data and right-perspective image data are a pair when the left-perspective image data and the right-perspective image data are generated by the generating unit 110 converting the same 2D image data.
  • Perspective information is information indicating whether an item of the generated 3D image data is left-perspective image data or right-perspective image data.
  • the adding unit 120 may record the pair information and/or the perspective information to a plurality of bit strings.
  • the adding unit 120 may record the pair information in upper 4-bits of 8-bits existing in a predetermined region of 3D image data, and/or may record the perspective information in lower 4-bits of the 8-bits, though it is understood that aspects of the present invention are not limited thereto.
  • the adding unit 120 may record the pair information in the lower 4-bits of 8-bits existing in a predetermined region of 3D image data, and/or may record the perspective information in the upper 4-bits of the 8-bits.
  • Pair information is information to indicate 2D image data used to generate 3D image data, and a plurality of pieces of 3D image data generated by converting the same 2D image data may have the same pair information.
  • Types of pair information may vary according to various embodiments. For example, according to an embodiment of the present invention, pair information may be classified into 16 types according to sequences of 2D image data used to generate 3D image data.
  • pair information of 3D image data generated by the generating unit 110 converting first 2D image data may have a value “0000”
  • pair information of 3D image data generated by the generating unit 110 converting second 2D image data may have a value “0001.”
  • pair information of 3D image data generated by the generating unit 110 converting sixteenth 2D image data may have a value “1111”
  • pair information of 3D image data generated by the generating unit 110 converting seventeenth 2D image data may have a value “0000” again, though it is understood that other embodiments are not limited thereto.
  • the pair information may be classified into two types depending on whether 2D image data used to generate 3D image data are even-numbered 2D image data or odd-numbered 2D image data.
  • 3D image data generated by the generating unit 110 converting even-numbered 2D image data may have a value “0000”
  • 3D image data generated by the generating unit 110 converting odd-numbered 2D image data may have a value “1111.”
  • Perspective information may vary according to a number of pairs of 3D image data to be generated by the generating unit 110 converting 2D image data. If stereoscopic images are to be reproduced, perspective information may be limited to two types. For example, perspective information of left-perspective image data may have a value “0000,” and perspective information of right-perspective image data may have a value “1111.” Thus, additional information regarding left-perspective image data generated by the generating unit 110 converting even-numbered 2D image data may include a bit string having a value “00000000,” and additional information regarding right-perspective image data generated by the generating unit 110 converting the same 2D image data may include a bit string having a value “00001111.” In this regard, additional information regarding left-perspective image data generated by the generating unit 110 converting odd-numbered 2D image data may include a bit string having a value “11110000,” and additional information regarding right-perspective image data generated by the generating unit 110 converting the same 2D image data may include a bit string
  • the outputting unit 130 transmits the 3D image data, to which the additional information is added by the adding unit 120 , to an image data displaying device 300 described below.
  • the generating unit 110 in the image data outputting device 100 generates 3D image data by converting 2D image data and generates additional information indicating relationships among the 3D image data.
  • the generating unit 110 may not generate the additional information. Instead, the generating unit 110 or the adding unit 120 may obtain additional information generated in advance and/or externally and transmit the additional information to the image data displaying device 300 together with 3D image data.
  • the image data outputting device 100 may further include a receiving unit (not shown).
  • the receiving unit receives metadata including additional information from an external server via a network system and/or from a broadcasting server that provides a cable broadcasting service via a cable network, though it is understood that aspects of the present invention are not limited thereto.
  • the receiving unit may receive the metadata from an external storage device (such as a server) via any wired and/or wireless connection (such as USB, Bluetooth, Infrared, etc.).
  • the image data outputting device 100 may further include a reading unit (not shown).
  • 2D image data and/or 3D image data generated by the generating unit 110 converting 2D image data is recorded in a recording medium.
  • metadata including additional information may also be recorded on the recording medium.
  • the reading unit may read the recording medium, obtain metadata therefrom, and transmit the metadata to the image data displaying device 300 together with the 3D image data.
  • FIG. 2A is a diagram showing a detailed configuration of the image data outputting device 100 shown in FIG. 1 .
  • a decoder 210 decodes input 2D image data.
  • the decoded 2D image data is transmitted to the generating unit 110 .
  • the generating unit 110 generates a 3D image data pair, which is to be used to generate 3D images, by converting the input 2D image data.
  • the adding unit 120 adds additional information that indicates relationships among a plurality of 3D image data generated by the generating unit 110 converting the same 2D image data, to at least one of the 3D image data.
  • the additional information may include pair information and/or perspective information.
  • the adding unit 120 includes a perspective determining unit 122 and an information recording unit 124 .
  • the perspective determining unit 122 determines perspectives of a generated 3D image data pair. That is, the perspective determining unit 122 determines which of 3D image data items is left-perspective image data and which is right-perspective image data.
  • the information recording unit 124 adds pair information that indicates left-perspective image data and right-perspective image data from among a 3D image data pair generated by using the same 2D image data, and adds perspective information to each of the image data.
  • Pair information and perspective information may be generated as metadata.
  • aspects of the present invention are not limited thereto.
  • the pair information and/or the perspective information may be added by being recorded in a predetermined region within 3D image data.
  • the outputting unit 130 outputs a 3D image data pair to which the pair information and the perspective information are added by the information recording unit 124 . If the outputting unit 130 outputs 3D image data at a transmitting rate of 60 Hz, it takes 1/30 of a second to completely output a 3D image data pair.
  • FIG. 2B is a diagram showing an example wherein an image data outputting unit according to an embodiment of the present invention outputs 3D image data to which additional information 231 is added.
  • the additional information 231 is output as metadata regarding 3D image data 232 and 233 .
  • an output sequence may vary according to embodiments.
  • the output sequence shown in FIG. 2B is the additional information 231 , left-perspective image data 232 , and right-perspective image data 233 .
  • FIG. 2C is a diagram showing another example wherein the image data outputting unit outputs 3D image data to which additional information is added.
  • additional pieces of information 242 and 244 are recorded in predetermined regions of 3D image data 241 and 243 , respectively.
  • the additional information 242 regarding left-perspective image data 241 is recorded in the left-perspective image data 241
  • the additional information 244 regarding right-perspective image data 243 is recorded in the right-perspective image data 243 .
  • a sequence of outputting 3D image data may vary according to embodiments.
  • the 3D image data is output in the sequence of the left-perspective image data 241 followed by the right-perspective image data 243 .
  • FIG. 3 is a block diagram of the image data displaying device 300 according to an embodiment of the present invention.
  • the image data displaying device 300 includes a receiving unit 310 , an extracting unit 320 , and a displaying unit 330 .
  • the image data displaying device 300 can be a workstation, a desktop computer, a notebook computer, a portable multimedia player, a television, a set-top box, a reproducing device that reads the image data from a medium, etc., and can be implemented using one or more computers and/or processors whose functions are executed using software/firmware.
  • the receiving unit 310 receives 3D image data from the image data outputting device 100 .
  • the received 3D image data includes additional information indicating relationships among 3D image data generated by converting the same 2D image data.
  • the additional information includes pair information and/or perspective information.
  • Pair information is information indicating that left-perspective image data and right-perspective image data generated by the generating unit 110 converting the same 2D image data are a 3D image data pair.
  • Left-perspective image data and right-perspective image data having the same pair information can be determined as a 3D image data pair.
  • Perspective information is information indicating whether 3D image data is left-perspective image data or right-perspective image data.
  • the extracting unit 320 extracts the additional information from the received 3D image data.
  • the displaying unit 330 displays the 3D image data based on the additional information. More particularly, whether to display 3D image data is determined based on the pair information and the perspective information. If it is determined to display the 3D image data, a 3D image data display sequence is determined, and the displaying unit 330 displays the 3D image data according to the determined sequence. An example of displaying 3D image data will be described in detail later with reference to FIG. 5 .
  • FIG. 4 is a block diagram of an image data displaying device 400 according to another embodiment of the present invention.
  • the image data displaying device 400 includes a receiving unit 410 , an extracting unit 420 , a control unit 430 , a storage unit 440 , and a displaying unit 450 .
  • operations of the receiving unit 410 , the extracting unit 420 , and the displaying unit 450 are similar to those of the receiving unit 310 , the extracting unit 320 , and the displaying unit 330 shown in FIG. 3 , and thus detailed descriptions thereof will be omitted here.
  • the control unit 430 controls the image data displaying device 400 to store, to delete, and/or to display 3D image data, based on additional information.
  • the storage unit 440 includes a plurality of buffers 442 through 448 , which are defined according to a predetermined standard. The predetermined standard may vary according to embodiments. For example, the buffers 442 through 448 may be defined based on the perspective information and/or the pair information that are added to the 3D image data.
  • the storage unit 440 which is controlled by the control unit 430 , stores 3D image data in one or more of the plurality of buffers, transmits stored 3D image data to the displaying unit 450 , and/or deletes stored 3D image data.
  • the receiving unit 410 receives 3D image data from the image data outputting device 100 .
  • the received 3D image data is image data generated by the generating unit 110 converting 2D image data, and may include additional information.
  • the additional information is information indicating relationships among 3D image data generated by the generating unit 110 converting the same 2D image data, and may include pair information and/or perspective information.
  • the pair information and the perspective information are indicated by using 8-bit strings within the 3D image data, wherein upper 4-bits of an 8-bit string indicates the pair information and lower 4-bits of the 8-bit string indicates the perspective information.
  • the pair information may indicate whether the 3D image data is generated by the generating unit 110 converting even-numbered 2D image data or odd-numbered 2D image data. For example, if a value of the upper 4-bits is “0000,” the 3D image data is generated by the generating unit 110 converting even-numbered 2D image data. In contrast, if the value of the upper 4-bits is “1111,” the 3D image data is generated by the generating unit 110 converting odd-numbered 2D image data.
  • the perspective information may indicate whether 3D image data is left-perspective image data or right-perspective image data. For example, if a value of the lower 4-bits is “0000,” the 3D image data is left-perspective image data. If the value of the lower 4-bits is “1111,” the 3D image data is right-perspective image data.
  • the extracting unit 420 extracts the pair information and the perspective information from the 3D image data. For example, as described above, if the extracted pair information and perspective information is “00000000,” the 3D image data is left-perspective image data generated by converting even-numbered 2D image data. Hereinafter, a 3D image data pair generated by converting even-numbered 2D image data will be referred as a first pair, and a 3D image data pair generated by converting odd-numbered 2D image data will be referred as a second pair.
  • the control unit 430 selects a buffer, to which 3D image data is to be stored, based on the pair information and the perspective information.
  • the storage unit 440 comprises four buffers 442 , 444 , 446 , and 448 .
  • a first buffer 442 stores left-perspective image data of the first pair
  • a second buffer 444 stores right-perspective image data of the first pair
  • a third buffer 446 stores left-perspective image data of the second pair
  • a fourth buffer 448 stores right-perspective image data of the second pair.
  • 3D image data having additional information indicated by the 8-bit string “00000000” is stored in the first buffer 442 .
  • aspects of the present invention are not limited to the four buffers 442 through 448 classified by the pair information and the perspective information. According to other aspects, more or less buffers may be provided according to other classification schemes. For example, only two buffers may be provided, classified by the pair information or the perspective information.
  • control unit 430 controls the storage unit 440 such that the 3D image data stored in the first buffer 442 is transmitted to the displaying unit 450 .
  • the displaying unit 450 may display the 3D image data in a sequence of displaying left-perspective image data of a pair and successively displaying right-perspective image data of the same pair or vice versa.
  • the control unit 430 controls the display unit 450 and the storage unit 440 such that the 3D image data is repeatedly displayed a predetermined number of times and is deleted from a buffer 442 , 444 , 446 , or 448 . Displaying and buffering the 3D image data will be described in detail later with reference to FIGS. 5 , 6 A, and 6 B.
  • the displaying unit 450 displays the transmitted 3D image data.
  • a conventional image data outputting device generates 3D image data by converting 2D image data and transmits the 3D image data to an image data displaying device without adding additional information, such as pair information and/or perspective information.
  • An image data displaying device alternately displays left-perspective image data and right-perspective image data in a sequence that the 3D image data is received.
  • a user watches left-perspective image data via his or her left eye and watches right-perspective image data via his or her right eye by using an auxiliary device to watch 3D images (e.g., 3D glasses or goggles).
  • a conventional method enables a user to watch 3D images in the case where 3D image data is sequentially transmitted and displayed.
  • 3D image data is not sequentially transmitted or some pieces of 3D image data are not transmitted and/or displayed due, for example, to a power failure, a user cannot watch the 3D images.
  • power supplied to an image data displaying device temporarily fails, and left-perspective image data is received again when power supply to the image data displaying device is recovered.
  • 3D image data does not include perspective information, and thus the image data displaying device cannot determine whether the newly received 3D image data is left-perspective image data or right-perspective image data. Therefore, without separate synchronization, the image data displaying device may determine left-perspective image data as right-perspective image data.
  • a user watches left-perspective image data via his or her right eye and right-perspective image data via his or her left eye, and thus the 3D effect cannot be sufficiently experienced.
  • the image data displaying device 300 , 400 uses perspective information included in 3D image data such that left-perspective image data is viewed via the left eye and right-perspective image data is viewed via the right eye. Thus, a user can watch 3D images without distortion.
  • pair information is not included in conventional 3D image data.
  • the image data displaying device 300 , 400 may not be able to display both left-perspective image data and right-perspective image data, and the image data displaying device 400 may only partially display a 3D image data pair and then display a next pair. In this case, the 3D effect cannot be sufficiently experienced by a user due to overlapping of images.
  • the image data displaying device 300 , 400 according to aspects of the present invention guarantees the display of both left-perspective image data and right-perspective image data by using pair information included in 3D image data. Thus, a user can watch natural 3D images.
  • FIG. 5 is a diagram showing an example of displaying 3D image data by using the image data displaying device 400 according to an embodiment of the present invention.
  • 3D image data output from the image data outputting device 100 according to an embodiment of the present invention are shown in chronological order in the lower part of FIG. 5 .
  • the image data outputting device 100 outputs 3D image data at a rate of 60 Hz, the unit of time is set to 1/60 of a second.
  • 3D image data includes pair information and perspective information.
  • pair information is indicated by an upper bit between two bits included in the 3D image data
  • perspective information is indicated by a lower bit.
  • 3D image data where a bit indicating pair information (i.e., the upper bit) is 0 is generated by using even-numbered 2D image data
  • 3D image data where a bit indicating pair information is 1 is generated by using odd-numbered 2D image data.
  • 3D image data where a bit indicating perspective information (i.e., the lower bit) is 0 is left-perspective image data
  • 3D image data where a bit indicating perspective information is 1 is right-perspective image data.
  • 3D image data 501 is output. In the 3D image data 501 , both a bit indicating pair information and a bit indicating perspective information are 0. Thus, it is clear that the 3D image data 501 is left-perspective image data generated by using even-numbered 2D image data.
  • 3D image data 502 is output. In the 3D image data 502 , a bit indicating pair information is 0, and a bit indicating perspective information is 1. Thus, it is clear that the 3D image data 502 is right-perspective image data generated by using even-numbered 2D image data. Furthermore, it is clear that the 3D image data 501 and the 3D image data 502 are paired image data, as the 3D image data 501 and the 3D image data 502 have the same pair information, have different perspective information, and are transmitted successively.
  • 3D image data 503 is output.
  • a bit indicating pair information is 1, and a bit indicating perspective information is 0. Therefore, it is clear that the 3D image data 503 is left-perspective image data generated by using odd-numbered 2D image data. Furthermore, since pair information of the 3D image data 502 and 503 are different from each other, it is clear that the 3D image data 502 and the 3D image data 503 are not paired image data.
  • 3D image data displayed by the image data displaying device 300 , 400 according to aspects of the present invention are shown in chronological order in the upper part of FIG. 5 .
  • the 3D image data 501 is received from the image data outputting device 100 , and the received 3D image data 501 is stored in the storage unit 440 .
  • the 3D image data 502 is received from the image data outputting device 100 , and the received 3D image data 502 is stored in the storage unit 440 .
  • the 3D image data 503 is received from the image data outputting device 100 , and the received 3D image data 503 is stored in the storage unit 440 .
  • the image data displaying device 400 displays stored left-perspective image data (that is, the 3D image data 501 ).
  • no 3D image data is received from the image data outputting device 100 , as the 3D image data in the present embodiment is received every 1/60 of a second.
  • the image data displaying device 400 displays stored right-perspective image data (that is, the 3D image data 502 ).
  • received image data has to be displayed continuously at a point of time when no image data is received from the image data outputting device 100 . Therefore, a 3D image data pair is displayed every 1/30 of a second.
  • stored 3D image data is used in the present embodiment, and thus an image data pair can be displayed every 1/60 of a second.
  • 3D image data 504 is received from the image data outputting device 100 , and the received 3D image data 504 is stored in the storage unit 440 .
  • the image data displaying device 400 displays the 3D image data 501 . Since the 3D image data 501 is repeatedly displayed twice, the 3D image data 501 is deleted from the storage unit 440 .
  • no 3D image data is received from the image data outputting device 100 .
  • the image data displaying device 400 displays the 3D image data 502 . Since the 3D image data 502 is repeatedly displayed twice, the 3D image data 502 is deleted from the storage unit 440 .
  • the image data displaying device 400 displays one image data every 1/60 of a second. Accordingly, in the case of displaying 2D image data, it takes 1/60 seconds to display a scene. However, in the case of displaying 3D image data, it takes 2/60 of a second to display a scene, because a user can watch a scene after two pieces of 3D image data are displayed. However, images appear more natural to a user when a scene is displayed every 1/60 of a second. Thus, 3D images displayed by the conventional image data displaying device do not seem natural to a user, compared to those displayed according to aspects of the present invention.
  • the image data displaying device 300 , 400 classifies and stores received 3D image data by using a plurality of buffers, and repeatedly displays 3D image data a predetermined number of times by using stored 3D image data.
  • a 3D image data pair can be displayed every 1/60 of a second. Therefore, a user can watch natural 3D images.
  • FIGS. 6A and 6B are diagrams of the storage unit 440 of the image data displaying device 400 according to an embodiment of the present invention.
  • the storage unit 440 of the image data displaying device 400 includes the first buffer 442 , the second buffer 444 , the third buffer 446 , and the fourth buffer 448 .
  • the first buffer 442 stores left-perspective image data of the first pair
  • the second buffer 444 stores right-perspective image data of the first pair
  • the third buffer 446 stores left-perspective image data of the second pair
  • the fourth buffer 448 stores right-perspective image data of the second pair.
  • 3D image data where a bit indicating pair information is 0 is 3D image data of the first pair
  • 3D image data where a bit indicating pair information is 1 is 3D image data of the second pair.
  • the 3D image data 501 is received from the image data outputting device 100 , and the received 3D image data 501 is stored in the storage unit 440 .
  • a bit indicating pair information is 0, and a bit indicating perspective information is also 0.
  • the 3D image data 501 is stored in the first buffer 442 .
  • no data is received from the image data outputting device 100 .
  • the 3D image data 502 is received from the image data outputting device 100 , and the received 3D image data 502 is stored in the storage unit 440 .
  • a bit indicating pair information is 0, and a bit indicating perspective information is 1.
  • the 3D image data 502 is stored in the second buffer 444 .
  • no data is received from the image data outputting device 100 .
  • the 3D image data 503 is received from the image data outputting device 100 , and the received 3D image data 503 is stored in the storage unit 440 .
  • a bit indicating pair information is 1, and a bit indicating perspective information is 0.
  • the 3D image data 503 is stored in the third buffer 446 .
  • the 3D image data 501 stored in the first buffer 442 is output to the displaying unit 450 .
  • the 3D image data 502 stored in the second buffer 444 is output to the displaying unit 450 .
  • the 3D image data 504 is received from the image data outputting device 100 , and the received 3D image data 504 is stored in the storage unit 440 .
  • a bit indicating pair information is 1, and a bit indicating perspective information is 1.
  • the 3D image data 504 is stored in the fourth buffer 448 .
  • the 3D image data 501 stored in the first buffer 442 is output to the displaying unit 450 .
  • the 3D image data 501 stored in the first buffer 442 is displayed twice, the 3D image data 501 is deleted from the first buffer 442 .
  • no data is received from the image data outputting device 100 .
  • the 3D image data 502 stored in the second buffer 444 is output to the displaying unit 450 . Since the 3D image data 502 stored in the second buffer 444 is displayed twice, the 3D image data 502 is deleted from the second buffer 444 .
  • the 3D image data 505 is received from the image data outputting device 100 , and the received 3D image data 505 is stored in the storage unit 440 .
  • a bit indicating pair information is 0, and a bit indicating perspective information is also 0.
  • the 3D image data 505 is stored in the third buffer 442 .
  • the 3D image data 503 stored in the third buffer 444 is output to the displaying unit 450 .
  • the 3D image data 504 stored in the fourth buffer 448 is output to the displaying unit 450 .
  • the 3D image data 506 is received from the image data outputting device 100 , and the received 3D image data 506 is stored in the storage unit 440 .
  • a bit indicating pair information is 0, and a bit indicating perspective information is 1.
  • the 3D image data 506 is stored in the second buffer 444 .
  • the 3D image data 503 stored in the third buffer 446 is output to the displaying unit 450 .
  • the 3D image data 503 stored in the third buffer 446 is displayed twice, the 3D image data 503 is deleted from the third buffer 446 .
  • no data is received from the image data outputting device 100 .
  • the 3D image data 504 stored in the fourth buffer 448 is output to the displaying unit 450 . Since the 3D image data 504 stored in the fourth buffer 448 is displayed twice, the 3D image data 504 is deleted from the fourth buffer 448 .
  • an image data outputting rate of the image data outputting device 100 may be less than or greater than 60 Hz.
  • the image data outputting device 100 may still be controlled to display a 3D image data pair every 1/60 of a second by adjusting the number of times 3D image data is repeatedly displayed (i.e., adjusting the number of times to be greater than two).
  • both left-perspective image data 501 and 503 and right-perspective image data 502 and 504 of a same pair are stored, and stored 3D image data 501 and 502 is displayed as soon as left-perspective image data 503 of a next pair is received.
  • stored left-perspective image data 501 may be displayed from when a pair of left-perspective image data and right-perspective image data 503 and 504 are received ( 2/60 of a second).
  • FIG. 7 is a flowchart of a method of outputting image data according to an embodiment of the present invention.
  • first-perspective image data and second-perspective image data to display a 3D image are generated in operation S 710 .
  • the first-perspective image data and the second-perspective image data are generated by converting the same 2D image data.
  • Additional information indicating relationships between the generated first-perspective image data and the generated second-perspective image data (operation S 710 ) is generated, and the generated additional information is added to the first-perspective image data and/or the second-perspective image data in operation S 720 .
  • the additional information may be added to both the first-perspective image data and the second-perspective image data, or just one of the first-perspective image data and the second-perspective image data.
  • the additional information may include pair information that indicates that the first-perspective image data and the second-perspective image data are an image data pair generated by converting the same 2D image data.
  • the additional information may additionally or alternatively include perspective information that indicates whether each of the first-perspective image data and the second-perspective image data is left-perspective image data or right-perspective image data.
  • the pair information and the perspective information may be recorded within the first-perspective image data and the second-perspective image data.
  • the pair information may be recorded by using an upper 4-bits of 8-bits in a predetermined region of the first-perspective image data and the second-perspective image data, and the perspective information may be recorded by using a lower 4-bits of the 8-bits.
  • the first-perspective image data and the second-perspective image data, to which additional information is added, are output in operation S 730 .
  • FIG. 8 is a flowchart of a method of displaying image data according to an embodiment of the present invention.
  • first-perspective image data and second-perspective image data to display a 3D image are received in operation S 810 .
  • the first-perspective image data and the second-perspective image data are generated by using the same 2D image data.
  • the first-perspective image data and the second-perspective image data include additional information.
  • the additional information is information indicating relationships among 3D image data generated by converting the same 2D image data, and may include pair information and/or perspective information.
  • Pair information is information indicating that first-perspective image data and second-perspective image data, generated by converting the same 2D image data, are paired image data.
  • Perspective information is information indicating whether each of the first-perspective image data and the second-perspective image data is left-perspective image data or right-perspective image data. If the first-perspective image data is left-perspective image data, the second-perspective image data is right-perspective image data. If the first-perspective image data is right-perspective image data, the second-perspective image data is left-perspective image data.
  • the additional information is obtained from the first-perspective image data and the second-perspective image data in operation S 820 .
  • the additional information may either be transmitted after being included in the first-perspective image data and/or the second-perspective image data, or be separately transmitted. In the case where the additional information is separately transmitted, the additional information may be separately received from an external server or an image data outputting device 100 .
  • the first-perspective image data and the second-perspective image data are displayed based on the additional information in operation S 830 .
  • displaying the first-perspective image data will be described in detail under an assumption that the additional information includes pair information and perspective information.
  • the received first-perspective image data is stored based on the pair information and perspective information.
  • the first-perspective image data may be stored in a first buffer 442 in which left-perspective image data of a first pair is stored, a second buffer 444 in which right-perspective image data of the first pair is stored, a third buffer 446 in which left-perspective image data of a second pair is stored, and a fourth buffer 448 in which right-perspective image data of the second pair is stored, as illustrated in FIG. 4 .
  • the first-perspective image data and the second-perspective image data are both stored in the buffers, the first-perspective image data and the second-perspective image data are sequentially displayed. At this point, the first-perspective image data and the second-perspective image data are repeatedly displayed a predetermined number of times, and are deleted from the buffers after being repeatedly displayed the predetermined number of times.
  • aspects of the present invention can also be written as computer programs and can be implemented in general-use or specific-use digital computers that execute the programs using a computer-readable recording medium.
  • Examples of the computer-readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs).
  • aspects of the present invention may also be realized as a data signal embodied in a carrier wave and comprising a program readable by a computer and transmittable over the Internet.

Abstract

A method of outputting three-dimensional (3D) images, the method including: generating first-perspective image data and second-perspective image data to display the 3D image by converting a same two-dimensional (2D) image data; adding additional information indicating a relationship between the first-perspective image data and the second-perspective image data to the first-perspective image data and/or the second-perspective image data; and outputting the first-perspective image data and the second-perspective image data.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of Korean Patent Application No. 10-2008-0092417, filed Sep. 19, 2008, in the Korean Intellectual Property Office, and the benefit of U.S. Provisional Patent Application No. 61/075,184, filed Jun. 24, 2008, in the U.S. Patent and Trademark Office, the disclosures of which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • Aspects of the present invention relate to methods and apparatuses to output and display image data.
  • 2. Description of the Related Art
  • Recently, dramatic developments in image processing technologies have resulted in more users demanding more realistic images. In particular, the number of users who could previously only watch two-dimensional (2D) images via screens but now demand realistic three-dimensional (3D) images continuously increases. Thus, technologies to provide 3D images are demanded.
  • According to a method of providing 3D images, an image may be generated as a 3D image. Specifically, a base image and a depth image are stored in a storage space. Then, a user reproduces 3D images by processing the base image and the depth image by using an image outputting device supporting reproduction of 3D images. However, such a method is unlikely to create an image as a 3D image. That is, in most cases, images are created as 2D images. Thus, technologies to reproduce 3D images by processing such 2D images are being used.
  • SUMMARY OF THE INVENTION
  • Aspects of the present invention provide methods and apparatuses to output and to display three-dimensional (3D) images.
  • According to an aspect of the present invention, there is provided a method of outputting 3D image data, the method including: generating first-perspective image data and second-perspective image data to display the 3D image by converting a same two-dimensional (2D) image data; generating additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and outputting the first-perspective image data, the second-perspective image data, and the additional information.
  • According to an aspect of the present invention, the additional information may include pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
  • According to an aspect of the present invention, the first-perspective image data may be left-perspective image data, and the second-perspective image data may be right-perspective image data.
  • According to an aspect of the present invention, the additional information may include perspective information indicating that the first-perspective image data is the left-perspective image data, and the second-perspective image data is the right-perspective image data.
  • According to another aspect of the present invention, there is provided a method of displaying 3D image data, the method including: receiving first-perspective image data and second-perspective image data, which are generated by converting a same 2D image data and are to display a 3D image; receiving additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and displaying the first-perspective image data and the second-perspective image data based on the additional information.
  • According to an aspect of the present invention, the additional information may include pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
  • According to an aspect of the present invention, the first-perspective image data may be left-perspective image data, and the second-perspective image data may be right-perspective image data.
  • According to an aspect of the present invention, the additional information may include perspective information indicating that the first-perspective image data is the left-perspective image data, and the second-perspective image data is the right-perspective image data.
  • According to an aspect of the present invention, the first-perspective image data may be stored in a first buffer, of a plurality of buffers classified according to a predetermined standard, and the second-perspective image data may be stored in a second buffer, of the plurality of buffers.
  • According to an aspect of the present invention, the first-perspective image data and the second-perspective image data may be repeatedly displayed for a predetermined number of times.
  • According to an aspect of the present invention, when the first-perspective image data and the second-perspective image data have been repeatedly displayed for the predetermined number of times, the first-perspective image data and the second-perspective image data may be deleted from the first and second buffers, respectively.
  • According to another aspect of the present invention, there is provided an image data outputting device including: a generating unit to generate first-perspective image data and second-perspective image data to display a 3D image by converting a same 2D image data; an adding unit to add additional information indicating a relationship between the first-perspective image data and the second-perspective image data to the first-perspective image data and/or the second-perspective image data; and an outputting unit to output the first-perspective image data and the second-perspective image data.
  • According to another aspect of the present invention, there is provided an image data displaying device including: a receiving unit to receive first-perspective image data and second-perspective image data, which are generated by converting a same 2D image data and are to display a 3D image, and to receive additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and a displaying unit to display the first-perspective image data and the second-perspective image data based on the additional information.
  • According to yet another aspect of the present invention, there is provided a method of outputting three-dimensional (3D) image data, the method including: generating first-perspective image data and second-perspective image data to display a 3D image by converting a same two-dimensional (2D) image data; generating additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and displaying the first-perspective image data and the second-perspective image data based on the additional information.
  • According to still another aspect of the present invention, there is provided an image data outputting device including: a generating unit to generate first-perspective image data and second-perspective image data to display a three-dimensional (3D) image by converting a same two-dimensional (2D) image data; and a displaying unit to display the first-perspective image data and the second-perspective image data based on the additional information indicating a relationship between the first-perspective image data and the second-perspective image data.
  • According to another aspect of the present invention, there is provided an image data outputting system, including: an image data outputting device including: a generating unit to generate first-perspective image data and second-perspective image data to display a three-dimensional (3D) image by converting a same two-dimensional (2D) image data, an adding unit to add additional information indicating a relationship between the first-perspective image data and the second-perspective image data to the first-perspective image data and/or the second-perspective image data, and an outputting unit to output the first-perspective image data and the second-perspective image data; and an image data displaying device including: a receiving unit to receive the first-perspective image data, the second-perspective image data, and the additional information, and a displaying unit to display the first-perspective image data and the second-perspective image data based on the additional information.
  • According to another aspect of the present invention, there is provided a method of outputting three-dimensional (3D) image data, the method including: generating additional information indicating a relationship between first-perspective image data and second-perspective image data that are generated from a same two-dimensional (2D) image data to display a 3D image.
  • Additional aspects and/or advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and/or other aspects and advantages of the invention will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
  • FIG. 1 is a block diagram of an image data outputting device according to an embodiment of the present invention;
  • FIG. 2A is a diagram showing a detailed configuration of the image data outputting device shown in FIG. 1;
  • FIG. 2B is a diagram showing an example wherein an image data outputting unit according to an embodiment of the present invention outputs 3D image data to which additional information is added;
  • FIG. 2C is a diagram showing another example wherein the image data outputting unit outputs 3D image data to which additional information is added;
  • FIG. 3 is a block diagram of the image data displaying device according to an embodiment of the present invention;
  • FIG. 4 is a block diagram of an image data displaying device according to another embodiment of the present invention;
  • FIG. 5 is a diagram showing an example of displaying 3D image data by using the image data displaying device according to an embodiment of the present invention;
  • FIGS. 6A to 6C are diagrams of a storage unit within the image data displaying device according to an embodiment of the present invention;
  • FIG. 7 is a flowchart of a method of outputting image data according to an embodiment of the present invention; and
  • FIG. 8 is a flowchart of a method of displaying image data according to an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Reference will now be made in detail to the present embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present invention by referring to the figures.
  • FIG. 1 is a block diagram of an image data outputting device 100 according to an embodiment of the present invention. Referring to FIG. 1, the image data outputting device 100 includes a generating unit 110, an adding unit 120, and an outputting unit 130. While not required, the image data outputting device 100 can be a transmission device, a data recorder that records the data on a recording medium, a workstation, a desktop computer, a notebook computer, etc., and can be implemented using one or more computers and/or processors whose functions are executed using software/firmware.
  • The generating unit 110 generates three-dimensional (3D) image data by converting two-dimensional (2D) image data into 3D image data. Here, 2D image data is image data used to reproduce 2D images, whereas 3D image data is image data used to reproduce 3D images. Specifically, when reproducing a stereoscopic image, the generating unit 110 converts 2D image data and generates left-perspective image data and right-perspective image data. According to the present embodiment, the left-perspective image data is image data for a left eye of a viewer, and the right-perspective image data is image data for a right eye of the viewer. When reproducing a multi-perspective image in three or more perspectives, the generating unit 110 converts 2D image data and generates three or more pieces of image data in different perspectives. For convenience of explanation, it will be assumed that a stereoscopic image is being reproduced and descriptions thereof will focus on left-perspective image data, though it is understood that aspects of the present invention are not limited thereto. That is, aspects of the present invention, as described below, can also be applied to right-perspective image data, and to a case where a multi-perspective image is reproduced in three or more perspectives.
  • The adding unit 120 adds additional information to at least one piece of generated 3D image data. The adding unit 120 may add additional information to each piece of generated 3D image data. Here, additional information is information indicating relationships among 3D image data generated by the generating unit 110 converting the same 2D image data, and may be in any format. For example, pair information and/or perspective information may be included in additional information. Pair information is information indicating that left-perspective image data and right-perspective image data are a pair when the left-perspective image data and the right-perspective image data are generated by the generating unit 110 converting the same 2D image data. Perspective information is information indicating whether an item of the generated 3D image data is left-perspective image data or right-perspective image data. The adding unit 120 may record the pair information and/or the perspective information to a plurality of bit strings. For example, the adding unit 120 may record the pair information in upper 4-bits of 8-bits existing in a predetermined region of 3D image data, and/or may record the perspective information in lower 4-bits of the 8-bits, though it is understood that aspects of the present invention are not limited thereto. For example, the adding unit 120 may record the pair information in the lower 4-bits of 8-bits existing in a predetermined region of 3D image data, and/or may record the perspective information in the upper 4-bits of the 8-bits.
  • Pair information is information to indicate 2D image data used to generate 3D image data, and a plurality of pieces of 3D image data generated by converting the same 2D image data may have the same pair information. Types of pair information may vary according to various embodiments. For example, according to an embodiment of the present invention, pair information may be classified into 16 types according to sequences of 2D image data used to generate 3D image data. In this case, pair information of 3D image data generated by the generating unit 110 converting first 2D image data may have a value “0000,” and pair information of 3D image data generated by the generating unit 110 converting second 2D image data may have a value “0001.” In this regard, pair information of 3D image data generated by the generating unit 110 converting sixteenth 2D image data may have a value “1111,” and pair information of 3D image data generated by the generating unit 110 converting seventeenth 2D image data may have a value “0000” again, though it is understood that other embodiments are not limited thereto. For example, according to another embodiment of the present invention, the pair information may be classified into two types depending on whether 2D image data used to generate 3D image data are even-numbered 2D image data or odd-numbered 2D image data. In this case, 3D image data generated by the generating unit 110 converting even-numbered 2D image data may have a value “0000,” and 3D image data generated by the generating unit 110 converting odd-numbered 2D image data may have a value “1111.”
  • Perspective information may vary according to a number of pairs of 3D image data to be generated by the generating unit 110 converting 2D image data. If stereoscopic images are to be reproduced, perspective information may be limited to two types. For example, perspective information of left-perspective image data may have a value “0000,” and perspective information of right-perspective image data may have a value “1111.” Thus, additional information regarding left-perspective image data generated by the generating unit 110 converting even-numbered 2D image data may include a bit string having a value “00000000,” and additional information regarding right-perspective image data generated by the generating unit 110 converting the same 2D image data may include a bit string having a value “00001111.” In this regard, additional information regarding left-perspective image data generated by the generating unit 110 converting odd-numbered 2D image data may include a bit string having a value “11110000,” and additional information regarding right-perspective image data generated by the generating unit 110 converting the same 2D image data may include a bit string having a value “11111111.”
  • The outputting unit 130 transmits the 3D image data, to which the additional information is added by the adding unit 120, to an image data displaying device 300 described below.
  • In the embodiment described above with reference to FIG. 1, the generating unit 110 in the image data outputting device 100 generates 3D image data by converting 2D image data and generates additional information indicating relationships among the 3D image data. However, according to another embodiment of the present invention, the generating unit 110 may not generate the additional information. Instead, the generating unit 110 or the adding unit 120 may obtain additional information generated in advance and/or externally and transmit the additional information to the image data displaying device 300 together with 3D image data.
  • According to an embodiment of the present invention, the image data outputting device 100 may further include a receiving unit (not shown). The receiving unit receives metadata including additional information from an external server via a network system and/or from a broadcasting server that provides a cable broadcasting service via a cable network, though it is understood that aspects of the present invention are not limited thereto. For example, according to other aspects, the receiving unit may receive the metadata from an external storage device (such as a server) via any wired and/or wireless connection (such as USB, Bluetooth, Infrared, etc.). According to another embodiment of the present invention, the image data outputting device 100 may further include a reading unit (not shown). In this case, 2D image data and/or 3D image data generated by the generating unit 110 converting 2D image data is recorded in a recording medium. Furthermore, metadata including additional information may also be recorded on the recording medium. The reading unit may read the recording medium, obtain metadata therefrom, and transmit the metadata to the image data displaying device 300 together with the 3D image data.
  • FIG. 2A is a diagram showing a detailed configuration of the image data outputting device 100 shown in FIG. 1. Referring to FIG. 2A, a decoder 210 decodes input 2D image data. The decoded 2D image data is transmitted to the generating unit 110. The generating unit 110 generates a 3D image data pair, which is to be used to generate 3D images, by converting the input 2D image data. The adding unit 120 adds additional information that indicates relationships among a plurality of 3D image data generated by the generating unit 110 converting the same 2D image data, to at least one of the 3D image data. As described above, the additional information may include pair information and/or perspective information.
  • According to the present embodiment, the adding unit 120 includes a perspective determining unit 122 and an information recording unit 124. The perspective determining unit 122 determines perspectives of a generated 3D image data pair. That is, the perspective determining unit 122 determines which of 3D image data items is left-perspective image data and which is right-perspective image data.
  • The information recording unit 124 adds pair information that indicates left-perspective image data and right-perspective image data from among a 3D image data pair generated by using the same 2D image data, and adds perspective information to each of the image data. Pair information and perspective information may be generated as metadata. However, aspects of the present invention are not limited thereto. For example, the pair information and/or the perspective information may be added by being recorded in a predetermined region within 3D image data.
  • The outputting unit 130 outputs a 3D image data pair to which the pair information and the perspective information are added by the information recording unit 124. If the outputting unit 130 outputs 3D image data at a transmitting rate of 60 Hz, it takes 1/30 of a second to completely output a 3D image data pair.
  • FIG. 2B is a diagram showing an example wherein an image data outputting unit according to an embodiment of the present invention outputs 3D image data to which additional information 231 is added. Referring to FIG. 2B, the additional information 231 is output as metadata regarding 3D image data 232 and 233. In this case, an output sequence may vary according to embodiments. The output sequence shown in FIG. 2B is the additional information 231, left-perspective image data 232, and right-perspective image data 233.
  • FIG. 2C is a diagram showing another example wherein the image data outputting unit outputs 3D image data to which additional information is added. Referring to FIG. 2C, additional pieces of information 242 and 244 are recorded in predetermined regions of 3D image data 241 and 243, respectively. In other words, the additional information 242 regarding left-perspective image data 241 is recorded in the left-perspective image data 241, and the additional information 244 regarding right-perspective image data 243 is recorded in the right-perspective image data 243. At this point, a sequence of outputting 3D image data may vary according to embodiments. In FIG. 2C, the 3D image data is output in the sequence of the left-perspective image data 241 followed by the right-perspective image data 243.
  • FIG. 3 is a block diagram of the image data displaying device 300 according to an embodiment of the present invention. Referring to FIG. 3, the image data displaying device 300 includes a receiving unit 310, an extracting unit 320, and a displaying unit 330. While not required, the image data displaying device 300 can be a workstation, a desktop computer, a notebook computer, a portable multimedia player, a television, a set-top box, a reproducing device that reads the image data from a medium, etc., and can be implemented using one or more computers and/or processors whose functions are executed using software/firmware.
  • The receiving unit 310 receives 3D image data from the image data outputting device 100. As described above, the received 3D image data includes additional information indicating relationships among 3D image data generated by converting the same 2D image data. The additional information includes pair information and/or perspective information. Pair information is information indicating that left-perspective image data and right-perspective image data generated by the generating unit 110 converting the same 2D image data are a 3D image data pair. Left-perspective image data and right-perspective image data having the same pair information can be determined as a 3D image data pair. Perspective information is information indicating whether 3D image data is left-perspective image data or right-perspective image data.
  • The extracting unit 320 extracts the additional information from the received 3D image data. The displaying unit 330 displays the 3D image data based on the additional information. More particularly, whether to display 3D image data is determined based on the pair information and the perspective information. If it is determined to display the 3D image data, a 3D image data display sequence is determined, and the displaying unit 330 displays the 3D image data according to the determined sequence. An example of displaying 3D image data will be described in detail later with reference to FIG. 5.
  • FIG. 4 is a block diagram of an image data displaying device 400 according to another embodiment of the present invention. Referring to FIG. 4, the image data displaying device 400 includes a receiving unit 410, an extracting unit 420, a control unit 430, a storage unit 440, and a displaying unit 450. Here, operations of the receiving unit 410, the extracting unit 420, and the displaying unit 450 are similar to those of the receiving unit 310, the extracting unit 320, and the displaying unit 330 shown in FIG. 3, and thus detailed descriptions thereof will be omitted here.
  • The control unit 430 controls the image data displaying device 400 to store, to delete, and/or to display 3D image data, based on additional information. The storage unit 440 includes a plurality of buffers 442 through 448, which are defined according to a predetermined standard. The predetermined standard may vary according to embodiments. For example, the buffers 442 through 448 may be defined based on the perspective information and/or the pair information that are added to the 3D image data. The storage unit 440, which is controlled by the control unit 430, stores 3D image data in one or more of the plurality of buffers, transmits stored 3D image data to the displaying unit 450, and/or deletes stored 3D image data.
  • Hereinafter, operations of the image data displaying unit 450 will be described. As described above, the receiving unit 410 receives 3D image data from the image data outputting device 100. The received 3D image data is image data generated by the generating unit 110 converting 2D image data, and may include additional information. The additional information is information indicating relationships among 3D image data generated by the generating unit 110 converting the same 2D image data, and may include pair information and/or perspective information.
  • As an example, it is assumed that the pair information and the perspective information are indicated by using 8-bit strings within the 3D image data, wherein upper 4-bits of an 8-bit string indicates the pair information and lower 4-bits of the 8-bit string indicates the perspective information. The pair information may indicate whether the 3D image data is generated by the generating unit 110 converting even-numbered 2D image data or odd-numbered 2D image data. For example, if a value of the upper 4-bits is “0000,” the 3D image data is generated by the generating unit 110 converting even-numbered 2D image data. In contrast, if the value of the upper 4-bits is “1111,” the 3D image data is generated by the generating unit 110 converting odd-numbered 2D image data. The perspective information may indicate whether 3D image data is left-perspective image data or right-perspective image data. For example, if a value of the lower 4-bits is “0000,” the 3D image data is left-perspective image data. If the value of the lower 4-bits is “1111,” the 3D image data is right-perspective image data.
  • The extracting unit 420 extracts the pair information and the perspective information from the 3D image data. For example, as described above, if the extracted pair information and perspective information is “00000000,” the 3D image data is left-perspective image data generated by converting even-numbered 2D image data. Hereinafter, a 3D image data pair generated by converting even-numbered 2D image data will be referred as a first pair, and a 3D image data pair generated by converting odd-numbered 2D image data will be referred as a second pair. The control unit 430 selects a buffer, to which 3D image data is to be stored, based on the pair information and the perspective information. The storage unit 440 comprises four buffers 442, 444, 446, and 448. A first buffer 442 stores left-perspective image data of the first pair, and a second buffer 444 stores right-perspective image data of the first pair. A third buffer 446 stores left-perspective image data of the second pair, and a fourth buffer 448 stores right-perspective image data of the second pair. Thus, 3D image data having additional information indicated by the 8-bit string “00000000” is stored in the first buffer 442. However, it is understood that aspects of the present invention are not limited to the four buffers 442 through 448 classified by the pair information and the perspective information. According to other aspects, more or less buffers may be provided according to other classification schemes. For example, only two buffers may be provided, classified by the pair information or the perspective information.
  • Thereafter, the control unit 430 controls the storage unit 440 such that the 3D image data stored in the first buffer 442 is transmitted to the displaying unit 450. The displaying unit 450 may display the 3D image data in a sequence of displaying left-perspective image data of a pair and successively displaying right-perspective image data of the same pair or vice versa.
  • The control unit 430 controls the display unit 450 and the storage unit 440 such that the 3D image data is repeatedly displayed a predetermined number of times and is deleted from a buffer 442, 444, 446, or 448. Displaying and buffering the 3D image data will be described in detail later with reference to FIGS. 5, 6A, and 6B. The displaying unit 450 displays the transmitted 3D image data.
  • A conventional image data outputting device generates 3D image data by converting 2D image data and transmits the 3D image data to an image data displaying device without adding additional information, such as pair information and/or perspective information. An image data displaying device alternately displays left-perspective image data and right-perspective image data in a sequence that the 3D image data is received. Here, a user watches left-perspective image data via his or her left eye and watches right-perspective image data via his or her right eye by using an auxiliary device to watch 3D images (e.g., 3D glasses or goggles).
  • Even a conventional method enables a user to watch 3D images in the case where 3D image data is sequentially transmitted and displayed. However, in the case where 3D image data is not sequentially transmitted or some pieces of 3D image data are not transmitted and/or displayed due, for example, to a power failure, a user cannot watch the 3D images. For example, it is assumed that, while a user is watching left-perspective image data via his or her left eye, power supplied to an image data displaying device temporarily fails, and left-perspective image data is received again when power supply to the image data displaying device is recovered. In this case, 3D image data does not include perspective information, and thus the image data displaying device cannot determine whether the newly received 3D image data is left-perspective image data or right-perspective image data. Therefore, without separate synchronization, the image data displaying device may determine left-perspective image data as right-perspective image data. In this case, a user watches left-perspective image data via his or her right eye and right-perspective image data via his or her left eye, and thus the 3D effect cannot be sufficiently experienced. However, the image data displaying device 300, 400 according to aspects of the present invention uses perspective information included in 3D image data such that left-perspective image data is viewed via the left eye and right-perspective image data is viewed via the right eye. Thus, a user can watch 3D images without distortion.
  • Furthermore, pair information is not included in conventional 3D image data. Thus, the image data displaying device 300, 400 may not be able to display both left-perspective image data and right-perspective image data, and the image data displaying device 400 may only partially display a 3D image data pair and then display a next pair. In this case, the 3D effect cannot be sufficiently experienced by a user due to overlapping of images. However, the image data displaying device 300, 400 according to aspects of the present invention guarantees the display of both left-perspective image data and right-perspective image data by using pair information included in 3D image data. Thus, a user can watch natural 3D images.
  • FIG. 5 is a diagram showing an example of displaying 3D image data by using the image data displaying device 400 according to an embodiment of the present invention. Referring to FIG. 5, 3D image data output from the image data outputting device 100 according to an embodiment of the present invention are shown in chronological order in the lower part of FIG. 5. Here, if the image data outputting device 100 outputs 3D image data at a rate of 60 Hz, the unit of time is set to 1/60 of a second.
  • 3D image data according to aspects of the present invention includes pair information and perspective information. In the present example, pair information is indicated by an upper bit between two bits included in the 3D image data, and perspective information is indicated by a lower bit. Specifically, 3D image data where a bit indicating pair information (i.e., the upper bit) is 0 is generated by using even-numbered 2D image data, whereas 3D image data where a bit indicating pair information is 1 is generated by using odd-numbered 2D image data. Furthermore, 3D image data where a bit indicating perspective information (i.e., the lower bit) is 0 is left-perspective image data, whereas 3D image data where a bit indicating perspective information is 1 is right-perspective image data.
  • At 1/60 of a second, 3D image data 501 is output. In the 3D image data 501, both a bit indicating pair information and a bit indicating perspective information are 0. Thus, it is clear that the 3D image data 501 is left-perspective image data generated by using even-numbered 2D image data. At 2/60 of a second, 3D image data 502 is output. In the 3D image data 502, a bit indicating pair information is 0, and a bit indicating perspective information is 1. Thus, it is clear that the 3D image data 502 is right-perspective image data generated by using even-numbered 2D image data. Furthermore, it is clear that the 3D image data 501 and the 3D image data 502 are paired image data, as the 3D image data 501 and the 3D image data 502 have the same pair information, have different perspective information, and are transmitted successively.
  • At 3/60 of a second, 3D image data 503 is output. In the 3D image data 503, a bit indicating pair information is 1, and a bit indicating perspective information is 0. Therefore, it is clear that the 3D image data 503 is left-perspective image data generated by using odd-numbered 2D image data. Furthermore, since pair information of the 3D image data 502 and 503 are different from each other, it is clear that the 3D image data 502 and the 3D image data 503 are not paired image data.
  • 3D image data displayed by the image data displaying device 300, 400 according to aspects of the present invention are shown in chronological order in the upper part of FIG. 5. At 1/60 of a second, the 3D image data 501 is received from the image data outputting device 100, and the received 3D image data 501 is stored in the storage unit 440. At 2/60 of a second, the 3D image data 502 is received from the image data outputting device 100, and the received 3D image data 502 is stored in the storage unit 440. At 3/60 of a second, the 3D image data 503 is received from the image data outputting device 100, and the received 3D image data 503 is stored in the storage unit 440. At this point, the image data displaying device 400 displays stored left-perspective image data (that is, the 3D image data 501).
  • At 3.5/60 of a second, no 3D image data is received from the image data outputting device 100, as the 3D image data in the present embodiment is received every 1/60 of a second. Here, the image data displaying device 400 displays stored right-perspective image data (that is, the 3D image data 502). In the prior art, received image data has to be displayed continuously at a point of time when no image data is received from the image data outputting device 100. Therefore, a 3D image data pair is displayed every 1/30 of a second. However, stored 3D image data is used in the present embodiment, and thus an image data pair can be displayed every 1/60 of a second.
  • At 4/60 of a second, 3D image data 504 is received from the image data outputting device 100, and the received 3D image data 504 is stored in the storage unit 440. At this point, the image data displaying device 400 displays the 3D image data 501. Since the 3D image data 501 is repeatedly displayed twice, the 3D image data 501 is deleted from the storage unit 440. At 4.5/60 of a second, no 3D image data is received from the image data outputting device 100. At this point, the image data displaying device 400 displays the 3D image data 502. Since the 3D image data 502 is repeatedly displayed twice, the 3D image data 502 is deleted from the storage unit 440.
  • In the prior art, when a conventional image data outputting device outputs image data every 1/60 of a second, the image data displaying device 400 displays one image data every 1/60 of a second. Accordingly, in the case of displaying 2D image data, it takes 1/60 seconds to display a scene. However, in the case of displaying 3D image data, it takes 2/60 of a second to display a scene, because a user can watch a scene after two pieces of 3D image data are displayed. However, images appear more natural to a user when a scene is displayed every 1/60 of a second. Thus, 3D images displayed by the conventional image data displaying device do not seem natural to a user, compared to those displayed according to aspects of the present invention. That is, the image data displaying device 300, 400 according to aspects of the present invention classifies and stores received 3D image data by using a plurality of buffers, and repeatedly displays 3D image data a predetermined number of times by using stored 3D image data. Thus, a 3D image data pair can be displayed every 1/60 of a second. Therefore, a user can watch natural 3D images.
  • FIGS. 6A and 6B are diagrams of the storage unit 440 of the image data displaying device 400 according to an embodiment of the present invention. Referring to FIGS. 6A and 6B, the storage unit 440 of the image data displaying device 400 includes the first buffer 442, the second buffer 444, the third buffer 446, and the fourth buffer 448.
  • The first buffer 442 stores left-perspective image data of the first pair, and the second buffer 444 stores right-perspective image data of the first pair. The third buffer 446 stores left-perspective image data of the second pair, and the fourth buffer 448 stores right-perspective image data of the second pair. In FIGS. 6A and 6B, 3D image data where a bit indicating pair information is 0 is 3D image data of the first pair, and 3D image data where a bit indicating pair information is 1 is 3D image data of the second pair.
  • Referring to FIGS. 5 and 6A, operations of the storage unit 440 at a time frame between 1/60 of a second and 2.5/60 of a second will now be described. At 1/60 of a second, the 3D image data 501 is received from the image data outputting device 100, and the received 3D image data 501 is stored in the storage unit 440. In the 3D image data 501, a bit indicating pair information is 0, and a bit indicating perspective information is also 0. Thus, the 3D image data 501 is stored in the first buffer 442. At 1.5/60 of a second, no data is received from the image data outputting device 100. At 2/60 of a second, the 3D image data 502 is received from the image data outputting device 100, and the received 3D image data 502 is stored in the storage unit 440. In the 3D image data 502, a bit indicating pair information is 0, and a bit indicating perspective information is 1. Thus, the 3D image data 502 is stored in the second buffer 444. At 2.5/60 of a second, no data is received from the image data outputting device 100.
  • Referring to FIGS. 5 and 6B, operations of the storage unit 440 at a time frame between 3/60 of a second and 4.5/60 of a second will now be described. At 3/60 of a second, the 3D image data 503 is received from the image data outputting device 100, and the received 3D image data 503 is stored in the storage unit 440. In the 3D image data 503, a bit indicating pair information is 1, and a bit indicating perspective information is 0. Thus, the 3D image data 503 is stored in the third buffer 446. At this point, the 3D image data 501 stored in the first buffer 442 is output to the displaying unit 450. At 3.5/60 of a second, no data is received from the image data outputting device 100. At this point, the 3D image data 502 stored in the second buffer 444 is output to the displaying unit 450. At 4/60 of a second, the 3D image data 504 is received from the image data outputting device 100, and the received 3D image data 504 is stored in the storage unit 440. In the 3D image data 504, a bit indicating pair information is 1, and a bit indicating perspective information is 1. Thus, the 3D image data 504 is stored in the fourth buffer 448. At this point, the 3D image data 501 stored in the first buffer 442 is output to the displaying unit 450. Since the 3D image data 501 stored in the first buffer 442 is displayed twice, the 3D image data 501 is deleted from the first buffer 442. At 4.5/60 of a second, no data is received from the image data outputting device 100. At this point, the 3D image data 502 stored in the second buffer 444 is output to the displaying unit 450. Since the 3D image data 502 stored in the second buffer 444 is displayed twice, the 3D image data 502 is deleted from the second buffer 444.
  • Referring to FIGS. 5 and 6C, operations of the storage unit 440 at a time frame between 5/60 of a second and 6.5/60 of a second will now be described. At 5/60 of a second, the 3D image data 505 is received from the image data outputting device 100, and the received 3D image data 505 is stored in the storage unit 440. In the 3D image data 505, a bit indicating pair information is 0, and a bit indicating perspective information is also 0. Thus, the 3D image data 505 is stored in the third buffer 442. At this point, the 3D image data 503 stored in the third buffer 444 is output to the displaying unit 450. At 5. 5/60 of a second, no data is received from the image data outputting device 100. At this point, the 3D image data 504 stored in the fourth buffer 448 is output to the displaying unit 450. At 6/60 of a second, the 3D image data 506 is received from the image data outputting device 100, and the received 3D image data 506 is stored in the storage unit 440. In the 3D image data 506, a bit indicating pair information is 0, and a bit indicating perspective information is 1. Thus, the 3D image data 506 is stored in the second buffer 444. At this point, the 3D image data 503 stored in the third buffer 446 is output to the displaying unit 450. Since the 3D image data 503 stored in the third buffer 446 is displayed twice, the 3D image data 503 is deleted from the third buffer 446. At 6.5/60 of a second, no data is received from the image data outputting device 100. At this point, the 3D image data 504 stored in the fourth buffer 448 is output to the displaying unit 450. Since the 3D image data 504 stored in the fourth buffer 448 is displayed twice, the 3D image data 504 is deleted from the fourth buffer 448.
  • In FIGS. 6A and 6B, one item of 3D image data 501 or 502 is displayed every 1/120 of a second by using the stored 3D image data 501 and 502. Thus, a 3D image data pair 501 and 502 is displayed every 1/60 of a second. According to other embodiments, an image data outputting rate of the image data outputting device 100 may be less than or greater than 60 Hz. For example, if the image data outputting rate is less than 60 Hz, the image data outputting device 100 may still be controlled to display a 3D image data pair every 1/60 of a second by adjusting the number of times 3D image data is repeatedly displayed (i.e., adjusting the number of times to be greater than two).
  • Furthermore, in FIGS. 6A and 6B, both left- perspective image data 501 and 503 and right- perspective image data 502 and 504 of a same pair are stored, and stored 3D image data 501 and 502 is displayed as soon as left-perspective image data 503 of a next pair is received. However, it is understood that aspects of the present invention are not limited thereto. For example, according to other aspects, stored left-perspective image data 501 may be displayed from when a pair of left-perspective image data and right- perspective image data 503 and 504 are received ( 2/60 of a second).
  • FIG. 7 is a flowchart of a method of outputting image data according to an embodiment of the present invention. Referring to FIG. 7, first-perspective image data and second-perspective image data to display a 3D image are generated in operation S710. The first-perspective image data and the second-perspective image data are generated by converting the same 2D image data. Additional information indicating relationships between the generated first-perspective image data and the generated second-perspective image data (operation S710) is generated, and the generated additional information is added to the first-perspective image data and/or the second-perspective image data in operation S720. That is, the additional information may be added to both the first-perspective image data and the second-perspective image data, or just one of the first-perspective image data and the second-perspective image data. The additional information may include pair information that indicates that the first-perspective image data and the second-perspective image data are an image data pair generated by converting the same 2D image data. Furthermore, the additional information may additionally or alternatively include perspective information that indicates whether each of the first-perspective image data and the second-perspective image data is left-perspective image data or right-perspective image data. Moreover, the pair information and the perspective information may be recorded within the first-perspective image data and the second-perspective image data. For example, the pair information may be recorded by using an upper 4-bits of 8-bits in a predetermined region of the first-perspective image data and the second-perspective image data, and the perspective information may be recorded by using a lower 4-bits of the 8-bits. The first-perspective image data and the second-perspective image data, to which additional information is added, are output in operation S730.
  • FIG. 8 is a flowchart of a method of displaying image data according to an embodiment of the present invention. Referring to FIG. 8, first-perspective image data and second-perspective image data to display a 3D image are received in operation S810. The first-perspective image data and the second-perspective image data are generated by using the same 2D image data. According to aspects of the present invention, the first-perspective image data and the second-perspective image data include additional information. The additional information is information indicating relationships among 3D image data generated by converting the same 2D image data, and may include pair information and/or perspective information. Pair information is information indicating that first-perspective image data and second-perspective image data, generated by converting the same 2D image data, are paired image data. Perspective information is information indicating whether each of the first-perspective image data and the second-perspective image data is left-perspective image data or right-perspective image data. If the first-perspective image data is left-perspective image data, the second-perspective image data is right-perspective image data. If the first-perspective image data is right-perspective image data, the second-perspective image data is left-perspective image data.
  • The additional information is obtained from the first-perspective image data and the second-perspective image data in operation S820. The additional information may either be transmitted after being included in the first-perspective image data and/or the second-perspective image data, or be separately transmitted. In the case where the additional information is separately transmitted, the additional information may be separately received from an external server or an image data outputting device 100.
  • The first-perspective image data and the second-perspective image data are displayed based on the additional information in operation S830. Hereinafter, displaying the first-perspective image data will be described in detail under an assumption that the additional information includes pair information and perspective information. First, the received first-perspective image data is stored based on the pair information and perspective information. For example, the first-perspective image data may be stored in a first buffer 442 in which left-perspective image data of a first pair is stored, a second buffer 444 in which right-perspective image data of the first pair is stored, a third buffer 446 in which left-perspective image data of a second pair is stored, and a fourth buffer 448 in which right-perspective image data of the second pair is stored, as illustrated in FIG. 4. When the first-perspective image data and the second-perspective image data are both stored in the buffers, the first-perspective image data and the second-perspective image data are sequentially displayed. At this point, the first-perspective image data and the second-perspective image data are repeatedly displayed a predetermined number of times, and are deleted from the buffers after being repeatedly displayed the predetermined number of times.
  • While not restricted thereto, aspects of the present invention can also be written as computer programs and can be implemented in general-use or specific-use digital computers that execute the programs using a computer-readable recording medium. Examples of the computer-readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs). Aspects of the present invention may also be realized as a data signal embodied in a carrier wave and comprising a program readable by a computer and transmittable over the Internet.
  • Although a few embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in this embodiment without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.

Claims (32)

1. A method of outputting three-dimensional (3D) image data, the method comprising:
generating first-perspective image data and second-perspective image data to display a 3D image by converting a same two-dimensional (2D) image data;
generating additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and
outputting the first-perspective image data, the second-perspective image data, and the additional information.
2. The method as claimed in claim 1, wherein the additional information comprises pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
3. The method as claimed in claim 1, wherein the first-perspective image data is left-perspective image data, and the second-perspective image data is right-perspective image data.
4. The method as claimed in claim 3, wherein the additional information comprises perspective information indicating that the first-perspective image data is the left-perspective image data, and/or that the second-perspective image data is the right-perspective image data.
5. The method as claimed in claim 1, wherein the generating of the additional information comprises:
generating the additional information; and
adding the generated additional information to the first-perspective image data and/or the second perspective image data.
6. The method as claimed in claim 1, wherein the generating of the additional information comprises generating the additional information as a bit string of the first-perspective image data and/or the second-perspective image data.
7. A method of displaying image data, the method comprising:
receiving first-perspective image data and second-perspective image data, which are generated by converting a same two-dimensional (2D) image data and are to display a three-dimensional (3D) image;
receiving additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and
displaying the first-perspective image data and the second-perspective image data based on the additional information.
8. The method as claimed in claim 7, wherein the additional information comprises pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
9. The method as claimed in claim 7, wherein the first-perspective image data is left-perspective image data, and the second-perspective image data is right-perspective image data.
10. The method as claimed in claim 9, wherein the additional information comprises perspective information indicating that the first-perspective image data is the left-perspective image data, and/or that the second-perspective image data is the right-perspective image data.
11. The method as claimed in claim 7, further comprising:
storing the received first-perspective image data and the received second-perspective image data.
12. The method as claimed in claim 11, wherein the first-perspective image data is stored in a first buffer, of a plurality of buffers classified according to a predetermined standard, and the second-perspective image data is stored in a second buffer, of the plurality of buffers.
13. The method as claimed in claim 12, wherein the plurality of buffers comprises:
the first buffer corresponding to left-perspective image data of a first pair of received image data;
the second buffer corresponding to right-perspective image data of the first pair;
a third buffer corresponding to the left-perspective image data of a second pair of received image data; and
a fourth buffer corresponding to the right-perspective image data of the second pair.
14. The method as claimed in claim 7, wherein the displaying of the first-perspective image data and the second-perspective image data comprises repeatedly displaying the first-perspective image data and the second-perspective image data for a predetermined number of times.
15. The method as claimed in claim 14, further comprising:
storing the received first-perspective image data in a first buffer, of a plurality of buffers classified according to a predetermined standard, and storing the received second-perspective image data in a second buffer, of the plurality of buffers; and
when the first-perspective image data and the second-perspective image data have been repeatedly displayed for the predetermined number of times, deleting the first-perspective image data and the second-perspective image data from the first and second buffers, respectively.
16. The method as claimed in claim 15, wherein the first perspective image data is received every 1/N of a second, and the first perspective image data is displayed every 1/Y of a second, where N and Y are integers and N is less than Y
17. The method as claimed in claim 7, wherein the receiving of the additional information comprises extracting the additional information from the first-perspective image data and/or the second-perspective image data.
18. An image data outputting device comprising:
a generating unit to generate first-perspective image data and second-perspective image data to display a three-dimensional (3D) image by converting a same two-dimensional (2D) image data;
an adding unit to add additional information indicating a relationship between the first-perspective image data and the second-perspective image data to the first-perspective image data and/or the second-perspective image data; and
an outputting unit to output the first-perspective image data and the second-perspective image data.
19. The image data outputting unit as claimed in claim 18, wherein the additional information comprises pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
20. The image data outputting unit as claimed in claim 18, wherein the first-perspective image data is left-perspective image data, and the second-perspective image data is right-perspective image data.
21. The image data outputting unit as claimed in claim 19, wherein the additional information comprises perspective information indicating that the first-perspective image data is left-perspective image data and the second-perspective image data is right-perspective image data.
22. An image data displaying device comprising:
a receiving unit to receive first-perspective image data and second-perspective image data, which are generated by converting a same two-dimensional (2D) image data and are to display a three-dimensional (3D) image, and to receive additional information indicating a relationship between the first-perspective image data and the second-perspective image data; and
a displaying unit to display the first-perspective image data and the second-perspective image data based on the additional information.
23. The image data displaying device as claimed in claim 22, wherein the additional information comprises pair information indicating that the first-perspective image data and the second-perspective image data are paired image data generated by converting the same 2D image data.
24. The image data displaying device as claimed in claim 22, wherein the first-perspective image data left-perspective image data, and the second-perspective image data is right-perspective image data.
25. The image data displaying device as claimed in claim 24, wherein the additional information comprises perspective information indicating that the first-perspective image data is the left-perspective image data, and the second-perspective image data is the right-perspective image data.
26. The image data displaying device as claimed in claim 22, further comprising:
a storage unit comprising a plurality of buffers classified according to a predetermined standard; and
a control unit to control the image data displaying device based on the additional information such that the first-perspective image data is stored in a first buffer, of the plurality of buffers, and the second-perspective image data is stored in a second buffer, of the plurality of buffers.
27. The image data displaying device as claimed in claim 26, wherein the control unit controls the displaying unit to repeatedly display the first-perspective image data and the second-perspective image data for a predetermined number of times.
28. The image data displaying device as claimed in claim 27, wherein the control unit further controls the image data displaying device such that the first-perspective image data and the second-perspective image data are deleted from the first and second buffers, respectively, in response to the displaying unit repeatedly displaying the first-perspective image data and the second-perspective image data for the predetermined number of times.
29. The image data displaying device as claimed in claim 26, wherein the receiving unit receives perspective image data every 1/N of a second, and the displaying unit displays the received perspective image data every 1/Y of a second, where N and Y are integers and N is less than Y
30. The image data displaying device as claimed in claim 22, wherein the receiving unit comprises an extracting unit to extract the additional information from the first-perspective image data and/or the second-perspective image data.
31. A computer readable recording medium having recorded thereon a computer program for executing the method of claim 1 by at least one computer.
32. A computer readable recording medium having recorded thereon a computer program for executing the method of claim 7 by at least one computer.
US12/479,978 2008-06-24 2009-06-08 Method and apparatus for outputting and displaying image data Abandoned US20090315884A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/479,978 US20090315884A1 (en) 2008-06-24 2009-06-08 Method and apparatus for outputting and displaying image data

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US7518408P 2008-06-24 2008-06-24
KR1020080092417A KR20100002033A (en) 2008-06-24 2008-09-19 Method and apparatus for output of image data and method and apparatus for displaying of image data
KR10-2008-0092417 2008-09-19
US12/479,978 US20090315884A1 (en) 2008-06-24 2009-06-08 Method and apparatus for outputting and displaying image data

Publications (1)

Publication Number Publication Date
US20090315884A1 true US20090315884A1 (en) 2009-12-24

Family

ID=41812276

Family Applications (6)

Application Number Title Priority Date Filing Date
US12/479,978 Abandoned US20090315884A1 (en) 2008-06-24 2009-06-08 Method and apparatus for outputting and displaying image data
US12/489,758 Abandoned US20090317061A1 (en) 2008-06-24 2009-06-23 Image generating method and apparatus and image processing method and apparatus
US12/489,726 Abandoned US20090315979A1 (en) 2008-06-24 2009-06-23 Method and apparatus for processing 3d video image
US12/490,589 Abandoned US20090315977A1 (en) 2008-06-24 2009-06-24 Method and apparatus for processing three dimensional video data
US12/556,699 Expired - Fee Related US8488869B2 (en) 2008-06-24 2009-09-10 Image processing method and apparatus
US12/564,201 Abandoned US20100103168A1 (en) 2008-06-24 2009-09-22 Methods and apparatuses for processing and displaying image

Family Applications After (5)

Application Number Title Priority Date Filing Date
US12/489,758 Abandoned US20090317061A1 (en) 2008-06-24 2009-06-23 Image generating method and apparatus and image processing method and apparatus
US12/489,726 Abandoned US20090315979A1 (en) 2008-06-24 2009-06-23 Method and apparatus for processing 3d video image
US12/490,589 Abandoned US20090315977A1 (en) 2008-06-24 2009-06-24 Method and apparatus for processing three dimensional video data
US12/556,699 Expired - Fee Related US8488869B2 (en) 2008-06-24 2009-09-10 Image processing method and apparatus
US12/564,201 Abandoned US20100103168A1 (en) 2008-06-24 2009-09-22 Methods and apparatuses for processing and displaying image

Country Status (7)

Country Link
US (6) US20090315884A1 (en)
EP (4) EP2292019A4 (en)
JP (4) JP2011525743A (en)
KR (9) KR20100002032A (en)
CN (4) CN102077600A (en)
MY (1) MY159672A (en)
WO (4) WO2009157668A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110228058A1 (en) * 2010-03-17 2011-09-22 Yasunari Hatasawa Reproducing device, reproduction control method and program
US8194986B2 (en) 2008-08-19 2012-06-05 Digimarc Corporation Methods and systems for content processing
CN103543953A (en) * 2013-11-08 2014-01-29 深圳市汉普电子技术开发有限公司 Method for playing 3D film source without 3D identification and touch device
US20150109411A1 (en) * 2012-04-26 2015-04-23 Electronics And Telecommunications Research Institute Image playback apparatus for 3dtv and method performed by the apparatus
EP2515293A3 (en) * 2011-04-19 2016-10-05 Lg Electronics Inc. Image display apparatus and method for operating the same
US11770513B1 (en) * 2022-07-13 2023-09-26 Rovi Guides, Inc. Systems and methods for reducing a number of focal planes used to display three-dimensional objects

Families Citing this family (201)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
KR101588877B1 (en) 2008-05-20 2016-01-26 펠리칸 이매징 코포레이션 Capturing and processing of images using monolithic camera array with heterogeneous imagers
US8866920B2 (en) 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US20100045779A1 (en) * 2008-08-20 2010-02-25 Samsung Electronics Co., Ltd. Three-dimensional video apparatus and method of providing on screen display applied thereto
JP2010062695A (en) * 2008-09-02 2010-03-18 Sony Corp Image processing apparatus, image processing method, and program
JP2010088092A (en) * 2008-09-02 2010-04-15 Panasonic Corp Three-dimensional video transmission system, video display device and video output device
US9264690B2 (en) * 2008-09-07 2016-02-16 Dolby Laboratories Licensing Corporation Conversion of interleaved data sets, including chroma correction and/or correction of checkerboard interleaved formatted 3D images
KR20110097879A (en) * 2008-11-24 2011-08-31 코닌클리케 필립스 일렉트로닉스 엔.브이. Combining 3d video and auxiliary data
US8599242B2 (en) * 2008-12-02 2013-12-03 Lg Electronics Inc. Method for displaying 3D caption and 3D display apparatus for implementing the same
EP2380357B2 (en) * 2008-12-19 2020-03-18 Koninklijke Philips N.V. Method and device for overlaying 3d graphics over 3d video
CN102576412B (en) * 2009-01-13 2014-11-05 华为技术有限公司 Method and system for image processing to classify an object in an image
KR20100112940A (en) * 2009-04-10 2010-10-20 엘지전자 주식회사 A method for processing data and a receiving system
TW201119353A (en) 2009-06-24 2011-06-01 Dolby Lab Licensing Corp Perceptual depth placement for 3D objects
CN102498720B (en) 2009-06-24 2015-09-02 杜比实验室特许公司 The method of captions and/or figure lamination is embedded in 3D or multi-view video data
US9479766B2 (en) * 2009-07-10 2016-10-25 Dolby Laboratories Licensing Corporation Modifying images for a 3-dimensional display mode
JP2011029849A (en) * 2009-07-23 2011-02-10 Sony Corp Receiving device, communication system, method of combining caption with stereoscopic image, program, and data structure
RU2554465C2 (en) * 2009-07-27 2015-06-27 Конинклейке Филипс Электроникс Н.В. Combination of 3d video and auxiliary data
KR101056281B1 (en) * 2009-08-03 2011-08-11 삼성모바일디스플레이주식회사 Organic electroluminescent display and driving method thereof
KR20110013693A (en) * 2009-08-03 2011-02-10 삼성모바일디스플레이주식회사 Organic light emitting display and driving method thereof
JP5444955B2 (en) 2009-08-31 2014-03-19 ソニー株式会社 Stereoscopic image display system, parallax conversion device, parallax conversion method, and program
US8614737B2 (en) * 2009-09-11 2013-12-24 Disney Enterprises, Inc. System and method for three-dimensional video capture workflow for dynamic rendering
US20110063298A1 (en) * 2009-09-15 2011-03-17 Samir Hulyalkar Method and system for rendering 3d graphics based on 3d display capabilities
US8988495B2 (en) * 2009-11-03 2015-03-24 Lg Eletronics Inc. Image display apparatus, method for controlling the image display apparatus, and image display system
JP2011109398A (en) * 2009-11-17 2011-06-02 Sony Corp Image transmission method, image receiving method, image transmission device, image receiving device, and image transmission system
EP2502115A4 (en) 2009-11-20 2013-11-06 Pelican Imaging Corp Capturing and processing of images using monolithic camera array with heterogeneous imagers
JP5502436B2 (en) * 2009-11-27 2014-05-28 パナソニック株式会社 Video signal processing device
US20110134217A1 (en) * 2009-12-08 2011-06-09 Darren Neuman Method and system for scaling 3d video
TWI491243B (en) * 2009-12-21 2015-07-01 Chunghwa Picture Tubes Ltd Image processing method
JP2011139261A (en) * 2009-12-28 2011-07-14 Sony Corp Image processing device, image processing method, and program
JP2013517677A (en) * 2010-01-13 2013-05-16 トムソン ライセンシング System and method for compositing 3D text with 3D content
JPWO2011086653A1 (en) * 2010-01-14 2013-05-16 パナソニック株式会社 Video output device, video display system
CN102714747A (en) * 2010-01-21 2012-10-03 通用仪表公司 Stereoscopic video graphics overlay
KR101801017B1 (en) * 2010-02-09 2017-11-24 코닌클리케 필립스 엔.브이. 3d video format detection
US9025933B2 (en) * 2010-02-12 2015-05-05 Sony Corporation Information processing device, information processing method, playback device, playback method, program and recording medium
JP2011166666A (en) * 2010-02-15 2011-08-25 Sony Corp Image processor, image processing method, and program
KR101445777B1 (en) * 2010-02-19 2014-11-04 삼성전자 주식회사 Reproducing apparatus and control method thereof
JP5873813B2 (en) * 2010-02-19 2016-03-01 トムソン ライセンシングThomson Licensing Stereo logo insertion
WO2011105992A1 (en) * 2010-02-24 2011-09-01 Thomson Licensing Subtitling for stereoscopic images
KR20110098420A (en) * 2010-02-26 2011-09-01 삼성전자주식회사 Display device and driving method thereof
US20110216083A1 (en) * 2010-03-03 2011-09-08 Vizio, Inc. System, method and apparatus for controlling brightness of a device
EP2543191B1 (en) * 2010-03-05 2020-05-27 Google Technology Holdings LLC Method and apparatus for converting two-dimensional video content for insertion into three-dimensional video content
US9426441B2 (en) * 2010-03-08 2016-08-23 Dolby Laboratories Licensing Corporation Methods for carrying and transmitting 3D z-norm attributes in digital TV closed captioning
US8830300B2 (en) * 2010-03-11 2014-09-09 Dolby Laboratories Licensing Corporation Multiscalar stereo video format conversion
US8878913B2 (en) * 2010-03-12 2014-11-04 Sony Corporation Extended command stream for closed caption disparity
JP2011217361A (en) * 2010-03-18 2011-10-27 Panasonic Corp Device and method of reproducing stereoscopic image and integrated circuit
JP5390016B2 (en) * 2010-03-24 2014-01-15 パナソニック株式会社 Video processing device
KR20110107151A (en) * 2010-03-24 2011-09-30 삼성전자주식회사 Method and apparatus for processing 3d image in mobile terminal
JP5526929B2 (en) * 2010-03-30 2014-06-18 ソニー株式会社 Image processing apparatus, image processing method, and program
EP2559006A4 (en) * 2010-04-12 2015-10-28 Fortem Solutions Inc Camera projection meshes
WO2011129631A2 (en) * 2010-04-14 2011-10-20 삼성전자 주식회사 Method and apparatus for generating a broadcast bit stream for digital broadcasting with captions, and method and apparatus for receiving a broadcast bit stream for digital broadcasting with captions
US9237366B2 (en) * 2010-04-16 2016-01-12 Google Technology Holdings LLC Method and apparatus for distribution of 3D television program materials
US20110255003A1 (en) * 2010-04-16 2011-10-20 The Directv Group, Inc. Method and apparatus for presenting on-screen graphics in a frame-compatible 3d format
KR101697184B1 (en) 2010-04-20 2017-01-17 삼성전자주식회사 Apparatus and Method for generating mesh, and apparatus and method for processing image
US9414042B2 (en) * 2010-05-05 2016-08-09 Google Technology Holdings LLC Program guide graphics and video in window for 3DTV
KR20120119927A (en) * 2010-05-11 2012-11-01 삼성전자주식회사 3-Dimension glasses and System for wireless power transmission
EP2569935B1 (en) 2010-05-12 2016-12-28 Pelican Imaging Corporation Architectures for imager arrays and array cameras
KR101082234B1 (en) * 2010-05-13 2011-11-09 삼성모바일디스플레이주식회사 Organic light emitting display device and driving method thereof
JP2011249895A (en) * 2010-05-24 2011-12-08 Panasonic Corp Signal processing system and signal processing apparatus
US20110292038A1 (en) * 2010-05-27 2011-12-01 Sony Computer Entertainment America, LLC 3d video conversion
KR101699875B1 (en) * 2010-06-03 2017-01-25 엘지디스플레이 주식회사 Apparatus and method for three- dimension liquid crystal display device
US9030536B2 (en) 2010-06-04 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
JP5682149B2 (en) * 2010-06-10 2015-03-11 ソニー株式会社 Stereo image data transmitting apparatus, stereo image data transmitting method, stereo image data receiving apparatus, and stereo image data receiving method
US8402502B2 (en) * 2010-06-16 2013-03-19 At&T Intellectual Property I, L.P. Method and apparatus for presenting media content
US9053562B1 (en) 2010-06-24 2015-06-09 Gregory S. Rabin Two dimensional to three dimensional moving image converter
US8640182B2 (en) 2010-06-30 2014-01-28 At&T Intellectual Property I, L.P. Method for detecting a viewing apparatus
US9787974B2 (en) 2010-06-30 2017-10-10 At&T Intellectual Property I, L.P. Method and apparatus for delivering media content
US8593574B2 (en) 2010-06-30 2013-11-26 At&T Intellectual Property I, L.P. Apparatus and method for providing dimensional media content based on detected display capability
KR101645404B1 (en) 2010-07-06 2016-08-04 삼성디스플레이 주식회사 Organic Light Emitting Display
US8918831B2 (en) * 2010-07-06 2014-12-23 At&T Intellectual Property I, Lp Method and apparatus for managing a presentation of media content
JP5609336B2 (en) 2010-07-07 2014-10-22 ソニー株式会社 Image data transmitting apparatus, image data transmitting method, image data receiving apparatus, image data receiving method, and image data transmitting / receiving system
KR101279660B1 (en) * 2010-07-07 2013-06-27 엘지디스플레이 주식회사 3d image display device and driving method thereof
US9049426B2 (en) 2010-07-07 2015-06-02 At&T Intellectual Property I, Lp Apparatus and method for distributing three dimensional media content
US8848038B2 (en) * 2010-07-09 2014-09-30 Lg Electronics Inc. Method and device for converting 3D images
US9032470B2 (en) 2010-07-20 2015-05-12 At&T Intellectual Property I, Lp Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US9560406B2 (en) 2010-07-20 2017-01-31 At&T Intellectual Property I, L.P. Method and apparatus for adapting a presentation of media content
US9232274B2 (en) 2010-07-20 2016-01-05 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
IT1401367B1 (en) * 2010-07-28 2013-07-18 Sisvel Technology Srl METHOD TO COMBINE REFERENCE IMAGES TO A THREE-DIMENSIONAL CONTENT.
US8994716B2 (en) 2010-08-02 2015-03-31 At&T Intellectual Property I, Lp Apparatus and method for providing media content
WO2012017687A1 (en) * 2010-08-05 2012-02-09 パナソニック株式会社 Image reproduction device
KR101674688B1 (en) * 2010-08-12 2016-11-09 엘지전자 주식회사 A method for displaying a stereoscopic image and stereoscopic image playing device
JP2012044625A (en) * 2010-08-23 2012-03-01 Sony Corp Stereoscopic image data transmission device, stereoscopic image data transmission method, stereoscopic image data reception device and stereoscopic image data reception method
US8438502B2 (en) 2010-08-25 2013-05-07 At&T Intellectual Property I, L.P. Apparatus for controlling three-dimensional images
KR101218815B1 (en) * 2010-08-26 2013-01-21 주식회사 티스마트 3D user interface processing method and set-top box using the same
US8994792B2 (en) * 2010-08-27 2015-03-31 Broadcom Corporation Method and system for creating a 3D video from a monoscopic 2D video and corresponding depth information
JP5058316B2 (en) * 2010-09-03 2012-10-24 株式会社東芝 Electronic device, image processing method, and image processing program
EP2426931A1 (en) * 2010-09-06 2012-03-07 Advanced Digital Broadcast S.A. A method and a system for determining a video frame type
WO2012031406A1 (en) * 2010-09-10 2012-03-15 青岛海信信芯科技有限公司 Display method and equipment for 3d tv interface
WO2012044272A1 (en) * 2010-09-29 2012-04-05 Thomson Licensing Automatically switching between three dimensional and two dimensional contents for display
JP2012094111A (en) * 2010-09-29 2012-05-17 Sony Corp Image processing device, image processing method and program
US8941724B2 (en) * 2010-10-01 2015-01-27 Hitachi Maxell Ltd. Receiver
US8947511B2 (en) 2010-10-01 2015-02-03 At&T Intellectual Property I, L.P. Apparatus and method for presenting three-dimensional media content
JP5543892B2 (en) * 2010-10-01 2014-07-09 日立コンシューマエレクトロニクス株式会社 REPRODUCTION DEVICE, REPRODUCTION METHOD, DISPLAY DEVICE, AND DISPLAY METHOD
TWI420151B (en) * 2010-10-07 2013-12-21 Innolux Corp Display method
KR101232086B1 (en) * 2010-10-08 2013-02-08 엘지디스플레이 주식회사 Liquid crystal display and local dimming control method of thereof
US20120092327A1 (en) * 2010-10-14 2012-04-19 Sony Corporation Overlaying graphical assets onto viewing plane of 3d glasses per metadata accompanying 3d image
JP5550520B2 (en) * 2010-10-20 2014-07-16 日立コンシューマエレクトロニクス株式会社 Playback apparatus and playback method
KR20120047055A (en) * 2010-11-03 2012-05-11 삼성전자주식회사 Display apparatus and method for providing graphic image
CN102469319A (en) * 2010-11-10 2012-05-23 康佳集团股份有限公司 Three-dimensional menu generation method and three-dimensional display device
JP5789960B2 (en) * 2010-11-18 2015-10-07 セイコーエプソン株式会社 Display device, display device control method, and program
JP5786315B2 (en) * 2010-11-24 2015-09-30 セイコーエプソン株式会社 Display device, display device control method, and program
CN101980545B (en) * 2010-11-29 2012-08-01 深圳市九洲电器有限公司 Method for automatically detecting 3DTV video program format
CN101984671B (en) * 2010-11-29 2013-04-17 深圳市九洲电器有限公司 Method for synthesizing video images and interface graphs by 3DTV receiving system
US8878950B2 (en) 2010-12-14 2014-11-04 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using super-resolution processes
JP2012129845A (en) * 2010-12-16 2012-07-05 Jvc Kenwood Corp Image processing device
JP5611807B2 (en) * 2010-12-27 2014-10-22 Necパーソナルコンピュータ株式会社 Video display device
US8600151B2 (en) 2011-01-03 2013-12-03 Apple Inc. Producing stereoscopic image
KR101814798B1 (en) * 2011-01-26 2018-01-04 삼성전자주식회사 Apparatus for processing three dimension image and method for the same
CN105554551A (en) * 2011-03-02 2016-05-04 华为技术有限公司 Method and device for acquiring three-dimensional (3D) format description information
CN102157012B (en) * 2011-03-23 2012-11-28 深圳超多维光电子有限公司 Method for three-dimensionally rendering scene, graphic image treatment device, equipment and system
WO2012145191A1 (en) 2011-04-15 2012-10-26 Dolby Laboratories Licensing Corporation Systems and methods for rendering 3d images independent of display size and viewing distance
KR20120119173A (en) * 2011-04-20 2012-10-30 삼성전자주식회사 3d image processing apparatus and method for adjusting three-dimensional effect thereof
JP2012231254A (en) * 2011-04-25 2012-11-22 Toshiba Corp Stereoscopic image generating apparatus and stereoscopic image generating method
JP2014519741A (en) 2011-05-11 2014-08-14 ペリカン イメージング コーポレイション System and method for transmitting and receiving array camera image data
US9602766B2 (en) 2011-06-24 2017-03-21 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US8947497B2 (en) 2011-06-24 2015-02-03 At&T Intellectual Property I, Lp Apparatus and method for managing telepresence sessions
US9445046B2 (en) 2011-06-24 2016-09-13 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
CN102231829B (en) * 2011-06-27 2014-12-17 深圳超多维光电子有限公司 Display format identification method and device of video file as well as video player
US9294752B2 (en) 2011-07-13 2016-03-22 Google Technology Holdings LLC Dual mode user interface system and method for 3D video
US8587635B2 (en) 2011-07-15 2013-11-19 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
US20130235155A1 (en) * 2011-08-18 2013-09-12 Beijing Goland Tech Co., Ltd. Method of converting 2d into 3d based on image motion information
CN103002297A (en) * 2011-09-16 2013-03-27 联咏科技股份有限公司 Method and device for generating dynamic depth values
WO2013043761A1 (en) 2011-09-19 2013-03-28 Pelican Imaging Corporation Determining depth from multiple views of a scene that include aliasing using hypothesized fusion
US8952996B2 (en) * 2011-09-27 2015-02-10 Delta Electronics, Inc. Image display system
WO2013049699A1 (en) 2011-09-28 2013-04-04 Pelican Imaging Corporation Systems and methods for encoding and decoding light field image files
US8813109B2 (en) 2011-10-21 2014-08-19 The Nielsen Company (Us), Llc Methods and apparatus to identify exposure to 3D media presentations
US8687470B2 (en) 2011-10-24 2014-04-01 Lsi Corporation Optical disk playback device with three-dimensional playback functionality
JP5289538B2 (en) * 2011-11-11 2013-09-11 株式会社東芝 Electronic device, display control method and program
CN102413350B (en) * 2011-11-30 2014-04-16 四川长虹电器股份有限公司 Method for processing blue-light 3D (three-dimensional) video
FR2983673A1 (en) * 2011-12-02 2013-06-07 Binocle CORRECTION METHOD FOR ALTERNATE PROJECTION OF STEREOSCOPIC IMAGES
US8713590B2 (en) 2012-02-21 2014-04-29 The Nielsen Company (Us), Llc Methods and apparatus to identify exposure to 3D media presentations
EP2817955B1 (en) 2012-02-21 2018-04-11 FotoNation Cayman Limited Systems and methods for the manipulation of captured light field image data
US8479226B1 (en) * 2012-02-21 2013-07-02 The Nielsen Company (Us), Llc Methods and apparatus to identify exposure to 3D media presentations
US10445398B2 (en) * 2012-03-01 2019-10-15 Sony Corporation Asset management during production of media
EP2836657B1 (en) 2012-04-10 2017-12-06 Dirtt Environmental Solutions, Ltd. Tamper evident wall cladding system
US9100635B2 (en) 2012-06-28 2015-08-04 Pelican Imaging Corporation Systems and methods for detecting defective camera arrays and optic arrays
US20140002674A1 (en) 2012-06-30 2014-01-02 Pelican Imaging Corporation Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors
SG11201500910RA (en) 2012-08-21 2015-03-30 Pelican Imaging Corp Systems and methods for parallax detection and correction in images captured using array cameras
US20140055632A1 (en) 2012-08-23 2014-02-27 Pelican Imaging Corporation Feature based high resolution motion estimation from low resolution images captured using an array source
KR20140039649A (en) 2012-09-24 2014-04-02 삼성전자주식회사 Multi view image generating method and multi view image display apparatus
KR20140049834A (en) * 2012-10-18 2014-04-28 삼성전자주식회사 Broadcast receiving apparatus and method of controlling the same, and user terminal device and method of providing the screen.
US9143711B2 (en) 2012-11-13 2015-09-22 Pelican Imaging Corporation Systems and methods for array camera focal plane control
KR102394716B1 (en) * 2012-11-27 2022-05-06 돌비 레버러토리즈 라이쎈싱 코오포레이션 Method for encoding and decoding image using depth information, and device and image system using same
KR101430985B1 (en) * 2013-02-20 2014-09-18 주식회사 카몬 System and Method on Providing Multi-Dimensional Content
US9462164B2 (en) 2013-02-21 2016-10-04 Pelican Imaging Corporation Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
WO2014133974A1 (en) 2013-02-24 2014-09-04 Pelican Imaging Corporation Thin form computational and modular array cameras
WO2014138697A1 (en) 2013-03-08 2014-09-12 Pelican Imaging Corporation Systems and methods for high dynamic range imaging using array cameras
US8866912B2 (en) 2013-03-10 2014-10-21 Pelican Imaging Corporation System and methods for calibration of an array camera using a single captured image
WO2014164550A2 (en) 2013-03-13 2014-10-09 Pelican Imaging Corporation System and methods for calibration of an array camera
US9888194B2 (en) 2013-03-13 2018-02-06 Fotonation Cayman Limited Array camera architecture implementing quantum film image sensors
US9578259B2 (en) 2013-03-14 2017-02-21 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US9992021B1 (en) 2013-03-14 2018-06-05 GoTenna, Inc. System and method for private and point-to-point communication between computing devices
US9100586B2 (en) 2013-03-14 2015-08-04 Pelican Imaging Corporation Systems and methods for photometric normalization in array cameras
US9445003B1 (en) 2013-03-15 2016-09-13 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
WO2014145856A1 (en) 2013-03-15 2014-09-18 Pelican Imaging Corporation Systems and methods for stereo imaging with camera arrays
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
CN104079941B (en) * 2013-03-27 2017-08-25 中兴通讯股份有限公司 A kind of depth information decoding method, device and Video processing playback equipment
CN104469338B (en) * 2013-09-25 2016-08-17 联想(北京)有限公司 A kind of control method and device
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
US10491916B2 (en) * 2013-10-01 2019-11-26 Advanced Micro Devices, Inc. Exploiting camera depth information for video encoding
JP2015119464A (en) * 2013-11-12 2015-06-25 セイコーエプソン株式会社 Display device and control method of the same
WO2015074078A1 (en) 2013-11-18 2015-05-21 Pelican Imaging Corporation Estimating depth from projected texture using camera arrays
US9426361B2 (en) 2013-11-26 2016-08-23 Pelican Imaging Corporation Array camera configurations incorporating multiple constituent array cameras
WO2015134996A1 (en) 2014-03-07 2015-09-11 Pelican Imaging Corporation System and methods for depth regularization and semiautomatic interactive matting using rgb-d images
CN104143308B (en) 2014-07-24 2016-09-07 京东方科技集团股份有限公司 The display methods of a kind of 3-D view and device
US10228751B2 (en) 2014-08-06 2019-03-12 Apple Inc. Low power mode
US9647489B2 (en) 2014-08-26 2017-05-09 Apple Inc. Brownout avoidance
WO2016054089A1 (en) 2014-09-29 2016-04-07 Pelican Imaging Corporation Systems and methods for dynamic calibration of array cameras
US10231033B1 (en) * 2014-09-30 2019-03-12 Apple Inc. Synchronizing out-of-band content with a media stream
US10708391B1 (en) 2014-09-30 2020-07-07 Apple Inc. Delivery of apps in a media stream
CN105095895B (en) * 2015-04-23 2018-09-25 广州广电运通金融电子股份有限公司 Valuable file identification device self-correction recognition methods
CN105376546A (en) * 2015-11-09 2016-03-02 中科创达软件股份有限公司 2D-to-3D method, device and mobile terminal
CN105472374A (en) * 2015-11-19 2016-04-06 广州华多网络科技有限公司 3D live video realization method, apparatus, and system
US20170150138A1 (en) * 2015-11-25 2017-05-25 Atheer, Inc. Method and apparatus for selective mono/stereo visual display
US20170150137A1 (en) * 2015-11-25 2017-05-25 Atheer, Inc. Method and apparatus for selective mono/stereo visual display
CN105872519B (en) * 2016-04-13 2018-03-27 万云数码媒体有限公司 A kind of 2D plus depth 3D rendering transverse direction storage methods based on RGB compressions
US10433025B2 (en) * 2016-05-10 2019-10-01 Jaunt Inc. Virtual reality resource scheduling of process in a cloud-based virtual reality processing system
CN106101681A (en) * 2016-06-21 2016-11-09 青岛海信电器股份有限公司 3-D view display processing method, signal input device and television terminal
CN106982367A (en) * 2017-03-31 2017-07-25 联想(北京)有限公司 Video transmission method and its device
US10038500B1 (en) * 2017-05-11 2018-07-31 Qualcomm Incorporated Visible light communication
US10735707B2 (en) * 2017-08-15 2020-08-04 International Business Machines Corporation Generating three-dimensional imagery
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
CN107589989A (en) 2017-09-14 2018-01-16 晨星半导体股份有限公司 Display device and its method for displaying image based on Android platform
US11363133B1 (en) 2017-12-20 2022-06-14 Apple Inc. Battery health-based power management
US10817307B1 (en) 2017-12-20 2020-10-27 Apple Inc. API behavior modification based on power source health
EP3644604A1 (en) * 2018-10-23 2020-04-29 Koninklijke Philips N.V. Image generating apparatus and method therefor
CN109257585B (en) * 2018-10-25 2021-04-06 京东方科技集团股份有限公司 Brightness correction device and method, display device, display system and method
CN109274949A (en) * 2018-10-30 2019-01-25 京东方科技集团股份有限公司 A kind of method of video image processing and its device, display equipment
CN112188181B (en) * 2019-07-02 2023-07-04 中强光电股份有限公司 Image display device, stereoscopic image processing circuit and synchronization signal correction method thereof
KR102646521B1 (en) 2019-09-17 2024-03-21 인트린식 이노베이션 엘엘씨 Surface modeling system and method using polarization cue
MX2022004163A (en) 2019-10-07 2022-07-19 Boston Polarimetrics Inc Systems and methods for surface normals sensing with polarization.
KR20230116068A (en) 2019-11-30 2023-08-03 보스턴 폴라리메트릭스, 인크. System and method for segmenting transparent objects using polarization signals
KR102241615B1 (en) * 2020-01-15 2021-04-19 한국과학기술원 Method to identify and video titles using metadata in video webpage source code, and apparatuses performing the same
JP7462769B2 (en) 2020-01-29 2024-04-05 イントリンジック イノベーション エルエルシー System and method for characterizing an object pose detection and measurement system - Patents.com
KR20220133973A (en) 2020-01-30 2022-10-05 인트린식 이노베이션 엘엘씨 Systems and methods for synthesizing data to train statistical models for different imaging modalities, including polarized images
WO2021243088A1 (en) 2020-05-27 2021-12-02 Boston Polarimetrics, Inc. Multi-aperture polarization optical systems using beam splitters
CN112004162B (en) * 2020-09-08 2022-06-21 宁波视睿迪光电有限公司 Online 3D content playing system and method
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers

Citations (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4523226A (en) * 1982-01-27 1985-06-11 Stereographics Corporation Stereoscopic television system
US5058992A (en) * 1988-09-07 1991-10-22 Toppan Printing Co., Ltd. Method for producing a display with a diffraction grating pattern and a display produced by the method
US5132812A (en) * 1989-10-16 1992-07-21 Toppan Printing Co., Ltd. Method of manufacturing display having diffraction grating patterns
US5262879A (en) * 1988-07-18 1993-11-16 Dimensional Arts. Inc. Holographic image conversion method for making a controlled holographic grating
US5291317A (en) * 1990-07-12 1994-03-01 Applied Holographics Corporation Holographic diffraction grating patterns and methods for creating the same
US5808664A (en) * 1994-07-14 1998-09-15 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5986781A (en) * 1996-10-28 1999-11-16 Pacific Holographics, Inc. Apparatus and method for generating diffractive element using liquid crystal display
US20030095177A1 (en) * 2001-11-21 2003-05-22 Kug-Jin Yun 3D stereoscopic/multiview video processing system and its method
US20040008893A1 (en) * 2002-07-10 2004-01-15 Nec Corporation Stereoscopic image encoding and decoding device
US20040036763A1 (en) * 1994-11-14 2004-02-26 Swift David C. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US20040066846A1 (en) * 2002-10-07 2004-04-08 Kugjin Yun Data processing system for stereoscopic 3-dimensional video based on MPEG-4 and method thereof
US20040145655A1 (en) * 2002-12-02 2004-07-29 Seijiro Tomita Stereoscopic video image display apparatus and stereoscopic video signal processing circuit
US20040201888A1 (en) * 2003-04-08 2004-10-14 Shoji Hagita Image pickup device and stereoscopic image generation device
US6839663B1 (en) * 1999-09-30 2005-01-04 Texas Tech University Haptic rendering of volumetric soft-bodies objects
US20050030301A1 (en) * 2001-12-14 2005-02-10 Ocuity Limited Control of optical switching apparatus
US20050046700A1 (en) * 2003-08-25 2005-03-03 Ive Bracke Device and method for performing multiple view imaging by means of a plurality of video processing devices
US20050053276A1 (en) * 2003-07-15 2005-03-10 Stmicroelectronics S.R.I. Method of obtaining a depth map from a digital image
US20050147166A1 (en) * 2003-12-12 2005-07-07 Shojiro Shibata Decoding device, electronic apparatus, computer, decoding method, program, and recording medium
US6968568B1 (en) * 1999-12-20 2005-11-22 International Business Machines Corporation Methods and apparatus of disseminating broadcast information to a handheld device
US20050259959A1 (en) * 2004-05-19 2005-11-24 Kabushiki Kaisha Toshiba Media data play apparatus and system
US20050259147A1 (en) * 2002-07-16 2005-11-24 Nam Jeho Apparatus and method for adapting 2d and 3d stereoscopic video signal
US20060117071A1 (en) * 2004-11-29 2006-06-01 Samsung Electronics Co., Ltd. Recording apparatus including a plurality of data blocks having different sizes, file managing method using the recording apparatus, and printing apparatus including the recording apparatus
US20060288081A1 (en) * 2005-05-26 2006-12-21 Samsung Electronics Co., Ltd. Information storage medium including application for obtaining metadata and apparatus and method of obtaining metadata
US20070046776A1 (en) * 2005-08-29 2007-03-01 Hiroichi Yamaguchi Stereoscopic display device and control method therefor
US20070081587A1 (en) * 2005-09-27 2007-04-12 Raveendran Vijayalakshmi R Content driven transcoder that orchestrates multimedia transcoding using content information
US20070120972A1 (en) * 2005-11-28 2007-05-31 Samsung Electronics Co., Ltd. Apparatus and method for processing 3D video signal
US20070189760A1 (en) * 2006-02-14 2007-08-16 Lg Electronics Inc. Display device for storing various sets of configuration data and method for controlling the same
US20070280546A1 (en) * 2006-05-11 2007-12-06 Jae Do Kwak Mobile communication terminal and method for displaying an image
US20080018731A1 (en) * 2004-03-08 2008-01-24 Kazunari Era Steroscopic Parameter Embedding Apparatus and Steroscopic Image Reproducer
US20080198218A1 (en) * 2006-11-03 2008-08-21 Quanta Computer Inc. Stereoscopic image format transformation method applied to display system
US20080285863A1 (en) * 2007-05-14 2008-11-20 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding multi-view image
US7613344B2 (en) * 2003-12-08 2009-11-03 Electronics And Telecommunications Research Institute System and method for encoding and decoding an image using bitstream map and recording medium thereof
US20090315981A1 (en) * 2008-06-24 2009-12-24 Samsung Electronics Co., Ltd. Image processing method and apparatus
US7720857B2 (en) * 2003-08-29 2010-05-18 Sap Ag Method and system for providing an invisible attractor in a predetermined sector, which attracts a subset of entities depending on an entity type
US20100165077A1 (en) * 2005-10-19 2010-07-01 Peng Yin Multi-View Video Coding Using Scalable Video Coding
US20100182403A1 (en) * 2006-09-04 2010-07-22 Enhanced Chip Technology Inc. File format for encoded stereoscopic image/video data
US7826709B2 (en) * 2002-04-12 2010-11-02 Mitsubishi Denki Kabushiki Kaisha Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method
US7893908B2 (en) * 2006-05-11 2011-02-22 Nec Display Solutions, Ltd. Liquid crystal display device and liquid crystal panel drive method
US7953315B2 (en) * 2006-05-22 2011-05-31 Broadcom Corporation Adaptive video processing circuitry and player using sub-frame metadata
US7986283B2 (en) * 2007-01-02 2011-07-26 Samsung Mobile Display Co., Ltd. Multi-dimensional image selectable display device
US8054329B2 (en) * 2005-07-08 2011-11-08 Samsung Electronics Co., Ltd. High resolution 2D-3D switchable autostereoscopic display apparatus
US8077117B2 (en) * 2007-04-17 2011-12-13 Samsung Mobile Display Co., Ltd. Electronic display device and method thereof

Family Cites Families (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4667228A (en) * 1983-10-14 1987-05-19 Canon Kabushiki Kaisha Image signal processing apparatus
JPS63116593A (en) * 1986-11-04 1988-05-20 Matsushita Electric Ind Co Ltd Stereoscopic picture reproducing device
JP3081675B2 (en) * 1991-07-24 2000-08-28 オリンパス光学工業株式会社 Image recording device and image reproducing device
US5740274A (en) * 1991-09-12 1998-04-14 Fuji Photo Film Co., Ltd. Method for recognizing object images and learning method for neural networks
US6011581A (en) * 1992-11-16 2000-01-04 Reveo, Inc. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US6084978A (en) * 1993-12-16 2000-07-04 Eastman Kodak Company Hierarchical storage and display of digital images used in constructing three-dimensional image hard copy
CN1113320C (en) * 1994-02-01 2003-07-02 三洋电机株式会社 Method of converting two-dimensional images into three-dimensional images
US5739844A (en) * 1994-02-04 1998-04-14 Sanyo Electric Co. Ltd. Method of converting two-dimensional image into three-dimensional image
US5684890A (en) * 1994-02-28 1997-11-04 Nec Corporation Three-dimensional reference image segmenting method and apparatus
US6104828A (en) * 1994-03-24 2000-08-15 Kabushiki Kaisha Topcon Ophthalmologic image processor
DE69528946T2 (en) * 1994-09-22 2003-10-02 Sanyo Electric Co Process for converting two-dimensional images into three-dimensional images
JPH09116931A (en) * 1995-10-18 1997-05-02 Sanyo Electric Co Ltd Method and identifying left and right video image for time division stereoscopic video signal
US5917940A (en) * 1996-01-23 1999-06-29 Nec Corporation Three dimensional reference image segmenting method and device and object discrimination system
JPH09322199A (en) * 1996-05-29 1997-12-12 Olympus Optical Co Ltd Stereoscopic video display device
JPH10224822A (en) * 1997-01-31 1998-08-21 Sony Corp Video display method and display device
JPH10313417A (en) * 1997-03-12 1998-11-24 Seiko Epson Corp Digital gamma correction circuit, liquid crystal display device using the same and electronic device
DE19806547C2 (en) * 1997-04-30 2001-01-25 Hewlett Packard Co System and method for generating stereoscopic display signals from a single computer graphics pipeline
JPH11113028A (en) * 1997-09-30 1999-04-23 Toshiba Corp Three-dimension video image display device
EP2252071A3 (en) * 1997-12-05 2017-04-12 Dynamic Digital Depth Research Pty. Ltd. Improved image conversion and encoding techniques
US6850631B1 (en) * 1998-02-20 2005-02-01 Oki Electric Industry Co., Ltd. Photographing device, iris input device and iris image input method
JP4149037B2 (en) * 1998-06-04 2008-09-10 オリンパス株式会社 Video system
US6704042B2 (en) * 1998-12-10 2004-03-09 Canon Kabushiki Kaisha Video processing apparatus, control method therefor, and storage medium
JP2000298246A (en) * 1999-02-12 2000-10-24 Canon Inc Device and method for display, and storage medium
JP2000275575A (en) * 1999-03-24 2000-10-06 Sharp Corp Stereoscopic video display device
KR100334722B1 (en) * 1999-06-05 2002-05-04 강호석 Method and the apparatus for generating stereoscopic image using MPEG data
JP2001012946A (en) * 1999-06-30 2001-01-19 Toshiba Corp Dynamic image processor and processing method
EP1243141B1 (en) * 1999-12-14 2011-10-19 Scientific-Atlanta, LLC System and method for adaptive decoding of a video signal with coordinated resource allocation
US20020009137A1 (en) * 2000-02-01 2002-01-24 Nelson John E. Three-dimensional video broadcasting system
US7215809B2 (en) * 2000-04-04 2007-05-08 Sony Corporation Three-dimensional image producing method and apparatus therefor
JP2001320693A (en) * 2000-05-12 2001-11-16 Sony Corp Service providing device and method, reception terminal and method, service providing system
US6765568B2 (en) * 2000-06-12 2004-07-20 Vrex, Inc. Electronic stereoscopic media delivery system
US6762755B2 (en) * 2000-10-16 2004-07-13 Pixel Science, Inc. Method and apparatus for creating and displaying interactive three dimensional computer images
JP3667620B2 (en) * 2000-10-16 2005-07-06 株式会社アイ・オー・データ機器 Stereo image capturing adapter, stereo image capturing camera, and stereo image processing apparatus
GB0100563D0 (en) * 2001-01-09 2001-02-21 Pace Micro Tech Plc Dynamic adjustment of on-screen displays to cope with different widescreen signalling types
US6678323B2 (en) * 2001-01-24 2004-01-13 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry Through Communications Research Centre Bandwidth reduction for stereoscopic imagery and video signals
KR20020096203A (en) * 2001-06-18 2002-12-31 (주)디지털국영 The Method for Enlarging or Reducing Stereoscopic Images
JP2003157292A (en) * 2001-11-20 2003-05-30 Nec Corp System and method for managing layout of product
US20040218269A1 (en) * 2002-01-14 2004-11-04 Divelbiss Adam W. General purpose stereoscopic 3D format conversion system and method
US7319720B2 (en) * 2002-01-28 2008-01-15 Microsoft Corporation Stereoscopic video
JP2003284099A (en) * 2002-03-22 2003-10-03 Olympus Optical Co Ltd Video information signal recording medium and video display apparatus
US6771274B2 (en) * 2002-03-27 2004-08-03 Sony Corporation Graphics and video integration with alpha and video blending
CA2380105A1 (en) * 2002-04-09 2003-10-09 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
JP4652389B2 (en) * 2002-04-12 2011-03-16 三菱電機株式会社 Metadata processing method
WO2003092303A1 (en) * 2002-04-25 2003-11-06 Sharp Kabushiki Kaisha Multimedia information generation method and multimedia information reproduction device
JP4183499B2 (en) * 2002-12-16 2008-11-19 三洋電機株式会社 Video file processing method and video processing method
JP2004246066A (en) * 2003-02-13 2004-09-02 Fujitsu Ltd Virtual environment creating method
JP2004274125A (en) * 2003-03-05 2004-09-30 Sony Corp Image processing apparatus and method
JP4677175B2 (en) * 2003-03-24 2011-04-27 シャープ株式会社 Image processing apparatus, image pickup system, image display system, image pickup display system, image processing program, and computer-readable recording medium recording image processing program
EP1617684A4 (en) * 2003-04-17 2009-06-03 Sharp Kk 3-dimensional image creation device, 3-dimensional image reproduction device, 3-dimensional image processing device, 3-dimensional image processing program, and recording medium containing the program
KR100556826B1 (en) * 2003-04-17 2006-03-10 한국전자통신연구원 System and Method of Internet Broadcasting for MPEG4 based Stereoscopic Video
JP2004357156A (en) * 2003-05-30 2004-12-16 Sharp Corp Video reception apparatus and video playback apparatus
JP2005026800A (en) * 2003-06-30 2005-01-27 Konica Minolta Photo Imaging Inc Image processing method, imaging apparatus, image processing apparatus, and image recording apparatus
KR100544677B1 (en) * 2003-12-26 2006-01-23 한국전자통신연구원 Apparatus and method for the 3D object tracking using multi-view and depth cameras
KR100543219B1 (en) * 2004-05-24 2006-01-20 한국과학기술연구원 Method for generating haptic vector field and 3d-height map in 2d-image
JP4227076B2 (en) * 2004-05-24 2009-02-18 株式会社東芝 Display device for displaying stereoscopic image and display method for displaying stereoscopic image
KR100708838B1 (en) * 2004-06-30 2007-04-17 삼성에스디아이 주식회사 Stereoscopic display device and driving method thereof
JP2006041811A (en) * 2004-07-26 2006-02-09 Kddi Corp Free visual point picture streaming method
KR20040077596A (en) * 2004-07-28 2004-09-04 손귀연 Stereoscopic Image Display Device Based on Flat Panel Display
WO2006028151A1 (en) * 2004-09-08 2006-03-16 Nippon Telegraph And Telephone Corporation 3d displaying method, device and program
KR100656575B1 (en) 2004-12-31 2006-12-11 광운대학교 산학협력단 Three-dimensional display device
TWI261099B (en) * 2005-02-17 2006-09-01 Au Optronics Corp Backlight modules
KR100828358B1 (en) * 2005-06-14 2008-05-08 삼성전자주식회사 Method and apparatus for converting display mode of video, and computer readable medium thereof
US7404645B2 (en) * 2005-06-20 2008-07-29 Digital Display Innovations, Llc Image and light source modulation for a digital display system
CA2553473A1 (en) * 2005-07-26 2007-01-26 Wa James Tam Generating a depth map from a tw0-dimensional source image for stereoscopic and multiview imaging
KR100780701B1 (en) 2006-03-28 2007-11-30 (주)오픈브이알 Apparatus automatically creating three dimension image and method therefore
KR20070098364A (en) * 2006-03-31 2007-10-05 (주)엔브이엘소프트 Apparatus and method for coding and saving a 3d moving image
US20070294737A1 (en) * 2006-06-16 2007-12-20 Sbc Knowledge Ventures, L.P. Internet Protocol Television (IPTV) stream management within a home viewing network
CA2884702C (en) * 2006-06-23 2018-06-05 Samuel Zhou Methods and systems for converting 2d motion pictures for stereoscopic 3d exhibition
KR100761022B1 (en) * 2006-08-14 2007-09-21 광주과학기술원 Haptic rendering method based on depth image, device therefor, and haptic broadcasting system using them
EP1901474B1 (en) 2006-09-13 2011-11-30 Stmicroelectronics Sa System for synchronizing modules in an integrated circuit in mesochronous clock domains
CN101523924B (en) * 2006-09-28 2011-07-06 皇家飞利浦电子股份有限公司 3 menu display
JP2010510558A (en) * 2006-10-11 2010-04-02 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Creating 3D graphics data
JP4755565B2 (en) * 2006-10-17 2011-08-24 シャープ株式会社 Stereoscopic image processing device
KR101362941B1 (en) * 2006-11-01 2014-02-17 한국전자통신연구원 Method and Apparatus for decoding metadata used for playing stereoscopic contents
JP5008677B2 (en) * 2006-11-29 2012-08-22 パナソニック株式会社 Video / audio device network and signal reproduction method
CA2627999C (en) * 2007-04-03 2011-11-15 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry Through The Communications Research Centre Canada Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images
US8213711B2 (en) * 2007-04-03 2012-07-03 Her Majesty The Queen In Right Of Canada As Represented By The Minister Of Industry, Through The Communications Research Centre Canada Method and graphical user interface for modifying depth maps
JP4564512B2 (en) 2007-04-16 2010-10-20 富士通株式会社 Display device, display program, and display method
JP4462288B2 (en) * 2007-05-16 2010-05-12 株式会社日立製作所 Video display device and three-dimensional video display device using the same
EP2198625A4 (en) * 2007-10-10 2013-11-06 Korea Electronics Telecomm Metadata structure for storing and playing stereoscopic data, and method for storing stereoscopic content file using this metadata
US8482654B2 (en) * 2008-10-24 2013-07-09 Reald Inc. Stereoscopic image format with depth information

Patent Citations (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4523226A (en) * 1982-01-27 1985-06-11 Stereographics Corporation Stereoscopic television system
US5262879A (en) * 1988-07-18 1993-11-16 Dimensional Arts. Inc. Holographic image conversion method for making a controlled holographic grating
US5058992A (en) * 1988-09-07 1991-10-22 Toppan Printing Co., Ltd. Method for producing a display with a diffraction grating pattern and a display produced by the method
US5132812A (en) * 1989-10-16 1992-07-21 Toppan Printing Co., Ltd. Method of manufacturing display having diffraction grating patterns
US5291317A (en) * 1990-07-12 1994-03-01 Applied Holographics Corporation Holographic diffraction grating patterns and methods for creating the same
US5808664A (en) * 1994-07-14 1998-09-15 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US20040036763A1 (en) * 1994-11-14 2004-02-26 Swift David C. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US5986781A (en) * 1996-10-28 1999-11-16 Pacific Holographics, Inc. Apparatus and method for generating diffractive element using liquid crystal display
US6839663B1 (en) * 1999-09-30 2005-01-04 Texas Tech University Haptic rendering of volumetric soft-bodies objects
US6968568B1 (en) * 1999-12-20 2005-11-22 International Business Machines Corporation Methods and apparatus of disseminating broadcast information to a handheld device
US20030095177A1 (en) * 2001-11-21 2003-05-22 Kug-Jin Yun 3D stereoscopic/multiview video processing system and its method
US8111758B2 (en) * 2001-11-21 2012-02-07 Electronics And Telecommunications Research Institute 3D stereoscopic/multiview video processing system and its method
US20050030301A1 (en) * 2001-12-14 2005-02-10 Ocuity Limited Control of optical switching apparatus
US7826709B2 (en) * 2002-04-12 2010-11-02 Mitsubishi Denki Kabushiki Kaisha Metadata editing apparatus, metadata reproduction apparatus, metadata delivery apparatus, metadata search apparatus, metadata re-generation condition setting apparatus, metadata delivery method and hint information description method
US20040008893A1 (en) * 2002-07-10 2004-01-15 Nec Corporation Stereoscopic image encoding and decoding device
US20050259147A1 (en) * 2002-07-16 2005-11-24 Nam Jeho Apparatus and method for adapting 2d and 3d stereoscopic video signal
US20040066846A1 (en) * 2002-10-07 2004-04-08 Kugjin Yun Data processing system for stereoscopic 3-dimensional video based on MPEG-4 and method thereof
US20040145655A1 (en) * 2002-12-02 2004-07-29 Seijiro Tomita Stereoscopic video image display apparatus and stereoscopic video signal processing circuit
US20040201888A1 (en) * 2003-04-08 2004-10-14 Shoji Hagita Image pickup device and stereoscopic image generation device
US20050053276A1 (en) * 2003-07-15 2005-03-10 Stmicroelectronics S.R.I. Method of obtaining a depth map from a digital image
US20050046700A1 (en) * 2003-08-25 2005-03-03 Ive Bracke Device and method for performing multiple view imaging by means of a plurality of video processing devices
US7720857B2 (en) * 2003-08-29 2010-05-18 Sap Ag Method and system for providing an invisible attractor in a predetermined sector, which attracts a subset of entities depending on an entity type
US7613344B2 (en) * 2003-12-08 2009-11-03 Electronics And Telecommunications Research Institute System and method for encoding and decoding an image using bitstream map and recording medium thereof
US20050147166A1 (en) * 2003-12-12 2005-07-07 Shojiro Shibata Decoding device, electronic apparatus, computer, decoding method, program, and recording medium
US20080018731A1 (en) * 2004-03-08 2008-01-24 Kazunari Era Steroscopic Parameter Embedding Apparatus and Steroscopic Image Reproducer
US20050259959A1 (en) * 2004-05-19 2005-11-24 Kabushiki Kaisha Toshiba Media data play apparatus and system
US20060117071A1 (en) * 2004-11-29 2006-06-01 Samsung Electronics Co., Ltd. Recording apparatus including a plurality of data blocks having different sizes, file managing method using the recording apparatus, and printing apparatus including the recording apparatus
US20060288081A1 (en) * 2005-05-26 2006-12-21 Samsung Electronics Co., Ltd. Information storage medium including application for obtaining metadata and apparatus and method of obtaining metadata
US8054329B2 (en) * 2005-07-08 2011-11-08 Samsung Electronics Co., Ltd. High resolution 2D-3D switchable autostereoscopic display apparatus
US20070046776A1 (en) * 2005-08-29 2007-03-01 Hiroichi Yamaguchi Stereoscopic display device and control method therefor
US20070081587A1 (en) * 2005-09-27 2007-04-12 Raveendran Vijayalakshmi R Content driven transcoder that orchestrates multimedia transcoding using content information
US20100165077A1 (en) * 2005-10-19 2010-07-01 Peng Yin Multi-View Video Coding Using Scalable Video Coding
US20070120972A1 (en) * 2005-11-28 2007-05-31 Samsung Electronics Co., Ltd. Apparatus and method for processing 3D video signal
US20070189760A1 (en) * 2006-02-14 2007-08-16 Lg Electronics Inc. Display device for storing various sets of configuration data and method for controlling the same
US7840132B2 (en) * 2006-02-14 2010-11-23 Lg Electronics Inc. Display device for storing various sets of configuration data and method for controlling the same
US20070280546A1 (en) * 2006-05-11 2007-12-06 Jae Do Kwak Mobile communication terminal and method for displaying an image
US7893908B2 (en) * 2006-05-11 2011-02-22 Nec Display Solutions, Ltd. Liquid crystal display device and liquid crystal panel drive method
US7953315B2 (en) * 2006-05-22 2011-05-31 Broadcom Corporation Adaptive video processing circuitry and player using sub-frame metadata
US20100182403A1 (en) * 2006-09-04 2010-07-22 Enhanced Chip Technology Inc. File format for encoded stereoscopic image/video data
US20080198218A1 (en) * 2006-11-03 2008-08-21 Quanta Computer Inc. Stereoscopic image format transformation method applied to display system
US7986283B2 (en) * 2007-01-02 2011-07-26 Samsung Mobile Display Co., Ltd. Multi-dimensional image selectable display device
US8077117B2 (en) * 2007-04-17 2011-12-13 Samsung Mobile Display Co., Ltd. Electronic display device and method thereof
US20080285863A1 (en) * 2007-05-14 2008-11-20 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding multi-view image
US20090315981A1 (en) * 2008-06-24 2009-12-24 Samsung Electronics Co., Ltd. Image processing method and apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Woods, Andrew; "3D on your PC," http://www.andrewwoods3d.com/3D-PC/, (June 11, 2005) (accessed March 2, 2012) *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8194986B2 (en) 2008-08-19 2012-06-05 Digimarc Corporation Methods and systems for content processing
US8503791B2 (en) 2008-08-19 2013-08-06 Digimarc Corporation Methods and systems for content processing
US8520979B2 (en) 2008-08-19 2013-08-27 Digimarc Corporation Methods and systems for content processing
US8606021B2 (en) 2008-08-19 2013-12-10 Digimarc Corporation Methods and systems for content processing
US9104915B2 (en) 2008-08-19 2015-08-11 Digimarc Corporation Methods and systems for content processing
US20110228058A1 (en) * 2010-03-17 2011-09-22 Yasunari Hatasawa Reproducing device, reproduction control method and program
US8730310B2 (en) * 2010-03-17 2014-05-20 Sony Corporation Reproducing device, reproduction control method and program
EP2515293A3 (en) * 2011-04-19 2016-10-05 Lg Electronics Inc. Image display apparatus and method for operating the same
US20150109411A1 (en) * 2012-04-26 2015-04-23 Electronics And Telecommunications Research Institute Image playback apparatus for 3dtv and method performed by the apparatus
CN103543953A (en) * 2013-11-08 2014-01-29 深圳市汉普电子技术开发有限公司 Method for playing 3D film source without 3D identification and touch device
US11770513B1 (en) * 2022-07-13 2023-09-26 Rovi Guides, Inc. Systems and methods for reducing a number of focal planes used to display three-dimensional objects
US20240022699A1 (en) * 2022-07-13 2024-01-18 Rovi Guides, Inc. Systems and methods for reducing a number of focal planes used to display three-dimensional objects

Also Published As

Publication number Publication date
CN102067613B (en) 2016-04-13
WO2009157708A3 (en) 2010-04-15
WO2009157668A2 (en) 2009-12-30
JP2011526103A (en) 2011-09-29
CN102077600A (en) 2011-05-25
US20100104219A1 (en) 2010-04-29
MY159672A (en) 2017-01-13
CN102067615A (en) 2011-05-18
KR20100002038A (en) 2010-01-06
JP2011525746A (en) 2011-09-22
US20090315977A1 (en) 2009-12-24
US8488869B2 (en) 2013-07-16
EP2279625A2 (en) 2011-02-02
CN102067614B (en) 2014-06-11
WO2009157701A3 (en) 2010-04-15
EP2289248A2 (en) 2011-03-02
EP2289247A4 (en) 2014-05-28
US20090317061A1 (en) 2009-12-24
JP2011525745A (en) 2011-09-22
KR20100002032A (en) 2010-01-06
KR101539935B1 (en) 2015-07-28
KR20100002031A (en) 2010-01-06
CN102067614A (en) 2011-05-18
US20090315979A1 (en) 2009-12-24
JP5547725B2 (en) 2014-07-16
EP2289248A4 (en) 2014-07-02
JP2011525743A (en) 2011-09-22
KR20100002037A (en) 2010-01-06
EP2292019A4 (en) 2014-04-30
WO2009157701A2 (en) 2009-12-30
WO2009157714A2 (en) 2009-12-30
CN102067613A (en) 2011-05-18
KR20100002035A (en) 2010-01-06
KR20100002036A (en) 2010-01-06
KR20100002049A (en) 2010-01-06
CN102067615B (en) 2015-02-25
KR20100002033A (en) 2010-01-06
WO2009157708A2 (en) 2009-12-30
EP2289247A2 (en) 2011-03-02
KR20100002048A (en) 2010-01-06
WO2009157668A3 (en) 2010-03-25
WO2009157714A3 (en) 2010-03-25
EP2279625A4 (en) 2013-07-03
US20100103168A1 (en) 2010-04-29
EP2292019A2 (en) 2011-03-09

Similar Documents

Publication Publication Date Title
US20090315884A1 (en) Method and apparatus for outputting and displaying image data
US10158841B2 (en) Method and device for overlaying 3D graphics over 3D video
US11557015B2 (en) System and method of data transfer in-band in video via optically encoded images
JP6027071B2 (en) 3D video player with flexible output
JP5022443B2 (en) Method of decoding metadata used for playback of stereoscopic video content
US20080303832A1 (en) Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same
US20190082139A1 (en) Image playback device, display device, and transmission device
CN101938663B (en) Recording/reproducing apparatus
US20080062069A1 (en) Personal Video Display Device
US20100103165A1 (en) Image decoding method, image outputting method, and image decoding and outputting apparatuses
KR100744594B1 (en) Content reproduce system, reproduce device, reproduce method, and distribution server
JP4737213B2 (en) Information processing device
US8542241B2 (en) Stereoscopic content auto-judging mechanism
CN105260131A (en) Content treatment device and content treatment method
RU2522040C2 (en) Reproducing method and apparatus, data structure, recording medium, recording apparatus, recording method and programme
JP2010192971A (en) Selected-area encoded video data distributing method, encoded video data decoding method, distribution server, reproduction terminal, program, and recording medium
US20110115881A1 (en) Data structure, reproducing apparatus, reproducing method, and program
KR20080108882A (en) Method for generating the 2d/3d display-convertible stereoscopic image bitstream, and method and apparatus for displaying the 2d/3d display-convertible stereoscopic image bitstream
KR100687207B1 (en) Image transmitting apparatus and image receiving apparatus
KR20110035810A (en) Method and apparatus for transmitting uncompressed three-dimensional video data via digital data interface, method and apparatus for receiving uncompressed three-dimensional video data via digital data interface
JP2013098878A (en) Image converter, image display, television receiver, image conversion method, and computer program
JP2011114641A (en) Recording device, reproducer, method for recording and reproducing stereoscopic video, recording method and reproducing method

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, DAE-JONG;CHUNG, HYUN-KWON;JUNG, KIL-SOO;REEL/FRAME:022800/0732

Effective date: 20090406

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION