US20070269136A1 - Method and device for generating 3d images - Google Patents

Method and device for generating 3d images Download PDF

Info

Publication number
US20070269136A1
US20070269136A1 US11/835,104 US83510407A US2007269136A1 US 20070269136 A1 US20070269136 A1 US 20070269136A1 US 83510407 A US83510407 A US 83510407A US 2007269136 A1 US2007269136 A1 US 2007269136A1
Authority
US
United States
Prior art keywords
image
images
sequence
similarity
image sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/835,104
Inventor
Rolf-Dieter Naske
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NewSight Corp
Original Assignee
NewSight Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NewSight Corp filed Critical NewSight Corp
Priority to US11/835,104 priority Critical patent/US20070269136A1/en
Publication of US20070269136A1 publication Critical patent/US20070269136A1/en
Assigned to PRENTICE CAPITAL MANAGEMENT, LP reassignment PRENTICE CAPITAL MANAGEMENT, LP SECURITY AGREEMENT Assignors: NEWSIGHT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • H04N13/359Switching between monoscopic and stereoscopic modes

Definitions

  • the invention relates to a method and a device for the generation of 3-D images.
  • Three-dimensional imaging is often used to analyze objects, particularly in the fields of medicine and science.
  • Various methods with which television pictures in particular can be produced in three dimensions have also been developed for general consumer applications.
  • sequential image trans-mission in which the images for the right eye and the left eye are transmitted alternately one after the other or saved to a storage medium
  • parallel transmission in which the images are transmitted on two separate channels.
  • interframe coding techniques such as MPEG-2, do not allow one to transmit or save stereo images in a sequential image format, because image information from one image is still contained in another image, creating the so-called “crosstalk effect”, which makes clear separation of the right image from the left impossible.
  • the problem behind the invention is to create a method and a device of the type specified in the introduction with which it is possible to generate 3-D images with a very natural three-dimensional image impression even if using the transmission and/or compression methods described in the introduction.
  • FIG. 1 a schematic block diagram of circuitry according to the invention
  • FIG. 2 a graphical representation of an actual image sequence and of a scanned image sequence
  • FIG. 3 a - c schematic representations of phase control in sequential images
  • FIG. 4 a schematic block diagram of one imaging application of the invented de-vice.
  • the basic components of a device according to the invention and their interconnections are schematically represented in FIG. 1 .
  • the system comprises a first input E 1 , through which the two-dimensional images generated by a camera and transmitted across a transmission path are directed to an A/D converter 10 and digitized. The digitized images are then directed to an image storage device 11 and a phase selector 16 .
  • the images saved in the image storage device 11 are analyzed by a phase analyzer 12 , the input of which is connected to the image storage device 11 and the output of which is connected to the phase selector 16 .
  • a long-term storage device 13 is connected to the image storage device 11 for storing images from this storage device and the output of which is connected to an image generator 15 .
  • the image generator 15 is also connected to another output of the image storage device 11 and of a motion analyzer 14 , to which images from the image storage device 11 are directed.
  • the device comprises a second input E 2 for manual motion control connected to the image generator 15 , as well as a third input E 3 for manual phase control attached to the phase selector 16 .
  • a right or left stereo image B L , B R is attached to two outputs of the phase selector 16 , which are connected to a first or second output A 1 , A 2 of the device.
  • a second image sequence is generated by this device based on a (first) image sequence recorded in two-dimensions. Together with the first image sequence, the second sequence makes it possible to view the originally two-dimensional images in three dimensions when the first and second image sequences are transmitted to the left or right eye.
  • the second image sequence is defined according to the following description based on image information resulting from the motion in the first image sequence. The following definitions apply:
  • x ij is a digitized image at time t with horizontal resolution I and vertical resolution J.
  • the most recent K images are located in the image storage device 11 with length K.
  • 0 ⁇ k is a real number representing the time interval of a given image x k , during which a (synthetic) image of the second image sequence is generated (“approximation variable”).
  • B L represents the given displayed left image
  • B R the given displayed right image.
  • the images x k in the image storage device 11 are viewed as sample values (scanned image sequence according to curve b in FIG. 2 ) of a constant function (actual image sequence according to curve a in FIG. 2 ).
  • Various methods of approximation may be applied to this function.
  • the following explanations relating to FIG. 2 refer to a linear spline approximation. However, other methods of approximation may be used as appropriate; for example, higher-level or polynomial approximations.
  • FIG. 2 shows a image sequence in two-dimensional (I/J-) space.
  • N is a fixed number with N>M, so: x ij 1s with 0 ⁇ i ⁇ I and j m ⁇ N ⁇ j ⁇ j m +N are defined as the search region (see FIG. 3 b ) in image x 1 , in which the partial image with the greatest similarity to the scanned image x ij 0s is sought.
  • d 1 is the similarity of the scanned image to a partial image of equal size from the search region with a displacement position 1, where ⁇ N ⁇ 1 ⁇ +N.
  • d 1 is the result for the displacement position 1:
  • the value of I ranges from ⁇ N to +N, where 1 represents the given displacement position of a partial image in the search region.
  • a Euclidean distance or an absolute amount may also be chosen as a measure of similarity.
  • the scanned image x s ( FIG. 3 a ) runs like a scanner across the search region ( FIG. 3 b ) of the image x 1 (previous image) and looks for the region with the greatest similarity d 1 to the scanned image, where the similarity d 1 is calculated for every displacement position 1.
  • a whole number is defined, which maybe called the moment of inertia and with which blurring is defined according to FIG. 3 c . This is used to allow for camera movement which should not be considered displacement of the image. For the value of ⁇ , ⁇ 1 ⁇ F ⁇ 1 approximately.
  • I min ⁇ this means that the region of greatest similarity in the search region is displaced to the left, and thus the predominant direction of movement in sequential images x 1 , x 0 of the first image sequence is indicated from left to right. This may result from the movement of an object in the images from left to right or from the panning of the camera from right to left.
  • x 0 i.e., the given image of the image sequence
  • second image sequence a calculated synthetic image
  • next image is then accessed and the same process is repeated for this image, beginning with the calculation of the minimum value of the measure of similarity d min .
  • This automatic phase control or selection may also be switched off and, for example, replaced by manual switching using a keyboard via the device's third input.
  • the embodiment shown in FIG. 1 comprises the motion analyzer 14 , which uses dynamic motion control or motion calculation to prevent the stereo base from becoming too large when there are large movements. In addition, this ensures that a certain minimum width of the stereo base is maintained during very slow movements before it disappears in images without any motion.
  • the long-term storage device 13 from which images are accessed and used as images of the second image sequence when the movements are too slow, has been provided for this last purpose.
  • this measure of similarity is a function of the extent to which the entire contents of the next image in an image sequence differ from the contents of the previous image, and thus represents a measure of the speed of motion in the images.
  • a Euclidian distance or an absolute amount may of course be chosen for the calculation instead of the cross-correlation described.
  • the individual color values of the selected color space RGB or YUV must always be processed separately.
  • the value of the approximation variables a will change as a function of the size of the measure of similarity d k relative to the threshold values ⁇ 0 , ⁇ 1 , ⁇ 2 , as follows.
  • the character s denotes a step width which is preferably 0.1, however, it can have other values as well.
  • This dynamic motion control can also be switched off like the automatic phase control and replaced by manual entry; for example, using a keyboard via the device's second input.
  • the method described will preferably be implemented using a data processing program on a computer, in particular a digital image processing system for the generation of a three-dimensional depiction of television pictures transmitted or stored in a two-dimensional format.
  • the value of M is set to 1 or 2.
  • FIG. 4 shows a block diagram of a device (stereo decoder or stereo viewer) for the generation and depiction of 3-D images which are calculated based on a sequence of 2-D images transmitted over a transmission path or accessed from a storage medium.
  • the device comprises a first input 21 , to which the 2-D images transmitted across a transmission path and demodulated or decompressed according to known techniques are connected.
  • a second input 22 which is connected to a DVD player, a video recorder, or another source of images, for example.
  • Both of these inputs are connected to the invented device 23 according to FIG. 1 , with which 3-D images are calculated based on the sequence of 2-D images according to the detailed explanation above.
  • the outputs A 1 , A 2 of this device, to which a sequence of left or right images B L , B R is connected, are connected to a stereo storage device 24 , 25 , in which the images are stored for each channel.
  • driver levels can be selected via a third input 26 by activating a selector switch 27 , by means of which a corresponding image generator is controlled.
  • a driver 28 for simulator goggles 29 a driver 28 for simulator goggles 29 , a driver 30 for an autostereoscopic monitor 31 , and a driver 32 for a stereo projector 33 are shown here.
  • This device is preferably designed as a component of a digital image processing system for the generation of a three-dimensional depiction of television pictures transmitted or stored in two dimensions.

Abstract

A method and a device for generating 3D images, according to which an image of a second sequence of images is generated in addition to an image of a first sequence of 2D images at an interval that can be determined via an approximation variable (α). A measure of similarity (dk) between successive images of the first sequence is determined and compared with threshold values (δ012) so as to modify the approximation variable (α) depending thereon in such a manner that the stereo base width does not turn unnaturally large. A phase analyzer (12) is used to determine a prevailing direction of movement in successive images of the first sequence of images and a phase converter (16) is used to allocate the image of the first and second sequence of images to a left-hand or right-hand viewing channel depending on a prevailing direction of movement in successive images of the first sequence.

Description

  • The invention relates to a method and a device for the generation of 3-D images.
  • Three-dimensional imaging is often used to analyze objects, particularly in the fields of medicine and science. Various methods with which television pictures in particular can be produced in three dimensions have also been developed for general consumer applications.
  • Among said methods, there is a basic distinction between sequential image trans-mission, in which the images for the right eye and the left eye are transmitted alternately one after the other or saved to a storage medium, and parallel transmission, in which the images are transmitted on two separate channels.
  • One particular disadvantage of sequential image transmission in connection with conventional television systems is the fact that the refresh rate is reduced to 25 images per second for each eye. This creates an unpleasant flickering for the viewer. Of course, this limitation does not occur when the image sequences are each transmitted on their own channel (left or right). However, problems may still arise with synchronizing both channels and due to the requirements placed on the receiver, which must be able to receive and process two channels simultaneously. This is not possible for most systems generally available on the market.
  • Signal transmission and processing will likely be entirely digital in future television systems. In such systems, every image is broken down into individual pixels which are transmitted in digitized format. In order to reduce the bandwidth required for this process, the appropriate compression methods are used; however, these create problems for stereo transmission.
  • For example, using block coding methods with a reasonable rate of compression, it is generally not possible to reconstruct every individual line of an image precisely. In addition, interframe coding techniques, such as MPEG-2, do not allow one to transmit or save stereo images in a sequential image format, because image information from one image is still contained in another image, creating the so-called “crosstalk effect”, which makes clear separation of the right image from the left impossible.
  • Other methods for generating a three-dimensional image sequence from a two-dimensional image sequence are disclosed in DE 35 30 610 and EP 0 665697. An autostereoscopic system with an interpolation of images is disclosed in EP 0 520 179, whereas in “Huang: Image Sequence Analysis” (published in Springer Verlag) problems of the recognition of motion areas in image sequences are discussed.
  • Therefore, the problem behind the invention is to create a method and a device of the type specified in the introduction with which it is possible to generate 3-D images with a very natural three-dimensional image impression even if using the transmission and/or compression methods described in the introduction.
  • This problem has been solved with a method according to claim 1 and a device according to claim 10.
  • The dependent claims contain further advantageous embodiments of the invention.
  • Additional details, features, and advantages of the invention may be seen from the following description of a preferred embodiment with reference to the drawings. They show:
  • FIG. 1 a schematic block diagram of circuitry according to the invention;
  • FIG. 2 a graphical representation of an actual image sequence and of a scanned image sequence;
  • FIG. 3 a-c schematic representations of phase control in sequential images; and
  • FIG. 4 a schematic block diagram of one imaging application of the invented de-vice.
  • The basic components of a device according to the invention and their interconnections are schematically represented in FIG. 1. The system comprises a first input E1, through which the two-dimensional images generated by a camera and transmitted across a transmission path are directed to an A/D converter 10 and digitized. The digitized images are then directed to an image storage device 11 and a phase selector 16. The images saved in the image storage device 11 are analyzed by a phase analyzer 12, the input of which is connected to the image storage device 11 and the output of which is connected to the phase selector 16. In addition, a long-term storage device 13 is connected to the image storage device 11 for storing images from this storage device and the output of which is connected to an image generator 15. Furthermore, the image generator 15 is also connected to another output of the image storage device 11 and of a motion analyzer 14, to which images from the image storage device 11 are directed. In addition, the device comprises a second input E2 for manual motion control connected to the image generator 15, as well as a third input E3 for manual phase control attached to the phase selector 16. A right or left stereo image BL, BR is attached to two outputs of the phase selector 16, which are connected to a first or second output A1, A2 of the device.
  • A second image sequence is generated by this device based on a (first) image sequence recorded in two-dimensions. Together with the first image sequence, the second sequence makes it possible to view the originally two-dimensional images in three dimensions when the first and second image sequences are transmitted to the left or right eye. The second image sequence is defined according to the following description based on image information resulting from the motion in the first image sequence. The following definitions apply:
  • xij is a digitized image at time t with horizontal resolution I and vertical resolution J. The scan rate is Δt, so that the following formula is derived for an image scanned at time k and saved in the image storage device 11:
    x k :=x ij(t−Δtk)
  • The most recent K images are located in the image storage device 11 with length K. 0≦α≦k is a real number representing the time interval of a given image xk, during which a (synthetic) image of the second image sequence is generated (“approximation variable”). In addition, BL represents the given displayed left image and BR the given displayed right image.
  • It is assumed that a fixed value is given to cc. The images xk in the image storage device 11 are viewed as sample values (scanned image sequence according to curve b in FIG. 2) of a constant function (actual image sequence according to curve a in FIG. 2). Various methods of approximation may be applied to this function. The following explanations relating to FIG. 2 refer to a linear spline approximation. However, other methods of approximation may be used as appropriate; for example, higher-level or polynomial approximations.
  • FIG. 2 shows a image sequence in two-dimensional (I/J-) space. The second image sequence is calculated by the image generator 15 as follows: First, αU is calculated as the largest whole number which is smaller than α. Next, αO is calculated as the smallest whole number which is larger than α. So:
    BL:=x0
    B R :=x +o(α−αu)+x αu(1−α+αu)
    where the image sequence BL for a left viewing channel (left eye) is calculated by the given actual images of the first image sequence x0, x1, etc., and the (second) image sequence BR is calculated by approximation for a right viewing channel (right eye).
  • This calculation is performed separately by the image generator 15 for all of the pixels xij in a selected color space (RGB or YUV); that is:
    B R :=b ij (Y,U,V):=
    (x ij αo(Y)(α−αu)+x ij αu(Y)(1−α+αu),
    x ij αo(U)(α−αu)+x ij αu(U)(1α+αu),
    x ij αo(V)(α−αu)+x ij αu(V)(1−α+αu)).
  • In addition, automatic phase control is performed by the phase analyzer 12 to determine movements in sequential images of the first image sequence. It is assumed that jm:=J/2 is the horizontal midpoint of an image, so xijm 0 with 0≦i≦I is the middle column of the image x0 at time t. Furthermore, M<jm is a selected whole number. Then:
    x ij 0s :=x ij 0 with 0<i<I and jm −M<j<j m +M
    will be defined as a scanned image, shown in vertical stripes in FIG. 3 a. Said image comprises 2M+1 columns s around the horizontal midpoint jm of the image x0. Now. N is a fixed number with N>M, so:
    x ij 1s with 0≦i≦I and j m −N≦j≦j m +N
    are defined as the search region (see FIG. 3 b) in image x1, in which the partial image with the greatest similarity to the scanned image xij 0s is sought.
  • d1 is the similarity of the scanned image to a partial image of equal size from the search region with a displacement position 1, where −N≦1≦+N.
  • If cross-correlation is chosen as a measure of similarity, d1 is the result for the displacement position 1:
    Formula (1): d 1 := 1 - i = 0 I j = j m - M j m + M x ij 0 · x ij - 1 1 ( x ij 0 ) 2 · ( x ij - 1 1 ) 2
  • Here, the value of I ranges from −N to +N, where 1 represents the given displacement position of a partial image in the search region.
  • As an alternative to cross-correlation, a Euclidean distance or an absolute amount may also be chosen as a measure of similarity.
  • Thus, with this method, as indicated in FIGS. 3 a and b, the scanned image xs (FIG. 3 a) runs like a scanner across the search region (FIG. 3 b) of the image x1 (previous image) and looks for the region with the greatest similarity d1 to the scanned image, where the similarity d1 is calculated for every displacement position 1.
  • In addition, a whole number is defined, which maybe called the moment of inertia and with which blurring is defined according to FIG. 3 c. This is used to allow for camera movement which should not be considered displacement of the image. For the value of ε, −1≦F≧1 approximately.
  • This analysis is performed substantially as follows. First, all measures of similarity d1 for −N≦I≦+N are calculated by the phase analyzer 12. Next, the measure of similarity dmin is chosen with the smallest value (dmin:=min d1) and the index Imin of this measure of similarity is determined. The values Imin and ε are compared by the phase selector 16, and the phase selector 16 switches as a function of the results of the comparison as follows:
  • If Imin<ε, this means that the region of greatest similarity in the search region is displaced to the left, and thus the predominant direction of movement in sequential images x1, x0 of the first image sequence is indicated from left to right. This may result from the movement of an object in the images from left to right or from the panning of the camera from right to left. In this case, for the left image BL:=x0 (i.e., the given image of the image sequence) and a calculated synthetic image (second image sequence) is selected for the right image BR. In addition, a “shift” indicator is set to “left” in the phase selector 16. If Imin>ε, this means that the region of greatest similarity in the search region is displaced to the right, and thus the predominant direction of movement in sequential images x1, x0 of the first image sequence is indicated from right to left. This may result from the movement of an object in the images from right to left or from the panning of the camera from left to right. In this case, a calculated synthetic image (second image sequence) is selected for the left image BL and for the right image BR:=x0 (i.e., the given image of the image sequence). In addition, the “shift” indicator is set to “right”.
  • If |Imin|<ε and the indicator is set to “right”, then a calculated synthetic image is selected for the left image BL (second image sequence) and for the right image BR:=x0 (i.e., the given image of the image sequence).
  • Finally, if |Imin|<ε and the indicator is set to “left”, then for the left image BL:=x0 and a calculated synthetic image is selected for the right image (second image sequence).
  • The next image is then accessed and the same process is repeated for this image, beginning with the calculation of the minimum value of the measure of similarity dmin.
  • This automatic phase control or selection may also be switched off and, for example, replaced by manual switching using a keyboard via the device's third input.
  • Furthermore, the embodiment shown in FIG. 1 comprises the motion analyzer 14, which uses dynamic motion control or motion calculation to prevent the stereo base from becoming too large when there are large movements. In addition, this ensures that a certain minimum width of the stereo base is maintained during very slow movements before it disappears in images without any motion. The long-term storage device 13, from which images are accessed and used as images of the second image sequence when the movements are too slow, has been provided for this last purpose.
  • The measure of similarity dk at time tk is defined as follows:
    Formula (2): d k := 1 - i = 0 I j = 0 J x ij k · x ij k + 1 ( x ij k ) 2 · ( x ij k + 1 ) 2
  • Therefore, this measure of similarity is a function of the extent to which the entire contents of the next image in an image sequence differ from the contents of the previous image, and thus represents a measure of the speed of motion in the images.
  • Threshold values δ012 are defined for the analysis of said measure of similarity, where in the ideal case the measure of similarity dk=0 for an unchanged (constant) image at time tk in comparison to the previous image at time tk+1. However, because there is always a certain amount of background noise during digitization, it should be assumed that dk0 for an unchanged image.
  • A Euclidian distance or an absolute amount may of course be chosen for the calculation instead of the cross-correlation described. The individual color values of the selected color space RGB or YUV must always be processed separately.
  • To analyze the value of the measure of similarity dk(k=0, 1, . . . K), it is first stored in the motion analyzer 14 and then compared to the threshold values.
  • If dk0, this means that the movements in the sequential images are very slow or nil. In this case, the transfer of the values of xk to the long-term storage device 13 is stopped so that images will be available which have a sufficient motion differential. In addition, images stored in the long-term memory device are used to generate the second image sequence in order to maintain the minimum stereo base width.
  • If dk>δ0, the value of the approximation variables a will change as a function of the size of the measure of similarity dk relative to the threshold values δ0, δ1, δ2, as follows.
  • If δ0<dk2 and dkd k−1≦−δ1 and as long as α<k−1, then the approximation variable is set at α:=α+s.
  • If δ0<dk2 and dkd k1>δ1 and as long as α≧2 ist, then the approximation variable is set at α:=α−s.
  • The character s denotes a step width which is preferably 0.1, however, it can have other values as well.
  • If δ0<dk2 and −δ1<dk−dk−11, then the approximation variable will remain at α:=α because the motion velocity is substantially constant. In this case, no adjustment is necessary.
  • Finally, if δ2<dk, this means that the movement is very fast and the stereo base width would be too large. In this case, the approximation variable is set at α:=1/dk.
  • This dynamic motion control can also be switched off like the automatic phase control and replaced by manual entry; for example, using a keyboard via the device's second input.
  • The method described will preferably be implemented using a data processing program on a computer, in particular a digital image processing system for the generation of a three-dimensional depiction of television pictures transmitted or stored in a two-dimensional format.
  • In the following, a preferred example with specific values shall be given for the above embodiment. In case of application of the known PAL standard the horizontal resolution is I=576 and the vertical resolution is J=768, whereas for the NTSC standard, =480 and J=640 are prescribed.
  • Generally it is sufficient to store the last five images in the image storage device 11 which means K:≦5. As an initial value α0, the approximation variable is set to α0:=2.1. For an adequate analysis of motion in sequential images the value of M is set to 1 or 2. The value of N should be chosen such that even in case of fast motions the scanning image is still within the search region. For this, a value of N of 20≦N≦30 (especially N:=25) is adequate. However, the value of N can as well comprise the complete original image so that N:=J/2.
  • For defining the blurring, a value of ε:=1 is proposed whereas for evaluating the measure of similarity the following values for the threshold values are preferably chosen: δ0:=0.05, δ1:=0.6 and δ2:=0.8.
  • With an embodiment realized with these values a very natural three-dimensional reproduction could be obtained for image sequences with very differently moving contents.
  • Finally, FIG. 4 shows a block diagram of a device (stereo decoder or stereo viewer) for the generation and depiction of 3-D images which are calculated based on a sequence of 2-D images transmitted over a transmission path or accessed from a storage medium.
  • The device comprises a first input 21, to which the 2-D images transmitted across a transmission path and demodulated or decompressed according to known techniques are connected. In addition, there is a second input 22, which is connected to a DVD player, a video recorder, or another source of images, for example.
  • Both of these inputs are connected to the invented device 23 according to FIG. 1, with which 3-D images are calculated based on the sequence of 2-D images according to the detailed explanation above. The outputs A1, A2 of this device, to which a sequence of left or right images BL, BR is connected, are connected to a stereo storage device 24, 25, in which the images are stored for each channel.
  • Finally, different driver levels can be selected via a third input 26 by activating a selector switch 27, by means of which a corresponding image generator is controlled.
  • For example, a driver 28 for simulator goggles 29, a driver 30 for an autostereoscopic monitor 31, and a driver 32 for a stereo projector 33 are shown here.
  • This device is preferably designed as a component of a digital image processing system for the generation of a three-dimensional depiction of television pictures transmitted or stored in two dimensions.

Claims (12)

1-14. (canceled)
15. A method for generating 3-D images from a first sequence of 2-D images, comprising the following steps:
(a) comparing a pair of sequential images in the first image sequence to determine a measure of similarity (dk) between the pair of sequential images;
(b) comparing the measure of similarity (dk) to predetermined threshold values (δ012);
(c) if δ1k2, (i) setting an approximation variable (a) within a predetermined range to maintain a minimum stereo base width and to prevent the stereo base from becoming too large, and (ii) generating a synthetic image for a second image sequence from the pair of images of the first image sequence by interpolating between the pair of images of the first image sequence based on the approximation variable;
(d) if dk0 setting a temporally previous image of the first image sequence as an image of the second image sequence; and
(e) assigning images of the first and second image sequence to a left and right viewing channel, respectively.
16. A method as set forth in claim 15, wherein if δ1k2 and dk−dk−1≦−δ1 and as long as α≦k−1, the approximation variable is set to α:=α+s, and if δ1<dk2 and dk−dk-1≧δ1 and as long as α>2, the approximation variable is set to α:=α−s.
17. A method as set forth in claim 15, wherein the measure of similarity (dk) is calculated by determining a Euclidean distance or an absolute value.
18. A method as set forth in claim 15, wherein the images of the second image sequence are calculated by linear spline approximation or a higher-level or polynomial approximation.
19. A method as set forth in claim 15, wherein in determining the predominant direction of motion, a vertical mid-region of a current image (x0) of the first image sequence is compared to different vertical regions of a previous image (x1) of this sequence and it is determined whether the vertical region of the previous image with the greatest similarity to the mid-region of the current image is situated left or right of center.
20. A method as set forth in claim 15, wherein in determining the predominant direction of motion, a second measure of similarity (dI) between the image regions is calculated by determination of a Euclidean distance or an absolute value.
21. A method as set forth in claim 15, wherein a blurring region (ε) with which small movements can be suppressed is established around the mid-region of the current image (x0).
22. A system for generating 3-D images from a first sequence of 2-D images, comprising
a first input for receiving the first image sequence;
a motion analyzer connected to the first input for comparing a pair of sequential images in the first image sequence to determine a measure of similarity (dk) between the pair of sequential images and for setting an approximation variable (a) that determines a stereo base based on the measure of similarity;
an image generator connected to the motion analyzer for generating a synthesized image of a second sequence of images by interpolating between the pair of images of the first image sequence using the approximation variable;
first and second outputs for transmitting the first and second image sequences to left and right viewing channels; and
a phase selector connected to the image generator for assigning images received from the phase analyzer to one of the left and the right viewing channels.
23. A system as set forth in claim 16, comprising a phase analyzer interconnected between the image generator and the phase selector for determining a predominant direction of motion in sequential images of the first image sequence, wherein the phase selector assigns images to one of the left and right viewing channels based on the predominant direction of motion determined by the phase analyzer.
24. A system as set forth in claim 16, comprising an analog-to-digital converter for converting analog image data received at the first input to digital image data.
25. A system as set forth in claim 18, comprising one or more of (a) a second input connected to the image generator for receiving manual motion control data; and (b) a third input connected to the phase selector for receiving manual phase control data.
US11/835,104 2000-04-01 2007-08-07 Method and device for generating 3d images Abandoned US20070269136A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/835,104 US20070269136A1 (en) 2000-04-01 2007-08-07 Method and device for generating 3d images

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
DE10016074.3 2000-04-01
DE10016074A DE10016074B4 (en) 2000-04-01 2000-04-01 Method and device for generating 3D images
PCT/EP2001/003707 WO2001076258A2 (en) 2000-04-01 2001-04-01 Generation of a sequence of stereoscopic images from a sequence of 2d images
US10/240,556 US7254264B2 (en) 2000-04-01 2001-04-01 Method and device for generating 3D images
US11/835,104 US20070269136A1 (en) 2000-04-01 2007-08-07 Method and device for generating 3d images

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/EP2001/003707 Continuation WO2001076258A2 (en) 2000-04-01 2001-04-01 Generation of a sequence of stereoscopic images from a sequence of 2d images
US10/240,556 Continuation US7254264B2 (en) 2000-04-01 2001-04-01 Method and device for generating 3D images

Publications (1)

Publication Number Publication Date
US20070269136A1 true US20070269136A1 (en) 2007-11-22

Family

ID=7637136

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/240,556 Expired - Fee Related US7254264B2 (en) 2000-04-01 2001-04-01 Method and device for generating 3D images
US11/835,104 Abandoned US20070269136A1 (en) 2000-04-01 2007-08-07 Method and device for generating 3d images

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/240,556 Expired - Fee Related US7254264B2 (en) 2000-04-01 2001-04-01 Method and device for generating 3D images

Country Status (8)

Country Link
US (2) US7254264B2 (en)
EP (1) EP1305956A2 (en)
JP (1) JP4843753B2 (en)
KR (1) KR100838351B1 (en)
AU (1) AU783045B2 (en)
CA (1) CA2404966A1 (en)
DE (1) DE10016074B4 (en)
WO (1) WO2001076258A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100046635A1 (en) * 2007-04-12 2010-02-25 Purvin Bibhas Pandit Tiling in video decoding and encoding
US9036714B2 (en) 2009-01-26 2015-05-19 Thomson Licensing Frame packing for video coding
US9215445B2 (en) 2010-01-29 2015-12-15 Thomson Licensing Block-based interleaving
RU2765424C2 (en) * 2017-06-29 2022-01-31 Конинклейке Филипс Н.В. Equipment and method for imaging

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7907793B1 (en) 2001-05-04 2011-03-15 Legend Films Inc. Image sequence depth enhancement system and method
US8396328B2 (en) 2001-05-04 2013-03-12 Legend3D, Inc. Minimal artifact image sequence depth enhancement system and method
US6773399B2 (en) * 2001-10-20 2004-08-10 Zonare Medical Systems, Inc. Block-switching in ultrasound imaging
US7254265B2 (en) 2000-04-01 2007-08-07 Newsight Corporation Methods and systems for 2D/3D image conversion and optimization
US8401336B2 (en) 2001-05-04 2013-03-19 Legend3D, Inc. System and method for rapid image sequence depth enhancement with augmented computer-generated elements
US9286941B2 (en) 2001-05-04 2016-03-15 Legend3D, Inc. Image sequence enhancement and motion picture project management system
DE10128530A1 (en) 2001-06-13 2002-12-19 Basf Ag Water-dilutable concentrate giving long-life low electrical conductivity cooling systems for fuel cell systems in e.g. vehicles is based on alkylene glycols and also contains azole derivatives
EP1451775A1 (en) * 2001-11-24 2004-09-01 TDV Technologies Corp. Generation of a stereo image sequence from a 2d image sequence
US7620070B1 (en) 2003-06-24 2009-11-17 Nvidia Corporation Packet processing with re-insertion into network interface circuitry
WO2008011888A1 (en) * 2006-07-24 2008-01-31 Seefront Gmbh Autostereoscopic system
US20090219383A1 (en) * 2007-12-21 2009-09-03 Charles Gregory Passmore Image depth augmentation system and method
KR101605314B1 (en) 2009-07-06 2016-03-22 삼성전자 주식회사 Image processing apparatus and image processing method
US8553072B2 (en) * 2010-11-23 2013-10-08 Circa3D, Llc Blanking inter-frame transitions of a 3D signal
US9983685B2 (en) 2011-01-17 2018-05-29 Mediatek Inc. Electronic apparatuses and methods for providing a man-machine interface (MMI)
US8670023B2 (en) * 2011-01-17 2014-03-11 Mediatek Inc. Apparatuses and methods for providing a 3D man-machine interface (MMI)
US8730232B2 (en) 2011-02-01 2014-05-20 Legend3D, Inc. Director-style based 2D to 3D movie conversion system and method
US9241147B2 (en) 2013-05-01 2016-01-19 Legend3D, Inc. External depth map transformation method for conversion of two-dimensional images to stereoscopic images
US9407904B2 (en) 2013-05-01 2016-08-02 Legend3D, Inc. Method for creating 3D virtual reality from 2D images
US9282321B2 (en) 2011-02-17 2016-03-08 Legend3D, Inc. 3D model multi-reviewer system
US9113130B2 (en) 2012-02-06 2015-08-18 Legend3D, Inc. Multi-stage production pipeline system
US9288476B2 (en) 2011-02-17 2016-03-15 Legend3D, Inc. System and method for real-time depth modification of stereo images of a virtual reality environment
US9007404B2 (en) 2013-03-15 2015-04-14 Legend3D, Inc. Tilt-based look around effect image enhancement method
US9438878B2 (en) 2013-05-01 2016-09-06 Legend3D, Inc. Method of converting 2D video to 3D video using 3D object models
WO2015191767A1 (en) * 2014-06-10 2015-12-17 Bitanimate, Inc. Stereoscopic depth adjustment and focus point adjustment
US9609307B1 (en) 2015-09-17 2017-03-28 Legend3D, Inc. Method of converting 2D video to 3D video using machine learning
US10735707B2 (en) 2017-08-15 2020-08-04 International Business Machines Corporation Generating three-dimensional imagery
US11069074B2 (en) * 2018-04-23 2021-07-20 Cognex Corporation Systems and methods for improved 3-D data reconstruction from stereo-temporal image sequences

Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4781435A (en) * 1985-09-19 1988-11-01 Deutsche Forschungs- Und Versuchsanstalt Fur Luft- Und Raumfahrt E.V. Method for the stereoscopic representation of image scenes with relative movement between imaging sensor and recorded scene
US4925294A (en) * 1986-12-17 1990-05-15 Geshwind David M Method to convert two dimensional motion pictures for three-dimensional systems
US5510832A (en) * 1993-12-01 1996-04-23 Medi-Vision Technologies, Inc. Synthesized stereoscopic imaging system and method
US5588067A (en) * 1993-02-19 1996-12-24 Peterson; Fred M. Motion detection and image acquisition apparatus and method of detecting the motion of and acquiring an image of an object
US5673081A (en) * 1994-11-22 1997-09-30 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5680474A (en) * 1992-10-27 1997-10-21 Canon Kabushiki Kaisha Corresponding point extraction method for a plurality of images
US5682437A (en) * 1994-09-22 1997-10-28 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5717415A (en) * 1994-02-01 1998-02-10 Sanyo Electric Co., Ltd. Display system with 2D/3D image conversion where left and right eye images have a delay and luminance difference base upon a horizontal component of a motion vector
US5739844A (en) * 1994-02-04 1998-04-14 Sanyo Electric Co. Ltd. Method of converting two-dimensional image into three-dimensional image
US5768415A (en) * 1995-09-08 1998-06-16 Lucent Technologies Inc. Apparatus and methods for performing electronic scene analysis and enhancement
US5777666A (en) * 1995-04-17 1998-07-07 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5808664A (en) * 1994-07-14 1998-09-15 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5953054A (en) * 1996-05-31 1999-09-14 Geo-3D Inc. Method and system for producing stereoscopic 3-dimensional images
US5969766A (en) * 1997-07-19 1999-10-19 Daewoo Electronics Co., Ltd Method and apparatus for contour motion estimating a binary image by using a weighted block match algorithm
US6108005A (en) * 1996-08-30 2000-08-22 Space Corporation Method for producing a synthesized stereoscopic image
US6141440A (en) * 1998-06-04 2000-10-31 Canon Kabushiki Kaisha Disparity measurement with variably sized interrogation regions
US6215590B1 (en) * 1998-02-09 2001-04-10 Kabushiki Kaisha Toshiba Stereoscopic image display apparatus
US6215516B1 (en) * 1997-07-07 2001-04-10 Reveo, Inc. Method and apparatus for monoscopic to stereoscopic image conversion
US6314211B1 (en) * 1997-12-30 2001-11-06 Samsung Electronics Co., Ltd. Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image
US20020041703A1 (en) * 2000-07-19 2002-04-11 Fox Simon Richard Image processing and encoding techniques
US6377625B1 (en) * 1999-06-05 2002-04-23 Soft4D Co., Ltd. Method and apparatus for generating steroscopic image using MPEG data
US6392689B1 (en) * 1991-02-21 2002-05-21 Eugene Dolgoff System for displaying moving images pseudostereoscopically
US6445833B1 (en) * 1996-07-18 2002-09-03 Sanyo Electric Co., Ltd Device and method for converting two-dimensional video into three-dimensional video
US20020122585A1 (en) * 2000-06-12 2002-09-05 Swift David C. Electronic stereoscopic media delivery system
US20020126202A1 (en) * 2001-03-09 2002-09-12 Koninklijke Philips Electronics N.V. Apparatus
US6456432B1 (en) * 1990-06-11 2002-09-24 Reveo, Inc. Stereoscopic 3-d viewing system with portable electro-optical viewing glasses and shutter-state control signal transmitter having multiple modes of operation for stereoscopic viewing of 3-d images displayed in different stereoscopic image formats
US6496598B1 (en) * 1997-09-02 2002-12-17 Dynamic Digital Depth Research Pty. Ltd. Image processing method and apparatus
US6584219B1 (en) * 1997-09-18 2003-06-24 Sanyo Electric Co., Ltd. 2D/3D image conversion system
US6757423B1 (en) * 1999-02-19 2004-06-29 Barnes-Jewish Hospital Methods of processing tagged MRI data indicative of tissue motion including 4-D LV tissue tracking

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2141605B (en) * 1983-05-09 1987-06-10 Geshwind David M Method for colorizing black and white film footage
DE3530610A1 (en) * 1985-08-27 1987-03-05 Inst Rundfunktechnik Gmbh Method for producing stereoscopic image sequences
AU652051B2 (en) * 1991-06-27 1994-08-11 Eastman Kodak Company Electronically interpolated integral photography system
JPH07226958A (en) * 1994-02-16 1995-08-22 Sanyo Electric Co Ltd Stereoscopic video image display system
JP3214688B2 (en) * 1994-02-01 2001-10-02 三洋電機株式会社 Method for converting 2D image to 3D image and 3D image signal generation device
JP2594235B2 (en) * 1994-02-01 1997-03-26 三洋電機株式会社 Method for converting 2D image to 3D image and 3D image signal generation device
JP2975837B2 (en) * 1994-02-25 1999-11-10 三洋電機株式会社 Method for converting a part of a two-dimensional image to a three-dimensional image
JPH07222203A (en) * 1994-02-04 1995-08-18 Sanyo Electric Co Ltd Three-dimensional video cassette converting system
JP2951230B2 (en) * 1994-09-22 1999-09-20 三洋電機株式会社 Method for generating 3D image from 2D image

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4781435A (en) * 1985-09-19 1988-11-01 Deutsche Forschungs- Und Versuchsanstalt Fur Luft- Und Raumfahrt E.V. Method for the stereoscopic representation of image scenes with relative movement between imaging sensor and recorded scene
US4925294A (en) * 1986-12-17 1990-05-15 Geshwind David M Method to convert two dimensional motion pictures for three-dimensional systems
US6456432B1 (en) * 1990-06-11 2002-09-24 Reveo, Inc. Stereoscopic 3-d viewing system with portable electro-optical viewing glasses and shutter-state control signal transmitter having multiple modes of operation for stereoscopic viewing of 3-d images displayed in different stereoscopic image formats
US6392689B1 (en) * 1991-02-21 2002-05-21 Eugene Dolgoff System for displaying moving images pseudostereoscopically
US5680474A (en) * 1992-10-27 1997-10-21 Canon Kabushiki Kaisha Corresponding point extraction method for a plurality of images
US5588067A (en) * 1993-02-19 1996-12-24 Peterson; Fred M. Motion detection and image acquisition apparatus and method of detecting the motion of and acquiring an image of an object
US5510832A (en) * 1993-12-01 1996-04-23 Medi-Vision Technologies, Inc. Synthesized stereoscopic imaging system and method
US5717415A (en) * 1994-02-01 1998-02-10 Sanyo Electric Co., Ltd. Display system with 2D/3D image conversion where left and right eye images have a delay and luminance difference base upon a horizontal component of a motion vector
US5739844A (en) * 1994-02-04 1998-04-14 Sanyo Electric Co. Ltd. Method of converting two-dimensional image into three-dimensional image
US5808664A (en) * 1994-07-14 1998-09-15 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5682437A (en) * 1994-09-22 1997-10-28 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5673081A (en) * 1994-11-22 1997-09-30 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5777666A (en) * 1995-04-17 1998-07-07 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5768415A (en) * 1995-09-08 1998-06-16 Lucent Technologies Inc. Apparatus and methods for performing electronic scene analysis and enhancement
US5953054A (en) * 1996-05-31 1999-09-14 Geo-3D Inc. Method and system for producing stereoscopic 3-dimensional images
US6445833B1 (en) * 1996-07-18 2002-09-03 Sanyo Electric Co., Ltd Device and method for converting two-dimensional video into three-dimensional video
US6108005A (en) * 1996-08-30 2000-08-22 Space Corporation Method for producing a synthesized stereoscopic image
US6215516B1 (en) * 1997-07-07 2001-04-10 Reveo, Inc. Method and apparatus for monoscopic to stereoscopic image conversion
US5969766A (en) * 1997-07-19 1999-10-19 Daewoo Electronics Co., Ltd Method and apparatus for contour motion estimating a binary image by using a weighted block match algorithm
US20020191841A1 (en) * 1997-09-02 2002-12-19 Dynamic Digital Depth Research Pty Ltd Image processing method and apparatus
US6496598B1 (en) * 1997-09-02 2002-12-17 Dynamic Digital Depth Research Pty. Ltd. Image processing method and apparatus
US6584219B1 (en) * 1997-09-18 2003-06-24 Sanyo Electric Co., Ltd. 2D/3D image conversion system
US6314211B1 (en) * 1997-12-30 2001-11-06 Samsung Electronics Co., Ltd. Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image
US6215590B1 (en) * 1998-02-09 2001-04-10 Kabushiki Kaisha Toshiba Stereoscopic image display apparatus
US6141440A (en) * 1998-06-04 2000-10-31 Canon Kabushiki Kaisha Disparity measurement with variably sized interrogation regions
US6757423B1 (en) * 1999-02-19 2004-06-29 Barnes-Jewish Hospital Methods of processing tagged MRI data indicative of tissue motion including 4-D LV tissue tracking
US6377625B1 (en) * 1999-06-05 2002-04-23 Soft4D Co., Ltd. Method and apparatus for generating steroscopic image using MPEG data
US20020122585A1 (en) * 2000-06-12 2002-09-05 Swift David C. Electronic stereoscopic media delivery system
US20020041703A1 (en) * 2000-07-19 2002-04-11 Fox Simon Richard Image processing and encoding techniques
US20020126202A1 (en) * 2001-03-09 2002-09-12 Koninklijke Philips Electronics N.V. Apparatus

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9986254B1 (en) 2007-04-12 2018-05-29 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US10129557B2 (en) 2007-04-12 2018-11-13 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US9706217B2 (en) 2007-04-12 2017-07-11 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US9838705B2 (en) 2007-04-12 2017-12-05 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US9445116B2 (en) 2007-04-12 2016-09-13 Thomson Licensing Tiling in video encoding and decoding
US9219923B2 (en) 2007-04-12 2015-12-22 Thomson Licensing Tiling in video encoding and decoding
US9232235B2 (en) 2007-04-12 2016-01-05 Thomson Licensing Tiling in video encoding and decoding
US10764596B2 (en) 2007-04-12 2020-09-01 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US10432958B2 (en) 2007-04-12 2019-10-01 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US10298948B2 (en) 2007-04-12 2019-05-21 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US9185384B2 (en) 2007-04-12 2015-11-10 Thomson Licensing Tiling in video encoding and decoding
US9973771B2 (en) 2007-04-12 2018-05-15 Dolby Laboratories Licensing Corporation Tiling in video encoding and decoding
US20100046635A1 (en) * 2007-04-12 2010-02-25 Purvin Bibhas Pandit Tiling in video decoding and encoding
US8780998B2 (en) 2007-04-12 2014-07-15 Thomson Licensing Tiling in video decoding and encoding
US9036714B2 (en) 2009-01-26 2015-05-19 Thomson Licensing Frame packing for video coding
US9420310B2 (en) 2009-01-26 2016-08-16 Thomson Licensing Frame packing for video coding
US9215445B2 (en) 2010-01-29 2015-12-15 Thomson Licensing Block-based interleaving
RU2765424C2 (en) * 2017-06-29 2022-01-31 Конинклейке Филипс Н.В. Equipment and method for imaging

Also Published As

Publication number Publication date
EP1305956A2 (en) 2003-05-02
KR100838351B1 (en) 2008-06-16
US7254264B2 (en) 2007-08-07
WO2001076258A3 (en) 2002-09-12
JP2004504736A (en) 2004-02-12
AU783045B2 (en) 2005-09-22
US20030098907A1 (en) 2003-05-29
WO2001076258A2 (en) 2001-10-11
CA2404966A1 (en) 2001-10-11
AU5224001A (en) 2001-10-15
KR20040010040A (en) 2004-01-31
DE10016074B4 (en) 2004-09-30
JP4843753B2 (en) 2011-12-21
DE10016074A1 (en) 2001-10-04

Similar Documents

Publication Publication Date Title
US20070269136A1 (en) Method and device for generating 3d images
US7254265B2 (en) Methods and systems for 2D/3D image conversion and optimization
US6314211B1 (en) Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image
US5739844A (en) Method of converting two-dimensional image into three-dimensional image
KR100334722B1 (en) Method and the apparatus for generating stereoscopic image using MPEG data
US5682437A (en) Method of converting two-dimensional images into three-dimensional images
US5777666A (en) Method of converting two-dimensional images into three-dimensional images
US6496598B1 (en) Image processing method and apparatus
KR100358021B1 (en) Method of converting 2D image into 3D image and stereoscopic image display system
US5610662A (en) Method and apparatus for reducing conversion artifacts
KR100720722B1 (en) Intermediate vector interpolation method and 3D display apparatus
KR20040071145A (en) Generation of a stereo image sequence from a 2d image sequence
US20100177239A1 (en) Method of and apparatus for frame rate conversion
US7197075B2 (en) Method and system for video sequence real-time motion compensated temporal upsampling
EP1646228B1 (en) Image processing apparatus and method
US20080239144A1 (en) Frame rate conversion device and image display apparatus
KR100263936B1 (en) Apparatus and method for converting 2-d video sequence to 3-d video using virtual stereoscopic video
KR19990057668A (en) 3D image conversion device and method of 2D continuous image using transformation of motion parallax into horizontal parallax
RU2287858C2 (en) Creation of sequence of stereoscopic images from sequence of two-dimensional images
EP0675643B1 (en) Method and apparatus for reducing conversion artefacts
KR100230447B1 (en) 3-dimension image conversion apparatus and method of 2-dimension continuous image
JP3580450B2 (en) Stereoscopic device and stereoscopic method
JP2846843B2 (en) How to convert 2D video to 3D video
KR100416557B1 (en) Method for reproducing partial interlaced image and apparatus thereof
JPH01202093A (en) Stereoscopic television transmission system

Legal Events

Date Code Title Description
AS Assignment

Owner name: PRENTICE CAPITAL MANAGEMENT, LP, NEW YORK

Free format text: SECURITY AGREEMENT;ASSIGNOR:NEWSIGHT CORPORATION;REEL/FRAME:020339/0259

Effective date: 20071220

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION