US20110025830A1 - Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation - Google Patents

Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation Download PDF

Info

Publication number
US20110025830A1
US20110025830A1 US12/842,257 US84225710A US2011025830A1 US 20110025830 A1 US20110025830 A1 US 20110025830A1 US 84225710 A US84225710 A US 84225710A US 2011025830 A1 US2011025830 A1 US 2011025830A1
Authority
US
United States
Prior art keywords
image
scene
depth map
images
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/842,257
Inventor
Michael McNamer
Patrick Mauney
Tassos Markas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3DMedia Corp
Original Assignee
3DMedia Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3DMedia Corp filed Critical 3DMedia Corp
Priority to US12/842,257 priority Critical patent/US20110025830A1/en
Publication of US20110025830A1 publication Critical patent/US20110025830A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/02Stereoscopic photography by sequential recording
    • G03B35/06Stereoscopic photography by sequential recording with axial movement of lens or gate between exposures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Definitions

  • the subject matter disclosed herein relates to generating three-dimensional images.
  • the subject matter disclosed herein relates to methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation.
  • Stereoscopic, or three-dimensional, imagery is based on the principle of human vision.
  • Two separate detectors detect the same object or objects in a scene from slightly different angles and project them onto two planes.
  • the resulting images are transferred to a processor which combines them and gives the perception of the third dimension, i.e. depth, to a scene.
  • display devices have been developed recently that are well-suited for displaying stereoscopic images.
  • display devices include digital cameras, personal computers, digital picture frames, high-definition televisions (HDTVs), and the like.
  • digital image capture devices such as digital still cameras, digital camcorders (or video cameras), and phones with built-in cameras
  • images captured using these devices are in a digital format, the images can be easily distributed and edited.
  • the digital images can be easily distributed over networks, such as the Internet.
  • the digital images can be edited by use of suitable software on the image capture device or a personal computer.
  • Digital images captured using conventional image capture devices are two-dimensional. It is desirable to provide methods and systems for using conventional devices for generating three-dimensional images. In addition, it is desirable to provide methods and systems for aiding users of image capture devices to select appropriate image capture positions for capturing two-dimensional images for use in generating three-dimensional images. Further, it is desirable to provide methods and systems for altering the depth perceived in three-dimensional images.
  • a method includes receiving a plurality of images of a scene captured at different focal planes. The method can also include identifying a plurality of portions of the scene in each captured image. Further, the method can include determining an in-focus depth of each portion based on the captured images for generating a depth map for the scene. Further, the method can include generating the other image of the stereoscopic image pair based on the captured image where the intended subject is found to be in focus and the depth map.
  • a method for generating a stereoscopic image pair by altering a depth map can include receiving an image of a scene.
  • the method can also include receiving a depth map associated with at least one captured image of the scene.
  • the depth map can define depths for each of a plurality of portions of at least one captured image.
  • the method can include receiving user input for changing, in the depth map, the depth of at least one portion of at least one captured image.
  • the method can also include generating a stereoscopic image pair of the scene based on the received image of the scene and the changed depth map.
  • a system for generating a three-dimensional image of a scene may include at least one computer processor and memory configured to: receive a plurality of images of a scene captured at different focal planes; identify a plurality of portions of the scene in each captured image; determine an in-focus depth of each portion based on the captured images for generating a depth map for the scene; identify the captured image where the intended subject is found to be in focus as being one of the images of a stereoscopic image pair; and generate the other image of the stereoscopic image pair based on the identified captured image and the depth map.
  • the computer processor and memory are configured to: scan a plurality of focal planes ranging from zero to infinity; and capture a plurality of images, each at a different focal plane.
  • the system includes an image capture device for capturing the plurality of images.
  • the image capture device comprises at least one of a digital still camera, a video camera, a mobile phone, and a smart phone.
  • the computer processor and memory are configured filter the portions of the scene for generating a filtered image; apply thresholded edge detection to the filtered image; and determine whether each filtered portion is in focus based on the applied threshold edge detection.
  • the computer processor and memory are configured to: identify at least one object in each captured image; and generate a depth map for the at least one object.
  • the at least one object is a target subject.
  • the computer processor and memory are configured to determine one of the captured images having the highest contrast based on the target subject.
  • the computer processor and memory are configured to generate the other image of the stereoscopic pair based on translation and perspective projection.
  • the computer processor and memory are configured to generate a three-dimensional image of the scene using the stereoscopic image pairs.
  • the computer processor and memory are configured to implement one or more of registration, rectification, color correction, matching edges of the pair of images, transformation, depth adjustment, motion detection, and removal of moving objects.
  • the computer processor and memory are configured to display the three-dimensional image on a suitable three-dimensional image display.
  • the computer processor and memory are configured to display the three-dimensional image on one of a digital still camera, a computer, a video camera, a digital picture frame, a set-top box, and a high-definition television.
  • FIG. 1 is a block diagram of an exemplary device for creating three-dimensional images of a scene according to embodiments of the present invention
  • FIG. 2 is a flow chart of an exemplary method for generating a stereoscopic image pair of a scene using a depth map and the device shown in FIG. 1 , alone or together with any other suitable device described herein, in accordance with embodiments of the present invention;
  • FIGS. 3A and 3B are a flow chart of an exemplary method of a sharpness/focus analysis procedure in accordance with embodiments of the present invention.
  • FIG. 4 is schematic diagram of an image-capture, “focus scan” procedure, which facilitates later conversion to stereoscopic images, and an associated table according to embodiments of the present invention
  • FIG. 5 illustrates several exemplary images related to sharpness/focus analysis with optional image segmentation according to embodiments of the present invention
  • FIG. 6 illustrates schematic diagrams showing close and medium-distance convergence points according to embodiments of the present invention
  • FIG. 7 is a schematic diagram showing a translational offset determination technique according to embodiments of the present invention.
  • FIG. 8 is a schematic diagram showing pixel repositioning via perspective projection with translation according to embodiments of the present invention.
  • FIG. 9 illustrates an exemplary environment for implementing various aspects of the subject matter disclosed herein.
  • the present invention includes various embodiments for the creation and/or alteration of a depth map for an image using a digital still camera or other suitable device as described herein.
  • a depth map for the image Using the depth map for the image, a stereoscopic image pair and its associated depth map may be rendered.
  • These processes may be implemented by a device such as a digital camera or any other suitable image processing device.
  • FIG. 1 illustrates a block diagram of an exemplary device 100 for generating three-dimensional images or a stereoscopic image pair of a scene using a depth map according to embodiments of the present invention.
  • device 100 is a digital camera capable of capturing several consecutive, still digital images of a scene.
  • the device 100 may be a video camera capable of capturing a video sequence including multiple still images of a scene.
  • the device may generate a stereoscopic image pair using a depth map as described in further detail herein.
  • a user of the device 100 may position the camera in different positions for capturing images of different perspective views of a scene.
  • the captured images may be suitably stored, analyzed and processed for generating three-dimensional images using a depth map as described herein.
  • the device 100 may use the images for generating a three-dimensional image of the scene and for displaying the three-dimensional image to the user.
  • the device 100 includes a sensor array 102 of charge coupled device (CCD) sensors or CMOS sensors which may be exposed to a scene through a lens and exposure control mechanism as understood by those of skill in the art.
  • the device 100 may also include analog and digital circuitry such as, but not limited to, a memory 104 for storing program instruction sequences that control the device 100 , together with a CPU 106 , in accordance with embodiments of the present invention.
  • the CPU 106 executes the program instruction sequences so as to cause the device 100 to expose the sensor array 102 to a scene and derive a digital image corresponding to the scene.
  • the digital image may be stored in the memory 104 .
  • All or a portion of the memory 104 may be removable, so as to facilitate transfer of the digital image to other devices such as a computer 108 . Further, the device 100 may be provided with an input/output (I/O) interface 110 so as to facilitate transfer of digital image even if the memory 104 is not removable.
  • the device 100 may also include a display 112 controllable by the CPU 106 and operable to display the images for viewing by a user.
  • the memory 104 and the CPU 106 may be operable together to implement an image generator function 114 for generating three-dimensional images of a scene using a depth map in accordance with embodiments of the present invention.
  • the image generator function 114 may generate a three-dimensional image of a scene using two or more images of the scene captured by the device 100 .
  • FIG. 2 illustrates a flow chart of an exemplary method for generating a stereoscopic image pair of a scene using a depth map and the device 100 , alone or together with any other suitable device, in accordance with embodiments of the present invention.
  • the method includes receiving 200 a plurality of images of a scene captured at different focal points. For example, all or a portion of a focal plane from zero to infinity may be scanned and all images captured during the scanning process may be stored.
  • the sensor array 102 may be used for capturing still images of the scene.
  • the method includes identifying 202 a plurality of portions of the scene in each captured image. For example, objects in each captured image can be identified and segmented to concentrate focus analysis on specific objects in the scene.
  • a focus map as described in more detail herein, may be generated and used for approximating the depth of image segments. Using the focus map, an in-focus depth of each portion may be determined 204 based on the captured images for generating a depth map for the scene.
  • the method uses the image where the intended subject is found to be in focus by the camera (as per normal camera focus operation) as the first image of the stereoscopic pair.
  • the other image of the stereoscopic image pair is then generated 206 based on the first image and the depth map.
  • a method in accordance with embodiments of the present invention for generating a stereoscopic image pair of a scene using a depth map may be applied during image capture and may utilize camera, focus, and optics information for estimating the depth of each pixel in the image scene.
  • the technique utilizes the concept of depth of field (or similarly, the circle of confusion) and relies upon fast capture and evaluation of a plurality of images while adjusting the lens focus from near field to infinity, before refocusing to capture the intended focused image.
  • FIGS. 3A and 3B are a flow chart of an exemplary method of a sharpness/focus analysis procedure in accordance with embodiments of the present invention. Referring to FIGS. 3A and 3B , the method may begin when the camera enters a stereoscopic mode (step 300 ).
  • the method of FIGS. 3A and 3B includes scanning the entire focal plane from zero to infinity and storing all images during the scanning process (step 302 ).
  • the camera may immediately begin to capture multiple images across the full range of focus for the lens (termed a focus scan, herein), as shown in the example of FIG. 4 .
  • a focus scan herein
  • each image capture at a given increment of the focus distance of the lens may result in a specific Depth of Field (area of image sharpness that encompasses a range of distance from the user) for the scene, with the distance of the sharply focused objects from the user increasing as the focus distance of the lens increases.
  • each may down-scaled to a reduced resolution before subsequent processing.
  • the method of FIGS. 3A and 3B may optionally include executing image segmentation to break the image into multiple objects (step 304 ).
  • each captured image from this sequence may be divided into N ⁇ M blocks, and each block (or the full image) may be high-pass filtered.
  • a black and white fill operation may be performed to fill the areas between edges, and an “in-focus” pixel map for each image may be obtained.
  • objects in the image may be segmented to concentrate the focus analysis on specific objects in the scene.
  • FIG. 4 illustrates several exemplary images related to sharpness/focus analysis with optional image segmentation according to embodiments of the present invention.
  • each N ⁇ M block may be further subdivided into n ⁇ m sized sub-blocks corresponding to portions of a given segmented object (step 306 ).
  • the images for which the pixels are deemed by the procedure above to be “in-focus” may be analyzed for those pixels to identify in which of the candidate images the local contrast is at its highest level (step 308 ). This process can continue hierarchically for smaller sub-blocks as needed.
  • the nearest focus distance at which a given pixel is deemed “in focus,” the farthest distance at which it is “in focus,” and the distance at which it is optimally “in focus,” as indicated by the highest local contrast for that pixel, may be recorded in a “focus map.”
  • an approximate depth for those pixels can be calculated.
  • image (camera) format circle of confusion, c, f-stop (aperture), N, and focal length, F the hyperfocal distance (the nearest distance at which the depth of field extends to infinity) of the combination can be approximated as follows:
  • the near field depth of field (D n ) for an image for a given focus distance, d can be approximated as follows:
  • the focus map contains the value for the shortest focus distance at which the pixel is in focus, d s (P), the longest distance, d l (P), and the optimum contrast distance, d c (P).
  • d s the longest focus distance at which the pixel is in focus
  • d l the longest distance
  • d c the optimum contrast distance
  • D p a depth for each pixel
  • D f (P) if any of the D f (P) values are non-infinite. In the case that all D f (P) values are infinite, D p is instead approximated as
  • D p [max( D ns ( P ), D nc ( P ))+min( D nl ( P ), D nc ( P ))]/2.
  • the method of FIGS. 3A and 3B includes assigning the left eye image to be the image where the intended subject is found to be in focus by the camera (step 312 ). Based on the depth map and the left eye image, the right eye image may be generated by translation and perspective projection (step 314 ). A dual-image process may also be implemented (step 316 ). The selected left and right eye images may be labeled as a stereoscopic image pair (step 318 ).
  • a method in accordance with embodiments of the present invention for altering a depth map for generating a stereoscopic image pair may be applicable either pre- or post-capture.
  • Touchscreen technology may be used in this method. Touchscreen technology has become increasingly common, and with it, applications such as touchscreen user directed focus for digital cameras (encompassing both digital still camera and cellphone camera units) has emerged. Using this technology, a touchscreen interface may be used for specifying the depth of objects in a two dimensional image capture. Either pre- or post-capture, the image field may be displayed in the live view LCD window, which also functions as a touchscreen interface.
  • a user may touch and highlight the window area he or she wishes to change the depth on, and subsequently uses a right/left (or similar) brushing gesture to indicate an increased or decreased (respectively) depth of the object(s) at the point of the touchscreen highlight.
  • depth can be specified by a user by use of any suitable input device or component, such as, for example, a keyboard, a mouse, or the like.
  • Embodiments of the present invention are applicable pre-capture, while composing the picture, or alternatively can be used post-capture to create or enhance the depth of objects in an eventual stereoscopic image, optionally in conjunction with the technology of the first part of the description.
  • the technology described can be used for selective artistic enhancements by the user; whereas in a stand-alone sense, the technology described can be the means of creation of a relative depth map for the picture, allowing the user to create a depth effect only for the objects he/she feels are of import.
  • rendering of the stereoscopic image pair may occur.
  • any stereoscopic image there is an overlapping field of view from the left and right eyes that defines the stereoscopic image.
  • the disparity of an object between the two views will be zero, i.e. no parallax. This defines the “screen point” when viewing the stereoscopic pair.
  • Objects in front of the screen and behind the screen will have increasing amounts of parallax disparity as the distance from the screen increases (negative parallax for objects in front of the screen, positive parallax for objects behind the screen).
  • the central point of the overlapping field of view on the screen plane (zero parallax depth) of the two eyes in stereoscopic viewing defines a circle that passes through each eye, with a radius, R, equal to the distance to the convergence point.
  • R radius
  • angle between the vectors from the central convergence point to each of the two eyes can be measured. Examples for varying convergence points are described herein below.
  • FIG. 6 illustrates schematic diagrams showing an example of close and medium-distance convergence points according to embodiments of the present invention.
  • the convergence point is chosen as center pixel of the image on the screen plane. It should be noted that this may be an imaginary point, as the center pixel of the image may not be at a depth that is on the screen plane, and hence, the depth of that center pixel can be approximated.
  • This value (D focus ) is approximated to be 10-30% behind the near end depth of field distance for the final captured image, and is approximated by the equation:
  • D focus is the focus distance of the lens for the final capture of the image
  • Screen is a value between 1.1 and 1.3, representing the placement of the screen plane behind the near end depth of field
  • scale represents any scaled adjustment of that depth by the user utilizing the touchscreen interface.
  • is dependent upon the estimated distance of focus and the modeled stereo baseline of the image pair to be created. Hence, ⁇ may be estimated as follows:
  • FIG. 7 illustrates a schematic diagram showing a translational offset determination technique according to embodiments of the present invention.
  • the X axis (horizontal) displacement, S is calculated using the angle of view, V, for the capture.
  • the angle of view is given by the following equation:
  • V 2 * tan - 1 ⁇ W 2 * F
  • Depth D p has been approximated for each pixel in the image, and is available from the depth map. It should be noted that the calculations that follow for a given pixel depth, D p , may be imperfect, since each pixel is not centrally located between the two eye views; however, the approximation is sufficient for the goal of producing a stereoscopic effect. Hence, knowing V and the depth, D p , of a given pixel, the approximate width of the field of view (WoV) may be represented as follows:
  • a perspective projective transform can be defined to generate a right eye image from the single “left eye” image.
  • a projective perspective transform is defined as having an aspect of translation (defined by S), rotation in the x/y plane (which will be zero for this case), rotation in the y/z plane (again will be zero for this case), and rotation in the x/z plane, which will be defined by the angle ⁇ .
  • the transform may be defined as follows:
  • (Dx p , Dy p , Dz p ) are 3D coordinate points resulting from the transform that can be projected onto a two dimensional image plane, which may be defined as follows:
  • x p ′ [ ( Dx p - Ex ) ⁇ x ⁇ ( Ez Dz p ) ]
  • x p ′ [ ( Dy p - Ey ) ⁇ x ⁇ ( Ez Dz p ) ]
  • Ex, Ey, and Ez are the coordinates of the viewer relative to the screen, and can be estimated for a given target display device. Ex and Ey can be assumed to be, but are not limited to, 0.
  • the pixels defined by (x p ′, y p ′) make up the right image view for the new stereoscopic image pair.
  • FIG. 8 illustrates a schematic diagram showing pixel repositioning via perspective projection with translation according to embodiments of the present invention.
  • a simple exemplary pixel fill-in process that may be utilized in the present invention assumes a linear gradient between points on each horizontal row in the image. For points on the same row, n, without defined pixel values between two defined points (x i , y n ) and (x j , y n ), the fill-in process first determines the distance, which may be defined as follows:
  • This process may repeat for each line in the image following the perspective projective transformation.
  • the resultant image may be combined with the initial image capture to create a stereo image pair that may be rendered for 3D viewing via stereo registration and display.
  • Other, more complex and potentially more accurate pixel fill in processes may be utilized.
  • Embodiments in accordance with the present invention may be implemented by a digital still camera, a video camera, a mobile phone, a smart phone, and the like.
  • FIG. 9 and the following discussion are intended to provide a brief, general description of a suitable operating environment 900 in which various aspects of the disclosed subject matter may be implemented. While the invention is described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices, those skilled in the art will recognize that the disclosed subject matter can also be implemented in combination with other program modules and/or as a combination of hardware and software.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular data types.
  • the operating environment 900 is only one example of a suitable operating environment and is not intended to suggest any limitation as to the scope of use or functionality of the subject matter disclosed herein.
  • Other well known computer systems, environments, and/or configurations that may be suitable for use with the invention include but are not limited to, personal computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include the above systems or devices, and the like.
  • an exemplary environment 900 for implementing various aspects of the subject matter disclosed herein includes a computer 902 .
  • the computer 902 includes a processing unit 904 , a system memory 906 , and a system bus 908 .
  • the system bus 908 couples system components including, but not limited to, the system memory 906 to the processing unit 904 .
  • the processing unit 904 can be any of various available processors. Dual microprocessors and other multiprocessor architectures also can be employed as the processing unit 904 .
  • the system bus 908 can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 11-bit bus, Industrial Standard Architecture (ISA), Micro-Channel Architecture (MCA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), Universal Serial Bus (USB), Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), and Small Computer Systems Interface (SCSI).
  • ISA Industrial Standard Architecture
  • MCA Micro-Channel Architecture
  • EISA Extended ISA
  • IDE Intelligent Drive Electronics
  • VLB VESA Local Bus
  • PCI Peripheral Component Interconnect
  • USB Universal Serial Bus
  • AGP Advanced Graphics Port
  • PCMCIA Personal Computer Memory Card International Association bus
  • SCSI Small Computer Systems Interface
  • the system memory 906 includes volatile memory 910 and nonvolatile memory 912 .
  • the basic input/output system (BIOS) containing the basic routines to transfer information between elements within the computer 902 , such as during start-up, is stored in nonvolatile memory 912 .
  • nonvolatile memory 912 can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM), or flash memory.
  • Volatile memory 910 includes random access memory (RAM), which acts as external cache memory.
  • RAM is available in many forms such as synchronous RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct Rambus RAM (DRRAM).
  • SRAM synchronous RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM Synchlink DRAM
  • DRRAM direct Rambus RAM
  • Disk storage 914 includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-100 drive, flash memory card, or memory stick.
  • disk storage 914 can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), CD recordable drive (CD-R Drive), CD rewritable drive (CD-RW Drive) or a digital versatile disk ROM drive (DVD-ROM).
  • an optical disk drive such as a compact disk ROM device (CD-ROM), CD recordable drive (CD-R Drive), CD rewritable drive (CD-RW Drive) or a digital versatile disk ROM drive (DVD-ROM).
  • a removable or non-removable interface is typically used such as interface 916 .
  • FIG. 9 describes software that acts as an intermediary between users and the basic computer resources described in suitable operating environment 900 .
  • Such software includes an operating system 918 .
  • Operating system 918 which can be stored on disk storage 914 , acts to control and allocate resources of the computer system 902 .
  • System applications 920 take advantage of the management of resources by operating system 918 through program modules 922 and program data 924 stored either in system memory 906 or on disk storage 914 . It is to be appreciated that the subject matter disclosed herein can be implemented with various operating systems or combinations of operating systems.
  • Input devices 926 include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like.
  • These and other input devices connect to the processing unit 904 through the system bus 908 via interface port(s) 928 .
  • Interface port(s) 928 include, for example, a serial port, a parallel port, a game port, and a universal serial bus (USB).
  • Output device(s) 930 use some of the same type of ports as input device(s) 926 .
  • a USB port may be used to provide input to computer 902 and to output information from computer 902 to an output device 930 .
  • Output adapter 932 is provided to illustrate that there are some output devices 930 like monitors, speakers, and printers among other output devices 930 that require special adapters.
  • the output adapters 932 include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device 930 and the system bus 908 . It should be noted that other devices and/or systems of devices provide both input and output capabilities such as remote computer(s) 934 .
  • Computer 902 can operate in a networked environment using logical connections to one or more remote computers, such as remote computer(s) 934 .
  • the remote computer(s) 934 can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to computer 902 .
  • only a memory storage device 936 is illustrated with remote computer(s) 934 .
  • Remote computer(s) 934 is logically connected to computer 902 through a network interface 938 and then physically connected via communication connection 940 .
  • Network interface 938 encompasses communication networks such as local-area networks (LAN) and wide-area networks (WAN).
  • LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 1102.3, Token Ring/IEEE 1102.5 and the like.
  • WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL).
  • ISDN Integrated Services Digital Networks
  • DSL Digital Subscriber Lines
  • Communication connection(s) 940 refers to the hardware/software employed to connect the network interface 938 to the bus 908 . While communication connection 940 is shown for illustrative clarity inside computer 902 , it can also be external to computer 902 .
  • the hardware/software necessary for connection to the network interface 938 includes, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.
  • the various techniques described herein may be implemented with hardware or software or, where appropriate, with a combination of both.
  • the methods and apparatus of the disclosed embodiments, or certain aspects or portions thereof may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
  • the computer will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device and at least one output device.
  • One or more programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system.
  • the program(s) can be implemented in assembly or machine language, if desired.
  • the language may be a compiled or interpreted language, and combined with hardware implementations.
  • the described methods and apparatus may also be embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, a video recorder or the like, the machine becomes an apparatus for practicing the invention.
  • a machine such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, a video recorder or the like
  • PLD programmable logic device
  • client computer a client computer
  • video recorder or the like
  • the program code When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates to perform the processing of the present invention.

Abstract

Methods, systems, and computer program products for generating stereoscopic content via depth map creation are disclosed herein. According to one aspect, a method includes receiving a plurality of images of a scene captured at different focal planes. The method can also include identifying a plurality of portions of the scene in each captured image. Further, the method can include determining an in-focus depth of each portion based on the captured images for generating a depth map for the scene. Further, the method can include generating the other image of the stereoscopic image pair based on the captured image where the intended subject is found to be in focus and the depth map.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. provisional patent application No. 61/230,138, filed Jul. 31, 2009, the disclosure of which is incorporated herein by reference in its entirety. The disclosures of the following U.S. provisional patent applications, commonly owned and simultaneously filed Jul. 31, 2009, are all incorporated by reference in their entirety: U.S. provisional patent application No. 61/230,131; and U.S. provisional patent application No. 61/230,133.
  • TECHNICAL FIELD
  • The subject matter disclosed herein relates to generating three-dimensional images. In particular, the subject matter disclosed herein relates to methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation.
  • BACKGROUND
  • Stereoscopic, or three-dimensional, imagery is based on the principle of human vision. Two separate detectors detect the same object or objects in a scene from slightly different angles and project them onto two planes. The resulting images are transferred to a processor which combines them and gives the perception of the third dimension, i.e. depth, to a scene.
  • Many techniques of viewing stereoscopic images have been developed and include the use of colored or polarizing filters to separate the two images, temporal selection by successive transmission of images using a shutter arrangement, or physical separation of the images in the viewer and projecting them separately to each eye. In addition, display devices have been developed recently that are well-suited for displaying stereoscopic images. For example, such display devices include digital cameras, personal computers, digital picture frames, high-definition televisions (HDTVs), and the like.
  • The use of digital image capture devices, such as digital still cameras, digital camcorders (or video cameras), and phones with built-in cameras, for use in capturing digital images has become widespread and popular. Because images captured using these devices are in a digital format, the images can be easily distributed and edited. For example, the digital images can be easily distributed over networks, such as the Internet. In addition, the digital images can be edited by use of suitable software on the image capture device or a personal computer.
  • Digital images captured using conventional image capture devices are two-dimensional. It is desirable to provide methods and systems for using conventional devices for generating three-dimensional images. In addition, it is desirable to provide methods and systems for aiding users of image capture devices to select appropriate image capture positions for capturing two-dimensional images for use in generating three-dimensional images. Further, it is desirable to provide methods and systems for altering the depth perceived in three-dimensional images.
  • SUMMARY
  • Methods, systems, and computer program products for generating stereoscopic content via depth map creation are disclosed herein. According to one aspect, a method includes receiving a plurality of images of a scene captured at different focal planes. The method can also include identifying a plurality of portions of the scene in each captured image. Further, the method can include determining an in-focus depth of each portion based on the captured images for generating a depth map for the scene. Further, the method can include generating the other image of the stereoscopic image pair based on the captured image where the intended subject is found to be in focus and the depth map.
  • According to another aspect, a method for generating a stereoscopic image pair by altering a depth map can include receiving an image of a scene. The method can also include receiving a depth map associated with at least one captured image of the scene. The depth map can define depths for each of a plurality of portions of at least one captured image. Further, the method can include receiving user input for changing, in the depth map, the depth of at least one portion of at least one captured image. The method can also include generating a stereoscopic image pair of the scene based on the received image of the scene and the changed depth map.
  • According to an aspect, a system for generating a three-dimensional image of a scene is disclosed. The system may include at least one computer processor and memory configured to: receive a plurality of images of a scene captured at different focal planes; identify a plurality of portions of the scene in each captured image; determine an in-focus depth of each portion based on the captured images for generating a depth map for the scene; identify the captured image where the intended subject is found to be in focus as being one of the images of a stereoscopic image pair; and generate the other image of the stereoscopic image pair based on the identified captured image and the depth map.
  • According to another aspect, the computer processor and memory are configured to: scan a plurality of focal planes ranging from zero to infinity; and capture a plurality of images, each at a different focal plane.
  • According to another aspect, the system includes an image capture device for capturing the plurality of images.
  • According to another aspect, the image capture device comprises at least one of a digital still camera, a video camera, a mobile phone, and a smart phone.
  • According to another aspect, the computer processor and memory are configured filter the portions of the scene for generating a filtered image; apply thresholded edge detection to the filtered image; and determine whether each filtered portion is in focus based on the applied threshold edge detection.
  • According to another aspect, the computer processor and memory are configured to: identify at least one object in each captured image; and generate a depth map for the at least one object.
  • According to another aspect, the at least one object is a target subject. The computer processor and memory are configured to determine one of the captured images having the highest contrast based on the target subject.
  • According to another aspect, the computer processor and memory are configured to generate the other image of the stereoscopic pair based on translation and perspective projection.
  • According to another aspect, the computer processor and memory are configured to generate a three-dimensional image of the scene using the stereoscopic image pairs.
  • According to another aspect, the computer processor and memory are configured to implement one or more of registration, rectification, color correction, matching edges of the pair of images, transformation, depth adjustment, motion detection, and removal of moving objects.
  • According to another aspect, the computer processor and memory are configured to display the three-dimensional image on a suitable three-dimensional image display.
  • According to another aspect, the computer processor and memory are configured to display the three-dimensional image on one of a digital still camera, a computer, a video camera, a digital picture frame, a set-top box, and a high-definition television.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing summary, as well as the following detailed description of preferred embodiments, is better understood when read in conjunction with the appended drawings. For the purposes of illustration, there is shown in the drawings exemplary embodiments; however, the invention is not limited to the specific methods and instrumentalities disclosed. In the drawings:
  • FIG. 1 is a block diagram of an exemplary device for creating three-dimensional images of a scene according to embodiments of the present invention;
  • FIG. 2 is a flow chart of an exemplary method for generating a stereoscopic image pair of a scene using a depth map and the device shown in FIG. 1, alone or together with any other suitable device described herein, in accordance with embodiments of the present invention;
  • FIGS. 3A and 3B are a flow chart of an exemplary method of a sharpness/focus analysis procedure in accordance with embodiments of the present invention;
  • FIG. 4 is schematic diagram of an image-capture, “focus scan” procedure, which facilitates later conversion to stereoscopic images, and an associated table according to embodiments of the present invention;
  • FIG. 5 illustrates several exemplary images related to sharpness/focus analysis with optional image segmentation according to embodiments of the present invention;
  • FIG. 6 illustrates schematic diagrams showing close and medium-distance convergence points according to embodiments of the present invention;
  • FIG. 7 is a schematic diagram showing a translational offset determination technique according to embodiments of the present invention;
  • FIG. 8 is a schematic diagram showing pixel repositioning via perspective projection with translation according to embodiments of the present invention; and
  • FIG. 9 illustrates an exemplary environment for implementing various aspects of the subject matter disclosed herein.
  • DETAILED DESCRIPTION
  • The subject matter of the present invention is described with specificity to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or elements similar to the ones described in this document, in conjunction with other present or future technologies. Moreover, although the term “step” may be used herein to connote different aspects of methods employed, the term should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.
  • The present invention includes various embodiments for the creation and/or alteration of a depth map for an image using a digital still camera or other suitable device as described herein. Using the depth map for the image, a stereoscopic image pair and its associated depth map may be rendered. These processes may be implemented by a device such as a digital camera or any other suitable image processing device.
  • FIG. 1 illustrates a block diagram of an exemplary device 100 for generating three-dimensional images or a stereoscopic image pair of a scene using a depth map according to embodiments of the present invention. In this example, device 100 is a digital camera capable of capturing several consecutive, still digital images of a scene. In another example, the device 100 may be a video camera capable of capturing a video sequence including multiple still images of a scene. The device may generate a stereoscopic image pair using a depth map as described in further detail herein. A user of the device 100 may position the camera in different positions for capturing images of different perspective views of a scene. The captured images may be suitably stored, analyzed and processed for generating three-dimensional images using a depth map as described herein. For example, subsequent to capturing the images of the different perspective views of the scene, the device 100, alone or in combination with a computer, may use the images for generating a three-dimensional image of the scene and for displaying the three-dimensional image to the user.
  • Referring to FIG. 1, the device 100 includes a sensor array 102 of charge coupled device (CCD) sensors or CMOS sensors which may be exposed to a scene through a lens and exposure control mechanism as understood by those of skill in the art. The device 100 may also include analog and digital circuitry such as, but not limited to, a memory 104 for storing program instruction sequences that control the device 100, together with a CPU 106, in accordance with embodiments of the present invention. The CPU 106 executes the program instruction sequences so as to cause the device 100 to expose the sensor array 102 to a scene and derive a digital image corresponding to the scene. The digital image may be stored in the memory 104. All or a portion of the memory 104 may be removable, so as to facilitate transfer of the digital image to other devices such as a computer 108. Further, the device 100 may be provided with an input/output (I/O) interface 110 so as to facilitate transfer of digital image even if the memory 104 is not removable. The device 100 may also include a display 112 controllable by the CPU 106 and operable to display the images for viewing by a user.
  • The memory 104 and the CPU 106 may be operable together to implement an image generator function 114 for generating three-dimensional images of a scene using a depth map in accordance with embodiments of the present invention. The image generator function 114 may generate a three-dimensional image of a scene using two or more images of the scene captured by the device 100. FIG. 2 illustrates a flow chart of an exemplary method for generating a stereoscopic image pair of a scene using a depth map and the device 100, alone or together with any other suitable device, in accordance with embodiments of the present invention. Referring to FIG. 2, the method includes receiving 200 a plurality of images of a scene captured at different focal points. For example, all or a portion of a focal plane from zero to infinity may be scanned and all images captured during the scanning process may be stored. The sensor array 102 may be used for capturing still images of the scene.
  • The method includes identifying 202 a plurality of portions of the scene in each captured image. For example, objects in each captured image can be identified and segmented to concentrate focus analysis on specific objects in the scene. A focus map, as described in more detail herein, may be generated and used for approximating the depth of image segments. Using the focus map, an in-focus depth of each portion may be determined 204 based on the captured images for generating a depth map for the scene.
  • The method uses the image where the intended subject is found to be in focus by the camera (as per normal camera focus operation) as the first image of the stereoscopic pair. The other image of the stereoscopic image pair is then generated 206 based on the first image and the depth map.
  • Generating a Stereoscopic Image Pair of a Scene Using a Depth Map
  • A method in accordance with embodiments of the present invention for generating a stereoscopic image pair of a scene using a depth map may be applied during image capture and may utilize camera, focus, and optics information for estimating the depth of each pixel in the image scene. The technique utilizes the concept of depth of field (or similarly, the circle of confusion) and relies upon fast capture and evaluation of a plurality of images while adjusting the lens focus from near field to infinity, before refocusing to capture the intended focused image. FIGS. 3A and 3B are a flow chart of an exemplary method of a sharpness/focus analysis procedure in accordance with embodiments of the present invention. Referring to FIGS. 3A and 3B, the method may begin when the camera enters a stereoscopic mode (step 300).
  • The method of FIGS. 3A and 3B includes scanning the entire focal plane from zero to infinity and storing all images during the scanning process (step 302). For example, when the user activates the focus process for the camera (e.g., by pressing the shutter button half-way or fully), the camera may immediately begin to capture multiple images across the full range of focus for the lens (termed a focus scan, herein), as shown in the example of FIG. 4. As indicated in the table shown in FIG. 4, each image capture at a given increment of the focus distance of the lens may result in a specific Depth of Field (area of image sharpness that encompasses a range of distance from the user) for the scene, with the distance of the sharply focused objects from the user increasing as the focus distance of the lens increases. Once the focus distance reaches the hyperfocal distance of the lens/camera combination, the far end of the Depth of Field will be “infinite,” indicating that all objects beyond the near end of the depth will be sharply in focus. To reduce complexity of the evaluation of this plurality of images, if necessary, each may down-scaled to a reduced resolution before subsequent processing.
  • The method of FIGS. 3A and 3B may optionally include executing image segmentation to break the image into multiple objects (step 304). For example, each captured image from this sequence may be divided into N×M blocks, and each block (or the full image) may be high-pass filtered. An edge detection operation may be performed on each of the resultant images, and each image may be converted to black and white, using a threshold of T (e.g., T>=0.75*max(filtered image)). Finally, a black and white fill operation may be performed to fill the areas between edges, and an “in-focus” pixel map for each image may be obtained. Optionally, objects in the image may be segmented to concentrate the focus analysis on specific objects in the scene. An example is illustrated in FIG. 4, which illustrates several exemplary images related to sharpness/focus analysis with optional image segmentation according to embodiments of the present invention.
  • If object segmentation is performed, each N×M block may be further subdivided into n×m sized sub-blocks corresponding to portions of a given segmented object (step 306). In each sub-block, the images for which the pixels are deemed by the procedure above to be “in-focus” may be analyzed for those pixels to identify in which of the candidate images the local contrast is at its highest level (step 308). This process can continue hierarchically for smaller sub-blocks as needed. The nearest focus distance at which a given pixel is deemed “in focus,” the farthest distance at which it is “in focus,” and the distance at which it is optimally “in focus,” as indicated by the highest local contrast for that pixel, may be recorded in a “focus map.”
  • Given the focus map for the pixels in an image, an approximate depth for those pixels can be calculated. For a given combination of image (camera) format circle of confusion, c, f-stop (aperture), N, and focal length, F, the hyperfocal distance (the nearest distance at which the depth of field extends to infinity) of the combination can be approximated as follows:
  • H F 2 N * c .
  • In turn, the near field depth of field (Dn) for an image for a given focus distance, d, can be approximated as follows:
  • D n H * d ( H + d )
  • (for moderate to large s), and the far field DOF (Df) as follows:
  • D f H * d ( H - d )
  • for d<H. For d>=H, the far end depth of field becomes infinite, and only the near end depth of field value is informative.
  • Using the values in the focus map, these relationships can be combined to build a depth map for the captured image (step 310). For example, for a given pixel (P) the focus map contains the value for the shortest focus distance at which the pixel is in focus, ds(P), the longest distance, dl(P), and the optimum contrast distance, dc(P). Using these values, one can approximate that the closest possible distance for the pixel is given by the following equation:
  • D n s ( P ) H * d s ( P ) ( H + d s ( P ) ) ,
  • And the furthest distance (again, remembering that for a given focus distance, di, if di>=H, the associated value of Df will be infinite) is given by the following equation:
  • D f l ( P ) H * d l ( P ) ( H - d l ( P ) ) ,
  • and the optimum distance is between the equation,
  • D n c ( P ) H * d c ( P ) ( H + d c ( P ) ) ,
  • and the equation,
  • D f c ( P ) H * d c ( P ) ( H - d c ( P ) ) .
  • Further, it is known that for the focus distances ds(P) and dl(P),
  • D f s ( P ) H * d s ( P ) ( H - d s ( P ) )
  • and
  • D n l ( P ) H * d l ( P ) ( H + d l ( P ) ) .
  • Given these values, a depth for each pixel, Dp, can be approximated as follows:

  • D p=(D nl(P)<D fc(P))

  • (D fs(P)>D nc(P))->[max(D ns(P),D nl(P),D nc(P))+min(D fs(P),D fl(P),D fc(P))]/2

  • Else->max(Dns(P),Dnl(P),Dnc(P))+min(Dfl(P),Dfc(P))]/2

  • Else

  • (D fs(P)>D nc(P))->[max(D ns(P),D nc(P))+min(D fs(P),D fl(P),D fc(P))]/2

  • Else->max(Dns(P),Dnl(P),Dnc(P))+min(Dfl(P),Dfc(P))]/2
  • if any of the Df(P) values are non-infinite. In the case that all Df(P) values are infinite, Dp is instead approximated as

  • D p=[max(D ns(P),D nc(P))+min(D nl(P),D nc(P))]/2.
  • The method of FIGS. 3A and 3B includes assigning the left eye image to be the image where the intended subject is found to be in focus by the camera (step 312). Based on the depth map and the left eye image, the right eye image may be generated by translation and perspective projection (step 314). A dual-image process may also be implemented (step 316). The selected left and right eye images may be labeled as a stereoscopic image pair (step 318).
  • Altering a Depth Map for Generating a Stereoscopic Image Pair
  • A method in accordance with embodiments of the present invention for altering a depth map for generating a stereoscopic image pair may be applicable either pre- or post-capture. Touchscreen technology may be used in this method. Touchscreen technology has become increasingly common, and with it, applications such as touchscreen user directed focus for digital cameras (encompassing both digital still camera and cellphone camera units) has emerged. Using this technology, a touchscreen interface may be used for specifying the depth of objects in a two dimensional image capture. Either pre- or post-capture, the image field may be displayed in the live view LCD window, which also functions as a touchscreen interface. A user may touch and highlight the window area he or she wishes to change the depth on, and subsequently uses a right/left (or similar) brushing gesture to indicate an increased or decreased (respectively) depth of the object(s) at the point of the touchscreen highlight. Alternatively, depth can be specified by a user by use of any suitable input device or component, such as, for example, a keyboard, a mouse, or the like.
  • Embodiments of the present invention are applicable pre-capture, while composing the picture, or alternatively can be used post-capture to create or enhance the depth of objects in an eventual stereoscopic image, optionally in conjunction with the technology of the first part of the description. In conjunction with the technology above, the technology described can be used for selective artistic enhancements by the user; whereas in a stand-alone sense, the technology described can be the means of creation of a relative depth map for the picture, allowing the user to create a depth effect only for the objects he/she feels are of import.
  • Once an image view and depth map are available using the techniques above, rendering of the stereoscopic image pair may occur.
  • For any stereoscopic image, there is an overlapping field of view from the left and right eyes that defines the stereoscopic image. At the point of convergence of the eyes, the disparity of an object between the two views will be zero, i.e. no parallax. This defines the “screen point” when viewing the stereoscopic pair. Objects in front of the screen and behind the screen will have increasing amounts of parallax disparity as the distance from the screen increases (negative parallax for objects in front of the screen, positive parallax for objects behind the screen).
  • The central point of the overlapping field of view on the screen plane (zero parallax depth) of the two eyes in stereoscopic viewing defines a circle that passes through each eye, with a radius, R, equal to the distance to the convergence point. Moreover, the angle, θ, between the vectors from the central convergence point to each of the two eyes can be measured. Examples for varying convergence points are described herein below.
  • Medium distance convergence gives a relatively small angular change, while close convergence gives a relatively large angular change. FIG. 6 illustrates schematic diagrams showing an example of close and medium-distance convergence points according to embodiments of the present invention.
  • The convergence point is chosen as center pixel of the image on the screen plane. It should be noted that this may be an imaginary point, as the center pixel of the image may not be at a depth that is on the screen plane, and hence, the depth of that center pixel can be approximated. This value (Dfocus) is approximated to be 10-30% behind the near end depth of field distance for the final captured image, and is approximated by the equation:
  • D focus Screen * scale * H * d focus ( H + d focus ) ,
  • where Dfocus is the focus distance of the lens for the final capture of the image, “Screen” is a value between 1.1 and 1.3, representing the placement of the screen plane behind the near end depth of field, and “scale” represents any scaled adjustment of that depth by the user utilizing the touchscreen interface.
  • The angle, θ, is dependent upon the estimated distance of focus and the modeled stereo baseline of the image pair to be created. Hence, θ may be estimated as follows:
  • θ = 2 * sin - 1 Baseline 2 * D focus
  • for Dfocus calculated in centimeters. Typically, θ would be modeled as at most 2 degrees.
  • In addition to the rotational element in the Z plane, there can also be an X axis translational shift between views. Since no toe-in should occur for the image captures, as would be the case for operation of the eyes, there can be horizontal (X axis) displacement at the screen plane for the two images at the time of capture. For example, FIG. 7 illustrates a schematic diagram showing a translational offset determination technique according to embodiments of the present invention. For a given pixel, P, at a depth Dp, the X axis (horizontal) displacement, S, is calculated using the angle of view, V, for the capture. The angle of view is given by the following equation:
  • V = 2 * tan - 1 W 2 * F
  • for the width of the image sensor, W, and the focal length, F.
  • Depth Dp has been approximated for each pixel in the image, and is available from the depth map. It should be noted that the calculations that follow for a given pixel depth, Dp, may be imperfect, since each pixel is not centrally located between the two eye views; however, the approximation is sufficient for the goal of producing a stereoscopic effect. Hence, knowing V and the depth, Dp, of a given pixel, the approximate width of the field of view (WoV) may be represented as follows:
  • WoV = D p * W F .
  • Hence, if the stereo baseline is estimated, the translational offset in pixels, S, for displacement on the X axis to the left (assuming without loss of generality, right image generated from left) is given by the following equation:
  • S = P W WoV * StereoBaseline = 2 * P w W F * sin θ 2 ,
  • for PW the image width in pixels. Since W, F, and Pw are camera-specific quantities, the only specified quantity is the modeled convergence angle, θ, as noted typically 1-2 degrees.
  • For each pixel, p, in the image, knowing (xp, yp) coordinates, pixel depth Dp, pixel X-axis displacement S, and the angle θ, a perspective projective transform can be defined to generate a right eye image from the single “left eye” image. A projective perspective transform is defined as having an aspect of translation (defined by S), rotation in the x/y plane (which will be zero for this case), rotation in the y/z plane (again will be zero for this case), and rotation in the x/z plane, which will be defined by the angle θ. For example, the transform may be defined as follows:
  • Dx p cos - θ 0 - sin - θ Dy p = 0 1 0 Dz p sin - θ 0 cos - θ x ( x p - S y p D p ) ,
  • where (Dxp, Dyp, Dzp) are 3D coordinate points resulting from the transform that can be projected onto a two dimensional image plane, which may be defined as follows:
  • x p = [ ( Dx p - Ex ) x ( Ez Dz p ) ] x p = [ ( Dy p - Ey ) x ( Ez Dz p ) ]
  • where Ex, Ey, and Ez are the coordinates of the viewer relative to the screen, and can be estimated for a given target display device. Ex and Ey can be assumed to be, but are not limited to, 0. The pixels defined by (xp′, yp′) make up the right image view for the new stereoscopic image pair.
  • Following the calculation of (xp′, yp′) for each pixel, some pixels may map to the same coordinates. The choice of which is in view is made by using the Dzp values of the two pixels, after the initial transform, but prior to the projection onto two-dimensional image space, with lowest value displayed. An example of the pixel manipulations that occur in the course of the transform is shown in FIG. 8, which illustrates a schematic diagram showing pixel repositioning via perspective projection with translation according to embodiments of the present invention.
  • Similarly, there may be points in the image for which no pixel maps. This can be addressed with pixel fill-in and/or cropping. A simple exemplary pixel fill-in process that may be utilized in the present invention assumes a linear gradient between points on each horizontal row in the image. For points on the same row, n, without defined pixel values between two defined points (xi, yn) and (xj, yn), the fill-in process first determines the distance, which may be defined as follows:

  • d=j−i−1,
  • and then proceeds to determine an interpolated gradient between the two pixel positions to fill in the missing values. For simplicity of implementation, the interpolation may be performed on a power of two, meaning that the interpolation will produce one of 1, 2, 4, 8, 16, etc. pixels as needed between the two defined pixels. Pixel regions that are not a power of two are mapped to the closest power of two, and either pixel repetition or truncation of the sequence is applied to fit. As an example, if j=14 and i=6, then d=7, and the following intermediate pixel gradient is calculated as follows:
  • p 1 = 7 8 * ( x 6 , y n ) + 1 8 * ( x 14 , y n ) p 2 = 6 8 * ( x 6 , y n ) + 2 8 * ( x 14 , y n ) p 3 = 5 8 * ( x 6 , y n ) + 3 8 * ( x 14 , y n ) p 4 = 4 8 * ( x 6 , y n ) + 4 8 * ( x 14 , y n ) p 5 = 3 8 * ( x 6 , y n ) + 5 8 * ( x 14 , y n ) p 6 = 2 8 * ( x 6 , y n ) + 6 8 * ( x 14 , y n ) p 7 = 1 8 * ( x 6 , y n ) + 7 8 * ( x 14 , y n ) p 8 = ( x 14 , y n ) .
  • Since only 7 values are needed, p8 would go unused in this case, such that the following assignments can be made:
      • (x7, yn)=p1
      • (x8, yn)=p2
      • (x9, yn)=p3
      • (x10, yn)=p4
      • (x11, yn)=p5
      • (x12, yn)=p6
      • (x13, yn)=p7.
  • This process may repeat for each line in the image following the perspective projective transformation. The resultant image may be combined with the initial image capture to create a stereo image pair that may be rendered for 3D viewing via stereo registration and display. Other, more complex and potentially more accurate pixel fill in processes may be utilized.
  • Embodiments in accordance with the present invention may be implemented by a digital still camera, a video camera, a mobile phone, a smart phone, and the like. In order to provide additional context for various aspects of the disclosed invention, FIG. 9 and the following discussion are intended to provide a brief, general description of a suitable operating environment 900 in which various aspects of the disclosed subject matter may be implemented. While the invention is described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices, those skilled in the art will recognize that the disclosed subject matter can also be implemented in combination with other program modules and/or as a combination of hardware and software.
  • Generally, however, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular data types. The operating environment 900 is only one example of a suitable operating environment and is not intended to suggest any limitation as to the scope of use or functionality of the subject matter disclosed herein. Other well known computer systems, environments, and/or configurations that may be suitable for use with the invention include but are not limited to, personal computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include the above systems or devices, and the like.
  • With reference to FIG. 9, an exemplary environment 900 for implementing various aspects of the subject matter disclosed herein includes a computer 902. The computer 902 includes a processing unit 904, a system memory 906, and a system bus 908. The system bus 908 couples system components including, but not limited to, the system memory 906 to the processing unit 904. The processing unit 904 can be any of various available processors. Dual microprocessors and other multiprocessor architectures also can be employed as the processing unit 904.
  • The system bus 908 can be any of several types of bus structure(s) including the memory bus or memory controller, a peripheral bus or external bus, and/or a local bus using any variety of available bus architectures including, but not limited to, 11-bit bus, Industrial Standard Architecture (ISA), Micro-Channel Architecture (MCA), Extended ISA (EISA), Intelligent Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnect (PCI), Universal Serial Bus (USB), Advanced Graphics Port (AGP), Personal Computer Memory Card International Association bus (PCMCIA), and Small Computer Systems Interface (SCSI).
  • The system memory 906 includes volatile memory 910 and nonvolatile memory 912. The basic input/output system (BIOS), containing the basic routines to transfer information between elements within the computer 902, such as during start-up, is stored in nonvolatile memory 912. By way of illustration, and not limitation, nonvolatile memory 912 can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM), or flash memory. Volatile memory 910 includes random access memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in many forms such as synchronous RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct Rambus RAM (DRRAM).
  • Computer 902 also includes removable/nonremovable, volatile/nonvolatile computer storage media. FIG. 9 illustrates, for example a disk storage 914. Disk storage 914 includes, but is not limited to, devices like a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-100 drive, flash memory card, or memory stick. In addition, disk storage 914 can include storage media separately or in combination with other storage media including, but not limited to, an optical disk drive such as a compact disk ROM device (CD-ROM), CD recordable drive (CD-R Drive), CD rewritable drive (CD-RW Drive) or a digital versatile disk ROM drive (DVD-ROM). To facilitate connection of the disk storage devices 914 to the system bus 908, a removable or non-removable interface is typically used such as interface 916.
  • It is to be appreciated that FIG. 9 describes software that acts as an intermediary between users and the basic computer resources described in suitable operating environment 900. Such software includes an operating system 918. Operating system 918, which can be stored on disk storage 914, acts to control and allocate resources of the computer system 902. System applications 920 take advantage of the management of resources by operating system 918 through program modules 922 and program data 924 stored either in system memory 906 or on disk storage 914. It is to be appreciated that the subject matter disclosed herein can be implemented with various operating systems or combinations of operating systems.
  • A user enters commands or information into the computer 902 through input device(s) 926. Input devices 926 include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touch pad, keyboard, microphone, joystick, game pad, satellite dish, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processing unit 904 through the system bus 908 via interface port(s) 928. Interface port(s) 928 include, for example, a serial port, a parallel port, a game port, and a universal serial bus (USB). Output device(s) 930 use some of the same type of ports as input device(s) 926. Thus, for example, a USB port may be used to provide input to computer 902 and to output information from computer 902 to an output device 930. Output adapter 932 is provided to illustrate that there are some output devices 930 like monitors, speakers, and printers among other output devices 930 that require special adapters. The output adapters 932 include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device 930 and the system bus 908. It should be noted that other devices and/or systems of devices provide both input and output capabilities such as remote computer(s) 934.
  • Computer 902 can operate in a networked environment using logical connections to one or more remote computers, such as remote computer(s) 934. The remote computer(s) 934 can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to computer 902. For purposes of brevity, only a memory storage device 936 is illustrated with remote computer(s) 934. Remote computer(s) 934 is logically connected to computer 902 through a network interface 938 and then physically connected via communication connection 940. Network interface 938 encompasses communication networks such as local-area networks (LAN) and wide-area networks (WAN). LAN technologies include Fiber Distributed Data Interface (FDDI), Copper Distributed Data Interface (CDDI), Ethernet/IEEE 1102.3, Token Ring/IEEE 1102.5 and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks like Integrated Services Digital Networks (ISDN) and variations thereon, packet switching networks, and Digital Subscriber Lines (DSL).
  • Communication connection(s) 940 refers to the hardware/software employed to connect the network interface 938 to the bus 908. While communication connection 940 is shown for illustrative clarity inside computer 902, it can also be external to computer 902. The hardware/software necessary for connection to the network interface 938 includes, for exemplary purposes only, internal and external technologies such as, modems including regular telephone grade modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.
  • The various techniques described herein may be implemented with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the disclosed embodiments, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention. In the case of program code execution on programmable computers, the computer will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device and at least one output device. One or more programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
  • The described methods and apparatus may also be embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, a video recorder or the like, the machine becomes an apparatus for practicing the invention. When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates to perform the processing of the present invention.
  • While the embodiments have been described in connection with the preferred embodiments of the various figures, it is to be understood that other similar embodiments may be used or modifications and additions may be made to the described embodiment for performing the same function without deviating therefrom. Therefore, the disclosed embodiments should not be limited to any single embodiment, but rather should be construed in breadth and scope in accordance with the appended claims.

Claims (22)

1. A method for generating a stereoscopic image pair of a scene using a depth map, the method comprising:
receiving a plurality of images of a scene captured at different focal planes;
identifying a plurality of portions of the scene in each captured image;
determining an in-focus depth of each portion based on the captured images for generating a depth map for the scene;
identifying the captured image where the intended subject is found to be in focus as being one of the images of a stereoscopic image pair; and
generating the other image of the stereoscopic image pair based on the identified captured image and the depth map.
2. The method of claim 1 further comprising:
scanning a plurality of focal planes ranging from zero to infinity; and
capturing a plurality of images, each at a different focal plane.
3. The method of claim 2 comprising using an image capture device for capturing the plurality of images.
4. The method of claim 3 wherein the image capture device comprises at least one of a digital still camera, a video camera, a mobile phone, and a smart phone.
5. The method of claim 1 further comprising:
for each captured image:
filtering the portions of the scene for generating a filtered image;
applying thresholded edge detection to the filtered image; and
determining whether each filtered portion is in focus based on the applied threshold edge detection.
6. The method of claim 5 further comprising:
identifying any in-focus objects in each captured image; and
generating a depth map value for each object.
7. The method of claim 6, wherein an object that is determined to be in focus for a sequence of images that is a subset of the full set of captured images is a target subject, and
wherein identifying one of the subset of images having a predetermined contrast comprises determining which of the subset of images has the highest local contrast based on the target subject.
8. The method of claim 1 wherein generating the other image of the stereoscopic image pair comprises generating the other image of the stereoscopic pair based on translation and perspective projection.
9. The method of claim 1 further comprising generating a three-dimensional image of the scene using the stereoscopic image pairs.
10. The method of claim 9 wherein generating a three-dimensional image comprises one or more of registration, rectification, color correction, matching edges of the pair of images, transformation, depth adjustment, motion detection, and removal of moving objects.
11. The method of claim 1 further comprising displaying the three-dimensional image on a suitable three-dimensional image display.
12. The method of claim 11 wherein displaying the three-dimensional image comprises displaying the three-dimensional image on one of a digital still camera, a computer, a video camera, a digital picture frame, a set-top box, and a high-definition television.
13. A system for generating a three-dimensional image of a scene, the system comprising:
at least one computer processor and memory configured to:
receive a plurality of images of a scene captured at different focal planes;
identify a plurality of portions of the scene in each captured image;
determine an in-focus depth of each portion based on the captured images for generating a depth map for the scene;
identify the captured image where the intended subject is found to be in focus as being one of the images of a stereoscopic image pair; and
generate the other image of the stereoscopic image pair based on the identified captured image and the depth map.
14. A computer-readable storage medium having stored thereon computer executable instructions for performing the following steps:
receiving a plurality of images of a scene captured at different focal planes;
identifying a plurality of portions of the scene in each captured image;
determining an in-focus depth of each portion based on the captured images for generating a depth map for the scene;
identifying the captured image where the intended subject is found to be in focus as being one of the images of a stereoscopic image pair; and
generating the other image of the stereoscopic image pair based on the identified captured image and the depth map.
15. A method for generating a stereoscopic image pair by altering a depth map, the method comprising:
receiving an image of a scene;
receiving a depth map associated with at least one captured image of the scene, wherein the depth map defining depths for each of a plurality of portions of the at least one captured image;
receiving user input for changing, in the depth map, a depth of at least one portion of the at least one captured image; and
generating a stereoscopic image pair of the scene based on the received image of the scene and the changed depth map.
16. The method of claim 15 wherein receiving an image of a scene comprises receiving a plurality of images of a scene captured at different focal planes.
17. The method of claim 15 wherein receiving user input for changing a depth comprises receiving user input on a touchscreen display.
18. The method of claim 15 further comprising:
using the received image as one of the images of the stereoscopic image pair; and
generating the other image of the stereoscopic image pair based on the received image.
19. The method of claim 18 wherein generating the other image of the stereoscopic image pair comprises:
defining a perspective projective transform; and
using the perspective projective transform for determining pixel values of the other image of the stereoscopic image pair.
20. The method of claim 19 further comprising:
determining points in the other image of the stereoscopic image pair where no pixel maps; and
using one of a pixel fill-in technique and a cropping technique for generating pixel values where no pixel maps.
21. A system for generating a three-dimensional image of a scene, the system comprising
at least one computer processor and memory configured to:
receive an image of a scene;
receive a depth map associated with at least one captured image of the scene, wherein the depth map defining depths for each of a plurality of portions of the at least one captured image;
receive user input for changing, in the depth map, a depth of at least one portion of the at least one captured image; and
generate a stereoscopic image pair of the scene based on the received image of the scene and the changed depth map.
22. A computer-readable storage medium having stored thereon computer executable instructions for performing the following steps:
receive an image of a scene;
receive a depth map associated with at least one captured image of the scene, wherein the depth map defining depths for each of a plurality of portions of the at least one captured image;
receive user input for changing, in the depth map, a depth of at least one portion of the at least one captured image; and
generate a stereoscopic image pair of the scene based on the received image of the scene and the changed depth map.
US12/842,257 2009-07-31 2010-07-23 Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation Abandoned US20110025830A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/842,257 US20110025830A1 (en) 2009-07-31 2010-07-23 Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US23013109P 2009-07-31 2009-07-31
US23013309P 2009-07-31 2009-07-31
US23013809P 2009-07-31 2009-07-31
US12/842,257 US20110025830A1 (en) 2009-07-31 2010-07-23 Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation

Publications (1)

Publication Number Publication Date
US20110025830A1 true US20110025830A1 (en) 2011-02-03

Family

ID=43526625

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/842,171 Expired - Fee Related US8436893B2 (en) 2009-07-31 2010-07-23 Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3D) images
US12/842,257 Abandoned US20110025830A1 (en) 2009-07-31 2010-07-23 Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
US13/865,312 Expired - Fee Related US8810635B2 (en) 2009-07-31 2013-04-18 Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional images

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/842,171 Expired - Fee Related US8436893B2 (en) 2009-07-31 2010-07-23 Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3D) images

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/865,312 Expired - Fee Related US8810635B2 (en) 2009-07-31 2013-04-18 Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional images

Country Status (1)

Country Link
US (3) US8436893B2 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110025825A1 (en) * 2009-07-31 2011-02-03 3Dmedia Corporation Methods, systems, and computer-readable storage media for creating three-dimensional (3d) images of a scene
US20110025829A1 (en) * 2009-07-31 2011-02-03 3Dmedia Corporation Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3d) images
US20120082369A1 (en) * 2010-09-30 2012-04-05 Casio Computer Co., Ltd. Image composition apparatus, image retrieval method, and storage medium storing program
US20120110565A1 (en) * 2010-10-29 2012-05-03 Intuit Inc. Chained data processing and application utilization
US8274552B2 (en) 2010-12-27 2012-09-25 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US20130044254A1 (en) * 2011-08-18 2013-02-21 Meir Tzur Image capture for later refocusing or focus-manipulation
CN103049933A (en) * 2011-10-17 2013-04-17 联咏科技股份有限公司 Image processing device and method thereof
US20130107020A1 (en) * 2010-06-30 2013-05-02 Fujifilm Corporation Image capture device, non-transitory computer-readable storage medium, image capture method
US20130162780A1 (en) * 2010-09-22 2013-06-27 Fujifilm Corporation Stereoscopic imaging device and shading correction method
CN104169970A (en) * 2012-04-18 2014-11-26 索尼公司 Method and optical system for determining a depth map of an image
WO2015016619A1 (en) * 2013-07-31 2015-02-05 Samsung Electronics Co., Ltd. Electronic apparatus, method of controlling the same, and image reproducing apparatus and method
EP2852143A1 (en) 2013-09-18 2015-03-25 Nokia Corporation Creating a cinemagraph
US20150085076A1 (en) * 2013-09-24 2015-03-26 Amazon Techologies, Inc. Approaches for simulating three-dimensional views
US20150103192A1 (en) * 2013-10-14 2015-04-16 Qualcomm Incorporated Refocusable images
US9137520B2 (en) 2011-09-08 2015-09-15 Samsung Display Co., Ltd. Stereoscopic image display device and method of displaying stereoscopic image
US9185388B2 (en) 2010-11-03 2015-11-10 3Dmedia Corporation Methods, systems, and computer program products for creating three-dimensional video sequences
US9344701B2 (en) 2010-07-23 2016-05-17 3Dmedia Corporation Methods, systems, and computer-readable storage media for identifying a rough depth map in a scene and for determining a stereo-base distance for three-dimensional (3D) content creation
US9437038B1 (en) 2013-09-26 2016-09-06 Amazon Technologies, Inc. Simulating three-dimensional views using depth relationships among planes of content
US20160309141A1 (en) * 2015-04-15 2016-10-20 The Lightco Inc. Methods and apparatus for generating a sharp image
US20180088923A1 (en) * 2012-07-27 2018-03-29 Huawei Device Co., Ltd. Method, User Equipment, and Application Server for Downloading Application
US10096115B2 (en) 2014-04-11 2018-10-09 Blackberry Limited Building a depth map using movement of one camera
US10134150B2 (en) * 2010-08-10 2018-11-20 Monotype Imaging Inc. Displaying graphics in multi-view scenes
US10200671B2 (en) 2010-12-27 2019-02-05 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US10203752B2 (en) * 2014-11-07 2019-02-12 Eye Labs, LLC Head-mounted devices having variable focal depths
US10237528B2 (en) 2013-03-14 2019-03-19 Qualcomm Incorporated System and method for real time 2D to 3D conversion of a video in a digital camera
US10441214B2 (en) 2015-01-29 2019-10-15 Kali Care, Inc. Monitoring adherence to a medication regimen using a sensor
US10537468B2 (en) 2012-10-23 2020-01-21 Kali Care, Inc. Portable management and monitoring system for eye drop medication regiment
US11044458B2 (en) 2009-07-31 2021-06-22 3Dmedia Corporation Methods, systems, and computer-readable storage media for generating three-dimensional (3D) images of a scene

Families Citing this family (166)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8866920B2 (en) 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US8902321B2 (en) 2008-05-20 2014-12-02 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US8908995B2 (en) 2009-01-12 2014-12-09 Intermec Ip Corp. Semi-automatic dimensioning with imager on a portable device
TWI422213B (en) * 2009-07-29 2014-01-01 Mstar Semiconductor Inc Image detection apparatus and method thereof
JPWO2011039947A1 (en) * 2009-10-01 2013-02-21 日本電気株式会社 IMAGING DEVICE, ITS CONTROL METHOD, IMAGING SYSTEM, AND PROGRAM
TWI394097B (en) * 2009-10-12 2013-04-21 Nat Univ Tsing Hua Detecting method and system for moving object
JP5756119B2 (en) * 2009-11-18 2015-07-29 トムソン ライセンシングThomson Licensing Method and system for 3D content distribution with flexible parallax selection
EP2502115A4 (en) 2009-11-20 2013-11-06 Pelican Imaging Corp Capturing and processing of images using monolithic camera array with heterogeneous imagers
JP2011176800A (en) * 2010-01-28 2011-09-08 Toshiba Corp Image processing apparatus, 3d display apparatus, and image processing method
GB2478157A (en) * 2010-02-26 2011-08-31 Sony Corp Method and apparatus for cutting between a first and second image sequence in a stereoscopic video
KR101824672B1 (en) 2010-05-12 2018-02-05 포토네이션 케이맨 리미티드 Architectures for imager arrays and array cameras
JP5449550B2 (en) * 2010-06-22 2014-03-19 富士フイルム株式会社 Stereoscopic image display device, stereoscopic image display method, stereoscopic image display program, and recording medium
TW201200959A (en) * 2010-06-29 2012-01-01 Fujifilm Corp One-eyed stereo photographic device
US8388146B2 (en) * 2010-08-01 2013-03-05 T-Mobile Usa, Inc. Anamorphic projection device
WO2012023168A1 (en) * 2010-08-19 2012-02-23 パナソニック株式会社 Stereoscopic image capturing device, and stereoscopic image capturing method
KR20120020627A (en) * 2010-08-30 2012-03-08 삼성전자주식회사 Apparatus and method for image processing using 3d image format
KR101708306B1 (en) * 2010-09-13 2017-02-20 엘지전자 주식회사 Mobile twrminal and 3d image convergence method thereof
KR101723235B1 (en) * 2010-10-04 2017-04-04 삼성전자주식회사 Apparatus and method for attenuating three dimensional effect of stereoscopic image
WO2012056685A1 (en) * 2010-10-27 2012-05-03 パナソニック株式会社 3d image processing device, 3d imaging device, and 3d image processing method
JP5594067B2 (en) * 2010-11-02 2014-09-24 ソニー株式会社 Image processing apparatus and image processing method
FR2967324B1 (en) * 2010-11-05 2016-11-04 Transvideo METHOD AND DEVICE FOR CONTROLLING THE PHASING BETWEEN STEREOSCOPIC CAMERAS
JP5614268B2 (en) * 2010-12-09 2014-10-29 ソニー株式会社 Image processing apparatus, image processing method, and program
US8878950B2 (en) 2010-12-14 2014-11-04 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using super-resolution processes
JP2012175533A (en) * 2011-02-23 2012-09-10 Sanyo Electric Co Ltd Electronic apparatus
US9398210B2 (en) * 2011-02-24 2016-07-19 Digimarc Corporation Methods and systems for dealing with perspective distortion in connection with smartphone cameras
JP6057136B2 (en) 2011-03-18 2017-01-11 ソニー株式会社 Image processing apparatus and image processing method
US9549122B2 (en) 2011-03-30 2017-01-17 Nec Corporation Imaging apparatus, photographing guide displaying method for imaging apparatus, and non-transitory computer readable medium
EP2509324A1 (en) * 2011-04-08 2012-10-10 Thomson Licensing Method and apparatus for analyzing stereoscopic or multi-view images
JP5874192B2 (en) * 2011-04-11 2016-03-02 ソニー株式会社 Image processing apparatus, image processing method, and program
US8963998B2 (en) * 2011-04-15 2015-02-24 Tektronix, Inc. Full reference system for predicting subjective quality of three-dimensional video
US20120280975A1 (en) * 2011-05-03 2012-11-08 Stephen Alan Jeffryes Poly-view Three Dimensional Monitor
JP2014519741A (en) 2011-05-11 2014-08-14 ペリカン イメージング コーポレイション System and method for transmitting and receiving array camera image data
EP2726930A4 (en) 2011-06-28 2015-03-04 Pelican Imaging Corp Optical arrangements for use with an array camera
US20130265459A1 (en) 2011-06-28 2013-10-10 Pelican Imaging Corporation Optical arrangements for use with an array camera
JP5721197B2 (en) * 2011-06-29 2015-05-20 Necソリューションイノベータ株式会社 Three-dimensional feature data generation device, three-dimensional feature data generation method, and three-dimensional feature data generation program
JP5899684B2 (en) * 2011-07-11 2016-04-06 ソニー株式会社 Image processing apparatus, image processing method, and program
US9191649B2 (en) 2011-08-12 2015-11-17 Qualcomm Incorporated Systems and methods to capture a stereoscopic image pair
TWI449408B (en) * 2011-08-31 2014-08-11 Altek Corp Method and apparatus for capturing three-dimensional image and apparatus for displaying three-dimensional image
BR112014004062A2 (en) * 2011-08-31 2017-03-07 Sony Corp coding and decoding devices and methods
WO2013043761A1 (en) 2011-09-19 2013-03-28 Pelican Imaging Corporation Determining depth from multiple views of a scene that include aliasing using hypothesized fusion
JP6032564B2 (en) * 2011-09-22 2016-11-30 パナソニックIpマネジメント株式会社 Stereoscopic imaging device and stereoscopic imaging method
US20130076872A1 (en) * 2011-09-23 2013-03-28 Himax Technologies Limited System and Method of Detecting and Correcting an Improper Rendering Condition in Stereoscopic Images
EP2761534B1 (en) 2011-09-28 2020-11-18 FotoNation Limited Systems for encoding light field image files
US8937646B1 (en) * 2011-10-05 2015-01-20 Amazon Technologies, Inc. Stereo imaging using disparate imaging devices
US20130107008A1 (en) * 2011-10-31 2013-05-02 Nokia Corporation Method, apparatus and computer program product for capturing images
JP5912441B2 (en) * 2011-11-15 2016-04-27 任天堂株式会社 Imaging program, imaging apparatus, imaging system, and image display method
JP5768684B2 (en) * 2011-11-29 2015-08-26 富士通株式会社 Stereo image generation apparatus, stereo image generation method, and computer program for stereo image generation
EP2789159A4 (en) 2011-12-07 2015-06-17 Intel Corp Guided image capture
KR101697512B1 (en) * 2011-12-15 2017-01-19 한국전자통신연구원 Image registration device and method thereof
EP2626771B1 (en) * 2012-02-09 2018-01-10 Samsung Electronics Co., Ltd Display apparatus and method for controlling a camera mounted on a display apparatus
US9412206B2 (en) 2012-02-21 2016-08-09 Pelican Imaging Corporation Systems and methods for the manipulation of captured light field image data
US9357204B2 (en) * 2012-03-19 2016-05-31 Fittingbox Method for constructing images of a pair of glasses
US9031316B2 (en) * 2012-04-05 2015-05-12 Mediatek Singapore Pte. Ltd. Method for identifying view order of image frames of stereo image pair according to image characteristics and related machine readable medium thereof
US9210392B2 (en) 2012-05-01 2015-12-08 Pelican Imaging Coporation Camera modules patterned with pi filter groups
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) * 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
WO2014005123A1 (en) 2012-06-28 2014-01-03 Pelican Imaging Corporation Systems and methods for detecting defective camera arrays, optic arrays, and sensors
US20140002674A1 (en) 2012-06-30 2014-01-02 Pelican Imaging Corporation Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
CN107346061B (en) 2012-08-21 2020-04-24 快图有限公司 System and method for parallax detection and correction in images captured using an array camera
WO2014032020A2 (en) 2012-08-23 2014-02-27 Pelican Imaging Corporation Feature based high resolution motion estimation from low resolution images captured using an array source
US20140063057A1 (en) * 2012-08-31 2014-03-06 Nokia Corporation System for guiding users in crowdsourced video services
US9214013B2 (en) 2012-09-14 2015-12-15 Pelican Imaging Corporation Systems and methods for correcting user identified artifacts in light field images
CN104685860A (en) 2012-09-28 2015-06-03 派力肯影像公司 Generating images from light fields utilizing virtual viewpoints
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US9148651B2 (en) * 2012-10-05 2015-09-29 Blackberry Limited Methods and devices for generating a stereoscopic image
US20140104413A1 (en) 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
US9143711B2 (en) 2012-11-13 2015-09-22 Pelican Imaging Corporation Systems and methods for array camera focal plane control
CN104813230A (en) * 2012-11-30 2015-07-29 汤姆逊许可公司 Method and system for capturing a 3d image using single camera
EP2750392A1 (en) * 2012-12-27 2014-07-02 ST-Ericsson SA Visually-assisted stereo acquisition from a single camera
KR101509869B1 (en) * 2012-12-31 2015-04-07 현대자동차주식회사 System and method for interlocking display
US9462164B2 (en) 2013-02-21 2016-10-04 Pelican Imaging Corporation Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
US9253380B2 (en) 2013-02-24 2016-02-02 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
US9638883B1 (en) 2013-03-04 2017-05-02 Fotonation Cayman Limited Passive alignment of array camera modules constructed from lens stack arrays and sensors based upon alignment information obtained during manufacture of array camera modules using an active alignment process
US9774789B2 (en) 2013-03-08 2017-09-26 Fotonation Cayman Limited Systems and methods for high dynamic range imaging using array cameras
US8866912B2 (en) 2013-03-10 2014-10-21 Pelican Imaging Corporation System and methods for calibration of an array camera using a single captured image
WO2014164550A2 (en) 2013-03-13 2014-10-09 Pelican Imaging Corporation System and methods for calibration of an array camera
US9888194B2 (en) 2013-03-13 2018-02-06 Fotonation Cayman Limited Array camera architecture implementing quantum film image sensors
US9106784B2 (en) 2013-03-13 2015-08-11 Pelican Imaging Corporation Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
WO2014165244A1 (en) 2013-03-13 2014-10-09 Pelican Imaging Corporation Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
US9080856B2 (en) 2013-03-13 2015-07-14 Intermec Ip Corp. Systems and methods for enhancing dimensioning, for example volume dimensioning
US9578259B2 (en) 2013-03-14 2017-02-21 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
WO2014153098A1 (en) 2013-03-14 2014-09-25 Pelican Imaging Corporation Photmetric normalization in array cameras
US9497370B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Array camera architecture implementing quantum dot color filters
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
WO2014145856A1 (en) 2013-03-15 2014-09-18 Pelican Imaging Corporation Systems and methods for stereo imaging with camera arrays
US9633442B2 (en) 2013-03-15 2017-04-25 Fotonation Cayman Limited Array cameras including an array camera module augmented with a separate camera
US9654761B1 (en) * 2013-03-15 2017-05-16 Google Inc. Computer vision algorithm for capturing and refocusing imagery
US9445003B1 (en) 2013-03-15 2016-09-13 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
WO2014158113A2 (en) * 2013-03-28 2014-10-02 Koymen Kadir A sliding system
KR102025535B1 (en) * 2013-04-08 2019-09-26 시마진 미디어 엘티디 Distance estimation using multi-camera device
TWI502271B (en) * 2013-04-15 2015-10-01 Htc Corp Controlling method and electronic apparatus
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
JP2015060053A (en) * 2013-09-18 2015-03-30 株式会社東芝 Solid-state imaging device, control device, and control program
WO2015048694A2 (en) 2013-09-27 2015-04-02 Pelican Imaging Corporation Systems and methods for depth-assisted perspective distortion correction
US10095917B2 (en) 2013-11-04 2018-10-09 Facebook, Inc. Systems and methods for facial representation
WO2015070105A1 (en) 2013-11-07 2015-05-14 Pelican Imaging Corporation Methods of manufacturing array camera modules incorporating independently aligned lens stacks
US10119808B2 (en) 2013-11-18 2018-11-06 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
EP3075140B1 (en) 2013-11-26 2018-06-13 FotoNation Cayman Limited Array camera configurations incorporating multiple constituent array cameras
TWI520098B (en) * 2014-01-28 2016-02-01 聚晶半導體股份有限公司 Image capturing device and method for detecting image deformation thereof
US10089740B2 (en) 2014-03-07 2018-10-02 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US9715113B2 (en) 2014-03-18 2017-07-25 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and computer program
US9247117B2 (en) 2014-04-07 2016-01-26 Pelican Imaging Corporation Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array
US9521319B2 (en) 2014-06-18 2016-12-13 Pelican Imaging Corporation Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor
US9912934B2 (en) * 2014-07-21 2018-03-06 Moaath Alrajab Determining three dimensional information using a single camera
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
KR102145542B1 (en) 2014-08-14 2020-08-18 삼성전자주식회사 Image photographing apparatus, image photographing system for photographing using a plurality of image photographing apparatuses and methods for photographing image thereof
CN113256730B (en) 2014-09-29 2023-09-05 快图有限公司 System and method for dynamic calibration of an array camera
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
KR102281184B1 (en) * 2014-11-20 2021-07-23 삼성전자주식회사 Method and apparatus for calibrating image
US20160165135A1 (en) * 2014-12-05 2016-06-09 Samsung Electronics Co., Ltd Image photographing apparatus, method of photographing image and non-transitory recordable medium
GB2533098B (en) * 2014-12-09 2016-12-14 Ibm Automated management of confidential data in cloud environments
US9998655B2 (en) * 2014-12-23 2018-06-12 Quallcomm Incorporated Visualization for viewing-guidance during dataset-generation
US10074178B2 (en) * 2015-01-30 2018-09-11 Dental Imaging Technologies Corporation Intra-oral image acquisition alignment
US10200666B2 (en) * 2015-03-04 2019-02-05 Dolby Laboratories Licensing Corporation Coherent motion estimation for stereoscopic video
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
EP3118576B1 (en) 2015-07-15 2018-09-12 Hand Held Products, Inc. Mobile dimensioning device with dynamic accuracy compatible with nist standard
US10094650B2 (en) 2015-07-16 2018-10-09 Hand Held Products, Inc. Dimensioning and imaging items
US20170017301A1 (en) 2015-07-16 2017-01-19 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
EP3873088A1 (en) * 2015-09-08 2021-09-01 SZ DJI Technology Co., Ltd. System and method for supporting three-dimensional display in first person view (fpv)
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10002435B2 (en) * 2016-01-29 2018-06-19 Google Llc Detecting motion in images
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
CN106254855B (en) * 2016-08-25 2017-12-05 锐马(福建)电气制造有限公司 A kind of three-dimensional modeling method and system based on zoom ranging
CN106331683B (en) * 2016-08-25 2017-12-22 锐马(福建)电气制造有限公司 A kind of object dimensional method for reconstructing and its system
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
TR201700549A2 (en) * 2017-01-13 2017-09-21 Master Teknik Tasarim Makina Sanayi Ve Ticaret Ltd Sirketi 3-Dimensional Monitoring of Very Distant Stars and Planets in Space
WO2018143153A1 (en) * 2017-02-03 2018-08-09 三井住友建設株式会社 Position measurement device and position measurement method
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
EP3496387A1 (en) 2017-12-05 2019-06-12 Koninklijke Philips N.V. Apparatus and method of image capture
GB2569546B (en) * 2017-12-19 2020-10-14 Sony Interactive Entertainment Inc Determining pixel values using reference images
CN109961503A (en) * 2017-12-25 2019-07-02 国民技术股份有限公司 A kind of image processing method and device, terminal and computer readable storage medium
WO2019144289A1 (en) * 2018-01-23 2019-08-01 SZ DJI Technology Co., Ltd. Systems and methods for calibrating an optical system of a movable object
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
JP7164968B2 (en) * 2018-05-07 2022-11-02 キヤノン株式会社 IMAGE PROCESSING DEVICE, CONTROL METHOD AND PROGRAM OF IMAGE PROCESSING DEVICE
JP7431527B2 (en) * 2019-08-07 2024-02-15 キヤノン株式会社 Depth information generation device, imaging device, depth information generation method, image processing device, image processing method, and program
MX2022003020A (en) 2019-09-17 2022-06-14 Boston Polarimetrics Inc Systems and methods for surface modeling using polarization cues.
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
EP4042366A4 (en) 2019-10-07 2023-11-15 Boston Polarimetrics, Inc. Systems and methods for augmentation of sensor systems and imaging systems with polarization
KR20230116068A (en) 2019-11-30 2023-08-03 보스턴 폴라리메트릭스, 인크. System and method for segmenting transparent objects using polarization signals
CN115552486A (en) 2020-01-29 2022-12-30 因思创新有限责任公司 System and method for characterizing an object pose detection and measurement system
KR20220133973A (en) 2020-01-30 2022-10-05 인트린식 이노베이션 엘엘씨 Systems and methods for synthesizing data to train statistical models for different imaging modalities, including polarized images
US11822643B2 (en) 2020-02-07 2023-11-21 BicDroid Inc. Method and system for creating quarantined workspaces through controlled interaction between a host and virtual guests
WO2021243088A1 (en) 2020-05-27 2021-12-02 Boston Polarimetrics, Inc. Multi-aperture polarization optical systems using beam splitters
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers

Citations (95)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3503316A (en) * 1965-11-09 1970-03-31 Toppan Printing Co Ltd Apparatus for producing stereoscopic pictures
US3953869A (en) * 1974-09-24 1976-04-27 Dimensional Development Corporation Stereoscopic photography apparatus
US4661986A (en) * 1983-06-27 1987-04-28 Rca Corporation Depth-of-focus imaging process method
US4956705A (en) * 1989-03-10 1990-09-11 Dimensional Visions Group Electronic method and apparatus for stereoscopic photography
US4980762A (en) * 1989-10-13 1990-12-25 Massachusetts Institute Of Technology Method and apparatus for image processing to obtain three dimensional motion and depth
US5043806A (en) * 1989-07-26 1991-08-27 L'etat Francais Represente Par Le Ministre Des P.T.T. Method of processing and transmitting over a "MAC" type channel a sequence of pairs of sterescopic television images
US5151609A (en) * 1989-08-02 1992-09-29 Hitachi, Ltd. Method of detecting solid shape of object with autofocusing and image detection at each focus level
US5305092A (en) * 1991-09-03 1994-04-19 Hitachi, Ltd. Apparatus for obtaining three-dimensional volume data of an object
US5369735A (en) * 1990-03-30 1994-11-29 New Microtime Inc. Method for controlling a 3D patch-driven special effects system
US5444479A (en) * 1993-09-02 1995-08-22 Vision Iii Imaging, Inc. Single camera autostereoscopic imaging apparatus
US5511153A (en) * 1994-01-18 1996-04-23 Massachusetts Institute Of Technology Method and apparatus for three-dimensional, textured models from plural video images
US5603687A (en) * 1992-10-28 1997-02-18 Oktas General Partnership Asymmetric stereo-optic endoscope
US5613048A (en) * 1993-08-03 1997-03-18 Apple Computer, Inc. Three-dimensional image synthesis using view interpolation
US5652616A (en) * 1996-08-06 1997-07-29 General Instrument Corporation Of Delaware Optimal disparity estimation for stereoscopic video coding
US5673081A (en) * 1994-11-22 1997-09-30 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5678089A (en) * 1993-11-05 1997-10-14 Vision Iii Imaging, Inc. Autostereoscopic imaging apparatus and method using a parallax scanning lens aperture
US5682437A (en) * 1994-09-22 1997-10-28 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5719954A (en) * 1994-06-07 1998-02-17 Matsushita Electric Industrial Co., Ltd. Stereo matching method and disparity measuring method
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
US5748199A (en) * 1995-12-20 1998-05-05 Synthonics Incorporated Method and apparatus for converting a two dimensional motion picture into a three dimensional motion picture
US5777666A (en) * 1995-04-17 1998-07-07 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5808664A (en) * 1994-07-14 1998-09-15 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5874988A (en) * 1996-07-08 1999-02-23 Da Vinci Systems, Inc. System and methods for automated color correction
US5883695A (en) * 1997-09-19 1999-03-16 Paul; Eddie Method and apparatus for producing stereoscopic images with single sensor
US5953054A (en) * 1996-05-31 1999-09-14 Geo-3D Inc. Method and system for producing stereoscopic 3-dimensional images
US5963247A (en) * 1994-05-31 1999-10-05 Banitt; Shmuel Visual display systems and a system for producing recordings for visualization thereon and methods therefor
US6018349A (en) * 1997-08-01 2000-01-25 Microsoft Corporation Patch-based alignment method and apparatus for construction of image mosaics
US6023588A (en) * 1998-09-28 2000-02-08 Eastman Kodak Company Method and apparatus for capturing panoramic images with range data
US6031538A (en) * 1994-08-30 2000-02-29 Thomson Broadband Systems Method for the generation of synthetic images
US6047078A (en) * 1997-10-03 2000-04-04 Digital Equipment Corporation Method for extracting a three-dimensional model using appearance-based constrained structure from motion
US6064759A (en) * 1996-11-08 2000-05-16 Buckley; B. Shawn Computer aided inspection machine
US6075905A (en) * 1996-07-17 2000-06-13 Sarnoff Corporation Method and apparatus for mosaic image construction
US6094215A (en) * 1998-01-06 2000-07-25 Intel Corporation Method of determining relative camera orientation position to create 3-D visual images
US6215516B1 (en) * 1997-07-07 2001-04-10 Reveo, Inc. Method and apparatus for monoscopic to stereoscopic image conversion
US6240198B1 (en) * 1998-04-13 2001-05-29 Compaq Computer Corporation Method for figure tracking using 2-D registration
US6246412B1 (en) * 1998-06-18 2001-06-12 Microsoft Corporation Interactive construction and refinement of 3D models from multiple panoramic images
US6269172B1 (en) * 1998-04-13 2001-07-31 Compaq Computer Corporation Method for tracking the motion of a 3-D figure
US6278460B1 (en) * 1998-12-15 2001-08-21 Point Cloud, Inc. Creating a three-dimensional model from two-dimensional images
US6314211B1 (en) * 1997-12-30 2001-11-06 Samsung Electronics Co., Ltd. Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image
US6381302B1 (en) * 2000-07-05 2002-04-30 Canon Kabushiki Kaisha Computer assisted 2D adjustment of stereo X-ray images
US6384859B1 (en) * 1995-03-29 2002-05-07 Sanyo Electric Co., Ltd. Methods for creating an image for a three-dimensional display, for calculating depth information and for image processing using the depth information
US6385334B1 (en) * 1998-03-12 2002-05-07 Fuji Jukogyo Kabushiki Kaisha System and method for adjusting stereo camera
US6414709B1 (en) * 1994-11-03 2002-07-02 Synthonics Incorporated Methods and apparatus for zooming during capture and reproduction of 3-dimensional images
US20020106120A1 (en) * 2001-01-31 2002-08-08 Nicole Brandenburg Method of analyzing in real time the correspondence of image characteristics in corresponding video images
US6434278B1 (en) * 1997-09-23 2002-08-13 Enroute, Inc. Generating three-dimensional models of objects defined by two-dimensional image data
US6445833B1 (en) * 1996-07-18 2002-09-03 Sanyo Electric Co., Ltd Device and method for converting two-dimensional video into three-dimensional video
US20020158873A1 (en) * 2001-01-26 2002-10-31 Todd Williamson Real-time virtual viewpoint in simulated reality environment
US6496598B1 (en) * 1997-09-02 2002-12-17 Dynamic Digital Depth Research Pty. Ltd. Image processing method and apparatus
US20020190991A1 (en) * 2001-05-16 2002-12-19 Daniel Efran 3-D instant replay system and method
US20030002870A1 (en) * 2001-06-27 2003-01-02 Baron John M. System for and method of auto focus indications
US6512892B1 (en) * 1999-09-15 2003-01-28 Sharp Kabushiki Kaisha 3D camera
US20030030636A1 (en) * 2000-03-31 2003-02-13 Olympus Optical Co., Ltd. 3D image data publishing method and 3D image production system
US6556704B1 (en) * 1999-08-25 2003-04-29 Eastman Kodak Company Method for forming a depth image from digital image data
US6559846B1 (en) * 2000-07-07 2003-05-06 Microsoft Corporation System and process for viewing panoramic video
US6584219B1 (en) * 1997-09-18 2003-06-24 Sanyo Electric Co., Ltd. 2D/3D image conversion system
US20030152264A1 (en) * 2002-02-13 2003-08-14 Perkins Christopher H. Method and system for processing stereoscopic images
US20030151659A1 (en) * 2002-02-13 2003-08-14 Pentax Corporation Camera for generating a stereoscopic pair of images
US6611268B1 (en) * 2000-05-30 2003-08-26 Microsoft Corporation System and process for generating 3D video textures using video-based rendering techniques
US6661913B1 (en) * 1999-05-05 2003-12-09 Microsoft Corporation System and method for determining structure and motion using multiples sets of images from different projection models for object modeling
US6677981B1 (en) * 1999-12-31 2004-01-13 Stmicroelectronics, Inc. Motion play-back of still pictures comprising a panoramic view for simulating perspective
US6677982B1 (en) * 2000-10-11 2004-01-13 Eastman Kodak Company Method for three dimensional spatial panorama formation
US6686926B1 (en) * 1998-05-27 2004-02-03 In-Three, Inc. Image processing system and method for converting two-dimensional images into three-dimensional images
US20040080661A1 (en) * 2000-12-22 2004-04-29 Sven-Ake Afsenius Camera that combines the best focused parts from different exposures to an image
US20040100565A1 (en) * 2002-11-22 2004-05-27 Eastman Kodak Company Method and system for generating images used in extended range panorama composition
US6747610B1 (en) * 1997-07-22 2004-06-08 Sanyo Electric Co., Ltd. Stereoscopic image display apparatus capable of selectively displaying desired stereoscopic image
US6750904B1 (en) * 1998-10-31 2004-06-15 International Business Machines Corporation Camera system for three dimensional images and video
US6760488B1 (en) * 1999-07-12 2004-07-06 Carnegie Mellon University System and method for generating a three-dimensional model from a two-dimensional image sequence
US20040136571A1 (en) * 2002-12-11 2004-07-15 Eastman Kodak Company Three dimensional images
US6798406B1 (en) * 1999-09-15 2004-09-28 Sharp Kabushiki Kaisha Stereo images with comfortable perceived depth
US20040218269A1 (en) * 2002-01-14 2004-11-04 Divelbiss Adam W. General purpose stereoscopic 3D format conversion system and method
US20050041123A1 (en) * 2003-08-20 2005-02-24 Sbc Knowledge Ventures, L.P. Digital image capturing system and method
US6862364B1 (en) * 1999-10-27 2005-03-01 Canon Kabushiki Kaisha Stereo image processing for radiography
US6927769B2 (en) * 2000-11-21 2005-08-09 Vrex, Inc. Stereoscopic image processing on a computer system
US20050201612A1 (en) * 2004-03-04 2005-09-15 Samsung Electronics Co.,Ltd. Method and apparatus for detecting people using stereo camera
US6947059B2 (en) * 2001-08-10 2005-09-20 Micoy Corporation Stereoscopic panoramic image capture device
US6967659B1 (en) * 2000-08-25 2005-11-22 Advanced Micro Devices, Inc. Circuitry and systems for performing two-dimensional motion compensation using a three-dimensional pipeline and methods of operating the same
US6970591B1 (en) * 1999-11-25 2005-11-29 Canon Kabushiki Kaisha Image processing apparatus
US20070024614A1 (en) * 2005-07-26 2007-02-01 Tam Wa J Generating a depth map from a two-dimensional source image for stereoscopic and multiview imaging
US20070285419A1 (en) * 2004-07-30 2007-12-13 Dor Givon System and method for 3d space-dimension based image processing
US20070297784A1 (en) * 2006-06-22 2007-12-27 Sony Corporation Method of and apparatus for generating a depth map utilized in autofocusing
US20080031327A1 (en) * 2006-08-01 2008-02-07 Haohong Wang Real-time capturing and generating stereo images and videos with a monoscopic low power mobile device
US20080080852A1 (en) * 2006-10-03 2008-04-03 National Taiwan University Single lens auto focus system for stereo image generation and method thereof
US20080150945A1 (en) * 2006-12-22 2008-06-26 Haohong Wang Complexity-adaptive 2d-to-3d video sequence conversion
US20080232680A1 (en) * 2007-03-19 2008-09-25 Alexander Berestov Two dimensional/three dimensional digital information acquisition and display device
US20080247670A1 (en) * 2007-04-03 2008-10-09 Wa James Tam Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images
US20090010507A1 (en) * 2007-07-02 2009-01-08 Zheng Jason Geng System and method for generating a 3d model of anatomical structure using a plurality of 2d images
US20090073164A1 (en) * 2005-10-28 2009-03-19 Wells Barton S Automatic compositing of 3d objects in a still frame or series of frames
US20090116732A1 (en) * 2006-06-23 2009-05-07 Samuel Zhou Methods and systems for converting 2d motion pictures for stereoscopic 3d exhibition
US20090169057A1 (en) * 2007-12-28 2009-07-02 Industrial Technology Research Institute Method for producing image with depth by using 2d images
US20090167930A1 (en) * 2007-12-27 2009-07-02 Ati Technologies Ulc Method and apparatus with fast camera auto focus
US20090167923A1 (en) * 2007-12-27 2009-07-02 Ati Technologies Ulc Method and apparatus with depth map generation
US20100080448A1 (en) * 2007-04-03 2010-04-01 Wa James Tam Method and graphical user interface for modifying depth maps
US7847854B2 (en) * 2006-09-27 2010-12-07 Fujitsu Semiconductor Limited Imaging apparatus with AF optical zoom
US8223194B2 (en) * 2007-01-30 2012-07-17 Samsung Electronics Co., Ltd. Image processing method and apparatus
US8248410B2 (en) * 2008-12-09 2012-08-21 Seiko Epson Corporation Synthesizing detailed depth maps from images

Family Cites Families (132)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5548667A (en) * 1991-05-24 1996-08-20 Sony Corporation Image processing system and method thereof in which three dimensional shape is reproduced from two dimensional image data
GB9201006D0 (en) * 1992-01-17 1992-03-11 Philip Electronic And Associat Classifying faces
US5530774A (en) * 1994-03-25 1996-06-25 Eastman Kodak Company Generation of depth image through interpolation and extrapolation of intermediate images derived from stereo image pair using disparity vector fields
US5768404A (en) 1994-04-13 1998-06-16 Matsushita Electric Industrial Co., Ltd. Motion and disparity estimation method, image synthesis method, and apparatus for implementing same methods
JPH08227097A (en) * 1995-02-21 1996-09-03 Ricoh Co Ltd Camera apparatus
US6879341B1 (en) * 1997-07-15 2005-04-12 Silverbrook Research Pty Ltd Digital camera system containing a VLIW vector processor
EP2252071A3 (en) 1997-12-05 2017-04-12 Dynamic Digital Depth Research Pty. Ltd. Improved image conversion and encoding techniques
US7116323B2 (en) 1998-05-27 2006-10-03 In-Three, Inc. Method of hidden surface reconstruction for creating accurate three-dimensional images converted from two-dimensional images
US7116324B2 (en) * 1998-05-27 2006-10-03 In-Three, Inc. Method for minimizing visual artifacts converting two-dimensional motion pictures into three-dimensional motion pictures
US6269175B1 (en) 1998-08-28 2001-07-31 Sarnoff Corporation Method and apparatus for enhancing regions of aligned images using flow estimation
JP2000251090A (en) 1999-03-01 2000-09-14 Sony Computer Entertainment Inc Drawing device, and method for representing depth of field by the drawing device
GB9912438D0 (en) 1999-05-27 1999-07-28 United Bristol Healthcare Nhs Method and apparatus for displaying volumetric data
AUPQ416699A0 (en) 1999-11-19 1999-12-16 Dynamic Digital Depth Research Pty Ltd Depth map compression technique
US7356082B1 (en) * 1999-11-29 2008-04-08 Sony Corporation Video/audio signal processing method and video-audio signal processing apparatus
EP1240540B1 (en) 1999-12-13 2011-02-23 The Trustees of Columbia University in the City of New York Rectified catadioptric stereo sensors
US6980690B1 (en) * 2000-01-20 2005-12-27 Canon Kabushiki Kaisha Image processing apparatus
US7373017B2 (en) 2005-10-04 2008-05-13 Sony Corporation System and method for capturing adjacent images by utilizing a panorama mode
US6978051B2 (en) 2000-03-06 2005-12-20 Sony Corporation System and method for capturing adjacent images by utilizing a panorama mode
KR100908989B1 (en) * 2000-04-04 2009-07-22 소니 가부시끼 가이샤 Stereoscopic image creation method and apparatus
US7027642B2 (en) * 2000-04-28 2006-04-11 Orametrix, Inc. Methods for registration of three-dimensional frames to create three-dimensional virtual models of objects
US6701005B1 (en) * 2000-04-29 2004-03-02 Cognex Corporation Method and apparatus for three-dimensional object segmentation
US7224357B2 (en) * 2000-05-03 2007-05-29 University Of Southern California Three-dimensional modeling based on photographic images
US6606406B1 (en) * 2000-05-04 2003-08-12 Microsoft Corporation System and method for progressive stereo matching of digital images
CA2327894A1 (en) * 2000-12-07 2002-06-07 Clearview Geophysics Inc. Method and system for complete 3d object and area digitizing
US7155049B2 (en) * 2001-01-11 2006-12-26 Trestle Acquisition Corp. System for creating microscopic digital montage images
DE10116056B4 (en) 2001-03-30 2005-09-08 Karl Storz Gmbh & Co. Kg Endoscopic visualization device with different image systems
JP2003141562A (en) 2001-10-29 2003-05-16 Sony Corp Image processing apparatus and method for nonplanar image, storage medium, and computer program
US7046840B2 (en) 2001-11-09 2006-05-16 Arcsoft, Inc. 3-D reconstruction engine
EP1451775A1 (en) 2001-11-24 2004-09-01 TDV Technologies Corp. Generation of a stereo image sequence from a 2d image sequence
DE10201523A1 (en) * 2002-01-17 2003-07-31 Bosch Gmbh Robert Method and device for masking detection in image sensor systems
JP2003244727A (en) * 2002-02-13 2003-08-29 Pentax Corp Stereoscopic image pickup system
CA2380105A1 (en) 2002-04-09 2003-10-09 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
US7081892B2 (en) 2002-04-09 2006-07-25 Sony Computer Entertainment America Inc. Image with depth of field using z-buffer image data and alpha blending
US7203356B2 (en) * 2002-04-11 2007-04-10 Canesta, Inc. Subject segmentation and tracking using 3D sensing technology for video compression in multimedia applications
US7224382B2 (en) * 2002-04-12 2007-05-29 Image Masters, Inc. Immersive imaging system
US7489812B2 (en) 2002-06-07 2009-02-10 Dynamic Digital Depth Research Pty Ltd. Conversion and encoding techniques
IL150131A (en) 2002-06-10 2007-03-08 Rafael Advanced Defense Sys Method for converting a sequence of monoscopic images to a sequence of stereoscopic images
AU2003272202A1 (en) 2002-06-21 2004-01-06 The Trustees Of Columbia University In The City Of New York Systems and methods for de-blurring motion blurred images
US7400782B2 (en) 2002-08-28 2008-07-15 Arcsoft, Inc. Image warping correction in forming 360 degree panoramic images
JP4216021B2 (en) 2002-08-30 2009-01-28 富士重工業株式会社 Intruder detection device
WO2004021151A2 (en) * 2002-08-30 2004-03-11 Orasee Corp. Multi-dimensional image system for digital image input and output
US7466336B2 (en) * 2002-09-05 2008-12-16 Eastman Kodak Company Camera and method for composing multi-perspective images
JP4072674B2 (en) * 2002-09-06 2008-04-09 ソニー株式会社 Image processing apparatus and method, recording medium, and program
JP5036132B2 (en) * 2002-11-21 2012-09-26 ビジョン サード イメージング,インコーポレイテッド Critical alignment of parallax images for autostereoscopic display
AU2002952874A0 (en) 2002-11-25 2002-12-12 Dynamic Digital Depth Research Pty Ltd 3D image synthesis from depth encoded source view
US20050191048A1 (en) * 2003-03-03 2005-09-01 Mission3D Llc Stereoscopic universal camera apparatus
JP4490074B2 (en) 2003-04-17 2010-06-23 ソニー株式会社 Stereoscopic image processing apparatus, stereoscopic image display apparatus, stereoscopic image providing method, and stereoscopic image processing system
GB0310504D0 (en) 2003-05-07 2003-06-11 Canon Europa Nv Photographing apparatus,device and method for obtaining images to be used for creating three-dimensional model
US8896725B2 (en) 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
GB2405764A (en) 2003-09-04 2005-03-09 Sharp Kk Guided capture or selection of stereoscopic image pairs.
JP3991020B2 (en) * 2003-09-30 2007-10-17 キヤノン株式会社 Image display method and image display system
US7409105B2 (en) 2003-10-22 2008-08-05 Arcsoft, Inc. Panoramic maker engine for a low profile system
GB2408901B (en) * 2003-12-05 2006-07-12 Motorola Inc A decoder for a wireless communication device
GB0329312D0 (en) 2003-12-18 2004-01-21 Univ Durham Mapping perceived depth to regions of interest in stereoscopic images
ATE382919T1 (en) * 2004-02-17 2008-01-15 Koninkl Philips Electronics Nv CREATE A DEPTH MAP
WO2005086078A1 (en) 2004-03-02 2005-09-15 Sarnoff Corporation Method and apparatus for classifying an object
FR2868168B1 (en) * 2004-03-26 2006-09-15 Cnes Epic FINE MATCHING OF STEREOSCOPIC IMAGES AND DEDICATED INSTRUMENT WITH A LOW STEREOSCOPIC COEFFICIENT
US8049776B2 (en) 2004-04-12 2011-11-01 Angstrom, Inc. Three-dimensional camcorder
US7512883B2 (en) * 2004-06-30 2009-03-31 Microsoft Corporation Portable solution for automatic camera management
EP1613060A1 (en) * 2004-07-02 2006-01-04 Sony Ericsson Mobile Communications AB Capturing a sequence of images
JP4581512B2 (en) * 2004-07-02 2010-11-17 オムロン株式会社 Three-dimensional image processing apparatus, optical axis adjustment method, and optical axis adjustment support method
US7515759B2 (en) 2004-07-14 2009-04-07 Sharp Laboratories Of America, Inc. 3D video coding using sub-sequences
KR100601958B1 (en) 2004-07-15 2006-07-14 삼성전자주식회사 Method for estimting disparity for 3D object recognition
KR20070064319A (en) 2004-08-06 2007-06-20 유니버시티 오브 워싱톤 Variable fixation viewing distance scanned light displays
CN101027900A (en) 2004-09-24 2007-08-29 皇家飞利浦电子股份有限公司 System and method for the production of composite images comprising or using one or more cameras for providing overlapping images
JP2006113807A (en) * 2004-10-14 2006-04-27 Canon Inc Image processor and image processing program for multi-eye-point image
US7212665B2 (en) * 2004-11-05 2007-05-01 Honda Motor Co. Human pose estimation with data driven belief propagation
WO2006062325A1 (en) 2004-12-06 2006-06-15 Electronics And Telecommunications Research Institute Apparatus for correcting image distortion of stereo-camera and method thereof
WO2006084385A1 (en) 2005-02-11 2006-08-17 Macdonald Dettwiler & Associates Inc. 3d imaging system
US8082120B2 (en) * 2005-03-11 2011-12-20 Creaform Inc. Hand-held self-referenced apparatus for three-dimensional scanning
US20060210111A1 (en) 2005-03-16 2006-09-21 Dixon Cleveland Systems and methods for eye-operated three-dimensional object location
US7760962B2 (en) * 2005-03-30 2010-07-20 Casio Computer Co., Ltd. Image capture apparatus which synthesizes a plurality of images obtained by shooting a subject from different directions, to produce an image in which the influence of glare from a light is reduced
US7643062B2 (en) * 2005-06-08 2010-01-05 Hewlett-Packard Development Company, L.P. Method and system for deblurring an image based on motion tracking
KR100653965B1 (en) 2005-08-01 2006-12-05 심재용 A 3d stereoscopic image processing device of portable telephone using the different camera sensor
KR100667810B1 (en) 2005-08-31 2007-01-11 삼성전자주식회사 Apparatus for controlling depth of 3d picture and method therefor
DE602006017940D1 (en) 2005-09-09 2010-12-16 Olympus Medical Systems Corp Medical stereo observation system
US20070064098A1 (en) * 2005-09-19 2007-03-22 Available For Licensing Systems and methods for 3D rendering
EP1994753A2 (en) 2005-09-26 2008-11-26 Koninklijke Philips Electronics N.V. Method and device for tracking a movement of an object or of a person
JP4356689B2 (en) 2005-12-08 2009-11-04 ソニー株式会社 CAMERA SYSTEM, CAMERA CONTROL DEVICE, PANORAMA IMAGE CREATION METHOD, AND COMPUTER PROGRAM
US20070165942A1 (en) * 2006-01-18 2007-07-19 Eastman Kodak Company Method for rectifying stereoscopic display systems
KR100679054B1 (en) 2006-02-15 2007-02-06 삼성전자주식회사 Apparatus and method for displaying three-dimensional image
GB0608841D0 (en) * 2006-05-04 2006-06-14 Isis Innovation Scanner system and method for scanning
US7573489B2 (en) 2006-06-01 2009-08-11 Industrial Light & Magic Infilling for 2D to 3D image conversion
US7573475B2 (en) 2006-06-01 2009-08-11 Industrial Light & Magic 2D to 3D image conversion
US7705970B2 (en) * 2006-06-05 2010-04-27 The Regents Of The University Of Colorado Method and system for optical imaging and ranging
US7538876B2 (en) * 2006-06-12 2009-05-26 The Boeing Company Efficient and accurate alignment of stereoscopic displays
US8189100B2 (en) * 2006-07-25 2012-05-29 Qualcomm Incorporated Mobile device with dual digital camera sensors and methods of using the same
US20080158345A1 (en) * 2006-09-11 2008-07-03 3Ality Digital Systems, Llc 3d augmentation of traditional photography
US7857455B2 (en) 2006-10-18 2010-12-28 Reald Inc. Combining P and S rays for bright stereoscopic projection
JP5366824B2 (en) 2006-12-19 2013-12-11 コーニンクレッカ フィリップス エヌ ヴェ Method and system for converting 2D video to 3D video
EP1939955A3 (en) 2006-12-27 2015-12-23 CSEM Centre Suisse d'Electronique et de Microtechnique SA - Recherche et Développement Optical device and system and method for fabricating the device
US8184926B2 (en) 2007-02-28 2012-05-22 Microsoft Corporation Image deblurring with blurred/noisy image pairs
GB0708676D0 (en) * 2007-05-04 2007-06-13 Imec Inter Uni Micro Electr A Method for real-time/on-line performing of multi view multimedia applications
US20100220932A1 (en) 2007-06-20 2010-09-02 Dong-Qing Zhang System and method for stereo matching of images
US7817187B2 (en) 2007-06-27 2010-10-19 Aptina Imaging Corporation Image blur correction using a secondary camera
CA2693666A1 (en) * 2007-07-12 2009-01-15 Izzat H. Izzat System and method for three-dimensional object reconstruction from two-dimensional images
US20090061381A1 (en) * 2007-09-05 2009-03-05 Duane Milford Durbin Systems and methods for 3D previewing
JP2009133753A (en) 2007-11-30 2009-06-18 Toshiba Corp Image processing device and its method
GB2455316B (en) 2007-12-04 2012-08-15 Sony Corp Image processing apparatus and method
KR100912715B1 (en) * 2007-12-17 2009-08-19 한국전자통신연구원 Method and apparatus of digital photogrammetry by integrated modeling for different types of sensors
KR101419979B1 (en) * 2008-01-29 2014-07-16 톰슨 라이센싱 Method and system for converting 2d image data to stereoscopic image data
GB2458927B (en) 2008-04-02 2012-11-14 Eykona Technologies Ltd 3D Imaging system
US20100097444A1 (en) * 2008-10-16 2010-04-22 Peter Lablans Camera System for Creating an Image From a Plurality of Images
US8350892B2 (en) * 2008-05-20 2013-01-08 Sony Corporation Image pickup apparatus, image pickup method, playback control apparatus, playback control method, and program
US8830341B2 (en) 2008-05-22 2014-09-09 Nvidia Corporation Selection of an optimum image in burst mode in a digital camera
KR101490689B1 (en) 2008-05-27 2015-02-06 삼성전자주식회사 Method and apparatus for generating a stereoscopic image datastream using a camera parameter, and method and apparatus for reconstructing a stereoscopic image using the same
ZA200903757B (en) 2008-07-29 2010-04-28 Eskom Holdings Pty Ltd Stray-flux detection and analysis system
US8300089B2 (en) * 2008-08-14 2012-10-30 Reald Inc. Stereoscopic depth mapping
CN102124745A (en) 2008-08-26 2011-07-13 升级芯片技术公司 Apparatus and method for converting 2D image signals into 3D image signals
WO2010052741A1 (en) 2008-11-07 2010-05-14 Telecom Italia S.P.A. Method and system for producing multi-view 3d visual contents
US8471895B2 (en) * 2008-11-25 2013-06-25 Paul S. Banks Systems and methods of high resolution three-dimensional imaging
US8405742B2 (en) 2008-12-30 2013-03-26 Massachusetts Institute Of Technology Processing images having different focus
US9098926B2 (en) 2009-02-06 2015-08-04 The Hong Kong University Of Science And Technology Generating three-dimensional façade models from images
GB2467932A (en) * 2009-02-19 2010-08-25 Sony Corp Image processing device and method
CN105681633B (en) 2009-03-19 2019-01-18 数字光学公司 Dual sensor camera and its method
US8743176B2 (en) 2009-05-20 2014-06-03 Advanced Scientific Concepts, Inc. 3-dimensional hybrid camera and production system
US9124874B2 (en) 2009-06-05 2015-09-01 Qualcomm Incorporated Encoding of three-dimensional conversion information with two-dimensional video sequence
KR101325292B1 (en) 2009-06-16 2013-11-08 인텔 코오퍼레이션 Camera applications in a handheld device
TWI411870B (en) * 2009-07-21 2013-10-11 Teco Elec & Machinery Co Ltd Stereo image generating method and system
US9380292B2 (en) 2009-07-31 2016-06-28 3Dmedia Corporation Methods, systems, and computer-readable storage media for generating three-dimensional (3D) images of a scene
US8436893B2 (en) 2009-07-31 2013-05-07 3Dmedia Corporation Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3D) images
WO2011014420A1 (en) 2009-07-31 2011-02-03 3Dmedia Corporation Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3d) images
WO2011014421A2 (en) 2009-07-31 2011-02-03 3Dmedia Corporation Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
US8508580B2 (en) * 2009-07-31 2013-08-13 3Dmedia Corporation Methods, systems, and computer-readable storage media for creating three-dimensional (3D) images of a scene
EP2462536A4 (en) 2009-08-04 2014-10-22 Shenzhen Tcl New Technology Systems and methods for three-dimensional video generation
WO2011028837A2 (en) * 2009-09-01 2011-03-10 Prime Focus Vfx Services Ii Inc. System and process for transforming two-dimensional images into three-dimensional images
US8908958B2 (en) * 2009-09-03 2014-12-09 Ron Kimmel Devices and methods of generating three dimensional (3D) colored models
KR101214536B1 (en) * 2010-01-12 2013-01-10 삼성전자주식회사 Method for performing out-focus using depth information and camera using the same
US9344701B2 (en) 2010-07-23 2016-05-17 3Dmedia Corporation Methods, systems, and computer-readable storage media for identifying a rough depth map in a scene and for determining a stereo-base distance for three-dimensional (3D) content creation
MY160768A (en) 2010-08-06 2017-03-15 Tasly Pharmaceutical Group Co Use of salvia miltiorrhiza composition in preparing drugs for secondary prevention of coronary heart disease
WO2012061549A2 (en) * 2010-11-03 2012-05-10 3Dmedia Corporation Methods, systems, and computer program products for creating three-dimensional video sequences
US8274552B2 (en) 2010-12-27 2012-09-25 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods

Patent Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3503316A (en) * 1965-11-09 1970-03-31 Toppan Printing Co Ltd Apparatus for producing stereoscopic pictures
US3953869A (en) * 1974-09-24 1976-04-27 Dimensional Development Corporation Stereoscopic photography apparatus
US4661986A (en) * 1983-06-27 1987-04-28 Rca Corporation Depth-of-focus imaging process method
US4956705A (en) * 1989-03-10 1990-09-11 Dimensional Visions Group Electronic method and apparatus for stereoscopic photography
US5043806A (en) * 1989-07-26 1991-08-27 L'etat Francais Represente Par Le Ministre Des P.T.T. Method of processing and transmitting over a "MAC" type channel a sequence of pairs of sterescopic television images
US5151609A (en) * 1989-08-02 1992-09-29 Hitachi, Ltd. Method of detecting solid shape of object with autofocusing and image detection at each focus level
US4980762A (en) * 1989-10-13 1990-12-25 Massachusetts Institute Of Technology Method and apparatus for image processing to obtain three dimensional motion and depth
US5369735A (en) * 1990-03-30 1994-11-29 New Microtime Inc. Method for controlling a 3D patch-driven special effects system
US5305092A (en) * 1991-09-03 1994-04-19 Hitachi, Ltd. Apparatus for obtaining three-dimensional volume data of an object
US5603687A (en) * 1992-10-28 1997-02-18 Oktas General Partnership Asymmetric stereo-optic endoscope
US5613048A (en) * 1993-08-03 1997-03-18 Apple Computer, Inc. Three-dimensional image synthesis using view interpolation
US5444479A (en) * 1993-09-02 1995-08-22 Vision Iii Imaging, Inc. Single camera autostereoscopic imaging apparatus
US6324347B1 (en) * 1993-11-05 2001-11-27 Vision Iii Imaging, Inc. Autostereoscopic imaging apparatus and method using a parallax scanning lens aperture
US5991551A (en) * 1993-11-05 1999-11-23 Vision Iii Imaging, Inc. Autostereoscopic imaging apparatus and method using a parallax scanning lens aperture
US5678089A (en) * 1993-11-05 1997-10-14 Vision Iii Imaging, Inc. Autostereoscopic imaging apparatus and method using a parallax scanning lens aperture
US5511153A (en) * 1994-01-18 1996-04-23 Massachusetts Institute Of Technology Method and apparatus for three-dimensional, textured models from plural video images
US5963247A (en) * 1994-05-31 1999-10-05 Banitt; Shmuel Visual display systems and a system for producing recordings for visualization thereon and methods therefor
US5719954A (en) * 1994-06-07 1998-02-17 Matsushita Electric Industrial Co., Ltd. Stereo matching method and disparity measuring method
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
US5808664A (en) * 1994-07-14 1998-09-15 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US6031538A (en) * 1994-08-30 2000-02-29 Thomson Broadband Systems Method for the generation of synthetic images
US5682437A (en) * 1994-09-22 1997-10-28 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US6414709B1 (en) * 1994-11-03 2002-07-02 Synthonics Incorporated Methods and apparatus for zooming during capture and reproduction of 3-dimensional images
US5673081A (en) * 1994-11-22 1997-09-30 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US6384859B1 (en) * 1995-03-29 2002-05-07 Sanyo Electric Co., Ltd. Methods for creating an image for a three-dimensional display, for calculating depth information and for image processing using the depth information
US5777666A (en) * 1995-04-17 1998-07-07 Sanyo Electric Co., Ltd. Method of converting two-dimensional images into three-dimensional images
US5748199A (en) * 1995-12-20 1998-05-05 Synthonics Incorporated Method and apparatus for converting a two dimensional motion picture into a three dimensional motion picture
US5953054A (en) * 1996-05-31 1999-09-14 Geo-3D Inc. Method and system for producing stereoscopic 3-dimensional images
US5874988A (en) * 1996-07-08 1999-02-23 Da Vinci Systems, Inc. System and methods for automated color correction
US6075905A (en) * 1996-07-17 2000-06-13 Sarnoff Corporation Method and apparatus for mosaic image construction
US6445833B1 (en) * 1996-07-18 2002-09-03 Sanyo Electric Co., Ltd Device and method for converting two-dimensional video into three-dimensional video
US5652616A (en) * 1996-08-06 1997-07-29 General Instrument Corporation Of Delaware Optimal disparity estimation for stereoscopic video coding
US6064759A (en) * 1996-11-08 2000-05-16 Buckley; B. Shawn Computer aided inspection machine
US6215516B1 (en) * 1997-07-07 2001-04-10 Reveo, Inc. Method and apparatus for monoscopic to stereoscopic image conversion
US6747610B1 (en) * 1997-07-22 2004-06-08 Sanyo Electric Co., Ltd. Stereoscopic image display apparatus capable of selectively displaying desired stereoscopic image
US6018349A (en) * 1997-08-01 2000-01-25 Microsoft Corporation Patch-based alignment method and apparatus for construction of image mosaics
US6496598B1 (en) * 1997-09-02 2002-12-17 Dynamic Digital Depth Research Pty. Ltd. Image processing method and apparatus
US20020191841A1 (en) * 1997-09-02 2002-12-19 Dynamic Digital Depth Research Pty Ltd Image processing method and apparatus
US6584219B1 (en) * 1997-09-18 2003-06-24 Sanyo Electric Co., Ltd. 2D/3D image conversion system
US5883695A (en) * 1997-09-19 1999-03-16 Paul; Eddie Method and apparatus for producing stereoscopic images with single sensor
US6434278B1 (en) * 1997-09-23 2002-08-13 Enroute, Inc. Generating three-dimensional models of objects defined by two-dimensional image data
US6047078A (en) * 1997-10-03 2000-04-04 Digital Equipment Corporation Method for extracting a three-dimensional model using appearance-based constrained structure from motion
US6314211B1 (en) * 1997-12-30 2001-11-06 Samsung Electronics Co., Ltd. Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image
US6094215A (en) * 1998-01-06 2000-07-25 Intel Corporation Method of determining relative camera orientation position to create 3-D visual images
US6385334B1 (en) * 1998-03-12 2002-05-07 Fuji Jukogyo Kabushiki Kaisha System and method for adjusting stereo camera
US6240198B1 (en) * 1998-04-13 2001-05-29 Compaq Computer Corporation Method for figure tracking using 2-D registration
US6269172B1 (en) * 1998-04-13 2001-07-31 Compaq Computer Corporation Method for tracking the motion of a 3-D figure
US6686926B1 (en) * 1998-05-27 2004-02-03 In-Three, Inc. Image processing system and method for converting two-dimensional images into three-dimensional images
US6246412B1 (en) * 1998-06-18 2001-06-12 Microsoft Corporation Interactive construction and refinement of 3D models from multiple panoramic images
US6023588A (en) * 1998-09-28 2000-02-08 Eastman Kodak Company Method and apparatus for capturing panoramic images with range data
US6750904B1 (en) * 1998-10-31 2004-06-15 International Business Machines Corporation Camera system for three dimensional images and video
US6278460B1 (en) * 1998-12-15 2001-08-21 Point Cloud, Inc. Creating a three-dimensional model from two-dimensional images
US6661913B1 (en) * 1999-05-05 2003-12-09 Microsoft Corporation System and method for determining structure and motion using multiples sets of images from different projection models for object modeling
US6760488B1 (en) * 1999-07-12 2004-07-06 Carnegie Mellon University System and method for generating a three-dimensional model from a two-dimensional image sequence
US6556704B1 (en) * 1999-08-25 2003-04-29 Eastman Kodak Company Method for forming a depth image from digital image data
US6512892B1 (en) * 1999-09-15 2003-01-28 Sharp Kabushiki Kaisha 3D camera
US6798406B1 (en) * 1999-09-15 2004-09-28 Sharp Kabushiki Kaisha Stereo images with comfortable perceived depth
US6862364B1 (en) * 1999-10-27 2005-03-01 Canon Kabushiki Kaisha Stereo image processing for radiography
US6970591B1 (en) * 1999-11-25 2005-11-29 Canon Kabushiki Kaisha Image processing apparatus
US6677981B1 (en) * 1999-12-31 2004-01-13 Stmicroelectronics, Inc. Motion play-back of still pictures comprising a panoramic view for simulating perspective
US20030030636A1 (en) * 2000-03-31 2003-02-13 Olympus Optical Co., Ltd. 3D image data publishing method and 3D image production system
US6611268B1 (en) * 2000-05-30 2003-08-26 Microsoft Corporation System and process for generating 3D video textures using video-based rendering techniques
US6381302B1 (en) * 2000-07-05 2002-04-30 Canon Kabushiki Kaisha Computer assisted 2D adjustment of stereo X-ray images
US6559846B1 (en) * 2000-07-07 2003-05-06 Microsoft Corporation System and process for viewing panoramic video
US6967659B1 (en) * 2000-08-25 2005-11-22 Advanced Micro Devices, Inc. Circuitry and systems for performing two-dimensional motion compensation using a three-dimensional pipeline and methods of operating the same
US6677982B1 (en) * 2000-10-11 2004-01-13 Eastman Kodak Company Method for three dimensional spatial panorama formation
US6927769B2 (en) * 2000-11-21 2005-08-09 Vrex, Inc. Stereoscopic image processing on a computer system
US20040080661A1 (en) * 2000-12-22 2004-04-29 Sven-Ake Afsenius Camera that combines the best focused parts from different exposures to an image
US20020158873A1 (en) * 2001-01-26 2002-10-31 Todd Williamson Real-time virtual viewpoint in simulated reality environment
US20020106120A1 (en) * 2001-01-31 2002-08-08 Nicole Brandenburg Method of analyzing in real time the correspondence of image characteristics in corresponding video images
US20020190991A1 (en) * 2001-05-16 2002-12-19 Daniel Efran 3-D instant replay system and method
US20030002870A1 (en) * 2001-06-27 2003-01-02 Baron John M. System for and method of auto focus indications
US6947059B2 (en) * 2001-08-10 2005-09-20 Micoy Corporation Stereoscopic panoramic image capture device
US20040218269A1 (en) * 2002-01-14 2004-11-04 Divelbiss Adam W. General purpose stereoscopic 3D format conversion system and method
US20030151659A1 (en) * 2002-02-13 2003-08-14 Pentax Corporation Camera for generating a stereoscopic pair of images
US20030152264A1 (en) * 2002-02-13 2003-08-14 Perkins Christopher H. Method and system for processing stereoscopic images
US20040100565A1 (en) * 2002-11-22 2004-05-27 Eastman Kodak Company Method and system for generating images used in extended range panorama composition
US20040136571A1 (en) * 2002-12-11 2004-07-15 Eastman Kodak Company Three dimensional images
US20050041123A1 (en) * 2003-08-20 2005-02-24 Sbc Knowledge Ventures, L.P. Digital image capturing system and method
US20050201612A1 (en) * 2004-03-04 2005-09-15 Samsung Electronics Co.,Ltd. Method and apparatus for detecting people using stereo camera
US20070285419A1 (en) * 2004-07-30 2007-12-13 Dor Givon System and method for 3d space-dimension based image processing
US20070024614A1 (en) * 2005-07-26 2007-02-01 Tam Wa J Generating a depth map from a two-dimensional source image for stereoscopic and multiview imaging
US20090073164A1 (en) * 2005-10-28 2009-03-19 Wells Barton S Automatic compositing of 3d objects in a still frame or series of frames
US20070297784A1 (en) * 2006-06-22 2007-12-27 Sony Corporation Method of and apparatus for generating a depth map utilized in autofocusing
US20090116732A1 (en) * 2006-06-23 2009-05-07 Samuel Zhou Methods and systems for converting 2d motion pictures for stereoscopic 3d exhibition
US20080031327A1 (en) * 2006-08-01 2008-02-07 Haohong Wang Real-time capturing and generating stereo images and videos with a monoscopic low power mobile device
US7847854B2 (en) * 2006-09-27 2010-12-07 Fujitsu Semiconductor Limited Imaging apparatus with AF optical zoom
US20080080852A1 (en) * 2006-10-03 2008-04-03 National Taiwan University Single lens auto focus system for stereo image generation and method thereof
US7616885B2 (en) * 2006-10-03 2009-11-10 National Taiwan University Single lens auto focus system for stereo image generation and method thereof
US20080150945A1 (en) * 2006-12-22 2008-06-26 Haohong Wang Complexity-adaptive 2d-to-3d video sequence conversion
US8223194B2 (en) * 2007-01-30 2012-07-17 Samsung Electronics Co., Ltd. Image processing method and apparatus
US20080232680A1 (en) * 2007-03-19 2008-09-25 Alexander Berestov Two dimensional/three dimensional digital information acquisition and display device
US20080247670A1 (en) * 2007-04-03 2008-10-09 Wa James Tam Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images
US20100080448A1 (en) * 2007-04-03 2010-04-01 Wa James Tam Method and graphical user interface for modifying depth maps
US20090010507A1 (en) * 2007-07-02 2009-01-08 Zheng Jason Geng System and method for generating a 3d model of anatomical structure using a plurality of 2d images
US20090167923A1 (en) * 2007-12-27 2009-07-02 Ati Technologies Ulc Method and apparatus with depth map generation
US20090167930A1 (en) * 2007-12-27 2009-07-02 Ati Technologies Ulc Method and apparatus with fast camera auto focus
US20090169057A1 (en) * 2007-12-28 2009-07-02 Industrial Technology Research Institute Method for producing image with depth by using 2d images
US8248410B2 (en) * 2008-12-09 2012-08-21 Seiko Epson Corporation Synthesizing detailed depth maps from images

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8436893B2 (en) 2009-07-31 2013-05-07 3Dmedia Corporation Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3D) images
US20110025829A1 (en) * 2009-07-31 2011-02-03 3Dmedia Corporation Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3d) images
US8810635B2 (en) 2009-07-31 2014-08-19 3Dmedia Corporation Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional images
US8508580B2 (en) 2009-07-31 2013-08-13 3Dmedia Corporation Methods, systems, and computer-readable storage media for creating three-dimensional (3D) images of a scene
US20110025825A1 (en) * 2009-07-31 2011-02-03 3Dmedia Corporation Methods, systems, and computer-readable storage media for creating three-dimensional (3d) images of a scene
US11044458B2 (en) 2009-07-31 2021-06-22 3Dmedia Corporation Methods, systems, and computer-readable storage media for generating three-dimensional (3D) images of a scene
US20130107020A1 (en) * 2010-06-30 2013-05-02 Fujifilm Corporation Image capture device, non-transitory computer-readable storage medium, image capture method
US9344701B2 (en) 2010-07-23 2016-05-17 3Dmedia Corporation Methods, systems, and computer-readable storage media for identifying a rough depth map in a scene and for determining a stereo-base distance for three-dimensional (3D) content creation
US10134150B2 (en) * 2010-08-10 2018-11-20 Monotype Imaging Inc. Displaying graphics in multi-view scenes
US9369693B2 (en) * 2010-09-22 2016-06-14 Fujifilm Corporation Stereoscopic imaging device and shading correction method
US20130162780A1 (en) * 2010-09-22 2013-06-27 Fujifilm Corporation Stereoscopic imaging device and shading correction method
US8995750B2 (en) * 2010-09-30 2015-03-31 Casio Computer Co., Ltd. Image composition apparatus, image retrieval method, and storage medium storing program
US20120082369A1 (en) * 2010-09-30 2012-04-05 Casio Computer Co., Ltd. Image composition apparatus, image retrieval method, and storage medium storing program
US20120110565A1 (en) * 2010-10-29 2012-05-03 Intuit Inc. Chained data processing and application utilization
US9185388B2 (en) 2010-11-03 2015-11-10 3Dmedia Corporation Methods, systems, and computer program products for creating three-dimensional video sequences
US10200671B2 (en) 2010-12-27 2019-02-05 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US10911737B2 (en) 2010-12-27 2021-02-02 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US8441520B2 (en) 2010-12-27 2013-05-14 3Dmedia Corporation Primary and auxiliary image capture devcies for image processing and related methods
US11388385B2 (en) * 2010-12-27 2022-07-12 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US8274552B2 (en) 2010-12-27 2012-09-25 3Dmedia Corporation Primary and auxiliary image capture devices for image processing and related methods
US9501834B2 (en) * 2011-08-18 2016-11-22 Qualcomm Technologies, Inc. Image capture for later refocusing or focus-manipulation
US20130044254A1 (en) * 2011-08-18 2013-02-21 Meir Tzur Image capture for later refocusing or focus-manipulation
US9137520B2 (en) 2011-09-08 2015-09-15 Samsung Display Co., Ltd. Stereoscopic image display device and method of displaying stereoscopic image
CN103049933A (en) * 2011-10-17 2013-04-17 联咏科技股份有限公司 Image processing device and method thereof
US20130093850A1 (en) * 2011-10-17 2013-04-18 Novatek Microelectronics Corp. Image processing apparatus and method thereof
CN104169970A (en) * 2012-04-18 2014-11-26 索尼公司 Method and optical system for determining a depth map of an image
US20180088923A1 (en) * 2012-07-27 2018-03-29 Huawei Device Co., Ltd. Method, User Equipment, and Application Server for Downloading Application
US10537468B2 (en) 2012-10-23 2020-01-21 Kali Care, Inc. Portable management and monitoring system for eye drop medication regiment
US10237528B2 (en) 2013-03-14 2019-03-19 Qualcomm Incorporated System and method for real time 2D to 3D conversion of a video in a digital camera
US10237491B2 (en) 2013-07-31 2019-03-19 Samsung Electronics Co., Ltd. Electronic apparatus, method of controlling the same, for capturing, storing, and reproducing multifocal images
WO2015016619A1 (en) * 2013-07-31 2015-02-05 Samsung Electronics Co., Ltd. Electronic apparatus, method of controlling the same, and image reproducing apparatus and method
EP2852143A1 (en) 2013-09-18 2015-03-25 Nokia Corporation Creating a cinemagraph
US9576387B2 (en) 2013-09-18 2017-02-21 Nokia Corporation Creating a cinemagraph
US9591295B2 (en) * 2013-09-24 2017-03-07 Amazon Technologies, Inc. Approaches for simulating three-dimensional views
US20150085076A1 (en) * 2013-09-24 2015-03-26 Amazon Techologies, Inc. Approaches for simulating three-dimensional views
US9437038B1 (en) 2013-09-26 2016-09-06 Amazon Technologies, Inc. Simulating three-dimensional views using depth relationships among planes of content
US9973677B2 (en) * 2013-10-14 2018-05-15 Qualcomm Incorporated Refocusable images
US20150103192A1 (en) * 2013-10-14 2015-04-16 Qualcomm Incorporated Refocusable images
US10096115B2 (en) 2014-04-11 2018-10-09 Blackberry Limited Building a depth map using movement of one camera
US10203752B2 (en) * 2014-11-07 2019-02-12 Eye Labs, LLC Head-mounted devices having variable focal depths
US10441214B2 (en) 2015-01-29 2019-10-15 Kali Care, Inc. Monitoring adherence to a medication regimen using a sensor
US9824427B2 (en) * 2015-04-15 2017-11-21 Light Labs Inc. Methods and apparatus for generating a sharp image
US20160309141A1 (en) * 2015-04-15 2016-10-20 The Lightco Inc. Methods and apparatus for generating a sharp image

Also Published As

Publication number Publication date
US20140009586A1 (en) 2014-01-09
US8436893B2 (en) 2013-05-07
US20110025829A1 (en) 2011-02-03
US8810635B2 (en) 2014-08-19

Similar Documents

Publication Publication Date Title
US20110025830A1 (en) Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
US20210314547A1 (en) Methods, systems, and computer-readable storage media for generating three-dimensional (3d) images of a scene
US9635348B2 (en) Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional images
WO2011014421A2 (en) Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
US9344701B2 (en) Methods, systems, and computer-readable storage media for identifying a rough depth map in a scene and for determining a stereo-base distance for three-dimensional (3D) content creation
US10425638B2 (en) Equipment and method for promptly performing calibration and verification of intrinsic and extrinsic parameters of a plurality of image capturing elements installed on electronic device
US8508580B2 (en) Methods, systems, and computer-readable storage media for creating three-dimensional (3D) images of a scene
US9544574B2 (en) Selecting camera pairs for stereoscopic imaging
KR20170106325A (en) Method and apparatus for multiple technology depth map acquisition and fusion
EP3154251A1 (en) Application programming interface for multi-aperture imaging systems
JP2011166264A (en) Image processing apparatus, imaging device and image processing method, and program
EP3281400A1 (en) Automated generation of panning shots
KR20170005009A (en) Generation and use of a 3d radon image
CN101577795A (en) Method and device for realizing real-time viewing of panoramic picture
TW200816800A (en) Single lens auto focus system for stereo image generation and method thereof
US20140085422A1 (en) Image processing method and device
JP5614268B2 (en) Image processing apparatus, image processing method, and program
JP2008276301A (en) Image processing apparatus, method and program
US20130083169A1 (en) Image capturing apparatus, image processing apparatus, image processing method and program
KR20150047604A (en) Method for description of object points of the object space and connection for its implementation
US20130076868A1 (en) Stereoscopic imaging apparatus, face detection apparatus and methods of controlling operation of same
JP3088852B2 (en) 3D image input device
CN115314698A (en) Stereoscopic shooting and displaying device and method
JP2012060512A (en) Multi-eye imaging apparatus and program
JP2011108253A (en) Image processing device, method and program

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION