US20060072020A1 - Rotating scan camera - Google Patents

Rotating scan camera Download PDF

Info

Publication number
US20060072020A1
US20060072020A1 US10/954,585 US95458504A US2006072020A1 US 20060072020 A1 US20060072020 A1 US 20060072020A1 US 95458504 A US95458504 A US 95458504A US 2006072020 A1 US2006072020 A1 US 2006072020A1
Authority
US
United States
Prior art keywords
camera
drum
sensor
image
recording
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/954,585
Other versions
US7791638B2 (en
Inventor
David McCutchen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Immersive Licensing Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/954,585 priority Critical patent/US7791638B2/en
Assigned to IMMERSIVE MEDIA CO. reassignment IMMERSIVE MEDIA CO. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MCCUTCHEN, DAVID J.
Publication of US20060072020A1 publication Critical patent/US20060072020A1/en
Application granted granted Critical
Publication of US7791638B2 publication Critical patent/US7791638B2/en
Assigned to IMMERSIVE VENTURES INC. reassignment IMMERSIVE VENTURES INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IMC360 COMPANY
Assigned to IMC360 COMPANY reassignment IMC360 COMPANY CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: IMMERSIVE MEDIA COMPANY
Assigned to IMMERSIVE LICENSING, INC. reassignment IMMERSIVE LICENSING, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IMMERSIVE VENTURES INC.
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/18Motion-picture cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/02Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with scanning movement of lens or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure

Definitions

  • This invention generally relates to a panoramic image reproduction system, and in particular to the use of rotary scanning elements in a camera design, with multiple sensors being used to establish stereoscopic viewpoints, adjust their convergence, and to increase the density of information being scanned.
  • CMOS Complementary Metal Oxide Semiconductor
  • Linear scan sensors are also being used to increase resolution for specific applications. These sensors have a single row of sensors, designed for monochrome or color readout, and sensitive to either visible light or some other part of the spectrum such as infrared or ultraviolet. The image is made by successive readouts of the sensor as it moves across the image, building up the final resolution. This is also sometime referred to as a slit scan or line scan sensor. Digital scans of this type are already found in such devices as fax machines and flatbed scanners. Line scan cameras are often used in industrial processes to obtain clear pictures of objects in motion.
  • Examples of these cameras are the Wintriss Engineering OPSIS 5150ALC for monochrome images, which uses a monochrome CCD linear sensor with 5150 pixels and onboard image processing, and the DALSA CT-E4-2048W, which uses a color CCD linear sensor with 2048 pixels.
  • these cameras are fixed in place, and look at objects moving past them. Unlike area-scan sensors, the resulting images do not suffer from motion blur.
  • Scan-back digital cameras such as the Better Light Model 8000 camera back are also doing high-quality digital still photography.
  • the use of a slow scan from a linear sensor allows for maximum resolution to be built up when photographing artwork, documents or other subjects.
  • Scanning sensors are also used in so-called “push-broom” satellite surveillance applications, where as the satellite sweeps across the surface, a succession of readouts of the scanner creates a very large and detailed overall image.
  • Existing panoramic slit scan cameras have also been made that use a moving slit to expose a strip of film, such as the WideluxTM camera for stills.
  • Jehle's “360 degree IR Surveillance with Panoramic Display” uses an infrared scanner also aligned radially to the axis of rotation.
  • Federau's “Line-Scan Panoramic Camera” (U.S. Pat. No. 4,532,544) also describes a radial axis for a television camera.
  • Globus et al's “Panoramic Camera” (U.S. Pat. No. 4,241,985), marketed as the GlobuscopeTM, employs a slit and a radial optical axis to expose a film strip.
  • Schonherr U.S. Pat. No. 5,305,035 describes a design for slots that are intended to “seal off” the edges of a light-sensitive cylindrical recording surface when used with a rotating objective drum, thus preventing extraneous light at the edges of the recording, or “light safety”.
  • the drum of Schoenherr et al has a lens in its center which focuses light from a wide input image window on one side onto a narrow image output window on the other side, so that as the drum rotates, the light passes through this second window in a scanning fashion onto the fixed cylindrical recording surface outside of the drum. Because the recording surface is separate and outside of the drum, this device has a horizontal field of view that is inherently less than 180 degrees.
  • Keast, et al discloses a rotating camera which in one embodiment has two cameras with axes not radial to the axis of rotation.
  • Keast depends upon the addition of counter-rotation relay optics between the lens and the sensor. This adds significant weight and complexity to the device, and also requires that a perfectly consistent geometrical relationship be consistently maintained between the input lens and the sensor along a long optical path, even while the whole apparatus is in rapid motion and presumably being carried in the field in the manner of a regular camera.
  • the described counter-rotation optics are also inherently impractical, because as the overall apparatus rotates, at the point where the lens is opposite the sensor, the mirror surfaces and their optical supports need to completely disappear in order to avoid an interruption in the image.
  • Keast et al describes the use of two cameras, purely for establishing stereo viewpoints, and the counter-rotation relay optics described preclude more than two cameras from being used. In addition, the described method suggests that there would be significant light interference between the light from the two cameras.
  • the prior art is largely directed toward the production of panoramic still photographs, and the art featuring film-based recording in particular is unsuitable for motion picture photography, because of the need to put a large and heavy film transport and storage mechanism into rapid motion.
  • the prior art for panoramic scanning is primarily directed toward a cylindrical panoramic format that omits a substantial portion of the surrounding scene, and spherical imaging to date has mainly involved multiple views that must be seamed together prior to viewing, or scanning approaches that produce interruptions in the field of view.
  • FIG. 1 shows an oblique view of the preferred embodiment of the rotating camera, with a hemispheric field of view for each of two sensors in a rotating camera drum with conventional fisheye lenses and streamlining elements.
  • FIG. 2 shows a simplified oblique view of a basic embodiment featuring a rotating drum, the motor and carrying handle for it, and two sensors in offset positions, showing an aperture slit for each sensor.
  • FIG. 3 shows a top view of the rotating drum of a basic embodiment, showing the sensor and lens components of the subcamera assemblies, along with the optical axis arrangements characteristic of the system.
  • FIG. 4 shows a closeup schematic top view of the preferred embodiment using a fisheye lens with a streamlining covering for the drum surface.
  • FIG. 5 is a top view of the camera drum in the preferred embodiment, showing the action of the spinning drum's sensors where an object in the field of view is scanned from different points of view, allowing for stereoscopic effects, with the addition of a rangefinding device as part of the scanning drum.
  • FIG. 6 shows two equirectangular stereoscopic views produced by scans of the preferred full spherical field of view according to the present invention.
  • FIG. 7 is a schematic view showing the application of lateral distortion to portions of a recorded image to achieve stereoscopic convergence at a given distance, such as in would be produced by delays in the timing of the recording of scans.
  • FIG. 8 is a schematic view showing a preferred embodiment wherein lateral distortion is applied to a region of interest within the overall image to achieve stereoscopic convergence
  • FIG. 9 shows a block diagram of the various components of the entire system in the preferred embodiment.
  • FIG. 10 shows a top view of an alternate arrangement for increasing the overall resolution, here with four sensors in place in a rotating drum, spaced around its circumference.
  • FIG. 11 shows a schematic diagram of the adjacent recording of the scans from different sensors, using various amounts of delay for the signal from each sensor.
  • FIG. 12 shows a closeup schematic top view of a conventional lens for the sensor, and ways to streamline the rapidly rotating drum surface.
  • FIG. 13 shows an oblique schematic view of an alternate external film-based recorder for camera image information.
  • FIG. 14 shows an oblique view of an alternate embodiment of the camera drum using a film-based recording system.
  • a rotary scanning camera of a novel design offers improved performance in capturing a panoramic surrounding scene.
  • One or more line-scan sensors are used in a sweeping recording of a surrounding scene.
  • the scan can be of a partial rotation or of one or more full rotations. Unlike most of the prior art, these sensors each have an optical axis that does not intersect the axis of rotation.
  • a rotating drum holds two or more line-scan devices for scanning the appearance of a surrounding field of view from stereoscopically separated viewpoints, with each optical axis on approximately a tangent line to the cylinder. This creates a pair of offset sweeping scans that record objects in the surrounding view from one, then the other, point of view, thereby creating an effective parallax separation in the recorded images. This parallax separation is the same for every point in the rotation of the drum.
  • FIG. 1 shows an oblique view of the preferred embodiment of the rotating camera, with a hemispheric field of view for each of two sensors 6 and 10 in a rotating camera drum 2 with streamlining elements such as at 40 .
  • a fisheye lens 38 with an optical axis 16 is shown for sensor 6 , producing a scan 92 with a vertical field of view which extends 180 degrees.
  • additional scans are produced, such as the second scan 94 .
  • the vertical boundary of a scan's field of view is at 100 .
  • the signals from these scans, as well as any necessary input signals, pass through a slip ring connection 56 and through a connecting cable 58 , shown here exiting at the base of a handle 52 .
  • FIG. 2 shows a simplified oblique view of a rotatable scan camera according to the present invention.
  • the camera comprises a rotatable drum 2 , actuated by a motor 4 .
  • This drum has a central axis of rotation 46 and a perimeter, which is the surface of the drum farthest from the axis of rotation.
  • Two linear sensors, a Right-hand sensor 6 and a Left-hand sensor 10 are disposed at the circumference of the rotatable drum 2 , each sensor having an aperture slit 8 , 12 .
  • Each linear sensor has an associated lens through which light passes according to a central optical axis and forms a focused image on an image plane orthogonal to the optical axis.
  • the linear sensor's active surface is located in this image plane.
  • Each sensor also has associated electronics for the input and output of signals.
  • the sensor, optics and electronics form a subcamera assembly within the rotatable drum, which rotates in a direction shown by the arrow, 14
  • this drum For recording a surrounding scene in motion, this drum should be rotating very quickly, and the scans should be as rapid as possible.
  • a preferred embodiment for recording motion would be a 5-inch diameter drum rotating at 3600 RPM, with two sensors, with a complete scanned image produced by each sensor every rotation, for an effective frame rate per sensor of 60 frames per second.
  • the resolution of the image is constrained only by the sensitivity of the sensor and its readout rate, and the ability of the data recorder to keep up. To achieve a pleasing level of resolution when viewing a closeup of a part of the image, as will commonly be done, then the resolution of the overall image should be as large as possible.
  • a preferred resolution for the scanned image frame produced by each sensor would be at least 2,000 by 4,000 pixels, which represents the rotation of a 2,000-pixel sensor sampled 4,000 times during each rotation. If this 8 million-pixel image frame is produced 60 times per second, this means 480 million pixels produced per imager per second, times two channels, or close to a billion pixels per second. This level of resolution, together with the stereoscopic qualities of the image, represents an extremely high level of realism.
  • a storage buffer should be used to store and add together multiple redundant scans.
  • an optimal equirectangular format might be 6000 ⁇ 3000 pixels. This can be reproduced, for example, by six cameras, each with a line scan sensor of 3000 pixels, that each sweep one-sixth of a sphere for each exposure period, after performing 1000 scans of 3000 pixels each.
  • faster pixel readout rates there can be more scans performed, and if they are timed to coincide with the position of earlier scans, then the results can be stored and added together to build up a better reading of the available light.
  • an operating rate of 30 MHz for each camera would enable ten scans to be added together during the progress of an exposure period, representing a ten times improvement in exposure over a single pass. Due to the nature of the trailing of scans at the beginning and end of an exposure period, scans from adjacent cameras should be added together at these points to make a smooth transition.
  • the scan depth increase can be used both to build up light sensitivity and to improve the dynamic range of the image, because of the larger number of samplings being applied to each pixel.
  • Another example also shows how multiple scans from a sensor can be added together for greater light sensitivity.
  • FIG. 3 shows a top view of the rotating drum 2 , showing the preferred tangential optical axes 16 and 18 of the Right and Left sensors 6 and 10 , and their associated lenses at 20 and 22 , respectively.
  • a sensor such as at 6 is usually wider than the aperture slit 8 ; this means that effectively, because of the space constraints of the interior of the drum, an optical axis 16 cannot be actually on the tangent of the drum, but must be located a short distance away.
  • What is shown here is an optimal arrangement, with two roughly parallel axes 16 , 18 with apertures located slightly forward of the tangent point 24 .
  • Other axis arrangements can be used, except for a radial axis 30 intersecting the center point 26 . Examples are a convergent axis such as is shown at 34 , or a divergent axis such as at 70 .
  • a true tangent axis is shown at 32 .
  • FIG. 4 shows the preferred embodiment of a conventional fisheye lens 38 for a sensor 6 , and an especially wide streamlining front element 40 conforming to the drum surface 42 that takes into account the wide horizontal field of view characteristic of this type of lens.
  • the lens is very narrow and flat, with an arrangement of elements like a slice taken through the middle of a conventional lens.
  • This flat lens has the advantage of the least surface interruption, but lacks the light-gathering abilities of a more conventional round lens, which is the preferred embodiment.
  • a flat fisheye line-scan lens for a slit aperture such as those produced by Coastal Optical, may have an f-number of f8
  • a conventional round fisheye lens could have an f-number of f2.8, for nearly eight times more light delivered to the sensor.
  • FIG. 5 shows the action of the spinning drum's sensors in the preferred embodiment where an object in the field of view is scanned from, one, then another, point of view.
  • Five optical axes for the spinning sensor 10 on the drum 2 are shown at 60 , 62 , 64 , 66 , and 68 , and how the scans of the sensors, especially for very near objects 70 , simultaneously sweep different areas 72 and 74 .
  • objects are scanned at different times, from different directions.
  • a comparable area 76 of the near object 70 is shown as being scanned by sensor 6 according to the optical axis 78 at the first position 80 , but the same target area is scanned at the fourth position 66 for the sensor 10 .
  • This scanning of objects from different points of view creates the parallax separations within the image that are the basis of any stereoscopic effect.
  • the lateral location within the scanned image is a function of time, because it is the point in time at which the image appeared in the aperture.
  • the scanned area on the object is shown for the Right 6 sensor at 82 and the Left 10 sensor at 84 , as well of the area 86 of the object 70 where the images from the two sensors overlap.
  • parallax The apparent stereoscopic separation produced by more than one point of view is called parallax, and the amount of parallax separation in the image from two fixed but separated sensors is determined by the distance between the camera and the object being viewed. Close objects, as seen against a distant background, will appear to be more laterally offset from each other than distant objects will. The distance to objects can be inferred if their images from both viewpoints are identified and measured, but this can be unreliable and calls for substantial processing power. A preferred way to determine this distance to a target object directly through the addition of a rangefinding element to the rotating camera drum. This rangefinder is shown as being made up of an emitter element 88 and a sensor element 90 .
  • This rangefinder either a miniature radar, laser, or infrared-based unit, takes readings either continuously or at intervals, to determine a distance profile of the distances to the objects in the surrounding field of view during the course of a rotation.
  • This distance profile is used to control an adjustment of parallax separation called convergence, in a convergence profile.
  • the convergence profile can be used to minimize or exaggerate the effects of parallax. If minimized, then closer objects will appear more continuous, and the effect will be of less separation between the stereo viewpoints. If exaggerated, then objects in the distance will be easier to see in stereoscopic depth, as if the stereo viewpoints were widely separated in space.
  • the convergence profile continuously applies the distance readings from the rangefinder to dynamically shift the lateral placement of the vertical scan lines within the final image buffer used for readout.
  • a target on a nearby object 70 is seen according to two optical axes from the two sensors, first at 78 for sensor 6 and then at 66 for sensor 10 .
  • the distance to objects can be inferred if their images from both viewpoints are identified and measured, but this can be unreliable and calls for substantial processing power.
  • This rangefinder is shown as being made up of an emitter element 88 and a sensor element 90 .
  • This rangefinder either a miniature radar, laser, or infrared-based unit, takes readings either continuously or at intervals, to determine a distance profile of the distances to the objects in the surrounding field of view during the course of a rotation.
  • This distance profile is used to control an adjustment of parallax separation called convergence, in a convergence profile.
  • the convergence profile can be used to minimize or exaggerate the effects of parallax. If minimized, then closer objects will appear more continuous, and the effect will be of less separation between the stereo viewpoints. If exaggerated, then objects in the distance will be easier to see in stereoscopic depth, as if the stereo
  • the convergence profile continuously applies the distance readings from the rangefinder to dynamically shift the lateral placement of the vertical scan lines within the final image buffer used for readout.
  • a target on a nearby object 70 is seen according to two optical axes from the two sensors, first at 78 for sensor 6 and then at 66 for sensor 10 . This makes 6 the leading sensor, producing the leading scan lines, and 10 the trailing sensor, producing the trailing scan lines.
  • FIG. 6 shows examples of two equirectangular stereoscopic views 102 and 104 produced by scans of a full spherical field of view from two sensors in the preferred embodiment.
  • the scene here is of a carnival midway, with a swing ride sending riders overhead.
  • This type of image is the result of a linear fisheye sensor with a hemispheric vertical field of view being rotated through a full 360-degree rotation.
  • the two stereoscopic views can be considered as a “left-eye view” and a “right-eye view”.
  • the optical axes of the multiple sensors in the camera should be in the same plane and essentially parallel, facing toward the same side of the drum, for the effect of convergence on objects at infinity.
  • the appearance of an object can be laterally shifted in the recording or playback in either direction according to the distance to the desired object, preferably as directly measured by a rangefinder.
  • FIG. 7 is a schematic view showing of the application of a convergence profile resulting in lateral shifting of portions of a recorded image to adjust stereoscopic convergence according to distance. This shifting is the result of adjustments in the timing of scans. What is shown here are the timings of the recordings of the scans, relative to an uncorrected timebase of recording which proceeds from left to right within the frame of information 106 . All of the scan information proceeds through a FIFO delay buffer with a certain default delay. Adding to or subtracting from this default delay allows for shifts either forward or backward in time. An uncorrected region is shown at 108 .
  • a region where the scans are shifted to the right (forward in time) is at 110 , and a region where they are shifted to the left (backward in time) is at 112 .
  • One view may be affected in this way, or both may be used for a greater range of corrections.
  • the data from earlier or later scan lines usually needs to be taken into account and interpolated to produce the final image readout.
  • the preferred embodiment When the final image is viewed, either as a mono or stereo image, the preferred embodiment employs the extraction of a smaller movable window of interest as needed. This presents a more natural view of a portion of the overall scene, while allowing the freedom to look around in any direction. This extraction of a region of interest can be from either a live or prerecorded image stream.
  • FIG. 8 is a schematic view showing of the preferred application of lateral shifting for convergence in a region of interest 114 within the overall image 106 in order to align objects atop one another.
  • this image transformation can be done only on the portion of the whole frame within or immediately adjacent to the region of interest 114 .
  • An example of a portion of the scans shifted forward in recording of trailing scan line 66 will shift its apparent position closer toward the position of 64 or 62 , simulating the appearance of target 76 as being at a much greater distance and having less parallax.
  • the application of delay in the recording of leading scan line 78 will make target 76 seem even closer.
  • the record of these shifts in the course of an overall rotation of the camera is the convergence profile. Advances in the relative position of scan lines can also be done if they are stored in a buffer first, and then usually read after a delay, or in special cases even sooner.
  • FIG. 6 shows examples of two equirectangular stereoscopic views 102 and 104 produced by scans of a full spherical field of view from two sensors in the preferred embodiment.
  • the scene here is of a carnival midway, with a swing ride sending riders overhead.
  • This type of image is the result of a linear fisheye sensor with a hemispheric vertical field of view being rotated through a full 360-degree rotation.
  • the two stereoscopic views can be considered as a “left-eye view” and a “right-eye view”.
  • the optical axes of the multiple sensors in the camera should be in the same plane and essentially parallel, facing toward the same side of the drum, for the effect of convergence on objects at infinity.
  • the appearance of an object can be laterally shifted in the recording or playback in either direction according to the distance to the desired object, preferably as directly measured by a rangefinder.
  • FIG. 7 is a schematic view showing of the application of a convergence profile resulting in lateral shifting of portions of a recorded image to adjust stereoscopic convergence according to distance. This shifting is the result of adjustments in the timing of scans. What is shown here are the timings of the recordings of the scans, relative to an uncorrected timebase of recording which proceeds from left to right within the frame of information 106 . All of the scan information proceeds through a FIFO delay buffer with a certain default delay. Adding to or subtracting from this default delay allows for shifts either forward or backward in time. An uncorrected region is shown at 108 .
  • a region where the scans are shifted to the right (forward in time) is at 110 , and a region where they are shifted to the left (backward in time) is at 112 .
  • One view may be affected in this way, or both may be used for a greater range of corrections.
  • the data from earlier or later scan lines usually needs to be taken into account and interpolated to produce the final image readout.
  • the preferred embodiment When the final image is viewed, either as a mono or stereo image, the preferred embodiment employs the extraction of a smaller movable window of interest as needed. This presents a more natural view of a portion of the overall scene, while allowing the freedom to look around in any direction. This extraction of a region of interest can be from either a live or prerecorded image stream.
  • FIG. 8 is a schematic view showing of the preferred application of lateral shifting for convergence in a region of interest 114 within the overall image 106 in order to align objects atop one another.
  • this image transformation can be done only on the portion of the whole frame within or immediately adjacent to the region of interest 114 .
  • An example of a portion of the scans shifted forward in time is at 116 , and scans shifted backward in time are at 118 .
  • An automatic convergence profile can be generated by the output of a rangefinder, but that profile can be modified to converge at different distances by manual control override as necessary to focus on specific objects.
  • the preferred recording device for picture information would be a digital recorder, such as a streaming tape or based digital recorder.
  • Application of image compression to the picture information prior to use reduces the bandwidth necessary for any digital storage, transmission, or playback.
  • FIG. 9 shows a block diagram of the various components of the entire system in the preferred embodiment.
  • the Right 6 and Left 10 Linear Sensors have equivalent associated components such as optics 20 and 22 and optical axes 16 and 18 , respectively, forming two subcamera systems within the boundaries of the camera drum 120 .
  • Each sensor has its associated power and drive electronics 122 which are coupled to a central Power source 124 and Control electronics 126 , which are related to the motion control regulation 128 for the motor 4 .
  • the sensor outputs an electronic signal representing raw digital image data 130 . This data goes through image compression 132 and any necessary I/O formatting 134 . Then the image signal exits the drum through rotary Slip Rings 136 .
  • a separate rangefinder 138 on the drum outputs distance data 140 for controlling stereoscopic convergence.
  • the image data from the sensors then is processed with the appropriate delays 142 for alignment of their scans together in time, plus any convergence adjustments.
  • the data from both channels may be multiplexed together 144 prior to being sent to the recorder 146 , which preferably should be digital.
  • the image data can be transmitted 150 prior to decompression 152 as part of a display process featuring the extraction of a movable Region of Interest 154 which appears on a Display 156 .
  • Alternate embodiments of the present invention are directed toward the solution of specific problems. For example, line-scan sensors at the present time do not have a fast enough readout rate to produce the above horizontal resolution if the drum spins around very rapidly. One solution to this would be to multiplex several sensors to fill in greater resolution.
  • FIG. 10 shows an alternative embodiment of the present invention, comprising four sensors, 10 , 158 , 160 and 162 , in place in a rotatable drum 2 .
  • these four sensors 10 , 158 , 160 and 162 could be used to build up the proper resolution.
  • Each of the four sensors has the same angle of optical axis 164 , 166 , 168 , and 170 relative to the drum 2 , according to their associated lenses at 174 , 176 , 178 , and 180 , and the four sensors are equidistantly spaced around the circumference of the drum.
  • each sensor would scan an object in the surrounding view at a slightly different time. By the addition of digital delays, these scans can be recorded adjacently, to produce an apparently continuous image.
  • FIG. 11 shows a schematic diagram of how a composite image is assembled from the data coming from the sensors, showing how sensor spacing about the circumference of the rotatable drum 2 is taken into account when recording signals from the sensors.
  • At 182 is a scan signal from the first sensor 10
  • at 184 is the signal from the second sensor 158 , taken at the times when they are at roughly the same point on the rotating drum relative to the surrounding scene.
  • By adding an appropriate amount of fixed delay 186 these scans of a given portion of the surrounding field of view are recorded adjacently at 188 and 190 , respectively. Amounts of fixed delay for each sensor compensate for the time differences between when a scan took place and when it is recorded. It is preferable to have the sensor spacing be as close together as possible, to minimize any visible discontinuities due to these time differences between the adjacent scans as they are recorded in the final image.
  • FIG. 12 shows an alternate embodiment for streamlining and soundproofing the camera. If a clear enclosure 192 is used to encase and soundproof the spinning drum 2 , then the optical axis 16 of a sensor will exit the enclosure at a very oblique angle, producing distortion as it passes through the enclosure, and any seams in the clear enclosure will appear as interruptions in the overall recorded field of view. There is also the potential for problems from internal reflections from the enclosure surface into the lenses. Thus an enclosure is not the preferred solution to this problem.
  • the large front surface of a conventional lens 194 would oridinarily create an interruption on the drum surface 42 that would produce noise, but the front of the lens can be made to conform to the surface of the drum by a streamlining front element 196 .
  • An optimum design for a lens might be a hybrid, with a GradiumTM front element, where the differing refraction index characteristic of this type of glass runs laterally across the front of the lens, in a direction at a right angle to the line of the sensor.
  • the slant could be reversed for opposite sides of the drum, again to match the surface.
  • An alternate covering would be more conventional glass for streamlining, but this has the risk of introducing optical distortion in some form.
  • One applicable form of recorder is a film strip recorder.
  • High-quality film has the advantage of offering higher density storage of image data, in an essentially analog form, at a lower cost than conventional digital streaming media, especially for large quantities of images.
  • FIG. 13 shows an oblique schematic view of an alternative external film-based recorder for image information from the camera.
  • the film apparatus described here is located external to the camera drum itself.
  • the dimensions of the film strip thus need have no physical relation to the primary drum apparatus.
  • the only communication between the two is through electrical transmission of the signals from the sensors shown here coming in on a line 216 .
  • the image is formed for the film strip recorder by means that turn the electrical signals representing the scans back into light in a linear display 218 .
  • Amplification of the signals 220 can be applied if necessary to produce the proper image, which is focused and reduced through optics 222 before it is recorded onto the continuously moving film at an exposure point 224 .
  • the supply reel for the film is at 226
  • the takeup reel is at 228 . Because the film strip is in constant motion, a continuous image is built up through the recording process. Any divisions in this recorded image on film, such as frame divisions, should be based on the film sprocket holes outside the image, or other measurements of distance, and not be interruptions to the image.
  • the image on the film can be digitized at another time if desired, such as for playback or transmission of the image.
  • FIG. 14 shows an oblique view of an alternate embodiment of the camera drum using a film-based recording system.
  • the entire apparatus must rotate according to an axis of rotation 230 .
  • the supply reel 232 is in an upper portion of the drum 234 , and the film 236 goes down into a middle portion 238 where it is exposed at 240 by a slit aperture according to the optical axis 16 .
  • the film proceeds down 242 to the lower portion 244 where it is wound onto the takeup reel 246 .
  • a similar arrangement of parts would have to be done for a second point of view such as for stereoscopic recording. This arrangement has the advantage of offering balance of the weight of the supply and takeup reels around a central axis of rotation 214 .
  • Film as a recording medium may be chosen because certain films may be more sensitive than a digital sensor, and yield better images.
  • this approach has several disadvantages. Foremost among them are the size and weight of the apparatus required, and the difficulty of putting it into rapid rotation for scanning.
  • the rotating scan camera of this invention can be used to record a panoramic field of view with increased resolution and stereoscopic separation throughout the recorded image.
  • a single oblique sensor will also produce a usable panoramic scanned image, although without the stereoscopic features of a dual scan.
  • Lesser resolution sensors can be used for more inexpensive solutions or for situations where the bandwidth for recording or playback is limited.
  • optical axes shown here as tangential can also be created within a larger drum.
  • the axes are still parallel and in the same plane, but the apertures are located farther forward from the tangent point.
  • Oblique but not tangential axes can also be used, either convergent or divergent relative to the tangential axis.
  • the optical axes in these cases still do not intersect at any point behind their lenses and within the drum, unlike the radial axis arrangement characteristic of the prior art.
  • the sensors can be in monochrome, color, or having another spectral sensitivity such as Ultraviolet or Infrared.
  • An example of a color linear sensor is the Eastman Kodak KLI-2113, a trilinear RGB sensor with a 2098 ⁇ 3 resolution, an image area of 29.4 ⁇ 0.24 mm, a sensitivity of 14 bits, and a readout rate of 60 MHz.
  • Other CCD and CMOS line scan sensors have other parameters. Generally speaking, at the present time, CCD sensors are more sensitive than CMOS sensors and allow for exposure of an entire vertical scan at once, but require more complex power sources.
  • CMOS sensors can incorporate many types of logic elements on the chips themselves. This enables on-chip functions such as digitizing, compression, enhancement, and pattern recognition.
  • the line scan sensor should be as sensitive to light as possible, as well as having a very fast readout rate.
  • the fast readout rate enables many individual vertical scans to be read from the sensor while it is in motion, and the sensitivity of the sensor enables a usable image to be produced even during the extremely short exposure time represented by such rapid scans.
  • PixelVision and Scientific Imaging Technologies of Beaverton, Oreg. have developed especially sensitive back-illuminated CCDs for astronomical use.
  • multiple sensors must be used to build up the proper resolution, then they should be carefully spaced around the periphery of the drum while taking into account the desired resolution. For example, four sensors, each limited to a readout speed of only 1000 scans during the course of a revolution, are used to build up an overall resolution of 4000 scan totoal. If the four sensors are spaced within a span of 90 degrees on the drum and exactly 22.5 degrees apart, then every 62.5 scans (1000*(22.5/360), each sensor would reach the same position as the next one in line. The first sensor, in other words, would advance in the course of the revolution, and after 62.5 scans the next in line would reach the starting position of the first.
  • Each sensor should have the same type of lens, to enable the matching of recorded images. Since the horizontal field of view of the line sensor is negligible, the important consideration for the lens is its vertical field of view.
  • a fisheye lens characterized by having a field of view of close to 180 degrees, is used to achieve an essentially spherical recording of the surrounding view as the drum rotates.
  • the lens should be small, of high quality, and fast, which means that it transmits light efficiently. Good quality lenses of this type are being made by Coastal Optical. Estelle's “Wide-angle Photographic Lens System and Photographic Camera” (U.S. Pat. No. 5,502,597) describes an especially compact and efficient design for a wide-angle lens.
  • the subcamera assembly In order to ensure recording of the area directly above the rotating camera, while creating a “blind spot” directly underneath it to hide a camera operator or mount, it may be preferable in practice to construct the subcamera assembly so as to raise the axis of the lens by up to 10 degrees from the horizontal plane of the drum.
  • the motor assembly should be very smooth and quiet, and able to hold to a precise rotational speed.
  • Various types of electronically controlled motors have been developed to meet these characteristics.
  • a recent line of direct-drive brushless dc motors from Thomson Airpax Mechatronics Group are 66% smaller and 30% more efficient than other industry-standard motors.
  • a smaller size for the motor would aid in the portability of the camera system by making it lighter in weight overall.
  • Sinusoidal motion control is preferred for the smoothest possible regulation of speed.
  • An 1800 RPM motor would produce an effective frame speed of 30 fps. This happens to be the type of motor used for the head drum in a VHS tape machine.
  • Rotary couplings such as slip rings are used to transfer electrical signals to and from the electronic elements in the rotating drum.
  • Litton Poly-Scientific's EC3848 for example is a high speed slip ring assembly of up to 10 lines for rotational speeds up to 10,000 RPM. Examples of the type of signals that would go into the drum are power and timing signals for the sensors, while the readout from the sensors would come out along one or more lines as required. Some sensors have multi-tap readouts to increase effective resolution and data speed.
  • the DALSA CT-E4-2048W for example, uses the IT-PI-2048 sensor which has four simultaneous readout channels at speeds of 20 Mz to 25 Mz each, for an effective readout rate for the sensor of up to 100 MHz. The number of rings in the coupling depends on the number of electrical channels needed to maintain proper connections.
  • Rangefinding sensors come in a variety of configurations.
  • the miniature radar-on-a-chip invented at Lawrence Livermore Laboratory in 1995 is one example of a low-cost sensor that could be used to obtain readings of the distance to various objects during the course of a scan.
  • Active Infrared Autofocus devices such are found on still cameras could also be used, especially to give distance readings within a range of around 20 feet (6 m).
  • LIDAR (Light Detection And Ranging) devices can also be used for scanning a surrounding scene, but the devices to date tend to have too slow a sampling rate to yield a detailed record of the depth of an overall scene.
  • compression should preferably be applied to the data prior to its transmission from the drum.
  • Various methods for image compression are currently available, such as Discrete Cosine Transform (DCT) and the Motion Picture Experts Group (MPEG) family of motion picture compression.
  • DCT Discrete Cosine Transform
  • MPEG Motion Picture Experts Group
  • a preferred embodiment for compression in the present invention would be wavelet compression.
  • the final data rate for the output would be determined by the desired level of quality in the image, as well as the optimum data rate of the recorder.
  • the transmission can be in the form of a wired or, preferably, a wireless connection.
  • the preferred embodiment makes use of one of the new high-quality high-bandwidth standards. These include Fibre Channel (ANSI x3.230), which is made for fiber optics and has an effective speed limit of 400 mbps (million bits per second), the same speed as for the IEEE-1394 “Firewire” interface.
  • Serial Digital Interface SDI; a.k.a. SMPTE 292M
  • SDI Serial Digital Interface
  • the Gigabit Internet standard has a maximum speed of 1776 mbps.
  • An even faster data speed is represented by 10 Gbps Ethernet standard now being developed by the 10 Gigabit Ethernet Alliance, including such companies as Intel, Extreme Networks, Nortel, World Wide Packets, and other companies. This speed would handle the full-bandwidth data stream from the preferred resolution and frame speed described above as the preferred embodiment.
  • the image data coming from each sensor represents a vertical column.
  • the columns are usually put together directly to make the final image in a digital buffer.
  • a video recorder is to be used for recording this type of image, then the recording method is modified, since video images are made up of rows, which are read out horizontally, rather than columns.
  • an image made up of columns is first stored in a buffer, then read out read out in rows, usually while a second buffer is being filled with the columns of a new frame.
  • a readout of an HDTV video image of approximately 1920 by 1080 pixels (which can be identified as High-level, High-profile ADTV), is preferred, as a way of preserving maximum image quality within existing video standards.
  • HDTV digital recorders such as the Panasonic AJ-HD2700 D-5 VCR (for a high-bandwidth SDI signal) or even the low-cost Panasonic PVHD 1000 D-VHS VCR (for a compressed signal) recorder are examples or recorders that could be used for this purpose.
  • the image can be formed for the film by a linear display.
  • a linear display Many forms of displays and microdisplays may be suitable, depending on the brightness, size and image characteristics required. These displays include an LED or laser diode array, or a linear CRT, LCD, DLP (Digital Light Processing) or plasma display, with reduction and focusing optics as necessary to transfer the light onto the film.
  • a linear color plasma display has the advantage of apparently continuous adjacent pixels.
  • a plasma display such used in the Philips Flat TV, for example, has pixels measuring 1.08 mm square; therefore a 2098 pixel linear sensor requires an array 226.584 cm (89.2 inches) long, which requires substantial reduction optics.
  • An alternate form of arrangement would be to split the display into a multiplicity of sections, which are combined to make a continuous recorded image. This can make use of any multi-tap readouts of the sensor, such as those featured in the DALSA camera mentioned above, where each of the four parallel channels coming from the sensor can be made to control its own display array.
  • the Cine V Solitaire Image Recorder (U.S. Pat. No. 4,754,334) is an example of an advanced film recorder for digital data.
  • the extracted signals from the sensors can be sent to other locations for use via electronic transmission, for either a live or a prerecorded image. It is preferred that a movable region of interest be extracted from within the overall image, either live or prerecorded. If taken from the live image, then if desired, the rest of the image can be discarded. This has the advantage of reducing the overall requirements for transmission or processing.

Abstract

A scanning camera with a rotating drum has one or more sensors characterized by a non-radial optical axis. With two sensors on opposite sides of the drum and facing in substantially the same direction, stereoscopic recording of a panorama is accomplished as the drum rotates. The adjustment of convergence between stereoscopic viewpoints is described that improves the viewing and interpretation of stereoscopic images. Rapid rotation of the scanning camera produces panoramic motion picture recording, with the final frame speed dependent on the sensitivity and speed of the sensor, the resolution desired, and the capabilities of the recording device. The preferred embodiment employs rotating fisheye lenses for a substantially full-sphere field of view. Additional sensors in the same arrangement are used to increase resolution and light sensitivity through multiplexed or additive recording of the image data. Recording image information using film, either internal or external to the camera drum, is also described as a cost-effective alternative to digital media storage.

Description

    BACKGROUND
  • 1. Field of the Invention
  • This invention generally relates to a panoramic image reproduction system, and in particular to the use of rotary scanning elements in a camera design, with multiple sensors being used to establish stereoscopic viewpoints, adjust their convergence, and to increase the density of information being scanned.
  • 2. Description of the Prior Art
  • Cameras for capturing still or motion pictures of the external world have usually used an area array sensor, such as a 640×480 Charge-Coupled Device (CCD) for NTSC video images. These area sensors use a rectangular or square area of light-sensitive elements and expose them all at once. New forms of Complementary Metal Oxide Semiconductor (CMOS) sensors for cameras are also being made in a variety of resolutions.
  • Linear scan sensors, either CMOS or CCD, are also being used to increase resolution for specific applications. These sensors have a single row of sensors, designed for monochrome or color readout, and sensitive to either visible light or some other part of the spectrum such as infrared or ultraviolet. The image is made by successive readouts of the sensor as it moves across the image, building up the final resolution. This is also sometime referred to as a slit scan or line scan sensor. Digital scans of this type are already found in such devices as fax machines and flatbed scanners. Line scan cameras are often used in industrial processes to obtain clear pictures of objects in motion. Examples of these cameras are the Wintriss Engineering OPSIS 5150ALC for monochrome images, which uses a monochrome CCD linear sensor with 5150 pixels and onboard image processing, and the DALSA CT-E4-2048W, which uses a color CCD linear sensor with 2048 pixels. Typically these cameras are fixed in place, and look at objects moving past them. Unlike area-scan sensors, the resulting images do not suffer from motion blur.
  • Scan-back digital cameras, such as the Better Light Model 8000 camera back are also doing high-quality digital still photography. The use of a slow scan from a linear sensor allows for maximum resolution to be built up when photographing artwork, documents or other subjects.
  • Scanning sensors are also used in so-called “push-broom” satellite surveillance applications, where as the satellite sweeps across the surface, a succession of readouts of the scanner creates a very large and detailed overall image. Existing panoramic slit scan cameras have also been made that use a moving slit to expose a strip of film, such as the Widelux™ camera for stills.
  • Various patents describe scanning cameras. All typically feature a rotation of the scanner to create the image according to an optical axis radial to the axis of rotation. Keller's “Panoramic Camera” (U.S. Pat. No. 5,659,804), for example, describes rotation of a scanner where the nodal point of the optics for the scanner is located radially according to the axis of rotation. Oxaal's “Method and Apparatus for Producing a 360° Spherical Visual Data Set” (U.S. Pat. No. 5,903,782) also involves a radial optical axis, this time for a “fisheye” lens. This is the approach used in the high-resolution digital PanoScan™ still camera. Jehle's “360 degree IR Surveillance with Panoramic Display” (U.S. Pat. No. 4,977,323) uses an infrared scanner also aligned radially to the axis of rotation. Federau's “Line-Scan Panoramic Camera” (U.S. Pat. No. 4,532,544) also describes a radial axis for a television camera. And Globus et al's “Panoramic Camera” (U.S. Pat. No. 4,241,985), marketed as the Globuscope™, employs a slit and a radial optical axis to expose a film strip.
  • Schonherr (U.S. Pat. No. 5,305,035) describes a design for slots that are intended to “seal off” the edges of a light-sensitive cylindrical recording surface when used with a rotating objective drum, thus preventing extraneous light at the edges of the recording, or “light safety”. The drum of Schoenherr et al has a lens in its center which focuses light from a wide input image window on one side onto a narrow image output window on the other side, so that as the drum rotates, the light passes through this second window in a scanning fashion onto the fixed cylindrical recording surface outside of the drum. Because the recording surface is separate and outside of the drum, this device has a horizontal field of view that is inherently less than 180 degrees. Other film strip-based scanning cameras have also been made, going back to the cumbersome Cirkut camera of the nineteenth century. Contemporary examples include the Widelux, which exposes a film strip in 120-degree increments of rotation, and the Hulcherama camera which exposes a continuous strip of film in any amount of rotation, using gear motors to advance the film in time with the sweep of the lens. Woltz's “Panoramic Motion Picture Camera and Method” (U.S. Pat. No. 4,602,857) describes a rotating motion picture film camera. All of these cameras share the same orientation of the optical axis as being radial to the axis of rotation.
  • Keast, et al (U.S. Pat. No. 5,721,585) discloses a rotating camera which in one embodiment has two cameras with axes not radial to the axis of rotation. However, Keast depends upon the addition of counter-rotation relay optics between the lens and the sensor. This adds significant weight and complexity to the device, and also requires that a perfectly consistent geometrical relationship be consistently maintained between the input lens and the sensor along a long optical path, even while the whole apparatus is in rapid motion and presumably being carried in the field in the manner of a regular camera. The described counter-rotation optics are also inherently impractical, because as the overall apparatus rotates, at the point where the lens is opposite the sensor, the mirror surfaces and their optical supports need to completely disappear in order to avoid an interruption in the image.
  • What these solutions lack is an effective way to record not only the appearance of the real world in motion through the slit, but also to use that information to create a stereoscopic image of the entire field of view. Without that stereoscopic information, the most realistic picture of a surrounding scene is not possible, as well as other applications such as rangefinding that depend on parallax offsets between different points of view. The reason for this is that existing rotary solutions assume an optical axis for the light path that is radial to the axis of the rotating array. This restriction precludes the use of other axes in a rotating array for stereoscopic effects or increased resolution. If one attempted to create a stereoscopic view with two panoramic scans taken with such cameras, by taking them from two laterally offset points in space, the stereo illusion would be strong only for the area directly orthogonal to the axis connecting the two points, but would fall off to zero as you approached that axis in the course of looking around. This is described in McMillan & Bishop's “Plenoptic Modelling: An image-Based Rendering System” in SIGGRAPH 95 Conference Proceedings pp. 39-46. An overview of stereoscopic imaging systems, including relevant US and foreign patents, can be found in Michael Starks' summary article “Stereoscopic Imaging Technology” from 3DTV Corporation (http://www.3dmagic.com).
  • In a pair of stereoscopic panoramas, the lateral offsets between the objects in them represent their distances from the camera. However, these offsets can be dynamically removed by digital or other means, on a localized basis, to produce a more continuous image, with the amounts of correction also reflecting a depth map of the scene. This dynamic correction is in contrast with the static approaches of the prior art. Sheiman et al (U.S. Pat. No. 4,235,515), for example, discloses a fixed, physical viewing method using prisms in sheets to establish apparent dual images for stereo viewing, and is directed at creating a fixed setting that removes the overall lateral distortion between the images.
  • The use of multiple cameras can not only establish two stereoscopic viewpoints, but also be used to increase resolution and light sensitivity in the recording process. Keast et al describes the use of two cameras, purely for establishing stereo viewpoints, and the counter-rotation relay optics described preclude more than two cameras from being used. In addition, the described method suggests that there would be significant light interference between the light from the two cameras.
  • The present invention will address all of these shortcomings in the prior art. The present application is based in part on the disclosure document no. 466771 “Rotating Scan Camera with Oblique Axis” filed Dec. 22, 1999, and is a continuation-in-part of application Ser. No. 09/505,601 “Rotating Scan Camera” filed Feb. 17, 2000.
  • OBJECTS AND ADVANTAGES OF THE INVENTION
  • A. It is an object of the present invention to provide a method, and an apparatus for applying that method, for stereoscopic imaging of a surrounding scene by means of separated views having equal parallax separation throughout the scene.
  • B. It is also an object of the present invention to provide a method, and an apparatus for applying that method, for increasing the resolution of the recorded image by the use of multiple line scan sensors. Unlike the prior art, which is directed toward one to two sensors and optical paths, this use of multiple sensors enables better resolution in the final image and the use of sensors that would otherwise be unable to give good results in a single scan.
  • C. It is also an object of the present invention to provide a method, and an apparatus for applying that method, for increasing the light sensitivity of the recording device by the storage and addition of redundant scans from line scan sensors, unlike the prior art, which is directed toward a single pass recording of a given subject.
  • D. It is also the object of the present invention to provide a method, and an apparatus for applying that method, for recording panoramic images in full motion, by means of a simplified recording mechanism. The prior art is largely directed toward the production of panoramic still photographs, and the art featuring film-based recording in particular is unsuitable for motion picture photography, because of the need to put a large and heavy film transport and storage mechanism into rapid motion.
  • E. It is also an object of the present invention to provide means for recording a fully spherical field of view in a seamless manner. The prior art for panoramic scanning is primarily directed toward a cylindrical panoramic format that omits a substantial portion of the surrounding scene, and spherical imaging to date has mainly involved multiple views that must be seamed together prior to viewing, or scanning approaches that produce interruptions in the field of view.
  • F. It is also an object of the invention to dynamically adjust the convergence of a panoramic stereoscopic view in order to focus viewing on objects at selected distances. No prior art addresses this problem.
  • G. It is also an object of the present invention to provide a convenient and cost-effective method and apparatus for recording high-quality scan image data on film.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an oblique view of the preferred embodiment of the rotating camera, with a hemispheric field of view for each of two sensors in a rotating camera drum with conventional fisheye lenses and streamlining elements.
  • FIG. 2 shows a simplified oblique view of a basic embodiment featuring a rotating drum, the motor and carrying handle for it, and two sensors in offset positions, showing an aperture slit for each sensor.
  • FIG. 3 shows a top view of the rotating drum of a basic embodiment, showing the sensor and lens components of the subcamera assemblies, along with the optical axis arrangements characteristic of the system.
  • FIG. 4 shows a closeup schematic top view of the preferred embodiment using a fisheye lens with a streamlining covering for the drum surface.
  • FIG. 5 is a top view of the camera drum in the preferred embodiment, showing the action of the spinning drum's sensors where an object in the field of view is scanned from different points of view, allowing for stereoscopic effects, with the addition of a rangefinding device as part of the scanning drum.
  • FIG. 6 shows two equirectangular stereoscopic views produced by scans of the preferred full spherical field of view according to the present invention.
  • FIG. 7 is a schematic view showing the application of lateral distortion to portions of a recorded image to achieve stereoscopic convergence at a given distance, such as in would be produced by delays in the timing of the recording of scans.
  • FIG. 8 is a schematic view showing a preferred embodiment wherein lateral distortion is applied to a region of interest within the overall image to achieve stereoscopic convergence
  • FIG. 9 shows a block diagram of the various components of the entire system in the preferred embodiment.
  • FIG. 10 shows a top view of an alternate arrangement for increasing the overall resolution, here with four sensors in place in a rotating drum, spaced around its circumference.
  • FIG. 11 shows a schematic diagram of the adjacent recording of the scans from different sensors, using various amounts of delay for the signal from each sensor.
  • FIG. 12 shows a closeup schematic top view of a conventional lens for the sensor, and ways to streamline the rapidly rotating drum surface.
  • FIG. 13 shows an oblique schematic view of an alternate external film-based recorder for camera image information.
  • FIG. 14 shows an oblique view of an alternate embodiment of the camera drum using a film-based recording system.
  • DETAILED DESCRIPTION
  • A rotary scanning camera of a novel design offers improved performance in capturing a panoramic surrounding scene. One or more line-scan sensors are used in a sweeping recording of a surrounding scene. The scan can be of a partial rotation or of one or more full rotations. Unlike most of the prior art, these sensors each have an optical axis that does not intersect the axis of rotation.
  • In the preferred embodiment, a rotating drum holds two or more line-scan devices for scanning the appearance of a surrounding field of view from stereoscopically separated viewpoints, with each optical axis on approximately a tangent line to the cylinder. This creates a pair of offset sweeping scans that record objects in the surrounding view from one, then the other, point of view, thereby creating an effective parallax separation in the recorded images. This parallax separation is the same for every point in the rotation of the drum.
  • FIG. 1 shows an oblique view of the preferred embodiment of the rotating camera, with a hemispheric field of view for each of two sensors 6 and 10 in a rotating camera drum 2 with streamlining elements such as at 40. A fisheye lens 38 with an optical axis 16 is shown for sensor 6, producing a scan 92 with a vertical field of view which extends 180 degrees. As the drum 2 rotates 14 around an axis of rotation 46, according to a motor 4, additional scans are produced, such as the second scan 94. The same is true for the scans 96 and 98 as recorded by the sensor 10. The vertical boundary of a scan's field of view is at 100. The signals from these scans, as well as any necessary input signals, pass through a slip ring connection 56 and through a connecting cable 58, shown here exiting at the base of a handle 52.
  • FIG. 2 shows a simplified oblique view of a rotatable scan camera according to the present invention. The camera comprises a rotatable drum 2, actuated by a motor 4. This drum has a central axis of rotation 46 and a perimeter, which is the surface of the drum farthest from the axis of rotation. Two linear sensors, a Right-hand sensor 6 and a Left-hand sensor 10, are disposed at the circumference of the rotatable drum 2, each sensor having an aperture slit 8, 12. Each linear sensor has an associated lens through which light passes according to a central optical axis and forms a focused image on an image plane orthogonal to the optical axis. The linear sensor's active surface is located in this image plane. Each sensor also has associated electronics for the input and output of signals. The sensor, optics and electronics form a subcamera assembly within the rotatable drum, which rotates in a direction shown by the arrow, 14, preferably clockwise as seen from above.
  • For recording a surrounding scene in motion, this drum should be rotating very quickly, and the scans should be as rapid as possible. A preferred embodiment for recording motion would be a 5-inch diameter drum rotating at 3600 RPM, with two sensors, with a complete scanned image produced by each sensor every rotation, for an effective frame rate per sensor of 60 frames per second. The resolution of the image is constrained only by the sensitivity of the sensor and its readout rate, and the ability of the data recorder to keep up. To achieve a pleasing level of resolution when viewing a closeup of a part of the image, as will commonly be done, then the resolution of the overall image should be as large as possible. A preferred resolution for the scanned image frame produced by each sensor would be at least 2,000 by 4,000 pixels, which represents the rotation of a 2,000-pixel sensor sampled 4,000 times during each rotation. If this 8 million-pixel image frame is produced 60 times per second, this means 480 million pixels produced per imager per second, times two channels, or close to a billion pixels per second. This level of resolution, together with the stereoscopic qualities of the image, represents an extremely high level of realism.
  • One disadvantage of the use of line scan sensors is their relative insensitivity to light. If one is used to scan a scene, the light recorded for any spot is only able to be recorded in the brief amount of time that the sensor sweeps past that spot. To help solve this problem, a storage buffer should be used to store and add together multiple redundant scans. For example, an optimal equirectangular format might be 6000×3000 pixels. This can be reproduced, for example, by six cameras, each with a line scan sensor of 3000 pixels, that each sweep one-sixth of a sphere for each exposure period, after performing 1000 scans of 3000 pixels each. This represents a total pixel rate of 18 Mpixels/sec or 18 MHz, and means that the rotation needed to capture a circular or spherical frame would be six times slower than a single camera would require. With faster pixel readout rates there can be more scans performed, and if they are timed to coincide with the position of earlier scans, then the results can be stored and added together to build up a better reading of the available light. For example, an operating rate of 30 MHz for each camera would enable ten scans to be added together during the progress of an exposure period, representing a ten times improvement in exposure over a single pass. Due to the nature of the trailing of scans at the beginning and end of an exposure period, scans from adjacent cameras should be added together at these points to make a smooth transition. The scan depth increase can be used both to build up light sensitivity and to improve the dynamic range of the image, because of the larger number of samplings being applied to each pixel.
  • For motion to be recorded in this manner, even more scans would be needed. At 30 frames per second the above camera would have to perform 30,000 scans per second, and the overall pixel readout rate of the final image would be 540 Mhz (540 Mpixels/sec).
  • Another example also shows how multiple scans from a sensor can be added together for greater light sensitivity.
  • FIG. 3 shows a top view of the rotating drum 2, showing the preferred tangential optical axes 16 and 18 of the Right and Left sensors 6 and 10, and their associated lenses at 20 and 22, respectively. A sensor such as at 6 is usually wider than the aperture slit 8; this means that effectively, because of the space constraints of the interior of the drum, an optical axis 16 cannot be actually on the tangent of the drum, but must be located a short distance away. What is shown here is an optimal arrangement, with two roughly parallel axes 16, 18 with apertures located slightly forward of the tangent point 24. Other axis arrangements can be used, except for a radial axis 30 intersecting the center point 26. Examples are a convergent axis such as is shown at 34, or a divergent axis such as at 70. A true tangent axis is shown at 32.
  • FIG. 4 shows the preferred embodiment of a conventional fisheye lens 38 for a sensor 6, and an especially wide streamlining front element 40 conforming to the drum surface 42 that takes into account the wide horizontal field of view characteristic of this type of lens. For a specialized line-scan lens, the lens is very narrow and flat, with an arrangement of elements like a slice taken through the middle of a conventional lens. This flat lens has the advantage of the least surface interruption, but lacks the light-gathering abilities of a more conventional round lens, which is the preferred embodiment. While a flat fisheye line-scan lens for a slit aperture, such as those produced by Coastal Optical, may have an f-number of f8, a conventional round fisheye lens could have an f-number of f2.8, for nearly eight times more light delivered to the sensor.
  • FIG. 5 shows the action of the spinning drum's sensors in the preferred embodiment where an object in the field of view is scanned from, one, then another, point of view. Five optical axes for the spinning sensor 10 on the drum 2 are shown at 60, 62, 64, 66, and 68, and how the scans of the sensors, especially for very near objects 70, simultaneously sweep different areas 72 and 74. Thus objects are scanned at different times, from different directions. For instance, a comparable area 76 of the near object 70 is shown as being scanned by sensor 6 according to the optical axis 78 at the first position 80, but the same target area is scanned at the fourth position 66 for the sensor 10. This scanning of objects from different points of view creates the parallax separations within the image that are the basis of any stereoscopic effect. The lateral location within the scanned image is a function of time, because it is the point in time at which the image appeared in the aperture. The scanned area on the object is shown for the Right 6 sensor at 82 and the Left 10 sensor at 84, as well of the area 86 of the object 70 where the images from the two sensors overlap.
  • The apparent stereoscopic separation produced by more than one point of view is called parallax, and the amount of parallax separation in the image from two fixed but separated sensors is determined by the distance between the camera and the object being viewed. Close objects, as seen against a distant background, will appear to be more laterally offset from each other than distant objects will. The distance to objects can be inferred if their images from both viewpoints are identified and measured, but this can be unreliable and calls for substantial processing power. A preferred way to determine this distance to a target object directly through the addition of a rangefinding element to the rotating camera drum. This rangefinder is shown as being made up of an emitter element 88 and a sensor element 90. This rangefinder, either a miniature radar, laser, or infrared-based unit, takes readings either continuously or at intervals, to determine a distance profile of the distances to the objects in the surrounding field of view during the course of a rotation. This distance profile is used to control an adjustment of parallax separation called convergence, in a convergence profile. The convergence profile can be used to minimize or exaggerate the effects of parallax. If minimized, then closer objects will appear more continuous, and the effect will be of less separation between the stereo viewpoints. If exaggerated, then objects in the distance will be easier to see in stereoscopic depth, as if the stereo viewpoints were widely separated in space.
  • The convergence profile continuously applies the distance readings from the rangefinder to dynamically shift the lateral placement of the vertical scan lines within the final image buffer used for readout. For example, again in FIG. 5, as sensors 6 and 10 rotate, a target on a nearby object 70 is seen according to two optical axes from the two sensors, first at 78 for sensor 6 and then at 66 for sensor 10. This makes 6 the leading sensor, producing the leading scan lines, and 10 the trailing sensor, producing the trailing scan lines. Therefore if the readings of a rangefinder 90 detects the close proximity of the object 70, then the application of delay to the determined by the distance between the camera and the object being viewed. Close objects, as seen against a distant background, will appear to be more laterally offset from each other than distant objects will. The distance to objects can be inferred if their images from both viewpoints are identified and measured, but this can be unreliable and calls for substantial processing power. A preferred way to determine this distance to a target object directly through the addition of a rangefinding element to the rotating camera drum. This rangefinder is shown as being made up of an emitter element 88 and a sensor element 90. This rangefinder, either a miniature radar, laser, or infrared-based unit, takes readings either continuously or at intervals, to determine a distance profile of the distances to the objects in the surrounding field of view during the course of a rotation. This distance profile is used to control an adjustment of parallax separation called convergence, in a convergence profile. The convergence profile can be used to minimize or exaggerate the effects of parallax. If minimized, then closer objects will appear more continuous, and the effect will be of less separation between the stereo viewpoints. If exaggerated, then objects in the distance will be easier to see in stereoscopic depth, as if the stereo viewpoints were widely separated in space.
  • The convergence profile continuously applies the distance readings from the rangefinder to dynamically shift the lateral placement of the vertical scan lines within the final image buffer used for readout. For example, again in FIG. 5, as sensors 6 and 10 rotate, a target on a nearby object 70 is seen according to two optical axes from the two sensors, first at 78 for sensor 6 and then at 66 for sensor 10. This makes 6 the leading sensor, producing the leading scan lines, and 10 the trailing sensor, producing the trailing scan lines. Therefore if the readings of a rangefinder 90 detects the close proximity of the object 70, then the application of delay to the recording of trailing scan line 66 will shift its apparent position closer toward the position of 64 or 62, simulating the appearance of target 76 as being at a much greater distance and having less parallax. Conversely, the application of delay in the recording of leading scan line 78 will make target 76 seem even closer. The record of these shifts in the course of an overall rotation of the camera is the convergence profile. Advances in the relative position of scan lines can also be done if they are stored in a buffer first, and then usually read after a delay, or in special cases even sooner.
  • FIG. 6 shows examples of two equirectangular stereoscopic views 102 and 104 produced by scans of a full spherical field of view from two sensors in the preferred embodiment. The scene here is of a carnival midway, with a swing ride sending riders overhead. This type of image is the result of a linear fisheye sensor with a hemispheric vertical field of view being rotated through a full 360-degree rotation. The two stereoscopic views can be considered as a “left-eye view” and a “right-eye view”.
  • For simplicity, the optical axes of the multiple sensors in the camera should be in the same plane and essentially parallel, facing toward the same side of the drum, for the effect of convergence on objects at infinity. By the addition of delays or advances in one or both channels of the stereo image according to a convergence profile, the appearance of an object can be laterally shifted in the recording or playback in either direction according to the distance to the desired object, preferably as directly measured by a rangefinder.
  • FIG. 7 is a schematic view showing of the application of a convergence profile resulting in lateral shifting of portions of a recorded image to adjust stereoscopic convergence according to distance. This shifting is the result of adjustments in the timing of scans. What is shown here are the timings of the recordings of the scans, relative to an uncorrected timebase of recording which proceeds from left to right within the frame of information 106. All of the scan information proceeds through a FIFO delay buffer with a certain default delay. Adding to or subtracting from this default delay allows for shifts either forward or backward in time. An uncorrected region is shown at 108. A region where the scans are shifted to the right (forward in time) is at 110, and a region where they are shifted to the left (backward in time) is at 112. One view may be affected in this way, or both may be used for a greater range of corrections. As the position of the image information is shifted, the data from earlier or later scan lines usually needs to be taken into account and interpolated to produce the final image readout.
  • When the final image is viewed, either as a mono or stereo image, the preferred embodiment employs the extraction of a smaller movable window of interest as needed. This presents a more natural view of a portion of the overall scene, while allowing the freedom to look around in any direction. This extraction of a region of interest can be from either a live or prerecorded image stream.
  • FIG. 8 is a schematic view showing of the preferred application of lateral shifting for convergence in a region of interest 114 within the overall image 106 in order to align objects atop one another. To conserve computational steps in this essentially digital process, this image transformation can be done only on the portion of the whole frame within or immediately adjacent to the region of interest 114. An example of a portion of the scans shifted forward in recording of trailing scan line 66 will shift its apparent position closer toward the position of 64 or 62, simulating the appearance of target 76 as being at a much greater distance and having less parallax. Conversely, the application of delay in the recording of leading scan line 78 will make target 76 seem even closer. The record of these shifts in the course of an overall rotation of the camera is the convergence profile. Advances in the relative position of scan lines can also be done if they are stored in a buffer first, and then usually read after a delay, or in special cases even sooner.
  • FIG. 6 shows examples of two equirectangular stereoscopic views 102 and 104 produced by scans of a full spherical field of view from two sensors in the preferred embodiment. The scene here is of a carnival midway, with a swing ride sending riders overhead. This type of image is the result of a linear fisheye sensor with a hemispheric vertical field of view being rotated through a full 360-degree rotation. The two stereoscopic views can be considered as a “left-eye view” and a “right-eye view”.
  • For simplicity, the optical axes of the multiple sensors in the camera should be in the same plane and essentially parallel, facing toward the same side of the drum, for the effect of convergence on objects at infinity. By the addition of delays or advances in one or both channels of the stereo image according to a convergence profile, the appearance of an object can be laterally shifted in the recording or playback in either direction according to the distance to the desired object, preferably as directly measured by a rangefinder.
  • FIG. 7 is a schematic view showing of the application of a convergence profile resulting in lateral shifting of portions of a recorded image to adjust stereoscopic convergence according to distance. This shifting is the result of adjustments in the timing of scans. What is shown here are the timings of the recordings of the scans, relative to an uncorrected timebase of recording which proceeds from left to right within the frame of information 106. All of the scan information proceeds through a FIFO delay buffer with a certain default delay. Adding to or subtracting from this default delay allows for shifts either forward or backward in time. An uncorrected region is shown at 108. A region where the scans are shifted to the right (forward in time) is at 110, and a region where they are shifted to the left (backward in time) is at 112. One view may be affected in this way, or both may be used for a greater range of corrections. As the position of the image information is shifted, the data from earlier or later scan lines usually needs to be taken into account and interpolated to produce the final image readout.
  • When the final image is viewed, either as a mono or stereo image, the preferred embodiment employs the extraction of a smaller movable window of interest as needed. This presents a more natural view of a portion of the overall scene, while allowing the freedom to look around in any direction. This extraction of a region of interest can be from either a live or prerecorded image stream.
  • FIG. 8 is a schematic view showing of the preferred application of lateral shifting for convergence in a region of interest 114 within the overall image 106 in order to align objects atop one another. To conserve computational steps in this essentially digital process, this image transformation can be done only on the portion of the whole frame within or immediately adjacent to the region of interest 114. An example of a portion of the scans shifted forward in time is at 116, and scans shifted backward in time are at 118. An automatic convergence profile can be generated by the output of a rangefinder, but that profile can be modified to converge at different distances by manual control override as necessary to focus on specific objects.
  • The preferred recording device for picture information would be a digital recorder, such as a streaming tape or based digital recorder. Application of image compression to the picture information prior to use reduces the bandwidth necessary for any digital storage, transmission, or playback.
  • FIG. 9 shows a block diagram of the various components of the entire system in the preferred embodiment. The Right 6 and Left 10 Linear Sensors have equivalent associated components such as optics 20 and 22 and optical axes 16 and 18, respectively, forming two subcamera systems within the boundaries of the camera drum 120. Each sensor has its associated power and drive electronics 122 which are coupled to a central Power source 124 and Control electronics 126, which are related to the motion control regulation 128 for the motor 4. The sensor outputs an electronic signal representing raw digital image data 130. This data goes through image compression 132 and any necessary I/O formatting 134. Then the image signal exits the drum through rotary Slip Rings 136. A separate rangefinder 138 on the drum outputs distance data 140 for controlling stereoscopic convergence. The image data from the sensors then is processed with the appropriate delays 142 for alignment of their scans together in time, plus any convergence adjustments. To conserve the number of recorders needed, the data from both channels may be multiplexed together 144 prior to being sent to the recorder 146, which preferably should be digital.
  • During the Playback process 148, the image data can be transmitted 150 prior to decompression 152 as part of a display process featuring the extraction of a movable Region of Interest 154 which appears on a Display 156.
  • Alternate embodiments of the present invention are directed toward the solution of specific problems. For example, line-scan sensors at the present time do not have a fast enough readout rate to produce the above horizontal resolution if the drum spins around very rapidly. One solution to this would be to multiplex several sensors to fill in greater resolution.
  • FIG. 10 shows an alternative embodiment of the present invention, comprising four sensors, 10, 158, 160 and 162, in place in a rotatable drum 2. If the requirement is for 4,000 scans during one revolution, which represents one frame, and the fastest that could be read from an individual sensor is 1,000 readings per frame, then these four sensors 10, 158, 160 and 162 could be used to build up the proper resolution. Each of the four sensors has the same angle of optical axis 164, 166, 168, and 170 relative to the drum 2, according to their associated lenses at 174, 176, 178, and 180, and the four sensors are equidistantly spaced around the circumference of the drum. In operation, with the drum rotating, each sensor would scan an object in the surrounding view at a slightly different time. By the addition of digital delays, these scans can be recorded adjacently, to produce an apparently continuous image.
  • FIG. 11 shows a schematic diagram of how a composite image is assembled from the data coming from the sensors, showing how sensor spacing about the circumference of the rotatable drum 2 is taken into account when recording signals from the sensors. At 182 is a scan signal from the first sensor 10, and at 184 is the signal from the second sensor 158, taken at the times when they are at roughly the same point on the rotating drum relative to the surrounding scene. By adding an appropriate amount of fixed delay 186, these scans of a given portion of the surrounding field of view are recorded adjacently at 188 and 190, respectively. Amounts of fixed delay for each sensor compensate for the time differences between when a scan took place and when it is recorded. It is preferable to have the sensor spacing be as close together as possible, to minimize any visible discontinuities due to these time differences between the adjacent scans as they are recorded in the final image.
  • FIG. 12 shows an alternate embodiment for streamlining and soundproofing the camera. If a clear enclosure 192 is used to encase and soundproof the spinning drum 2, then the optical axis 16 of a sensor will exit the enclosure at a very oblique angle, producing distortion as it passes through the enclosure, and any seams in the clear enclosure will appear as interruptions in the overall recorded field of view. There is also the potential for problems from internal reflections from the enclosure surface into the lenses. Thus an enclosure is not the preferred solution to this problem.
  • The large front surface of a conventional lens 194 would oridinarily create an interruption on the drum surface 42 that would produce noise, but the front of the lens can be made to conform to the surface of the drum by a streamlining front element 196. An optimum design for a lens might be a hybrid, with a Gradium™ front element, where the differing refraction index characteristic of this type of glass runs laterally across the front of the lens, in a direction at a right angle to the line of the sensor. This produces a front element 196 that has the optical characteristics of a flat element, but which is thicker on one side than the other, to match the contour of the drum surface 42. The slant could be reversed for opposite sides of the drum, again to match the surface. An alternate covering would be more conventional glass for streamlining, but this has the risk of introducing optical distortion in some form.
  • For large amounts of image data, unusual recorders may be used. One applicable form of recorder is a film strip recorder. High-quality film has the advantage of offering higher density storage of image data, in an essentially analog form, at a lower cost than conventional digital streaming media, especially for large quantities of images.
  • FIG. 13 shows an oblique schematic view of an alternative external film-based recorder for image information from the camera. Unlike conventional panoramic cameras, the film apparatus described here is located external to the camera drum itself. The dimensions of the film strip thus need have no physical relation to the primary drum apparatus. The only communication between the two is through electrical transmission of the signals from the sensors shown here coming in on a line 216. The image is formed for the film strip recorder by means that turn the electrical signals representing the scans back into light in a linear display 218. Amplification of the signals 220 can be applied if necessary to produce the proper image, which is focused and reduced through optics 222 before it is recorded onto the continuously moving film at an exposure point 224. The supply reel for the film is at 226, and the takeup reel is at 228. Because the film strip is in constant motion, a continuous image is built up through the recording process. Any divisions in this recorded image on film, such as frame divisions, should be based on the film sprocket holes outside the image, or other measurements of distance, and not be interruptions to the image. The image on the film can be digitized at another time if desired, such as for playback or transmission of the image.
  • FIG. 14 shows an oblique view of an alternate embodiment of the camera drum using a film-based recording system. The entire apparatus must rotate according to an axis of rotation 230. The supply reel 232 is in an upper portion of the drum 234, and the film 236 goes down into a middle portion 238 where it is exposed at 240 by a slit aperture according to the optical axis 16. The film proceeds down 242 to the lower portion 244 where it is wound onto the takeup reel 246. A similar arrangement of parts would have to be done for a second point of view such as for stereoscopic recording. This arrangement has the advantage of offering balance of the weight of the supply and takeup reels around a central axis of rotation 214. This allows rapid rotation without undue vibration and instability. Film as a recording medium may be chosen because certain films may be more sensitive than a digital sensor, and yield better images. However, this approach has several disadvantages. Foremost among them are the size and weight of the apparatus required, and the difficulty of putting it into rapid rotation for scanning.
  • OPERATION, RAMIFICATIONS AND SCOPE
  • Accordingly the reader will see that the rotating scan camera of this invention can be used to record a panoramic field of view with increased resolution and stereoscopic separation throughout the recorded image.
  • Although the description above contains many specificities, these should not be construed as limiting the scope of the invention but as merely providing illustrations of some of the presently preferred embodiments of this invention.
  • For example, a single oblique sensor will also produce a usable panoramic scanned image, although without the stereoscopic features of a dual scan. Lesser resolution sensors can be used for more inexpensive solutions or for situations where the bandwidth for recording or playback is limited.
  • The arrangement of optical axes shown here as tangential can also be created within a larger drum. In this case, the axes are still parallel and in the same plane, but the apertures are located farther forward from the tangent point. Oblique but not tangential axes can also be used, either convergent or divergent relative to the tangential axis. The optical axes in these cases still do not intersect at any point behind their lenses and within the drum, unlike the radial axis arrangement characteristic of the prior art.
  • The sensors can be in monochrome, color, or having another spectral sensitivity such as Ultraviolet or Infrared. An example of a color linear sensor is the Eastman Kodak KLI-2113, a trilinear RGB sensor with a 2098×3 resolution, an image area of 29.4×0.24 mm, a sensitivity of 14 bits, and a readout rate of 60 MHz. Other CCD and CMOS line scan sensors have other parameters. Generally speaking, at the present time, CCD sensors are more sensitive than CMOS sensors and allow for exposure of an entire vertical scan at once, but require more complex power sources. CMOS sensors, on the other hand, can incorporate many types of logic elements on the chips themselves. This enables on-chip functions such as digitizing, compression, enhancement, and pattern recognition.
  • Preferably the line scan sensor should be as sensitive to light as possible, as well as having a very fast readout rate. The fast readout rate enables many individual vertical scans to be read from the sensor while it is in motion, and the sensitivity of the sensor enables a usable image to be produced even during the extremely short exposure time represented by such rapid scans. PixelVision and Scientific Imaging Technologies of Beaverton, Oreg. have developed especially sensitive back-illuminated CCDs for astronomical use.
  • If multiple sensors must be used to build up the proper resolution, then they should be carefully spaced around the periphery of the drum while taking into account the desired resolution. For example, four sensors, each limited to a readout speed of only 1000 scans during the course of a revolution, are used to build up an overall resolution of 4000 scan totoal. If the four sensors are spaced within a span of 90 degrees on the drum and exactly 22.5 degrees apart, then every 62.5 scans (1000*(22.5/360), each sensor would reach the same position as the next one in line. The first sensor, in other words, would advance in the course of the revolution, and after 62.5 scans the next in line would reach the starting position of the first. What is needed, however, is not a mere duplication of the first reading, but another reading taken slightly ahead or behind this point, to help fill in the gaps represented by the readings of a single sensor. If the intention is to fill in the gap going backward, then the spacing between the sensors is increased by a slight amount representing one increment of the desired final scan resolution, in this case 4000 scans. For a 12.7 cm (5-inch) diameter drum, which is a preferred size to use because of the amount of parallax separation it produces, the circumference is 398.98 mm or almost exactly 10 scans/mm as the drum rotates. Thus the added increment of spacing for each would be approximately one-tenth of a millimeter.
  • Each sensor should have the same type of lens, to enable the matching of recorded images. Since the horizontal field of view of the line sensor is negligible, the important consideration for the lens is its vertical field of view. In the preferred embodiment, a fisheye lens, characterized by having a field of view of close to 180 degrees, is used to achieve an essentially spherical recording of the surrounding view as the drum rotates. The lens should be small, of high quality, and fast, which means that it transmits light efficiently. Good quality lenses of this type are being made by Coastal Optical. Estelle's “Wide-angle Photographic Lens System and Photographic Camera” (U.S. Pat. No. 5,502,597) describes an especially compact and efficient design for a wide-angle lens. In order to ensure recording of the area directly above the rotating camera, while creating a “blind spot” directly underneath it to hide a camera operator or mount, it may be preferable in practice to construct the subcamera assembly so as to raise the axis of the lens by up to 10 degrees from the horizontal plane of the drum.
  • As the vertical angle of view increases, an exposure difference becomes apparent in different segments of the line sensor. For a global scan of a spherical field of view, for instance, this exposure difference comes from the different rates of speed found in scanning around the poles versus scanning at the equator. Another way to illustrate this is to observe the difference in rotational speed of a point on a globe near the poles, as compared to a point on the equator. To adjust for this, a fixed signal compensation function can be applied to the signals from different sections of the linear sensor.
  • The motor assembly should be very smooth and quiet, and able to hold to a precise rotational speed. Various types of electronically controlled motors have been developed to meet these characteristics. A recent line of direct-drive brushless dc motors from Thomson Airpax Mechatronics Group are 66% smaller and 30% more efficient than other industry-standard motors. A smaller size for the motor would aid in the portability of the camera system by making it lighter in weight overall. Sinusoidal motion control is preferred for the smoothest possible regulation of speed. An 1800 RPM motor would produce an effective frame speed of 30 fps. This happens to be the type of motor used for the head drum in a VHS tape machine.
  • Rotary couplings such as slip rings are used to transfer electrical signals to and from the electronic elements in the rotating drum. Litton Poly-Scientific's EC3848, for example is a high speed slip ring assembly of up to 10 lines for rotational speeds up to 10,000 RPM. Examples of the type of signals that would go into the drum are power and timing signals for the sensors, while the readout from the sensors would come out along one or more lines as required. Some sensors have multi-tap readouts to increase effective resolution and data speed. The DALSA CT-E4-2048W, for example, uses the IT-PI-2048 sensor which has four simultaneous readout channels at speeds of 20 Mz to 25 Mz each, for an effective readout rate for the sensor of up to 100 MHz. The number of rings in the coupling depends on the number of electrical channels needed to maintain proper connections.
  • Rangefinding sensors come in a variety of configurations. The miniature radar-on-a-chip invented at Lawrence Livermore Laboratory in 1995 is one example of a low-cost sensor that could be used to obtain readings of the distance to various objects during the course of a scan. Active Infrared Autofocus devices such are found on still cameras could also be used, especially to give distance readings within a range of around 20 feet (6 m). LIDAR (Light Detection And Ranging) devices can also be used for scanning a surrounding scene, but the devices to date tend to have too slow a sampling rate to yield a detailed record of the depth of an overall scene.
  • For more convenient use of the image data from the sensors, compression should preferably be applied to the data prior to its transmission from the drum. Various methods for image compression are currently available, such as Discrete Cosine Transform (DCT) and the Motion Picture Experts Group (MPEG) family of motion picture compression. A preferred embodiment for compression in the present invention would be wavelet compression. The final data rate for the output would be determined by the desired level of quality in the image, as well as the optimum data rate of the recorder.
  • For digital readings from the sensors, a variety of data transfer protocols could be used for sending the data for external recording, transmission or processing. The transmission can be in the form of a wired or, preferably, a wireless connection. The preferred embodiment makes use of one of the new high-quality high-bandwidth standards. These include Fibre Channel (ANSI x3.230), which is made for fiber optics and has an effective speed limit of 400 mbps (million bits per second), the same speed as for the IEEE-1394 “Firewire” interface. Serial Digital Interface (SDI; a.k.a. SMPTE 292M) has a maximum speed of 1200 mbps, and Gigabit Ethernet, 1000 mbps. The Gigabit Internet standard (GVIF) has a maximum speed of 1776 mbps. An even faster data speed is represented by 10 Gbps Ethernet standard now being developed by the 10 Gigabit Ethernet Alliance, including such companies as Intel, Extreme Networks, Nortel, World Wide Packets, and other companies. This speed would handle the full-bandwidth data stream from the preferred resolution and frame speed described above as the preferred embodiment.
  • The image data coming from each sensor represents a vertical column. The columns are usually put together directly to make the final image in a digital buffer. However, if a video recorder is to be used for recording this type of image, then the recording method is modified, since video images are made up of rows, which are read out horizontally, rather than columns. In order to use this type of recorder, an image made up of columns is first stored in a buffer, then read out read out in rows, usually while a second buffer is being filled with the columns of a new frame. A readout of an HDTV video image of approximately 1920 by 1080 pixels (which can be identified as High-level, High-profile ADTV), is preferred, as a way of preserving maximum image quality within existing video standards. HDTV digital recorders such as the Panasonic AJ-HD2700 D-5 VCR (for a high-bandwidth SDI signal) or even the low-cost Panasonic PVHD 1000 D-VHS VCR (for a compressed signal) recorder are examples or recorders that could be used for this purpose. Present video standards in general, however, are not of high enough resolution for adequate recording of spherical scenes.
  • For a film strip recorder for the image information, the image can be formed for the film by a linear display. Many forms of displays and microdisplays may be suitable, depending on the brightness, size and image characteristics required. These displays include an LED or laser diode array, or a linear CRT, LCD, DLP (Digital Light Processing) or plasma display, with reduction and focusing optics as necessary to transfer the light onto the film. A linear color plasma display has the advantage of apparently continuous adjacent pixels. However, a plasma display such used in the Philips Flat TV, for example, has pixels measuring 1.08 mm square; therefore a 2098 pixel linear sensor requires an array 226.584 cm (89.2 inches) long, which requires substantial reduction optics. An alternate form of arrangement would be to split the display into a multiplicity of sections, which are combined to make a continuous recorded image. This can make use of any multi-tap readouts of the sensor, such as those featured in the DALSA camera mentioned above, where each of the four parallel channels coming from the sensor can be made to control its own display array. The Cine V Solitaire Image Recorder (U.S. Pat. No. 4,754,334) is an example of an advanced film recorder for digital data.
  • The extracted signals from the sensors can be sent to other locations for use via electronic transmission, for either a live or a prerecorded image. It is preferred that a movable region of interest be extracted from within the overall image, either live or prerecorded. If taken from the live image, then if desired, the rest of the image can be discarded. This has the advantage of reducing the overall requirements for transmission or processing.
  • Thus the scope of the invention should be determined by the appended claims and their legal equivalents, rather than by the examples given.
  • Listed Parts in Drawings
  • FIG. 6
    • 2. Cylindrical drum
    • 4. Motor for rotatable drum
    • 8. Aperture for Right sensor
    • 92. Hemispherical linear scan from right sensor
    • 94. Second hemispherical scan from right sensor
    • 96. Hemispherical linear scan from left sensor
    • 98. Second hemispherical scan from left sensor
    • 100. Vertical boundary of hemispherical FOV
      FIG. 2
    • 6. Right-hand sensor
    • 10. Left-hand sensor for stereoscopic scan
    • 12. Aperture for Left sensor
    • 14. Clockwise direction of rotation
      FIG. 3
    • 16. Semi-tangent optical axis for Right sensor
    • 18. Semi-tangent optical axis for Left sensor
    • 20. Right lens
    • 22. Left lens
    • 24. Tangent point
    • 26. Center of rotation
    • 28. Drum surface
    • 30. Radial axis
    • 32. Tangential axis
    • 34. Convergent axis
    • 36. Divergent axis
      FIG. 4
    • 38. Fisheye lens
      • Wider front
    • 42. Drum surface
      FIG. 5
    • 60. First rotary position of Left sensor
    • 62. Second rotary position
    • 64. Third rotary position
    • 66. Fourth rotary position
    • 68. Fifth rotary position
    • 70. Near object
    • 72. Area of first sweeping scan
    • 74. Area of second sweeping scan
    • 76. Target area of near object
    • 78. Axis for first position of Right sensor
    • 80. First position of Right sensor
    • 82. Scanned area on object from Right sensor
    • 84. Scanned area on object from Left sensor
    • 86. Area of overlapping scans
    • 88. Rangefinder emitter
    • 90. Rangefinder sensor
      FIG. 6
    • 102. First equirectangular view
    • 104. Second equirectangular view
      FIG. 7
    • 106. Frame of information
    • 108. Uncorrected scans
    • 110. Scans shifted to right (delay forward in time)
    • 112. Scans shifted to left (delay backward in time)
      FIG. 8
    • 114. Region of Interest
    • 116. Selected scans shifted forward in time
    • 118. Selected scans shifted backward in time
      FIG. 9
    • 120. Boundary of Drum
    • 122. Power and Drive for sensor
    • 124. Central Power
    • 126. Central Control
    • 128. Motion Control
    • 130. Raw digital image data
    • 132. Image compression
    • 134. I/O formatting
    • 136. Slip rings
    • 138. Rangefinder
    • 140. Distance Data
    • 142. Delays
    • 144. Multiplexing
    • 146. Recorder
    • 148. Playback
    • 150. Data Transmission
    • 152. Decompression
    • 154. Region of Interest Extraction
    • 156. Display of ROI
      FIG. 10
    • 158. Extra interpolated sensor # 2
    • 160. Extra interpolated sensor #3
    • 162. Extra interpolated sensor # 4
    • 164. Basic optical axis for sensor #1 (Left)
    • 166. Extra optical axis for #2
    • 168. Extra optical axis for #3
    • 170. Extra optical axis for #4
    • 172. Rotational distance between sensors 1 and 4
    • 174. Lens for sensor # 1
    • 176. Lens for sensor # 2
    • 178. Lens for sensor #3
    • 180. Lens for sensor # 4
      FIG. 11
    • 182. Scan from #1
    • 184. Scan from #2
    • 186. Delay for #1
    • 188. Recorded position for scan # 1
    • 190. Recorded position for scan # 2
      FIG. 12
    • 216. Transmission line for sensor information
    • 218. Linear display
    • 220. Amplification and processing of signal for display
    • 222. Reduction optics
    • 224. Film recording
    • 226. Supply reel
    • 228. Takeup reel
      FIG. 13
    • 230. Axis of rotation
    • 232. Supply reel for film camera drum
    • 234. Upper portion of drum
    • 236. Film in upper portion
    • 238. Middle portion of film camera drum
    • 240. Film in lower portion
    • 242. Exposure point
    • 244. Lower portion of film camera drum
    • 246. Takeup reel for film camera drum

Claims (16)

1. A camera for recording a panoramic field of view in a scanning fashion, comprising:
i a rotatable drum having an axis of rotation and a perimeter; and
ii an even number of subcameras disposed at the perimeter of the drum, each subcamera comprising:
a lens having an optical axis substantially tangential to the perimeter of the drum and an image plane, said optical axis being not radial to said axis of rotation;
a linear sensor having a sensor area which intersects said optical axis and which lies within the image plane; and
transmission means for transferring scan line readings from the linear sensor;
iii storage means for storing the scan line readings from the sensors;
iv processing means for adjusting the positioning of the scan line readings in a final composite image.
v recording means for recording the final composite image.
2. The camera of claim 1, wherein the subcameras are disposed substantially on opposite sides of said drum, and provide in the course of the regularly repeated rotation of the drum, a repeated viewing of objects in the surrounding environment from two points of view with parallax separation.
3. The camera of claim 1, wherein each lens has a vertical field of view of at least 160 degrees.
4. The camera of claim 1, wherein the rotation of the rotatable drum is 360 degrees, and repeated in regular periods.
5. The camera of claim 1, wherein the electrical readings of the linear sensor, representing pictorial information, are grouped to form images representing a portion of a rotation.
6. The camera of claim 1, wherein image compression is applied to the pictorial information.
7. The camera of claim 1, wherein processing means are used to extract a movable region of interest from the pictorial information.
8. The camera of claim 1, wherein the images represent images with stereoscopic separation.
9. The camera of claim 8, wherein both stereoscopic images are viewed together as a stereo pair.
10. The camera of claim 8, wherein the convergence of the stereoscopic images is controlled by lateral distortion of one of the images relative to the other.
11. The camera of claim 10, wherein a rangefinding sensor on the drum is used to determine the amount of lateral distortion applied.
12. The camera of claim 1, wherein said recording means include a film recorder, and said line scan readings are first converted into light before being recorded on a strip of photographic film.
13. A method for recording a stereoscopic image of a wide field of view, up to a complete sphere, including the steps of:
i. aligning at least two line scan devices within the field of view such that their optical axes are in the same plane, separated, approximately parallel, and pointed in the same direction;
ii. rotating said line scan devices simultaneously about an axis of rotation, the axis of rotation being approximately perpendicular to said plane and disposed equidistant to and between said line scan devices;
iii. sampling the output of said line scan devices during said rotation, to produce scans from each sensor; and
iv. processing said scans so as to assemble a composite image having stereoscopic separation thoughout the image.
14. The method of claim 13, with the additional steps of placing duplicates of said line scan devices in rotated positions around said optical axis, and adjusting the timing of the recording of the additional scans produced by said duplicates so that they appear interleaved with the scans produced by the original line scan sensors.
15. The method of claim 13, with the additional step of adjusting the convergence of the stereoscopic image by digital delays of the scans according to a convergence profile.
16. The method of claim 15, with the convergence profile being determined by the input of a rangefinding device also mounted and rotated along with the sensors.
US10/954,585 2004-09-29 2004-09-29 Rotating scan camera Expired - Fee Related US7791638B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/954,585 US7791638B2 (en) 2004-09-29 2004-09-29 Rotating scan camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/954,585 US7791638B2 (en) 2004-09-29 2004-09-29 Rotating scan camera

Publications (2)

Publication Number Publication Date
US20060072020A1 true US20060072020A1 (en) 2006-04-06
US7791638B2 US7791638B2 (en) 2010-09-07

Family

ID=36125122

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/954,585 Expired - Fee Related US7791638B2 (en) 2004-09-29 2004-09-29 Rotating scan camera

Country Status (1)

Country Link
US (1) US7791638B2 (en)

Cited By (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070014347A1 (en) * 2005-04-07 2007-01-18 Prechtl Eric F Stereoscopic wide field of view imaging system
US20080043014A1 (en) * 2004-12-28 2008-02-21 Japan Science And Technology Agency 3D Image Display Method
US20080111881A1 (en) * 2006-11-09 2008-05-15 Innovative Signal Analysis, Inc. Imaging system
US20080211902A1 (en) * 2007-03-02 2008-09-04 Fujifilm Corporation Imaging device
US20080246864A1 (en) * 2007-04-07 2008-10-09 Andrew Jon Selin Electronic Imaging Carousel
EP2141932A1 (en) * 2008-06-30 2010-01-06 France Telecom 3D rendering method and system
US20110114728A1 (en) * 2009-11-18 2011-05-19 Hand Held Products, Inc. Optical reader having improved back-illuminated image sensor
US20110132086A1 (en) * 2009-12-09 2011-06-09 Emcom Technology Inc. Waterproof sensing device
US20110228115A1 (en) * 2010-03-16 2011-09-22 Microsoft Corporation Large Format Digital Camera
US8108147B1 (en) * 2009-02-06 2012-01-31 The United States Of America As Represented By The Secretary Of The Navy Apparatus and method for automatic omni-directional visual motion-based collision avoidance
US20120162360A1 (en) * 2009-10-02 2012-06-28 Kabushiki Kaisha Topcon Wide-Angle Image Pickup Unit And Measuring Device
CN102566245A (en) * 2010-12-10 2012-07-11 财团法人工业技术研究院 Method and system for establishing ring field image
US20120268563A1 (en) * 2011-04-22 2012-10-25 Microsoft Corporation Augmented auditory perception for the visually impaired
WO2013029128A1 (en) * 2010-09-02 2013-03-07 As Tv Produçôes Cinematográficas Ltda Equipment, system and method for mobile video monitoring with panoramic capture, transmission and instant storage
US20130089301A1 (en) * 2011-10-06 2013-04-11 Chi-cheng Ju Method and apparatus for processing video frames image with image registration information involved therein
US20130201296A1 (en) * 2011-07-26 2013-08-08 Mitchell Weiss Multi-camera head
US8754929B1 (en) * 2011-05-23 2014-06-17 John Prince Real time vergence control for 3D video capture and display
US20140168357A1 (en) * 2012-12-13 2014-06-19 Microsoft Corporation Displacing image on imager in multi-lens cameras
CN103907341A (en) * 2011-11-07 2014-07-02 索尼电脑娱乐公司 Image generation device, and image generation method
US20140267598A1 (en) * 2013-03-14 2014-09-18 360Brandvision, Inc. Apparatus and method for holographic poster display
CN104253937A (en) * 2014-10-16 2014-12-31 宁波通视电子科技有限公司 Omnibearing camera
US20150022643A1 (en) * 2013-07-19 2015-01-22 Google Inc. Asymmetric Sensor Array for Capturing Images
WO2015030308A1 (en) * 2013-09-02 2015-03-05 Lg Electronics Inc. Digital device and method of controlling therefor
US9020336B1 (en) * 2012-05-29 2015-04-28 Cordin Company, Inc. Digital streak camera with rotating mirror
US20150204556A1 (en) * 2013-05-17 2015-07-23 Panasonic Intellectual Property Corporation Of America Thermal image sensor and user interface
US9121701B2 (en) 2011-08-23 2015-09-01 Bae Systems Information And Electronic Systems Integration Inc. Fiber optically coupled laser rangefinder for use in a gimbal systems
US20150312477A1 (en) * 2014-04-29 2015-10-29 Nokia Corporation Method and apparatus for extendable field of view rendering
US9185391B1 (en) * 2014-06-17 2015-11-10 Actality, Inc. Adjustable parallax distance, wide field of view, stereoscopic imaging system
US20150341617A1 (en) * 2014-05-20 2015-11-26 Nextvr Inc. Methods and apparatus including or for use with one or more cameras
US20160088280A1 (en) * 2014-09-22 2016-03-24 Samsung Electronics Company, Ltd. Camera system for three-dimensional video
US9430923B2 (en) 2009-11-30 2016-08-30 Innovative Signal Analysis, Inc. Moving object detection, tracking, and displaying systems
US9588407B1 (en) * 2016-04-06 2017-03-07 Gopro, Inc. Invertible timer mount for camera
US9667848B2 (en) * 2015-04-22 2017-05-30 Qualcomm Incorporated Tiltable camera module
US9674435B1 (en) * 2016-07-06 2017-06-06 Lawrence Maxwell Monari Virtual reality platforms for capturing content for virtual reality displays
US9729788B2 (en) 2011-11-07 2017-08-08 Sony Corporation Image generation apparatus and image generation method
US9894272B2 (en) 2011-11-07 2018-02-13 Sony Interactive Entertainment Inc. Image generation apparatus and image generation method
US9965856B2 (en) 2013-10-22 2018-05-08 Seegrid Corporation Ranging cameras using a common substrate
US10139819B2 (en) 2014-08-22 2018-11-27 Innovative Signal Analysis, Inc. Video enabled inspection using unmanned aerial vehicles
US10284776B2 (en) 2011-11-07 2019-05-07 Sony Interactive Entertainment Inc. Image generation apparatus and image generation method
US10591969B2 (en) 2013-10-25 2020-03-17 Google Technology Holdings LLC Sensor-based near-field communication authentication
RU2720581C1 (en) * 2019-07-19 2020-05-12 Вячеслав Михайлович Смелков Panoramic television surveillance computer system device
RU2721381C1 (en) * 2019-08-12 2020-05-19 Вячеслав Михайлович Смелков High-resolution panorama television surveillance computer system device
RU2723645C1 (en) * 2019-12-13 2020-06-17 Вячеслав Михайлович Смелков High-resolution panorama television surveillance computer system device
RU2723640C1 (en) * 2019-12-09 2020-06-17 Вячеслав Михайлович Смелков High-resolution panorama television surveillance computer system device
US10708519B2 (en) * 2016-01-15 2020-07-07 Fachhochschule Nordwestschweiz Fhnw Stereo image capturing system having two identical panorama image capturing units arranged at a common support structure
RU2727920C1 (en) * 2020-01-27 2020-07-27 Вячеслав Михайлович Смелков Panoramic television surveillance computer system with selective image scaling
RU2730177C1 (en) * 2020-01-16 2020-08-19 Вячеслав Михайлович Смелков Television system with selective image scaling (versions)
USD894256S1 (en) 2018-08-31 2020-08-25 Gopro, Inc. Camera mount
WO2020205051A1 (en) * 2019-04-03 2020-10-08 Digital Check Corp Image stitching from multiple line scanners
CN111787302A (en) * 2020-06-29 2020-10-16 湖南傲英创视信息科技有限公司 Stereoscopic panoramic live broadcast shooting system based on line scanning camera
USD905786S1 (en) 2018-08-31 2020-12-22 Gopro, Inc. Camera mount
RU2743571C1 (en) * 2020-06-23 2021-02-20 Вячеслав Михайлович Смелков Computing system device for panoramic video surveillance with selective image scaling
US10928711B2 (en) 2018-08-07 2021-02-23 Gopro, Inc. Camera and camera mount
CN112594508A (en) * 2020-11-20 2021-04-02 重庆电子工程职业学院 Campus security monitoring system
US11006028B2 (en) * 2016-12-27 2021-05-11 Yasushi Ikei Image capturing device
US11009779B1 (en) * 2020-05-18 2021-05-18 Titus Gadwin Watts Camera to capture slow to high speed footage
RU2748754C1 (en) * 2020-09-28 2021-05-31 Вячеслав Михайлович Смелков Selective zoom tv system (options)
US11049218B2 (en) 2017-08-11 2021-06-29 Samsung Electronics Company, Ltd. Seamless image stitching
RU2755494C1 (en) * 2021-01-25 2021-09-16 Вячеслав Михайлович Смелков Method for generating video signal in television and computer system for monitoring industrial products having shape of circular ring
RU2755809C1 (en) * 2021-01-12 2021-09-21 Вячеслав Михайлович Смелков Device of a computer system for panoramic television surveillance with increased resolution
RU2756234C1 (en) * 2021-03-01 2021-09-28 Вячеслав Михайлович Смелков Device of a computer system for panoramic television surveillance with selective image scaling
WO2021221980A1 (en) * 2020-04-27 2021-11-04 Ouster, Inc. Stereoscopic image capturing systems
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
US11474254B2 (en) 2017-11-07 2022-10-18 Piaggio Fast Forward Inc. Multi-axes scanning system from single-axis scanner
US11553115B1 (en) * 2021-06-16 2023-01-10 Sony Interactive Entertainment LLC Movable camera
US11601605B2 (en) * 2019-11-22 2023-03-07 Thermal Imaging Radar, LLC Thermal imaging camera device
USD991318S1 (en) 2020-08-14 2023-07-04 Gopro, Inc. Camera
USD997232S1 (en) 2019-09-17 2023-08-29 Gopro, Inc. Camera
USD1024165S1 (en) 2023-05-17 2024-04-23 Gopro, Inc. Camera

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201205180A (en) * 2010-07-27 2012-02-01 Hon Hai Prec Ind Co Ltd Camera
US9503606B2 (en) 2011-09-28 2016-11-22 Semiconductor Components Industries, Llc Time-delay-and-integrate image sensors having variable integration times
US8605161B2 (en) 2012-03-12 2013-12-10 Raytheon Company Intra-frame optical-stabilization with intentional inter-frame scene motion
KR101889225B1 (en) 2016-09-06 2018-08-16 주식회사 에스360브이알 Method of obtaining stereoscopic panoramic images, playing the same and stereoscopic panoramic camera
US10346950B2 (en) 2016-10-05 2019-07-09 Hidden Path Entertainment, Inc. System and method of capturing and rendering a stereoscopic panorama using a depth buffer
US10530997B2 (en) 2017-07-13 2020-01-07 Zillow Group, Inc. Connecting and using building interior data acquired from mobile devices
US10375306B2 (en) 2017-07-13 2019-08-06 Zillow Group, Inc. Capture and use of building interior data from mobile devices
US10643386B2 (en) 2018-04-11 2020-05-05 Zillow Group, Inc. Presenting image transition sequences between viewing locations
CA3058602C (en) 2018-10-11 2023-01-24 Zillow Group, Inc. Automated mapping information generation from inter-connected images
US10708507B1 (en) 2018-10-11 2020-07-07 Zillow Group, Inc. Automated control of image acquisition via use of acquisition device sensors
US10809066B2 (en) 2018-10-11 2020-10-20 Zillow Group, Inc. Automated mapping information generation from inter-connected images
CN109640044B (en) * 2018-12-19 2020-07-14 上海百涛电子系统工程有限公司 Video monitoring system
US11243656B2 (en) 2019-08-28 2022-02-08 Zillow, Inc. Automated tools for generating mapping information for buildings
US11164368B2 (en) 2019-10-07 2021-11-02 Zillow, Inc. Providing simulated lighting information for three-dimensional building models
US11164361B2 (en) 2019-10-28 2021-11-02 Zillow, Inc. Generating floor maps for buildings from automated analysis of visual data of the buildings' interiors
US11676344B2 (en) 2019-11-12 2023-06-13 MFTB Holdco, Inc. Presenting building information using building models
US10825247B1 (en) 2019-11-12 2020-11-03 Zillow Group, Inc. Presenting integrated building information using three-dimensional building models
US11405549B2 (en) 2020-06-05 2022-08-02 Zillow, Inc. Automated generation on mobile devices of panorama images for building locations and subsequent use
US11514674B2 (en) 2020-09-04 2022-11-29 Zillow, Inc. Automated analysis of image contents to determine the acquisition location of the image
US11592969B2 (en) 2020-10-13 2023-02-28 MFTB Holdco, Inc. Automated tools for generating building mapping information
US11481925B1 (en) 2020-11-23 2022-10-25 Zillow, Inc. Automated determination of image acquisition locations in building interiors using determined room shapes
US11252329B1 (en) 2021-01-08 2022-02-15 Zillow, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
CA3142154A1 (en) 2021-01-08 2022-07-08 Zillow, Inc. Automated determination of image acquisition locations in building interiors using multiple data capture devices
US11836973B2 (en) 2021-02-25 2023-12-05 MFTB Holdco, Inc. Automated direction of capturing in-room information for use in usability assessment of buildings
US11790648B2 (en) 2021-02-25 2023-10-17 MFTB Holdco, Inc. Automated usability assessment of buildings using visual data of captured in-room images
US11501492B1 (en) 2021-07-27 2022-11-15 Zillow, Inc. Automated room shape determination using visual data of multiple captured in-room images
US11842464B2 (en) 2021-09-22 2023-12-12 MFTB Holdco, Inc. Automated exchange and use of attribute information between building images of multiple types
US11830135B1 (en) 2022-07-13 2023-11-28 MFTB Holdco, Inc. Automated building identification using floor plans and acquired building images

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3591269A (en) * 1968-07-30 1971-07-06 Itek Corp Conical scan panoramic camera
US4235515A (en) * 1978-06-29 1980-11-25 Rudell Elliot A Stereoscopic viewing system
US4241985A (en) * 1978-11-27 1980-12-30 Globus Richard D Panoramic camera
US4532544A (en) * 1983-06-28 1985-07-30 Gregor Federau Line-scan panoramic camera
US4602857A (en) * 1982-12-23 1986-07-29 James H. Carmel Panoramic motion picture camera and method
US4754334A (en) * 1987-01-08 1988-06-28 Management Graphics, Inc. Image recorder having automatic alignment method and apparatus
US4977323A (en) * 1973-08-16 1990-12-11 The United States Of America As Represented By The Secretary Of The Navy 360 degree infrared surveillance with panoramic display
US5097325A (en) * 1990-12-17 1992-03-17 Eol3 Company, Inc. Circular scanning system for an integrated camera and panoramic catadioptric display
US5305035A (en) * 1992-08-08 1994-04-19 Kamerawerke Noble Gmbh Panoramic camera with objective drum
US5379067A (en) * 1992-04-03 1995-01-03 Sony Corporation CCD linear sensor and method of reading-out charges therefrom
US5502597A (en) * 1994-02-02 1996-03-26 Eastman Kodak Company Wide-angle photographic lens system and a photographic camera
US5659804A (en) * 1994-10-11 1997-08-19 Keller; James Mcneel Panoramic camera
US5721585A (en) * 1996-08-08 1998-02-24 Keast; Jeffrey D. Digital video panoramic image capture and display system
US5724585A (en) * 1995-10-06 1998-03-03 International Business Machines Corporation Method for processing an application termination initiated from a pre-initialized computer language execution environment
US5790183A (en) * 1996-04-05 1998-08-04 Kerbyson; Gerald M. High-resolution panoramic television surveillance system with synoptic wide-angle field of view
US5903782A (en) * 1995-11-15 1999-05-11 Oxaal; Ford Method and apparatus for producing a three-hundred and sixty degree spherical visual data set
US6031541A (en) * 1996-08-05 2000-02-29 International Business Machines Corporation Method and apparatus for viewing panoramic three dimensional scenes
US6144406A (en) * 1996-12-24 2000-11-07 Hydro-Quebec Electronic panoramic camera
US6404146B1 (en) * 2001-01-31 2002-06-11 Innovision Corporation Method and system for providing two-dimensional color convergence correction
US6665003B1 (en) * 1998-09-17 2003-12-16 Issum Research Development Company Of The Hebrew University Of Jerusalem System and method for generating and displaying panoramic images and movies
US6791598B1 (en) * 2000-03-17 2004-09-14 International Business Machines Corporation Methods and apparatus for information capture and steroscopic display of panoramic images
US6922211B2 (en) * 1998-09-04 2005-07-26 Videotec S.R.L. Scanning and cleaning device for explosion-proof casing for monitoring apparatus such as surveillance television camera operating in explosive environment
US7129971B2 (en) * 2000-02-16 2006-10-31 Immersive Media Company Rotating scan self-cleaning camera

Patent Citations (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3591269A (en) * 1968-07-30 1971-07-06 Itek Corp Conical scan panoramic camera
US4977323A (en) * 1973-08-16 1990-12-11 The United States Of America As Represented By The Secretary Of The Navy 360 degree infrared surveillance with panoramic display
US4235515A (en) * 1978-06-29 1980-11-25 Rudell Elliot A Stereoscopic viewing system
US4241985A (en) * 1978-11-27 1980-12-30 Globus Richard D Panoramic camera
US4602857A (en) * 1982-12-23 1986-07-29 James H. Carmel Panoramic motion picture camera and method
US4532544A (en) * 1983-06-28 1985-07-30 Gregor Federau Line-scan panoramic camera
US4754334A (en) * 1987-01-08 1988-06-28 Management Graphics, Inc. Image recorder having automatic alignment method and apparatus
US5097325A (en) * 1990-12-17 1992-03-17 Eol3 Company, Inc. Circular scanning system for an integrated camera and panoramic catadioptric display
US5379067A (en) * 1992-04-03 1995-01-03 Sony Corporation CCD linear sensor and method of reading-out charges therefrom
US5305035A (en) * 1992-08-08 1994-04-19 Kamerawerke Noble Gmbh Panoramic camera with objective drum
US5502597A (en) * 1994-02-02 1996-03-26 Eastman Kodak Company Wide-angle photographic lens system and a photographic camera
US5659804A (en) * 1994-10-11 1997-08-19 Keller; James Mcneel Panoramic camera
US5758199A (en) * 1994-10-11 1998-05-26 Keller; James Mcneel Panoramic camera
US5724585A (en) * 1995-10-06 1998-03-03 International Business Machines Corporation Method for processing an application termination initiated from a pre-initialized computer language execution environment
US5903782A (en) * 1995-11-15 1999-05-11 Oxaal; Ford Method and apparatus for producing a three-hundred and sixty degree spherical visual data set
US5790183A (en) * 1996-04-05 1998-08-04 Kerbyson; Gerald M. High-resolution panoramic television surveillance system with synoptic wide-angle field of view
US6031541A (en) * 1996-08-05 2000-02-29 International Business Machines Corporation Method and apparatus for viewing panoramic three dimensional scenes
US5721585A (en) * 1996-08-08 1998-02-24 Keast; Jeffrey D. Digital video panoramic image capture and display system
US6144406A (en) * 1996-12-24 2000-11-07 Hydro-Quebec Electronic panoramic camera
US6922211B2 (en) * 1998-09-04 2005-07-26 Videotec S.R.L. Scanning and cleaning device for explosion-proof casing for monitoring apparatus such as surveillance television camera operating in explosive environment
US6665003B1 (en) * 1998-09-17 2003-12-16 Issum Research Development Company Of The Hebrew University Of Jerusalem System and method for generating and displaying panoramic images and movies
US7129971B2 (en) * 2000-02-16 2006-10-31 Immersive Media Company Rotating scan self-cleaning camera
US7525567B2 (en) * 2000-02-16 2009-04-28 Immersive Media Company Recording a stereoscopic image of a wide field of view
US6791598B1 (en) * 2000-03-17 2004-09-14 International Business Machines Corporation Methods and apparatus for information capture and steroscopic display of panoramic images
US6404146B1 (en) * 2001-01-31 2002-06-11 Innovision Corporation Method and system for providing two-dimensional color convergence correction

Cited By (120)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080043014A1 (en) * 2004-12-28 2008-02-21 Japan Science And Technology Agency 3D Image Display Method
US7961182B2 (en) * 2004-12-28 2011-06-14 Japan Science And Technology Agency Stereoscopic image display method
US20070014347A1 (en) * 2005-04-07 2007-01-18 Prechtl Eric F Stereoscopic wide field of view imaging system
US20070024701A1 (en) * 2005-04-07 2007-02-01 Prechtl Eric F Stereoscopic wide field of view imaging system
US20070126863A1 (en) * 2005-04-07 2007-06-07 Prechtl Eric F Stereoscopic wide field of view imaging system
US8004558B2 (en) 2005-04-07 2011-08-23 Axis Engineering Technologies, Inc. Stereoscopic wide field of view imaging system
US7982777B2 (en) 2005-04-07 2011-07-19 Axis Engineering Technologies, Inc. Stereoscopic wide field of view imaging system
US20080111881A1 (en) * 2006-11-09 2008-05-15 Innovative Signal Analysis, Inc. Imaging system
US20100073475A1 (en) * 2006-11-09 2010-03-25 Innovative Signal Analysis, Inc. Moving object detection
US9413956B2 (en) 2006-11-09 2016-08-09 Innovative Signal Analysis, Inc. System for extending a field-of-view of an image acquisition device
US8670020B2 (en) 2006-11-09 2014-03-11 Innovative Systems Analysis, Inc. Multi-dimensional staring lens system
US8803972B2 (en) 2006-11-09 2014-08-12 Innovative Signal Analysis, Inc. Moving object detection
US8072482B2 (en) * 2006-11-09 2011-12-06 Innovative Signal Anlysis Imaging system having a rotatable image-directing device
US8792002B2 (en) 2006-11-09 2014-07-29 Innovative Signal Analysis, Inc. System for extending a field-of-view of an image acquisition device
US20080211902A1 (en) * 2007-03-02 2008-09-04 Fujifilm Corporation Imaging device
US20080246864A1 (en) * 2007-04-07 2008-10-09 Andrew Jon Selin Electronic Imaging Carousel
EP2141932A1 (en) * 2008-06-30 2010-01-06 France Telecom 3D rendering method and system
US8108147B1 (en) * 2009-02-06 2012-01-31 The United States Of America As Represented By The Secretary Of The Navy Apparatus and method for automatic omni-directional visual motion-based collision avoidance
US9733080B2 (en) * 2009-10-02 2017-08-15 Kabushiki Kaisha Topcon Wide-angle image pickup unit and measuring device
US20120162360A1 (en) * 2009-10-02 2012-06-28 Kabushiki Kaisha Topcon Wide-Angle Image Pickup Unit And Measuring Device
US20110114728A1 (en) * 2009-11-18 2011-05-19 Hand Held Products, Inc. Optical reader having improved back-illuminated image sensor
US8464952B2 (en) 2009-11-18 2013-06-18 Hand Held Products, Inc. Optical reader having improved back-illuminated image sensor
US10510231B2 (en) 2009-11-30 2019-12-17 Innovative Signal Analysis, Inc. Moving object detection, tracking, and displaying systems
US9430923B2 (en) 2009-11-30 2016-08-30 Innovative Signal Analysis, Inc. Moving object detection, tracking, and displaying systems
US8399873B2 (en) * 2009-12-09 2013-03-19 Emcom Technology Inc. Rotatable waterproof sensing device
US20110132086A1 (en) * 2009-12-09 2011-06-09 Emcom Technology Inc. Waterproof sensing device
US20110228115A1 (en) * 2010-03-16 2011-09-22 Microsoft Corporation Large Format Digital Camera
WO2013029128A1 (en) * 2010-09-02 2013-03-07 As Tv Produçôes Cinematográficas Ltda Equipment, system and method for mobile video monitoring with panoramic capture, transmission and instant storage
CN102566245A (en) * 2010-12-10 2012-07-11 财团法人工业技术研究院 Method and system for establishing ring field image
US20120268563A1 (en) * 2011-04-22 2012-10-25 Microsoft Corporation Augmented auditory perception for the visually impaired
US8797386B2 (en) * 2011-04-22 2014-08-05 Microsoft Corporation Augmented auditory perception for the visually impaired
US8754929B1 (en) * 2011-05-23 2014-06-17 John Prince Real time vergence control for 3D video capture and display
US20130201296A1 (en) * 2011-07-26 2013-08-08 Mitchell Weiss Multi-camera head
US9121701B2 (en) 2011-08-23 2015-09-01 Bae Systems Information And Electronic Systems Integration Inc. Fiber optically coupled laser rangefinder for use in a gimbal systems
US20130089301A1 (en) * 2011-10-06 2013-04-11 Chi-cheng Ju Method and apparatus for processing video frames image with image registration information involved therein
US9729788B2 (en) 2011-11-07 2017-08-08 Sony Corporation Image generation apparatus and image generation method
EP2779620A4 (en) * 2011-11-07 2015-06-24 Sony Computer Entertainment Inc Image generation device, and image generation method
US9894272B2 (en) 2011-11-07 2018-02-13 Sony Interactive Entertainment Inc. Image generation apparatus and image generation method
CN103907341A (en) * 2011-11-07 2014-07-02 索尼电脑娱乐公司 Image generation device, and image generation method
US10284776B2 (en) 2011-11-07 2019-05-07 Sony Interactive Entertainment Inc. Image generation apparatus and image generation method
US9560274B2 (en) 2011-11-07 2017-01-31 Sony Corporation Image generation apparatus and image generation method
US9020336B1 (en) * 2012-05-29 2015-04-28 Cordin Company, Inc. Digital streak camera with rotating mirror
US20140168357A1 (en) * 2012-12-13 2014-06-19 Microsoft Corporation Displacing image on imager in multi-lens cameras
US9094540B2 (en) * 2012-12-13 2015-07-28 Microsoft Technology Licensing, Llc Displacing image on imager in multi-lens cameras
US20140267598A1 (en) * 2013-03-14 2014-09-18 360Brandvision, Inc. Apparatus and method for holographic poster display
JP2015222260A (en) * 2013-05-17 2015-12-10 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Thermal image sensor and air conditioner
US11320162B2 (en) 2013-05-17 2022-05-03 Panasonic Intellectual Property Corporation Of America Thermal image sensor and user interface
US10641509B2 (en) 2013-05-17 2020-05-05 Panasonic Intellectual Property Corporation Of America Thermal image sensor and user interface
US9939164B2 (en) * 2013-05-17 2018-04-10 Panasonic Intellectual Property Corporation Of America Thermal image sensor and user interface
US20150204556A1 (en) * 2013-05-17 2015-07-23 Panasonic Intellectual Property Corporation Of America Thermal image sensor and user interface
US20150022643A1 (en) * 2013-07-19 2015-01-22 Google Inc. Asymmetric Sensor Array for Capturing Images
CN105308953A (en) * 2013-07-19 2016-02-03 谷歌技术控股有限责任公司 Asymmetric sensor array for capturing images
WO2015030308A1 (en) * 2013-09-02 2015-03-05 Lg Electronics Inc. Digital device and method of controlling therefor
US9965856B2 (en) 2013-10-22 2018-05-08 Seegrid Corporation Ranging cameras using a common substrate
US11429174B2 (en) 2013-10-25 2022-08-30 Google Llc Sensor-based near-field communication authentication
US10591969B2 (en) 2013-10-25 2020-03-17 Google Technology Holdings LLC Sensor-based near-field communication authentication
US9930253B2 (en) * 2014-04-29 2018-03-27 Nokia Technologies Oy Method and apparatus for extendable field of view rendering
US20150312477A1 (en) * 2014-04-29 2015-10-29 Nokia Corporation Method and apparatus for extendable field of view rendering
US10447994B2 (en) * 2014-05-20 2019-10-15 Nextvr Inc. Methods and apparatus including or for use with one or more cameras
US20190082165A1 (en) * 2014-05-20 2019-03-14 Nextvr Inc. Methods and apparatus including or for use with one or more cameras
US10027948B2 (en) * 2014-05-20 2018-07-17 Nextvr Inc. Methods and apparatus including or for use with one or more cameras
CN106464785A (en) * 2014-05-20 2017-02-22 奈克斯特Vr股份有限公司 Methods and apparatus including or for use with one or more cameras
US20150341617A1 (en) * 2014-05-20 2015-11-26 Nextvr Inc. Methods and apparatus including or for use with one or more cameras
US20170155888A1 (en) * 2014-06-17 2017-06-01 Actality, Inc. Systems and Methods for Transferring a Clip of Video Data to a User Facility
US9838668B2 (en) * 2014-06-17 2017-12-05 Actality, Inc. Systems and methods for transferring a clip of video data to a user facility
US9185391B1 (en) * 2014-06-17 2015-11-10 Actality, Inc. Adjustable parallax distance, wide field of view, stereoscopic imaging system
US9578309B2 (en) 2014-06-17 2017-02-21 Actality, Inc. Adjustable parallax distance, wide field of view, stereoscopic imaging system
US10139819B2 (en) 2014-08-22 2018-11-27 Innovative Signal Analysis, Inc. Video enabled inspection using unmanned aerial vehicles
KR20170052676A (en) * 2014-09-22 2017-05-12 삼성전자주식회사 Camera system for three-dimensional video
US20160088287A1 (en) * 2014-09-22 2016-03-24 Samsung Electronics Company, Ltd. Image stitching for three-dimensional video
EP3198865A4 (en) * 2014-09-22 2017-09-27 Samsung Electronics Co., Ltd. Camera system for three-dimensional video
US10750153B2 (en) * 2014-09-22 2020-08-18 Samsung Electronics Company, Ltd. Camera system for three-dimensional video
US10257494B2 (en) * 2014-09-22 2019-04-09 Samsung Electronics Co., Ltd. Reconstruction of three-dimensional video
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
US10313656B2 (en) * 2014-09-22 2019-06-04 Samsung Electronics Company Ltd. Image stitching for three-dimensional video
US20160088285A1 (en) * 2014-09-22 2016-03-24 Samsung Electronics Company, Ltd. Reconstruction of three-dimensional video
US20160088280A1 (en) * 2014-09-22 2016-03-24 Samsung Electronics Company, Ltd. Camera system for three-dimensional video
US10547825B2 (en) * 2014-09-22 2020-01-28 Samsung Electronics Company, Ltd. Transmission of three-dimensional video
US20160088282A1 (en) * 2014-09-22 2016-03-24 Samsung Electronics Company, Ltd. Transmission of three-dimensional video
KR101885780B1 (en) * 2014-09-22 2018-08-06 삼성전자주식회사 Camera system for three-dimensional video
CN104253937A (en) * 2014-10-16 2014-12-31 宁波通视电子科技有限公司 Omnibearing camera
US9667848B2 (en) * 2015-04-22 2017-05-30 Qualcomm Incorporated Tiltable camera module
US10708519B2 (en) * 2016-01-15 2020-07-07 Fachhochschule Nordwestschweiz Fhnw Stereo image capturing system having two identical panorama image capturing units arranged at a common support structure
US9588407B1 (en) * 2016-04-06 2017-03-07 Gopro, Inc. Invertible timer mount for camera
US9674435B1 (en) * 2016-07-06 2017-06-06 Lawrence Maxwell Monari Virtual reality platforms for capturing content for virtual reality displays
US11006028B2 (en) * 2016-12-27 2021-05-11 Yasushi Ikei Image capturing device
US11049218B2 (en) 2017-08-11 2021-06-29 Samsung Electronics Company, Ltd. Seamless image stitching
US11474254B2 (en) 2017-11-07 2022-10-18 Piaggio Fast Forward Inc. Multi-axes scanning system from single-axis scanner
US11662651B2 (en) 2018-08-07 2023-05-30 Gopro, Inc. Camera and camera mount
US10928711B2 (en) 2018-08-07 2021-02-23 Gopro, Inc. Camera and camera mount
USD905786S1 (en) 2018-08-31 2020-12-22 Gopro, Inc. Camera mount
USD1023115S1 (en) 2018-08-31 2024-04-16 Gopro, Inc. Camera mount
USD894256S1 (en) 2018-08-31 2020-08-25 Gopro, Inc. Camera mount
USD989165S1 (en) 2018-08-31 2023-06-13 Gopro, Inc. Camera mount
WO2020205051A1 (en) * 2019-04-03 2020-10-08 Digital Check Corp Image stitching from multiple line scanners
US11688048B2 (en) 2019-04-03 2023-06-27 Digital Check Corp. Image stitching from multiple line scanners
EP3949370A4 (en) * 2019-04-03 2022-12-14 Digital Check Corp Image stitching from multiple line scanners
RU2720581C1 (en) * 2019-07-19 2020-05-12 Вячеслав Михайлович Смелков Panoramic television surveillance computer system device
RU2721381C1 (en) * 2019-08-12 2020-05-19 Вячеслав Михайлович Смелков High-resolution panorama television surveillance computer system device
USD997232S1 (en) 2019-09-17 2023-08-29 Gopro, Inc. Camera
US11601605B2 (en) * 2019-11-22 2023-03-07 Thermal Imaging Radar, LLC Thermal imaging camera device
RU2723640C1 (en) * 2019-12-09 2020-06-17 Вячеслав Михайлович Смелков High-resolution panorama television surveillance computer system device
RU2723645C1 (en) * 2019-12-13 2020-06-17 Вячеслав Михайлович Смелков High-resolution panorama television surveillance computer system device
RU2730177C1 (en) * 2020-01-16 2020-08-19 Вячеслав Михайлович Смелков Television system with selective image scaling (versions)
RU2727920C9 (en) * 2020-01-27 2020-08-06 Вячеслав Михайлович Смелков Panoramic television surveillance computer system with selective image scaling
RU2727920C1 (en) * 2020-01-27 2020-07-27 Вячеслав Михайлович Смелков Panoramic television surveillance computer system with selective image scaling
WO2021221980A1 (en) * 2020-04-27 2021-11-04 Ouster, Inc. Stereoscopic image capturing systems
US11695911B2 (en) 2020-04-27 2023-07-04 Ouster, Inc. Stereoscopic image capturing systems
US11009779B1 (en) * 2020-05-18 2021-05-18 Titus Gadwin Watts Camera to capture slow to high speed footage
RU2743571C1 (en) * 2020-06-23 2021-02-20 Вячеслав Михайлович Смелков Computing system device for panoramic video surveillance with selective image scaling
CN111787302A (en) * 2020-06-29 2020-10-16 湖南傲英创视信息科技有限公司 Stereoscopic panoramic live broadcast shooting system based on line scanning camera
USD1004676S1 (en) 2020-08-14 2023-11-14 Gopro, Inc. Camera
USD991318S1 (en) 2020-08-14 2023-07-04 Gopro, Inc. Camera
RU2748754C1 (en) * 2020-09-28 2021-05-31 Вячеслав Михайлович Смелков Selective zoom tv system (options)
CN112594508A (en) * 2020-11-20 2021-04-02 重庆电子工程职业学院 Campus security monitoring system
RU2755809C1 (en) * 2021-01-12 2021-09-21 Вячеслав Михайлович Смелков Device of a computer system for panoramic television surveillance with increased resolution
RU2755494C1 (en) * 2021-01-25 2021-09-16 Вячеслав Михайлович Смелков Method for generating video signal in television and computer system for monitoring industrial products having shape of circular ring
RU2756234C1 (en) * 2021-03-01 2021-09-28 Вячеслав Михайлович Смелков Device of a computer system for panoramic television surveillance with selective image scaling
US11553115B1 (en) * 2021-06-16 2023-01-10 Sony Interactive Entertainment LLC Movable camera
USD1024165S1 (en) 2023-05-17 2024-04-23 Gopro, Inc. Camera

Also Published As

Publication number Publication date
US7791638B2 (en) 2010-09-07

Similar Documents

Publication Publication Date Title
US7791638B2 (en) Rotating scan camera
US7525567B2 (en) Recording a stereoscopic image of a wide field of view
US7486324B2 (en) Presenting panoramic images with geometric transformation
CN102077575B (en) Zoom by multiple image capture
US5721585A (en) Digital video panoramic image capture and display system
US8390729B2 (en) Method and apparatus for providing a video image having multiple focal lengths
US8059185B2 (en) Photographing apparatus, image display method, computer program and storage medium for acquiring a photographed image in a wide range
US6545701B2 (en) Panoramic digital camera system and method
EP1178352A1 (en) Method of and apparatus for presenting panoramic images at a local receiver, and a corresponding computer program
US20020190991A1 (en) 3-D instant replay system and method
US20050231590A1 (en) Three-dimensional image-capturing apparatus
Konrad et al. Spinvr: towards live-streaming 3d virtual reality video
WO2012002430A1 (en) Stereoscopic image capture device
US4646156A (en) High-speed video camera and method of high-speed imaging using a beam splitter
US20050254817A1 (en) Autostereoscopic electronic camera
US20050041095A1 (en) Stereoscopic image acquisition device
JPH1146317A (en) Video camera
JPH0630446A (en) Stereoscopic image recorder
JP2003158684A (en) Digital camera
FR2636799A1 (en) Camera for taking fast motion television pictures, and system for reproducing images using such a camera
JPH0591450A (en) Video camera device

Legal Events

Date Code Title Description
AS Assignment

Owner name: IMMERSIVE MEDIA CO., OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MCCUTCHEN, DAVID J.;REEL/FRAME:015579/0683

Effective date: 20050118

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: IMC360 COMPANY, OREGON

Free format text: CHANGE OF NAME;ASSIGNOR:IMMERSIVE MEDIA COMPANY;REEL/FRAME:026899/0208

Effective date: 20110331

Owner name: IMMERSIVE VENTURES INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IMC360 COMPANY;REEL/FRAME:026898/0664

Effective date: 20110913

REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 4

SULP Surcharge for late payment
AS Assignment

Owner name: IMMERSIVE LICENSING, INC., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IMMERSIVE VENTURES INC.;REEL/FRAME:035530/0180

Effective date: 20150410

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.)

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552)

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20220907