WO2012089895A1 - Rolling shutter compensation in camera array - Google Patents

Rolling shutter compensation in camera array Download PDF

Info

Publication number
WO2012089895A1
WO2012089895A1 PCT/FI2010/051103 FI2010051103W WO2012089895A1 WO 2012089895 A1 WO2012089895 A1 WO 2012089895A1 FI 2010051103 W FI2010051103 W FI 2010051103W WO 2012089895 A1 WO2012089895 A1 WO 2012089895A1
Authority
WO
WIPO (PCT)
Prior art keywords
cameras
adjacent
images
sub
rolling shutter
Prior art date
Application number
PCT/FI2010/051103
Other languages
French (fr)
Inventor
Tommi Ilmonen
Original Assignee
Multitouch Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Multitouch Oy filed Critical Multitouch Oy
Priority to PCT/FI2010/051103 priority Critical patent/WO2012089895A1/en
Publication of WO2012089895A1 publication Critical patent/WO2012089895A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/531Control of the integration time by controlling rolling shutters in CMOS SSIS

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

Rolling shutter effects are compensated in a camera array of rolling shutter cameras (101)that are configured to capture overlapping or adjacent sub-images (102)which sub-images (102) collectively form a full image of the captured (102). Asynchronizing circuitry (103) controls the adjacent cameras (101) to capture images continuously taking into account the rolling shutter such that line scan is performed with a substantially constant delay between each of a plurality of lines scanned out from the adjacent cameras(101)

Description

ROLLING SHUTTER COMPENSATION IN CAMERA ARRAY
FIELD OF THE INVENTION The present invention generally relates to rolling shutter compensation in a camera array.
BACKGROUND OF THE INVENTION Camera arrays are used in various applications such as optical touch detection systems, super-resolution camera systems and panorama camera systems. Some cameras produce each frame as a snapshot such that all the vertical and horizontal lines of the frames correspond to the very same exposure period. However, there are cameras such as complementary metal-oxide-semiconductor (CMOS) cameras which inherently have a so-called rolling shutter or line scan nature in which the image frames are scanned out line-by-line. If the object and camera move with respect to one another, the object becomes correspondingly distorted. For instance, horizontal panning makes vertical objects appear slanted. This effect may be digitally compensated to an extent by computationally straightening slant from image frames. The slanting is not, however, particularly disturbing under normal circumstances when a single camera is used. However, in case of a camera array, different cameras in the array may cause relatively easily very surprising and disturbing effects in which some parts of an image object are substituted by a surround part of the image object.
The camera arrays are also technically complex systems, as individual camera units have to be made to work together.
It is an object of the invention to avoid or at least mitigate the aforementioned problems, to produce further technical advancements and advantages, or at least to produce new alternatives to existing technology. SUMMARY
According to a first aspect of the invention there is provided an apparatus comprising:
an array of rolling shutter cameras that are configured to capture overlapping or adjacent sub-images, which sub-images collectively form a full image; and
a synchronizing circuitry configured to control the adjacent cameras to capture images continuously taking into account the rolling shutter such that line scan is performed with a substantially constant delay between each of a plurality of lines scanned out from the adjacent cameras.
The adjacent cameras may be oriented in a common scan direction. The synchronizing circuitry may be configured to command each of the adjacent cameras to start exposing an image frame in a cascade with such a delay in the scan direction that lines are scanned by the adjacent cameras so that the full image is formed in a consistent progression. The synchronizing circuitry may comprise a field-programmable gate array (FPGA) that is communicatively connected with at least two of the cameras.
The synchronizing circuitry may comprise a single field-programmable gate array (FPGA) that is communicatively connected with at least two of the cameras. The single field-programmable gate array may be communicatively connected with all of the cameras
The synchronizing circuitry may comprise a plurality of output ports configured to control respective trigger ports of each of the cameras.
The apparatus may further comprise a data bus communicatively connected with the synchronizing circuitry. The synchronizing circuitry may be configured to image data from one or more of the cameras at a time and to pass the received image data to the data bus without prior buffering on an external buffer circuitry.
The apparatus may enable significantly simplified a construction in which separate memory buffers may be omitted without necessitating use of very fast data buses.
The cameras may be infra-red cameras. The apparatus may further comprise a touch detection circuitry configured to detect touching of a touching surface based on the full image. It may be particularly advantageous to compensate for rolling shutter in an apparatus detecting touching, where continuous images may be of major importance.
According to a s econd aspect of the invention there is provided a method, comprising:
capturing overlapping or adjacent sub-images -image with an array of rolling shutter cameras;
forming a full image of the captured overlapping or adjacent sub-images; and
controlling said adjacent cameras of the array to capture images continuously taking into account the rolling shutter such that line scan is performed with a substantially constant delay between each of a plurality of lines scanned out from the adjacent cameras.
According to a third aspect of the invention there is provided a computer program configured to cause when executed by a computer a method according to the second aspect of the invention.
According to a fourth aspect of the invention there is provided a computer readable memory med ium embodied with a computer program wh ich when executed by a computer causes a computer to perform a method according to the second aspect of the invention. According to a fifth aspect of the invention there is provided a computer program product comprising a non-transitory computer readable medium having computer executable program code stored thereon, which when executed by at least one processor causes an apparatus at least to perform a method according to the second aspect of the invention.
Various embodiments of the present invention have been illustrated only with reference to certain aspects of the invention. It should be appreciated that corresponding embodiments may apply to other aspects and embodiments as well to produce further non-limiting examples.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will be described, by way of example only, with reference to the accompanying drawings, in which:
Fig. 1 shows a block diagram of a system according to an embodiment of the invention;
Fig. 2 shows a simplified block diagram of the structure of a control unit shown in Fig. 1 according to an embodiment of the invention;
Fig. 3a shows an example of distribution of a full image into a plurality of sub- images;
Fig. 3c shows a schematic flow chart according to an embodiment of the invention; and
Fig. 4 shows a system to illustrate further details regarding possible circuitries suited e.g. for operation as shown in Figs. 3a, 3b and/or 3c.
DETAILED DESCRIPTION In the following description, like numbers denote like elements.
Fig. 1 shows a block diagram of a system 100 according to an embodiment of the invention. The system 100 comprises a plurality of cameras 101 arranged in an array (here only three shown), sub-images or sub-image areas 102 of the cameras 101 , a control unit 103 that controls the operation of the cameras 101 and a data bus 104 connected with the control unit 103. Unless otherwise stated, the sub- images and sub-image areas are used interchangeably.
In Fig. 1 , the sub-images 102 are adjacent such that the adjacent sub-images form a continuous full image without overlap between different sub-images. In some alternative embodiments, the sub-images 102 do overlap: this is particularly the case when more accurate images are desired by imaging a common region by two or more overlapping sub-images 102, which technique is also referred to as super- resolution imaging.
The cameras 1 01 are rolling shutter cameras such as complementary metal- oxide-semiconductor (CMOS) cameras that produce the image line-by-line (of image pixels) with minute delay between each line. The cameras typically produce a square or rectangular sub-images 102. The array typically conforms to a regular matrix, but other alignments are also provided for in other embodiments. If the cameras have common or individually differing overlaps, it is advantageous to account for this by storing a measure of the overlap and by adjusting the timing when each of the cameras 101 is triggered so that the sub-images 102 are formed on each camera 101 such that any moment of time, the lines being scanned have substantially linear correspondence to the full image and thus to an object that is being imaged. In other words, assuming that the rolling shutter direction is left to right, the lines are scanned left to right over each of the cameras 101 line by line and the instantaneously scanned lines horizontally aligned in comparison to the imaging object. If two cameras 101 overlap in the rolling shutter direction, then for some progression over the overlap, both of these cameras 101 are scanned out line by line with corresponding lines. If the object moves with relation to the camera during the scan out, each line corresponding to the object are still imaging the very same part of the object and no tearing should occur: instead, the full image may then show a slanted outline for the object. The principle outlined in the preceding paragraph is explained with more detail with reference to Figs. 3a, 3b and 3c.
Fig. 2 shows a simplified block diagram of the structure of the control unit 103. The control unit 1 03 may be based on, for example, a general purpose computer supplied with suitable software and / or on a particularly adapted computing device. While it is possible to implement the control unit 103 by purely hardware based a device, typically it is more economic and faster to produce by making use of software.
In Fig. 2, the control unit 103 is drawn to comprise a memory 201 that comprises a work memory 202, a non-volatile memory 203 that is configured to store software 204, and settings 206 needed e.g. for manual or automatic calibration of the system 100. The software 204 may comprise any one or more of the following items: operating system, device drivers, display presentation application, hypertext markup language parser, image processing software, and drivers for different external equipment that may be connected to the system such as printers, further displays, further interactive systems 100, audio systems, and external IR illumination equipment (not shown).
The control unit 103 further comprises a processor 207 (such as a field- programmable gate array, FPGA) configured to control the operation of the control unit 103 according to the software 204 by executing computer executable program code contained by the software in the work memory 202. Alternatively, the control 103 unit may be configured to execute the software in place in the non-volatile memory in which case the work memory may not be necessary. The control unit 103 further comprises an input/output unit (I/O) 208 for exchanging signals with other elements of the system 100 and optionally also with external equipment. The I/O 208 may comprise e.g. any one or more of a universal serial bus port, a local area network port, an ISA bus, a PCI express port, an I R port, a Bluetooth element, and a parallel port. Alternatively to being configured capable of communicating with external equipment, the system 100 may be provided with a transferable memory reception unit 209 such as a cd-rom or dvd-rom drive, memory card reader or memory stick reader which enables replacing part of the non-volatile memory.
It is appreciated that while the control unit may consist of one separate unit, the control unit 103 may alternatively be integrated with any other element or comprise two or more discreet elements each for one or more of the aforementioned acts.
Fig. 3a shows an example of distribution of a full image into a plurality of sub- images 102, also referred to as sub-regions. The sub-regions 102 are drawn to form an 8 x 4 matrix and denoted with reference signs 1 -1 to 4-8. Also a rolling shutter direction or scanning direction 300 is shown, here from left to right, implying that camera scan-lines are aligned vertically.
Fig. 3b shows a timing chart that illustrates how scan-out periods of adjacent rolling shutter cameras 101 are synchronized in one example embodiment. Fig. 3b shows the timing of some main events for three adjacent cameras 101 in the rolling shutter direction 300. Fig. 3b is not drawn in scale. However, Fig. 3b clearly shows how the scan out periods continuously progress as a function of time. After the eight cameras 101 on each row (in Fig. 3a embodiment), the first camera 101 would be triggered again so that the scan out period starts from where the scan out period of the eighth camera 101 ends.
The cameras 101 on other rows are preferably synchronized with the first row so that the cameras 101 on each column of the matrix shown in Fig. 3a are triggered simultaneously.
Fig. 3b is based on the assumption that the sub-images or regions 102 are adjacent and non-overlapping. If there is overlap, the timing is preferably adjusted for the cameras 101 so that such lines, that overlap in the full image, are scanned out substantially simultaneously from all cameras against the rolling shutter direction 300. In Fig. 3b the frame-capture signal is sent at a moment of time t1 . Actual frame capture starts at a moment of time t2, with the time between the moment t1 and t2 (if any) being spent in preparing the image capture. Image exposure takes place between moments of time t2 and t3, with the image scan-out happening between moments of time t3, and t4.
Fig. 3c shows a schematic flow chart according to an embodiment of the invention. In step 301 , the process the camera units of the first column (in the embodiment of Fig. 3a) are activated for capturing first sub-images 102, which in Fig. 3a example means sub-regions 1 -1 , 2-1 , 3-1 and 3-1 .
Next in step 302, the following sub-regions are selected and triggered as the first sub-regions in step 301 in the rolling shutter direction 300. It is understood more (or less) than two sub-regions may be processed simultaneously. Moreover, the sub-regions need not be processed exactly simultaneously; the exact timing of parallel processing of different sub-regions may be partly random or there may be a given time offset such 1/10 or 1/100 of the scan out period. Such offsetting may be help feeding all the image data to the data bus 104 e.g. as interlaced bursts. Of course, in some other embodiments the scanning may proceed along columns rather than rows. Regardless of how whether the scan direction of the cameras is horizontal or vertical, the operation is preferably advanced from one sub-region 102 to the following in the rolling shutter direction 300 in a continuous manner without disturbingly different interval between neighboring (outmost) lines of two cameras 1 01 . Therefore, there should be no holes or other discontinuities in images formed by combining plural adjacent sub-images unlike with current systems where rolling shutter images may suffer from temporal disruptions.
It is then checked 303 if all the sub-regions 102 have been captured already. If not, the process resumes to step 302, otherwise the process advances to step 304. In step 304, the full image is formed from all the different sub-images 102. On forming the full image, the mutual alignment of the different sub-images 102 is accounted for. For instance, stored alignment data (e.g. information that identifies overlapping portions of different sub-images 102) is used in one embodiment to determine how the full image is put together. In another embodiment, however, the mutual alignment of different sub-images 102 is determined entirely or in part based on the image information of the different sub-images 102. This determining of mutual alignment may be implemented e.g. by searching the mutual alignment with least difference between adjacent pixels of two sub-images along the adjoining edge of one of the sub-images.
Fig. 4 shows a system 410 to illustrate further details regarding possible circuitries suited e.g. for operation as shown in Figs. 3a, 3b and/or 3c. Fig. 4 also makes clear how the circuitry may be greatly simplified. This simplifying takes advantage of the rolling shutter nature of the cameras 101 - when the cameras 101 produce image pixels for different parts of the image on slightly different moments of time.
In the system 410, some of the cameras 101 are laid onto one or more circuit boards into an array. In the array, one or more cameras 101 are connected as one common camera unit 420.
The common camera unit 420 comprises a field-programmable gate array (FPGA) 422 that is communicatively connected with each of the cameras 101 of the common camera unit 420. The FPGA 422s are configured to synchronise the cameras 101 so that the image formed by adjacent cameras 101 is formed continuously taking into account the rolling shutter. That is, in Fig. 4, first the topmost cameras 101 start exposing a first sub-image 102. The FPGAs 422 of the top-most camera units 420 start substantially simultaneously scanning out from the first (top-most cameras 101 ) these sub-image 102. The FPGAs 422 also trigger second (downward in Fig. 4) cameras 101 to start exposing second sub- images 102 so that the FPGA 422 scans out the second sub-images continuously after the first sub-images, i.e. so that there is no significant delay after scanning out the first image and before scanning out the second image. The cameras 101 are thus scanned out row by row, column by column, or in some other order, along the rolling shutter direction 300. The timing of the scan-out operations enables that the FPGAs 422 in different camera units 420 receives image data from different cameras 101 as a substantially continuous data stream while the different cameras 101 of that camera unit are being scanned out. In some embodiments, such as that in Fig. 4, there will be a break between scanning out image data from the cameras 1 01 of one camera unit 420 while the scanning out progresses through the other camera unit in the rolling shutter direction 300. The system 410 further comprises one or more common camera unit data buses 430 through which the FPGA 422 passes the data stream for subsequent processing. The common camera unit data buses 430 are data channels that are configured capable of transporting image data from all of the FPGAs 422 connected thereto. When the cameras 101 are not overlapping but arranged in a matrix e.g . as shown in Fig. 3, there is only one FPGA 422 that is transmitting image data through the data bus 430 at the time. In Fig. 4, each data bus should simultaneously convey the image data of two cameras 101 . In an alternative embodiment, there is one FPGA 422 that is configured to control all the cameras 101 on each one or two columns (as in Fig. 3) or rows. Alternatively, a single FPGA 422 may be configured to control all the cameras 101 .
An interfacing unit 440 is connected to each of the common camera unit data buses and configured to pass all the image data and necessary metadata. The metadata comprises e.g. an identification of the camera 101 from which the image data in question comes from. The metadata is typically provided by the FPGA 422.
The interfacing unit 440 may comprise a coordinating processor 442 (such as an FPGA circuitry, central processing unit, digital signal processor or the like), a data input port 444 and possibly a control output port 446 for controlling the FPGAs 422.
The interfacing unit further comprises a data output port 448 which may comprise a relative small buffer memory 4482 e.g. to allow retransmissions should data be corrupted over a connection 450 between the system 410 and an auxiliary device that receives the image data (such as a computer, not shown).
When data is scanned out from a camera 101 , the associated FPGA 422 receives and forwards that data, potentially together with identification of the camera 101 , and the interfacing unit 440 further passes on the image data it receives from the different FPGAs 422. All of this may take place without need to buffer and re-buffer the scanned-out image data. Fig. 4 exemplifies some factors related to the design of the system 410. The fewer FPGAs 422 there are, the higher the number of data ports is needed at the FPGAs 422 and the more complex the wiring may become, but the overall costs may still be reduced by reducing the number of the FPGAs 422. On the other hand, the fewer FPGAs 422 there are per simultaneously scanned cameras 101 , the faster the data bus or buses 430 have to be. In one embodiment, the system 410 is implemented so that the data bus 430 is preferably fast enough to relay image data from all the simultaneously scanned cameras 101 , or if that is not possible, there is the smallest possible number of data buses 430 that suffices for this task. Then, a single FPGA 422 is provided per data bus and common camera units are formed of all the cameras 101 that shall communicate via each data bus 430.
The common camera units may altogether lack memory buffers, whereas in normal implementation, each of the FPGAs 422 would be associated with a memory buffer that is large enough to store at least one entire image frame. Thus, the system 410 can be simplified with the advantages that the manufacturing and maintenance become cheaper and faster and debugging of possible problems is also improved in comparison to an alternative implementation in which the memory buffers are provided. Such an alternative implementation may be more advantageous e.g. when fast data buses are not desired for any reason
It should be appreciated that in this document, words comprise, include and contain are each used as open-ended expressions with no intended exclusivity. Moreover, term light here is interchangeable with radiation. While infrared light has in occasions been used, this terming is merely for convenience of explanation the term light is not intended to imply suitability for perception by means of a human eye. The foregoing description has provided by way of non-limiting examples of particular implementations and embodiments of the invention a full and informative description of the best mode presently contemplated by the inventors for carrying out the invention. It is however clear to a person skilled in the art that the invention is not restricted to details of the embodiments presented above, but that it can be implemented in other embodiments using equivalent means without deviating from the characteristics of the invention.
Furthermore, some of the features of the above-disclosed embodiments of this invention may be used to advantage without the corresponding use of other features. As such , the foregoing description shall be considered as merely illustrative of the principles of the present invention, and not in limitation thereof. Hence, the scope of the invention is only restricted by the appended patent claims.

Claims

Claims
1 . An apparatus (100) comprising:
an array of rolling shutter cameras (101 ) that are configured to capture overlapping or adjacent sub-images (102), which sub-images collectively form a full image; characterized by:
a synchronizing circuitry (103) configured to control the adjacent cameras (101 ) to capture images continuously taking into account the rolling shutter such that line scan is performed with a substantially constant delay between each of a plurality of lines scanned out from the adjacent cameras.
2. The apparatus of claim 1 , characterized in that the adjacent cameras (101 ) are oriented in a common scan direction (300).
3. The apparatus of claim 1 or 2, characterized in that the synchronizing circuitry (103) is configured to command each of the adjacent cameras (101 ) to start exposing an image frame in a cascade with such a delay in the scan direction that lines are scanned by the adjacent cameras (101 ) so that the full image is formed in a consistent progression.
4. The apparatus of any one of the preceding claims, characterized in that the synchronizing circuitry (103) comprises a field-programmable gate array that is communicatively connected with at least two of the cameras (101 ).
5. The apparatus of any one of claims 1 to 3, characterized in that the synchronizing circuitry (103) comprises a single field-programmable gate array that is communicatively connected with at least two of the cameras (101 ).
6. The apparatus of claim 5, characterized in that the single field-programmable gate array is communicatively connected with all of the cameras (101 ).
7. The apparatus of any one of the preceding claims, characterized in that the synchronizing circuitry (103) comprises a plurality of output ports configured to control respective trigger ports of each of the cameras (101 ).
8. The apparatus of any one of the preceding claims, characterized in that the apparatus further comprises a data bus communicatively connected with the synchronizing circuitry (103).
9. The apparatus of claim 8, characterized in that the synchronizing circuitry (103) is configured to image data from one or more of the cameras (101 ) at a time and to pass the received image data to the data bus without prior buffering on an external buffer circuitry.
10. A method, comprising:
capturing (301 ,302,303) overlapping or adjacent sub-images (102) with an array of rolling shutter cameras (101 ); and
forming (304) a full image of the captured overlapping or adjacent sub- images (102); characterized by:
controlling said adjacent cameras (101 ) of the array to capture images continuously taking into account the rolling shutter such that line scan is performed with a substantially constant delay between each of a plurality of lines scanned out from the adjacent cameras (101 ).
1 1 . The method of claim 10, characterized by commanding each of the adjacent cameras (101 ) to start exposing an image frame in a cascade with such a delay in the scan direction that lines are scanned by the adjacent cameras (101 ) so that the full image is formed in a consistent progression.
12. The method of claim 10 or 1 1 , characterized by using a synchronizing circuitry (103) for the controlling of at least two of the adjacent cameras (101 ).
13. The method of claim 12, characterized in that the synchronizing circuitry (103) comprises a single field-programmable gate array that is communicatively connected with at least two of the cameras (101 ).
14. The method of claim 13, characterized in that the single field-programmable gate array controls all of the cameras (101 ).
15. A computer program configured to cause when executed by a computer a method according of any one of claims 10 to 14.
PCT/FI2010/051103 2010-12-31 2010-12-31 Rolling shutter compensation in camera array WO2012089895A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/051103 WO2012089895A1 (en) 2010-12-31 2010-12-31 Rolling shutter compensation in camera array

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2010/051103 WO2012089895A1 (en) 2010-12-31 2010-12-31 Rolling shutter compensation in camera array

Publications (1)

Publication Number Publication Date
WO2012089895A1 true WO2012089895A1 (en) 2012-07-05

Family

ID=46382359

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2010/051103 WO2012089895A1 (en) 2010-12-31 2010-12-31 Rolling shutter compensation in camera array

Country Status (1)

Country Link
WO (1) WO2012089895A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581531A (en) * 2012-07-20 2014-02-12 万里科技股份有限公司 Device for expanding and controlling signals of camera shutter release cable through isolated switches
WO2015058157A1 (en) 2013-10-18 2015-04-23 The Lightco Inc. Image capture control methods and apparatus
WO2015150619A1 (en) * 2014-04-03 2015-10-08 Nokia Technologies Oy Apparatus, method and computer program for obtaining images
EP2946336A2 (en) * 2013-01-15 2015-11-25 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
US20170289482A1 (en) * 2016-03-30 2017-10-05 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
EP3058713A4 (en) * 2013-10-18 2017-11-15 The Lightco Inc. Image capture control methods and apparatus
US9857584B2 (en) 2015-04-17 2018-01-02 Light Labs Inc. Camera device methods, apparatus and components
US9955082B2 (en) 2013-10-18 2018-04-24 Light Labs Inc. Methods and apparatus for capturing images using optical chains and/or for using captured images
US9967535B2 (en) 2015-04-17 2018-05-08 Light Labs Inc. Methods and apparatus for reducing noise in images
US9998638B2 (en) 2014-12-17 2018-06-12 Light Labs Inc. Methods and apparatus for implementing and using camera devices
US10003738B2 (en) 2015-12-18 2018-06-19 Light Labs Inc. Methods and apparatus for detecting and/or indicating a blocked sensor or camera module
US10009530B2 (en) 2013-10-18 2018-06-26 Light Labs Inc. Methods and apparatus for synchronized image capture using camera modules with different focal lengths
US10051182B2 (en) 2015-10-05 2018-08-14 Light Labs Inc. Methods and apparatus for compensating for motion and/or changing light conditions during image capture
US10091447B2 (en) 2015-04-17 2018-10-02 Light Labs Inc. Methods and apparatus for synchronizing readout of multiple image sensors
US10129483B2 (en) 2015-06-23 2018-11-13 Light Labs Inc. Methods and apparatus for implementing zoom using one or more moveable camera modules
US10225445B2 (en) 2015-12-18 2019-03-05 Light Labs Inc. Methods and apparatus for providing a camera lens or viewing point indicator
US10365480B2 (en) 2015-08-27 2019-07-30 Light Labs Inc. Methods and apparatus for implementing and/or using camera devices with one or more light redirection devices
US10491806B2 (en) 2015-08-03 2019-11-26 Light Labs Inc. Camera device control related methods and apparatus
US10516834B2 (en) 2015-10-06 2019-12-24 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
US10670858B2 (en) 2017-05-21 2020-06-02 Light Labs Inc. Methods and apparatus for maintaining and accurately determining the position of a moveable element
US10931866B2 (en) 2014-01-05 2021-02-23 Light Labs Inc. Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60172888A (en) * 1984-02-17 1985-09-06 Matsushita Electric Ind Co Ltd Wide angle television camera
US6937270B1 (en) * 1999-05-03 2005-08-30 Omnivision Technologies, Inc. Analog video monitoring system using a plurality of phase locked CMOS image sensors
US20070030342A1 (en) * 2004-07-21 2007-02-08 Bennett Wilburn Apparatus and method for capturing a scene using staggered triggering of dense camera arrays
WO2007045714A1 (en) * 2005-10-21 2007-04-26 Nokia Corporation A method and a device for reducing motion distortion in digital imaging
US20090201361A1 (en) * 2008-02-08 2009-08-13 Google Inc. Panoramic Camera With Multiple Image Sensors Using Timed Shutters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60172888A (en) * 1984-02-17 1985-09-06 Matsushita Electric Ind Co Ltd Wide angle television camera
US6937270B1 (en) * 1999-05-03 2005-08-30 Omnivision Technologies, Inc. Analog video monitoring system using a plurality of phase locked CMOS image sensors
US20070030342A1 (en) * 2004-07-21 2007-02-08 Bennett Wilburn Apparatus and method for capturing a scene using staggered triggering of dense camera arrays
WO2007045714A1 (en) * 2005-10-21 2007-04-26 Nokia Corporation A method and a device for reducing motion distortion in digital imaging
US20090201361A1 (en) * 2008-02-08 2009-08-13 Google Inc. Panoramic Camera With Multiple Image Sensors Using Timed Shutters

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581531A (en) * 2012-07-20 2014-02-12 万里科技股份有限公司 Device for expanding and controlling signals of camera shutter release cable through isolated switches
EP2946336B1 (en) * 2013-01-15 2023-06-21 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
US10764517B2 (en) 2013-01-15 2020-09-01 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
EP2946336A2 (en) * 2013-01-15 2015-11-25 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
US10200638B2 (en) 2013-01-15 2019-02-05 Mobileye Vision Technologies Ltd. Stereo assist with rolling shutters
US10048472B2 (en) 2013-10-18 2018-08-14 Light Labs Inc. Methods and apparatus for implementing and/or using a camera device
US10205862B2 (en) 2013-10-18 2019-02-12 Light Labs Inc. Methods and apparatus relating to a camera including multiple optical chains
US9955082B2 (en) 2013-10-18 2018-04-24 Light Labs Inc. Methods and apparatus for capturing images using optical chains and/or for using captured images
US10274706B2 (en) 2013-10-18 2019-04-30 Light Labs Inc. Image capture control methods and apparatus
US10120159B2 (en) 2013-10-18 2018-11-06 Light Labs Inc. Methods and apparatus for supporting zoom operations
WO2015058157A1 (en) 2013-10-18 2015-04-23 The Lightco Inc. Image capture control methods and apparatus
US10009530B2 (en) 2013-10-18 2018-06-26 Light Labs Inc. Methods and apparatus for synchronized image capture using camera modules with different focal lengths
US10038860B2 (en) 2013-10-18 2018-07-31 Light Labs Inc. Methods and apparatus for controlling sensors to capture images in a synchronized manner
EP3058713A4 (en) * 2013-10-18 2017-11-15 The Lightco Inc. Image capture control methods and apparatus
US10931866B2 (en) 2014-01-05 2021-02-23 Light Labs Inc. Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture
WO2015150619A1 (en) * 2014-04-03 2015-10-08 Nokia Technologies Oy Apparatus, method and computer program for obtaining images
US9998638B2 (en) 2014-12-17 2018-06-12 Light Labs Inc. Methods and apparatus for implementing and using camera devices
US10091447B2 (en) 2015-04-17 2018-10-02 Light Labs Inc. Methods and apparatus for synchronizing readout of multiple image sensors
US9857584B2 (en) 2015-04-17 2018-01-02 Light Labs Inc. Camera device methods, apparatus and components
US9967535B2 (en) 2015-04-17 2018-05-08 Light Labs Inc. Methods and apparatus for reducing noise in images
US10129483B2 (en) 2015-06-23 2018-11-13 Light Labs Inc. Methods and apparatus for implementing zoom using one or more moveable camera modules
US10491806B2 (en) 2015-08-03 2019-11-26 Light Labs Inc. Camera device control related methods and apparatus
US10365480B2 (en) 2015-08-27 2019-07-30 Light Labs Inc. Methods and apparatus for implementing and/or using camera devices with one or more light redirection devices
US10051182B2 (en) 2015-10-05 2018-08-14 Light Labs Inc. Methods and apparatus for compensating for motion and/or changing light conditions during image capture
US10516834B2 (en) 2015-10-06 2019-12-24 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
US10225445B2 (en) 2015-12-18 2019-03-05 Light Labs Inc. Methods and apparatus for providing a camera lens or viewing point indicator
US10003738B2 (en) 2015-12-18 2018-06-19 Light Labs Inc. Methods and apparatus for detecting and/or indicating a blocked sensor or camera module
US20170289482A1 (en) * 2016-03-30 2017-10-05 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
US10044963B2 (en) * 2016-03-30 2018-08-07 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus
US10670858B2 (en) 2017-05-21 2020-06-02 Light Labs Inc. Methods and apparatus for maintaining and accurately determining the position of a moveable element

Similar Documents

Publication Publication Date Title
WO2012089895A1 (en) Rolling shutter compensation in camera array
US10237506B2 (en) Image adjustment apparatus and image sensor for synchronous image and asynchronous image
TWI523516B (en) Video wall
RU2570354C2 (en) Real-time image capture and display
US7525576B2 (en) Method and apparatus for panning and tilting a camera
US10645258B2 (en) Multi-camera system, method of controlling a multi-camera system, and camera
US9270907B2 (en) Radiation imaging apparatus, control method for radiation imaging apparatus, and storage medium
US20130235149A1 (en) Image capturing apparatus
US7619195B2 (en) Imaging device driver, imaging device driving method, and image signal processor
US20170046843A1 (en) Method, Apparatus and System for Detecting Location of Laser Point on Screen
JP7466229B2 (en) Pressed part inspection device and press part inspection method
JP5824278B2 (en) Image processing device
US7705910B2 (en) Photographic device for obtaining a plurality of images at a time by rolling shutter method
JP2008244649A (en) Motion detection imaging device
US9075482B2 (en) Optical touch display
JP2011019058A (en) Image processor for robot system and robot system including the same
US20060214087A1 (en) Imaging device and method, and imaging controlling apparatus and method
JP2013048333A (en) Image processor, image processing method and image processing system
US20230228691A1 (en) Smart synchronization method of a web inspection system
US10212405B2 (en) Control apparatus and method
US20060215050A1 (en) Driving controlling method for image sensing device, and imaging device
US10389962B2 (en) Image pickup apparatus and method utilizing the same line rate for upscaling and outputting image
JP2014175931A (en) Photographing system, imaging apparatus, and control method therefor
JP6602162B2 (en) Image processing apparatus, image processing method, and program
JP4320780B2 (en) Imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10861460

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10861460

Country of ref document: EP

Kind code of ref document: A1