US20140055664A1 - Imaging device - Google Patents

Imaging device Download PDF

Info

Publication number
US20140055664A1
US20140055664A1 US14/112,799 US201314112799A US2014055664A1 US 20140055664 A1 US20140055664 A1 US 20140055664A1 US 201314112799 A US201314112799 A US 201314112799A US 2014055664 A1 US2014055664 A1 US 2014055664A1
Authority
US
United States
Prior art keywords
pixels
optical
capture device
light
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/112,799
Inventor
Michihiro Yamagata
Tsuguhiro Korenaga
Norihiro Imamura
Takamasa Ando
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corp filed Critical Panasonic Corp
Assigned to PANASONIC CORPORATION reassignment PANASONIC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ANDO, TAKAMASA, IMAMURA, NORIHIRO, KORENAGA, TSUGUHIRO, YAMAGATA, MICHIHIRO
Publication of US20140055664A1 publication Critical patent/US20140055664A1/en
Assigned to PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. reassignment PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PANASONIC CORPORATION
Priority to US15/361,779 priority Critical patent/US10247866B2/en
Assigned to PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. reassignment PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUSLY FILED APPLICATION NUMBERS 13/384239, 13/498734, 14/116681 AND 14/301144 PREVIOUSLY RECORDED ON REEL 034194 FRAME 0143. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: PANASONIC CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/30Polarising elements
    • G02B5/3025Polarisers, i.e. arrangements capable of producing a definite output polarisation state from an unpolarised input state
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/0407Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings
    • G01J1/0411Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings using focussing or collimating elements, i.e. lenses or mirrors; Aberration correction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/0407Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings
    • G01J1/0429Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings using polarisation elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4228Photometry, e.g. photographic exposure meter using electric radiation detectors arrangements with two or more detectors, e.g. for sensitivity compensation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J4/00Measuring polarisation of light
    • G01J4/04Polarimeters using electric detection means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/22Telecentric objectives or lens systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • the present application relates to an image capture device such as a camera and more particularly relates to an image capture device which captures an image using polarized light.
  • the polarization property of the light changes. That is why the light reflected from the object has its polarization property determined by various pieces of information including the surface roughness, reflectance, and birefringence of the object and the orientation of the reflective surface. Thus, if light with a polarization property is separated, detected and converted into an electrical signal, those pieces of information can be obtained.
  • a shooting system of that type is generally used as means for imaging separately light that has been reflected from the surface of a tissue and light that has been reflected from inside of the tissue in a camera for medical and beauty treatment purposes such as an endoscope and a skin checker.
  • a polarizer is arranged in front of the lens of a image capturing camera and a shooting session is carried out with the polarizer rotated according to the shooting condition. According to such a method, however, shooting sessions need to be performed a number of times with the polarizer rotated, thus taking a long time to get an image.
  • a Patent Document No. 1 discloses a device in which polarization filters are provided for some of the pixels of a solid-state image sensor. By making light with a polarization property incident on some pixels and by performing image processing on an image obtained from those pixels on which the light with a polarization property has been incident and an image obtained from other pixels, an image, from which the influence of reflection from the surface of the subject has been reduced, can be obtained.
  • Patent Document No. 2 discloses an image capture device in which an optical system is formed by an array of two lenses and an image is shot through each of those lenses using light with a different polarization direction, thereby detecting the state of a dry or wet road surface.
  • a non-limiting exemplary embodiment of the present application provides an image capture device which can obtain a plurality of images based on multiple light beams in mutually different polarization states using a simplified or general configuration.
  • An image capture device includes: a lens optical system; an image sensor on which light that has passed through the lens optical system is incident and which includes at least a plurality of first pixels and a plurality of second pixels; and an array of optical elements which is arranged between the lens optical system and the image sensor and which includes a plurality of optical elements, each having a lens surface.
  • the lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis.
  • Each of the optical elements that form the array of optical elements makes the light that has passed through the first optical region incident on the plurality of first pixels and also makes the light that has passed through the second optical region incident on the plurality of second pixels.
  • An image capture device can capture multiple images based on light beams in mutually different polarization states while using a single lens optical system.
  • a biometric image capture device which may be used effectively to check the skin state, for example, is realized.
  • FIG. 1 Illustrates a configuration for a first embodiment of an image capture device according to the present invention.
  • FIG. 2 A front view of a split polarizer according to the first embodiment.
  • FIG. 3 A perspective view of an array of optical elements according to the first embodiment.
  • FIG. 4 A schematic enlarged cross-sectional view illustrating the array of optical elements and an image sensor according to the first embodiment and their surrounding.
  • FIG. 5 A front view of a split polarizer according to a second embodiment.
  • FIG. 6 A schematic perspective view illustrating an array of optical elements and an image sensor according to the second embodiment and their surrounding.
  • FIG. 7 Illustrates light rays to be incident on the image capturing plane according to the second embodiment.
  • FIG. 8 Illustrates a configuration according to a third embodiment.
  • FIG. 9 ( a ) is a front view of a split polarizer according to the third embodiment and ( b ) is a front view illustrating another example of the split polarizer.
  • FIG. 10 Illustrates a configuration according to a fourth embodiment.
  • FIGS. 11 ( a ) and ( b ) are respectively a cross-sectional view and a front view of a liquid crystal element according to the fourth embodiment.
  • FIG. 12 A perspective view of an array of optical elements according to a fifth embodiment.
  • FIGS. 13 ( a ) and ( b ) illustrate how light rays are incident on an image sensor according to the fifth embodiment.
  • FIG. 14 Illustrates a general configuration for a skin checker according to a sixth embodiment of the present invention.
  • FIG. 15 ( a ) is a side view of an optical element Sa to be arranged in the vicinity of the stop of a lens optical system according to the sixth embodiment, ( b - 1 ) is a front view of split color filters Sc, and ( b - 2 ) is a front view of a split polarizer Sp.
  • the present inventors checked out the image capture devices disclosed in Patent Documents Nos. 1 and 2. As a result, we reached the following conclusion. Specifically, the device disclosed in Patent Document No. 1 needs to use a dedicated image sensor with a polarization filter. However, since such an image sensor is not retailed, it must be manufactured as a dedicated product. Particularly if such dedicated products are manufactured in small quantities, the manufacturing cost is expensive. On top of that, in that case, the arrangement of the polarizer cannot be changed appropriately according to the shooting situation.
  • an optical system is implemented as a lens array on the image sensor.
  • the effective diameter of a single optical system needs to be less than a half of the size of the image capturing area, thus limiting the degree of freedom of the optical design. That is why it is difficult to form an optical system, of which the resolution is high enough to obtain an image.
  • An aspect of the present invention can be outlined as follows.
  • An image capture device includes: a lens optical system; an image sensor on which light that has passed through the lens optical system is incident and which includes at least a plurality of first pixels and a plurality of second pixels; and an array of optical elements which is arranged between the lens optical system and the image sensor and which includes a plurality of optical elements, each having a lens surface.
  • the lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis.
  • Each of the optical elements that form the array of optical elements makes the light that has passed through the first optical region incident on the plurality of first pixels and also makes the light that has passed through the second optical region incident on the plurality of second pixels.
  • the image sensor may be a monochrome image sensor.
  • the lens optical system may be an image-space telecentric optical system.
  • the lens optical system may include a split polarizer having first and second polarizing portions which are located in the first and second regions, respectively.
  • the optical elements that form the array of optical elements may be a lenticular lens.
  • a number of the first pixels and a number of the second pixels may be arranged in a first direction and the first pixels arranged in the first direction and the second pixels arranged in the first direction may alternate with each other in a second direction that intersects with the first direction at right angles, thus forming an image capturing plane.
  • the lens optical system may further have a third optical region which transmits mostly light vibrating in the direction of a third polarization axis and a fourth optical region which transmits mostly light vibrating in the direction of a fourth polarization axis.
  • the split polarizer may further have third and fourth polarizing portions which are located in the third and fourth regions, respectively.
  • the optical elements that form the array of optical elements may be a micro lens array.
  • the image capture device may further include a polarization direction changing section which changes the direction of at least one of the first and second polarization axes of the first and second optical regions.
  • the lens optical system may include a split polarizer with at least three polarizing portions, two adjacent ones of which have polarization axes in mutually different directions.
  • the image capture device may further include a drive mechanism which drives the split polarizer so that any two adjacent ones of the at least three polarizing portions of the split optical element are located in the first and second regions.
  • the split polarizer may include a common transparent electrode, two divided transparent electrodes which are located in the first and second optical regions, respectively, a liquid crystal layer which is interposed between the common transparent electrode and the two divided transparent electrodes, and a control section which applies mutually different voltages to the two divided transparent electrodes.
  • the image capture device may perform shooting sessions multiple times with the voltage changed.
  • the plurality of first pixels may include a number of 1A pixels with filters having a first spectral transmittance characteristic, a number of 2A pixels with filters having a second spectral transmittance characteristic, a number of 3A pixels with filters having a third spectral transmittance characteristic, and a number of 4A pixels with filters having a fourth spectral transmittance characteristic.
  • the plurality of second pixels may include a number of 1B pixels with filters having the first spectral transmittance characteristic, a number of 2B pixels with filters having the second spectral transmittance characteristic, a number of 3B pixels with filters having the third spectral transmittance characteristic, and a number of 4B pixels with filters having the fourth spectral transmittance characteristic.
  • the array of optical elements may include: a plurality of first optical elements which makes light that has passed through the first optical region incident on the 1A and 3A pixels and which also makes light that has passed through the second region incident on the 2B and 4B pixels; and a plurality of second optical elements which makes light that has passed through the first region incident on the 2A and 4A pixels and which also makes light that has passed through the second region incident on the 1B and 3B pixels.
  • the 1A, 2B, 3A and 4B pixels that form a single set may be adjacent to each other and may be arranged at the four vertices of a quadrangle.
  • the filters having the first spectral transmittance characteristic and the filters having the second spectral transmittance characteristic may transmit light falling within the wavelength range of the color green.
  • the filters having the third spectral transmittance characteristic may transmit light falling within the wavelength range of the color red.
  • the filters having the fourth spectral transmittance characteristic may transmit light falling within the wavelength range of the color blue.
  • 1A, 2B, 3A and 4B pixels that form a single set may be arranged in a Bayer arrangement pattern.
  • the plurality of first optical elements and the plurality of second optical elements may form a lenticular lens.
  • the lens optical system may further include a stop, and the first and second optical regions may be located in the vicinity of the stop.
  • An image capture device includes: a lens optical system; an image sensor which includes a plurality of first pixels with filters having a first spectral transmittance characteristic, a plurality of second pixels with filters having a second spectral transmittance characteristic, a plurality of third pixels with filters having a third spectral transmittance characteristic, and a plurality of fourth pixels with filters having a fourth spectral transmittance characteristic and in which a first row where the first and second pixels are arranged alternately in a first direction and a second row where the third and fourth pixels are arranged alternately in the first direction alternate in a second direction, thereby forming an image capturing plane, wherein light that has passed through the lens optical system is incident on the first, second, third and fourth pixels; and an array of optical elements which is arranged between the lens optical system and the image sensor.
  • the lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis.
  • the first and second optical regions are arranged in the second direction.
  • the array of optical elements includes a plurality of optical elements, each of which makes the light that has been transmitted through the lens optical system incident on every four pixels, which are the first, second, third and fourth pixels that are arranged adjacent to each other in the first and second directions.
  • the plurality of optical elements forms a number of columns, each of which is arranged linearly in the second direction. On two columns that are adjacent to each other in the first direction, each optical element on one of the two columns is shifted from an associated optical element on the other column in the second direction by a length corresponding to a half of one arrangement period of the optical elements.
  • a biometric image capturing system includes: an image capture device according to any of the embodiments described above; and a light source which irradiates an object with polarized light.
  • the lens optical system of the image capture device may include split color filters to be arranged in the first to fourth optical regions.
  • the split color filters may transmit light falling within the same wavelength range in two of those first through fourth optical regions.
  • the polarization axes of the split polarizer may be in mutually different directions.
  • the directions of the polarization axes of the split polarizer may intersect with each other at substantially right angles.
  • the biometric image capturing system may further include a control section which controls the light source and the image capture device.
  • the control section may control the image capture device so that the image capture device captures multiple images synchronously with flickering of the light source.
  • the image capture device may perform arithmetic processing between the multiple images to generate another image.
  • the biometric image capturing system may further include a display section which displays an image that has been shot by the image capture device.
  • the image capture device may further include a signal processing section.
  • the signal processing section may generate an image signal by inverting horizontally the image shot and may output the image signal to the display section.
  • FIG. 1 is a schematic representation illustrating a first embodiment of an image capture device according to the present invention.
  • the image capture device A of this embodiment includes a lens optical system L, of which the optical axis is identified by V 0 , an array of optical elements K which is arranged in the vicinity of the focal point of the lens optical system L, an image sensor N, and a signal processing section C.
  • the lens optical system L includes a stop S and an objective lens L 1 which images light that has passed through the stop S onto the image sensor.
  • the lens optical system L has a first optical region D 1 and a second optical region D 2 .
  • the first and second optical regions D 1 and D 2 are located in the vicinity of the stop S.
  • the region obtained by combining these first and second optical receives D 1 and D 2 together has a circular shape corresponding to the aperture of the stop S on a cross section which intersects with the optical axis V 0 at right angles.
  • the boundary between the first and second optical regions D 1 and D 2 includes the optical axis V 0 and is located on a plane which is parallel to the horizontal direction.
  • the first optical region D 1 of the lens optical system L is configured to transmit mostly light vibrating in the direction of a first polarization axis
  • the second optical region D 2 is configured to transmit mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis.
  • the lens optical system L includes a split polarizer Sp which is located in the first and second optical regions D 1 and D 2 .
  • FIG. 2 is a front view of the split polarizer Sp.
  • the split polarizer Sp divides the aperture of the stop S into two regions by a line that includes the optical axis V 0 of the lens optical system L and that is parallel to the horizontal direction of the image capture device, and includes a first polarizing portion Sp 1 located in the first optical region D 1 and a second polarizing portion Sp 2 located in the second optical region.
  • the first and second polarizing regions Sp 1 and Sp 2 are implemented as respective polarizers.
  • a so-called “PVA iodine stretched film” which is made by dyeing polyvinyl alcohol with iodine and stretching it into a film may be used.
  • the first and second polarizing sections Sp 1 and Sp 2 have first and second polarization axes, respectively, and the directions of the first and second polarization axes are different from each other.
  • the direction of the first polarization axis may be the vertical direction of the image capture device
  • the second polarization direction may be the horizontal direction of the image capture device.
  • the light beam B 1 enters the first polarizing portion Sp 1 of the split polarizer Sp and the light beam B 2 enters the second polarizing portion Sp 2 of the split polarizer Sp.
  • the light entering the stop S includes linearly polarized light rays which are polarized in an arbitrary direction, then only linearly polarized light rays vibrating in the direction of the first polarization axis are transmitted through the first polarizing portion Sp 1 and only linearly polarized light rays vibrating in the direction of the second polarization axis are transmitted through the second polarizing portion Sp 2 .
  • the light beams B 1 and B 2 are converged by the objective lens L 1 and incident on the array of optical elements K.
  • FIG. 3 is a perspective view of the array of optical elements K, which includes a plurality of optical elements M, each having a lens face.
  • the lens face of each optical element M is a cylindrical face.
  • the optical elements M are arranged vertically so that their cylindrical faces run in the horizontal direction. In this manner, these optical elements M form a lenticular lens.
  • FIG. 4 is an enlarged view of the array of optical elements K and image sensor N shown in FIG. 1 .
  • the array of optical elements K implemented as a lenticular lens, is arranged so that its side with the optical elements M faces the image sensor N.
  • the array of optical elements K is arranged in the vicinity of the focal point of the lens optical system L and is located at a predetermined distance from the image sensor N.
  • the image sensor N includes a plurality of first pixels P 1 and a plurality of second pixels P 2 , which are arranged on the image capturing plane Ni.
  • a number of those first pixels P 1 are arranged in the horizontal direction (i.e., the first direction), so are a number of those second pixels P 2 .
  • those first pixels P 1 and second pixels P 2 are arranged alternately in the vertical direction (i.e., the second direction).
  • each and every one of the first pixels P 1 and second pixels P 2 has the same shape on the image capturing plane Ni.
  • each of the first pixels P 1 and second pixels P 2 may have the same rectangular shape and may have the same area, too.
  • the image sensor N may include a plurality of micro lenses Ms, which are arranged on the image capturing plane Ni so as to cover the surface of the respective pixels.
  • the position at which the array of optical elements K is arranged may be determined by reference to the focal point of the objective lens L 1 , for example.
  • One period in the vertical direction of the cylindrical faces of the array of optical elements K corresponds to two of the pixels arranged on the image capturing plane Ni.
  • the boundary between two adjacent cylindrical faces of the array of optical elements K is level in the horizontal direction with the boundary between two adjacent micro lenses Ms of the image sensor N. That is to say, the array of optical elements K and the image sensor N are arranged so that each single optical element M of the array of optical elements K corresponds to two rows of pixels on the image capturing plane Ni.
  • Each optical element M has the function of selectively determining the outgoing direction of an incoming light ray according to its angle of incidence.
  • the optical elements M makes most of the light beam B 1 that has been transmitted through the first optical region D 1 incident onto the first pixels P 1 on the image capturing plane Ni and also makes most of the light beam B 2 that has been transmitted through the second optical region D 2 incident onto the second pixels P 2 on the image capturing plane Ni. This can be done by adjusting the refractive index of the lenticular lens used as the array of optical elements K, the radius of curvature of the optical elements M, and the distance from the image capturing plane Ni.
  • the image sensor N photoelectrically converts the incident light and transmits an image signal Q 0 to the signal processing section C. Based on the image signal Q 0 , the signal processing section C generates image signals Q 1 and Q 2 corresponding to the first pixels P 1 and the second pixels P 2 , respectively.
  • the image signal Q 1 represents an image that has been produced by the light beam transmitted through the first optical region D 1
  • the second image signal Q 2 represents an image that has been produced by the light beam transmitted through the second optical region D 2 . Since the first and second optical regions D 1 and D 2 transmit light beams vibrating in the directions of the first and second polarization axes, respectively, two images represented by two linearly polarized light components with mutually different polarization directions can be obtained.
  • an image representing a scene under the water clearly can be obtained with the reflection from the surface of water suppressed, or an image representing even a center line on a wet road surface clearly can also be obtained.
  • images including those pieces of information about the surface roughness, reflectance and birefringence of an object and the orientation of the reflective surface can be obtained.
  • the image sensor of this embodiment can obtain two images represented by light beams with mutually different polarization properties at a time by using a general purpose image sensor. Since such images can be obtained by the split polarizer which is arranged in the vicinity of the stop, a practical resolution can be maintained without increasing the size of the image capture device too much.
  • the lens optical system L of this embodiment may be an image-space telecentric optical system.
  • principal rays of light beams entering at different angles of view can also be incident on the array of optical elements at an angle of incidence of nearly zero degrees.
  • crosstalk i.e., incidence of light rays that should have been incident on the first pixels P 1 on the second pixels P 2 or incidence of light rays that should have been incident on the second pixels P 2 on the first pixels P 1
  • crosstalk i.e., incidence of light rays that should have been incident on the first pixels P 1 on the second pixels P 2 or incidence of light rays that should have been incident on the second pixels P 2 on the first pixels P 1
  • the stop S is a region through which a bundle of rays with every angle of view passes. That is why by inserting a plane, of which the optical property controls the polarization property, to the vicinity of the stop S, the polarization property of a bundle of rays with any angle of view can be controlled in the same way.
  • the split polarizer Sp may be arranged in the vicinity of the stop S. By arranging the split polarizer Sp in the optical regions D 1 and D 2 that are located in the vicinity of the stop, a polarization property corresponding to the number of the divided regions can be given to the bundle of rays.
  • the split polarizer Sp is arranged at such a position that the light that has passed through the stop S can enter the split polarizer Sp directly (i.e., without passing through any other optical member).
  • the split polarizer Sp may be arranged closer to the object than the stop S is. In that case, the light that has passed through the split polarizer Sp may enter the stop S directly (i.e., without passing through any other optical member).
  • the angle of incidence of a light ray at the focal point of the optical system is determined unequivocally by the position of the light ray that passed through the stop S.
  • the array of optical elements K has the function of selectively determining the outgoing direction of an incoming light ray according to its angle of incidence. That is why the bundle of rays can be distributed onto pixels on the image capturing plane Ni so as to correspond to the optical regions D 1 and D 2 which are divided in the vicinity of the stop S.
  • the angle of incidence of a light ray at the focal point of the optical system is determined unequivocally by the position of the light ray that passed through the stop S and the angle of view.
  • the lens optical system has first through fourth optical regions and micro lenses are arranged as an array of optical elements, unlike the image capture device of the first embodiment.
  • the following description of this embodiment will be focused on these differences from the first embodiment.
  • the lens optical system L has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis, a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis, a third optical region which transmits mostly light vibrating in the direction of a third polarization axis, and a fourth optical region which transmits mostly light vibrating in the direction of a fourth polarization axis.
  • FIG. 5 illustrates an example of a split polarizer Sp to be arranged in these four optical regions.
  • the split polarizer Sp shown in FIG. 5 is viewed from the object side.
  • the split polarizer Sp has first, second, third and fourth polarizing portions Sp 1 , Sp 2 , Sp 3 and Sp 4 which are located in the first, second third and fourth optical regions D 1 , D 2 , D 3 and D 4 , respectively.
  • the boundary between the first and second optical regions D 1 and D 2 and the boundary between the third and fourth optical regions D 3 and D 4 are located on a plane which includes the optical axis V 0 of the lens optical system L and which is parallel to the horizontal direction of the image capture device.
  • the boundary between the first and fourth optical regions D 1 and D 4 and the boundary between the second and fourth optical regions D 2 and D 4 are located on a plane which includes the optical axis V 0 of the lens optical system L and which is parallel to the vertical direction of the image capture device.
  • the direction of the third polarization axis may be either different from the directions of the first and second polarization axes or the same as the direction of the first or second polarization axis.
  • the direction of the fourth polarization axis may be either different from, or the same as, the directions of the first and second polarization axes. That is to say, any two of the first, second, third and fourth polarizing portions Sp 1 , Sp 2 , Sp 3 and Sp 4 need to have mutually different polarization directions.
  • the directions of the first through fourth polarization axes may be three directions that define angles of 45, 90 and 135 degrees with respect to one direction, for example.
  • FIG. 6 is a partially cutaway perspective view of the array of optical elements K and image sensor N.
  • the optical elements M of the array of optical elements K are micro lenses, and the lens surface is a spherical one.
  • the optical elements M are arranged periodically in vertical and horizontal directions, thereby forming a micro lens array.
  • the image sensor N is arranged so as to face the array of optical elements K.
  • Each of the pixels on the image capturing plane Ni of the image sensor N is provided with a micro lens Ms.
  • One period of the optical elements M of the array of optical elements K is set to be twice as long as one period of the micro lenses Ms of the image capture device N both horizontally and vertically. That is why a single optical element M of the array of micro lenses that form the array of optical elements K is associated with four pixels on the image capturing plane Ni.
  • FIG. 7 illustrates relations between pixels arranged on the image capturing plane of the image sensor N and light rays that have been transmitted through the four optical regions of the lens optical system L.
  • the image sensor N includes a plurality of first pixels P 1 , a plurality of second pixels P 2 , a plurality of third pixels P 3 and a plurality of fourth pixels P 4 , all of which are arranged on the image capturing plane Ni.
  • the second and third pixels P 2 and P 3 are arranged horizontally alternately
  • the first and fourth pixels P 1 and P 4 are arranged horizontally alternately.
  • the rows on which the second and third pixels P 2 and P 3 are arranged and the rows on which the first and fourth pixels P 1 and P 4 are arranged alternate so that the first and second pixels P 1 and P 2 are vertically adjacent to each other.
  • the first, second third and fourth pixels P 1 , P 2 , P 3 and P 4 are arranged so as to be adjacent to each other in the row and column directions, and each set of those four pixels corresponds to a single optical element M of the micro lens array.
  • a light ray that has been transmitted through the first polarizing portion Sp 1 in the first optical region D 1 is converged by the lens optical system L and then incident on the first pixel P 1 through an optical element M of the array of optical elements K.
  • light rays that have been transmitted through the second, third and fourth polarizing portions Sp 2 , Sp 3 and Sp 4 in the second, third and fourth optical regions D 2 , D 3 and D 4 are incident on the second, third and fourth pixels P 2 , P 3 and P 4 , respectively. That is to say, light rays that have been transmitted through each optical region are incident on the same kind of pixels which are located every other row in the horizontal direction and every other column in the vertical direction on the image capturing plane Ni.
  • the image sensor N photoelectrically converts the incident light on a pixel-by-pixel basis and outputs a signal thus obtained to the signal processing section C, which processes the signals obtained from the first, second, third and fourth pixels P 1 , P 2 , P 3 and P 4 on each kind of pixels basis, thereby generating image signals. Specifically, by processing the signals obtained from a number of first pixels P 1 , the signal processing section C generates an image signal Q 1 . In the same way, by processing signals obtained from a number of second pixels P 2 , signals obtained from a number of third pixels P 3 , and signals obtained from a number of fourth pixels P 4 , the signal processing section C generates image signals Q 2 , Q 3 and Q 4 , respectively.
  • the image signals Q 1 , Q 2 , Q 3 and Q 4 thus obtained represent Images #1, #2, #3 and #4 of the same scene which have been shot at the same time through a single lens system.
  • these Images #1, #2, #3 and #4 have been generated based on light beams in mutually different polarization states. That is why these Images #1, #2, #3 and #4 include various pieces of information about the surface roughness, reflectance and birefringence of an object and the orientation of the reflective surface due to their difference in polarization property.
  • images in four different polarization states can be shot by performing a shooting session only once.
  • FIG. 8 is a schematic representation illustrating an image capture device according to this third embodiment.
  • the image capture device of this embodiment further includes a polarization direction changing section which changes the direction of at least one of the first and second polarization axes of the first and second optical regions, which is a major difference from the image capture device of the first embodiment.
  • the split polarizer Sp is a switching type split polarizer Sp which can change the directions of the first and second polarization axes of the first and second optical regions, and includes, as the polarization direction changing section, a drive mechanism U to change the direction of the polarization axes and a control section V which controls the operation of the drive mechanism U.
  • the switching type split polarizer Sp of this embodiment has least three polarizing portions, two adjacent ones of which have polarization axes in mutually different directions.
  • FIG. 9( a ) illustrates an example of such a split polarizer Sp.
  • the split polarizer Sp shown in FIG. 9 has first through eighth polarizing portions Sp 1 through Sp 8 , which all have a fan shape and which are arranged around the center of rotation S 0 .
  • the polarization axes of the first through eighth polarizing portions Sp 1 through Sp 8 are defined so that the polarization axes are different from each other at least between adjacent polarizing portions with respect to the boundary between them, for example.
  • the drive mechanism U rotates the split polarizer Sp on the center of rotation S and stops rotating the split polarizer Sp at a position where the boundary between adjacent polarizing portions overlaps with the optical axis V 0 of the lens optical system L.
  • two polarizing portions with mutually different polarization axis directions can be arranged in the first and second optical regions D 1 and D 2 .
  • the polarizing portions to be arranged in the first and second optical regions D 1 and D 2 can be selected from the first through eighth polarizing portions Sp 1 through Sp 8 , the polarization axis directions in the first and second optical regions D 1 and D 2 can be selected arbitrarily from a predetermined combination.
  • the polarization axis directions in the first and second optical regions D 1 and D 2 can be switched.
  • polarized images can be shot adaptively to an even broader range of shooting environments.
  • the switching type split polarizer does not have to have the configuration shown in FIG. 9( a ) but may also be modified in various manners.
  • first through seventh polarizing portions Sp 1 through Sp 7 may be arranged linearly as shown in FIG. 9( b ) and the drive mechanism U may move the polarizing portions in the direction in which they are arranged.
  • the drive mechanism U and the control section V can change the polarization axis directions of the first and second optical regions D 1 and D 2 according to this embodiment, one of these two polarization axis directions does not have to be changed.
  • the polarizing portion arranged in one of the first and second optical regions D 1 and D 2 may be fixed and only the polarizing portion to be arranged in the other by the drive mechanism may be switched.
  • a fourth embodiment of an image capture device will be described.
  • the image capture device of this embodiment can also change the directions of the first and second polarization axes of the first and second optical regions, which is a major difference from the image capture device of the first embodiment.
  • the split polarizer is comprised of a liquid crystal element and a control section which functions as the polarization direction changing section.
  • FIG. 10 illustrates a configuration for an image capture device according to this embodiment.
  • the image capture device shown in FIG. 10 includes a liquid crystal element W and a control section V as a split polarizer.
  • FIG. 11( a ) is a cross-sectional view generally illustrating the structure of the liquid crystal element W and FIG. 11( b ) is a front view thereof.
  • the liquid crystal element W includes a common transparent electrode EC, a liquid crystal layer LC, divided transparent electrodes ED 1 , ED 2 and a polarizing plate PL.
  • the common transparent electrode EC is arranged on a glass substrate H 1 with an alignment film T 1 , thus forming a substrate SB 1 .
  • the divided transparent electrodes ED 1 and ED 2 are arranged on a glass substrate H 2 with an alignment film T 2 .
  • the polarizing plate PL On the other side of the substrate SB 2 with no divided transparent electrodes ED 1 , ED 2 , arranged is the polarizing plate PL, which has a polarization axis and which transmits light vibrating in the direction of the polarization axis.
  • the alignment direction of the alignment film T 2 agrees with the polarization axis of the polarizing plate PL.
  • the liquid crystal layer LC is interposed between the two substrates SB 1 and SB 2 that are bonded together with a seal member J.
  • the divided transparent electrodes ED 1 and ED 2 are arranged so that their boundary agrees with the horizontal direction that passes through the optical axis V 0 of the lens optical system L. In this manner, the divided transparent electrodes ED 1 and ED 2 are arranged in the first and second optical regions D 1 and D 2 , respectively.
  • the control section V applies a voltage to between the common transparent electrode EC and the divided transparent electrodes ED 1 , ED 2 .
  • the liquid crystal layer LC has an optical rotatory characteristic and comes to have an angle of optical rotation according to the voltage applied between the common transparent electrode EC and the divided transparent electrodes ED 1 , ED 2 .
  • the liquid crystal layer LC may have an angle of optical rotation of 90 or 180 degrees according to the voltage applied. That is why if the voltages applied to the divided transparent electrodes ED 1 and ED 2 are different, then the liquid crystal layer LC interposed between the common transparent electrode EC and the divided transparent electrode ED 1 and the liquid crystal layer LC interposed between the common transparent electrode EC and the divided transparent electrode ED 2 come to have different angles of rotation of light.
  • the light beam that has entered this liquid crystal element W has its polarization direction rotated due to the optical rotatory characteristic of the liquid crystal layer LC and then is incident on the polarizing plate PL.
  • the angle of optical rotation i.e., the angle of rotation of the polarization axis
  • the polarizing plate PL transmits only a component of the light transmitted through the liquid crystal layer LC if the component is a linearly polarized light ray that is parallel to the polarization axis of the polarizing plate PL.
  • the polarization directions of the light rays going out of the liquid crystal element W and incident on the lens optical system L are different between the divided transparent electrodes ED 1 and ED 2 , i.e., between the first and second optical regions D 1 and D 2 . That is to say, by making the linearly polarized light rays to be transmitted through the divided transparent electrodes ED 1 and ED 2 have mutually different polarization axis directions and by regulating the voltage applied, their polarization directions can be changed substantially.
  • two images represented by light beams in mutually different polarization states can be obtained at a time.
  • the angle of optical rotation of the liquid crystal layer LC can be adjusted by regulating the applied voltage, the polarization condition of the light representing the image can be changed according to the shooting environment. Consequently, the image capture device of this embodiment can cope with an even broader range of shooting environments.
  • the switching operation can get done at high speeds. Therefore, it is possible to perform a shooting session under a predetermined polarization condition and then perform another shooting session for a short time under a different polarization condition. For example, if an organism needs to be shot in three or more different polarization states, the number of images obtained can be twice as large as the number of times of shooting sessions by performing the shooting sessions a number of times at short intervals.
  • the liquid crystal element includes two divided transparent electrodes in the embodiment described above, the liquid crystal element may also include four divided transparent electrodes to be arranged in four optical regions as in the second embodiment described above. In that case, micro lenses may be used as the array of optical elements as in the second embodiment described above. Then, four images represented by light beams in four different polarization states can be obtained.
  • the image sensor is a color image sensor with an arrangement of pixels on which color filters are arranged in a Bayer arrangement
  • the array of optical elements K is a lenticular lens with a different shape from its counterpart of the first embodiment, which are major differences from the image capture device of the first embodiment.
  • the following description of this fifth embodiment will be focused on these differences from the first embodiment.
  • red and blue pixels are present on either odd- or even-numbered columns and on either odd- or even-numbered rows. That is why if the array of optical elements K has the same structure as the counterpart of the first embodiment (i.e., implemented as a lenticular lens), information about the color blue is missing from one of the two images represented by light beams that have been transmitted through the first and second optical regions D 1 and D 2 and information about the color red is missing from the other image.
  • FIG. 12 is a perspective view illustrating the array of optical elements K according to this embodiment as viewed from the image side.
  • This array of optical elements K includes a plurality of optical elements M 1 and M 2 .
  • a number of cylindrical lenses, each running horizontally (i.e., in the first direction), are arranged vertically (i.e., in the second direction) linearly.
  • Each set of those optical elements M 1 , M 2 forms a column running vertically, and columns of the optical elements M 1 and columns of the optical elements M 2 are alternately arranged horizontally.
  • each optical element on one column is vertically shifted from an associated optical element on the other column by a length corresponding to a half of one vertical arrangement period.
  • Each of these optical elements M 1 and M 2 is associated with four pixels with red, blue and green filters, which are arranged in a Bayer arrangement pattern to form the image capturing plane of the image sensor, and makes the light that has been transmitted through the lens optical system L incident on four pixels that face the optical element. That is to say, the cylindrical surface that is the lens surface of each optical element M 1 , M 2 has one period corresponding to two pixels of the image sensor N both vertically and horizontally. That is why in two horizontally adjacent columns of optical elements M 1 and M 2 , each optical element on one column is vertically shifted by one pixel from an associated optical element on the other column.
  • the lenticular lens consisting of these optical elements M 1 and M 2 As in the first embodiment, thanks to the action of the lenticular lens consisting of these optical elements M 1 and M 2 , light rays that have been transmitted through the first and second optical regions D 1 and D 2 are incident on mutually different pixels.
  • An optical element on one column of optical elements M 1 is vertically shifted by a half period from an associated optical element on an adjacent column of optical elements M 2 . That is why the light rays coming from the first and second optical regions D 1 and D 2 are incident on the pixels of the image sensor with odd- and even-numbered rows changed every two pixels.
  • the optical elements M 1 lead the light rays coming from the first optical region D 1 onto green (G1) pixels P 1 A and red (R) pixels P 3 A and also leas the light rays coming from the second optical region D 2 onto green (G2) pixels P 2 B and blue (B) pixels P 4 B.
  • the optical elements M 2 lead the light rays coming from the first optical region D 1 onto green (G2) pixels P 2 A and blue (B) pixels P 4 A and also leas the light rays coming from the second optical region D 2 onto green (G1) pixels P 1 B and red (R) pixels P 3 B.
  • signals of two missing ones of the four pixels associated with each column of optical elements M 1 may be interpolated with signals of two pixels associated with an adjacent column of optical elements M 2 .
  • signals of two missing ones of the four pixels associated with each column of optical elements M 1 may be interpolated with signals of two pixels associated with an adjacent column of optical elements M 2 .
  • pixels with green filters may be vertically adjacent to each other in each set of four pixels.
  • the image sensor is supposed to include pixels with red, blue and green filters in the embodiment described above, the image sensor may also include pixels with filters in the complementary colors of these colors. For example, in the image sensor, red, blue, green and white filters, red, blue, green and yellow filters, or any other appropriate combination of filters may be provided for each set of four pixels.
  • FIG. 14 illustrates a general configuration for a skin checker as a sixth embodiment of the present invention.
  • the skin checker of this embodiment includes a light source Ls to illuminate an object Ob, an image capture device A, a display section Y to display an image shot, and a control section which controls all of these.
  • the lens optical system includes an optical element Sa including a split polarizer and split color filters, which is a difference from the image capture device of the second embodiment.
  • FIG. 15 illustrates a configuration for the optical element Sa to be arranged in the vicinity of the stop of the lens optical system.
  • FIG. 15( a ) is a side view.
  • the optical element Sa includes the split polarizer Sp that has already been described for the second embodiment and split color filters Sc adjacent to the split polarizer Sp.
  • FIG. 15( b - 1 ) is a front view of the split color filters Sc.
  • There are four optical regions D 1 to D 4 which are arranged around the optical axis V 0 .
  • a polarizer is arranged at the light source Ls to make the light source Ls emit mainly light with a predetermined polarization direction.
  • the skin observation may be carried out with the polarization directions of the illumination source and the shooting optical system matched to each other by taking advantage of such a characteristic.
  • the shooting session may be carried out with the polarization directions of the illumination source and the shooting optical system left different from each other.
  • the optical region D 3 of the split polarizer Sp corresponding to the optical region D 3 of the split color filters Sc which mainly transmits light falling within the color blue wavelength range has such a polarization property as to mainly transmit polarized light, of which the polarization axis is substantially parallel to that of polarized light mainly emitted from the light source Ls.
  • the optical region D 4 of the split polarizer Sp corresponding to the optical region D 4 which mainly transmits light falling within the color blue wavelength range has such a polarization property as to mainly transmit polarized light, of which the polarization axis intersects at substantially right angles with that of polarized light mainly emitted from the light source Ls.
  • the optical regions D 1 and D 2 of the split polarizer Sp corresponding to the optical regions D 1 and D 2 of the split color filters Sc that mainly transmit light falling within the colors green and red wavelength ranges, respectively, have such a polarization property as to mainly transmit polarized light, of which the polarization axis defines a tilt angle of 45 degrees with respect to the polarization axis of the polarized light mainly emitted from the light source Ls.
  • the polarization axis of the optical regions D 1 and D 2 does not have to define a tilt angle of 45 degrees with respect to the polarization axis of the light emitted from the light source Ls. Rather the direction of the polarization axis of the optical regions D 1 and D 2 may be adjusted appropriately depending on how the region of interest is irradiated with the light. For example, in a general shooting situation, there should be not only the light source of the skin checker but also other kinds of environmental light such as a room light and sunlight. That is why if the direct reflection of such environmental light is reduced by appropriately adjusting the polarization axis direction of the shooting optical system, the shooting condition can be further improved.
  • the image that has been shot by the image capture device A is subjected to appropriate image processing by the signal processing section C and then presented on the display section Y.
  • the signal processing section C performs horizontally inverting processing on the image
  • a mirror image of the object Ob is presented on the display section Y with its right and left portions inverted.
  • the display section can function as a mirror.
  • the skin checker of this embodiment can be used as a sort of electronic mirror that can display spots, wrinkles and so on effectively.
  • the skin checker suitably performs such a display operation just like a mirror does, because he or she can recognize his or her spots or wrinkles intuitively in that case while doing a makeup or skincare.
  • a skin checker which allows the user to observe his or her own skin spots, wrinkles and texture efficiently and which can obtain a color skin image at the same time is realized.
  • the image capture device of this embodiment can appropriately adjust the directions of the polarization axes of the respective optical regions D 1 through D 4 of the split polarizer Sp, and therefore, can arrange the light source Ls appropriately with the influence of environmental light taken into account and set the polarization directions of the split polarizer Sp according to the position of the light source before actually starting to carry out shooting. As a result, an image can be shot with the influence of environmental light further reduced.
  • the influence of the environmental light can be reduced by carrying out shooting synchronously with flickering of the light source Ls.
  • the difference between an image captured with the light source Ls turned ON and an image captured with the light source Ls turned OFF may be calculated and spots or wrinkles may be checked using the differential image.
  • the optical regions D 3 and D 4 are supposed to transmit a light beam falling within the color blue wavelength range.
  • the wavelength range of the light beam to be transmitted through the two optical regions does not have to be the color blue wavelength range.
  • two optical regions may transmit a light beam falling within the color green wavelength range and the other two optical regions may transmit a light beam falling within the color blue wavelength range and a light beam falling within the color red wavelength range, respectively.
  • Such a configuration may be adopted if considering the spectrum distribution of the light source Ls and the spectral sensitivity of the photodiode, it is more advantageous to use a light beam falling within the color green wavelength range rather than a light beam falling within the color blue wavelength range in order to observe skin spots, wrinkles or texture.
  • the lens optical system L is supposed to be a single lens.
  • the lens optical system may include a compound lens which is a combination of multiple lenses.
  • the optical system can be designed with an increased degree of freedom, and an image with a high resolution can be obtained, which is beneficial.
  • the lens optical system may have image-space telecentricity.
  • the incoming light can also be split into multiple light rays just as intended by appropriately adjusting one period of the array of optical elements (such as a lenticular lens or a micro lens array) arranged in front of the image sensor according to the angle of emittance of an off-axis principal ray of the lens optical system.
  • the image capture device is supposed to include a signal processing section C.
  • an image capture device according to the present invention does not have to include the signal processing section C.
  • the output signal of the image sensor may be transmitted to an external device such as a personal computer so that the arithmetic processing that should have been done by the signal processing section C is carried out by the external device instead.
  • the present invention may be implemented as a system including the image capture device with the lens optical system L, the array of optical elements K and the image sensor N and an external signal processor.
  • An image capture device can be used effectively as an industrial camera such as a product inspecting camera, a surveillance camera, and an image input camera for information terminals or robots.
  • the image capture device of the present disclosure may also be used in a digital still camera or a digital camcorder as well.

Abstract

An image capture device according to the present disclosure includes: a lens optical system L; an image sensor N on which light that has passed through the lens optical system L is incident and which includes at least a plurality of first pixels and a plurality of second pixels; and an array of optical elements K which is arranged between the lens optical system and the image sensor. The lens optical system has a first optical region D1 which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region D2 which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis. The array of optical elements makes the light that has passed through the first optical region D1 incident on the plurality of first pixels and also makes the light that has passed through the second optical region D2 incident on the plurality of second pixels.

Description

    TECHNICAL FIELD
  • The present application relates to an image capture device such as a camera and more particularly relates to an image capture device which captures an image using polarized light.
  • BACKGROUND ART
  • When light is reflected from the surface of an object, the polarization property of the light changes. That is why the light reflected from the object has its polarization property determined by various pieces of information including the surface roughness, reflectance, and birefringence of the object and the orientation of the reflective surface. Thus, if light with a polarization property is separated, detected and converted into an electrical signal, those pieces of information can be obtained.
  • A shooting system of that type is generally used as means for imaging separately light that has been reflected from the surface of a tissue and light that has been reflected from inside of the tissue in a camera for medical and beauty treatment purposes such as an endoscope and a skin checker.
  • In order to detect light with a polarization property, according to a conventional method, a polarizer is arranged in front of the lens of a image capturing camera and a shooting session is carried out with the polarizer rotated according to the shooting condition. According to such a method, however, shooting sessions need to be performed a number of times with the polarizer rotated, thus taking a long time to get an image.
  • Thus, to overcome such a problem, some people proposed a method for getting an image using light with a different polarization property by arranging a polarizer having an axis of polarization in a predetermined direction in advance on the surface of an image sensor. For example, a Patent Document No. 1 discloses a device in which polarization filters are provided for some of the pixels of a solid-state image sensor. By making light with a polarization property incident on some pixels and by performing image processing on an image obtained from those pixels on which the light with a polarization property has been incident and an image obtained from other pixels, an image, from which the influence of reflection from the surface of the subject has been reduced, can be obtained.
  • Meanwhile, Patent Document No. 2 discloses an image capture device in which an optical system is formed by an array of two lenses and an image is shot through each of those lenses using light with a different polarization direction, thereby detecting the state of a dry or wet road surface.
  • CITATION LIST Patent Literature
    • Patent Document No. 1: Japanese Laid-Open Patent Publication No. 2006-254331
    • Patent Document No. 2: Japanese Laid-Open Patent Publication No. 2010-25915
    SUMMARY OF INVENTION Technical Problem
  • However, there is a growing demand for an image capture device which can obtain a plurality of images based on multiple light beams in mutually different polarization states using a more simplified or more general configuration than the conventional ones.
  • A non-limiting exemplary embodiment of the present application provides an image capture device which can obtain a plurality of images based on multiple light beams in mutually different polarization states using a simplified or general configuration.
  • Solution to Problem
  • An image capture device according to an aspect of the present invention includes: a lens optical system; an image sensor on which light that has passed through the lens optical system is incident and which includes at least a plurality of first pixels and a plurality of second pixels; and an array of optical elements which is arranged between the lens optical system and the image sensor and which includes a plurality of optical elements, each having a lens surface. The lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis. Each of the optical elements that form the array of optical elements makes the light that has passed through the first optical region incident on the plurality of first pixels and also makes the light that has passed through the second optical region incident on the plurality of second pixels.
  • Advantageous Effects of Invention
  • An image capture device according to an aspect of the present invention can capture multiple images based on light beams in mutually different polarization states while using a single lens optical system.
  • In addition, by adopting the image capture device of the present invention, a biometric image capture device, which may be used effectively to check the skin state, for example, is realized.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 Illustrates a configuration for a first embodiment of an image capture device according to the present invention.
  • FIG. 2 A front view of a split polarizer according to the first embodiment.
  • FIG. 3 A perspective view of an array of optical elements according to the first embodiment.
  • FIG. 4 A schematic enlarged cross-sectional view illustrating the array of optical elements and an image sensor according to the first embodiment and their surrounding.
  • FIG. 5 A front view of a split polarizer according to a second embodiment.
  • FIG. 6 A schematic perspective view illustrating an array of optical elements and an image sensor according to the second embodiment and their surrounding.
  • FIG. 7 Illustrates light rays to be incident on the image capturing plane according to the second embodiment.
  • FIG. 8 Illustrates a configuration according to a third embodiment.
  • FIG. 9 (a) is a front view of a split polarizer according to the third embodiment and (b) is a front view illustrating another example of the split polarizer.
  • FIG. 10 Illustrates a configuration according to a fourth embodiment.
  • FIGS. 11 (a) and (b) are respectively a cross-sectional view and a front view of a liquid crystal element according to the fourth embodiment.
  • FIG. 12 A perspective view of an array of optical elements according to a fifth embodiment.
  • FIGS. 13 (a) and (b) illustrate how light rays are incident on an image sensor according to the fifth embodiment.
  • FIG. 14 Illustrates a general configuration for a skin checker according to a sixth embodiment of the present invention.
  • FIG. 15 (a) is a side view of an optical element Sa to be arranged in the vicinity of the stop of a lens optical system according to the sixth embodiment, (b-1) is a front view of split color filters Sc, and (b-2) is a front view of a split polarizer Sp.
  • DESCRIPTION OF EMBODIMENTS
  • The present inventors checked out the image capture devices disclosed in Patent Documents Nos. 1 and 2. As a result, we reached the following conclusion. Specifically, the device disclosed in Patent Document No. 1 needs to use a dedicated image sensor with a polarization filter. However, since such an image sensor is not retailed, it must be manufactured as a dedicated product. Particularly if such dedicated products are manufactured in small quantities, the manufacturing cost is expensive. On top of that, in that case, the arrangement of the polarizer cannot be changed appropriately according to the shooting situation.
  • On the other hand, in the device disclosed in Patent Document No. 2, an optical system is implemented as a lens array on the image sensor. According to such an arrangement, the effective diameter of a single optical system needs to be less than a half of the size of the image capturing area, thus limiting the degree of freedom of the optical design. That is why it is difficult to form an optical system, of which the resolution is high enough to obtain an image.
  • In order to overcome these problems, the present inventors invented a novel image capture device which can obtain an image by using light with a polarization property. An aspect of the present invention can be outlined as follows.
  • An image capture device according to an aspect of the present invention includes: a lens optical system; an image sensor on which light that has passed through the lens optical system is incident and which includes at least a plurality of first pixels and a plurality of second pixels; and an array of optical elements which is arranged between the lens optical system and the image sensor and which includes a plurality of optical elements, each having a lens surface. The lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis. Each of the optical elements that form the array of optical elements makes the light that has passed through the first optical region incident on the plurality of first pixels and also makes the light that has passed through the second optical region incident on the plurality of second pixels.
  • The image sensor may be a monochrome image sensor.
  • The lens optical system may be an image-space telecentric optical system.
  • The lens optical system may include a split polarizer having first and second polarizing portions which are located in the first and second regions, respectively.
  • The optical elements that form the array of optical elements may be a lenticular lens.
  • In the image sensor, a number of the first pixels and a number of the second pixels may be arranged in a first direction and the first pixels arranged in the first direction and the second pixels arranged in the first direction may alternate with each other in a second direction that intersects with the first direction at right angles, thus forming an image capturing plane.
  • The lens optical system may further have a third optical region which transmits mostly light vibrating in the direction of a third polarization axis and a fourth optical region which transmits mostly light vibrating in the direction of a fourth polarization axis. The split polarizer may further have third and fourth polarizing portions which are located in the third and fourth regions, respectively.
  • The optical elements that form the array of optical elements may be a micro lens array.
  • The image capture device may further include a polarization direction changing section which changes the direction of at least one of the first and second polarization axes of the first and second optical regions.
  • The lens optical system may include a split polarizer with at least three polarizing portions, two adjacent ones of which have polarization axes in mutually different directions. The image capture device may further include a drive mechanism which drives the split polarizer so that any two adjacent ones of the at least three polarizing portions of the split optical element are located in the first and second regions.
  • The split polarizer may include a common transparent electrode, two divided transparent electrodes which are located in the first and second optical regions, respectively, a liquid crystal layer which is interposed between the common transparent electrode and the two divided transparent electrodes, and a control section which applies mutually different voltages to the two divided transparent electrodes.
  • The image capture device may perform shooting sessions multiple times with the voltage changed.
  • The plurality of first pixels may include a number of 1A pixels with filters having a first spectral transmittance characteristic, a number of 2A pixels with filters having a second spectral transmittance characteristic, a number of 3A pixels with filters having a third spectral transmittance characteristic, and a number of 4A pixels with filters having a fourth spectral transmittance characteristic. The plurality of second pixels may include a number of 1B pixels with filters having the first spectral transmittance characteristic, a number of 2B pixels with filters having the second spectral transmittance characteristic, a number of 3B pixels with filters having the third spectral transmittance characteristic, and a number of 4B pixels with filters having the fourth spectral transmittance characteristic. The array of optical elements may include: a plurality of first optical elements which makes light that has passed through the first optical region incident on the 1A and 3A pixels and which also makes light that has passed through the second region incident on the 2B and 4B pixels; and a plurality of second optical elements which makes light that has passed through the first region incident on the 2A and 4A pixels and which also makes light that has passed through the second region incident on the 1B and 3B pixels.
  • On the image capturing plane of the image sensor, the 1A, 2B, 3A and 4B pixels that form a single set may be adjacent to each other and may be arranged at the four vertices of a quadrangle.
  • The filters having the first spectral transmittance characteristic and the filters having the second spectral transmittance characteristic may transmit light falling within the wavelength range of the color green. The filters having the third spectral transmittance characteristic may transmit light falling within the wavelength range of the color red. The filters having the fourth spectral transmittance characteristic may transmit light falling within the wavelength range of the color blue. And 1A, 2B, 3A and 4B pixels that form a single set may be arranged in a Bayer arrangement pattern.
  • The plurality of first optical elements and the plurality of second optical elements may form a lenticular lens.
  • The lens optical system may further include a stop, and the first and second optical regions may be located in the vicinity of the stop.
  • An image capture device according to another aspect of the present invention includes: a lens optical system; an image sensor which includes a plurality of first pixels with filters having a first spectral transmittance characteristic, a plurality of second pixels with filters having a second spectral transmittance characteristic, a plurality of third pixels with filters having a third spectral transmittance characteristic, and a plurality of fourth pixels with filters having a fourth spectral transmittance characteristic and in which a first row where the first and second pixels are arranged alternately in a first direction and a second row where the third and fourth pixels are arranged alternately in the first direction alternate in a second direction, thereby forming an image capturing plane, wherein light that has passed through the lens optical system is incident on the first, second, third and fourth pixels; and an array of optical elements which is arranged between the lens optical system and the image sensor. The lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis. The first and second optical regions are arranged in the second direction. On the image capturing plane, the array of optical elements includes a plurality of optical elements, each of which makes the light that has been transmitted through the lens optical system incident on every four pixels, which are the first, second, third and fourth pixels that are arranged adjacent to each other in the first and second directions. The plurality of optical elements forms a number of columns, each of which is arranged linearly in the second direction. On two columns that are adjacent to each other in the first direction, each optical element on one of the two columns is shifted from an associated optical element on the other column in the second direction by a length corresponding to a half of one arrangement period of the optical elements.
  • A biometric image capturing system according to an aspect of the present invention includes: an image capture device according to any of the embodiments described above; and a light source which irradiates an object with polarized light.
  • The lens optical system of the image capture device may include split color filters to be arranged in the first to fourth optical regions. The split color filters may transmit light falling within the same wavelength range in two of those first through fourth optical regions. And in the two optical regions, the polarization axes of the split polarizer may be in mutually different directions.
  • In the two optical regions, the directions of the polarization axes of the split polarizer may intersect with each other at substantially right angles.
  • The biometric image capturing system may further include a control section which controls the light source and the image capture device. The control section may control the image capture device so that the image capture device captures multiple images synchronously with flickering of the light source. And the image capture device may perform arithmetic processing between the multiple images to generate another image.
  • The biometric image capturing system may further include a display section which displays an image that has been shot by the image capture device. The image capture device may further include a signal processing section. The signal processing section may generate an image signal by inverting horizontally the image shot and may output the image signal to the display section.
  • Hereinafter, embodiments of an image capture device according to the present invention will be described with reference to the accompanying drawings.
  • Embodiment 1
  • FIG. 1 is a schematic representation illustrating a first embodiment of an image capture device according to the present invention. The image capture device A of this embodiment includes a lens optical system L, of which the optical axis is identified by V0, an array of optical elements K which is arranged in the vicinity of the focal point of the lens optical system L, an image sensor N, and a signal processing section C.
  • In this embodiment, the lens optical system L includes a stop S and an objective lens L1 which images light that has passed through the stop S onto the image sensor. The lens optical system L has a first optical region D1 and a second optical region D2. The first and second optical regions D1 and D2 are located in the vicinity of the stop S. The region obtained by combining these first and second optical receives D1 and D2 together has a circular shape corresponding to the aperture of the stop S on a cross section which intersects with the optical axis V0 at right angles. The boundary between the first and second optical regions D1 and D2 includes the optical axis V0 and is located on a plane which is parallel to the horizontal direction.
  • The first optical region D1 of the lens optical system L is configured to transmit mostly light vibrating in the direction of a first polarization axis, and the second optical region D2 is configured to transmit mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis.
  • In this embodiment, the lens optical system L includes a split polarizer Sp which is located in the first and second optical regions D1 and D2. FIG. 2 is a front view of the split polarizer Sp. The split polarizer Sp divides the aperture of the stop S into two regions by a line that includes the optical axis V0 of the lens optical system L and that is parallel to the horizontal direction of the image capture device, and includes a first polarizing portion Sp1 located in the first optical region D1 and a second polarizing portion Sp2 located in the second optical region. The first and second polarizing regions Sp1 and Sp2 are implemented as respective polarizers. As each of those polarizers, a so-called “PVA iodine stretched film” which is made by dyeing polyvinyl alcohol with iodine and stretching it into a film may be used.
  • The first and second polarizing sections Sp1 and Sp2 have first and second polarization axes, respectively, and the directions of the first and second polarization axes are different from each other. For example, the direction of the first polarization axis may be the vertical direction of the image capture device, and the second polarization direction may be the horizontal direction of the image capture device.
  • As shown in FIG. 1, of the light entering the stop S, the light beam B1 enters the first polarizing portion Sp1 of the split polarizer Sp and the light beam B2 enters the second polarizing portion Sp2 of the split polarizer Sp. If the light entering the stop S includes linearly polarized light rays which are polarized in an arbitrary direction, then only linearly polarized light rays vibrating in the direction of the first polarization axis are transmitted through the first polarizing portion Sp1 and only linearly polarized light rays vibrating in the direction of the second polarization axis are transmitted through the second polarizing portion Sp2. The light beams B1 and B2 are converged by the objective lens L1 and incident on the array of optical elements K.
  • FIG. 3 is a perspective view of the array of optical elements K, which includes a plurality of optical elements M, each having a lens face. In this embodiment, the lens face of each optical element M is a cylindrical face. In this array of optical elements K, the optical elements M are arranged vertically so that their cylindrical faces run in the horizontal direction. In this manner, these optical elements M form a lenticular lens.
  • FIG. 4 is an enlarged view of the array of optical elements K and image sensor N shown in FIG. 1. The array of optical elements K, implemented as a lenticular lens, is arranged so that its side with the optical elements M faces the image sensor N. As shown in FIG. 1, the array of optical elements K is arranged in the vicinity of the focal point of the lens optical system L and is located at a predetermined distance from the image sensor N. The image sensor N includes a plurality of first pixels P1 and a plurality of second pixels P2, which are arranged on the image capturing plane Ni. A number of those first pixels P1 are arranged in the horizontal direction (i.e., the first direction), so are a number of those second pixels P2. As shown in FIG. 4, those first pixels P1 and second pixels P2 are arranged alternately in the vertical direction (i.e., the second direction).
  • In this embodiment, each and every one of the first pixels P1 and second pixels P2 has the same shape on the image capturing plane Ni. For example, each of the first pixels P1 and second pixels P2 may have the same rectangular shape and may have the same area, too.
  • The image sensor N may include a plurality of micro lenses Ms, which are arranged on the image capturing plane Ni so as to cover the surface of the respective pixels. The position at which the array of optical elements K is arranged may be determined by reference to the focal point of the objective lens L1, for example. One period in the vertical direction of the cylindrical faces of the array of optical elements K corresponds to two of the pixels arranged on the image capturing plane Ni.
  • As shown in FIG. 4, the boundary between two adjacent cylindrical faces of the array of optical elements K is level in the horizontal direction with the boundary between two adjacent micro lenses Ms of the image sensor N. That is to say, the array of optical elements K and the image sensor N are arranged so that each single optical element M of the array of optical elements K corresponds to two rows of pixels on the image capturing plane Ni. Each optical element M has the function of selectively determining the outgoing direction of an incoming light ray according to its angle of incidence. Specifically, the optical elements M makes most of the light beam B1 that has been transmitted through the first optical region D1 incident onto the first pixels P1 on the image capturing plane Ni and also makes most of the light beam B2 that has been transmitted through the second optical region D2 incident onto the second pixels P2 on the image capturing plane Ni. This can be done by adjusting the refractive index of the lenticular lens used as the array of optical elements K, the radius of curvature of the optical elements M, and the distance from the image capturing plane Ni.
  • The image sensor N photoelectrically converts the incident light and transmits an image signal Q0 to the signal processing section C. Based on the image signal Q0, the signal processing section C generates image signals Q1 and Q2 corresponding to the first pixels P1 and the second pixels P2, respectively.
  • The image signal Q1 represents an image that has been produced by the light beam transmitted through the first optical region D1, while the second image signal Q2 represents an image that has been produced by the light beam transmitted through the second optical region D2. Since the first and second optical regions D1 and D2 transmit light beams vibrating in the directions of the first and second polarization axes, respectively, two images represented by two linearly polarized light components with mutually different polarization directions can be obtained.
  • These two images obtained in this manner have been shot at a time through the single lens optical system. That is why since the same object has been shot substantially at the same time from the same angle, there is no significant difference between the two images except that those two images are represented by light beams in mutually different polarization states. However, as a light beam coming from the object has a polarization property which is determined by various kinds of information about the surface roughness, reflectance and birefringence of an object and the orientation of the reflective surface, those pieces of information about the surface roughness, reflectance and birefringence of an object and the orientation of the reflective surface are more enhanced in one of the two images than in the other. As a result, an image representing a scene under the water clearly can be obtained with the reflection from the surface of water suppressed, or an image representing even a center line on a wet road surface clearly can also be obtained. On top of that, by processing the two image signals by various known image processing techniques, images including those pieces of information about the surface roughness, reflectance and birefringence of an object and the orientation of the reflective surface can be obtained.
  • As can be seen, the image sensor of this embodiment can obtain two images represented by light beams with mutually different polarization properties at a time by using a general purpose image sensor. Since such images can be obtained by the split polarizer which is arranged in the vicinity of the stop, a practical resolution can be maintained without increasing the size of the image capture device too much.
  • Optionally, the lens optical system L of this embodiment may be an image-space telecentric optical system. In that case, principal rays of light beams entering at different angles of view can also be incident on the array of optical elements at an angle of incidence of nearly zero degrees. As a result, crosstalk (i.e., incidence of light rays that should have been incident on the first pixels P1 on the second pixels P2 or incidence of light rays that should have been incident on the second pixels P2 on the first pixels P1) can be reduced over the entire image sensor.
  • The stop S is a region through which a bundle of rays with every angle of view passes. That is why by inserting a plane, of which the optical property controls the polarization property, to the vicinity of the stop S, the polarization property of a bundle of rays with any angle of view can be controlled in the same way. Specifically, in this embodiment, the split polarizer Sp may be arranged in the vicinity of the stop S. By arranging the split polarizer Sp in the optical regions D1 and D2 that are located in the vicinity of the stop, a polarization property corresponding to the number of the divided regions can be given to the bundle of rays.
  • In FIG. 1, the split polarizer Sp is arranged at such a position that the light that has passed through the stop S can enter the split polarizer Sp directly (i.e., without passing through any other optical member). Optionally, the split polarizer Sp may be arranged closer to the object than the stop S is. In that case, the light that has passed through the split polarizer Sp may enter the stop S directly (i.e., without passing through any other optical member). In the case of an image-space telecentric optical system, the angle of incidence of a light ray at the focal point of the optical system is determined unequivocally by the position of the light ray that passed through the stop S. Also, the array of optical elements K has the function of selectively determining the outgoing direction of an incoming light ray according to its angle of incidence. That is why the bundle of rays can be distributed onto pixels on the image capturing plane Ni so as to correspond to the optical regions D1 and D2 which are divided in the vicinity of the stop S.
  • On the other hand, in the case of an image-space non-telecentric optical system, the angle of incidence of a light ray at the focal point of the optical system is determined unequivocally by the position of the light ray that passed through the stop S and the angle of view.
  • Embodiment 2
  • A second embodiment of an image capture device according to the present invention will be described. In the image capture device of this embodiment, the lens optical system has first through fourth optical regions and micro lenses are arranged as an array of optical elements, unlike the image capture device of the first embodiment. Thus, the following description of this embodiment will be focused on these differences from the first embodiment.
  • In this embodiment, the lens optical system L has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis, a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis, a third optical region which transmits mostly light vibrating in the direction of a third polarization axis, and a fourth optical region which transmits mostly light vibrating in the direction of a fourth polarization axis. FIG. 5 illustrates an example of a split polarizer Sp to be arranged in these four optical regions. The split polarizer Sp shown in FIG. 5 is viewed from the object side. The split polarizer Sp has first, second, third and fourth polarizing portions Sp1, Sp2, Sp3 and Sp4 which are located in the first, second third and fourth optical regions D1, D2, D3 and D4, respectively.
  • The boundary between the first and second optical regions D1 and D2 and the boundary between the third and fourth optical regions D3 and D4 are located on a plane which includes the optical axis V0 of the lens optical system L and which is parallel to the horizontal direction of the image capture device. On the other hand, the boundary between the first and fourth optical regions D1 and D4 and the boundary between the second and fourth optical regions D2 and D4 are located on a plane which includes the optical axis V0 of the lens optical system L and which is parallel to the vertical direction of the image capture device.
  • The direction of the third polarization axis may be either different from the directions of the first and second polarization axes or the same as the direction of the first or second polarization axis. Likewise, the direction of the fourth polarization axis may be either different from, or the same as, the directions of the first and second polarization axes. That is to say, any two of the first, second, third and fourth polarizing portions Sp1, Sp2, Sp3 and Sp4 need to have mutually different polarization directions. Also, if the directions of the first through fourth polarization axes are different from each other, then the directions of the first through fourth polarization axes may be three directions that define angles of 45, 90 and 135 degrees with respect to one direction, for example.
  • FIG. 6 is a partially cutaway perspective view of the array of optical elements K and image sensor N. In this embodiment, the optical elements M of the array of optical elements K are micro lenses, and the lens surface is a spherical one. The optical elements M are arranged periodically in vertical and horizontal directions, thereby forming a micro lens array. The image sensor N is arranged so as to face the array of optical elements K. Each of the pixels on the image capturing plane Ni of the image sensor N is provided with a micro lens Ms. One period of the optical elements M of the array of optical elements K is set to be twice as long as one period of the micro lenses Ms of the image capture device N both horizontally and vertically. That is why a single optical element M of the array of micro lenses that form the array of optical elements K is associated with four pixels on the image capturing plane Ni.
  • FIG. 7 illustrates relations between pixels arranged on the image capturing plane of the image sensor N and light rays that have been transmitted through the four optical regions of the lens optical system L. The image sensor N includes a plurality of first pixels P1, a plurality of second pixels P2, a plurality of third pixels P3 and a plurality of fourth pixels P4, all of which are arranged on the image capturing plane Ni. As shown in FIG. 7, on the image capturing plane Ni, the second and third pixels P2 and P3 are arranged horizontally alternately, and the first and fourth pixels P1 and P4 are arranged horizontally alternately. The rows on which the second and third pixels P2 and P3 are arranged and the rows on which the first and fourth pixels P1 and P4 are arranged alternate so that the first and second pixels P1 and P2 are vertically adjacent to each other. Thus, the first, second third and fourth pixels P1, P2, P3 and P4 are arranged so as to be adjacent to each other in the row and column directions, and each set of those four pixels corresponds to a single optical element M of the micro lens array.
  • A light ray that has been transmitted through the first polarizing portion Sp1 in the first optical region D1 is converged by the lens optical system L and then incident on the first pixel P1 through an optical element M of the array of optical elements K. In the same way, light rays that have been transmitted through the second, third and fourth polarizing portions Sp2, Sp3 and Sp4 in the second, third and fourth optical regions D2, D3 and D4 are incident on the second, third and fourth pixels P2, P3 and P4, respectively. That is to say, light rays that have been transmitted through each optical region are incident on the same kind of pixels which are located every other row in the horizontal direction and every other column in the vertical direction on the image capturing plane Ni.
  • The image sensor N photoelectrically converts the incident light on a pixel-by-pixel basis and outputs a signal thus obtained to the signal processing section C, which processes the signals obtained from the first, second, third and fourth pixels P1, P2, P3 and P4 on each kind of pixels basis, thereby generating image signals. Specifically, by processing the signals obtained from a number of first pixels P1, the signal processing section C generates an image signal Q1. In the same way, by processing signals obtained from a number of second pixels P2, signals obtained from a number of third pixels P3, and signals obtained from a number of fourth pixels P4, the signal processing section C generates image signals Q2, Q3 and Q4, respectively.
  • The image signals Q1, Q2, Q3 and Q4 thus obtained represent Images #1, #2, #3 and #4 of the same scene which have been shot at the same time through a single lens system. However, these Images #1, #2, #3 and #4 have been generated based on light beams in mutually different polarization states. That is why these Images #1, #2, #3 and #4 include various pieces of information about the surface roughness, reflectance and birefringence of an object and the orientation of the reflective surface due to their difference in polarization property. In this manner, according to this embodiment, images in four different polarization states can be shot by performing a shooting session only once.
  • Embodiment 3
  • A third embodiment of an image capture device according to the present invention will be described. FIG. 8 is a schematic representation illustrating an image capture device according to this third embodiment. The image capture device of this embodiment further includes a polarization direction changing section which changes the direction of at least one of the first and second polarization axes of the first and second optical regions, which is a major difference from the image capture device of the first embodiment. More specifically, in this embodiment, the split polarizer Sp is a switching type split polarizer Sp which can change the directions of the first and second polarization axes of the first and second optical regions, and includes, as the polarization direction changing section, a drive mechanism U to change the direction of the polarization axes and a control section V which controls the operation of the drive mechanism U. Thus, the following description of this third embodiment will be focused on these differences from the first embodiment.
  • The switching type split polarizer Sp of this embodiment has least three polarizing portions, two adjacent ones of which have polarization axes in mutually different directions. FIG. 9( a) illustrates an example of such a split polarizer Sp. The split polarizer Sp shown in FIG. 9 has first through eighth polarizing portions Sp1 through Sp8, which all have a fan shape and which are arranged around the center of rotation S0. The polarization axes of the first through eighth polarizing portions Sp1 through Sp8 are defined so that the polarization axes are different from each other at least between adjacent polarizing portions with respect to the boundary between them, for example.
  • In accordance with a signal supplied from the control section V, the drive mechanism U rotates the split polarizer Sp on the center of rotation S and stops rotating the split polarizer Sp at a position where the boundary between adjacent polarizing portions overlaps with the optical axis V0 of the lens optical system L. In this manner, two polarizing portions with mutually different polarization axis directions can be arranged in the first and second optical regions D1 and D2. In addition, since the polarizing portions to be arranged in the first and second optical regions D1 and D2 can be selected from the first through eighth polarizing portions Sp1 through Sp8, the polarization axis directions in the first and second optical regions D1 and D2 can be selected arbitrarily from a predetermined combination.
  • According to such a configuration, by selecting arbitrary polarizing portions according to the condition on which the object needs to be shot, the polarization axis directions in the first and second optical regions D1 and D2 can be switched. As a result, polarized images can be shot adaptively to an even broader range of shooting environments.
  • The switching type split polarizer does not have to have the configuration shown in FIG. 9( a) but may also be modified in various manners. For example, first through seventh polarizing portions Sp1 through Sp7 may be arranged linearly as shown in FIG. 9( b) and the drive mechanism U may move the polarizing portions in the direction in which they are arranged. Also, although the drive mechanism U and the control section V can change the polarization axis directions of the first and second optical regions D1 and D2 according to this embodiment, one of these two polarization axis directions does not have to be changed. Specifically, even though the polarizing portions to be arranged in the first and second optical regions D1 and D2 are switched by the drive mechanism in this embodiment, the polarizing portion arranged in one of the first and second optical regions D1 and D2 may be fixed and only the polarizing portion to be arranged in the other by the drive mechanism may be switched.
  • Embodiment 4
  • A fourth embodiment of an image capture device according to the present invention will be described. The image capture device of this embodiment can also change the directions of the first and second polarization axes of the first and second optical regions, which is a major difference from the image capture device of the first embodiment. More specifically, in this embodiment, the split polarizer is comprised of a liquid crystal element and a control section which functions as the polarization direction changing section. Thus, the following description of this fourth embodiment will be focused on these differences from the first embodiment.
  • FIG. 10 illustrates a configuration for an image capture device according to this embodiment. The image capture device shown in FIG. 10 includes a liquid crystal element W and a control section V as a split polarizer.
  • FIG. 11( a) is a cross-sectional view generally illustrating the structure of the liquid crystal element W and FIG. 11( b) is a front view thereof. The liquid crystal element W includes a common transparent electrode EC, a liquid crystal layer LC, divided transparent electrodes ED1, ED2 and a polarizing plate PL.
  • The common transparent electrode EC is arranged on a glass substrate H1 with an alignment film T1, thus forming a substrate SB1. On the other hand, the divided transparent electrodes ED1 and ED2 are arranged on a glass substrate H2 with an alignment film T2. On the other side of the substrate SB2 with no divided transparent electrodes ED1, ED2, arranged is the polarizing plate PL, which has a polarization axis and which transmits light vibrating in the direction of the polarization axis. The alignment direction of the alignment film T2 agrees with the polarization axis of the polarizing plate PL. The liquid crystal layer LC is interposed between the two substrates SB1 and SB2 that are bonded together with a seal member J.
  • As shown in FIG. 11( b), the divided transparent electrodes ED1 and ED2 are arranged so that their boundary agrees with the horizontal direction that passes through the optical axis V0 of the lens optical system L. In this manner, the divided transparent electrodes ED1 and ED2 are arranged in the first and second optical regions D1 and D2, respectively. The control section V applies a voltage to between the common transparent electrode EC and the divided transparent electrodes ED1, ED2.
  • The liquid crystal layer LC has an optical rotatory characteristic and comes to have an angle of optical rotation according to the voltage applied between the common transparent electrode EC and the divided transparent electrodes ED1, ED2. For example, the liquid crystal layer LC may have an angle of optical rotation of 90 or 180 degrees according to the voltage applied. That is why if the voltages applied to the divided transparent electrodes ED1 and ED2 are different, then the liquid crystal layer LC interposed between the common transparent electrode EC and the divided transparent electrode ED1 and the liquid crystal layer LC interposed between the common transparent electrode EC and the divided transparent electrode ED2 come to have different angles of rotation of light.
  • The light beam that has entered this liquid crystal element W has its polarization direction rotated due to the optical rotatory characteristic of the liquid crystal layer LC and then is incident on the polarizing plate PL. In this case, the angle of optical rotation, i.e., the angle of rotation of the polarization axis, varies according to the voltage applied by the control section V to the divided transparent electrodes ED1 and ED2, as described above. The polarizing plate PL transmits only a component of the light transmitted through the liquid crystal layer LC if the component is a linearly polarized light ray that is parallel to the polarization axis of the polarizing plate PL. As a result, only a linearly polarized light ray that has rotated to the angle of optical rotation to be determined by the voltages applied to the divided transparent electrodes ED1 and ED2 and that vibrates in the same direction as the polarization axis of the polarizing plate PL is transmitted through the liquid crystal element W and detected at the image sensor N. That is why the polarization direction of the light ray going out of the liquid crystal element W is the same, no matter whether the light ray has been transmitted through the divided transparent electrode ED1 or the divided transparent electrode ED2. However, since those light rays have rotated to different angles of optical rotation through the liquid crystal layer LC, the polarization directions of the light rays going out of the liquid crystal element W and incident on the lens optical system L are different between the divided transparent electrodes ED1 and ED2, i.e., between the first and second optical regions D1 and D2. That is to say, by making the linearly polarized light rays to be transmitted through the divided transparent electrodes ED1 and ED2 have mutually different polarization axis directions and by regulating the voltage applied, their polarization directions can be changed substantially.
  • As can be seen, according to this embodiment, two images represented by light beams in mutually different polarization states can be obtained at a time. In addition, since the angle of optical rotation of the liquid crystal layer LC can be adjusted by regulating the applied voltage, the polarization condition of the light representing the image can be changed according to the shooting environment. Consequently, the image capture device of this embodiment can cope with an even broader range of shooting environments.
  • In addition, since the polarization axis of the split polarizer can be switched without using any mechanical driving section, the switching operation can get done at high speeds. Therefore, it is possible to perform a shooting session under a predetermined polarization condition and then perform another shooting session for a short time under a different polarization condition. For example, if an organism needs to be shot in three or more different polarization states, the number of images obtained can be twice as large as the number of times of shooting sessions by performing the shooting sessions a number of times at short intervals.
  • Although the liquid crystal element includes two divided transparent electrodes in the embodiment described above, the liquid crystal element may also include four divided transparent electrodes to be arranged in four optical regions as in the second embodiment described above. In that case, micro lenses may be used as the array of optical elements as in the second embodiment described above. Then, four images represented by light beams in four different polarization states can be obtained.
  • Embodiment 5
  • A fifth embodiment of an image capture device according to the present invention will be described. In the image capture device of this embodiment, the image sensor is a color image sensor with an arrangement of pixels on which color filters are arranged in a Bayer arrangement, and the array of optical elements K is a lenticular lens with a different shape from its counterpart of the first embodiment, which are major differences from the image capture device of the first embodiment. Thus, the following description of this fifth embodiment will be focused on these differences from the first embodiment.
  • In a color image sensor with a Bayer arrangement, pixels are arranged to form a tetragonal lattice, and pixels with green color filters (having first and second spectral transmittance characteristics), among those pixels, are arranged so as to be diagonally adjacent to each other and have a density that is approximately a half as high as that of all pixels. On the other hand, pixels with red and blue color filters (having third and fourth spectral transmittance characteristics) are arranged evenly as a density that is a half as high as that of the green pixels. More specifically, although there are green pixels on each row and each column (i.e., both on odd- and even-numbered columns and on odd- and even-numbered rows), red and blue pixels are present on either odd- or even-numbered columns and on either odd- or even-numbered rows. That is why if the array of optical elements K has the same structure as the counterpart of the first embodiment (i.e., implemented as a lenticular lens), information about the color blue is missing from one of the two images represented by light beams that have been transmitted through the first and second optical regions D1 and D2 and information about the color red is missing from the other image.
  • Thus, to achieve the same effect as in the first embodiment even when such a color image sensor with a Bayer arrangement is used, the shape of the array of optical elements K is modified according to this embodiment. FIG. 12 is a perspective view illustrating the array of optical elements K according to this embodiment as viewed from the image side. This array of optical elements K includes a plurality of optical elements M1 and M2. In each set of optical elements M1, M2, a number of cylindrical lenses, each running horizontally (i.e., in the first direction), are arranged vertically (i.e., in the second direction) linearly. Each set of those optical elements M1, M2 forms a column running vertically, and columns of the optical elements M1 and columns of the optical elements M2 are alternately arranged horizontally. In a column of optical elements M1 and a column of optical elements M2 which are horizontally adjacent to each other, each optical element on one column is vertically shifted from an associated optical element on the other column by a length corresponding to a half of one vertical arrangement period.
  • Each of these optical elements M1 and M2 is associated with four pixels with red, blue and green filters, which are arranged in a Bayer arrangement pattern to form the image capturing plane of the image sensor, and makes the light that has been transmitted through the lens optical system L incident on four pixels that face the optical element. That is to say, the cylindrical surface that is the lens surface of each optical element M1, M2 has one period corresponding to two pixels of the image sensor N both vertically and horizontally. That is why in two horizontally adjacent columns of optical elements M1 and M2, each optical element on one column is vertically shifted by one pixel from an associated optical element on the other column.
  • As in the first embodiment, thanks to the action of the lenticular lens consisting of these optical elements M1 and M2, light rays that have been transmitted through the first and second optical regions D1 and D2 are incident on mutually different pixels. An optical element on one column of optical elements M1 is vertically shifted by a half period from an associated optical element on an adjacent column of optical elements M2. That is why the light rays coming from the first and second optical regions D1 and D2 are incident on the pixels of the image sensor with odd- and even-numbered rows changed every two pixels.
  • FIGS. 13( a) and 13(b) are schematic representations illustrating light rays incident on the image capturing plane Ni of the image sensor N of this embodiment. In FIGS. 13( a) and 13(b), pixels to which the light rays transmitted through the first optical region D1 are led are shown in FIG. 13( a) and pixels to which the light rays transmitted through the second optical region D2 are led are shown in FIG. 13( b) for the sake of simplicity.
  • As shown in these drawings, on each column of optical elements M1, the optical elements M1 lead the light rays coming from the first optical region D1 onto green (G1) pixels P1A and red (R) pixels P3A and also leas the light rays coming from the second optical region D2 onto green (G2) pixels P2B and blue (B) pixels P4B. On the other hand, on each column of optical elements M2, the optical elements M2 lead the light rays coming from the first optical region D1 onto green (G2) pixels P2A and blue (B) pixels P4A and also leas the light rays coming from the second optical region D2 onto green (G1) pixels P1B and red (R) pixels P3B.
  • The signal processing section C receives signals from the pixels of the image sensor N on which the light rays coming from the first optical region D1 have been incident (as shown in FIG. 13( a)) and signals from the pixels on which the light rays coming from the second optical region D2 have been incident (as shown in FIG. 13( b)) and processes those two groups of signals separately from each other, thereby generating two images. The signals received from the pixels on which the light rays coming from the first optical region D1 have been incident (as shown in FIG. 13( a)) and the signals received from the pixels on which the light rays coming from the second optical region D2 have been incident (as shown in FIG. 13( b)) each include signals received from the red, blue and green pixels. As a result, color images generated by light rays in mutually different polarization states can be obtained.
  • As can be seen from FIGS. 13( a) and 13(b), the light rays led by the columns of optical elements M1 are not incident on the green (G2) pixels P2A and blue (B) pixels P4A among the pixels on which the light rays coming from the first optical region D1 are incident (see FIG. 13( a)). In the same way, the light rays led by the columns of optical elements M2 are not incident on the green (G1) pixels P1A and red (R) pixels P3A. That is why when the signal processing section C processes the signals received from the pixels on which the light rays coming from the first optical region D1 have been incident as shown in FIG. 13( a), signals of two missing ones of the four pixels associated with each column of optical elements M1 may be interpolated with signals of two pixels associated with an adjacent column of optical elements M2. In the same way, when the signal processing section C processes the signals received from the pixels on which the light rays coming from the second optical region D2 have been incident as shown in FIG. 13( b), signals of two missing ones of the four pixels associated with each column of optical elements M1 may be interpolated with signals of two pixels associated with an adjacent column of optical elements M2.
  • Even though the image sensor is supposed to be a color image sensor with a Bayer arrangement in the embodiment described above, pixels with green filters may be vertically adjacent to each other in each set of four pixels. Also, although the image sensor is supposed to include pixels with red, blue and green filters in the embodiment described above, the image sensor may also include pixels with filters in the complementary colors of these colors. For example, in the image sensor, red, blue, green and white filters, red, blue, green and yellow filters, or any other appropriate combination of filters may be provided for each set of four pixels.
  • Embodiment 6
  • An embodiment of a skin checker will be described as an exemplary biometric image capturing system that uses an image capture device according to the present invention. FIG. 14 illustrates a general configuration for a skin checker as a sixth embodiment of the present invention. The skin checker of this embodiment includes a light source Ls to illuminate an object Ob, an image capture device A, a display section Y to display an image shot, and a control section which controls all of these.
  • In the image capture device A, the lens optical system includes an optical element Sa including a split polarizer and split color filters, which is a difference from the image capture device of the second embodiment. FIG. 15 illustrates a configuration for the optical element Sa to be arranged in the vicinity of the stop of the lens optical system. FIG. 15( a) is a side view. The optical element Sa includes the split polarizer Sp that has already been described for the second embodiment and split color filters Sc adjacent to the split polarizer Sp. FIG. 15( b-1) is a front view of the split color filters Sc. There are four optical regions D1 to D4 which are arranged around the optical axis V0. A filter which mainly transmits light falling within the color red wavelength range is arranged in the optical region D1. A filter which mainly transmits light falling within the color green wavelength range is arranged in the optical region D2. And a filter which mainly transmits light falling within the color blue wavelength range is arranged in the optical regions D3 and D4. FIG. 15( b-2) is a front view of the split polarizer Sp. There are four optical regions D1 to D4 which are arranged around the optical axis V0. The polarized light transmission characteristics of these regions can be adjusted independently of each other by control means (not shown). For example, the split polarizer Sp may be configured so that polarizers with different polarized light transmission characteristics (such as the directions of their polarization axes) are readily attachable and removable to/from these optical regions D1 to D4.
  • A polarizer is arranged at the light source Ls to make the light source Ls emit mainly light with a predetermined polarization direction.
  • The light reflected from skin is a mixture of components of the light that has been reflected from the surface of the skin and components of the light that has been reflected from inside of the skin and that is affected by scattering. Of these two kinds of light, the light reflected from the surface of the skin comes back while keeping the polarization direction of the light source. On the other hand, the light that has been reflected from inside of the skin and affected by scattering no longer keeps the polarization direction of the light source.
  • In observing skin, if the surface wrinkles or texture of the skin needs to be checked, then the skin observation may be carried out with the polarization directions of the illumination source and the shooting optical system matched to each other by taking advantage of such a characteristic. On the other hand, if spots under the skin surface need to be checked, then the shooting session may be carried out with the polarization directions of the illumination source and the shooting optical system left different from each other.
  • In this case, when spots are going to be observed, the shorter the wavelength range, the better. That is why it is recommended that the spots be observed with light falling within the color blue wavelength range. Likewise, wrinkles and surface texture can also be observed better with light falling within the color blue wavelength range. That is why the object Ob is suitably shot with light falling within the color blue wavelength range and having two different pieces of polarization information. Thus, in the split polarizer Sp shown in FIG. 15, the optical region D3 of the split polarizer Sp corresponding to the optical region D3 of the split color filters Sc which mainly transmits light falling within the color blue wavelength range has such a polarization property as to mainly transmit polarized light, of which the polarization axis is substantially parallel to that of polarized light mainly emitted from the light source Ls. Likewise, the optical region D4 of the split polarizer Sp corresponding to the optical region D4 which mainly transmits light falling within the color blue wavelength range has such a polarization property as to mainly transmit polarized light, of which the polarization axis intersects at substantially right angles with that of polarized light mainly emitted from the light source Ls. On the other hand, the optical regions D1 and D2 of the split polarizer Sp corresponding to the optical regions D1 and D2 of the split color filters Sc that mainly transmit light falling within the colors green and red wavelength ranges, respectively, have such a polarization property as to mainly transmit polarized light, of which the polarization axis defines a tilt angle of 45 degrees with respect to the polarization axis of the polarized light mainly emitted from the light source Ls.
  • Consequently, an image which allows the viewer to observe the skin wrinkles and texture easily can be obtained based on the light that has been transmitted through the optical region D3 of the split polarizer Sp. On the other hand, an image which allows the viewer to observe the skin spots more easily can be obtained based on the light that has been transmitted through the optical region D4 of the split polarizer Sp.
  • Furthermore, components of light reflected from the surface of the skin and components of light reflected from inside of the skin and affected by scattering can be both obtained based on the light that has been transmitted through the optical regions D1 and D2 of the split polarizer Sp.
  • It should be noted that the polarization axis of the optical regions D1 and D2 does not have to define a tilt angle of 45 degrees with respect to the polarization axis of the light emitted from the light source Ls. Rather the direction of the polarization axis of the optical regions D1 and D2 may be adjusted appropriately depending on how the region of interest is irradiated with the light. For example, in a general shooting situation, there should be not only the light source of the skin checker but also other kinds of environmental light such as a room light and sunlight. That is why if the direct reflection of such environmental light is reduced by appropriately adjusting the polarization axis direction of the shooting optical system, the shooting condition can be further improved.
  • By synthesizing together the images produced by the light beams that have been transmitted through the optical regions D1 and D2 and that fall within the colors green and red wavelength ranges and the image produced by the light beams that have been transmitted through the optical regions D3 and D4 and that fall within the color blue wavelength range, a color observed image representing the skin can be obtained. These images are synthesized together by the signal processing section C of the image capture device A (see FIG. 1).
  • The image that has been shot by the image capture device A is subjected to appropriate image processing by the signal processing section C and then presented on the display section Y. In this case, if the signal processing section C performs horizontally inverting processing on the image, a mirror image of the object Ob is presented on the display section Y with its right and left portions inverted. By displaying such an inverted image when a person who is the object is observing his or her own skin, the display section can function as a mirror. As a result, the skin checker of this embodiment can be used as a sort of electronic mirror that can display spots, wrinkles and so on effectively. The skin checker suitably performs such a display operation just like a mirror does, because he or she can recognize his or her spots or wrinkles intuitively in that case while doing a makeup or skincare.
  • As described above, by adopting the configuration of this embodiment, a skin checker which allows the user to observe his or her own skin spots, wrinkles and texture efficiently and which can obtain a color skin image at the same time is realized.
  • The image capture device of this embodiment can appropriately adjust the directions of the polarization axes of the respective optical regions D1 through D4 of the split polarizer Sp, and therefore, can arrange the light source Ls appropriately with the influence of environmental light taken into account and set the polarization directions of the split polarizer Sp according to the position of the light source before actually starting to carry out shooting. As a result, an image can be shot with the influence of environmental light further reduced.
  • Also, under a shooting environment to be seriously affected by environmental light, the influence of the environmental light can be reduced by carrying out shooting synchronously with flickering of the light source Ls. For example, in that case, the difference between an image captured with the light source Ls turned ON and an image captured with the light source Ls turned OFF may be calculated and spots or wrinkles may be checked using the differential image.
  • In the embodiment described above, of the four optical regions D1 to D4 of the split polarizer Sc, the optical regions D3 and D4 are supposed to transmit a light beam falling within the color blue wavelength range. However, the wavelength range of the light beam to be transmitted through the two optical regions does not have to be the color blue wavelength range. For example, two optical regions may transmit a light beam falling within the color green wavelength range and the other two optical regions may transmit a light beam falling within the color blue wavelength range and a light beam falling within the color red wavelength range, respectively. Such a configuration may be adopted if considering the spectrum distribution of the light source Ls and the spectral sensitivity of the photodiode, it is more advantageous to use a light beam falling within the color green wavelength range rather than a light beam falling within the color blue wavelength range in order to observe skin spots, wrinkles or texture.
  • Other Embodiments
  • In the embodiments described above, the lens optical system L is supposed to be a single lens. However, the lens optical system may include a compound lens which is a combination of multiple lenses. By using such a compound lens, the optical system can be designed with an increased degree of freedom, and an image with a high resolution can be obtained, which is beneficial.
  • Optionally, to allow the array of optical elements to split incoming light into multiple light rays as intended, the lens optical system may have image-space telecentricity. However, even if the lens optical system does not have image-space telecentricity, the incoming light can also be split into multiple light rays just as intended by appropriately adjusting one period of the array of optical elements (such as a lenticular lens or a micro lens array) arranged in front of the image sensor according to the angle of emittance of an off-axis principal ray of the lens optical system.
  • In the embodiment described above, the image capture device is supposed to include a signal processing section C. However, an image capture device according to the present invention does not have to include the signal processing section C. In that case, the output signal of the image sensor may be transmitted to an external device such as a personal computer so that the arithmetic processing that should have been done by the signal processing section C is carried out by the external device instead. That is to say, the present invention may be implemented as a system including the image capture device with the lens optical system L, the array of optical elements K and the image sensor N and an external signal processor.
  • INDUSTRIAL APPLICABILITY
  • An image capture device according to the present disclosure can be used effectively as an industrial camera such as a product inspecting camera, a surveillance camera, and an image input camera for information terminals or robots. In addition, the image capture device of the present disclosure may also be used in a digital still camera or a digital camcorder as well.
  • REFERENCE SIGNS LIST
    • A image capture device
    • L lens optical system
    • L1 objective lens
    • Ls light source
    • V0 optical axis of lens optical system L
    • D1, D2, D3, D4 optical region
    • S stop
    • Sp split polarizer
    • K array of optical elements
    • M, M1, M2 optical element
    • N image sensor
    • Ni image capturing plane
    • Ms micro lens
    • Ob object
    • P1 to P4 pixel
    • C signal processing section
    • V control section
    • U drive mechanism
    • W liquid crystal element
    • EC common transparent electrode
    • ED1, ED2 divided transparent electrode
    • LC liquid crystal layer
    • PL polarizer
    • SB1, SB2 substrate
    • H1, H2 glass substrate
    • J seal member
    • T1, T2 alignment film
    • P1A to P4A pixel
    • P1B to P4AB pixel
    • Y display section

Claims (23)

1. An image capture device comprising:
a lens optical system;
an image sensor on which light that has passed through the lens optical system is incident and which includes at least a plurality of first pixels and a plurality of second pixels; and
an array of optical elements which is arranged between the lens optical system and the image sensor and which includes a plurality of optical elements, each having a lens surface,
wherein the lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis, and
each of the optical elements that form the array of optical elements makes the light that has passed through the first optical region incident on the plurality of first pixels and also makes the light that has passed through the second optical region incident on the plurality of second pixels.
2. The image capture device of claim 1, wherein the image sensor is a monochrome image sensor.
3. The image capture device of claim 1, wherein the lens optical system is an image-space telecentric optical system.
4. The image capture device of claim 1, wherein the lens optical system includes a split polarizer having first and second polarizing portions which are located in the first and second regions, respectively.
5. The image capture device of claim 1, wherein the optical elements that form the array of optical elements are a lenticular lens.
6. The image capture device of claim 5, wherein in the image sensor, a number of the first pixels and a number of the second pixels are arranged in a first direction and the first pixels arranged in the first direction and the second pixels arranged in the first direction alternate with each other in a second direction that intersects with the first direction at right angles, thus forming an image capturing plane.
7. The image capture device of claim 4, wherein the lens optical system further has a third optical region which transmits mostly light vibrating in the direction of a third polarization axis and a fourth optical region which transmits mostly light vibrating in the direction of a fourth polarization axis, and
the split polarizer further has third and fourth polarizing portions which are located in the third and fourth regions, respectively.
8. The image capture device of claim 7, wherein the optical elements that form the array of optical elements are a micro lens array.
9. The image capture device of claim 1, further comprising a polarization direction changing section which changes the direction of at least one of the first and second polarization axes of the first and second optical regions.
10. The image capture device of claim 1, wherein the lens optical system includes a split polarizer with at least three polarizing portions, two adjacent ones of which have polarization axes in mutually different directions, and
the image capture device further includes a drive mechanism which drives the split polarizer so that any two adjacent ones of the at least three polarizing portions of the split polarizer are located in the first and second regions.
11. The image capture device of claim 10, wherein the split polarizer includes a common transparent electrode, two divided transparent electrodes which are located in the first and second optical regions, respectively, a liquid crystal layer which is interposed between the common transparent electrode and the two divided transparent electrodes, and a control section which applies mutually different voltages to the two divided transparent electrodes.
12. (canceled)
13. The image capture device of claim 1, wherein the plurality of first pixels includes a number of 1A pixels with filters having a first spectral transmittance characteristic, a number of 2A pixels with filters having a second spectral transmittance characteristic, a number of 3A pixels with filters having a third spectral transmittance characteristic, and a number of 4A pixels with filters having a fourth spectral transmittance characteristic,
the plurality of second pixels includes a number of 1B pixels with filters having the first spectral transmittance characteristic, a number of 2B pixels with filters having the second spectral transmittance characteristic, a number of 3B pixels with filters having the third spectral transmittance characteristic, and a number of 4B pixels with filters having the fourth spectral transmittance characteristic, and
the array of optical elements includes:
a plurality of first optical elements which makes light that has passed through the first optical region incident on the 1A and 3A pixels and which also makes light that has passed through the second region incident on the 2B and 4B pixels; and
a plurality of second optical elements which makes light that has passed through the first region incident on the 2A and 4A pixels and which also makes light that has passed through the second region incident on the 1B and 3B pixels.
14. The image capture device of claim 13, wherein on the image capturing plane of the image sensor, the 1A, 2B, 3A and 4B pixels that form a single set are adjacent to each other and are arranged at the four vertices of a quadrangle,
wherein the filters having the first spectral transmittance characteristic and the filters having the second spectral transmittance characteristic transmit light falling within the wavelength range of the color green,
the filters having the third spectral transmittance characteristic transmit light falling within the wavelength range of the color red,
the filters having the fourth spectral transmittance characteristic transmit light falling within the wavelength range of the color blue, and
the 1A, 2B, 3A and 4B pixels that form a single set are arranged in a Bayer arrangement pattern.
15. (canceled)
16. The image capture device of claim 13, wherein the plurality of first optical elements and the plurality of second optical elements form a lenticular lens.
17. The image capture device of claim 1, wherein the lens optical system further includes a stop, and the first and second optical regions are located in the vicinity of the stop.
18. An image capture device comprising:
a lens optical system;
an image sensor which includes a plurality of first pixels with filters having a first spectral transmittance characteristic, a plurality of second pixels with filters having a second spectral transmittance characteristic, a plurality of third pixels with filters having a third spectral transmittance characteristic, and a plurality of fourth pixels with filters having a fourth spectral transmittance characteristic and in which a first row where the first and second pixels are arranged alternately in a first direction and a second row where the third and fourth pixels are arranged alternately in the first direction alternate in a second direction, thereby forming an image capturing plane, wherein light that has passed through the lens optical system is incident on the first, second, third and fourth pixels; and
an array of optical elements which is arranged between the lens optical system and the image sensor,
wherein the lens optical system has a first optical region which transmits mostly light vibrating in the direction of a first polarization axis and a second optical region which transmits mostly light vibrating in the direction of a second polarization axis that is different from the direction of the first polarization axis, the first and second optical regions being arranged in the second direction,
on the image capturing plane, the array of optical elements includes a plurality of optical elements, each of which makes the light that has been transmitted through the lens optical system incident on every four pixels, which are the first, second, third and fourth pixels that are arranged adjacent to each other in the first and second directions, and
the plurality of optical elements forms a number of columns, each of which is arranged linearly in the second direction, wherein on two columns that are adjacent to each other in the first direction, each optical element on one of the two columns is shifted from an associated optical element on the other column in the second direction by a length corresponding to a half of one arrangement period of the optical elements.
19. A biometric image capturing system comprising:
the image capture device of claim 7; and
a light source which irradiates an object with polarized light.
20. The biometric image capturing system of claim 19, wherein the lens optical system of the image capture device includes split color filters to be arranged in the first to fourth optical regions,
the split color filters transmit light falling within the same wavelength range in two of those first through fourth optical regions, and
in the two optical regions, the polarization axes of the split polarizer are in mutually different directions.
21. (canceled)
22. The biometric image capturing system of claim 19, further comprising a control section which controls the light source and the image capture device,
wherein the control section controls the image capture device so that the image capture device captures multiple images synchronously with flickering of the light source, and
the image capture device performs arithmetic processing between the multiple images to generate another image.
23. The biometric image capturing system of claim 19, further comprising a display section which displays an image that has been shot by the image capture device,
wherein the image capture device further includes a signal processing section, and
the signal processing section generates an image signal by inverting horizontally the image shot and outputs the image signal to the display section.
US14/112,799 2012-02-02 2013-02-01 Imaging device Abandoned US20140055664A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/361,779 US10247866B2 (en) 2012-02-02 2016-11-28 Imaging device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2012021026 2012-02-02
JP2012-011026 2012-02-02
PCT/JP2013/000563 WO2013114888A1 (en) 2012-02-02 2013-02-01 Imaging device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/000563 A-371-Of-International WO2013114888A1 (en) 2012-02-02 2013-02-01 Imaging device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/361,779 Continuation US10247866B2 (en) 2012-02-02 2016-11-28 Imaging device

Publications (1)

Publication Number Publication Date
US20140055664A1 true US20140055664A1 (en) 2014-02-27

Family

ID=48904932

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/112,799 Abandoned US20140055664A1 (en) 2012-02-02 2013-02-01 Imaging device
US15/361,779 Active US10247866B2 (en) 2012-02-02 2016-11-28 Imaging device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/361,779 Active US10247866B2 (en) 2012-02-02 2016-11-28 Imaging device

Country Status (3)

Country Link
US (2) US20140055664A1 (en)
JP (1) JP5906464B2 (en)
WO (1) WO2013114888A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9057896B1 (en) * 2012-12-20 2015-06-16 Amazon Technologies, Inc. Orientation variable polarization
US20150381871A1 (en) * 2014-06-25 2015-12-31 Canon Kabushiki Kaisha Image capturing apparatus
WO2016169862A1 (en) * 2015-04-20 2016-10-27 Rodenstock Gmbh Method for calibrating a polarization axis measuring device and method for determining polarization axes of spectacle lenses for a polarization axis measuring device
EP3255886A3 (en) * 2016-06-07 2018-02-28 Goodrich Corporation Imaging systems and methods
US20180336655A1 (en) * 2016-02-29 2018-11-22 Fujitsu Frontech Limited Imaging device and imaging method
US10247866B2 (en) 2012-02-02 2019-04-02 Panasonic Intellectual Property Management Co., Ltd. Imaging device
WO2020004837A1 (en) * 2018-06-25 2020-01-02 사이정보통신(주) Automatic polarization control device and method
EP3627133A1 (en) * 2018-09-20 2020-03-25 MEI S.r.l. Polarizing filter, apparatus and method for determining an orientation of a lens polarization axis of a polarized lens
EP3796081A1 (en) * 2019-09-18 2021-03-24 Kabushiki Kaisha Toshiba Optical imaging apparatus, robot hand, moving body, and lidar apparatus
CN114822245A (en) * 2022-03-31 2022-07-29 联想(北京)有限公司 Electronic device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013051519A (en) 2011-08-30 2013-03-14 Sony Corp Information processing apparatus, information processing method, program, and information processing system
JP6671872B2 (en) * 2015-06-22 2020-03-25 キヤノン株式会社 Adapter device, imaging device, and imaging system
JPWO2020085149A1 (en) * 2018-10-22 2021-09-16 富士フイルム株式会社 Imaging device and imaging method
JP7261192B2 (en) * 2020-02-27 2023-04-19 富士フイルム株式会社 LENS DEVICE, IMAGING DEVICE, IMAGING METHOD, AND IMAGING PROGRAM
CN111447423A (en) * 2020-03-25 2020-07-24 浙江大华技术股份有限公司 Image sensor, imaging apparatus, and image processing method

Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5076687A (en) * 1990-08-28 1991-12-31 Massachusetts Institute Of Technology Optical ranging apparatus
US20010033326A1 (en) * 1999-02-25 2001-10-25 Goldstein Michael D. Optical device
US20020054208A1 (en) * 1999-02-25 2002-05-09 Envision Advanced Medical Systems Ltd. Optical device
US20020154215A1 (en) * 1999-02-25 2002-10-24 Envision Advance Medical Systems Ltd. Optical device
US20060209292A1 (en) * 2004-09-14 2006-09-21 Dowski Edward R Jr Low height imaging system and associated methods
US20080291311A1 (en) * 2007-04-11 2008-11-27 Nikon Corporation Image pickup device, focus detection device, image pickup apparatus, method for manufacturing image pickup device, method for manufacturing focus detection device, and method for manufacturing image pickup apparatus
US20090167922A1 (en) * 2005-01-18 2009-07-02 Perlman Stephen G Apparatus and method for capturing still images and video using coded lens imaging techniques
US20090278966A1 (en) * 2006-09-14 2009-11-12 Nikon Corporation Image sensor and imaging apparatus
US20100171854A1 (en) * 2009-01-08 2010-07-08 Sony Corporation Solid-state imaging device
US20100303344A1 (en) * 2008-07-08 2010-12-02 Satoshi Sato Method, apparatus and program for image processing and method and apparatus for image synthesizing
US20110033177A1 (en) * 2009-08-06 2011-02-10 Yoshihiko Kuroki Imaging Device and Video Recording/Reproducing System
US20110316983A1 (en) * 2010-01-05 2011-12-29 Panasonic Corporation Three-dimensional image capture device
US20120112037A1 (en) * 2010-05-11 2012-05-10 Panasonic Corporation Three-dimensional imaging device
US20120212587A1 (en) * 2011-02-17 2012-08-23 Sony Corporation Imaging apparatus, image processing method, and program
US20130027557A1 (en) * 2011-07-29 2013-01-31 Ricoh Company, Ltd. Detection apparatus and method
US20130063569A1 (en) * 2010-05-28 2013-03-14 Sony Corporation Image-capturing apparatus and image-capturing method
US20130070146A1 (en) * 2011-03-07 2013-03-21 Panasonic Corporation Image pickup device and rangefinder device
US20130075585A1 (en) * 2011-09-27 2013-03-28 Kabushiki Kaisha Toshiba Solid imaging device
US20130083172A1 (en) * 2011-09-30 2013-04-04 Sony Corporation Imaging apparatus and imaging method
US20130136306A1 (en) * 2010-07-01 2013-05-30 Xue Li Object identification device
US20130141634A1 (en) * 2011-06-27 2013-06-06 Tsuguhiro Korenaga Imaging device
US20130145608A1 (en) * 2010-12-10 2013-06-13 Panasonic Corporation Method for designing and method for manufacturing diffraction-grating lens
US20130188023A1 (en) * 2012-01-23 2013-07-25 Omnivision Technologies, Inc. Image sensor with optical filters having alternating polarization for 3d imaging
US20130215299A1 (en) * 2011-06-23 2013-08-22 Panasonic Corporation Imaging apparatus
US20130235256A1 (en) * 2010-11-16 2013-09-12 Kenichi Kodama Multiband camera, and multiband image capturing method
US20130270421A1 (en) * 2011-09-02 2013-10-17 Panasonic Corporation Polarization image sensor and endoscope
US20130293704A1 (en) * 2011-11-30 2013-11-07 Panasonic Corporation Imaging apparatus
US20130329042A1 (en) * 2011-04-27 2013-12-12 Panasonic Corporation Image pick-up device, image pick-up system equipped with image pick-up device, and image pick-up method
US20130341493A1 (en) * 2011-11-30 2013-12-26 Panasonic Corporation Imaging device and imaging system
US20140028825A1 (en) * 2012-07-25 2014-01-30 Panasonic Corporation Imaging-observation apparatus
US20140055661A1 (en) * 2012-02-03 2014-02-27 Panasonic Corporation Imaging device and imaging system
US20140071247A1 (en) * 2012-02-03 2014-03-13 Panasonic Corporation Image pick-up device and distance measuring device
US20140092227A1 (en) * 2012-05-22 2014-04-03 Panasonic Corporation Image capturing processor and endoscope
US8768102B1 (en) * 2011-02-09 2014-07-01 Lytro, Inc. Downsampling light field images
US20150156478A1 (en) * 2012-08-06 2015-06-04 Fujifilm Corporation Imaging device
US20150212294A1 (en) * 2013-07-30 2015-07-30 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus, imaging system that includes imaging apparatus, electron mirror system that includes imaging apparatus, and ranging apparatus that includes imaging apparatus
US9239514B2 (en) * 2011-03-28 2016-01-19 Sony Corporation Imaging apparatus and electronic device for producing stereoscopic images

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0835928A (en) * 1994-07-20 1996-02-06 Unitec Res Kk Imaging apparatus
JP2004048702A (en) * 2002-05-17 2004-02-12 Canon Inc Stereoscopic image display device and stereoscopic image display system
JP2006254331A (en) 2005-03-14 2006-09-21 Fuji Photo Film Co Ltd Image processing method for detecting reflected optical component of object and apparatus executing this method
US7723662B2 (en) 2005-10-07 2010-05-25 The Board Of Trustees Of The Leland Stanford Junior University Microscopy arrangements and approaches
JP4857962B2 (en) * 2006-07-05 2012-01-18 株式会社ニコン Imaging device
JP4984300B2 (en) * 2007-03-30 2012-07-25 富士フイルム株式会社 Image correction apparatus, image correction method, and program
JP5610254B2 (en) 2008-06-18 2014-10-22 株式会社リコー Imaging apparatus and road surface state determination method
JP5472584B2 (en) 2008-11-21 2014-04-16 ソニー株式会社 Imaging device
JP2010154171A (en) * 2008-12-25 2010-07-08 Olympus Imaging Corp Imaging apparatus and imaging method
JP5433381B2 (en) * 2009-01-28 2014-03-05 合同会社IP Bridge1号 Intraoral measurement device and intraoral measurement method
JP2011038827A (en) * 2009-08-07 2011-02-24 Kitami Institute Of Technology Road surface condition detection method and road surface condition detector
JP5358368B2 (en) * 2009-09-18 2013-12-04 富士フイルム株式会社 Endoscope system
JP5289371B2 (en) * 2010-03-30 2013-09-11 富士フイルム株式会社 Endoscope device and processor device in endoscope system
JP5587057B2 (en) * 2010-06-29 2014-09-10 富士フイルム株式会社 Polarized image measurement device and polarized image measurement display system
JP5650055B2 (en) * 2011-05-27 2015-01-07 富士フイルム株式会社 Imaging device
WO2013114888A1 (en) 2012-02-02 2013-08-08 パナソニック株式会社 Imaging device

Patent Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5076687A (en) * 1990-08-28 1991-12-31 Massachusetts Institute Of Technology Optical ranging apparatus
US20010033326A1 (en) * 1999-02-25 2001-10-25 Goldstein Michael D. Optical device
US20020054208A1 (en) * 1999-02-25 2002-05-09 Envision Advanced Medical Systems Ltd. Optical device
US20020154215A1 (en) * 1999-02-25 2002-10-24 Envision Advance Medical Systems Ltd. Optical device
US20060209292A1 (en) * 2004-09-14 2006-09-21 Dowski Edward R Jr Low height imaging system and associated methods
US20090167922A1 (en) * 2005-01-18 2009-07-02 Perlman Stephen G Apparatus and method for capturing still images and video using coded lens imaging techniques
US20090278966A1 (en) * 2006-09-14 2009-11-12 Nikon Corporation Image sensor and imaging apparatus
US20080291311A1 (en) * 2007-04-11 2008-11-27 Nikon Corporation Image pickup device, focus detection device, image pickup apparatus, method for manufacturing image pickup device, method for manufacturing focus detection device, and method for manufacturing image pickup apparatus
US20100303344A1 (en) * 2008-07-08 2010-12-02 Satoshi Sato Method, apparatus and program for image processing and method and apparatus for image synthesizing
US20100171854A1 (en) * 2009-01-08 2010-07-08 Sony Corporation Solid-state imaging device
US20110033177A1 (en) * 2009-08-06 2011-02-10 Yoshihiko Kuroki Imaging Device and Video Recording/Reproducing System
US20110316983A1 (en) * 2010-01-05 2011-12-29 Panasonic Corporation Three-dimensional image capture device
US20120112037A1 (en) * 2010-05-11 2012-05-10 Panasonic Corporation Three-dimensional imaging device
US20130063569A1 (en) * 2010-05-28 2013-03-14 Sony Corporation Image-capturing apparatus and image-capturing method
US20130136306A1 (en) * 2010-07-01 2013-05-30 Xue Li Object identification device
US20130235256A1 (en) * 2010-11-16 2013-09-12 Kenichi Kodama Multiband camera, and multiband image capturing method
US20130145608A1 (en) * 2010-12-10 2013-06-13 Panasonic Corporation Method for designing and method for manufacturing diffraction-grating lens
US8768102B1 (en) * 2011-02-09 2014-07-01 Lytro, Inc. Downsampling light field images
US20120212587A1 (en) * 2011-02-17 2012-08-23 Sony Corporation Imaging apparatus, image processing method, and program
US20130070146A1 (en) * 2011-03-07 2013-03-21 Panasonic Corporation Image pickup device and rangefinder device
US9239514B2 (en) * 2011-03-28 2016-01-19 Sony Corporation Imaging apparatus and electronic device for producing stereoscopic images
US20130329042A1 (en) * 2011-04-27 2013-12-12 Panasonic Corporation Image pick-up device, image pick-up system equipped with image pick-up device, and image pick-up method
US20130215299A1 (en) * 2011-06-23 2013-08-22 Panasonic Corporation Imaging apparatus
US20130141634A1 (en) * 2011-06-27 2013-06-06 Tsuguhiro Korenaga Imaging device
US20130027557A1 (en) * 2011-07-29 2013-01-31 Ricoh Company, Ltd. Detection apparatus and method
US20130270421A1 (en) * 2011-09-02 2013-10-17 Panasonic Corporation Polarization image sensor and endoscope
US20130075585A1 (en) * 2011-09-27 2013-03-28 Kabushiki Kaisha Toshiba Solid imaging device
US20130083172A1 (en) * 2011-09-30 2013-04-04 Sony Corporation Imaging apparatus and imaging method
US20130341493A1 (en) * 2011-11-30 2013-12-26 Panasonic Corporation Imaging device and imaging system
US20130293704A1 (en) * 2011-11-30 2013-11-07 Panasonic Corporation Imaging apparatus
US20130188023A1 (en) * 2012-01-23 2013-07-25 Omnivision Technologies, Inc. Image sensor with optical filters having alternating polarization for 3d imaging
US20140055661A1 (en) * 2012-02-03 2014-02-27 Panasonic Corporation Imaging device and imaging system
US20140071247A1 (en) * 2012-02-03 2014-03-13 Panasonic Corporation Image pick-up device and distance measuring device
US20140092227A1 (en) * 2012-05-22 2014-04-03 Panasonic Corporation Image capturing processor and endoscope
US20140028825A1 (en) * 2012-07-25 2014-01-30 Panasonic Corporation Imaging-observation apparatus
US20150156478A1 (en) * 2012-08-06 2015-06-04 Fujifilm Corporation Imaging device
US20150212294A1 (en) * 2013-07-30 2015-07-30 Panasonic Intellectual Property Management Co., Ltd. Imaging apparatus, imaging system that includes imaging apparatus, electron mirror system that includes imaging apparatus, and ranging apparatus that includes imaging apparatus

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10247866B2 (en) 2012-02-02 2019-04-02 Panasonic Intellectual Property Management Co., Ltd. Imaging device
US9057896B1 (en) * 2012-12-20 2015-06-16 Amazon Technologies, Inc. Orientation variable polarization
US20150381871A1 (en) * 2014-06-25 2015-12-31 Canon Kabushiki Kaisha Image capturing apparatus
US9690112B2 (en) * 2014-06-25 2017-06-27 Canon Kabushiki Kaisha Image capturing apparatus
WO2016169862A1 (en) * 2015-04-20 2016-10-27 Rodenstock Gmbh Method for calibrating a polarization axis measuring device and method for determining polarization axes of spectacle lenses for a polarization axis measuring device
US10161828B2 (en) 2015-04-20 2018-12-25 Rodenstock Gmbh Method for calibrating a polarisation axis measuring device and method for determining polarisation axes of spectacle lenses for a polarisation axis measuring device
US10970802B2 (en) 2016-02-29 2021-04-06 Fujitsu Frontech Limited Imaging device and imaging method selecting a pixel having a lowest brightness
US20180336655A1 (en) * 2016-02-29 2018-11-22 Fujitsu Frontech Limited Imaging device and imaging method
EP3255886A3 (en) * 2016-06-07 2018-02-28 Goodrich Corporation Imaging systems and methods
US10048413B2 (en) 2016-06-07 2018-08-14 Goodrich Corporation Imaging systems and methods
WO2020004837A1 (en) * 2018-06-25 2020-01-02 사이정보통신(주) Automatic polarization control device and method
US11880046B2 (en) 2018-06-25 2024-01-23 Sai Technologies Corp. Automatic polarization control device and method
EP3627133A1 (en) * 2018-09-20 2020-03-25 MEI S.r.l. Polarizing filter, apparatus and method for determining an orientation of a lens polarization axis of a polarized lens
WO2020058022A1 (en) * 2018-09-20 2020-03-26 Mei S.R.L. Polarizing filter
US11835416B2 (en) 2018-09-20 2023-12-05 Mei S.R.L. Polarizing filter
EP3796081A1 (en) * 2019-09-18 2021-03-24 Kabushiki Kaisha Toshiba Optical imaging apparatus, robot hand, moving body, and lidar apparatus
CN114822245A (en) * 2022-03-31 2022-07-29 联想(北京)有限公司 Electronic device

Also Published As

Publication number Publication date
US10247866B2 (en) 2019-04-02
JPWO2013114888A1 (en) 2015-05-11
WO2013114888A1 (en) 2013-08-08
JP5906464B2 (en) 2016-04-20
US20170075050A1 (en) 2017-03-16

Similar Documents

Publication Publication Date Title
US10247866B2 (en) Imaging device
US9658463B2 (en) Imaging device and imaging system
US10606031B2 (en) Imaging apparatus, imaging system that includes imaging apparatus, electron mirror system that includes imaging apparatus, and ranging apparatus that includes imaging apparatus
US11496693B2 (en) Endoscope system with high dynamic range image capture using image sensor with polarization
US9253414B2 (en) Imaging-observation apparatus
US20110292258A1 (en) Two sensor imaging systems
JP5053468B2 (en) Stereoscopic image capturing apparatus and endoscope
US10264953B2 (en) Imaging apparatus
JP4971532B1 (en) Stereoscopic image capturing apparatus and endoscope
CN103052914A (en) Three-dimensional image pickup apparatus
JP5740559B2 (en) Image processing apparatus and endoscope
US8279269B2 (en) Mobile information kiosk with a three-dimensional imaging effect
JP5891403B2 (en) Imaging device
JP2004187248A (en) Television camera apparatus for photographing skin
JP2015166723A (en) Imaging device and imaging system

Legal Events

Date Code Title Description
AS Assignment

Owner name: PANASONIC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YAMAGATA, MICHIHIRO;KORENAGA, TSUGUHIRO;IMAMURA, NORIHIRO;AND OTHERS;SIGNING DATES FROM 20130827 TO 20130828;REEL/FRAME:032261/0293

AS Assignment

Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143

Effective date: 20141110

Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:034194/0143

Effective date: 20141110

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD., JAPAN

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ERRONEOUSLY FILED APPLICATION NUMBERS 13/384239, 13/498734, 14/116681 AND 14/301144 PREVIOUSLY RECORDED ON REEL 034194 FRAME 0143. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:PANASONIC CORPORATION;REEL/FRAME:056788/0362

Effective date: 20141110