WO2006102201A1 - Imaging systems with pixelated spatial light modulators - Google Patents

Imaging systems with pixelated spatial light modulators Download PDF

Info

Publication number
WO2006102201A1
WO2006102201A1 PCT/US2006/009958 US2006009958W WO2006102201A1 WO 2006102201 A1 WO2006102201 A1 WO 2006102201A1 US 2006009958 W US2006009958 W US 2006009958W WO 2006102201 A1 WO2006102201 A1 WO 2006102201A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging system
spatial light
light modulator
microscope
control unit
Prior art date
Application number
PCT/US2006/009958
Other languages
French (fr)
Other versions
WO2006102201A8 (en
Inventor
Wade Thomas Cathey, Jr.
Carol Jean Cogswell
Edward Raymond Dowski, Jr.
Original Assignee
Cdm Optics, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cdm Optics, Inc. filed Critical Cdm Optics, Inc.
Publication of WO2006102201A1 publication Critical patent/WO2006102201A1/en
Publication of WO2006102201A8 publication Critical patent/WO2006102201A8/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0025Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/06Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the phase of light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0205Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows
    • G01J3/0229Optical elements not provided otherwise, e.g. optical manifolds, diffusers, windows using masks, aperture plates, spatial light modulators or spatial filters, e.g. reflective filters

Definitions

  • Limited depth of field may present an asset, a liability, or both, in various applications of optics, hi imaging systems, for example, optics are used to produce an image of an obj ect. These optics may limit the depth of field of the imaging system so that only a specific part of the object is in focus. Certain applications such as microscopy and cinematography may, at times, advantageously utilize reduced depth of field to emphasize an image at one plane of focus, thereby de- emphasizing objects that are out of focus. At other times and in other applications, increased depth of field is desirable.
  • Depth of field may be controlled by utilizing insertable filters; however, insertion and withdrawal of such filters from an optical path can present opportunities for mechanical damage, contamination, and/or introduction of undesirable aberrations (e.g., misfocus) within the optical path. Additionally, in some applications, aberrations may occur as a function of the object being imaged. For example, in microscopy, certain samples themselves produce aberrations (e.g., spherical aberrations) in images formed by microscopes; correcting such aberrations is difficult since they are sample dependent.
  • aberrations e.g., spherical aberrations
  • an improvement to an imaging system including a pixelated spatial light modulator for selectively modifying optical attributes of an image generated by the imaging system.
  • a microscope includes a control unit, and a pixelated spatial light modulator that is responsive to the control unit, for selectively modifying optical attributes of an image generated by the microscope.
  • an imaging system includes a pixelated spatial light modulator for selectively modifying optical attributes (such as overall focus, wavefront coding, depth of field, correction for specimen-induced spherical aberration, and correction for aberration induced by an objective lens of the system) of images.
  • the spatial light modulator may connect with a control unit that controls phase delays imparted by pixels of the spatial light modulator, and may assert stored signal patterns to configure the spatial light modulator to change the optical attributes.
  • the control unit may also include an image processor and image storage, to post process the images, for example.
  • an attachment system adjusts optical attributes of a microscope.
  • the attachment system includes a pixelated spatial light modulator and a control unit.
  • the spatial light modulator responds to signals from the control unit to selectively modify phase of a wavefront of the microscope, to modify the optical attributes.
  • a microscopy method provides a control unit that stores a plurality of signal patterns, and applies one of the plurality of signal patterns to a pixelated spatial light modulator of an imaging system to adjust an optical attribute of the imaging system.
  • an imaging method includes receiving a user input at a user interface of a control unit that stores signal patterns, and applying one of the signal patterns to a pixelated spatial light modulator of an imaging system.
  • the spatial light modulator adjusts one of (a) a focal length, (b) an optical tilt and (c) a depth of field of the imaging system in response to the user input.
  • an imaging method includes applying first and second sets of signals from a control unit to a pixelated spatial light modulator.
  • the first and second sets of signals impart first and second optical tilts, respectively, to phase modulation patterns within an imaging system.
  • the method captures two images, each image using one of the first and second optical tilts, and processes the images to extract depth information.
  • the processing may, for example, extract a three dimensional depth map.
  • the method may also include utilizing the depth information to calculate a spherical aberration correction.
  • an imaging method provides information of a depth of an object, calculates a signal pattern for a pixelated spatial light modulator based on the information, and applies the signal pattern to the spatial light modulator to correct spherical aberration caused by the subject.
  • a software product includes instructions stored in computer readable media.
  • the instructions when executed by a computer, perform steps for adjusting optical attributes of an imaging system.
  • the instructions include instructions for receiving a user input at a user interface of a control unit that stores signal patterns, and instructions for applying one of the signal patterns to a pixelated spatial light modulator of an imaging system.
  • the spatial light modulator adjusts one of (a) a focal length, (b) an optical tilt and (c) a depth of field of the imaging system in response to the user input.
  • a computer readable medium contains a data structure that stores signal patterns for a pixelated spatial light modulator of an imaging system. At least one of the signal patterns configures the spatial light modulator to introduce one of a focal length change, an optical tilt, and a depth of field change in the imaging system.
  • FIG. 1 shows a schematic diagram of an imaging system with a pixelated spatial light modulator, in accord with one embodiment.
  • FIG. 2 shows a schematic diagram of a differential interference contrast imaging system with a pixelated spatial light modulator, in accord with one embodiment.
  • FIG. 3 and FIG. 4 show schematic diagrams of exemplary pixelated spatial light modulators.
  • FIG. 5 shows a flow chart illustrating a process for operating an imaging system that includes a pixelated spatial light modulator with a control unit that stores signal patterns.
  • FIG. 6 shows a flow chart illustrating a process for extracting depth information of a sample, in accord with an embodiment.
  • FIG. 7 shows a flow chart illustrating a process for utilizing the imaging system of FIG. 1 to correct for sample-induced spherical aberration.
  • FIG. 8 shows a schematic diagram illustrating an embodiment of a pixelated spatial light modulator suitable for use in imaging systems in accordance with the present disclosure.
  • FIG. 9 shows a schematic diagram illustrating an exemplary situation in which objects at varying distances may be imaged using one imaging system employing a pixelated spatial light modulator.
  • FIGS. 1OA and 1OB are schematic diagrams illustrating a side view and a top view, respectively, of a vehicle having imaging systems with pixelated spatial light modulators.
  • FIG. 1 shows a schematic diagram of an imaging system 10(1) with a pixelated spatial light modulator ("SLM") 60(1).
  • SLM pixelated spatial light modulator
  • electromagnetic energy (e.g., light rays) 30 from an object 20(1) enters a microscope objective 40(1) that has optics (e.g., lenses) 50(1) and 50(2), and propagates to a back aperture plane 45(1) of objective 40(1).
  • Lenses 50(1) and 50(1) are exemplary only; it will be appreciated that an imaging system objective may have more or fewer lenses and/or other optical components.
  • SLM 60(1) at or near back aperture plane 45(1), or a conjugate plane thereof, modulates phase and/or intensity of electromagnetic energy 30 passing through back aperture plane 45(1).
  • SLM 60(1) includes an array of SLM pixels (not shown); each pixel may be individually controlled to vary phase of a wavefront passing therethrough (in an embodiment, amplitude of the wavefront is not intentionally varied, although it is understood that incidental amplitude variation may occur due to scattering or absorption from surfaces and structure).
  • a processing and control unit 80(1) controls each pixel in SLM 60(1), by signals from control unit 80(1) to SLM 60(1), through a communication path 70(1).
  • Control unit 80(1) includes a processor 81 and a user interface 82 that receives input from a user.
  • Processor 81 may operate under the control of software 84, for example; or internal firmware may be used to provide like function.
  • control unit 80(1) includes SLM signal patterns, stored in a data structure of signal pattern storage 85, to vary optical properties of SLM 60(1).
  • Each SLM signal pattern assigns a signal (e.g., a voltage) to each pixel in SLM 60(1); each such signal pattern may thus provide specific optical attribute changes such as discussed below.
  • a lens 90(1) focuses energy 30 at an image plane 100(1).
  • imaging system 10(1) optionally includes an image detector 110(1) that detects electromagnetic energy 30(1) to form image data.
  • a communication path 120(1) may send the image data to control unit 80(1), which stores image data in image storage 86.
  • An image processor 88 within control unit 80(1) may be used to post-process stored image data; or, image post processing may be done by processor 81 under control of software 84, for example.
  • Image processing parameters may also be stored in the data structure of signal pattern storage 85, so that parameters required for post processing of images formed with certain signal patterns (e.g., parameters required to remove blurs introduced by wavefront coding) may be associated with the signal patterns.
  • SLM 60(1) enables control of optical attributes of imaging system 10(1). For example, if SLM 60(1) appropriately changes phase of electromagnetic energy 30 across back aperture plane 45, the optical effect created is equivalent to changing a focal length of objective 40(1). This alters a front conjugate focal distance ⁇ ⁇ of objective 40(1), changing a plane within object 20(1) that is in focus at image plane 100(1). Thus, a user of imaging system 10(1) may control signals from control unit 80(1) to implement phase changes in electromagnetic energy 30, affecting overall focus of electromagnetic energy 30 at image plane 100(1) and without physically moving either object 20(1) or a component of imaging system 10(1).
  • software 84 of control unit 80(1) configures control unit 80(1) to selectively apply one of a plurality of SLM signal patterns that are stored in signal pattern storage 85.
  • Each signal pattern corresponds to a specific focus change to be implemented by SLM 60(1), so that a user need only select one signal pattern (using, for example, user interface 82) to achieve a desired focus. Altering focus without physically moving the sample may be useful, for example, in imaging systems wherein the mechanical complexity, mechanical positioning errors and delays associated with mechanical focusing are undesirable.
  • SLM 60(1) may alter phase of a wavefront formed by electromagnetic energy 30(1) to change other optical attributes.
  • Optical tilt is for example one such attribute.
  • Another such attribute may be phase alteration ("wavefront coding") that makes an optical transfer function ("OTF") of an optical imaging system substantially invariant to misfocus- related aberrations as compared to an OTF of a corresponding optical imaging system without the phase alteration.
  • Wavefront coding may, for example, extend the depth of field of an imaging system without a change of aperture (i.e., without "stopping down” or blocking an aperture). Useful information about wavefront coding may be found, for example, in U.S. Patent No. 5,748,371 , which is incorporated herein by reference.
  • pre-set SLM signal patterns include one or more signal patterns that correspond with specific phase alterations to be implemented by SLM 60(1), so that a user need only select a signal pattern (using, for example, user interface 82) to achieve a desired wavefront coding effect.
  • a user may enter commands into user interface 82 to control signals from control unit 80(1) such that, for example: (a) SLM 60(1) does not modulate phase and imaging system 10(1) performs as a standard imaging system; (b) SLM 60(1) implements wavefront coding and imaging system 10(1) has extended depth of field; or (c) SLM 60(1) customizes the depth of field, for example to adjust depth of field to the total depth of a sample.
  • Wavefront coding using SLM 60(1) may, for example, eliminate a need to insert and retract wavefront coding filters from an optical path of imaging system 10(1), thus minimizing exposure of optical components to handling damage and/or contamination. Nonetheless, imaging system 10(1) using SLM 60 may also use wavefront coding filters and/or other phase filters, for example near to plane 45(1) (or a conjugate plane thereof), to provide a first optical attribute (e.g., a focus change, a spherical aberration correction or wavefront coding) while SLM 60(1) provides a second optical attribute.
  • a first optical attribute e.g., a focus change, a spherical aberration correction or wavefront coding
  • imaging system 10(1) may be useful, for example, when imaging system 10(1) is designed without mechanical focusing capability (e.g., a phase filter may provide a coarse focus correction while SLM 60(1) provides a fine focus correction, a spherical aberration correction and/or wavefront coding).
  • a phase filter may provide a coarse focus correction while SLM 60(1) provides a fine focus correction, a spherical aberration correction and/or wavefront coding.
  • Signal pattern storage 85 may store signal patterns that correspond to SLM phase alterations which minimize spherical aberration. For example, a user may select a signal pattern (using, for example, user interface 82) to change performance of imaging system 10(1) so as to minimize spherical aberration. This may be particularly useful for thick specimens that otherwise introduce spherical aberration. Spherical aberration (and/or other misfocus related aberrations) introduced by objective lenses or other optical elements of imaging system 10(1) may also be minimized by utilizing SLM 60(1).
  • One application that may benefit from extended depth of field and/or reduced focusing time is drug discovery, in which microscope slides with many small sample chambers are evaluated. Each chamber includes cells that may vary in thickness or depth within the chamber. Evaluating such slides requires high resolution imaging of fluorescent dye markers at high magnification. However, the dye markers are susceptible to photobleaching; that is, the fluorescence of the dye markers degrades with exposure to light. Microscope objectives that achieve the required high magnification generally have shallow depth of field. Drug discovery practice using a conventional microscope entails focusing up and down within each chamber to collect multiple images; when done mechanically, the focusing step may be slow enough to result in photobleaching before image acquisition is complete.
  • Microscopy of certain live organisms may also benefit from the reduced focusing time and/or extended depth of field available from imaging system 10(1), for similar reasons as drug discovery applications.
  • High light intensities which may be present during microscopy, can damage certain organisms.
  • An image generation process that captures fewer images (because a sample is in focus across a greater depth of field), and/or changes focus quickly, reduces such damage.
  • Microscopes often include multiple objective lenses mounted in a turret; a user rotates the turret to change an objective lens in use at a particular time.
  • imaging system 10(1) may include a back aperture plane 45(1) located outside of a turret, so that a single SLM 60(1) can be located at back aperture plane 45(1), instead of equipping an imaging system 10(1) with an SLM 60(1) for each objective lens.
  • relay optics may re- image back aperture plane 45(1) to another location within an imaging system; placing SLM 60(1) at an image of a back aperture plane has essentially the same effect as its placement at back aperture plane 45(1). Placement of SLM 60(1) at back aperture plane 45(1) or at an image of plane 45(1) within imaging system 10(1) is ideal, but other placements of SLM 60(1) may be used (e.g., placement near plane 45(1), or an image thereof, may have nearly equivalent optical effect).
  • FIG. 2 shows a differential interference contrast ("DIC") microscope 10(2) with a pixelated spatial light modulator 60(2).
  • Electromagnetic energy (not shown in FIG. 2) from an object 20(2) passes through a DIC apparatus 130 including a controllable analyzer 140.
  • a processing and control unit 80(2) controls analyzer 140 and may operate similar to control unit 80(1), FIG. 1.
  • a relay lens 95 focuses the electromagnetic energy from DIC 130 to an image 45(2) of an aperture plane of microscope 10(2).
  • An SLM 60(2) at or near plane 45(2) modulates phase and/or intensity of the electromagnetic energy passing therethrough.
  • Processing and control unit 80(2) controls each pixel in SLM 60(2), by signals from control unit 80(2) to SLM 60(2) through a communication path 70(2), for example.
  • control unit 80(2) uses information (e.g., polarization state information) from controllable analyzer 140 to post-process image data.
  • a lens 90(2) focuses the electromagnetic energy to form an image plane 100(2); an optional image detector 110(2) detects the electromagnetic energy at plane 100(2).
  • a communication path 120(2) (e.g., a bus) may be used to send image data to control unit 80(2), which may store and/or post process the image data.
  • FIG. 3 and FIG. 4 illustrate pixelated spatial light modulators 60(2) and 60(3) that may be used, for example, as SLM 60(1), FIG. 1.
  • SLM 60(2) includes an array of hexagonal pixels 62, as shown.
  • SLM 60(2) may be, for example, model Hex-127P available from Meadowlark Optics, Frederick, CO.
  • Circle 61 represents the location of an image of an aperture of imaging system 10(1) with respect to SLM 60(2).
  • Pixelated spatial light modulator 60(3) includes an array of rectangular pixels 63, as shown.
  • Circle 64 represents the location of an image of an aperture of imaging system 10(1) with respect to SLM 60(3).
  • Imaging systems 10(1) and 10(2) are examples of imaging systems with pixelated spatial light modulators. It is appreciated that other types of imaging systems may utilize pixelated spatial light modulators in a similar manner to that illustrated in FIG. 1 and FIG. 2.
  • FIG. 5 illustrates a process 200 for operating an imaging system that has an SLM with a control unit that stores signal patterns (e.g., imaging system 10(1) having control unit 80(1) with signal pattern storage 85). Certain steps of process 200 may be performed alone; others may be utilized in combination with other steps of process 200, as explained below. Certain steps of process 200 are optional, the use of each step depending on user preferences, or requirements of a given application. Arrows connecting steps of process 200 illustrate that steps or step combinations may be utilized in order according to the application. Steps of process 200 may be performed by a processor running under the control of software (e.g., processor 81 running under the control of software 84, FIG. 1).
  • a processor running under the control of software e.g., processor 81 running under the control of software 84, FIG. 1).
  • Step 205 begins with an optional substep of receiving user input; alternatively, step 205 may be performed solely under software control, such as when a system implementing process 200 is powered on, or as part of an automated sequence.
  • step 205 "clears" the SLM by applying a signal pattern to the SLM such that no phase change is introduced in electromagnetic energy passing therethrough, so the SLM has essentially no effect in the imaging system optics.
  • control unit 80(1) applies a signal pattern to SLM 60(1) via signal line 70(1).
  • Step 210 applies a new SLM signal pattern (or modifies a currently applied SLM signal pattern) to change phase of electromagnetic energy passing therethrough, to alter focus of the imaging system. That is, step 210 may simply implement a stored signal pattern that implements the appropriate phase modification, or it may add a set of stored signal pattern modifications to a signal pattern currently being applied to the SLM, and implement a resulting signal pattern.
  • control unit 80(1) applies a signal pattern to SLM 60(1) via signal line 70(1) to apply a particular phase pattern to the wavefront of system 10(1).
  • Step 215 applies or modifies an SLM signal pattern to change phase of electromagnetic energy passing therethrough to implement wavefront coding, e.g., to introduce a controlled blur to the resulting image, and to modify depth of field. Such changes may extend depth of field or may reduce it, as required by a user of the imaging system.
  • control unit 80(1) applies a signal pattern to SLM 60(1) via signal line 70(1) to encode the wavefront of system 10(1).
  • Step 215 is followed by a step 225 that implements signal processing to remove the controlled blur, so that the system has a depth of field as modified by step 215 but forms a sharp final image when the blur is removed.
  • a step 225 implements signal processing to remove the controlled blur, so that the system has a depth of field as modified by step 215 but forms a sharp final image when the blur is removed.
  • image processing parameters for reducing a specific blur may be associated, in a signal pattern data structure, with the signal patterns that code an SLM to create the blur.
  • association may facilitate simultaneous implementation by a processing and control unit (e.g., unit 80(1)) of a wavefront coding signal pattern by an SLM along with appropriate post processing to remove blur introduced by the wavefront coding.
  • Step 220 applies or modifies an SLM signal pattern to change phase to implement wavefront coding to correct for spherical aberration caused by a sample being viewed; step 220 is also followed by step 225 to remove a controlled blur introduced by the wavefront coding.
  • Step 230 applies or modifies an SLM signal pattern to change phase to implement wavefront coding for task based application imaging. For example, certain task based applications benefit from extended or reduced depth of field, yet the application may not require removal of the wavefront coding induced controlled blur from images formed thereby. Thus, step 230 may be implemented without a corresponding step 225 to remove such a blur.
  • Step 240 applies or modifies an SLM signal pattern to change phase of electromagnetic energy passing therethrough to apply an optical tilt.
  • the optical tilt lets a viewer see the object through the imaging system from a slightly different angle than without the tilt; optical tilt can therefore be utilized to extract depth information of a sample (see FIG. 6).
  • Step 245 captures an image, for example by triggering a shutter release that exposes firm, or by triggering image capture by a detector (e.g., image detector 110(1), FIG. 1).
  • Step 250 alters other attributes of an imaging system that may be controllable by a control unit of the imaging system; for example, illumination, polarization, apertures, movement of sliders (e.g., filters or fixed phase masks), mechanical focusing, position of a microscope stage, and/or selection of a specimen being viewed.
  • illumination polarization
  • apertures e.g., apertures
  • movement of sliders e.g., filters or fixed phase masks
  • mechanical focusing e.g., position of a microscope stage, and/or selection of a specimen being viewed.
  • an imaging system may be configured to store sequences of any of the steps of process 200 to facilitate rapid execution.
  • a "change sample" sequence may include multiple steps 250 to select a new specimen for viewing, to set illumination, polarization, sliders, objective lens and mechanical focus to "baseline” conditions, then may include step 205 to "clear” the effect of the SLM to a "no phase change” condition.
  • steps 250 to select a new specimen for viewing, to set illumination, polarization, sliders, objective lens and mechanical focus to "baseline” conditions, then may include step 205 to "clear” the effect of the SLM to a "no phase change" condition.
  • a user of an imaging system first utilizes step 250 to set illumination, polarization, apertures, sliders and mechanical focus to baseline settings, and to select a low magnification objective lens and a specimen for viewing.
  • the user utilizes step 205 to clear the SLM.
  • the user begins to examine the specimen and decides to increase depth of field; thus step 215 modifies the SLM signal pattern to change phase to implement wavefront coding; step 225 may automatically follow, implementing signal processing to remove wavefront coding induced blur. If the user becomes interested in a particular feature of the specimen, he may utilize step 250 to select a higher magnification objective lens and to move a microscope stage.
  • step 210 may be utilized to implement wavefront coding to reduce depth of field (and implement appropriate processing) and utilize step 210 repeatedly, modifying the SLM signal pattern to change focus until the desired feature is in focus.
  • the user wishes to capture a set of images taken at various depths of the sample, he initiates a sequence in which the imaging system automatically and sequentially utilizes step 210 to set focus at a first depth of the sample, step 245 to capture an image, step 210 to set focus at a second depth of the sample, step 245 to capture an image, etc. until a desired number of images is obtained.
  • FIG. 6 illustrates a process 300 for extracting depth information of a sample (e.g., using imaging system 10(1)).
  • Process 300 is for example implemented by processing and control unit 80, FIG. 1.
  • an SLM in an imaging system e.g., SLM 60(1) in imaging system 10(1)
  • Step 320 captures the image formed in step 310 (utilizing, for example, image detector 110(1) and storing the captured image in image storage 86).
  • the SLM imparts an optical tilt in a second direction, forming a second image through the exit pupil.
  • Step 340 captures the image formed in step 330.
  • the first and second images are a stereo pair; unlike other optical arrangements that may generate a stereo pair using separate apertures or split apertures, each of the first and second images of the stereo pair generated in steps 310 - 340 utilizes the entire exit pupil.
  • Step 350 processes the stereo pair of images to extract depth information of objects that are identifiable in each of the first and second stored images.
  • Step 350 may be performed by image processor 88, for example under the control of software 84 utilizing image storage 86.
  • the depth information may take the form of a three dimensional depth map, for example. The availability of depth information may be particularly useful when viewing thick specimens, because imaging system 10(1) may utilize the depth information to correct for sample-induced spherical aberration, as discussed above with respect to FIG. 1.
  • FIG. 7 illustrates a process 301 for utilizing imaging system 10(1) to correct sample-induced spherical aberration.
  • Step 360 provides depth information of a sample; step 360 may be performed for example by process 300, or step 360 may be performed by providing depth information obtained about the sample by other means.
  • Step 370 of process 301 calculates a signal pattern required for SLM 60(1) to impart a phase alteration to correct sample-induced spherical aberration.
  • Step 380 of process 301 applies the signal pattern to the SLM (e.g., control unit 80 applies the signal pattern to SLM 60(1) through communication path 70(1)).
  • FIG. 8 illustrates a pixelated spatial light modulator 60(4) that may be used, for example, as SLM 60(1), FIG. 1.
  • SLM 60(4) includes an array of hexagonal pixels 62', as shown. Circle 65 represents the location of an image of an aperture of imaging system 10(1) with respect to SLM 60(4). Edges of pixels 62' may exhibit phase anomalies when differing voltages are applied to adjacent pixels. If allowed to interact with electromagnetic energy passing through the SLM, the phase anomalies may cause unsatisfactory imaging performance.
  • An opaque mask 68 covers the edges of pixels 62' so that electromagnetic energy passing through the pixel edges does not continue through an imaging system that utilizes SLM 60(4) (or, if the electromagnetic energy impinges on SLM 60(4) from a side that has mask 68, it may be blocked from reaching the pixel edges).
  • FIG. 9 illustrates objects at varying distances from an imaging system 400 with a pixelated SLM.
  • Imaging system 400 may be, for example, a cell phone.
  • System 400 may image, at different times, any of a nearby object 410 at about a distance d 2 from the system, a somewhat distant object 420 at about a distance d 3 from the system, and a very distant object 430 at about a distance d 4 from the system.
  • system 400 may include a processing and control unit (such as unit 80(1) described in connection with FIG. 1) that includes a processor, a user interface, software, signal pattern storage for the SLM, image storage, and an image processor.
  • a user of system 400 may select from stored SLM signal patterns to adjust depth of field for a given scene. For example, the user may select a stored SLM signal pattern that extends depth of field when imaging either of objects 410 or 420, but select a different SLM signal pattern that does not extend depth of field when imaging object 430.
  • System 400 may also generate a modified signal pattern by adding signal modifications (that change focus or change depth of field) to a signal pattern currently being applied.
  • System 400 may also automatically modify processing of a detected image to remove blur introduced when wavefront coding is utilized.
  • the capabilities of adding or adjusting wavefront coding, and modifying processing to remove wavefront coding induced blur, may be coordinated with automatic exposure and/or focusing capabilities of system 400 such that wavefront coding is only utilized when needed (e.g., to mitigate computation time and/or power consumption that may be associated with processing of images obtained with wavefront coding).
  • FIG. 1OA is a side view
  • FIG. 1OA is a side view
  • 1OB is a top view, of a vehicle 450 having imaging systems 460, 470, 480(1) and 480(2), each with apixelated SLM. Dashed lines adjacent to each of systems 460, 470, 480(1) and 480(2) indicate approximately a maximum field of view of each system.
  • Imaging system 460 images a field of view behind vehicle 450, and is sometimes called a "back up camera.”
  • Imaging system 470 images a field of view within vehicle 450 such that a driver of vehicle 450 may visually monitor the interior of the vehicle (e.g., to watch children, passengers, or other objects within the vehicle) without turning around.
  • Imaging systems 480(1) and 480(2) image fields of view in locations that are just behind a location of a drive, and just to the left and right of vehicle 450, respectively, that are sometimes called "blind spots.” Images from imaging systems 460, 470, 480(1) and 480(2) may be displayed on screens visible by a driver of vehicle 450.
  • Vehicle 450 may include a processing and control unit (like unit 80(1) described in connection with FIG. 1) that serves each of systems 460, 470, 480(1) and 480(2).
  • the processing and control unit within vehicle 450 may include a processor, a user interface, software, signal pattern storage for each SLM, image storage, and an image processor.
  • SLMs within each of systems 460, 470, 480(1) and 480(2) may be utilized by a driver (or a passenger) of vehicle 450 in a similar manner to SLMs in previously-discussed imaging systems, to adjust a depth of field of the corresponding imaging system.
  • the driver or passenger may wish to see an image with depth of field that is larger or smaller than a current depth of field.
  • Such an image may be generated by utilizing stored SLM signal patterns that implement wavefront coding to adjust depth of field for any of systems 460, 470, 480(1) and 480(2) (and, optionally, corresponding image processing) to produce the desired imaging characteristics.
  • Imaging systems 460, 470, 480(1) and 480(2) may be considered examples of imaging systems utilized for security purposes; it is appreciated that other security system applications may utilize imaging systems with pixelated SLMs to adjust depth of field to a desired range. Such systems may include processing and control units like unit 80(1) described in connection with FIG. 1 that utilize stored SLM signal patterns to implement wavefront coding adjust depth of field for each of systems 460, 470, 480(1) and 480(2) (and, optionally, corresponding image processing) to produce the desired imaging characteristics. [0057] Changes described above, and others, may be made in the imaging systems with pixelated spatial light modulators described herein without departing from the scope hereof.
  • imaging systems that use a pixelated spatial light modulator to change optical attributes may include, or be integrated into, articles such as stereo microscopes, surgical microscopes, confocal microscopes, ophthalmoscopes, endoscopes, vehicles, toys, security systems and/or articles of sporting goods.
  • a pixelated spatial light modulator may be configured through application of stored signal patterns, to enhance operation of an imaging system operating in modes such as bright field, dark field, fluorescence, phase contrast, and DIC.
  • SLM 60(1), control unit 80(1) and communication path 70(1) may be manufactured as an attachment set suitable for integration with a conventional microscope, a stereo microscope, a surgical microscope, a confocal microscope, an ophthalmoscope, and/or an endoscope.

Abstract

An imaging system includes a pixelated spatial light modulator ('SLM') to modify optical attributes (such as overall focus, wavefront coding, depth of field, and corrections for aberrations) of images. The SLM connects with a control unit that controls pixels of the SLM, and that asserts stored signal patterns to configure the SLM to change the optical attributes. The control unit may also include an image processor and image storage. An imaging method includes a control unit applying a signal pattern to a pixelated spatial light modulator in an imaging system to adjust an optical attribute. Another imaging method includes using a pixelated spatial light modulator to sequentially impart first and second optical tilts within an imaging system, capture an image using each optical tilt, and process the images to extract depth information such as, for example, a three dimensional depth map.

Description

IMAGING SYSTEMS WITH PIXELATED SPATIAL LIGHT
MODULATORS
RELATED APPLICATIONS
[0001] This application claims the benefit of priority of U.S. Provisional Patent Application No. 60/663,271 filed 18 March 2005, the disclosure of which is incorporated herein by reference.
BACKGROUND
[0002] Limited depth of field may present an asset, a liability, or both, in various applications of optics, hi imaging systems, for example, optics are used to produce an image of an obj ect. These optics may limit the depth of field of the imaging system so that only a specific part of the object is in focus. Certain applications such as microscopy and cinematography may, at times, advantageously utilize reduced depth of field to emphasize an image at one plane of focus, thereby de- emphasizing objects that are out of focus. At other times and in other applications, increased depth of field is desirable.
[0003] Depth of field (and correction of certain types of aberrations) may be controlled by utilizing insertable filters; however, insertion and withdrawal of such filters from an optical path can present opportunities for mechanical damage, contamination, and/or introduction of undesirable aberrations (e.g., misfocus) within the optical path. Additionally, in some applications, aberrations may occur as a function of the object being imaged. For example, in microscopy, certain samples themselves produce aberrations (e.g., spherical aberrations) in images formed by microscopes; correcting such aberrations is difficult since they are sample dependent. [0004] Focus changes using liquid crystal lenses have been reported, for example by Ye et al., "Liquid-crystal lens with a focal length that is variable in a wide range" in Applied Optics, Vol. 43, No. 35. However, liquid crystal lenses only provide simple curvatures and/or phase retardation. Mechanical focusing elements may add complexity and cost to an imaging system. Moreover, certain applications require acquisition of multiple images, with each image using a different focus; use of mechanical focusing elements may introduce mechanical positioning errors and/or time delays between the multiple images. SUMMARY
[0005] In one embodiment, an improvement to an imaging system is provided, including a pixelated spatial light modulator for selectively modifying optical attributes of an image generated by the imaging system. [0006] In one embodiment, a microscope includes a control unit, and a pixelated spatial light modulator that is responsive to the control unit, for selectively modifying optical attributes of an image generated by the microscope.
[0007] In one embodiment, an imaging system includes a pixelated spatial light modulator for selectively modifying optical attributes (such as overall focus, wavefront coding, depth of field, correction for specimen-induced spherical aberration, and correction for aberration induced by an objective lens of the system) of images. The spatial light modulator may connect with a control unit that controls phase delays imparted by pixels of the spatial light modulator, and may assert stored signal patterns to configure the spatial light modulator to change the optical attributes. The control unit may also include an image processor and image storage, to post process the images, for example.
[0008] In one embodiment, an attachment system adjusts optical attributes of a microscope. The attachment system includes a pixelated spatial light modulator and a control unit. The spatial light modulator responds to signals from the control unit to selectively modify phase of a wavefront of the microscope, to modify the optical attributes.
[0009] In one embodiment, a microscopy method provides a control unit that stores a plurality of signal patterns, and applies one of the plurality of signal patterns to a pixelated spatial light modulator of an imaging system to adjust an optical attribute of the imaging system.
[0010] In one embodiment, an imaging method includes receiving a user input at a user interface of a control unit that stores signal patterns, and applying one of the signal patterns to a pixelated spatial light modulator of an imaging system. The spatial light modulator adjusts one of (a) a focal length, (b) an optical tilt and (c) a depth of field of the imaging system in response to the user input.
[0011] In one embodiment, an imaging method includes applying first and second sets of signals from a control unit to a pixelated spatial light modulator. The first and second sets of signals impart first and second optical tilts, respectively, to phase modulation patterns within an imaging system. The method captures two images, each image using one of the first and second optical tilts, and processes the images to extract depth information. The processing may, for example, extract a three dimensional depth map. The method may also include utilizing the depth information to calculate a spherical aberration correction. [0012] In one embodiment, an imaging method provides information of a depth of an object, calculates a signal pattern for a pixelated spatial light modulator based on the information, and applies the signal pattern to the spatial light modulator to correct spherical aberration caused by the subject.
[0013] In one embodiment, a software product includes instructions stored in computer readable media. The instructions, when executed by a computer, perform steps for adjusting optical attributes of an imaging system. The instructions include instructions for receiving a user input at a user interface of a control unit that stores signal patterns, and instructions for applying one of the signal patterns to a pixelated spatial light modulator of an imaging system. The spatial light modulator adjusts one of (a) a focal length, (b) an optical tilt and (c) a depth of field of the imaging system in response to the user input.
[0014] In one embodiment, a computer readable medium contains a data structure that stores signal patterns for a pixelated spatial light modulator of an imaging system. At least one of the signal patterns configures the spatial light modulator to introduce one of a focal length change, an optical tilt, and a depth of field change in the imaging system.
BRIEF DESCRIPTION OF DRAWINGS
[0015] FIG. 1 shows a schematic diagram of an imaging system with a pixelated spatial light modulator, in accord with one embodiment. [0016] FIG. 2 shows a schematic diagram of a differential interference contrast imaging system with a pixelated spatial light modulator, in accord with one embodiment.
[0017] FIG. 3 and FIG. 4 show schematic diagrams of exemplary pixelated spatial light modulators. [0018] FIG. 5 shows a flow chart illustrating a process for operating an imaging system that includes a pixelated spatial light modulator with a control unit that stores signal patterns. [0019] FIG. 6 shows a flow chart illustrating a process for extracting depth information of a sample, in accord with an embodiment.
[0020] FIG. 7 shows a flow chart illustrating a process for utilizing the imaging system of FIG. 1 to correct for sample-induced spherical aberration. [0021] FIG. 8 shows a schematic diagram illustrating an embodiment of a pixelated spatial light modulator suitable for use in imaging systems in accordance with the present disclosure.
[0022] FIG. 9 shows a schematic diagram illustrating an exemplary situation in which objects at varying distances may be imaged using one imaging system employing a pixelated spatial light modulator.
[0023] FIGS. 1OA and 1OB are schematic diagrams illustrating a side view and a top view, respectively, of a vehicle having imaging systems with pixelated spatial light modulators.
DETAILED DESCRIPTION OF DRAWINGS [0024] FIG. 1 shows a schematic diagram of an imaging system 10(1) with a pixelated spatial light modulator ("SLM") 60(1). It should be noted that the presently described embodiment of a microscope is but one example of an imaging system incorporating techniques for controlling depth of field and aberration effects in accordance with the present disclosure. The microscope embodiment is thus exemplary, and not limiting, as other embodiments are also useful, as will be described in detail at an appropriate point in the following discussion.
[0025] Continuing to refer to FIG. 1, electromagnetic energy (e.g., light rays) 30 from an object 20(1) enters a microscope objective 40(1) that has optics (e.g., lenses) 50(1) and 50(2), and propagates to a back aperture plane 45(1) of objective 40(1). Lenses 50(1) and 50(1) are exemplary only; it will be appreciated that an imaging system objective may have more or fewer lenses and/or other optical components. SLM 60(1) at or near back aperture plane 45(1), or a conjugate plane thereof, modulates phase and/or intensity of electromagnetic energy 30 passing through back aperture plane 45(1). [0026] SLM 60(1) includes an array of SLM pixels (not shown); each pixel may be individually controlled to vary phase of a wavefront passing therethrough (in an embodiment, amplitude of the wavefront is not intentionally varied, although it is understood that incidental amplitude variation may occur due to scattering or absorption from surfaces and structure). A processing and control unit 80(1) controls each pixel in SLM 60(1), by signals from control unit 80(1) to SLM 60(1), through a communication path 70(1). Control unit 80(1) includes a processor 81 and a user interface 82 that receives input from a user. Processor 81 may operate under the control of software 84, for example; or internal firmware may be used to provide like function.
[0027] In an embodiment, control unit 80(1) includes SLM signal patterns, stored in a data structure of signal pattern storage 85, to vary optical properties of SLM 60(1). Each SLM signal pattern assigns a signal (e.g., a voltage) to each pixel in SLM 60(1); each such signal pattern may thus provide specific optical attribute changes such as discussed below. After electromagnetic energy 30 passes through SLM 60(1), a lens 90(1) focuses energy 30 at an image plane 100(1). At image plane 100(1), imaging system 10(1) optionally includes an image detector 110(1) that detects electromagnetic energy 30(1) to form image data. A communication path 120(1) may send the image data to control unit 80(1), which stores image data in image storage 86. An image processor 88 within control unit 80(1) may be used to post-process stored image data; or, image post processing may be done by processor 81 under control of software 84, for example. Image processing parameters may also be stored in the data structure of signal pattern storage 85, so that parameters required for post processing of images formed with certain signal patterns (e.g., parameters required to remove blurs introduced by wavefront coding) may be associated with the signal patterns.
[0028] SLM 60(1) enables control of optical attributes of imaging system 10(1). For example, if SLM 60(1) appropriately changes phase of electromagnetic energy 30 across back aperture plane 45, the optical effect created is equivalent to changing a focal length of objective 40(1). This alters a front conjugate focal distance ά\ of objective 40(1), changing a plane within object 20(1) that is in focus at image plane 100(1). Thus, a user of imaging system 10(1) may control signals from control unit 80(1) to implement phase changes in electromagnetic energy 30, affecting overall focus of electromagnetic energy 30 at image plane 100(1) and without physically moving either object 20(1) or a component of imaging system 10(1).
[0029] In one embodiment, software 84 of control unit 80(1) configures control unit 80(1) to selectively apply one of a plurality of SLM signal patterns that are stored in signal pattern storage 85. Each signal pattern corresponds to a specific focus change to be implemented by SLM 60(1), so that a user need only select one signal pattern (using, for example, user interface 82) to achieve a desired focus. Altering focus without physically moving the sample may be useful, for example, in imaging systems wherein the mechanical complexity, mechanical positioning errors and delays associated with mechanical focusing are undesirable.
[0030] In addition to producing focal length changes, SLM 60(1) may alter phase of a wavefront formed by electromagnetic energy 30(1) to change other optical attributes. Optical tilt is for example one such attribute. Another such attribute may be phase alteration ("wavefront coding") that makes an optical transfer function ("OTF") of an optical imaging system substantially invariant to misfocus- related aberrations as compared to an OTF of a corresponding optical imaging system without the phase alteration. Wavefront coding may, for example, extend the depth of field of an imaging system without a change of aperture (i.e., without "stopping down" or blocking an aperture). Useful information about wavefront coding may be found, for example, in U.S. Patent No. 5,748,371 , which is incorporated herein by reference.
[0031] In one embodiment, pre-set SLM signal patterns include one or more signal patterns that correspond with specific phase alterations to be implemented by SLM 60(1), so that a user need only select a signal pattern (using, for example, user interface 82) to achieve a desired wavefront coding effect. Thus, a user may enter commands into user interface 82 to control signals from control unit 80(1) such that, for example: (a) SLM 60(1) does not modulate phase and imaging system 10(1) performs as a standard imaging system; (b) SLM 60(1) implements wavefront coding and imaging system 10(1) has extended depth of field; or (c) SLM 60(1) customizes the depth of field, for example to adjust depth of field to the total depth of a sample.
[0032] Wavefront coding using SLM 60(1) may, for example, eliminate a need to insert and retract wavefront coding filters from an optical path of imaging system 10(1), thus minimizing exposure of optical components to handling damage and/or contamination. Nonetheless, imaging system 10(1) using SLM 60 may also use wavefront coding filters and/or other phase filters, for example near to plane 45(1) (or a conjugate plane thereof), to provide a first optical attribute (e.g., a focus change, a spherical aberration correction or wavefront coding) while SLM 60(1) provides a second optical attribute. Using both fixed filters and SLM 60(1) in imaging system 10(1) may be useful, for example, when imaging system 10(1) is designed without mechanical focusing capability (e.g., a phase filter may provide a coarse focus correction while SLM 60(1) provides a fine focus correction, a spherical aberration correction and/or wavefront coding).
[0033] Signal pattern storage 85 may store signal patterns that correspond to SLM phase alterations which minimize spherical aberration. For example, a user may select a signal pattern (using, for example, user interface 82) to change performance of imaging system 10(1) so as to minimize spherical aberration. This may be particularly useful for thick specimens that otherwise introduce spherical aberration. Spherical aberration (and/or other misfocus related aberrations) introduced by objective lenses or other optical elements of imaging system 10(1) may also be minimized by utilizing SLM 60(1).
[0034] One application that may benefit from extended depth of field and/or reduced focusing time is drug discovery, in which microscope slides with many small sample chambers are evaluated. Each chamber includes cells that may vary in thickness or depth within the chamber. Evaluating such slides requires high resolution imaging of fluorescent dye markers at high magnification. However, the dye markers are susceptible to photobleaching; that is, the fluorescence of the dye markers degrades with exposure to light. Microscope objectives that achieve the required high magnification generally have shallow depth of field. Drug discovery practice using a conventional microscope entails focusing up and down within each chamber to collect multiple images; when done mechanically, the focusing step may be slow enough to result in photobleaching before image acquisition is complete. Use of a microscope with extended depth of field (i.e., due to the use of wavefront coding and/or SLM 60(1)), and which can refocus more rapidly when needed, can speed up the image generation process so that photobleaching is reduced, thereby increasing image quality.
[0035] Microscopy of certain live organisms may also benefit from the reduced focusing time and/or extended depth of field available from imaging system 10(1), for similar reasons as drug discovery applications. High light intensities, which may be present during microscopy, can damage certain organisms. An image generation process that captures fewer images (because a sample is in focus across a greater depth of field), and/or changes focus quickly, reduces such damage.
[0036] Microscopes often include multiple objective lenses mounted in a turret; a user rotates the turret to change an objective lens in use at a particular time. It will be appreciated that imaging system 10(1) may include a back aperture plane 45(1) located outside of a turret, so that a single SLM 60(1) can be located at back aperture plane 45(1), instead of equipping an imaging system 10(1) with an SLM 60(1) for each objective lens. Additionally, there may be other locations in imaging system 10(1), besides back aperture plane 45(1), where SLM 60(1) may be placed to provide the same function as shown in FIG. 1. For example, relay optics may re- image back aperture plane 45(1) to another location within an imaging system; placing SLM 60(1) at an image of a back aperture plane has essentially the same effect as its placement at back aperture plane 45(1). Placement of SLM 60(1) at back aperture plane 45(1) or at an image of plane 45(1) within imaging system 10(1) is ideal, but other placements of SLM 60(1) may be used (e.g., placement near plane 45(1), or an image thereof, may have nearly equivalent optical effect).
[0037] FIG. 2 shows a differential interference contrast ("DIC") microscope 10(2) with a pixelated spatial light modulator 60(2). Electromagnetic energy (not shown in FIG. 2) from an object 20(2) passes through a DIC apparatus 130 including a controllable analyzer 140. A processing and control unit 80(2) controls analyzer 140 and may operate similar to control unit 80(1), FIG. 1. A relay lens 95 focuses the electromagnetic energy from DIC 130 to an image 45(2) of an aperture plane of microscope 10(2). An SLM 60(2) at or near plane 45(2) modulates phase and/or intensity of the electromagnetic energy passing therethrough.
Processing and control unit 80(2) controls each pixel in SLM 60(2), by signals from control unit 80(2) to SLM 60(2) through a communication path 70(2), for example. In one embodiment, control unit 80(2) uses information (e.g., polarization state information) from controllable analyzer 140 to post-process image data. A lens 90(2) focuses the electromagnetic energy to form an image plane 100(2); an optional image detector 110(2) detects the electromagnetic energy at plane 100(2). A communication path 120(2) (e.g., a bus) may be used to send image data to control unit 80(2), which may store and/or post process the image data.
[0038] FIG. 3 and FIG. 4 illustrate pixelated spatial light modulators 60(2) and 60(3) that may be used, for example, as SLM 60(1), FIG. 1. SLM 60(2) includes an array of hexagonal pixels 62, as shown. SLM 60(2) may be, for example, model Hex-127P available from Meadowlark Optics, Frederick, CO. Circle 61 represents the location of an image of an aperture of imaging system 10(1) with respect to SLM 60(2). Also shown in FIG. 3 are X-addresses 71 and Y-addresses 72 of a coordinate system that defines specific pixels for application of signals thereto (as described in connection with Table 1 below). Pixelated spatial light modulator 60(3) includes an array of rectangular pixels 63, as shown. Circle 64 represents the location of an image of an aperture of imaging system 10(1) with respect to SLM 60(3). [0039] Imaging systems 10(1) and 10(2) are examples of imaging systems with pixelated spatial light modulators. It is appreciated that other types of imaging systems may utilize pixelated spatial light modulators in a similar manner to that illustrated in FIG. 1 and FIG. 2.
[0040] FIG. 5 illustrates a process 200 for operating an imaging system that has an SLM with a control unit that stores signal patterns (e.g., imaging system 10(1) having control unit 80(1) with signal pattern storage 85). Certain steps of process 200 may be performed alone; others may be utilized in combination with other steps of process 200, as explained below. Certain steps of process 200 are optional, the use of each step depending on user preferences, or requirements of a given application. Arrows connecting steps of process 200 illustrate that steps or step combinations may be utilized in order according to the application. Steps of process 200 may be performed by a processor running under the control of software (e.g., processor 81 running under the control of software 84, FIG. 1).
[0041] Step 205 begins with an optional substep of receiving user input; alternatively, step 205 may be performed solely under software control, such as when a system implementing process 200 is powered on, or as part of an automated sequence. The optional substep - denoted as "[Receive user input / ]" - in step 205 and other steps of process 200 denotes that the user command is optional; each such step may be initiated by a user command or may be performed automatically. Continuing, step 205 "clears" the SLM by applying a signal pattern to the SLM such that no phase change is introduced in electromagnetic energy passing therethrough, so the SLM has essentially no effect in the imaging system optics. In an example of step 205, control unit 80(1) applies a signal pattern to SLM 60(1) via signal line 70(1). Step 210 applies a new SLM signal pattern (or modifies a currently applied SLM signal pattern) to change phase of electromagnetic energy passing therethrough, to alter focus of the imaging system. That is, step 210 may simply implement a stored signal pattern that implements the appropriate phase modification, or it may add a set of stored signal pattern modifications to a signal pattern currently being applied to the SLM, and implement a resulting signal pattern. In an example of step 210, control unit 80(1) applies a signal pattern to SLM 60(1) via signal line 70(1) to apply a particular phase pattern to the wavefront of system 10(1).
[0042] Step 215 applies or modifies an SLM signal pattern to change phase of electromagnetic energy passing therethrough to implement wavefront coding, e.g., to introduce a controlled blur to the resulting image, and to modify depth of field. Such changes may extend depth of field or may reduce it, as required by a user of the imaging system. In an example of step 215, control unit 80(1) applies a signal pattern to SLM 60(1) via signal line 70(1) to encode the wavefront of system 10(1). Step 215 is followed by a step 225 that implements signal processing to remove the controlled blur, so that the system has a depth of field as modified by step 215 but forms a sharp final image when the blur is removed. As discussed in connection with FIG. 1, image processing parameters for reducing a specific blur may be associated, in a signal pattern data structure, with the signal patterns that code an SLM to create the blur. Such association may facilitate simultaneous implementation by a processing and control unit (e.g., unit 80(1)) of a wavefront coding signal pattern by an SLM along with appropriate post processing to remove blur introduced by the wavefront coding.
[0043] Step 220 applies or modifies an SLM signal pattern to change phase to implement wavefront coding to correct for spherical aberration caused by a sample being viewed; step 220 is also followed by step 225 to remove a controlled blur introduced by the wavefront coding. Step 230 applies or modifies an SLM signal pattern to change phase to implement wavefront coding for task based application imaging. For example, certain task based applications benefit from extended or reduced depth of field, yet the application may not require removal of the wavefront coding induced controlled blur from images formed thereby. Thus, step 230 may be implemented without a corresponding step 225 to remove such a blur.
[0044] Step 240 applies or modifies an SLM signal pattern to change phase of electromagnetic energy passing therethrough to apply an optical tilt. The optical tilt lets a viewer see the object through the imaging system from a slightly different angle than without the tilt; optical tilt can therefore be utilized to extract depth information of a sample (see FIG. 6). Step 245 captures an image, for example by triggering a shutter release that exposes firm, or by triggering image capture by a detector (e.g., image detector 110(1), FIG. 1). Step 250 alters other attributes of an imaging system that may be controllable by a control unit of the imaging system; for example, illumination, polarization, apertures, movement of sliders (e.g., filters or fixed phase masks), mechanical focusing, position of a microscope stage, and/or selection of a specimen being viewed.
[0045] It is appreciated that an imaging system may be configured to store sequences of any of the steps of process 200 to facilitate rapid execution. For example, when an imaging system implementing process 200 is a microscope, a "change sample" sequence may include multiple steps 250 to select a new specimen for viewing, to set illumination, polarization, sliders, objective lens and mechanical focus to "baseline" conditions, then may include step 205 to "clear" the effect of the SLM to a "no phase change" condition. Other examples of step sequences are described in processes 300 and 301 below.
[0046] hi one example of process 200, a user of an imaging system (e.g., either of imaging systems 10(1) or 10(2)) first utilizes step 250 to set illumination, polarization, apertures, sliders and mechanical focus to baseline settings, and to select a low magnification objective lens and a specimen for viewing. The user utilizes step 205 to clear the SLM. The user begins to examine the specimen and decides to increase depth of field; thus step 215 modifies the SLM signal pattern to change phase to implement wavefront coding; step 225 may automatically follow, implementing signal processing to remove wavefront coding induced blur. If the user becomes interested in a particular feature of the specimen, he may utilize step 250 to select a higher magnification objective lens and to move a microscope stage. If the user then wishes to examine the feature at a particular depth, he may utilize steps 220 and 225 to implement wavefront coding to reduce depth of field (and implement appropriate processing) and utilize step 210 repeatedly, modifying the SLM signal pattern to change focus until the desired feature is in focus. Finally, if the user wishes to capture a set of images taken at various depths of the sample, he initiates a sequence in which the imaging system automatically and sequentially utilizes step 210 to set focus at a first depth of the sample, step 245 to capture an image, step 210 to set focus at a second depth of the sample, step 245 to capture an image, etc. until a desired number of images is obtained.
[0047] FIG. 6 illustrates a process 300 for extracting depth information of a sample (e.g., using imaging system 10(1)). Process 300 is for example implemented by processing and control unit 80, FIG. 1. hi step 310, an SLM in an imaging system (e.g., SLM 60(1) in imaging system 10(1)) imparts an optical tilt in a first direction, forming a first image through an exit pupil of an objective lens of the imaging system (e.g., objective 40(1)). Step 320 captures the image formed in step 310 (utilizing, for example, image detector 110(1) and storing the captured image in image storage 86). In step 330, the SLM imparts an optical tilt in a second direction, forming a second image through the exit pupil. Step 340 captures the image formed in step 330. The first and second images are a stereo pair; unlike other optical arrangements that may generate a stereo pair using separate apertures or split apertures, each of the first and second images of the stereo pair generated in steps 310 - 340 utilizes the entire exit pupil. Step 350 processes the stereo pair of images to extract depth information of objects that are identifiable in each of the first and second stored images. Step 350 may be performed by image processor 88, for example under the control of software 84 utilizing image storage 86. The depth information may take the form of a three dimensional depth map, for example. The availability of depth information may be particularly useful when viewing thick specimens, because imaging system 10(1) may utilize the depth information to correct for sample-induced spherical aberration, as discussed above with respect to FIG. 1.
[0048] FIG. 7 illustrates a process 301 for utilizing imaging system 10(1) to correct sample-induced spherical aberration. Step 360 provides depth information of a sample; step 360 may be performed for example by process 300, or step 360 may be performed by providing depth information obtained about the sample by other means. Step 370 of process 301 calculates a signal pattern required for SLM 60(1) to impart a phase alteration to correct sample-induced spherical aberration. Step 380 of process 301 applies the signal pattern to the SLM (e.g., control unit 80 applies the signal pattern to SLM 60(1) through communication path 70(1)). [0049] FIG. 8 illustrates a pixelated spatial light modulator 60(4) that may be used, for example, as SLM 60(1), FIG. 1. SLM 60(4) includes an array of hexagonal pixels 62', as shown. Circle 65 represents the location of an image of an aperture of imaging system 10(1) with respect to SLM 60(4). Edges of pixels 62' may exhibit phase anomalies when differing voltages are applied to adjacent pixels. If allowed to interact with electromagnetic energy passing through the SLM, the phase anomalies may cause unsatisfactory imaging performance. An opaque mask 68 covers the edges of pixels 62' so that electromagnetic energy passing through the pixel edges does not continue through an imaging system that utilizes SLM 60(4) (or, if the electromagnetic energy impinges on SLM 60(4) from a side that has mask 68, it may be blocked from reaching the pixel edges). It is appreciated that mask 68 may be applied to either side or both sides of SLM 60(4), and that an area of mask 68 may be optimized by reducing its extent at each pixel edge to block electromagnetic energy that interacts with phase anomalies. [0050] FIG. 9 illustrates objects at varying distances from an imaging system 400 with a pixelated SLM. FIG. 9 is not drawn to scale. Imaging system 400 may be, for example, a cell phone. System 400 may image, at different times, any of a nearby object 410 at about a distance d2 from the system, a somewhat distant object 420 at about a distance d3 from the system, and a very distant object 430 at about a distance d4 from the system. It is appreciated that each of objects 410, 420 and 430 has depth in relation to system 400; that is, each object is not located at a single plane at any of the distances d2, d3 and d4 from system 400. Therefore, it is appreciated that imaging of objects 410, 420 and/or 430 may benefit from use of an SLM in imaging system 400 to implement wavefront coding optics to vary depth of field. For example, system 400 may include a processing and control unit (such as unit 80(1) described in connection with FIG. 1) that includes a processor, a user interface, software, signal pattern storage for the SLM, image storage, and an image processor. [0051] A user of system 400 may select from stored SLM signal patterns to adjust depth of field for a given scene. For example, the user may select a stored SLM signal pattern that extends depth of field when imaging either of objects 410 or 420, but select a different SLM signal pattern that does not extend depth of field when imaging object 430.
[0052] An example of a signal pattern for a 128-pixel SLM (such as that shown in FIG. 8) that implements extended depth of field is shown in Table 1. The coordinate system used in Table 1 corresponds to that shown with respect to SLM
60(4) in FIG. 3, wherein Y-coordinates vary from +/- 12 and X-coordinates vary from +/-6 with the center pixel having the coordinate (X5Y) = (0,0).
Figure imgf000015_0001
Figure imgf000016_0001
Figure imgf000017_0001
Figure imgf000018_0001
Table 1. Example Signal Pattern
[0053] System 400 may also generate a modified signal pattern by adding signal modifications (that change focus or change depth of field) to a signal pattern currently being applied. System 400 may also automatically modify processing of a detected image to remove blur introduced when wavefront coding is utilized. The capabilities of adding or adjusting wavefront coding, and modifying processing to remove wavefront coding induced blur, may be coordinated with automatic exposure and/or focusing capabilities of system 400 such that wavefront coding is only utilized when needed (e.g., to mitigate computation time and/or power consumption that may be associated with processing of images obtained with wavefront coding). [0054] FIG. 1OA is a side view, and FIG. 1OB is a top view, of a vehicle 450 having imaging systems 460, 470, 480(1) and 480(2), each with apixelated SLM. Dashed lines adjacent to each of systems 460, 470, 480(1) and 480(2) indicate approximately a maximum field of view of each system. Imaging system 460 images a field of view behind vehicle 450, and is sometimes called a "back up camera." Imaging system 470 images a field of view within vehicle 450 such that a driver of vehicle 450 may visually monitor the interior of the vehicle (e.g., to watch children, passengers, or other objects within the vehicle) without turning around. Imaging systems 480(1) and 480(2) image fields of view in locations that are just behind a location of a drive, and just to the left and right of vehicle 450, respectively, that are sometimes called "blind spots." Images from imaging systems 460, 470, 480(1) and 480(2) may be displayed on screens visible by a driver of vehicle 450. Vehicle 450 may include a processing and control unit (like unit 80(1) described in connection with FIG. 1) that serves each of systems 460, 470, 480(1) and 480(2). For example, the processing and control unit within vehicle 450 may include a processor, a user interface, software, signal pattern storage for each SLM, image storage, and an image processor.
[0055] SLMs within each of systems 460, 470, 480(1) and 480(2) may be utilized by a driver (or a passenger) of vehicle 450 in a similar manner to SLMs in previously-discussed imaging systems, to adjust a depth of field of the corresponding imaging system. For example, the driver or passenger may wish to see an image with depth of field that is larger or smaller than a current depth of field. Such an image may be generated by utilizing stored SLM signal patterns that implement wavefront coding to adjust depth of field for any of systems 460, 470, 480(1) and 480(2) (and, optionally, corresponding image processing) to produce the desired imaging characteristics.
[0056] Imaging systems 460, 470, 480(1) and 480(2) may be considered examples of imaging systems utilized for security purposes; it is appreciated that other security system applications may utilize imaging systems with pixelated SLMs to adjust depth of field to a desired range. Such systems may include processing and control units like unit 80(1) described in connection with FIG. 1 that utilize stored SLM signal patterns to implement wavefront coding adjust depth of field for each of systems 460, 470, 480(1) and 480(2) (and, optionally, corresponding image processing) to produce the desired imaging characteristics. [0057] Changes described above, and others, may be made in the imaging systems with pixelated spatial light modulators described herein without departing from the scope hereof. For example, other imaging systems that use a pixelated spatial light modulator to change optical attributes may include, or be integrated into, articles such as stereo microscopes, surgical microscopes, confocal microscopes, ophthalmoscopes, endoscopes, vehicles, toys, security systems and/or articles of sporting goods. A pixelated spatial light modulator may be configured through application of stored signal patterns, to enhance operation of an imaging system operating in modes such as bright field, dark field, fluorescence, phase contrast, and DIC. In another example, SLM 60(1), control unit 80(1) and communication path 70(1) may be manufactured as an attachment set suitable for integration with a conventional microscope, a stereo microscope, a surgical microscope, a confocal microscope, an ophthalmoscope, and/or an endoscope. It should thus be noted that the matter contained in the above description or shown in the accompanying drawings should be interpreted as illustrative and not in a limiting sense. The following claims are intended to cover all generic and specific features described herein, as well as all statements of the scope of the present method and system, which, as a matter of language, might be said to fall there between.

Claims

CLAIMSWhat is claimed is:
1. In an imaging system, the improvement comprising a pixelated spatial light modulator for selectively modifying optical attributes of an image generated by the imaging system.
2. In the imaging system of claim 1, the further improvement wherein the pixelated spatial light modulator connects with a control unit that controls phase delays imparted by pixels of the pixelated spatial light modulator.
3. In the imaging system of claim 2, the further improvement wherein the control unit is configured to assert one or more stored signal patterns, each signal pattern comprising signals that, when asserted by the control unit, configure the pixelated spatial light modulator to change at least one optical attribute of the microscope.
4. In the imaging system of claim 3, the further improvement wherein the optical attribute comprises one or more of a focal length of the imaging system, optical tilt, wavefront coding and a depth of field of the imaging system.
5. In the imaging system of claim 3 , the further improvement wherein the optical attribute comprises a depth of field of the imaging system.
6. In the imaging system of claim 2, the further improvement wherein the control unit comprises an image processor and image storage.
7. In the imaging system of claim 6, the further improvement wherein the control unit applies a stored signal pattern to the pixelated spatial light modulator to implement wavefront coding and then processes data from a detector to remove blurring introduced by the wavefront coding.
8. In the imaging system of claim 1 , the further improvement comprising a phase filter for modifying phase of a wavefront of the imaging system.
9. In the imaging system of claim 1 , the further improvement wherein the spatial light modulator comprises an opaque mask disposed adjacent to pixel boundaries of the spatial light modulator.
10. In the imaging system of claim 1 , the further improvement wherein the imaging system is integrated into one of a cell phone, a vehicle, a toy, a sporting goods article, and a security system.
11. A microscope, comprising a control unit and a pixelated spatial light modulator, responsive to the control unit, for selectively modifying optical attributes of an image generated by the microscope.
12. The microscope of claim 11, wherein the pixelated spatial light modulator is located near one of a back aperture plane of an objective lens and an image of the back aperture plane.
13. The microscope of claim 11 , wherein the microscope comprises a plurality of objective lenses configured such that an objective lens in use has a back aperture plane located at the pixelated spatial light modulator.
14. The microscope of claim 13, the objective lenses configured in a turret of the microscope.
15. The microscope of claim 11 , wherein the control unit controls phase delays imparted by pixels of the pixelated spatial light modulator.
16. The microscope of claim 15, wherein the control unit is configured to assert one or more stored signal patterns, each signal pattern comprising signals that, when asserted by the control unit, configure the pixelated spatial light modulator to change at least one optical attribute of the microscope.
17. The microscope of claim 16, wherein the optical attribute comprises one or more of an overall focus of the microscope, optical tilt, wavefront coding, a depth of field of the microscope, a correction for specimen-induced spherical aberration, and a correction for aberration induced by an objective lens of the microscope.
18. The microscope of claim 16, wherein the optical attribute comprises a depth of field of the microscope, the depth of field changing without changing an aperture of the microscope.
19. The microscope of claim 15, the control unit comprising an image processor and image storage.
20. The imaging system of claim 15, the control unit configured such that when a stored signal pattern that configures the spatial light modulator for wavefront coding is asserted, image processing removes blurring introduced by the wavefront coding.
21. The microscope of claim 11 , further comprising a phase filter for modifying phase of a wavefront of the microscope.
22. The microscope of claim 11, the spatial light modulator comprising an opaque mask disposed adjacent to pixel boundaries of the spatial light modulator.
23. The microscope of claim 11, forming one of a stereo microscope, a surgical microscope, a confocal microscope, an ophthalmoscope or an endoscope.
24. The microscope of claim 11, operating in a mode that is one of bright field, dark field, fluorescence, phase contrast, and differential interference contrast.
25. An attachment system that adjusts optical attributes of a microscope, comprising: a pixelated spatial light modulator and a control unit, the pixelated spatial light modulator responsive to signals of the control unit to selectively modify phase of a wavefront of the microscope, to achieve an adjustment of the optical attributes.
26. An imaging method, comprising: receiving a user input at a user interface of a control unit that stores a plurality of signal patterns; and applying one of the plurality of signal patterns to a pixelated spatial light modulator of an imaging system, to adjust one of (a) a focal length, (b) an optical tilt and (c) a depth of field of the imaging system in response to the user input.
27. The method of claim 26, further comprising modifying a currently applied signal pattern, such that a resulting change of phase induced by the spatial light modulator changes one of a focal length of the imaging system, an optical tilt and a depth of field of the imaging system.
28. The method of claim 26, the imaging system comprising a microscope, further comprising modifying a currently applied signal pattern, such that a resulting change of phase induced by the spatial light modulator provides one of a correction for specimen-induced spherical aberration and a correction for aberration induced by an objective lens of the microscope.
29. The method of claim 26, the step of applying comprising implementing wavefront coding.
30. The method of claim 29, further comprising implementing image processing that removes a blur introduced by the wavefront coding.
31. An imaging method, comprising: applying a first set of signals from a control unit to a pixelated spatial light modulator in an imaging system, such that the pixelated spatial light modulator imparts a first optical tilt to a phase modulation pattern of the imaging system, capturing a first image from the imaging system, applying a second set of signals from the control unit to the pixelated spatial light modulator, such that the pixelated spatial light modulator imparts a second optical tilt to the phase modulation pattern, capturing a second image from the imaging system, and processing the first and second image to extract depth information.
32. The method of claim 30, the processing step comprising extracting a three dimensional depth map.
33. The microscopy method of claim 30, further comprising the step of utilizing the depth information to calculate a spherical aberration correction.
34. An imaging method, comprising: providing information of a depth of an object, calculating a signal pattern for a pixelated spatial light modulator based on the information, and applying the signal pattern, so calculated, to the pixelated spatial light modulator to correct spherical aberration caused by the object.
35. A software product, comprising instructions stored in computer readable media, wherein the instructions, when executed by a computer, perform steps for adjusting optical attributes of an imaging system, comprising: instructions for receiving a user input at a user interface of a control unit that stores a plurality of signal patterns; and instructions for applying one of the plurality of signal patterns to a pixelated spatial light modulator of an imaging system, to adjust one of (a) a focal length, (b) an optical tilt and (c) a depth of field of the imaging system in response to the user input.
36. Software product of claim 35, further comprising instructions for modifying a currently applied signal pattern, such that a resulting change of phase induced by the spatial light modulator changes one of a focal length of the imaging system, an optical tilt and a depth of field of the imaging system.
37. Software product of claim 35, the imaging system comprising a microscope, the step of applying comprising modifying a currently applied signal pattern, such that a resulting change of phase induced by the spatial light modulator provides one of a correction for specimen-induced spherical aberration and a correction for aberration induced by an objective lens of the microscope.
38. Software product of claim 35, the step of applying comprising implementing wavefront coding.
39. Software product of claim 38, further comprising instructions for implementing image processing that removes a blur introduced by the wavefront coding.
40. A computer readable medium containing a data structure that stores signal patterns for a pixelated spatial light modulator of an imaging system, at least one of the signal patterns operable to configure the spatial light modulator to introduce one of a focal length change, an optical tilt, and a depth of field change in the imaging system.
41. Computer readable medium of claim 40, at least one of the signal patterns operable to configure the spatial light modulator to implement wavefront coding in the imaging system.
42. Computer readable medium of claim 41, the data structure further storing image processing parameters for removing a blur, introduced by the wavefront coding, from image data of the imaging system.
PCT/US2006/009958 2005-03-18 2006-03-20 Imaging systems with pixelated spatial light modulators WO2006102201A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US66327105P 2005-03-18 2005-03-18
US60/663,271 2005-03-18

Publications (2)

Publication Number Publication Date
WO2006102201A1 true WO2006102201A1 (en) 2006-09-28
WO2006102201A8 WO2006102201A8 (en) 2007-03-29

Family

ID=36572248

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/009958 WO2006102201A1 (en) 2005-03-18 2006-03-20 Imaging systems with pixelated spatial light modulators

Country Status (1)

Country Link
WO (1) WO2006102201A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7944467B2 (en) 2003-12-01 2011-05-17 Omnivision Technologies, Inc. Task-based imaging systems
US8144208B2 (en) 2003-12-01 2012-03-27 Omnivision Technologies, Inc. Task-based imaging systems
WO2013130077A1 (en) * 2012-02-29 2013-09-06 Agilent Technologies, Inc. Software defined microscope
US8558873B2 (en) 2010-06-16 2013-10-15 Microsoft Corporation Use of wavefront coding to create a depth image
US20140368904A1 (en) * 2012-02-29 2014-12-18 Agilent Technologies, Inc. Software Defined Microscope
WO2018136251A1 (en) * 2017-01-19 2018-07-26 Oculus Vr, Llc Focal surface display
DE102017125453B3 (en) 2017-10-30 2019-02-21 Carl Zeiss Meditec Ag Surgical microscope and procedures performed with the surgical microscope

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5115335A (en) * 1990-06-29 1992-05-19 The United States Of America As Represented By The Secretary Of The Air Force Electrooptic fabry-perot pixels for phase-dominant spatial light modulators
EP0550886A2 (en) * 1991-12-24 1993-07-14 Research Development Corporation Of Japan Aberration correction method and aberration correction apparatus
US5291012A (en) * 1991-04-30 1994-03-01 Isao Shimizu High resolution optical microscope and irradiation spot beam-forming mask
WO1996024085A1 (en) * 1995-02-03 1996-08-08 The Regents Of The University Of Colorado Extended depth of field optical systems
US5666197A (en) * 1996-08-21 1997-09-09 Polaroid Corporation Apparatus and methods employing phase control and analysis of evanescent illumination for imaging and metrology of subwavelength lateral surface topography
US6025951A (en) * 1996-11-27 2000-02-15 National Optics Institute Light modulating microdevice and method
WO2001035155A1 (en) * 1999-11-08 2001-05-17 Wavefront Analysis Inc. System and method for recovering phase information of a wave front
US20010006429A1 (en) * 1999-12-24 2001-07-05 Philips Corporation Optical wavefront modifier
US20030063384A1 (en) * 2001-08-31 2003-04-03 Dowski Edward Raymond Wavefront coding optics
WO2003073153A1 (en) * 2002-02-27 2003-09-04 Cdm Optics, Inc. Optimized image processing for wavefront coded imaging systems

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5115335A (en) * 1990-06-29 1992-05-19 The United States Of America As Represented By The Secretary Of The Air Force Electrooptic fabry-perot pixels for phase-dominant spatial light modulators
US5291012A (en) * 1991-04-30 1994-03-01 Isao Shimizu High resolution optical microscope and irradiation spot beam-forming mask
EP0550886A2 (en) * 1991-12-24 1993-07-14 Research Development Corporation Of Japan Aberration correction method and aberration correction apparatus
WO1996024085A1 (en) * 1995-02-03 1996-08-08 The Regents Of The University Of Colorado Extended depth of field optical systems
US5666197A (en) * 1996-08-21 1997-09-09 Polaroid Corporation Apparatus and methods employing phase control and analysis of evanescent illumination for imaging and metrology of subwavelength lateral surface topography
US6025951A (en) * 1996-11-27 2000-02-15 National Optics Institute Light modulating microdevice and method
WO2001035155A1 (en) * 1999-11-08 2001-05-17 Wavefront Analysis Inc. System and method for recovering phase information of a wave front
US20010006429A1 (en) * 1999-12-24 2001-07-05 Philips Corporation Optical wavefront modifier
US20030063384A1 (en) * 2001-08-31 2003-04-03 Dowski Edward Raymond Wavefront coding optics
WO2003073153A1 (en) * 2002-02-27 2003-09-04 Cdm Optics, Inc. Optimized image processing for wavefront coded imaging systems

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7944467B2 (en) 2003-12-01 2011-05-17 Omnivision Technologies, Inc. Task-based imaging systems
US8144208B2 (en) 2003-12-01 2012-03-27 Omnivision Technologies, Inc. Task-based imaging systems
US8760516B2 (en) 2003-12-01 2014-06-24 Omnivision Technologies, Inc. Task-based imaging systems
US8558873B2 (en) 2010-06-16 2013-10-15 Microsoft Corporation Use of wavefront coding to create a depth image
WO2013130077A1 (en) * 2012-02-29 2013-09-06 Agilent Technologies, Inc. Software defined microscope
US20140368904A1 (en) * 2012-02-29 2014-12-18 Agilent Technologies, Inc. Software Defined Microscope
WO2018136251A1 (en) * 2017-01-19 2018-07-26 Oculus Vr, Llc Focal surface display
US10330936B2 (en) 2017-01-19 2019-06-25 Facebook Technologies, Llc Focal surface display
US10558049B2 (en) 2017-01-19 2020-02-11 Facebook Technologies, Llc Focal surface display
DE102017125453B3 (en) 2017-10-30 2019-02-21 Carl Zeiss Meditec Ag Surgical microscope and procedures performed with the surgical microscope
US10440256B2 (en) 2017-10-30 2019-10-08 Carl Zeiss Meditec Ag Surgical microscope and method implemented with the surgical microscope

Also Published As

Publication number Publication date
WO2006102201A8 (en) 2007-03-29

Similar Documents

Publication Publication Date Title
RU2523028C2 (en) Image processing device, image capturing device and image processing method
WO2006102201A1 (en) Imaging systems with pixelated spatial light modulators
US7944609B2 (en) 3-D optical microscope
US7764433B2 (en) Method and system for correcting optical aberrations, including widefield imaging applications
US20100201865A1 (en) Imaging method for use with variable coded aperture device and imaging apparatus using the imaging method
CN110023810A (en) The figure adjustment of optical aberration
US10607389B2 (en) Microscopic imaging system and method with three-dimensional refractive index tomography
JPH09298682A (en) Focus depth extension device
EP2016456A2 (en) All optical system and method for providing extended depth of focus of imaging
US20160291302A1 (en) Image capture method for a microscope system, and corresponding microscope system
CN107077722A (en) Image capture apparatus and image-capturing method
WO2012127362A1 (en) Camera system comprising a camera, camera, method of operating a camera and method for deconvoluting a recorded image
Holmes et al. Blind deconvolution
US8508589B2 (en) Imaging systems and associated methods thereof
Bimber et al. Closed-loop feedback illumination for optical inverse tone-mapping in light microscopy
US20220082808A1 (en) Method of adjusting optical apparatus, adjustment support method, optical system, and optical apparatus
JP6330955B2 (en) Imaging apparatus and imaging method
JPWO2019026615A1 (en) Control device, control method, and imaging device
US20040169922A1 (en) Stereo microscopy
US10823951B2 (en) Method for imaging in a microscope with oblique illumination
Racicot et al. High resolution, wide field optical imaging of macaque visual cortex with a curved detector
JP6962714B2 (en) Observation device
Beckers et al. Real-time, extended depth DIC microscopy
Enguita et al. Multi-kernel deconvolution for contrast improvement in a full field imaging system with engineered PSFs using conical diffraction
Sun End-to-end Optics Design for Computational Cameras

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06738938

Country of ref document: EP

Kind code of ref document: A1