WO2007084943A2 - Methods and apparatus for locating the eye in three dimensions - Google Patents

Methods and apparatus for locating the eye in three dimensions Download PDF

Info

Publication number
WO2007084943A2
WO2007084943A2 PCT/US2007/060689 US2007060689W WO2007084943A2 WO 2007084943 A2 WO2007084943 A2 WO 2007084943A2 US 2007060689 W US2007060689 W US 2007060689W WO 2007084943 A2 WO2007084943 A2 WO 2007084943A2
Authority
WO
WIPO (PCT)
Prior art keywords
eye
light
electronic imaging
eyes
reflected
Prior art date
Application number
PCT/US2007/060689
Other languages
French (fr)
Other versions
WO2007084943A3 (en
Inventor
Don Yancey
Jean-Christophe Roulet
Original Assignee
Don Yancey
Jean-Christophe Roulet
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Don Yancey, Jean-Christophe Roulet filed Critical Don Yancey
Publication of WO2007084943A2 publication Critical patent/WO2007084943A2/en
Publication of WO2007084943A3 publication Critical patent/WO2007084943A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/15Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
    • A61B3/152Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing for aligning

Definitions

  • the present invention is in the field of devices used to locate the position of a subject's eye for purposes of examination or for subject's use of an optical instrument.
  • the X axis is considered to be a horizontal axis passing through or in front of both eyes of a patient.
  • the Y axis is a vertical axis passing between the eyes.
  • the Z axis is a horizontal axis passing between the eyes.
  • the terms "vertical” and “horizontal” are used with the assumption that the patient is sitting or standing upright, facing forward. This position of the patient is for establishing reference axes only, and does not represent a requirement of operation of the device.
  • an autorefractor This is a device that requires accurate alignment along not only the X and Y axes, but the Z axis as well. The distance from an eye to a known point along the Z axis will be referred to as the vertex distance.
  • One known method for establishing vertex distance is to project points of light onto the cornea and use reflections of such light to adjust vertex distance to a given value indicated by the returned reflections. Another method is to fix the patient in a chin- forehead rest so as to approximately locate the vertex distance at some desired value. Still another method is to use a focusing method to put the features of the eye into sharp focus so as to obtain and adjust the vertex distance. Still other well-known methods may be used to find and adjust the vertex distance. Many of these methods are not easily amenable to automation, however, because of complexity and consequent difficulties in implementation.
  • the PD can be decomposed into two Vz IPD measurements, each representing a distance between a respective one of the two pupils and bridge of the patient's nose. In a case of perfect left-right symmetry, the two Vi IPDs would be equal, but this is often not the case.
  • Known methods include manual as well as automatically calculated location, utilizing a closed circuit video monitor to obtain an image of the eyes and use this image to adjust the X-Y location of instrumentation to obtain the desired alignment with the eye. More automated approaches include determining X-Y eye location by establishing corneal or pupil reflections that impinge upon a quad photodetector. The data generated by the photodetector is used to control stepper motors or other actuators to move the optics into the indicated eye center.
  • Figure 1 illustrates a standard corneal reflection device for obtaining interpupillary distance.
  • a fixation target 101 is positioned at the focal point of lens 102 and illuminated by light source 103 producing visible light. This produces plane waves 104 seen by the subject's right eye 105 and left eye 106 as an illuminated target at optical infinity.
  • Adjustable wire occluders 107 are arranged to be movable so as to be selectively positioned so as to block corneal reflection of each of the eyes 105 and 106. Therefore, reflections from the corneas of eyes 105 and 106 with positions of occluder wires are used to measure IPD.
  • both eyes see a target at optical infinity. Accordingly, the lines of sight 105a and 106a of the eyes 105 and 106, respectively, are parallel. Parallel lines of sight inhibit the phenomenon of accommodation. Accommodation usually is not desired when examining the eyes, especially when examining the eyes for refractive errors, in which case the eye must not be accommodated but looking at a distant object or at optical infinity.
  • the object of the present invention is to find location of eye or eyes of a subject in three dimensions and thus to enable precise optical alignment of automatic instrumentation with optics of the eyes, either monocularly or binocularly, so as to facilitate accurate examination and measurement of the eyes.
  • an object of this invention is to obtain the x,- y, and z coordinates of each eye as well as IPD and/or Vz IPD measurements together with vertex distance. This locates the eye in three dimensions, which in turn allows for proper location of optical instruments with respect to the eye.
  • the examples will refer to refractor units as that instrument, but the present invention is not limited to such application. Indeed, also described is an application for binoculars, such application providing benefits of automatic centering to prevent vignetting and increase viewing comfort.
  • the apparatus may include, among other elements, first and second left-eye light sources, preferentially BR. light, each arranged to project light onto a left-eye area of an eye target area to produce first and second left-eye reflected light (from eye's pupil, "bright pupil"), respectively.
  • first and second right-eye light sources are arranged to project light onto a right-eye area of an eye target area to produce first and second right-eye reflected light, respectively.
  • a first lens is arranged with respect to the first left-eye light source, the first right-eye light source, the left-eye area, and the right-eye area so that the first left-eye reflected light and the first right-eye reflected light pass through the first lens to produce first left- eye focused light and first right-eye focused light.
  • the first left-eye light source and the first right-eye light source are positioned so as to be coaxial with the first lens.
  • a second lens is arranged with respect to the second left-eye light source, the second right-eye light source, the left-eye area, and the right-eye area so that the second left-eye reflected light and the second right-eye reflected light pass through the second lens to produce second left-eye focused light and second right-eye focused light, respectively.
  • the second left-eye light source and the second right-eye light source are positioned so as to be coaxial with the second lens;
  • a first electronic imaging device (sensor) is arranged with respect to the first lens so that the first left-eye and first right-eye focused lights impinge upon the first electronic imaging device.
  • the first electronic imaging device produces as output first left-eye and first right-eye position data based on locations on the first electronic imaging device where the first left-eye and first right-eye focused lights impinge.
  • a second electronic imaging device is arranged with respect to the second lens so that the second left-eye and second right-eye focused lights impinge upon the second electronic imaging device, the second electronic imaging device producing as output second left-eye and second right-eye position data based on locations on the second electronic imaging device where the second left-eye and second right-eye focused lights impinge.
  • camera means light sensor device with lens.
  • Electronic imaging device refers to the light sensor device. These cameras may have an IR light source for illuminating eye and this light source may be integrated into the camera. Alternatively, the IR light source may be external to the camera. These cameras, elements of cameras, and IR light sources for illumination are illustrated in the accompanying drawings. In some drawings, for simplicity, illumination of eye is not shown. Because of short focal lengths of the optics of cameras, that is, lens/electronic imaging device, the field of vision of the camera can be made wide angle, and the reflected light from the eye (“bright pupil”) appears as a small spot of light on the sensor. Accordingly, the small spot of light impinges on just a few pixels of sensor so that pixel locations and position angle (of eye being imaged) can be readily determined.
  • a variation of the method using IR light for illumination instead employs illumination, visible or IR, to image a "dark pupil" onto on a sensor.
  • illumination visible or IR
  • This is similar to that of using a "bright pupil,” except that the eye, including the iris is, illuminated. Then the eye, that is, the iris with a “dark pupil,” is imaged onto the sensor with the "dark pupil” located on a few pixels of the sensor.
  • a data processor receives as inputs the first and second left-eye position data, that is, the position of "bright pupil” that has been imaged on the sensors, and the first and second right-eye position data, and based on known relative positions of the first and second electronic imaging devices, the data processor calculates three-dimensional locations of the left and right eyes.
  • the embodiment described above utilizes two cameras.
  • an embodiment utilizing one camera that is, one lens and one sensor, and two mirrors is also described.
  • Such embodiment includes at least one movable mirror, together with a position controller for the movable mirror.
  • the position controller is designed so as to generate x-axis mirror position data.
  • At least one light source is arranged to project light to reflect off the movable mirror onto an eye target area.
  • the eye target area includes a left-eye area and a right-eye area. This produces left-eye reflected light and right-eye reflected light.
  • a lens is arranged with respect to the light source, the left-eye area, and the right-eye area so that the left-eye and right-eye reflected light pass through the lens to produce left-eye focused light and right-eye focused light, respectively.
  • the light source is arranged to be coaxial with the lens, that is, the field of view of lens is coincident with the illumination beam projected by the light source.
  • An electronic imaging device is arranged with respect to the lens so that the left-eye and right-eye focused light impinges upon the electronic imaging device.
  • the electronic imaging device generates left-eye position data and right-eye position data based on locations on the electronic imaging device where the left-eye and right-eye focused lights impinge, respectively.
  • a data processor receives the left-eye position data, the right-eye position data, and the mirror position (x-axis) data, and uses this information to calculate three- dimensional locations of left and right eyes.
  • vertex distance if known, can be substituted for position of x-axis data.
  • vertex distance along with data (angle data) from the electronic imaging device can be used to calculate three dimensional locations of left and right eyes.
  • two cameras are used to determine the three-dimensional location of a single eye.
  • Such a device includes first and second light sources, each arranged to project light onto an eye target area to produce first and second reflected light, respectively.
  • First and second lenses are arranged with respect to the first and second light sources and the eye target area so that the first and second reflected light pass through the first and second lenses, respectively. This produces first and second focused light.
  • the first and second light sources are arranged to be coaxial with the first and second lenses, respectively.
  • First and second electronic imaging devices are arranged with respect to the first and second lenses so that the first and second focused lights impinge upon the first and second electronic imaging devices, respectively.
  • the first and second electronic imaging devices generate first position data and second position data based on locations on the first and second electronic imaging devices where the first and second focused lights impinge, respectively.
  • Known data are location of each camera with respect to each other and with respect to center point of nose pads.
  • a data processor receives the first position data and the second position data. Based on these data along with known data of locations of each camera the processor calculates a three-dimensional location of the eye with respect to known point, such as the center point of nose pads or the known location of the cameras.
  • Yet another embodiment of the present invention utilizes four cameras (again, "camera” being used to mean a light sensor device with a lens) and IR illumination to determine the three-dimensional location of each of two eyes.
  • Such an apparatus includes first and second right-eye light sources, each of which is arranged to project light onto a right-eye pupil to produce first and second right-eye reflected light, respectively.
  • Corresponding first and second left-eye light sources are arranged to produce first and second left-eye reflected light, respectively.
  • First and second right-eye lenses are arranged with respect to the first and second right- eye light sources and the right-eye eye target area so that the first and second right-eye reflected light pass through the first and second right-eye lenses, respectively. This produces first and second right-eye focused light.
  • the first and second right-eye light sources to illuminate the eye are arranged to be coaxial with the first and second right-eye lenses.
  • Corresponding first and second left-eye lenses arranged to produce first and second left-eye focused light. Again, the first and second left-eye light sources are coaxial with the first and second left-eye lenses.
  • First and second right-eye electronic imaging devices are arranged with respect to the first and second right-eye lenses so that the first and second focused right-eye light impinges upon the first and second right-eye electronic imaging devices, respectively.
  • the first and second right-eye electronic imaging devices generate first and second right- eye position data based on locations on the first and second right-eye electronic imaging devices where the first and second right-eye focused lights impinge, respectively.
  • the first and second left-eye electronic imaging devices generate first and second left-eye position data based on where the first and second left-eye focused lights impinge, respectively.
  • a data processor receives the first and second left-eye position data and the first and second right-eye position data. The data processor then uses these three-dimensional locations of the left and right eyes with respect to the known point.
  • Figure 1 is an illustration of a top view of a prior art device
  • Figures 2A and 2B are side and perspective views of an embodiment of the present invention.
  • Figure 3 is a top view of an embodiment of the invention utilizing two cameras to locate two eyes;
  • Figure 4 is a perspective view of the embodiment of Figure 3
  • Figure 5 is a perspective view of the embodiment of Figure 3 utilizing moving cameras
  • Figure 6 illustrates one possible configuration of multiple emitters, a beam splitter, a lens, and a sensor
  • Figure 7 is a schematic illustration of the embodiment of Figure 3, identifying points used in an explanation of the geometry of the invention.
  • Figure 8 is a schematic illustration of the embodiment of Figure 3 demonstrating how the geometry of the device facilitates determination of the three-dimensional location of the eyes;
  • Figure 9A is a schematic view of an embodiment utilizing a single camera to detect the location of two eyes in conjunction with another device to measure the separation of the eyes along the x-axis;
  • Figure 9B is aligned with Figure 9 A to show moving mirrors used to determine the x-axis dimensions of the two eyes;
  • Figure 10 is a schematic illustration of the geometry underlying the embodiment of Figures 9 A and 9B; ;
  • Figure 11 illustrates an embodiment utilizing four cameras to determine the three- dimensional location of two eyes
  • Figure 12 illustrates the geometry underlying the embodiment of Figure 11
  • Figure 13 illustrates a portion of an embodiment in which a single camera is moved between two locations to identify the three-dimensional location of a single eye
  • Figure 14 illustrates an embodiment of the present invention implements in connection with one eyepiece of a pair of binoculars
  • Figure 15 illustrates the interrelationship between the sensor, beam splitter, emitter, lens, and eye of the subject.
  • Figure 2A a number of elements are added to those of the prior art device illustrated in Figure 1.
  • lens 102 collimates light 104 of target 101 which eyes 105, 106 see a target at optical infinity so that lines of vision 105a, 106a respectively are parallel to each other.
  • Occluder 107 selectively occludes either right eye 105 or left eye 106.
  • Figure 2A illustrates a first view from the side.
  • Two emitters 110 and 111 are arranged at a level below the fixation target 101 and light source 103.
  • a lens 101A lies between fixation target and beam splitter 113. Lens 101 A makes the fixation target appear at optical infinity for the subject.
  • the two emitters can lie in a common horizontal plane, and therefore only one is visible in the side view of Figure 2A.
  • the emitters 110 and 111 emit light that reflects off of beam splitters 112 and 113.
  • the emitters 110, 111 and the beam splitters 112 and 113 are positioned so that light from emitter 110 impinges upon the right eye 105, whereas light from emitter 111 impinges upon left eye 106.
  • Below beam splitter 112 lies lens 114 and a sensor or imaging device 115.
  • the sensor can be a charge-coupled device (CCD) or other technology. It is, in any event, a device that converts received light intensity upon various portions of an area into a signal that can be interpreted to determine where light is impinging upon such area. These signals can be interpreted by a microprocessor 116.
  • CCD charge-coupled device
  • the emitters 110 and 111 are positioned with respect to the first beam splitter 112 and lens 114 so that the light emitted by the each emitter is coaxial with lens 114. This arrangement is further illustrated in connection with the perspective view of Figure 2B. Such coaxial arrangement means that corneal reflections or bright pupils will be reflected back through the lens 114 and onto the sensor 115.
  • a single emitter can be used if such emitter is of sufficiently wide angle so as to illuminate both eyes.
  • coaxial it is meant that the illumination beam of the emitter is coincident with the sensor's field of view. Stated another way, the illumination is coaxial with an imaginary line running between the sensor and the pupil of the eye being observed.
  • the fixation target 101 can be fogged or otherwise adjusted so as to inhibit accommodation. So that the fixation target appears at optical infinity, or any other distance, lens 101 A is employed. Also, two fixation targets could also be used instead of single fixation target 101, with one fixation target for each eye. With such arrangement, the two targets must be properly collimated for each eye.
  • light from emitters 110 and 111 reflect sequentially off of beam splitters 112 and 113 to impinge upon right eye 105 and left eye 106, respectively.
  • the light then reflects off of eyes 105 and 106 and returns to reflect off of beam splitter 113.
  • the reflected light from eyes 105 and 106 then passes through beam splitter 112.
  • the reflected light then passes through lens 114 and is focused on a surface of sensor 115.
  • Emitters 110 and 111, beam splitters 112 and 113, lens 114, and sensor 115 are positioned in a fixed relationship with one another. Accordingly, the only variation as to the points of the surface of the sensor 115 upon which the reflected light originating from emitters 110 and 111 impinges is due to a variation in position of the eyes 105 and 106.
  • microprocessor 116 for interpreting the location of such points of reflected light from eyes 106 and 107 that impinge on sensor 115, two lines along which right and left eyes 105 and 106 are located and the angles of such lines can be found.
  • another embodiment utilizes a combination of a sensor and two emitters for each eye. This embodiment allows for the fast and accurate location of two eyes in three dimensional space, with the further ability to provide IPD and Vi IPD dimensions.
  • two separate sensors or electronic imaging devices 128 and 129 are provided.
  • sensors 128 and 129 are the schematically represented light source and optics 150 and 151 and eyes 1067, 106, respectively.
  • Each of sensors 128 and 129 is located at an appropriate focal length from the optics in associated elements 150 and 151, respectively.
  • Light source and optics 150 generates light that impinges upon the general areas where right eye 107 and left eye 106 are expected to be located during three-dimensional acquisition of the eyes. These areas where the eyes are expected to be located are the left-eye and right-eye target areas.
  • One option is for there to be a continuous beam that encompasses both of eyes 106 and 107.
  • Another option is to provide two separate beams, one projected in the direction of each of eyes 106 and 107. In the illustration of Figure 3, two separate light beams are provided. The lateral limits 160 of the beam projected on right eye 107 and the lateral limits 170 of the beam projected on left eye 106 appear in Figure 3.
  • the light projected onto eyes 106 and 107 will be reflected by the eyes, some of which will be reflected back to light source and optics 150.
  • Projected light 170 and 160 enters pupils of each of eyes 106 and 107, so that eyes 106, 107 reflect the projected light 170, 160 directly back toward light source and optics 150.
  • This reflected light appears as light beam 161 from right eye 107 and light beam 171 from left eye 106.
  • sensor 128 is positioned at an appropriate focal length of the optics in light source and optics 150, the reflected light beams 161 and 171 will impinge on sensor 128 as light beams 162 and 172, respectively. Accordingly, there will be two bright spots of light on the two- dimensional surface of sensor 128. The two-dimensional location of each of such bright spots is represented in the signals generated by sensor 128 and received by processor 116.
  • Light source and optics 151 is constructed and operates in relation to sensor 129 in the same way that light source and optics 150 interacts with sensor 128.
  • the light projected from light source and optics 151 onto left eye 106 and right eye 107 is not illustrated, nor is the light reflected back by the eyes so as not to clutter the illustration.
  • the light reflected back by right eye 107 is focused on sensor 129 as light beam 192.
  • the light reflected back by left eye 106 is focused on sensor 129 as light beam 182.
  • the position of each of the sensors 128, 129 and light source and optics 150 and 151 is fixed. Therefore the only variability as to the location of the points of light on the sensors 128, 129 is the positions of the eyes 106 and 107.
  • each point uniquely identifies a line along which the eye that produced such point by "bright pupil" reflection must be located in the form of a unique pixel location and subsequent angle of impinging ray.
  • Each of the two points of light on sensor 129 created by light beams 182 and 192 similarly defines a line along which the eye that produces such point by reflection must be located.
  • the processor 116 is connected to each of the sensors 128 and 129 to receive the data generated thereby, representative of the location of the two points of light on each of sensors 128 and 129. Processor 116 therefore performs the analysis to define the three- dimensional location of each of the eyes 106 and 107.
  • Processor 116 having calculated the position of each eye in three-dimensional space, can then calculate the IPD. If the device is further provided with a nose rest, a reference line
  • the reference line 130 can be established, aligned with the patient's nose.
  • the reference line 130 has a fixed and known position with respect to light sources and optics 150 and 151 as well as sensors 128 and 129. With this additional data, the individual V ⁇ IPDs associated with the left and right eyes can also be calculated.
  • Figure 4 is a schematic representation of the principle underlying the present embodiment. Li addition to the various light paths illustrated in Figure 3 are reflected light beam 191 from right eye 107 to light source and optics 151 as well as reflected light beam 181 from left eye 106 to light source and optics 151.
  • the reflected and focused light beam 192 from right eye 107 impinges upon sensor 129 at point 130 and the reflected and focused light beam 162 impinges upon sensor 128 at point 132.
  • the reflected and focused light beam 182 from left eye 106 impinges upon sensor 129 at point 131 and the reflected and focused light beam 172 impinges upon sensor 128 at point 133.
  • the location of points 130 and 132 serve to define the three-dimensional location of right eye 107.
  • 131 and 133 serve to define the three-dimensional location of left eye 106. Understood is that various combinations of lines, sides of various triangles, subtended angles, and supplementary angles, can be used to calculate the three-dimensional location of each eye.
  • each of the elements of the device itself lies in a fixed relationship with each other element.
  • each of the sensors 128 and 129 with associated optics and illumination is movable. This movement can be achieved by use of stepper motors or other known mechanism for mechanical control.
  • the stepper motors can be under the control of processor 116.
  • each of sensors 128 and 129 is moveable to be coincident with the optical axis of eyes 107 and 106, respectively.
  • points 131 and 132 can be located in the two-dimensional center of sensors 129 and 128, respectively.
  • the elements can be arranged so that light beams that impinge on the sensors at points 131 and 132 follow a path that is normal to the surface of sensors 129 and 128, respectively. This facilitates various measurements such as imaging structures of the eye, measuring corneal curvature, and imaging the eyes "straight-on" for undistorted display of the external eye. While each of the sensors is illustrated as being movable along both lateral and vertical axes, it is also possible that the sensors would be movable in only one of such axes or in a third axis orthogonal to the first two axes.
  • dark pupil could be used with illumination for the eye so that the eye's iris/pupil is imaged onto a sensor.
  • the “dark pupil” is an equivalent method to the methods presently being described, except the sensor's data processor requires software to recognize the "dark pupil” pixel location.
  • Light sources or emitters 201 and 202 are arranged with respect to a beam splitter 210 to project light onto beam splitter 210.
  • Each of emitters 201 and 202 can be of a variety that generates infrared light, but is not required to be.
  • the light that reflects off of beam splitter 210 passes through lens 211.
  • Single lens 211 could be replaced by a lens group.
  • Emitters 201, 202, beam splitter 210, and lens 211 are arranged so that the light originating from emitter 201 impinges upon right eye 107, and the light originating from emitter 202 impinges upon left eye 106.
  • the reflected beams 161 from right eye 107 and 171 from left eye 106 are not illustrated in Figure 6, but each would pass through lens 211 and beam splitter 210 to produce beams 162 and 172, respectively.
  • Sensor 128 is positioned at an appropriate focal length of lens 211 in order to properly image eye. Additionally, each of emitters 201 and 202 is positioned so as to be coaxial with sensor 128 and lens 211. Modification of the basic structure is possible and such variations are known to one of skill in the art. Among other variations, the lens can be placed between the beam splitter and the sensor. The two emitters can be replaced with a single emitter if such single emitter produces a beam that is sufficiently broad so as to impinge upon both eyes. The emitters can be of a variety that generates infrared light of approximately 920 nm, but need not be so.
  • the light source and optics as a unit, must project light upon an area where one or both eyes of the patient are expected to be found, and arranged with respect to a sensor so that the reflected light is focused to a spot on a two-dimensional surface of the sensor in such a way that the location of the spot defines a unique line along which the eye producing the reflection must lie, and which line's angle is uniquely defined as will shortly be illustrated.
  • the interrelationship between such two lines defines a unique three-dimensional location for the eye.
  • Figures 7 and 8 illustrate the geometry that underlies the translation of data received from sensors 128 and 129 into the three-dimensional locations of eyes 106 and 107.
  • Points A and B are known points that cameras are positioned, and the known length AB is used in calculations.
  • the reference line 130 exists in a known relationship with lenses 224 and 225, emitters 220 and 221, beam splitters 222 and 223, and sensors 128 and 129.
  • Each set of emitter, beam splitter, and lens in Figure 7 is arranged so that the light produced by the emitter is coaxial with the associated lens.
  • a right eye line of vision 230 extends directly forward from right eye 107 because the fixation target viewed by the subject is located at optical infinity.
  • the fixation target could be set at a nearby target so that eyes' lines of vision converge, but for simplicity it is assumed the fixation target is at optical infinity and eyes' lines of vision are parallel to each other.
  • a line of vision 231 extends from left eye 106 in parallel with line of vision 230.
  • Point 133 on sensor 128 uniquely defines light path 171 in three dimensional space given the known relationship between sensor 128 and lens 224.
  • spot 131 uniquely identifies light path 181.
  • points 132 and 130 on sensors 128 and 129 define lines 161 and 191, respectively.
  • Lines 161 and 191 then uniquely define point D in three dimensional space.
  • Points A and B are defined by the known position of lenses 224 and 225, respectively.
  • Reference line 130 is defined by the known position of the nose rest. Given this information the three dimensional locations of points A, B, are known as is the plane defined by reference 130, and points C, and D can be calculated.
  • Figure 8 illustrates the geometric interrelationship of these known elements.
  • Angles ⁇ and ⁇ are defined by points 133 and 131, respectively.
  • Length AB is known as both points are defined by the position of lenses 224 and 225, respectively.
  • Length AB and angles ⁇ / and ⁇ ' known, the three dimensional location of point C can be derived. Tn similar fashion, the three dimensional location of point D can be derived from known AB and points 130 and 132.
  • CD defines IPD for the patient. DE defines V% IPD for right eye 107, and CD defines 1 A IPD for left eye 106. Knowing AB and C, it is also possible to derive G. Correspondingly, knowing AB and D, one can derive F. CG therefore defines distance of left eye 106 from the plane of AB, representing the z-dimensional position of left eye 106 or its vertex distance. Similarly, the z-dimension or vertex distance of right eye 107 can also be derived.
  • a moving mirror device is first used to establish an x-dimension distance between the pupils, or IPD.
  • An emitter 240 generates a beam of light that reflects off of a beam splitter 241 toward a moving mirror 242.
  • Moving mirror 242 is constructed to be contollably movable along an axis that will cause it to pass in front of expected positions of the right and left eyes 107 and 106 of the patient. It is not critical that the path of moving mirror 242 be precisely aligned with eyes 106 and 107 on the y-axis, so long as some of the light reflected onto the eyes by mirror 242 is reflected back to mirror 242.
  • Sensor 243 is aligned with the path of mirror 242.
  • Mirror 242 begins at its rest position where it is shown in a solid line in Figure 9A. In use, mirror 242 moves along the x axis while light from emitter 240 reflects off of beam splitter 241 and onto moving mirror 242. Any light reflected back from moving mirror 242 will pass through beam splitter 241 and impinge on sensor 243. To measure IPD, moving mirror 242 moves from its rest position along a path parallel to the x axis in front of right eye 107 and left eye 106. As moving mirror 242 passes near either of the eyes some of the light originating from emitter 240 and reflected by moving mirror 242 onto the eye and will be reflected back. This light reflected back by either of the eyes will ultimately impinge on and be detected by sensor 243.
  • the ordinate of the plot of Figure 9 A is intensity of light impinging on sensor 243.
  • the intensity is at a baseline value.
  • the intensity of light impinging on sensor 243 increases as some of the light originating from emitter 240 and reflecting off moving mirror 242 is reflected back by right eye 107.
  • the intensity of the reflected light reaches a maximum value, beyond which the intensity returns to the baseline value.
  • the process repeats as moving mirror 242 approaches and passes by left eye 106.
  • the locations of, and hence distance between, the right eye 107 and left eye 106 is discernible by the two peaks of light intensity as measured by sensor 243.
  • the first data regarding the location of the eyes in three dimensions is established, namely the distance between the eyes along the x axis, or the IPD. In the illustration of Figure 9 A, this corresponds to the distance between points B and C.
  • light is projected onto right 107 and left eye 106.
  • the light is projected by way of an emitter and beam splitter located in a known relationship with the lens 250, so long as the light produced is coaxial with lens 250.
  • the light that reflects off of the eyes 106 and 107 is then focused at two distinct points D and E, respectively, on a surface of sensor 260.
  • Sensor 260 is located at an appropriate focal length of lens 250 in relation to imaging eyes.
  • the location of spot D on the two-dimensional surface of sensor 260 defines a unique line in three dimensional space along which left eye 106 must be located.
  • the same known relationship in view of the location of spot E on the two dimensional surface of sensor 260 defines a unique line in three dimensional space along which right eye 107 must be located.
  • Point A is known in light of the known position of lens 250, which is also in the centerline of the light sensor 260.
  • the location of points D and E are known by interpreting data output by sensor 260, which interpretation can be performed by a processor such as processor 116 illustrated hi connection with previous embodiments.
  • the known relationship between sensor 260 and lens 250 means that each potential location of spots D and E corresponds to specified angles ⁇ and ⁇ . Angle ⁇ is therefore known (180 - ⁇ - ⁇ , as the sum of angles for a triangle must equal 180°).
  • the known relationship between points A, D, and E serves to define two lines that encompass line segments AC and AB. The location of points B and C along these lines is not known from the position of points A, D 5 and E, however.
  • length BC is known. This length, and the fact that line segment BC lies in a plane parallel to the plane of the surface of sensor 260 allows for the derivation of the unique positions of points B and C along the previously defined lines by using elementary geometry. With the three dimensional locations of points A-E established, the locations of points F and G can then be derived. Length CF therefore defines the vertex distance for right eye 107, and length BG defines the vertex distance for left eye 106. If the device is provided with a nose rest having a known position in relation to the other elements of the device, plane 130 is known. The position of plane 130 with respect to points B and C serves to define the two Vz IPD distances.
  • two lenses and two sensors that is, two cameras, are provided for each eye, for a total of four lenses and four sensors, that is, a total of four cameras.
  • Four sensors 311-314 are arranged at the appropriate focal lengths of four lenses 301-304, respectively. Each is positioned so that an optical path exists between an area where one eye of a patient is expected to be found and one of sensors 311-314, passing through lenses 301-304, respectively. While the particular arrangement illustrated in Figure 11 has all of the sensors and lenses inboard of the anticipated position of the eyes 106 and 107, this is not a requirement.
  • the lens/sensor sets could be positioned on opposite sides of each eye, or above and below the level of the eyes.
  • Lines 321, 322 represent the paths of light that are reflected back from right eye 107 from such emitters, which pass through lenses 301 and 302 and impinge upon sensors 311 and 312, respectively.
  • Lines 323, 324 represent the paths of light that are reflected back from left eye 106 from such emitters, which pass through lenses 303 and 304 and impinge upon sensors 313 and 314, respectively.
  • Each of these paths of light 321-324 returned from one of the eyes is focused into one of spots 331-334 on one of sensors 311-314, respectively.
  • the location of each of spots 331-334 on the two-dimensional surface of sensors 311-314 uniquely defines a line in three- dimensional space along which the eye from which such light was reflected must lie as previously described. As there are two such lines associated with each eye, there exists only one location of each eye that lies along both lines associated with that eye. This serves to uniquely identify the location of both eyes in three dimensional space.
  • Figure 12 illustrates the geometric interrelationship of these known elements.
  • Angles ⁇ and ⁇ are defined by points 331 and 332, respectively.
  • Length AB is known as both points are defined by the position of lenses 301 and 302, respectively.
  • Length AB and angles a r and ⁇ ' known, the three dimensional location of point C can be derived.
  • the three dimensional location of point F can be derived from known DE and points 333 and 334.
  • CF defines IPD for the patient.
  • CG defines Vi IPD for right eye 107
  • FG defines Vz IPD for left eye 106.
  • FI therefore defines distance of left eye 106 from the plane of AB, representing the z-dimensional position of left eye 106 or its vertex distance. Similarly, the z-dimension or vertex distance of right eye 107 can also be derived.
  • the two sets of sensors, lenses, and associated emitters provided for each eye can be replaced by a single such set.
  • the single set takes a first measurement at a first position, provides the output from the sensor to a processor, and then moves to a second position and takes a second reading, the data associated with which is also provided to the processor.
  • This is illustrated in Figure 13, using elements corresponding to those of Figure 11.
  • Lenses 301 and 302 are replaced by single lens 301.
  • Sensors 311 and 312 are replaced by single sensor 311.
  • the location of spot 331 is determined and, as explained in connection with the previous embodiment, line 321 is defined.
  • the combination of lens 301 and sensor 311 is then moved to the second position indicated, corresponding to the position of lens 302 and sensor 312 of Figure 11.
  • the location of spot 332 is determined and line 322 is defined.
  • One application of the present invention is that of automatically centering the optics of binoculars and other optical devices.
  • vignetting As the optical axis of the eye of a user moves away from the optical axis of the optics through which such eye is viewing, a portion of the image is removed from view. This is referred to as vignetting.
  • Automatically the optics with the user's eyes would serve to prevent vignetting and to ensure comfortable viewing.
  • Such centering would amount to adjustment of the optics along the x and y axes with respect to the user's eyes. Considered from the perspective of the user, this would mean moving the optics left/right and up/down.
  • Automatic centering of the optics with respect to the eyes can be done in a continuous and automatic manner so that optical axis of each set of optics in the binoculars is automatically and continuously aligned with the optical axis of the associated eyes. This enables the viewer to concentrate on the image instead of constantly correcting the position of the device with respect to the user's eyes. This increases viewing clarity as well as viewing comfort. Particularly in the context of low-illumination viewing, continuous automatic centering offers a considerable benefit.
  • lenses 401 and 402 are elements of an adjustable diopter lens. These may be lenses in the eyepiece of an optical device, such as binoculars. Lenses 401 and 402 are movable as a unit, which will be referred to as an eyepiece. This movement may be achieved by a stepper motor under control of a processor. The movement would be possible in the x and y axes, or up/down and left/right from the perspective of a user during use.
  • sensors 420 and 421 Associated with each eyepiece with lenses 401 and 402 are sensors 420 and 421, lenses 410 and 411, and apertures or windows 430 and 431. Sensors 420 and 421 are positioned at the appropriate focal length of lenses 410 and 411, respectively. Also present are emitters 450 and 451 and further optical elements such as beam splitters 460 and 461 that provide for light to be cast upon the user's eye 403. The light is cast in such a way as to be coaxial with lenses 410, 411 and associated sensors 420, 421. The light returns after reflecting from eye 403 along lines 440 and 441.
  • Each set of sensor, lens, beam splitter and emitter is considered to be a camera for the purposes of this and other embodiments.
  • the emitter may be an IR emitter.
  • Each camera is tilted at an angle ⁇ 420a with respect to a plane parallel to the eyepiece lenses so as to be directed toward the user's eye. This angle is incorporated into the calculation of the position of eye 403.
  • Each combination of an eyepiece and two cameras can be movable as a single unit so as to be properly positioned to be centered and aligned with eye 403.
  • the wide beam of infrared illumination is illustrated by the divergent limits of the IR illumination exiting each of the windows 430 and 431 and directed toward the eye 403.
  • the beams 440 and 441 represent the bright pupils reflected back toward the cameras
  • the point that reflected light beam 440 impinges upon the two-dimensional surface of sensor 420 uniquely defines line 440 in three-dimensional space along which eye 403 must lie, in light of the known spatial relationship between lens 410 and sensor 420.
  • the two lines 440 and 441 uniquely define the location eye 403 in three dimensions.
  • a processor that receives as input the output of sensors 420 and 421 can perform the calculations necessary to make such determination of the location of eye 403 and control stepper motors of other mechanical devices to maintain the eyepiece with lenses 401 and 402 in alignment with eye 403.
  • the components illustrated in Figure 14 could be duplicated so that each of two eyepieces could be independently controlled.
  • eye relief The maximum distance, usually expressed in millimeters, that an optical device can be held from the eye with the full field of view comfortably observed is known as eye relief.
  • the eye relief is so small that insufficient space is provided for the arrangement of elements as illustrated in Figure 14.
  • One solution to this problem is to have sensors 420, 421 and their associated emitters project and receive light through lenses 401 and 402 of the eyepiece itself. This would likely entail larger eyepieces and a larger field of vision for the eye.
  • the movable eyepieces can be linked to associated optics using prisms or other well known methods.
  • the eye pieces could be manually adjusted to within a few millimeters of alignment, and then the mechanism described above would provide fine adjustment of interpuillary distance (auto IPD) to achieve fine alignment between the eyepieces and eyes. Adjustment of as little as +/- 5 mm for the X and Y axes would likely be sufficient to obtain fine adjustment for the IPD of most users, which averages 64-65 mm.
  • auto IPD interpuillary distance
  • eyepiece lenses can additionally be adjusted to control the eye's actual distance from the eyepiece lenses.
  • a built-in autorefractor of a type known in the art can be used to adjust the diopter value of the eye piece lenses to assure the sharpest image on the eye's retina.
  • FIG 15 illustrates an arrangement of elements that may be used in connection with the embodiment of Figure 14 or other embodiments discussed above.
  • Emitter 501 projects light that may be in the infrared range. The light from a single emitter 501 reflects off of beam splitter 502 and passes through lens 503 toward an eye 504.
  • Emitter 501 may be positioned to be inside the focal point of lens 503 to provide a sufficiently wide beam.
  • Lens 503 is positioned with respect to sensor 531 so that eye 504 is properly focused on the sensor. The same applies to lenses 410 and 411 and sensors 420 and 421 of Figure 14.
  • the wide beam of illumination of eye 504 is indicated by outer boundaries of such light 510 and 511.
  • the bright pupil reflection from eye 504 appears as 521, which passes back through lens 503, through beam splitter 502, and impinges upon a sensor 531. Because of the optics design, the reflected bright pupil, or alternatively the dark pupil, appears as a small spot on sensor 531. The location of such small spot on sensor 531 uniquely identifies a line along which eye 504 must be located.
  • lens center lines correspond to the center lines of the associated sensors.
  • the advantages of the above application are quickly determining the location of the eye in x-, y-, and z-axes so there is no searching for the eye and so that speedy acquisition of eye is achieved, that is, alignment of device optical axis with the eye's optical axis.

Abstract

In a method and apparatus for locating the position of an eye in three dimensions, an emitter projects light so that it illuminates the eye whose location is to be determined. A bright pupil reflection from the eye travels back and is focused by a lens so that it impinges upon a sensor. Using a beam splitter or other arrangement, the light is cast in such a way that the field of view of the lens is coincident with the light projected by the emitter. In light of a known relationship between the lens and sensor, a unique line is established along which the reflecting eye must lie. If two distinct lines are established, by the same or another camera, a unique location in three dimensions for the eye can be defined. Also, a single line and other known feature of the position of the eye can be combined.

Description

Methods and Apparatus for Locating the Eye in Three Dimensions BACKGROUND OF THE INVENTION Field of the Invention
The present invention is in the field of devices used to locate the position of a subject's eye for purposes of examination or for subject's use of an optical instrument.
Description of the Related Art
In examinations of the eye it is essential first to locate the eye and align examination instruments with the eye. With proper training and practical experience this is relatively simple using manual instruments such as an ophthalmoscope. When the instrumentation is automatic, locating and alignment of optical instrumentation with the eye becomes more problematic. That is, automatic instrumentation such as autorefractors and fundus cameras must be accurately located in X, Y and Z axes to function properly.
For present purposes, the X axis is considered to be a horizontal axis passing through or in front of both eyes of a patient. The Y axis is a vertical axis passing between the eyes. The Z axis is a horizontal axis passing between the eyes. For the purpose of these definitions, the terms "vertical" and "horizontal" are used with the assumption that the patient is sitting or standing upright, facing forward. This position of the patient is for establishing reference axes only, and does not represent a requirement of operation of the device.
One example of such instrumentation is an autorefractor. This is a device that requires accurate alignment along not only the X and Y axes, but the Z axis as well. The distance from an eye to a known point along the Z axis will be referred to as the vertex distance.
One known method for establishing vertex distance is to project points of light onto the cornea and use reflections of such light to adjust vertex distance to a given value indicated by the returned reflections. Another method is to fix the patient in a chin- forehead rest so as to approximately locate the vertex distance at some desired value. Still another method is to use a focusing method to put the features of the eye into sharp focus so as to obtain and adjust the vertex distance. Still other well-known methods may be used to find and adjust the vertex distance. Many of these methods are not easily amenable to automation, however, because of complexity and consequent difficulties in implementation.
In addition to the vertex distance, eye location on the X and Y axes must also be determined. The distance between the patients pupils along the X axis is known as the interpupillary distance, or IPD. For any given patient, the PD can be decomposed into two Vz IPD measurements, each representing a distance between a respective one of the two pupils and bridge of the patient's nose. In a case of perfect left-right symmetry, the two Vi IPDs would be equal, but this is often not the case.
Known methods include manual as well as automatically calculated location, utilizing a closed circuit video monitor to obtain an image of the eyes and use this image to adjust the X-Y location of instrumentation to obtain the desired alignment with the eye. More automated approaches include determining X-Y eye location by establishing corneal or pupil reflections that impinge upon a quad photodetector. The data generated by the photodetector is used to control stepper motors or other actuators to move the optics into the indicated eye center.
One example of a known device is illustrated in Figure 1. Figure 1 illustrates a standard corneal reflection device for obtaining interpupillary distance. A fixation target 101 is positioned at the focal point of lens 102 and illuminated by light source 103 producing visible light. This produces plane waves 104 seen by the subject's right eye 105 and left eye 106 as an illuminated target at optical infinity. Adjustable wire occluders 107 are arranged to be movable so as to be selectively positioned so as to block corneal reflection of each of the eyes 105 and 106. Therefore, reflections from the corneas of eyes 105 and 106 with positions of occluder wires are used to measure IPD.
In this embodiment both eyes see a target at optical infinity. Accordingly, the lines of sight 105a and 106a of the eyes 105 and 106, respectively, are parallel. Parallel lines of sight inhibit the phenomenon of accommodation. Accommodation usually is not desired when examining the eyes, especially when examining the eyes for refractive errors, in which case the eye must not be accommodated but looking at a distant object or at optical infinity.
This and other embodiments known in the art have had a number of drawbacks. These include poor signal-to-noise ratio of corneal reflections that make practical electronic implementation unreliable. Using the bright pupil method increases signal-to-noise ratio but in finding peak intensity of the bright pupil, which is not a sharp point, does not provide a precisely defined location. Using quad detectors is excellent for finding alignment, but does not provide other measurements such as vertex distance. Moreover, these prior art methods in many cases rely on moving parts to find alignment points so that finding location of eye is neither speedy nor simple.
SUMMARY OF THE INVENTION
The object of the present invention is to find location of eye or eyes of a subject in three dimensions and thus to enable precise optical alignment of automatic instrumentation with optics of the eyes, either monocularly or binocularly, so as to facilitate accurate examination and measurement of the eyes.
Therefore, an object of this invention is to obtain the x,- y, and z coordinates of each eye as well as IPD and/or Vz IPD measurements together with vertex distance. This locates the eye in three dimensions, which in turn allows for proper location of optical instruments with respect to the eye. The examples will refer to refractor units as that instrument, but the present invention is not limited to such application. Indeed, also described is an application for binoculars, such application providing benefits of automatic centering to prevent vignetting and increase viewing comfort.
To achieve these and other advantages, I describe an apparatus for determining a location of each of a left and a right eye in three dimensions. The apparatus may include, among other elements, first and second left-eye light sources, preferentially BR. light, each arranged to project light onto a left-eye area of an eye target area to produce first and second left-eye reflected light (from eye's pupil, "bright pupil"), respectively. Similarly, first and second right-eye light sources are arranged to project light onto a right-eye area of an eye target area to produce first and second right-eye reflected light, respectively.
A first lens is arranged with respect to the first left-eye light source, the first right-eye light source, the left-eye area, and the right-eye area so that the first left-eye reflected light and the first right-eye reflected light pass through the first lens to produce first left- eye focused light and first right-eye focused light. The first left-eye light source and the first right-eye light source are positioned so as to be coaxial with the first lens.
Correspondingly, a second lens is arranged with respect to the second left-eye light source, the second right-eye light source, the left-eye area, and the right-eye area so that the second left-eye reflected light and the second right-eye reflected light pass through the second lens to produce second left-eye focused light and second right-eye focused light, respectively. The second left-eye light source and the second right-eye light source are positioned so as to be coaxial with the second lens;
A first electronic imaging device (sensor) is arranged with respect to the first lens so that the first left-eye and first right-eye focused lights impinge upon the first electronic imaging device. The first electronic imaging device produces as output first left-eye and first right-eye position data based on locations on the first electronic imaging device where the first left-eye and first right-eye focused lights impinge. Similarly, a second electronic imaging device is arranged with respect to the second lens so that the second left-eye and second right-eye focused lights impinge upon the second electronic imaging device, the second electronic imaging device producing as output second left-eye and second right-eye position data based on locations on the second electronic imaging device where the second left-eye and second right-eye focused lights impinge.
In the description, "camera" means light sensor device with lens. "Electronic imaging device" refers to the light sensor device. These cameras may have an IR light source for illuminating eye and this light source may be integrated into the camera. Alternatively, the IR light source may be external to the camera. These cameras, elements of cameras, and IR light sources for illumination are illustrated in the accompanying drawings. In some drawings, for simplicity, illumination of eye is not shown. Because of short focal lengths of the optics of cameras, that is, lens/electronic imaging device, the field of vision of the camera can be made wide angle, and the reflected light from the eye ("bright pupil") appears as a small spot of light on the sensor. Accordingly, the small spot of light impinges on just a few pixels of sensor so that pixel locations and position angle (of eye being imaged) can be readily determined.
Note that a variation of the method using IR light for illumination instead employs illumination, visible or IR, to image a "dark pupil" onto on a sensor. This is similar to that of using a "bright pupil," except that the eye, including the iris is, illuminated. Then the eye, that is, the iris with a "dark pupil," is imaged onto the sensor with the "dark pupil" located on a few pixels of the sensor.
Finally, a data processor receives as inputs the first and second left-eye position data, that is, the position of "bright pupil" that has been imaged on the sensors, and the first and second right-eye position data, and based on known relative positions of the first and second electronic imaging devices, the data processor calculates three-dimensional locations of the left and right eyes.
The embodiment described above utilizes two cameras. As another object of the present invention, an embodiment utilizing one camera, that is, one lens and one sensor, and two mirrors is also described. Such embodiment includes at least one movable mirror, together with a position controller for the movable mirror. The position controller is designed so as to generate x-axis mirror position data.
At least one light source is arranged to project light to reflect off the movable mirror onto an eye target area. The eye target area includes a left-eye area and a right-eye area. This produces left-eye reflected light and right-eye reflected light.
A lens is arranged with respect to the light source, the left-eye area, and the right-eye area so that the left-eye and right-eye reflected light pass through the lens to produce left-eye focused light and right-eye focused light, respectively. The light source is arranged to be coaxial with the lens, that is, the field of view of lens is coincident with the illumination beam projected by the light source.
An electronic imaging device is arranged with respect to the lens so that the left-eye and right-eye focused light impinges upon the electronic imaging device. The electronic imaging device generates left-eye position data and right-eye position data based on locations on the electronic imaging device where the left-eye and right-eye focused lights impinge, respectively.
Finally, a data processor receives the left-eye position data, the right-eye position data, and the mirror position (x-axis) data, and uses this information to calculate three- dimensional locations of left and right eyes. Note that vertex distance, if known, can be substituted for position of x-axis data. Li other words, vertex distance along with data (angle data) from the electronic imaging device can be used to calculate three dimensional locations of left and right eyes. In another embodiment, two cameras are used to determine the three-dimensional location of a single eye. Such a device includes first and second light sources, each arranged to project light onto an eye target area to produce first and second reflected light, respectively.
First and second lenses are arranged with respect to the first and second light sources and the eye target area so that the first and second reflected light pass through the first and second lenses, respectively. This produces first and second focused light. The first and second light sources are arranged to be coaxial with the first and second lenses, respectively.
First and second electronic imaging devices are arranged with respect to the first and second lenses so that the first and second focused lights impinge upon the first and second electronic imaging devices, respectively. The first and second electronic imaging devices generate first position data and second position data based on locations on the first and second electronic imaging devices where the first and second focused lights impinge, respectively. Known data are location of each camera with respect to each other and with respect to center point of nose pads.
Finally, a data processor receives the first position data and the second position data. Based on these data along with known data of locations of each camera the processor calculates a three-dimensional location of the eye with respect to known point, such as the center point of nose pads or the known location of the cameras.
Yet another embodiment of the present invention utilizes four cameras (again, "camera" being used to mean a light sensor device with a lens) and IR illumination to determine the three-dimensional location of each of two eyes. Such an apparatus includes first and second right-eye light sources, each of which is arranged to project light onto a right-eye pupil to produce first and second right-eye reflected light, respectively. Corresponding first and second left-eye light sources are arranged to produce first and second left-eye reflected light, respectively.
First and second right-eye lenses are arranged with respect to the first and second right- eye light sources and the right-eye eye target area so that the first and second right-eye reflected light pass through the first and second right-eye lenses, respectively. This produces first and second right-eye focused light. The first and second right-eye light sources to illuminate the eye are arranged to be coaxial with the first and second right-eye lenses. Corresponding first and second left-eye lenses arranged to produce first and second left-eye focused light. Again, the first and second left-eye light sources are coaxial with the first and second left-eye lenses.
First and second right-eye electronic imaging devices are arranged with respect to the first and second right-eye lenses so that the first and second focused right-eye light impinges upon the first and second right-eye electronic imaging devices, respectively. The first and second right-eye electronic imaging devices generate first and second right- eye position data based on locations on the first and second right-eye electronic imaging devices where the first and second right-eye focused lights impinge, respectively. Corresponding first and second left-eye electronic imaging devices arranged with respect to the first and second left-eye lenses. The first and second left-eye electronic imaging devices generate first and second left-eye position data based on where the first and second left-eye focused lights impinge, respectively.
Finally, a data processor receives the first and second left-eye position data and the first and second right-eye position data. The data processor then uses these three-dimensional locations of the left and right eyes with respect to the known point.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be described in connection with the attached drawings, in which:
Figure 1 is an illustration of a top view of a prior art device;
Figures 2A and 2B are side and perspective views of an embodiment of the present invention;
Figure 3 is a top view of an embodiment of the invention utilizing two cameras to locate two eyes;
Figure 4 is a perspective view of the embodiment of Figure 3; Figure 5 is a perspective view of the embodiment of Figure 3 utilizing moving cameras;
Figure 6 illustrates one possible configuration of multiple emitters, a beam splitter, a lens, and a sensor;
Figure 7 is a schematic illustration of the embodiment of Figure 3, identifying points used in an explanation of the geometry of the invention;
Figure 8 is a schematic illustration of the embodiment of Figure 3 demonstrating how the geometry of the device facilitates determination of the three-dimensional location of the eyes;
Figure 9A is a schematic view of an embodiment utilizing a single camera to detect the location of two eyes in conjunction with another device to measure the separation of the eyes along the x-axis;
Figure 9B is aligned with Figure 9 A to show moving mirrors used to determine the x-axis dimensions of the two eyes;
Figure 10 is a schematic illustration of the geometry underlying the embodiment of Figures 9 A and 9B; ;
Figure 11 illustrates an embodiment utilizing four cameras to determine the three- dimensional location of two eyes;
Figure 12 illustrates the geometry underlying the embodiment of Figure 11;
Figure 13 illustrates a portion of an embodiment in which a single camera is moved between two locations to identify the three-dimensional location of a single eye; and
Figure 14 illustrates an embodiment of the present invention implements in connection with one eyepiece of a pair of binoculars; and
Figure 15 illustrates the interrelationship between the sensor, beam splitter, emitter, lens, and eye of the subject.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
In a first embodiment illustrated in Figure 2A, a number of elements are added to those of the prior art device illustrated in Figure 1. In Figure 1, lens 102 collimates light 104 of target 101 which eyes 105, 106 see a target at optical infinity so that lines of vision 105a, 106a respectively are parallel to each other. Occluder 107 selectively occludes either right eye 105 or left eye 106. As opposed to the top view of Figure 1, Figure 2A illustrates a first view from the side. Two emitters 110 and 111 are arranged at a level below the fixation target 101 and light source 103. A lens 101A lies between fixation target and beam splitter 113. Lens 101 A makes the fixation target appear at optical infinity for the subject. The two emitters can lie in a common horizontal plane, and therefore only one is visible in the side view of Figure 2A. The emitters 110 and 111 emit light that reflects off of beam splitters 112 and 113. The emitters 110, 111 and the beam splitters 112 and 113 are positioned so that light from emitter 110 impinges upon the right eye 105, whereas light from emitter 111 impinges upon left eye 106. Below beam splitter 112 lies lens 114 and a sensor or imaging device 115. The sensor can be a charge-coupled device (CCD) or other technology. It is, in any event, a device that converts received light intensity upon various portions of an area into a signal that can be interpreted to determine where light is impinging upon such area. These signals can be interpreted by a microprocessor 116.
For purposes of description, the words "ray," "line," and "beam" may be used interchangeably in describing embodiments and geometric figures.
The emitters 110 and 111 are positioned with respect to the first beam splitter 112 and lens 114 so that the light emitted by the each emitter is coaxial with lens 114. This arrangement is further illustrated in connection with the perspective view of Figure 2B. Such coaxial arrangement means that corneal reflections or bright pupils will be reflected back through the lens 114 and onto the sensor 115.
As one alternative to the implementation illustrated in Figures 2 A and 2B, a single emitter can be used if such emitter is of sufficiently wide angle so as to illuminate both eyes. By "coaxial" it is meant that the illumination beam of the emitter is coincident with the sensor's field of view. Stated another way, the illumination is coaxial with an imaginary line running between the sensor and the pupil of the eye being observed.
The fixation target 101 can be fogged or otherwise adjusted so as to inhibit accommodation. So that the fixation target appears at optical infinity, or any other distance, lens 101 A is employed. Also, two fixation targets could also be used instead of single fixation target 101, with one fixation target for each eye. With such arrangement, the two targets must be properly collimated for each eye.
With the device of Figures 2A and 2B, light from emitters 110 and 111 reflect sequentially off of beam splitters 112 and 113 to impinge upon right eye 105 and left eye 106, respectively. The light then reflects off of eyes 105 and 106 and returns to reflect off of beam splitter 113. Having bounced off of beam splitter 113, the reflected light from eyes 105 and 106 then passes through beam splitter 112. The reflected light then passes through lens 114 and is focused on a surface of sensor 115.
Emitters 110 and 111, beam splitters 112 and 113, lens 114, and sensor 115 are positioned in a fixed relationship with one another. Accordingly, the only variation as to the points of the surface of the sensor 115 upon which the reflected light originating from emitters 110 and 111 impinges is due to a variation in position of the eyes 105 and 106. By using microprocessor 116 for interpreting the location of such points of reflected light from eyes 106 and 107 that impinge on sensor 115, two lines along which right and left eyes 105 and 106 are located and the angles of such lines can be found.
While the embodiment illustrated in Figures 2A and 2B is shown with the line between the fixation target and the eyes being direct, and the path between the emitters, eyes, and sensor being "folded" by way of reflection, this is not necessarily the case. The path between the eyes and sensor can be direct, with the path between the eyes and fixation target being folded.
Illustrated in Figure 3, another embodiment utilizes a combination of a sensor and two emitters for each eye. This embodiment allows for the fast and accurate location of two eyes in three dimensional space, with the further ability to provide IPD and Vi IPD dimensions.
In this embodiment two separate sensors or electronic imaging devices 128 and 129 are provided. Associated with sensors 128 and 129 are the schematically represented light source and optics 150 and 151 and eyes 1067, 106, respectively. Each of sensors 128 and 129 is located at an appropriate focal length from the optics in associated elements 150 and 151, respectively. Light source and optics 150 generates light that impinges upon the general areas where right eye 107 and left eye 106 are expected to be located during three-dimensional acquisition of the eyes. These areas where the eyes are expected to be located are the left-eye and right-eye target areas.
One option is for there to be a continuous beam that encompasses both of eyes 106 and 107. Another option is to provide two separate beams, one projected in the direction of each of eyes 106 and 107. In the illustration of Figure 3, two separate light beams are provided. The lateral limits 160 of the beam projected on right eye 107 and the lateral limits 170 of the beam projected on left eye 106 appear in Figure 3.
The light projected onto eyes 106 and 107 will be reflected by the eyes, some of which will be reflected back to light source and optics 150. Projected light 170 and 160 enters pupils of each of eyes 106 and 107, so that eyes 106, 107 reflect the projected light 170, 160 directly back toward light source and optics 150. This reflected light appears as light beam 161 from right eye 107 and light beam 171 from left eye 106. Given that sensor 128 is positioned at an appropriate focal length of the optics in light source and optics 150, the reflected light beams 161 and 171 will impinge on sensor 128 as light beams 162 and 172, respectively. Accordingly, there will be two bright spots of light on the two- dimensional surface of sensor 128. The two-dimensional location of each of such bright spots is represented in the signals generated by sensor 128 and received by processor 116.
Light source and optics 151 is constructed and operates in relation to sensor 129 in the same way that light source and optics 150 interacts with sensor 128. The light projected from light source and optics 151 onto left eye 106 and right eye 107 is not illustrated, nor is the light reflected back by the eyes so as not to clutter the illustration. The light reflected back by right eye 107 is focused on sensor 129 as light beam 192. The light reflected back by left eye 106 is focused on sensor 129 as light beam 182. In this arrangement, the position of each of the sensors 128, 129 and light source and optics 150 and 151 is fixed. Therefore the only variability as to the location of the points of light on the sensors 128, 129 is the positions of the eyes 106 and 107.
Considering sensor 128, two points of light will be present on the two-dimensional surface of the sensor, one caused by light beam 162 after reflecting from right eye 107 and the other caused by light beam 172 after reflecting from left eye 106. Each point uniquely identifies a line along which the eye that produced such point by "bright pupil" reflection must be located in the form of a unique pixel location and subsequent angle of impinging ray. Each of the two points of light on sensor 129 created by light beams 182 and 192 similarly defines a line along which the eye that produces such point by reflection must be located. Therefore, for each possible location of right eye 107, there exists a unique combination of points, one on the two-dimensional surface of sensor 128 created by light beam 162 and the other on the two-dimensional surface of sensor 129 created by light beam 192. The same applies with left eye 106 and the reflected light beam 172 impinging on sensor 128 and light beam 182 impinging on sensor 129..
The processor 116 is connected to each of the sensors 128 and 129 to receive the data generated thereby, representative of the location of the two points of light on each of sensors 128 and 129. Processor 116 therefore performs the analysis to define the three- dimensional location of each of the eyes 106 and 107.
Processor 116, having calculated the position of each eye in three-dimensional space, can then calculate the IPD. If the device is further provided with a nose rest, a reference line
130 can be established, aligned with the patient's nose. The reference line 130 has a fixed and known position with respect to light sources and optics 150 and 151 as well as sensors 128 and 129. With this additional data, the individual Vτ IPDs associated with the left and right eyes can also be calculated.
Modifications of the arrangement of the elements is possible and within the knowledge of one of skill in the art. The principle of this embodiment is illustrated in Figure 4. Figure 4 is a schematic representation of the principle underlying the present embodiment. Li addition to the various light paths illustrated in Figure 3 are reflected light beam 191 from right eye 107 to light source and optics 151 as well as reflected light beam 181 from left eye 106 to light source and optics 151.
The reflected and focused light beam 192 from right eye 107 impinges upon sensor 129 at point 130 and the reflected and focused light beam 162 impinges upon sensor 128 at point 132. The reflected and focused light beam 182 from left eye 106 impinges upon sensor 129 at point 131 and the reflected and focused light beam 172 impinges upon sensor 128 at point 133. Along with known distance between 151 and 152 (later described in detail) and in light of the fixed and known relationship between light source and optics 150 and 151 and sensors 128 and 129, the location of points 130 and 132 serve to define the three-dimensional location of right eye 107. Similarly, the location of points
131 and 133 serve to define the three-dimensional location of left eye 106. Understood is that various combinations of lines, sides of various triangles, subtended angles, and supplementary angles, can be used to calculate the three-dimensional location of each eye.
In the embodiment described above, each of the elements of the device itself lies in a fixed relationship with each other element. In an alternative embodiment illustrated in Figure 5, each of the sensors 128 and 129 with associated optics and illumination is movable. This movement can be achieved by use of stepper motors or other known mechanism for mechanical control. The stepper motors can be under the control of processor 116.
In this embodiment, each of sensors 128 and 129 is moveable to be coincident with the optical axis of eyes 107 and 106, respectively. As a result, points 131 and 132 can be located in the two-dimensional center of sensors 129 and 128, respectively. The elements can be arranged so that light beams that impinge on the sensors at points 131 and 132 follow a path that is normal to the surface of sensors 129 and 128, respectively. This facilitates various measurements such as imaging structures of the eye, measuring corneal curvature, and imaging the eyes "straight-on" for undistorted display of the external eye. While each of the sensors is illustrated as being movable along both lateral and vertical axes, it is also possible that the sensors would be movable in only one of such axes or in a third axis orthogonal to the first two axes.
Note that in all embodiments, instead of using "bright pupil" illumination methods whereby the illumination is coincident with a camera's field of view, "dark pupil" could be used with illumination for the eye so that the eye's iris/pupil is imaged onto a sensor. The "dark pupil" is an equivalent method to the methods presently being described, except the sensor's data processor requires software to recognize the "dark pupil" pixel location.
One possible implementation of light source and optics 105 and 151 illustrated schematically in Figures 3, 4, and 5 is illustrated in Figure 6. Light sources or emitters 201 and 202 are arranged with respect to a beam splitter 210 to project light onto beam splitter 210. Each of emitters 201 and 202 can be of a variety that generates infrared light, but is not required to be. The light that reflects off of beam splitter 210 passes through lens 211. Single lens 211 could be replaced by a lens group. Emitters 201, 202, beam splitter 210, and lens 211 are arranged so that the light originating from emitter 201 impinges upon right eye 107, and the light originating from emitter 202 impinges upon left eye 106. The reflected beams 161 from right eye 107 and 171 from left eye 106 are not illustrated in Figure 6, but each would pass through lens 211 and beam splitter 210 to produce beams 162 and 172, respectively.
Sensor 128 is positioned at an appropriate focal length of lens 211 in order to properly image eye. Additionally, each of emitters 201 and 202 is positioned so as to be coaxial with sensor 128 and lens 211. Modification of the basic structure is possible and such variations are known to one of skill in the art. Among other variations, the lens can be placed between the beam splitter and the sensor. The two emitters can be replaced with a single emitter if such single emitter produces a beam that is sufficiently broad so as to impinge upon both eyes. The emitters can be of a variety that generates infrared light of approximately 920 nm, but need not be so. The light source and optics, as a unit, must project light upon an area where one or both eyes of the patient are expected to be found, and arranged with respect to a sensor so that the reflected light is focused to a spot on a two-dimensional surface of the sensor in such a way that the location of the spot defines a unique line along which the eye producing the reflection must lie, and which line's angle is uniquely defined as will shortly be illustrated. When two such lines are defined because of two different light sources, the interrelationship between such two lines defines a unique three-dimensional location for the eye.
Figures 7 and 8 illustrate the geometry that underlies the translation of data received from sensors 128 and 129 into the three-dimensional locations of eyes 106 and 107. Points A and B are known points that cameras are positioned, and the known length AB is used in calculations. The reference line 130 exists in a known relationship with lenses 224 and 225, emitters 220 and 221, beam splitters 222 and 223, and sensors 128 and 129. Each set of emitter, beam splitter, and lens in Figure 7 is arranged so that the light produced by the emitter is coaxial with the associated lens. A right eye line of vision 230 extends directly forward from right eye 107 because the fixation target viewed by the subject is located at optical infinity. (The fixation target could be set at a nearby target so that eyes' lines of vision converge, but for simplicity it is assumed the fixation target is at optical infinity and eyes' lines of vision are parallel to each other.) A line of vision 231 extends from left eye 106 in parallel with line of vision 230.
Point 133 on sensor 128 uniquely defines light path 171 in three dimensional space given the known relationship between sensor 128 and lens 224. Similarly, spot 131 uniquely identifies light path 181. Taken together, with the unique angles of each line 171 and 181 and known length AB, light paths 171 and 181 uniquely identify point C in three dimensional space. In a corresponding manner, points 132 and 130 on sensors 128 and 129 define lines 161 and 191, respectively. Lines 161 and 191 then uniquely define point D in three dimensional space. Points A and B are defined by the known position of lenses 224 and 225, respectively. Reference line 130 is defined by the known position of the nose rest. Given this information the three dimensional locations of points A, B, are known as is the plane defined by reference 130, and points C, and D can be calculated.
Figure 8 illustrates the geometric interrelationship of these known elements. Angles α and β are defined by points 133 and 131, respectively. In light of a known parallel relationship between the surfaces of sensors 128 and 129 and the reference line interconnecting points A and B, α = α/ and β = β ' . Length AB is known as both points are defined by the position of lenses 224 and 225, respectively. With length AB and angles α/ and β' known, the three dimensional location of point C can be derived. Tn similar fashion, the three dimensional location of point D can be derived from known AB and points 130 and 132.
CD defines IPD for the patient. DE defines V% IPD for right eye 107, and CD defines 1A IPD for left eye 106. Knowing AB and C, it is also possible to derive G. Correspondingly, knowing AB and D, one can derive F. CG therefore defines distance of left eye 106 from the plane of AB, representing the z-dimensional position of left eye 106 or its vertex distance. Similarly, the z-dimension or vertex distance of right eye 107 can also be derived.
Accordingly, using the known positions of lenses 224 and 225, sensors 128 and 129, and spots 130-133, it is possible to determine IPD, three dimensional location of the left eye, three dimensional location of the right eye, vertex distance of the left eye, and vertex distance of the right eye. With the additional data of reference line 130 from having the equipment include a nose rest, it is further possible to determine right eye Vz IPD and left eye Vi IPD. All of this can be determined in a very short time and with high precision and accuracy.
In the embodiment of Figures 9A, 9B and 10, another embodiment is shown based on principles similar to those outlined in connection with the previous embodiments. In this embodiment, reflected light, optics, and a sensor are used in conjunction with another device used to determine IPD.
In Figure 9A a moving mirror device is first used to establish an x-dimension distance between the pupils, or IPD. An emitter 240 generates a beam of light that reflects off of a beam splitter 241 toward a moving mirror 242. Moving mirror 242 is constructed to be contollably movable along an axis that will cause it to pass in front of expected positions of the right and left eyes 107 and 106 of the patient. It is not critical that the path of moving mirror 242 be precisely aligned with eyes 106 and 107 on the y-axis, so long as some of the light reflected onto the eyes by mirror 242 is reflected back to mirror 242. Sensor 243 is aligned with the path of mirror 242.
Mirror 242 begins at its rest position where it is shown in a solid line in Figure 9A. In use, mirror 242 moves along the x axis while light from emitter 240 reflects off of beam splitter 241 and onto moving mirror 242. Any light reflected back from moving mirror 242 will pass through beam splitter 241 and impinge on sensor 243. To measure IPD, moving mirror 242 moves from its rest position along a path parallel to the x axis in front of right eye 107 and left eye 106. As moving mirror 242 passes near either of the eyes some of the light originating from emitter 240 and reflected by moving mirror 242 onto the eye and will be reflected back. This light reflected back by either of the eyes will ultimately impinge on and be detected by sensor 243.
For any given transit in front of either eye, the point at which moving mirror 242 passes nearest the eye is also the point where a maximum amount of light impinging on the eye will be reflected back by the eye. Accordingly, as moving mirror 242 approaches right eye 107 or left eye 106, the intensity of light detected by sensor 243 will increase to a maximum representative of a point where moving mirror 242 passes closest to optical alignment with such eye. The intensity will then fall off again as moving mirror passes beyond this point. This principle is illustrated in Figure 9B, which is aligned with Figure 9A. The abscissa of the plot of Figure 9B is the displacement of moving mirror 242 along its path, and is aligned with the illustration of such path in Figure 9A. The ordinate of the plot of Figure 9 A is intensity of light impinging on sensor 243. As moving mirror 242 travels from its original, rest position the intensity is at a baseline value. As moving mirror 242 approaches right eye 107 the intensity of light impinging on sensor 243 increases as some of the light originating from emitter 240 and reflecting off moving mirror 242 is reflected back by right eye 107. As moving mirror 242 passes closest to optical alignment with right eye 107 the intensity of the reflected light reaches a maximum value, beyond which the intensity returns to the baseline value. The process repeats as moving mirror 242 approaches and passes by left eye 106. As indicated by the correspondence between the plot of Figure 9B and the transit path of moving mirror 242, the locations of, and hence distance between, the right eye 107 and left eye 106 is discernible by the two peaks of light intensity as measured by sensor 243.
In this way, the first data regarding the location of the eyes in three dimensions is established, namely the distance between the eyes along the x axis, or the IPD. In the illustration of Figure 9 A, this corresponds to the distance between points B and C.
Next, and in a manner similar to the embodiments already discussed in detail, light is projected onto right 107 and left eye 106. The light is projected by way of an emitter and beam splitter located in a known relationship with the lens 250, so long as the light produced is coaxial with lens 250. The light that reflects off of the eyes 106 and 107 is then focused at two distinct points D and E, respectively, on a surface of sensor 260. Sensor 260 is located at an appropriate focal length of lens 250 in relation to imaging eyes.
In light of the known relationship between lens 250 and sensor 260, the location of spot D on the two-dimensional surface of sensor 260 defines a unique line in three dimensional space along which left eye 106 must be located. Correspondingly, the same known relationship in view of the location of spot E on the two dimensional surface of sensor 260 defines a unique line in three dimensional space along which right eye 107 must be located.
In previous embodiments, two lines representing reflections from two different light sources for each eye were used to determine the location of each eye in three dimensional space. In the present embodiment there exists only one such line associated with each eye. However, the identification of these two unique lines in conjunction with the IPD as determined using the moving mirror arrangement serves to establish the location of each eye in three dimensional space.
The geometry and trigonometry underlying this determination is illustrated in Figure 10. Point A is known in light of the known position of lens 250, which is also in the centerline of the light sensor 260. The location of points D and E are known by interpreting data output by sensor 260, which interpretation can be performed by a processor such as processor 116 illustrated hi connection with previous embodiments. The known relationship between sensor 260 and lens 250 means that each potential location of spots D and E corresponds to specified angles β and γ. Angle α is therefore known (180 - β - γ, as the sum of angles for a triangle must equal 180°). The known relationship between points A, D, and E serves to define two lines that encompass line segments AC and AB. The location of points B and C along these lines is not known from the position of points A, D5 and E, however.
From the moving mirror apparatus, length BC is known. This length, and the fact that line segment BC lies in a plane parallel to the plane of the surface of sensor 260 allows for the derivation of the unique positions of points B and C along the previously defined lines by using elementary geometry. With the three dimensional locations of points A-E established, the locations of points F and G can then be derived. Length CF therefore defines the vertex distance for right eye 107, and length BG defines the vertex distance for left eye 106. If the device is provided with a nose rest having a known position in relation to the other elements of the device, plane 130 is known. The position of plane 130 with respect to points B and C serves to define the two Vz IPD distances.
Using this embodiment, only one lens and only one sensor is required. So long as the IPD can be determined by other means, the three dimensional location of each eye, and hence the xh IPDs and vertex distances can be known. While the moving mirror arrangement is described above, any other mechanism for measuring distance between the eyes can be used in conjunction the determination of triangle ADE to determine the location of the eyes. Moreover, it is also possible to measure vertex distances BG and CF instead of IPD or BC. By determining the vertex distance for right eye 107 or the length of CF in Figure 10, it is also possible to determine the location of point C. The same is true with left eye 106 and length BG.
In the embodiment of Figure 11, two lenses and two sensors, that is, two cameras, are provided for each eye, for a total of four lenses and four sensors, that is, a total of four cameras. Four sensors 311-314 are arranged at the appropriate focal lengths of four lenses 301-304, respectively. Each is positioned so that an optical path exists between an area where one eye of a patient is expected to be found and one of sensors 311-314, passing through lenses 301-304, respectively. While the particular arrangement illustrated in Figure 11 has all of the sensors and lenses inboard of the anticipated position of the eyes 106 and 107, this is not a requirement. The lens/sensor sets could be positioned on opposite sides of each eye, or above and below the level of the eyes. An advantage of two cameras per eye is reducing illumination requirements as well as greatly reducing space so that the implementation becomes ultra-compact.
Present but not illustrated are emitters and other optical elements such beam splitters that allow for light to be cast upon each of the eyes along a path between the eye and one of the associated sensors. Lines 321, 322 represent the paths of light that are reflected back from right eye 107 from such emitters, which pass through lenses 301 and 302 and impinge upon sensors 311 and 312, respectively. Lines 323, 324 represent the paths of light that are reflected back from left eye 106 from such emitters, which pass through lenses 303 and 304 and impinge upon sensors 313 and 314, respectively. Each of these paths of light 321-324 returned from one of the eyes is focused into one of spots 331-334 on one of sensors 311-314, respectively. In light of the known relationship between the position of lenses 301-304 and sensors 311-314, the location of each of spots 331-334 on the two-dimensional surface of sensors 311-314 uniquely defines a line in three- dimensional space along which the eye from which such light was reflected must lie as previously described. As there are two such lines associated with each eye, there exists only one location of each eye that lies along both lines associated with that eye. This serves to uniquely identify the location of both eyes in three dimensional space.
By further providing a nose rest that lies in a known relationship with lenses 301-304 and sensors 311-314, not only the IPD but also each of the Vi IPDs can be derived.
Figure 12 illustrates the geometric interrelationship of these known elements. Angles α and β are defined by points 331 and 332, respectively. In light of a known parallel relationship between the surfaces of sensors 311 and 312 and the reference line interconnecting points A and B, α = α/ and β = β ' . Length AB is known as both points are defined by the position of lenses 301 and 302, respectively. With length AB and angles ar and β' known, the three dimensional location of point C can be derived. In similar fashion, the three dimensional location of point F can be derived from known DE and points 333 and 334.
CF defines IPD for the patient. CG defines Vi IPD for right eye 107, and FG defines Vz IPD for left eye 106.
Knowing AB and C, it is also possible to derive H. Correspondingly, knowing DE and F, one can derive I. FI therefore defines distance of left eye 106 from the plane of AB, representing the z-dimensional position of left eye 106 or its vertex distance. Similarly, the z-dimension or vertex distance of right eye 107 can also be derived.
Accordingly, using the known positions of lenses 301-304, sensors 311-314, and spots 331-334, it is possible to determine IPD, three dimensional location of the left eye, three dimensional location of the right eye, vertex distance of the left eye, and vertex distance of the right eye. With the additional data of reference line 130 from having the equipment include a nose rest, it is further possible to determine right eye Vi IPD and left eye Vi IPD. All of this can be determined in a very short time and with high precision and accuracy.
As an alternative to the embodiment of Figures 11 and 12, the two sets of sensors, lenses, and associated emitters provided for each eye can be replaced by a single such set. To provide the two lines defined in three dimensional space to uniquely locate the position of the eye, the single set takes a first measurement at a first position, provides the output from the sensor to a processor, and then moves to a second position and takes a second reading, the data associated with which is also provided to the processor. This is illustrated in Figure 13, using elements corresponding to those of Figure 11. Lenses 301 and 302 are replaced by single lens 301. Sensors 311 and 312 are replaced by single sensor 311. In a first capture, the location of spot 331 is determined and, as explained in connection with the previous embodiment, line 321 is defined. The combination of lens 301 and sensor 311 is then moved to the second position indicated, corresponding to the position of lens 302 and sensor 312 of Figure 11. In a second capture, the location of spot 332 is determined and line 322 is defined.
One application of the present invention is that of automatically centering the optics of binoculars and other optical devices. As the optical axis of the eye of a user moves away from the optical axis of the optics through which such eye is viewing, a portion of the image is removed from view. This is referred to as vignetting. Automatically the optics with the user's eyes would serve to prevent vignetting and to ensure comfortable viewing. Such centering would amount to adjustment of the optics along the x and y axes with respect to the user's eyes. Considered from the perspective of the user, this would mean moving the optics left/right and up/down.
Automatic centering of the optics with respect to the eyes can be done in a continuous and automatic manner so that optical axis of each set of optics in the binoculars is automatically and continuously aligned with the optical axis of the associated eyes. This enables the viewer to concentrate on the image instead of constantly correcting the position of the device with respect to the user's eyes. This increases viewing clarity as well as viewing comfort. Particularly in the context of low-illumination viewing, continuous automatic centering offers a considerable benefit.
In Figure 14, lenses 401 and 402 are elements of an adjustable diopter lens. These may be lenses in the eyepiece of an optical device, such as binoculars. Lenses 401 and 402 are movable as a unit, which will be referred to as an eyepiece. This movement may be achieved by a stepper motor under control of a processor. The movement would be possible in the x and y axes, or up/down and left/right from the perspective of a user during use.
Associated with each eyepiece with lenses 401 and 402 are sensors 420 and 421, lenses 410 and 411, and apertures or windows 430 and 431. Sensors 420 and 421 are positioned at the appropriate focal length of lenses 410 and 411, respectively. Also present are emitters 450 and 451 and further optical elements such as beam splitters 460 and 461 that provide for light to be cast upon the user's eye 403. The light is cast in such a way as to be coaxial with lenses 410, 411 and associated sensors 420, 421. The light returns after reflecting from eye 403 along lines 440 and 441.
Each set of sensor, lens, beam splitter and emitter is considered to be a camera for the purposes of this and other embodiments. The emitter may be an IR emitter. Each camera is tilted at an angle α 420a with respect to a plane parallel to the eyepiece lenses so as to be directed toward the user's eye. This angle is incorporated into the calculation of the position of eye 403. Each combination of an eyepiece and two cameras can be movable as a single unit so as to be properly positioned to be centered and aligned with eye 403. The wide beam of infrared illumination is illustrated by the divergent limits of the IR illumination exiting each of the windows 430 and 431 and directed toward the eye 403. The beams 440 and 441 represent the bright pupils reflected back toward the cameras
As described in detail in connection with previous embodiments described above, the point that reflected light beam 440 impinges upon the two-dimensional surface of sensor 420 uniquely defines line 440 in three-dimensional space along which eye 403 must lie, in light of the known spatial relationship between lens 410 and sensor 420. The same applies for linb 441 in connection with lens 411 and sensor 421. Together, the two lines 440 and 441 uniquely define the location eye 403 in three dimensions. A processor that receives as input the output of sensors 420 and 421 can perform the calculations necessary to make such determination of the location of eye 403 and control stepper motors of other mechanical devices to maintain the eyepiece with lenses 401 and 402 in alignment with eye 403. In a binocular device, the components illustrated in Figure 14 could be duplicated so that each of two eyepieces could be independently controlled.
The maximum distance, usually expressed in millimeters, that an optical device can be held from the eye with the full field of view comfortably observed is known as eye relief. In some devices, the eye relief is so small that insufficient space is provided for the arrangement of elements as illustrated in Figure 14. One solution to this problem is to have sensors 420, 421 and their associated emitters project and receive light through lenses 401 and 402 of the eyepiece itself. This would likely entail larger eyepieces and a larger field of vision for the eye. The movable eyepieces can be linked to associated optics using prisms or other well known methods.
It is possible that the eye pieces could be manually adjusted to within a few millimeters of alignment, and then the mechanism described above would provide fine adjustment of interpuillary distance (auto IPD) to achieve fine alignment between the eyepieces and eyes. Adjustment of as little as +/- 5 mm for the X and Y axes would likely be sufficient to obtain fine adjustment for the IPD of most users, which averages 64-65 mm.
As described in connection with previous embodiments, it is possible to determine the Z axis location of the eye, or vertex distance. Accordingly, eyepiece lenses can additionally be adjusted to control the eye's actual distance from the eyepiece lenses. Moreover, a built-in autorefractor of a type known in the art can be used to adjust the diopter value of the eye piece lenses to assure the sharpest image on the eye's retina.
Figure 15 illustrates an arrangement of elements that may be used in connection with the embodiment of Figure 14 or other embodiments discussed above. Emitter 501 projects light that may be in the infrared range. The light from a single emitter 501 reflects off of beam splitter 502 and passes through lens 503 toward an eye 504. Emitter 501 may be positioned to be inside the focal point of lens 503 to provide a sufficiently wide beam. Lens 503 is positioned with respect to sensor 531 so that eye 504 is properly focused on the sensor. The same applies to lenses 410 and 411 and sensors 420 and 421 of Figure 14. The wide beam of illumination of eye 504 is indicated by outer boundaries of such light 510 and 511. The bright pupil reflection from eye 504 appears as 521, which passes back through lens 503, through beam splitter 502, and impinges upon a sensor 531. Because of the optics design, the reflected bright pupil, or alternatively the dark pupil, appears as a small spot on sensor 531. The location of such small spot on sensor 531 uniquely identifies a line along which eye 504 must be located.
In each of Figures 14 and 15, lens center lines correspond to the center lines of the associated sensors.
In addition to ultra-compactness, simplicity, and low cost, the advantages of the above application are quickly determining the location of the eye in x-, y-, and z-axes so there is no searching for the eye and so that speedy acquisition of eye is achieved, that is, alignment of device optical axis with the eye's optical axis.
While the present invention has been described in connection with particμlar embodiments, the invention is defined by the attached claims, the scope of which is not limited to the particular implementations of such invention as described above.

Claims

1. An apparatus for determining a location of each of a left and right eye in three dimensions, comprising: first and second left-eye light sources, each arranged to project light onto a left- eye area of an eye target area to produce first and second left-eye reflected light, respectively; first and second right-eye light sources, each arranged to project light onto a right- eye area of an eye target area to produce first and second right-eye reflected light, respectively; a first lens arranged with respect to the first left-eye light source, the first right- eye light source, the left-eye area, and the right-eye area so that the first left-eye reflected light and the first right-eye reflected light pass through the first lens to produce first left- eye focused light and first right-eye focused light, the first left-eye light source and the first right-eye light source being positioned so as to be coaxial with the first lens; a second lens arranged with respect to the second left-eye light source, the second right-eye light source, the left-eye area, and the right-eye area so that the second left-eye reflected light and the second right-eye reflected light pass through the second lens to produce second left-eye focused light and second right-eye focused light, respectively, the second left-eye light source and the second right-eye light source being positioned so as to be coaxial with the second lens; a first electronic imaging device arranged with respect to the first lens so that the first left-eye and first right-eye focused lights impinge upon the first electronic imaging device, the first electronic imaging device producing as output first left-eye and first right-eye position data based on locations on the first electronic imaging device where the first left-eye and first right-eye focused lights impinge; a second electronic imaging device arranged with respect to the second lens so that the second left-eye and second right-eye focused lights impinge upon the second electronic imaging device, the second electronic imaging device producing as output second left-eye and second right-eye position data based on locations on the second electronic imaging device where the second left-eye and second right-eye focused lights impinge; a data processor receiving as inputs the first and second left-eye position data and the first and second right-eye position data, and based on known relative positions of the first and second electronic imaging devices calculating three-dimensional locations of the left and right eyes.
2. The apparatus of claim 1, further comprising a nose pad positioned in a known relationship with the first and second electronic imaging devices; wherein the eye position calculator further a distance between the nose pad and each of the left and right eyes.
3. The apparatus of claim 1, further comprising at least one first beam splitter, wherein each of the left-eye and right-eye light sources is arranged so that the light passing from the left-eye and right-eye light sources reflects from the at least one first beam splitter before impinging on the left-eye and right-eye areas, and so that the left-eye and right-eye reflected light passes through the at least one first beam splitter.
4. The apparatus of claim 3, further comprising at least one second beam splitter arranged so that light from the left-eye and right-eye light sources reflected off of the at least one first beam splitter reflects off of the at least one second beam splitter before impinging on the left-eye and right-eye areas, and so that the left-eye and right-eye reflected light reflects off of the at least one second beam splitter before passing through the at least one first beam splitter.
5. The apparatus of claim 1, further comprising a first and second beam splitters, wherein each of the left-eye and right-eye light sources is arranged so that the light passing from the left-eye and right-eye light sources reflects off of the beam splitter before impinging on the left-eye and right-eye areas, and so that the left-eye and right-eye reflected light reflects off of the second beam splitter before passing through the first beam splitter.
6. An apparatus for determining a location of each of a left and right eye, comprising: at least one movable mirror; a position controller for the at least one movable mirror, the position controller producing as output mirror position data; at least one light source arranged to project light to reflect off of the at least one movable mirror onto an eye target area, the eye target area comprising a left-eye area and a right-eye area, to produce left-eye reflected light and right-eye reflected light, respectively; a lens arranged with respect to the at least one light source, the left-eye area, and the right-eye area so that the left-eye and right-eye reflected light pass through the lens to produce left-eye focused light and right-eye focused light, respectively, the at least one light source being arranged to be coaxial with the lens; an electronic imaging device arranged with respect to the lens so that the left-eye and right-eye focused light impinges upon the electronic imaging device, the electronic imaging device producing as output left-eye position data and right-eye position data based on locations on the electronic imaging device where the left-eye and right-eye focused lights impinge, respectively; and a data processor receiving as inputs the left-eye position data, the right-eye position data, and the mirror position data, and calculating three-dimensional locations of the left and right eyes.
7. The apparatus of claim 6, wherein the means for measuring IPD comprises: at least one movable mirror movable along a fixed axis; a position controller for the at least one movable mirror, the position controller producing as output mirror position data; at least one first second source arranged to project light to reflect off of the at least one movable mirror onto an eye target area, the eye target area comprising a left-eye area and a right-eye area; and a second electronic imaging device positioned to receive the light from the at least one second light source after said light has reflected back from one of the left-eye area and the right-eye area, the first electronic imaging device producing as output reflected light intensity data based on the light reflected from one of the left-eye area and the right- eye area; wherein the processor uses the mirror position data and reflected light intensity data to determine the IPD.
8. An apparatus for determining a location of a target eye with respect to a known point, comprising: first and second light sources, each arranged to project light onto an eye target area to produce first and second reflected light, respectively; first and second lenses arranged with respect to the first and second light sources and the eye target area so that the first and second reflected light pass through the first and second lenses, respectively, to produce first and second focused light, the first and osecond light sources being coaxial with the first and second lenses, respectively; first and second electronic imaging devices arranged with respect to the first and second lenses so that the first and second focused lights impinge upon the first and second electronic imaging devices, respectively, the first and second electronic imaging devices generating as output first position data and second position data based on locations on the first and second electronic imaging devices where the first and second focused lights impinge, respectively; a data processor receiving as inputs the first position data and the second position data, and calculating a three-dimensional location of the eye with respect to the known point.
9. The apparatus of claim 8, wherein the known point is aligned with a nose of a patient.
10. An apparatus for determining a location of left and right eyes with respect to a known point, comprising: first and second right-eye light sources, each arranged to project light onto a right- eye eye target area to produce first and second right-eye reflected light, respectively; first and second left-eye light sources, each arranged to project light onto a left- eye eye target area to produce first and second left-eye reflected light, respectively; first and second right-eye lenses arranged with respect to the first and second right-eye light sources and the right-eye eye target area so that the first and second right- eye reflected light pass through the first and second right-eye lenses, respectively, to produce first and second right-eye focused light, the first and second right-eye light sources being coaxial with the first and second right-eye lenses; first and second left-eye lenses arranged with respect to the first and second left- eye light sources and the left-eye eye target area so that the first and second left-eye reflected light pass through the first and second left-eye lenses, respectively, to produce first and second left-eye focused light, the first and second left-eye light sources being coaxial with the first and second left-eye lenses; first and second right-eye electronic imaging devices arranged with respect to the first and second right-eye lenses so that the first and second focused right-eye light impinges upon the first and second right-eye electronic imaging devices, respectively, the first and second right-eye electronic imaging devices generating as output first and second right-eye position data based on locations on the first and second right-eye electronic imaging devices where the first and second right-eye focused lights impinge, respectively; first and second left-eye electronic imaging devices arranged with respect to the first and second left-eye lenses so that the first and second focused left-eye light impinges upon the first and second left-eye electronic imaging devices, respectively, the first and second left-eye electronic imaging devices generating as output first and second left-eye position data based on locations on the first and second left-eye electronic imaging devices where the first and second left-eye focused lights impinge, respectively; a data processor receiving as inputs the first and second left-eye position data and the first and second right-eye position data, and calculating three-dimensional locations of the left and right eyes with respect to the known point.
11. The apparatus of claim 10, wherein the known point is aligned with a nose of a patient.
12. An apparatus for determining a location of one eye in three dimensions, comprising: first and second cameras, each of the cameras comprising: a lens; a sensor; and an illumination source, the illumination source comprising an emitter with a beam splitter arranged so that the illumination source is coincident with a field of view of the lens and sensor; and a processor connected to receive data output by each of the sensors; wherein a position of the first camera in relation to the second camera is known so as to create a known reference length in order that rays of light originating from the illumination source and reflected from the eye impinging on the sensor of each of the first and second cameras provides the data to the processor, the processor being constructed and arranged to interpret the data from the sensors to determine ray angles so as to calculate a three-dimensional location of the eye.
13. A device comprising first and second of the apparatuses of claim 12, each said apparatus being constructed and arranged to determine a three dimensional location of a respective one of two eyes of a user, wherein a position of the first apparatus in relation to the second apparatus is known, and a position of a nose of the user is known in relation to each of the first and second cameras, in order to calculate a position of each eye, an interpupillary distance (IPD), and Vz IPDs as well as the three-dimensional location of each said eye.
14. A method for determining a location of each of two eyes in three dimensions comprising steps of: projecting light from at least one light source to impinge on each of the two eyes; focusing light reflected back from each of the two eyes onto at least one electronic imaging device using at least one lens; and determining at least one line along which a respective one of the two eyes must lie using a know relationship between the light source, the lens, and the electronic imaging device.
15. The method of claim 14, wherein said at least one electronic imaging device comprises first and second electronic imaging devices, each of the first and second electronic imaging devices receiving said light reflected from each of the two eyes, the determining step comprising determining two lines along which a given one of the two eyes must lie based upon a point on each of the first and second electronic imaging devices upon which the light reflected from the given eye impinges, and determining a the position of the given eye based on the two lines.
16. The method of claim 15, wherein the at least one lens comprises first and second lenses arranged to focus the reflected light on the first and second electronic imaging devices, respectively.
17. The method of claim 16, wherein each of the first and second lenses receives the light reflected back from each of the two eyes and focuses the reflected light on a respective one of the first and second electronic imaging devices.
18. The method of claim 14, wherein the at least one electronic imaging device comprises first through fourth electronic imaging devices, each of the first and second electronic imaging devices receiving said light reflected from a first of the two eyes, and each of the third and fourth electronic imaging devices receiving said light reflected from a second of the two eyes; the determining step comprising determining two lines along which the first eye must lie based upon a point on each of the first and second electronic imaging devices upon which the light reflected from the first eye impinges, and determining a the position of the first eye based on the two lines; and the determining step further comprising determining two lines along which the second eye must lie based upon a point on each of the third and fourth electronic imaging devices upon which the light reflected from the second eye impinges, and determining a the position of the second eye based on the two lines.
19. The method of claim 18, wherein the at least one lens comprises first through fourth lenses arranged to focus the reflected light on the first through fourth electronic imaging devices, respectively.
20. The method of claim 14, wherein said at least one electronic imaging device consists of only one electronic imaging device receiving said light reflected from each of the two eyes, the determining step comprising determining two lines along which the two eyes must lie based upon two points on the electronic imaging device upon which the light reflected from the respective two eyes impinges; the method further comprising measuring a distance between the two eyes, and calculating the three dimensional position of each of the eyes based upon the two lines and the distance between the two eyes.
PCT/US2007/060689 2006-01-19 2007-01-18 Methods and apparatus for locating the eye in three dimensions WO2007084943A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US76057906P 2006-01-19 2006-01-19
US60/760,579 2006-01-19
US81216306P 2006-06-09 2006-06-09
US60/812,163 2006-06-09

Publications (2)

Publication Number Publication Date
WO2007084943A2 true WO2007084943A2 (en) 2007-07-26
WO2007084943A3 WO2007084943A3 (en) 2007-12-13

Family

ID=38288386

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2007/060689 WO2007084943A2 (en) 2006-01-19 2007-01-18 Methods and apparatus for locating the eye in three dimensions
PCT/US2007/070555 WO2007146714A2 (en) 2006-01-19 2007-06-06 Methods and apparatus for locating the eye in three demensions

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2007/070555 WO2007146714A2 (en) 2006-01-19 2007-06-06 Methods and apparatus for locating the eye in three demensions

Country Status (1)

Country Link
WO (2) WO2007084943A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013034803A1 (en) * 2011-09-06 2013-03-14 Icare Finland Oy Ophthalmic apparatus and method for measuring an eye
EP2441382A3 (en) * 2007-07-30 2013-03-27 Lein Applied Diagnostics Limited Optical alignment apparatus
US11252399B2 (en) 2015-05-28 2022-02-15 Microsoft Technology Licensing, Llc Determining inter-pupillary distance

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3495897A (en) * 1966-08-04 1970-02-17 Temkine & Cie Lunetiers Device for measuring the pupillary distance
US5596377A (en) * 1993-08-31 1997-01-21 Nidek Co., Ltd. Ophthalmic apparatus having three dimensional calculating means
US6027216A (en) * 1997-10-21 2000-02-22 The Johns University School Of Medicine Eye fixation monitor and tracker
US20050024586A1 (en) * 2001-02-09 2005-02-03 Sensomotoric Instruments Gmbh Multidimensional eye tracking and position measurement system for diagnosis and treatment of the eye

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3495897A (en) * 1966-08-04 1970-02-17 Temkine & Cie Lunetiers Device for measuring the pupillary distance
US5596377A (en) * 1993-08-31 1997-01-21 Nidek Co., Ltd. Ophthalmic apparatus having three dimensional calculating means
US6027216A (en) * 1997-10-21 2000-02-22 The Johns University School Of Medicine Eye fixation monitor and tracker
US20050024586A1 (en) * 2001-02-09 2005-02-03 Sensomotoric Instruments Gmbh Multidimensional eye tracking and position measurement system for diagnosis and treatment of the eye

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2441382A3 (en) * 2007-07-30 2013-03-27 Lein Applied Diagnostics Limited Optical alignment apparatus
WO2013034803A1 (en) * 2011-09-06 2013-03-14 Icare Finland Oy Ophthalmic apparatus and method for measuring an eye
CN103889316A (en) * 2011-09-06 2014-06-25 伊卡尔芬兰有限公司 Ophthalmic apparatus and method for measuring an eye
US9332901B2 (en) 2011-09-06 2016-05-10 Icare Finland Oy Ophthalmic apparatus and method for measuring an eye
AU2012306231B2 (en) * 2011-09-06 2016-10-20 Icare Finland Oy Ophthalmic apparatus and method for measuring an eye
CN103889316B (en) * 2011-09-06 2017-07-18 伊卡尔芬兰有限公司 Ophthalmologic apparatus and method for measuring eyes
US11252399B2 (en) 2015-05-28 2022-02-15 Microsoft Technology Licensing, Llc Determining inter-pupillary distance

Also Published As

Publication number Publication date
WO2007146714A2 (en) 2007-12-21
WO2007146714A3 (en) 2008-12-24
WO2007084943A3 (en) 2007-12-13

Similar Documents

Publication Publication Date Title
US7025459B2 (en) Ocular fundus auto imager
US9339186B2 (en) Method and apparatus for enhanced eye measurements
JP5970682B2 (en) Eyeball measuring device and eyeball measuring method
US7316480B2 (en) Eye refractive power measurement apparatus
US20140375951A1 (en) System and method for the non-contacting measurements of the eye
US7416301B2 (en) Eye refractive power measurement apparatus
JP4684700B2 (en) Ophthalmic optical characteristic measuring device
US9107611B2 (en) Optical coherence tomography device capable of controlling measuring position
CN112244757A (en) Ophthalmologic measurement system and method
JP4630126B2 (en) Ophthalmic optical characteristic measuring device
JP3649839B2 (en) Ophthalmic equipment
US20230404402A1 (en) Ophthalmic apparatus and alignment method
WO2007084943A2 (en) Methods and apparatus for locating the eye in three dimensions
JPH0646995A (en) Eye refractometer
JP3617705B2 (en) Corneal endothelial cell imaging device
JP2023029537A (en) Ophthalmologic apparatus
JP2812421B2 (en) Corneal cell imaging device
JP3576656B2 (en) Alignment detection device for ophthalmic instruments
CN113440098A (en) Full-automatic human eye visual inspection device and method
JP2020168266A (en) Ophthalmologic light interference tomographic device and control method of ophthalmologic light interference tomographic device
JP2020127588A (en) Ophthalmologic apparatus
WO2022030202A1 (en) Ophthalmic device and ophthalmic device control program
WO2023145638A1 (en) Ophthalmic device and ophthalmic program
JP2022114614A (en) Ophthalmologic apparatus and control method thereof, and program
JP2630956B2 (en) Automatic eye refractive power measuring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07701256

Country of ref document: EP

Kind code of ref document: A2