WO2007026158A1 - Imaging apparatus, portable image capture device and method of assembling composite images from component images - Google Patents

Imaging apparatus, portable image capture device and method of assembling composite images from component images Download PDF

Info

Publication number
WO2007026158A1
WO2007026158A1 PCT/GB2006/003238 GB2006003238W WO2007026158A1 WO 2007026158 A1 WO2007026158 A1 WO 2007026158A1 GB 2006003238 W GB2006003238 W GB 2006003238W WO 2007026158 A1 WO2007026158 A1 WO 2007026158A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
capture device
imaging apparatus
image capture
operable
Prior art date
Application number
PCT/GB2006/003238
Other languages
French (fr)
Inventor
Peter John Bryanston-Cross
Clive Burrows
Luis DIAZ SANTANA HARO
Frederick William Ii Fitzke
Jon Holmes
Edward James Judd
Panagiotis Kantartzis
Panagiotis Liatsis
Toyin Oshinowo
Diana Katharine Taulbut
Richard Taylor
Brenda H. Timmerman
Jiangjian ZHONG
Original Assignee
Keeler Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Keeler Limited filed Critical Keeler Limited
Publication of WO2007026158A1 publication Critical patent/WO2007026158A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1208Multiple lens hand-held instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/145Arrangements specially adapted for eye photography by video means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Definitions

  • TITLE IMAGING APPARATUS, PORTABLE IMAGE CAPTURE DEVICE AND
  • This invention relates to imaging apparatus comprising a portable image capture device and image processing means operable to assemble component images captured by the image capture device into a composite image, to a portable image capture device, in particular a device for capturing images of portions of a fundus of an eye, for use in such apparatus, and to a method of assembling a composite image from component images.
  • Imaging apparatus comprising a portable image capture device operable to capture first and second component images of respective first and second portions of an image subject, and image processing means operable to assemble the first and second component images into a composite image of at least a portion of the image subject, is known.
  • US Patent No. 6,454,410 for example, is concerned with a system for mosaicing images of an eye, the system comprising a slitlamp biomicroscope to provide first and second adjacent images of the eye, a camera coupled to the slitlamp biomicroscope to capture the first and second images and a data processor coupled to the camera to process the images into a mosaic representation of the eye.
  • the image processing means of such known imaging apparatus must carry out considerable image processing of the component images to identify to what extent, if any, the component images overlap, before overlapping the component images by any identified overlap to form the composite image. If no overlap of the component images is identified, the image processing means cannot assemble them into the composite image.
  • imaging apparatus comprising a portable image capture device operable to capture first and second component images of respective first and second portions of an image subject, and image processing means operable to assemble the first and second component images into a composite image of at least a portion of the image subject, wherein the imaging apparatus further comprises sensor means operable to determine a displacement of the portable image capture device between capture of the first and second component images and the image processing means is operable to derive from an output of the sensor means an indication of offset of the first and second portions of the image subject relative to one another or to a datum, and thereby to assemble the first and second component images into the composite image.
  • image subject means a scene, an object or a portion of an object.
  • the offset of the first and second portions of the image subject relative to one another or to the datum may be the vertical and/or horizontal displacement of the portions from each other or the datum and/or the rotational orientation of the portions relative to each other or to the datum.
  • the term “portable image capture device” encompasses both hand held and head mounted devices, or devices supported by a user in any other way.
  • the imaging apparatus of the invention can assemble component images of portions of an image subject into wider field of view composite images of the image subject more quickly than known imaging apparatus because the use of the output of the sensor means to derive the indication of the offset of the image portions enables the amount of image processing involved in aligning and registering the component images to be reduced.
  • the image processing means is preferably operable to shift (and, if necessary, rotate) one of the images by the offset relative to the other image so as to juxtapose the first and second images, such that the composite image consists of two discrete portions of the image subject. This would not be possible using the known imaging apparatus.
  • the image processing means is preferably operable to shift (and, if necessary, rotate) one of the images by the offset relative to the other image so as to combine the first and second images, such that the composite image consists of one continuous portion of the image subject.
  • the portable image capture device is operable to capture at least one subsequent component image of a subsequent portion of the image subject
  • the sensor means is operable to determine a displacement of the portable image capture device between capture of the or each subsequent component image and another component image
  • the image processing means is operable to derive from the output of the sensor means an indication of offset of the or each subsequent portion of the image subject relative to another portion of the image subject, and thereby to assemble the or each subsequent component image into the composite image.
  • the component images may include images of discrete portions and of overlapping portions of the image subject, in which case the image processing means is operable to assemble the composite image by juxtaposing and combining the component images as appropriate.
  • said another component image is the first component image.
  • the portable image capture device preferably includes a digital image sensor operable to capture image data in the form of pixel values, such as a charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) array, and digital memory means operable to store the image data captured by the digital image sensor.
  • a digital image sensor operable to capture image data in the form of pixel values, such as a charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) array
  • digital memory means operable to store the image data captured by the digital image sensor.
  • the portable image capture device may advantageously be provided with display means for displaying an image in a field of view of the image capture device.
  • the portable image capture device may advantageously further be provided with trigger means operable to cause the image capture device to capture an image in the field of view of the device.
  • a user of the imaging apparatus can use the display means to ensure that a portion of interest of the image subject is in the field of view of the image capture device before operating the trigger means to cause the image capture device to capture an image of the portion of interest.
  • the trigger means may advantageously further be operable to cause the image capture device to capture a succession of component images at known time intervals.
  • a user of the imaging apparatus can move the image capture device relative to the image subject such that each of the succession of component images is of a portion of the image subject that overlaps another portion of the image subject, such that the resulting composite image is of one continuous portion, or even the whole, of the image subject.
  • the image processing means is operable to create a composite image which comprises a mosaic of said component images.
  • the senor means may be external to the image capture device, for example comprising an array of proximity detectors which are mounted in fixed positions in a room in which the device is to be used, the sensor means preferably forms part of the image capture device.
  • the sensor means is responsive to changes in orientation of the image capture device to produce an output from which said indication of offset is deriveable.
  • the sensor means comprises a velocity sensor operable to determine linear and/or angular velocities of the image capture device.
  • the velocity sensor is preferably one of a plurality of such sensors mounted at spaced apart locations and/or differing orientations on the image capture device to enable changes in orientation of the latter to be determined.
  • the plurality of velocity sensors are preferably arranged so as to enable angular velocities of the image capture device to be measured about any of three mutually orthogonal axes that pass through the device.
  • the image capture device may advantageously include an automatic focussing system to enable focussed images to be obtained at any distance in a range of possible distances from the image capture device to the image subject.
  • the image capture device is preferably operable, at the same time as capturing the second or any subsequent component image, to store displacement data representative of the displacement of the image capture device between capture of that component image and the first component image.
  • the image processing means can then use the displacement data associated with each component image to assemble the composite image from the component images.
  • the indication of offset of the portions of the image subject derived from the displacement data can be used by the image processing means to accelerate the operation of identifying the overlap of the portions of the image subject.
  • the indication of offset of the portions of the image subject derived from the displacement data can be used by the image processing means to bypass the operation of attempting to identify the (nonexistent) overlap of the image subject.
  • the trigger means may advantageously further be operable to cause the image capture device to capture a succession of component images in response to predetermined displacements of the image capture device from an initial position and/or orientation of the device in which the trigger means was first operated.
  • This feature means that a user of the imaging apparatus need not concern himself that the succession of component images captured by the image capture device are of overlapping portions of the image subject, because the predetermined displacements of the image capture device from the initial position and/or orientation can be chosen to ensure that the succession of component images are of overlapping portions of the image subject.
  • the image capture device is preferably operable, at the same time as capturing a component image, to store velocity data representative of the rate of change of the displacement of the image capture device at the time of capturing the component image.
  • the image capture device and/or image processing means can discard any component image that was captured when the device was moving at a velocity greater than a predetermined rate of change of displacement, because the image may be blurred or the displacement data inaccurate due to the movement experienced by the sensor lying out of its range of operation.
  • the image capture device and/or image processing means may advantageously be operable to divide a component image into a plurality of individual regions, to compare pixel values in neighbouring regions, thereby to identify individual regions of the component image that are unsuitable for assembly into the composite image, and to discard the identified individual regions.
  • the image capture device and/or image processing means is/are operable to compare colours represented by pixel values in neighbouring regions, thereby to identify the individual regions that are unsuitable for assembly into the composite image.
  • the image capture device is preferably operable to compare colours represented by pixel values in individual regions with a range of colours that are expected to be encountered in images of the particular type of image subject, thereby to identify the individual regions that are unsuitable for assembly into the composite image.
  • the image capture device is an instrument for use in ophthalmology and is operable to compare colours represented by pixel values in individual regions with a range of colours that are expected to be encountered in images of the fundus of an eye.
  • the image capture device and/or image processing means is further operable to determine whether an individual region identified as unsuitable for assembly into the composite image is greater in area than a threshold area, and to discard the identified individual region only if it is determined to be greater in area than the threshold area.
  • the imaging apparatus can distinguish to some extent between regions of the component image that are too dark or too bright because of incorrect illumination of the image subject, and regions of the component image that correspond to features of the image subject that are, for example, more reflective than their surroundings, and therefore properly appear as a bright region in the component image. It is undesirable to discard such regions from the component images.
  • many of the component images of portions of the fundus of an eye captured by the image capture device include a large bright reflection caused by the cornea of the eye.
  • the imaging apparatus is thus operable to discard regions of the captured images corresponding to such corneal reflections, whilst retaining small bright or dark regions corresponding to features of the eye such as blood vessels.
  • the image processing means can nevertheless align and register the remaining regions of the component images to form the composite image, using the displacement data associated with the component images.
  • portions of the image subject which are omitted from one component image may nevertheless appear in another component image, so that the portions in question may still appear in the composite image.
  • the image processing means may advantageously further be operable to identify whether a user's movements of the image capture device follow a pattern, and if such a pattern is identified, to use the pattern to predict the approximate displacement of the image capture device from an initial position and/or orientation on the basis of previous displacements of the image capture device from the initial position and/or orientation.
  • the image processing means may advantageously be operable to use common features in component images to position and orientate the component images relative to one another, for example by correlation, in which case the ability to predict the approximate displacement of the image capture device, and hence the approximate offset of one portion of the image subject relative another portion of the image subject, considerably reduces the amount of processing that must be carried out by the image processing means, because the image processing means can process those component images, or even those portions of a component image, that are most likely to have features in common with a particular component image.
  • the image capture device is preferably provided with illumination means operable to illuminate the image subject or a portion of the image subject of which an image is to be captured by the image capture device.
  • the illumination means may advantageously comprise an incandescent filament bulb and/or one or more light-emitting diodes (LEDs).
  • LEDs light-emitting diodes
  • At least part of the image processing means is separate from the image capture device. Consequently, the need for the image capture device to be portable does not place any constraint on the size or weight of the image processing means.
  • the portable image capture device to be smaller and lighter than it would otherwise be, because at least part of the image processing means does not form a part of the portable image capture device.
  • the image capture device includes a rechargeable battery and wireless transmission means operable to transmit image data and displacement data to the image processing means.
  • the sensor means may advantageously be operable, if no movement of the device has occurred in a predetermined time, to cause the device to enter a low power consumption mode.
  • the absence of any movement of the device during a prolonged period, for example one minute, is indicative that the device is not being used and has been laid down without being turned off.
  • the low power consumption mode would typically involve turning off any illumination means, image capture and wireless transmission means of the device.
  • the apparatus may advantageously further comprise a charging cradle that has an electrical connection for connection to the image capture device to recharge the battery of the image capture device.
  • the cradle may advantageously include a further electrical connection for connection to the image capture device to transfer data between the image capture device and the cradle.
  • the cradle includes wireless reception means operable to receive image data from the image capture device.
  • the cradle would be connected to the image processing means by a cable for transmission of image data received from the image capture device to the image processing means.
  • the image capture device includes calibration means for calibrating the sensor means of the image capture device when the latter is placed in the cradle. Since the device is stationary when in the cradle, placing the image capture device in the cradle ensures the device is in a position in which no significant velocity should be detected thus facilitating calibration.
  • the image capture device includes preliminary image processing means for identifying images which, on the basis of the associated velocity data, will be blurred, determining the autofocus and displacement data for each component image, linking autofocus and displacement data to their respective component images, and discarding individual regions of component images identified as being unsuitable for assembly into the composite image, prior to said data being supplied to the image processing means.
  • the image capture device is an instrument for use in ophthalmology, for example a direct ophthalmoscope.
  • the image capture device is preferably a handheld device.
  • the imaging apparatus preferably includes analysing means for comparing the autofocus measurements obtained during the inspection of the retina of the eye with a set of expected data for an eye to determine whether the eye has any defects associated with an unusual variation of the distance of the retina from the cornea (e.g. a detached retina).
  • the image capture device preferably includes a user interface that allows a user of the device to transmit "yes" and "no" responses to prompts transmitted to the device by the image processing means.
  • Such prompts might, for example, be questions whether the user wishes to capture further component images or whether a composite image should be stored.
  • a portable image capture device for imaging apparatus in accordance with the first aspect of the invention.
  • a method of assembling a composite image from component images comprising the steps of capturing first and second component images of respective first and second portions of an image subject using imaging apparatus including a portable image capture device comprising sensor means operable to determine a displacement of the portable image capture device between capture of the first and second component images, deriving from an output of the sensor means an offset of the first and second portions of the image subject relative to one another or to a datum, and assembling the first and second component images into the composite image using the offset of the first and second portions of the image subject.
  • Figure 1 is a schematic diagram of imaging apparatus in accordance with the first aspect of the invention.
  • Figure 2 is a schematic diagram of a handheld direct ophthalmoscope of the imaging apparatus of Figure 1;
  • Figure 3 is a functional block diagram of the ophthalmoscope of Figures 1 and 2;
  • Figure 4 is a functional block diagram of the personal computer of Figure 1;
  • Figure 5 is a functional block diagram of the cradle of Figure 1;
  • Figure 6 is a representation of the velocity sensors of the- ophthalmoscope of Figures 1 and 2 and the three mutually orthogonal axes about which the velocity sensors are operable to measure rotational velocity of the ophthalmoscope;
  • Figure 7 is a flow chart of the sequence of operations carried out by the ophthalmoscope when capturing component images
  • Figures 8a and 8b are representations of an image of a portion of the fundus of an eye before and after, respectively, discarding individual regions of the image that are unsuitable for assembly into a composite image;
  • Figure 9 is a representation of a composite image of a portion of a fundus of an eye assembled by the method of the third aspect of the invention.
  • the imaging apparatus of Figure 1 comprises an image capture device in the form of a portable handheld, battery powered direct ophthalmoscope 100, image processing means in the form of a programmed personal computer 200 and a cradle 300 in which the ophthalmoscope 100 can be placed to recharge a rechargeable battery of the ophthalmoscope.
  • image processing means in the form of a programmed personal computer 200
  • cradle 300 in which the ophthalmoscope 100 can be placed to recharge a rechargeable battery of the ophthalmoscope.
  • the components of the ophthalmoscope 100 are described in detail with reference to Figure 2.
  • the programmed personal computer 200 is of standard configuration and includes a processor 210, a monitor 212 and memory 214.
  • the memory 214 includes both volatile memory and non- volatile memory in the form of a hard drive.
  • the volatile memory is used for execution of an image processing program and the hard drive is used to store the image processing program and image data generated by execution of the image processing program.
  • the components of the processor 210 and memory 214 are described in detail with reference to Figure 4.
  • the ophthalmoscope 100 comprises a housing 110 containing a beamsplitter 112, a corrective lens system 114, an illumination mirror 116, an aperture/graticule/colour filter assembly 118, an illumination lens 120, an illumination point source 122, an auto focus system 124, a CCD array lens 126, a CCD contrast enhancement filter 128, a CCD array 130, an orientation sensing system 132, a microprocessor 134, an encoder 136, a wireless transmitter 138 and a rechargeable battery 140.
  • the exterior of the housing 110 is provided with a user interface denoted generally by reference numeral 142, a set of battery charging contacts 144 and a set of data transfer contacts 146.
  • the housing 110 is adapted to prevent light from entering inappropriate areas of the interior of the ophthalmoscope and has an internal surface of a suitable colour and texture to limit unwanted reflections.
  • the housing 110 comprises a handle portion 148 which is topped by an elongate body portion 150 that extends substantially horizontally with the ophthalmoscope in its normally intended attitude of use.
  • the ophthalmoscope is positioned with the front end of the body portion 150 facing the eye to be examined (denoted by reference numeral 152 in Figure 2). An image of that eye can be directly viewed by the user (whose eye has been denoted by the reference numeral 154).
  • the body portion 150 of the housing contains a set of illumination optics, constituted by the illumination point source 122, aperture/graticule/colour filter assembly 118, illumination lens 120 and illumination mirror 116 for projecting a beam 156 of illuminating light into the eye 152.
  • the body portion 150 of the housing also contains a set of viewing optics, constituted by the corrective lens system 114 and beam splitter 112 through which an image of the eye 152 can be viewed.
  • the illumination and imaging optics are, in essence, identical with the illumination and imaging optics of known designs of direct ophthalmoscope, such as Keeler Limited of the United Kingdom's "Professional" ophthalmoscope.
  • the illumination point source 122 takes the form of an incandescent bulb connected to an on board power supply for supplying a variable voltage to the incandescent bulb in order to vary the intensity of white light emitted by the latter. Situated above the source 122 is the illumination lens 120 which causes light from the source 122 to illuminate uniformly a variable aperture in the aperture/graticule/colour filter assembly 118.
  • the assembly 118 has been shown in a simplified form, but does include a wheel that, as with known direct ophthalmoscopes, provides a blue or green filter to give a "red-free" illumination for contrast enhancement and graticules to provide for a "target" image within the light beam and differently sized apertures to produce different diameters of light beams.
  • This wheel is rotatably mounted in the handle.
  • the wheel can be rotated by means of a control forming part of the user interface denoted generally by reference numeral 142 to bring any selected one of the apertures or colour filters into registry with the illumination lens 120 so as to provide control of the size of the spot of illuminating light projected into the eye 152.
  • the illumination mirror 116 of the illumination optics is mounted in the body portion 150 of the housing in a position spaced from the viewing axis 158 of the viewing optics. Light from the illumination source 122 is reflected off the illumination mirror 116 in the direction of the eye 152 to be examined, with the path 156 of the light slightly angled towards the viewing axis 158.
  • the corrective lens system 114 consists of an array of corrective lenses arranged in a circular pattern and hence referred to as a "lenswheel", for compensating for the combined refractive error of the eye 154 of the user and eye 152 to be examined in a manner familiar from known ophthalmoscopes.
  • the beamsplitter 112 comprises a partially silvered mirror, which allows a portion of the light reflected through the corrective lens system along the viewing axis 158 from the eye 152 to be examined to reach the eye 154 of the user, thereby allowing the eye 154 of the user to view an image of the retina of the eye 152 to be examined, whilst also reflecting an image of the retina of the eye 152 along an image capture axis 160 and into the CCD array 130 via the autofocus system 124, CCD array lens 126 and CCD contrast enhancement filter 128.
  • the autofocus system 124 focuses the image of the retina of the eye 152 onto the CCD array 130 and is needed to compensate for the change in focal length of the eye 152 experienced depending on the viewing angle into the eye 152.
  • the autofocus system is provided for this function.
  • the autofocus system 124 makes use of a known frequency detection technique whereby the sharper the edges in an image are, the more in-focus an image is.
  • the autofocus system uses known electromechanical means to carry out a "dithering" (i.e. small displacement, high frequency, rapidly oscillating) movement in addition to its corrective repositioning.
  • the "dithering" motion is used in conjunction with frequency detection feedback to determine which direction the autofocus system should move (i.e. whether it is moving in or out of focus as it oscillates back and forth). Once the correct direction has been determined during the "dithering" motion, it will move rapidly to a point that provides for an in-focus image, again in conjunction with frequency detection feedback.
  • the frequency detection process is carried out by the microprocessor 134.
  • a sensor (not shown) is used to determine the rotary position of the lenswheel in order to determine which corrective lens is engaged at any one time. This information is used by the autofocus system so that any movement of the lenswheel causes a corresponding adjustment by the autofocus system.
  • the CCD array 130 is operable to capture a succession of images of at least part of the retina of the eye 152 to be examined.
  • the CCD array is covered by the contrast enhancement filter 128, whose spectral characteristics alter the spectral profile of the light striking the surface of the CCD array in such a way that certain wavelength ranges are reduced in intensity in order to improve image contrast.
  • One effect of the filter 128 that improves the contrast of an image is the reduction in scattered light within a particular wavelength range reflected from the retina.
  • Another effect is the removal of longer-wave infrared light, which may otherwise cause blurred images by penetrating into the material of the CCD array to such an extent that the light crosses pixel boundaries.
  • the orientation sensing system 132 comprises an array of velocity sensors, which are described in detail in relation to Figure 6 and are sensitive to changes in orientation of the ophthalmoscope in the form of rotations about any of three mutually perpendicular axes.
  • the output of the orientation sensing system 132 is therefore representative of changes in orientation of the ophthalmoscope.
  • This output is sent- to the microprocessor 134 and combined with the image data captured following any such changes in orientation of the ophthalmoscope.
  • This combined data is supplied to the encoder 136 and wireless transmitter 138 of the ophthalmoscope, which transmits the encoded combined data to the cradle 300 via a wireless link (denoted in Figure 1 by reference numeral 162).
  • the aperture/graticule/colour filter assembly 118 is, as with known designs of direct ophthalmoscope, operable to project a graticule onto the retina of the eye 152 to be examined.
  • FIG. 3 shows the electrical and electronic subsystems of the ophthalmoscope 100, which are an optical subsystem 164, an illumination subsystem 166, a communication subsystem 168, an imaging subsystem 170 and a power supply subsystem 172.
  • the optical subsystem 164 comprises the aperture/graticule/colour filter assembly (denoted by reference numeral 118 in Figure 2), which is made up of selectable apertures 118a, selectable graticules 118b and red-free colour filter 118c, together with corrective lens system 114, beamsplitter 112, autofocus system 124 and CCD array lens 126.
  • the illumination subsystem 166 comprises the illumination point source 122 and illumination controller 142a, which controls the brightness of the point source 122.
  • the communication subsystem 168 comprises the wireless transmitter 138 and data transfer contacts 146, together with a buffer memory 174 and a data port 176 adapted to receive a removable memory card 178.
  • the imaging subsystem 170 comprises the image capture trigger 142b, CCD array 130, CCD contrast enhancement filter 128 and microprocessor 134, which is operable to carry out image pre-processing on captured images.
  • the power supply subsystem 172 comprises the charging contacts 144, rechargeable battery 140 and a sleep mode controller 180. The power supply subsystem 172 supplies power to the subsystems 164, 166, 168 and 170.
  • the charging contacts 144 make electrical contact with corresponding contacts (denoted by reference numeral 344 in Figure 5) on the cradle 300 and thus enable the re-chargeable battery 140 to be re-charged.
  • the power supply subsystem 172 supplies power to the illumination source 122 through the illumination controller 142a, which is controlled via the user interface 142 of the ophthalmoscope and varies the amount of power supplied to the illumination source 122 (and hence the brightness of the latter).
  • the sleep mode controller 180 is connected to the output of the orientation sensing system 132.
  • the sleep mode controller monitors the output of the orientation sensing system 132 and, if the ophthalmoscope has been stationary for more than a predetermined period of time, switches the ophthalmoscope to a power saving mode by deactivating the supply of power to the illumination source 122, imaging subsystem 170 and the communication subsystem 168.
  • the image capture trigger 142b of the imaging subsystem 170 is operable by the user to cause the microprocessor 134 to begin combining image data from the CCD array 130 with orientation data from the orientation sensing system 132.
  • the image data captured by the CCD array 130 is processed by the microprocessor 134 before being combined with the orientation data to identify captured images that are unsuitable for assembly into a composite image. Such unsuitable images are discarded.
  • the computer 200 could perform the same pre-processing (as well as the remaining image processing/construction involved in preparing the composite image or images) since the computer 200 can receive image and orientation data directly from the ophthalmoscope by means of the memory card 178. Use of the memory card to transfer image and orientation data directly to the computer 200 is described in greater detail below.
  • a microprocessor located in the cradle 300 could perform the preprocessing of the image data instead of the microprocessor 134 of the ophthalmoscope.
  • the sleep mode controller 180 deactivates the ophthalmoscope if the latter is not moved for a period of at least one minute. This period is adjustable via the user interface 142 of the ophthalmoscope. As soon as the ophthalmoscope is moved again, for example by being picked up, this movement is detected by the orientation sensing system 132 and power is supplied as normal to all subsystems of the ophthalmoscope.
  • Figure 3 also illustrates at 182 an option of feedback control of the illumination source 122 by the imaging subsystem 1-70.
  • the imaging subsystem 170 can increase or decrease the illumination provided by the illumination source 122 dependent upon whether the images captured by the CCD array 130 are under exposed or over exposed, whilst differentiating between areas of interest in the images and the saturated areas typical of, for example, a glare spot due to corneal reflection.
  • the memory card 178 provides removable storage of image data captured by the CCD array 130, and also provides an extended buffer memory in the event of the ophthalmoscope being unable to transmit data to the cradle 300, for example because the ophthalmoscope has been used in an environment where RP noise interrupts the wireless link 162 with the cradle, or where the ophthalmoscope is used at a greater distance from the cradle than the range of the wireless transmitter 138.
  • the memory card 178 can be removed from the ophthalmoscope and connected to a card reader 216 connected to the computer 200 to transfer data from the ophthalmoscope to the computer if, for any reason, wireless transmission of data is not possible.
  • light 156 from the illumination source 122 passes through the variable aperture/graticule/colour filter assembly 118 and the illumination lens 120 and is reflected by the mirror 116 into the eye 152 to be examined.
  • Light reflected from the eye 152 passes through the corrective lens system 114 to the beamsplitter 112, which directs a portion 158 of the light reflected from the eye 152 to the eye 154 of the user, and the remainder 160 of the light to the CCD array 130 via the autofocus system 124 and CCD contrast enhancement filter 128.
  • Operation of the trigger 142b is sensed by the microprocessor 134, which begins to store image data captured by the CCD array and encoding associated orientation data into each image. This combined image and orientation data is transmitted to the encoder 136.
  • the image data is representative of a portion of the eye 152 to be examined at which the ophthalmoscope is pointed when the trigger 142b is operated, and the orientation data is representative of a displacement of the ophthalmoscope from an initial orientation when the trigger 142b is first operated.
  • the computer 200 includes a cradle link subsystem 218, an image processing subsystem 220 and a user interface subsystem 222.
  • the cradle link subsystem 218 includes a cable port 224 and a buffer memory 226.
  • the image processing subsystem 220 includes relative positioning 228, correlation 230, blending 232 and global registration 236 software components.
  • the user interface subsystem 222 includes feedback 238, patient records 240, parameter setting 242, correlation 244, image viewing 246 and system diagnostics 248 software components.
  • the cable port 224 of the cradle link subsystem 218 is connected to a cable port 324 of a computer link subsystem 318 of the cradle 300 by a cable.
  • the computer 200 further includes a temporary memory 250 via which data is transmitted from the cradle link subsystem 218 to the image processing subsystem 220, and quasi-permanent image storage 252 and permanent image storage 254. Data transmitted to the computer 200 via the cradle link subsystem 218 is held in the temporary memory 250.
  • the buffer memory 226 of the cradle link subsystem 218 provides for when there are delays in the image processing and/or user interface subsystem accepting data from the cradle link subsystem.
  • the image processing subsystem carries out the image processing operations of relative positioning i.e.
  • the image processing subsystem 220 is further operable, if the orientation data associated with an image indicate that it overlaps a previously received image and correlation of the images does not identify the overlap, to correlate scaled up and scaled down versions of the image with the previously received image so as to identify whether the magnification of the image is different from that of the previously received image.
  • Such a change of magnification can occur as a result of movement of the ophthalmoscope relative to the cornea of the eye 152, because of the curved shape of the cornea.
  • the scaled version of the image is assembled into the composite image.
  • This process of scaling up or scaling down an image is likely to be necessary where the corrective lens system 114 of the ophthalmoscope is altered and at the start of a sequence of captured images, because the distance from the ophthalmoscope to the retina is likely to change slightly between capture of a first sequence of component images and a subsequent sequence of component images.
  • the image processing system 220 is still further operable to warp the composite images that are assembled to counteract the curvature of the retina and enable the retina to be rendered accurately as a flat surface on the monitor 212 of the computer.
  • the field of view of the ophthalmoscope is relatively narrow, such that the degree of warping required for a particular image is relatively constant across the image.
  • the warping is carried out by applying differing amounts of warping to overlapping images and correlating the warped images to identify the amount of warping for each image that gives the best correlation with its neighbouring images. This gives the most accurate rendition of the retina as a flat surface.
  • the user interface subsystem 222 enables the user to receive information from the computer. Such information might for example include a warning generated by the computer in response to determination that the autofocus settings generated by the autofocus system of the ophthalmoscope do not fall within a range of expected settings for a healthy eye, so as to alert the user to the possibility of a detached retina in the eye 152.
  • the user interface subsystem also enables the user to view completed composite images, alter system settings such as the number of times an image is captured of any given portion of the retina of the eye 152 i.e. the extent to which oversampling takes place, and to communicate -with external systems such as a patient record database.
  • the patient records software component 240 enables the user interface subsystem to organise patient records and to store composite images of eyes as part of such patient records.
  • the system diagnostics software component 248 enables the user interface subsystem to carry out a diagnostic procedure to determine whether the ophthalmoscope is functioning correctly, and to alert the user if a fault is detected.
  • the cradle 300 comprises a support surface 310, which is of complementary shape to the base of the handle portion 148 of the housing 110 of the ophthalmoscope, and has charging contacts 344 for engagement with the charging contacts 144 of the ophthalmoscope, such that the rechargeable battery 140 of the ophthalmoscope is recharged when the ophthalmoscope is placed in the cradle, and data transfer contacts 346 for engagement with the data transfer contacts 146 of the ophthalmoscope such that data can be transmitted between the cradle 300 and the ophthalmoscope 100.
  • the cradle 300 further comprises a wireless link subsystem 312, a microprocessor 314, a display subsystem 316, the computer link subsystem 318 and a power supply subsystem 320.
  • the wireless link subsystem 312 includes a buffer memory 322, a radio frequency transceiver 326 and a decoder 328, the radio frequency transceiver 326 being adapted to receive encoded image and orientation data from the transmitter 138 of the ophthalmoscope, and the decoder 328 being adapted to decode the image and orientation data.
  • the transceiver 326 of the cradle also enables additional data and commands (for example diagnostic data and image storage commands) to be transmitted between the ophthalmoscope and the cradle.
  • the buffer memory 322 provides an initial store for data received from the ophthalmoscope.
  • the microprocessor 314 is operable to control the subsystems of the cradle and, if necessary, to carry out pre-processing of the image data, such as discarding images that are blurred because the ophthalmoscope was moving too fast as the image was captured.
  • the display subsystem 316 includes a display screen 330 that can display information such as whether the cradle is charging the battery of the ophthalmoscope, whether the ophthalmoscope is within wireless transmission range of the cradle, and whether transmission of data between the cradle and the ophthalmoscope is in progress.
  • the computer link subsystem 318 includes the cable port 324 and buffer memory 332.
  • the computer link subsystem 318 is connected to the cable port 224 of the cradle link subsystem 218 of the computer 200 by a cable.
  • the power supply subsystem 320 comprises a mains supply inlet 334 for connection to mains supply outlet 340, a transformer 336 and a battery charging controller 338.
  • a rectifier (not shown) is used in conventional fashion to provide DC power to the other subsystems of the cradle and to the rechargeable battery 140 of the ophthalmoscope 100.
  • Image data and orientation data received by the transceiver 326 of the cradle are transmitted via the cable port 324 by a cable to the cable port 224 of the computer 200.
  • Algorithms using, for example, averaging routines are used to blend the overlaid images together such that boundaries between individual component images are eliminated or at least reduced in visual impact.
  • the ophthalmoscope When those portions of the eye 152 are in the field of view of the ophthalmoscope, as seen by the eye 154 of the user through the ophthalmoscope, the user operates the trigger 142b, the ophthalmoscope transmits the combined image and displacement data to the cradle 300 and the computer 200 incorporates the new composite image of the missing portion of the eye into the composite image displayed on the monitor 212.
  • the orientation sensing system 132 comprises first, second and third angular rotation sensors 132a, 132b and 132c respectively.
  • the angular rotation sensors are ADXRS 150 micro-machined gyroscopes made by Analog Devices of the United States of America. Each sensor is operable to measure rotation about an axis that passes through the sensor.
  • the sensors 132a, 132b and 132c are arranged such that the axis about which each measures rotation is perpendicular to the axes of the other two sensors.
  • the first sensor 132a is arranged such that when the ophthalmoscope 100 is in its attitude of normal use, i.e. substantially upright, the axis 190 about which the sensor measures rotation is a horizontal line parallel to a line connecting the eyes 152 and 154.
  • the second sensor 132b is arranged such that the axis 192 about which the sensor measures rotation is another horizontal line perpendicular to the axis 190.
  • the third sensor 132c is arranged such that the axis 194 about which the sensor measures rotation is a vertical line perpendicular to the axes 190 and 192.
  • the axis 190 will be referred to as the roll axis
  • the axis 192 will be referred to as the pitch axis
  • the axis 194 will be referred to as the yaw axis.
  • the output signal of the relevant sensor for example first sensor 132a for the roll axis, which is related to angular velocity of the sensor about the axis, is integrated with respect to time from a starting point.
  • the starting point is the vertical position of the ophthalmoscope when it is placed in the cradle 300 in the correct manner. If it is assumed that in this position the ophthalmoscope is vertical, then the yaw axis can be assumed to be vertical, from which it follows that the pitch and roll axes are horizontal.
  • the direction in which the ophthalmoscope is facing can be considered to be a vector combination of the roll, pitch and yaw axes, with the vector projecting in varying amounts along each axis depending on the direction in which the ophthalmoscope is facing.
  • the angular position around any of the three axes can be used to describe how far along the two other axes the vector combination projects. For a given amount of rotation about a particular axis, it is possible to calculate using trigonometric methods, how far along the other two axes the vector projects. Assuming that the ophthalmoscope has been calibrated such that the yaw axis is vertical, the values of components resolved along the axes of the projecting vector are used to obtain the indication of the offset of the portions of the image subject in the following manner.
  • the view of the retina of the eye 152 would appear to move up or down to a user of the ophthalmoscope and the value of the vector component along the yaw axis would be used to obtain the indication of the offset.
  • the view of the retina of the eye 152 would appear to move left or right to the user of the ophthalmoscope and the value of the vector component along the pitch axis would be used to obtain the indication of the offset.
  • the view of the retina of the eye 152 would appear to rotate clockwise or anticlockwise to the user of the ophthalmoscope and the measured rotation of the ophthalmoscope about the roll axis would be used to obtain the indication of the offset using trigonometry.
  • the voltage output of each sensor is converted to an angular velocity and the change in orientation over a particular period of time is calculated by integration.
  • the integration process is accomplished by sampling the voltage output of each sensor at lms intervals and summing the sampled voltage outputs to obtain the angular displacement of the ophthalmoscope from an initial orientation every lms.
  • the angular displacement of the ophthalmoscope from the initial orientation is stored for the particular lms interval during which a component image is captured.
  • the change in angular position of the ophthalmoscope about the roll axis corresponds to the difference in angular orientation of a later composite image relative to a composite image captured with the ophthalmoscope in the initial orientation. It is a simple matter to rotate the later composite image relative to the first composite image to remove the difference in orientation.
  • the change in angular position of the ophthalmoscope about the pitch and yaw axes corresponds to the difference in horizontal and vertical position of the later component image relative to the component image captured with the ophthalmoscope in the initial orientation.
  • the angle of rotation of the ophthalmoscope about each of the pitch and yaw axes is converted to a number of pixels by which the later component image is to be displaced vertically and/or horizontally relative to the first component image.
  • the CCD array 130 of the ophthalmoscope comprises 640 pixels by 480 pixels and has a field of view of 8 degrees. Each degree of rotation of the ophthalmoscope about the yaw axis therefore corresponds to a horizontal displacement of the later component image relative to the first component image of 80 pixels.
  • the ophthalmoscope when used to capture component images of the retina of an eye, the images must be captured through the pupil of the eye.
  • the ophthalmoscope is therefore used as close to the eye as possible so that each image captured is of as large a portion of the retina of the eye as possible, and as large a region of the retina as possible can be imaged.
  • the displacements of the ophthalmoscope between capture of the first component image and later component images are predominantly rotational rather than linear displacements of the ophthalmoscope, because such linear displacements would either (in the case of linear displacement along the roll axis) increase the distance between the ophthalmoscope and the pupil of the eye, so reducing the size of the portion of the retina captured by each image, or (in the case of linear displacement along the pitch or yaw axes) bring the ophthalmoscope out of alignment with the pupil, so that the retina can no longer be imaged through the pupil.
  • the ophthalmoscope of the invention therefore does not require the linear motion sensors that might be required in other imaging applications within the scope of the present invention.
  • Figure 7 shows the initial image processing steps that are carried out by the ophthalmoscope in response to operation of the trigger 142b.
  • the ophthalmoscope is switched on, for example as a result of the user picking up the ophthalmoscope, and operation moves to step 712.
  • the ophthalmoscope waits until the trigger 142b is operated, and operation moves to step 714, at which the ophthalmoscope captures image data representative of a component image, and motion data representative of any motion of the ophthalmoscope between operation of the trigger and capture of the image, after which operation moves to step 716.
  • the ophthalmoscope determines from the motion data associated with the image data whether the image is blurred due to movement of the ophthalmoscope during capture of the image. If the image is blurred, the image is discarded and operation returns to step 714, as shown by path 716a, and a new component image is captured. If the image is not blurred, operation proceeds to step 718, as shown by path 716b, at which non-useful regions of the first component image are discarded.
  • the method of identifying and discarding non-useful regions of the first component image is as follows.
  • the first component image is divided into individual regions by identifying the edges of such regions, where the colours represented by the pixel values of the image change abruptly. For each individual region the area of the region is determined. If the area of the region is smaller than a threshold area, indicating that the region corresponds to a feature of the retina rather than an over- or underexposed region of the component image, the region is retained in the component image.
  • the colours represented by the pixel values of each region are compared with a predetermined range of colours expected to be encountered in an image of the retina of an eye and with the colours represented by the pixel values of the neighbouring regions, If the colours of the region do not match any of the predetermined colours and differ by more than a threshold amount from the colours of the neighbouring regions, which indicates that the region corresponds to an over- or underexposed region of the component image, the region is discarded from the component image. Otherwise the region is retained in the component image.
  • Operation then passes to step 720, at which the ophthalmoscope determines whether the first component image is in focus. If the image is not in focus, the image is discarded and operation proceeds to step 722, as shown by path 720a, at which the autofocus system of the ophthalmoscope adjusts the focus setting, before operation returns to step 714. If the image is in focus, operation proceeds to steps 724 and 726 at the same time, as shown by paths 720a and 720b. At step 724 the focus selling of the autofocus system is stored, then operation returns to step 714. At step 726 the ophthalmoscope determines whether the image is a first component image in the sense that all previous component images have been discarded at step 716 or 720.
  • step 728 determines whether the image is such a first image. If the image is such a first image, operation proceeds to step 728, as shown by path 726a, at which the image data and motion data are stored. If the image is not such a first image, but a subsequent image, operation proceeds to step 730, as shown by path 726b, at which the motion data associated with the subsequent image are used to determine the indication of the offset of the subsequent image relative to the first image, and the image data and data representative of the offset are transmitted to the computer to enable the image data to be correlated with previously captured image data. Operation then proceeds to step 732 at which the ophthalmoscope determines whether the trigger is still operated. If the trigger is still operated, operation returns to step 714, as shown by path 732a.
  • Figure 8a shows a component image 800 of a portion of a retina of an eye captured by the ophthalmoscope 100.
  • the image consists of an underexposed outer region 810, a correctly exposed middle region 820 and an overexposed inner region 830.
  • the underexposed region 810 corresponds to that portion of the retina in the field of view of the ophthalmoscope- that is not illuminated sufficiently by the illumination source of the ophthalmoscope.
  • the overexposed region 830 is caused by reflection of light from the illumination source by the cornea of the eye.
  • the correctly exposed region 820 is suitable for use in a composite image of the retina of the eye.
  • Figure 8b shows the component image 800 after excision of the underexposed region 810 and overexposed region 830, to leave the correctly exposed region 820 corresponding to an annular portion of the retina.
  • Figure 9 shows a composite image 900 of a portion of the retina.
  • the composite image is made up of a plurality of annular portions, three of which are denoted by reference numerals 910, 920 and 930, the annular portions being similar to that shown in Figure 8b.
  • reference numeral 940 denotes a region of the composite image representing a continuous portion of the retina, which has been assembled by capturing a sufficient quantity of overlapping images.
  • the peripheral region surrounding region 940 could be rendered continuous simply by capturing more component images of the corresponding region of the retina.

Abstract

Imaging apparatus comprising a portable image capture device (100) operable to capture component images of portions of an image subject (152), and image processing means (200) operable to assemble the component images into a composite image of at least a portion of the image subject (152), wherein the imaging apparatus further comprises sensor means (132) operable to determine a displacement of the portable image capture device (100) between capture of the component images and the image processing means (200) is operable to derive from an output of the sensor means (132) an indication of offset of the portions of the image subject (152) relative to one another or to a datum, and thereby to assemble the component images into the composite image. Also an image capture device (100) for such imaging apparatus and a method of assembling a composite image from component images using such imaging apparatus.

Description

TITLE: IMAGING APPARATUS, PORTABLE IMAGE CAPTURE DEVICE AND
METHOD OF ASSEMBLING COMPOSITE IMAGES FROM COMPONENT IMAGES
Field of the Invention
This invention relates to imaging apparatus comprising a portable image capture device and image processing means operable to assemble component images captured by the image capture device into a composite image, to a portable image capture device, in particular a device for capturing images of portions of a fundus of an eye, for use in such apparatus, and to a method of assembling a composite image from component images.
Background to the Invention
Imaging apparatus comprising a portable image capture device operable to capture first and second component images of respective first and second portions of an image subject, and image processing means operable to assemble the first and second component images into a composite image of at least a portion of the image subject, is known.
US Patent No. 6,454,410, for example, is concerned with a system for mosaicing images of an eye, the system comprising a slitlamp biomicroscope to provide first and second adjacent images of the eye, a camera coupled to the slitlamp biomicroscope to capture the first and second images and a data processor coupled to the camera to process the images into a mosaic representation of the eye.
In order to assemble the component images into the composite image, the image processing means of such known imaging apparatus must carry out considerable image processing of the component images to identify to what extent, if any, the component images overlap, before overlapping the component images by any identified overlap to form the composite image. If no overlap of the component images is identified, the image processing means cannot assemble them into the composite image. Summary of the Invention
According to a first aspect of the invention there is provided imaging apparatus comprising a portable image capture device operable to capture first and second component images of respective first and second portions of an image subject, and image processing means operable to assemble the first and second component images into a composite image of at least a portion of the image subject, wherein the imaging apparatus further comprises sensor means operable to determine a displacement of the portable image capture device between capture of the first and second component images and the image processing means is operable to derive from an output of the sensor means an indication of offset of the first and second portions of the image subject relative to one another or to a datum, and thereby to assemble the first and second component images into the composite image.
In this specification "image subject" means a scene, an object or a portion of an object. The offset of the first and second portions of the image subject relative to one another or to the datum may be the vertical and/or horizontal displacement of the portions from each other or the datum and/or the rotational orientation of the portions relative to each other or to the datum.
Furthermore, for the purposes of this specification, the term "portable image capture device" encompasses both hand held and head mounted devices, or devices supported by a user in any other way.
The imaging apparatus of the invention can assemble component images of portions of an image subject into wider field of view composite images of the image subject more quickly than known imaging apparatus because the use of the output of the sensor means to derive the indication of the offset of the image portions enables the amount of image processing involved in aligning and registering the component images to be reduced.
Where the first and second images are images of discrete portions of the image subject, the image processing means is preferably operable to shift (and, if necessary, rotate) one of the images by the offset relative to the other image so as to juxtapose the first and second images, such that the composite image consists of two discrete portions of the image subject. This would not be possible using the known imaging apparatus.
Where, on the other hand, the first and second images are images of overlapping portions of the image subject, the image processing means is preferably operable to shift (and, if necessary, rotate) one of the images by the offset relative to the other image so as to combine the first and second images, such that the composite image consists of one continuous portion of the image subject.
Preferably the portable image capture device is operable to capture at least one subsequent component image of a subsequent portion of the image subject, the sensor means is operable to determine a displacement of the portable image capture device between capture of the or each subsequent component image and another component image, and the image processing means is operable to derive from the output of the sensor means an indication of offset of the or each subsequent portion of the image subject relative to another portion of the image subject, and thereby to assemble the or each subsequent component image into the composite image.
Where the imaging apparatus is so operable, the component images may include images of discrete portions and of overlapping portions of the image subject, in which case the image processing means is operable to assemble the composite image by juxtaposing and combining the component images as appropriate.
Preferably, said another component image is the first component image.
The portable image capture device preferably includes a digital image sensor operable to capture image data in the form of pixel values, such as a charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) array, and digital memory means operable to store the image data captured by the digital image sensor.
The portable image capture device may advantageously be provided with display means for displaying an image in a field of view of the image capture device. The portable image capture device may advantageously further be provided with trigger means operable to cause the image capture device to capture an image in the field of view of the device.
A user of the imaging apparatus can use the display means to ensure that a portion of interest of the image subject is in the field of view of the image capture device before operating the trigger means to cause the image capture device to capture an image of the portion of interest.
The trigger means may advantageously further be operable to cause the image capture device to capture a succession of component images at known time intervals.
Although not essential for the operation of the imaging apparatus, in this way a user of the imaging apparatus can move the image capture device relative to the image subject such that each of the succession of component images is of a portion of the image subject that overlaps another portion of the image subject, such that the resulting composite image is of one continuous portion, or even the whole, of the image subject.
Preferably the image processing means is operable to create a composite image which comprises a mosaic of said component images.
Although it is envisaged that the sensor means may be external to the image capture device, for example comprising an array of proximity detectors which are mounted in fixed positions in a room in which the device is to be used, the sensor means preferably forms part of the image capture device.
Preferably the sensor means is responsive to changes in orientation of the image capture device to produce an output from which said indication of offset is deriveable.
Preferably the sensor means comprises a velocity sensor operable to determine linear and/or angular velocities of the image capture device. The velocity sensor is preferably one of a plurality of such sensors mounted at spaced apart locations and/or differing orientations on the image capture device to enable changes in orientation of the latter to be determined.
The plurality of velocity sensors are preferably arranged so as to enable angular velocities of the image capture device to be measured about any of three mutually orthogonal axes that pass through the device.
The image capture device may advantageously include an automatic focussing system to enable focussed images to be obtained at any distance in a range of possible distances from the image capture device to the image subject.
The image capture device is preferably operable, at the same time as capturing the second or any subsequent component image, to store displacement data representative of the displacement of the image capture device between capture of that component image and the first component image.
The image processing means can then use the displacement data associated with each component image to assemble the composite image from the component images. Where the component images are of overlapping portions of the image subject, the indication of offset of the portions of the image subject derived from the displacement data can be used by the image processing means to accelerate the operation of identifying the overlap of the portions of the image subject.
Where the component images are of discrete portions of the image subject, the indication of offset of the portions of the image subject derived from the displacement data can be used by the image processing means to bypass the operation of attempting to identify the (nonexistent) overlap of the image subject.
Where the image capture device includes trigger means, the trigger means may advantageously further be operable to cause the image capture device to capture a succession of component images in response to predetermined displacements of the image capture device from an initial position and/or orientation of the device in which the trigger means was first operated.
This feature means that a user of the imaging apparatus need not concern himself that the succession of component images captured by the image capture device are of overlapping portions of the image subject, because the predetermined displacements of the image capture device from the initial position and/or orientation can be chosen to ensure that the succession of component images are of overlapping portions of the image subject.
The image capture device is preferably operable, at the same time as capturing a component image, to store velocity data representative of the rate of change of the displacement of the image capture device at the time of capturing the component image.
In this way the image capture device and/or image processing means can discard any component image that was captured when the device was moving at a velocity greater than a predetermined rate of change of displacement, because the image may be blurred or the displacement data inaccurate due to the movement experienced by the sensor lying out of its range of operation.
The image capture device and/or image processing means may advantageously be operable to divide a component image into a plurality of individual regions, to compare pixel values in neighbouring regions, thereby to identify individual regions of the component image that are unsuitable for assembly into the composite image, and to discard the identified individual regions.
Individual regions of the component image might be unsuitable for use because, for example, they are too dark or too bright, and correspond to portions of the image subject that are under- and over-illuminated, respectively.
This also enables the identification and elimination of pixel values that are erroneously at a maximum value, known as "hot pixels". Preferably the image capture device and/or image processing means is/are operable to compare colours represented by pixel values in neighbouring regions, thereby to identify the individual regions that are unsuitable for assembly into the composite image.
Where the imaging apparatus is to be used for capturing images of a particular type of image subject, the image capture device is preferably operable to compare colours represented by pixel values in individual regions with a range of colours that are expected to be encountered in images of the particular type of image subject, thereby to identify the individual regions that are unsuitable for assembly into the composite image.
For example, in a preferred embodiment of the invention the image capture device is an instrument for use in ophthalmology and is operable to compare colours represented by pixel values in individual regions with a range of colours that are expected to be encountered in images of the fundus of an eye.
Preferably the image capture device and/or image processing means is further operable to determine whether an individual region identified as unsuitable for assembly into the composite image is greater in area than a threshold area, and to discard the identified individual region only if it is determined to be greater in area than the threshold area.
In this way the imaging apparatus can distinguish to some extent between regions of the component image that are too dark or too bright because of incorrect illumination of the image subject, and regions of the component image that correspond to features of the image subject that are, for example, more reflective than their surroundings, and therefore properly appear as a bright region in the component image. It is undesirable to discard such regions from the component images.
In the preferred embodiment of the invention, many of the component images of portions of the fundus of an eye captured by the image capture device include a large bright reflection caused by the cornea of the eye. The imaging apparatus is thus operable to discard regions of the captured images corresponding to such corneal reflections, whilst retaining small bright or dark regions corresponding to features of the eye such as blood vessels. Where the image capture device and/or image processing means is operable to discard regions of component images, the image processing means can nevertheless align and register the remaining regions of the component images to form the composite image, using the displacement data associated with the component images.
Moreover, where the component images are similar to one another, but not overlapping, incorrect overlapping of the component images in the composite image can be prevented, because the displacement data associated with component images will indicate that they cannot overlap.
It will be appreciated that portions of the image subject which are omitted from one component image may nevertheless appear in another component image, so that the portions in question may still appear in the composite image.
The image processing means may advantageously further be operable to identify whether a user's movements of the image capture device follow a pattern, and if such a pattern is identified, to use the pattern to predict the approximate displacement of the image capture device from an initial position and/or orientation on the basis of previous displacements of the image capture device from the initial position and/or orientation.
The image processing means may advantageously be operable to use common features in component images to position and orientate the component images relative to one another, for example by correlation, in which case the ability to predict the approximate displacement of the image capture device, and hence the approximate offset of one portion of the image subject relative another portion of the image subject, considerably reduces the amount of processing that must be carried out by the image processing means, because the image processing means can process those component images, or even those portions of a component image, that are most likely to have features in common with a particular component image. The image capture device is preferably provided with illumination means operable to illuminate the image subject or a portion of the image subject of which an image is to be captured by the image capture device.
The illumination means may advantageously comprise an incandescent filament bulb and/or one or more light-emitting diodes (LEDs).
Preferably, at least part of the image processing means is separate from the image capture device. Consequently, the need for the image capture device to be portable does not place any constraint on the size or weight of the image processing means.
This enables the portable image capture device to be smaller and lighter than it would otherwise be, because at least part of the image processing means does not form a part of the portable image capture device.
Preferably the image capture device includes a rechargeable battery and wireless transmission means operable to transmit image data and displacement data to the image processing means.
Where the image capture device includes a rechargeable battery the sensor means may advantageously be operable, if no movement of the device has occurred in a predetermined time, to cause the device to enter a low power consumption mode.
The absence of any movement of the device during a prolonged period, for example one minute, is indicative that the device is not being used and has been laid down without being turned off. The low power consumption mode would typically involve turning off any illumination means, image capture and wireless transmission means of the device.
Where the image capture device includes a rechargeable battery and wireless transmission means, the apparatus may advantageously further comprise a charging cradle that has an electrical connection for connection to the image capture device to recharge the battery of the image capture device. The cradle may advantageously include a further electrical connection for connection to the image capture device to transfer data between the image capture device and the cradle.
Preferably the cradle includes wireless reception means operable to receive image data from the image capture device.
Typically the cradle would be connected to the image processing means by a cable for transmission of image data received from the image capture device to the image processing means.
Preferably, the image capture device includes calibration means for calibrating the sensor means of the image capture device when the latter is placed in the cradle. Since the device is stationary when in the cradle, placing the image capture device in the cradle ensures the device is in a position in which no significant velocity should be detected thus facilitating calibration.
Preferably, the image capture device includes preliminary image processing means for identifying images which, on the basis of the associated velocity data, will be blurred, determining the autofocus and displacement data for each component image, linking autofocus and displacement data to their respective component images, and discarding individual regions of component images identified as being unsuitable for assembly into the composite image, prior to said data being supplied to the image processing means.
Preferably the image capture device is an instrument for use in ophthalmology, for example a direct ophthalmoscope. The image capture device is preferably a handheld device.
Where the image capture device is such an instrument the imaging apparatus preferably includes analysing means for comparing the autofocus measurements obtained during the inspection of the retina of the eye with a set of expected data for an eye to determine whether the eye has any defects associated with an unusual variation of the distance of the retina from the cornea (e.g. a detached retina). The image capture device preferably includes a user interface that allows a user of the device to transmit "yes" and "no" responses to prompts transmitted to the device by the image processing means.
Such prompts might, for example, be questions whether the user wishes to capture further component images or whether a composite image should be stored.
According to a second aspect of the invention there is provided a portable image capture device for imaging apparatus in accordance with the first aspect of the invention.
According to a third aspect of the invention there is provided a method of assembling a composite image from component images, the method comprising the steps of capturing first and second component images of respective first and second portions of an image subject using imaging apparatus including a portable image capture device comprising sensor means operable to determine a displacement of the portable image capture device between capture of the first and second component images, deriving from an output of the sensor means an offset of the first and second portions of the image subject relative to one another or to a datum, and assembling the first and second component images into the composite image using the offset of the first and second portions of the image subject.
Brief Description of the Drawings
The invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram of imaging apparatus in accordance with the first aspect of the invention;
Figure 2 is a schematic diagram of a handheld direct ophthalmoscope of the imaging apparatus of Figure 1;
Figure 3 is a functional block diagram of the ophthalmoscope of Figures 1 and 2; Figure 4 is a functional block diagram of the personal computer of Figure 1;
Figure 5 is a functional block diagram of the cradle of Figure 1;
Figure 6 is a representation of the velocity sensors of the- ophthalmoscope of Figures 1 and 2 and the three mutually orthogonal axes about which the velocity sensors are operable to measure rotational velocity of the ophthalmoscope;
Figure 7 is a flow chart of the sequence of operations carried out by the ophthalmoscope when capturing component images;
Figures 8a and 8b are representations of an image of a portion of the fundus of an eye before and after, respectively, discarding individual regions of the image that are unsuitable for assembly into a composite image; and
Figure 9 is a representation of a composite image of a portion of a fundus of an eye assembled by the method of the third aspect of the invention.
Detailed Description of an Embodiment
The imaging apparatus of Figure 1 comprises an image capture device in the form of a portable handheld, battery powered direct ophthalmoscope 100, image processing means in the form of a programmed personal computer 200 and a cradle 300 in which the ophthalmoscope 100 can be placed to recharge a rechargeable battery of the ophthalmoscope. The components of the ophthalmoscope 100 are described in detail with reference to Figure 2.
The programmed personal computer 200 is of standard configuration and includes a processor 210, a monitor 212 and memory 214. The memory 214 includes both volatile memory and non- volatile memory in the form of a hard drive. The volatile memory is used for execution of an image processing program and the hard drive is used to store the image processing program and image data generated by execution of the image processing program. The components of the processor 210 and memory 214 are described in detail with reference to Figure 4.
The components of the cradle 300 are described in detail with reference to Figure 5.
Turning to Figure 2, the ophthalmoscope 100 comprises a housing 110 containing a beamsplitter 112, a corrective lens system 114, an illumination mirror 116, an aperture/graticule/colour filter assembly 118, an illumination lens 120, an illumination point source 122, an auto focus system 124, a CCD array lens 126, a CCD contrast enhancement filter 128, a CCD array 130, an orientation sensing system 132, a microprocessor 134, an encoder 136, a wireless transmitter 138 and a rechargeable battery 140. The exterior of the housing 110 is provided with a user interface denoted generally by reference numeral 142, a set of battery charging contacts 144 and a set of data transfer contacts 146.
The housing 110 is adapted to prevent light from entering inappropriate areas of the interior of the ophthalmoscope and has an internal surface of a suitable colour and texture to limit unwanted reflections. The housing 110 comprises a handle portion 148 which is topped by an elongate body portion 150 that extends substantially horizontally with the ophthalmoscope in its normally intended attitude of use. In use, the ophthalmoscope is positioned with the front end of the body portion 150 facing the eye to be examined (denoted by reference numeral 152 in Figure 2). An image of that eye can be directly viewed by the user (whose eye has been denoted by the reference numeral 154).
The body portion 150 of the housing contains a set of illumination optics, constituted by the illumination point source 122, aperture/graticule/colour filter assembly 118, illumination lens 120 and illumination mirror 116 for projecting a beam 156 of illuminating light into the eye 152. The body portion 150 of the housing also contains a set of viewing optics, constituted by the corrective lens system 114 and beam splitter 112 through which an image of the eye 152 can be viewed. The illumination and imaging optics are, in essence, identical with the illumination and imaging optics of known designs of direct ophthalmoscope, such as Keeler Limited of the United Kingdom's "Professional" ophthalmoscope. The illumination point source 122 takes the form of an incandescent bulb connected to an on board power supply for supplying a variable voltage to the incandescent bulb in order to vary the intensity of white light emitted by the latter. Situated above the source 122 is the illumination lens 120 which causes light from the source 122 to illuminate uniformly a variable aperture in the aperture/graticule/colour filter assembly 118. The assembly 118 has been shown in a simplified form, but does include a wheel that, as with known direct ophthalmoscopes, provides a blue or green filter to give a "red-free" illumination for contrast enhancement and graticules to provide for a "target" image within the light beam and differently sized apertures to produce different diameters of light beams. This wheel is rotatably mounted in the handle. The wheel can be rotated by means of a control forming part of the user interface denoted generally by reference numeral 142 to bring any selected one of the apertures or colour filters into registry with the illumination lens 120 so as to provide control of the size of the spot of illuminating light projected into the eye 152.
The illumination mirror 116 of the illumination optics is mounted in the body portion 150 of the housing in a position spaced from the viewing axis 158 of the viewing optics. Light from the illumination source 122 is reflected off the illumination mirror 116 in the direction of the eye 152 to be examined, with the path 156 of the light slightly angled towards the viewing axis 158.
The corrective lens system 114 consists of an array of corrective lenses arranged in a circular pattern and hence referred to as a "lenswheel", for compensating for the combined refractive error of the eye 154 of the user and eye 152 to be examined in a manner familiar from known ophthalmoscopes.
The beamsplitter 112 comprises a partially silvered mirror, which allows a portion of the light reflected through the corrective lens system along the viewing axis 158 from the eye 152 to be examined to reach the eye 154 of the user, thereby allowing the eye 154 of the user to view an image of the retina of the eye 152 to be examined, whilst also reflecting an image of the retina of the eye 152 along an image capture axis 160 and into the CCD array 130 via the autofocus system 124, CCD array lens 126 and CCD contrast enhancement filter 128. The autofocus system 124 focuses the image of the retina of the eye 152 onto the CCD array 130 and is needed to compensate for the change in focal length of the eye 152 experienced depending on the viewing angle into the eye 152. This change in focal length would normally be compensated for by the eye 154 of the user or by using a different corrective lens from the lenswheel. The autofocus system is provided for this function. The autofocus system 124 makes use of a known frequency detection technique whereby the sharper the edges in an image are, the more in-focus an image is. The autofocus system uses known electromechanical means to carry out a "dithering" (i.e. small displacement, high frequency, rapidly oscillating) movement in addition to its corrective repositioning. The "dithering" motion is used in conjunction with frequency detection feedback to determine which direction the autofocus system should move (i.e. whether it is moving in or out of focus as it oscillates back and forth). Once the correct direction has been determined during the "dithering" motion, it will move rapidly to a point that provides for an in-focus image, again in conjunction with frequency detection feedback. The frequency detection process is carried out by the microprocessor 134.
A sensor (not shown) is used to determine the rotary position of the lenswheel in order to determine which corrective lens is engaged at any one time. This information is used by the autofocus system so that any movement of the lenswheel causes a corresponding adjustment by the autofocus system.
The CCD array 130 is operable to capture a succession of images of at least part of the retina of the eye 152 to be examined. The CCD array is covered by the contrast enhancement filter 128, whose spectral characteristics alter the spectral profile of the light striking the surface of the CCD array in such a way that certain wavelength ranges are reduced in intensity in order to improve image contrast. One effect of the filter 128 that improves the contrast of an image is the reduction in scattered light within a particular wavelength range reflected from the retina. Another effect is the removal of longer-wave infrared light, which may otherwise cause blurred images by penetrating into the material of the CCD array to such an extent that the light crosses pixel boundaries. The orientation sensing system 132 comprises an array of velocity sensors, which are described in detail in relation to Figure 6 and are sensitive to changes in orientation of the ophthalmoscope in the form of rotations about any of three mutually perpendicular axes. The output of the orientation sensing system 132 is therefore representative of changes in orientation of the ophthalmoscope. -This output is sent- to the microprocessor 134 and combined with the image data captured following any such changes in orientation of the ophthalmoscope. This combined data is supplied to the encoder 136 and wireless transmitter 138 of the ophthalmoscope, which transmits the encoded combined data to the cradle 300 via a wireless link (denoted in Figure 1 by reference numeral 162).
The aperture/graticule/colour filter assembly 118 is, as with known designs of direct ophthalmoscope, operable to project a graticule onto the retina of the eye 152 to be examined.
Turning to Figure 3, this shows the electrical and electronic subsystems of the ophthalmoscope 100, which are an optical subsystem 164, an illumination subsystem 166, a communication subsystem 168, an imaging subsystem 170 and a power supply subsystem 172. The optical subsystem 164 comprises the aperture/graticule/colour filter assembly (denoted by reference numeral 118 in Figure 2), which is made up of selectable apertures 118a, selectable graticules 118b and red-free colour filter 118c, together with corrective lens system 114, beamsplitter 112, autofocus system 124 and CCD array lens 126. The illumination subsystem 166 comprises the illumination point source 122 and illumination controller 142a, which controls the brightness of the point source 122. The communication subsystem 168 comprises the wireless transmitter 138 and data transfer contacts 146, together with a buffer memory 174 and a data port 176 adapted to receive a removable memory card 178. The imaging subsystem 170 comprises the image capture trigger 142b, CCD array 130, CCD contrast enhancement filter 128 and microprocessor 134, which is operable to carry out image pre-processing on captured images. The power supply subsystem 172 comprises the charging contacts 144, rechargeable battery 140 and a sleep mode controller 180. The power supply subsystem 172 supplies power to the subsystems 164, 166, 168 and 170. The charging contacts 144 make electrical contact with corresponding contacts (denoted by reference numeral 344 in Figure 5) on the cradle 300 and thus enable the re-chargeable battery 140 to be re-charged. The power supply subsystem 172 supplies power to the illumination source 122 through the illumination controller 142a, which is controlled via the user interface 142 of the ophthalmoscope and varies the amount of power supplied to the illumination source 122 (and hence the brightness of the latter).
The sleep mode controller 180 is connected to the output of the orientation sensing system 132. The sleep mode controller monitors the output of the orientation sensing system 132 and, if the ophthalmoscope has been stationary for more than a predetermined period of time, switches the ophthalmoscope to a power saving mode by deactivating the supply of power to the illumination source 122, imaging subsystem 170 and the communication subsystem 168.
The image capture trigger 142b of the imaging subsystem 170 is operable by the user to cause the microprocessor 134 to begin combining image data from the CCD array 130 with orientation data from the orientation sensing system 132. The image data captured by the CCD array 130 is processed by the microprocessor 134 before being combined with the orientation data to identify captured images that are unsuitable for assembly into a composite image. Such unsuitable images are discarded.
It should be noted, however, that the computer 200 could perform the same pre-processing (as well as the remaining image processing/construction involved in preparing the composite image or images) since the computer 200 can receive image and orientation data directly from the ophthalmoscope by means of the memory card 178. Use of the memory card to transfer image and orientation data directly to the computer 200 is described in greater detail below. Similarly, a microprocessor located in the cradle 300 could perform the preprocessing of the image data instead of the microprocessor 134 of the ophthalmoscope.
The sleep mode controller 180 deactivates the ophthalmoscope if the latter is not moved for a period of at least one minute. This period is adjustable via the user interface 142 of the ophthalmoscope. As soon as the ophthalmoscope is moved again, for example by being picked up, this movement is detected by the orientation sensing system 132 and power is supplied as normal to all subsystems of the ophthalmoscope.
Figure 3 also illustrates at 182 an option of feedback control of the illumination source 122 by the imaging subsystem 1-70. The imaging subsystem 170 can increase or decrease the illumination provided by the illumination source 122 dependent upon whether the images captured by the CCD array 130 are under exposed or over exposed, whilst differentiating between areas of interest in the images and the saturated areas typical of, for example, a glare spot due to corneal reflection.
The memory card 178 provides removable storage of image data captured by the CCD array 130, and also provides an extended buffer memory in the event of the ophthalmoscope being unable to transmit data to the cradle 300, for example because the ophthalmoscope has been used in an environment where RP noise interrupts the wireless link 162 with the cradle, or where the ophthalmoscope is used at a greater distance from the cradle than the range of the wireless transmitter 138.
The memory card 178 can be removed from the ophthalmoscope and connected to a card reader 216 connected to the computer 200 to transfer data from the ophthalmoscope to the computer if, for any reason, wireless transmission of data is not possible.
In use, light 156 from the illumination source 122 passes through the variable aperture/graticule/colour filter assembly 118 and the illumination lens 120 and is reflected by the mirror 116 into the eye 152 to be examined. Light reflected from the eye 152 passes through the corrective lens system 114 to the beamsplitter 112, which directs a portion 158 of the light reflected from the eye 152 to the eye 154 of the user, and the remainder 160 of the light to the CCD array 130 via the autofocus system 124 and CCD contrast enhancement filter 128.
Operation of the trigger 142b is sensed by the microprocessor 134, which begins to store image data captured by the CCD array and encoding associated orientation data into each image. This combined image and orientation data is transmitted to the encoder 136. The image data is representative of a portion of the eye 152 to be examined at which the ophthalmoscope is pointed when the trigger 142b is operated, and the orientation data is representative of a displacement of the ophthalmoscope from an initial orientation when the trigger 142b is first operated.
Turning to Figure 4, the computer 200 includes a cradle link subsystem 218, an image processing subsystem 220 and a user interface subsystem 222. The cradle link subsystem 218 includes a cable port 224 and a buffer memory 226. The image processing subsystem 220 includes relative positioning 228, correlation 230, blending 232 and global registration 236 software components. The user interface subsystem 222 includes feedback 238, patient records 240, parameter setting 242, correlation 244, image viewing 246 and system diagnostics 248 software components.
The cable port 224 of the cradle link subsystem 218 is connected to a cable port 324 of a computer link subsystem 318 of the cradle 300 by a cable. The computer 200 further includes a temporary memory 250 via which data is transmitted from the cradle link subsystem 218 to the image processing subsystem 220, and quasi-permanent image storage 252 and permanent image storage 254. Data transmitted to the computer 200 via the cradle link subsystem 218 is held in the temporary memory 250. The buffer memory 226 of the cradle link subsystem 218 provides for when there are delays in the image processing and/or user interface subsystem accepting data from the cradle link subsystem. The image processing subsystem carries out the image processing operations of relative positioning i.e. positioning of individual component images relative to one another, correlation i.e. comparison of component images based on the offset of the images derived from the displacement data to determine an exact offset of the images relative to one another, blending i.e. manipulation of the overlapping regions of component images to give the appearance of a single continuous image, and global registration i.e. positioning of the composite image relative to a datum, for example so that when the composite image is displayed, the top of the image corresponds to the top of the imaged portion of the retina of the eye 152. Completed composite images are stored in the permanent image storage 254, The memory card reader 216 enables image and orientation data to be transferred directly from the ophthalmoscope 100 to the computer 200 rather than via the cradle 300. The image processing subsystem 220 is further operable, if the orientation data associated with an image indicate that it overlaps a previously received image and correlation of the images does not identify the overlap, to correlate scaled up and scaled down versions of the image with the previously received image so as to identify whether the magnification of the image is different from that of the previously received image. Such a change of magnification can occur as a result of movement of the ophthalmoscope relative to the cornea of the eye 152, because of the curved shape of the cornea.
If correlation of the scaled version of the image with the previously received image identifies the overlap, the scaled version of the image is assembled into the composite image.
This process of scaling up or scaling down an image is likely to be necessary where the corrective lens system 114 of the ophthalmoscope is altered and at the start of a sequence of captured images, because the distance from the ophthalmoscope to the retina is likely to change slightly between capture of a first sequence of component images and a subsequent sequence of component images.
The image processing system 220 is still further operable to warp the composite images that are assembled to counteract the curvature of the retina and enable the retina to be rendered accurately as a flat surface on the monitor 212 of the computer.
The field of view of the ophthalmoscope is relatively narrow, such that the degree of warping required for a particular image is relatively constant across the image.
The warping is carried out by applying differing amounts of warping to overlapping images and correlating the warped images to identify the amount of warping for each image that gives the best correlation with its neighbouring images. This gives the most accurate rendition of the retina as a flat surface.
The user interface subsystem 222 enables the user to receive information from the computer. Such information might for example include a warning generated by the computer in response to determination that the autofocus settings generated by the autofocus system of the ophthalmoscope do not fall within a range of expected settings for a healthy eye, so as to alert the user to the possibility of a detached retina in the eye 152. The user interface subsystem also enables the user to view completed composite images, alter system settings such as the number of times an image is captured of any given portion of the retina of the eye 152 i.e. the extent to which oversampling takes place, and to communicate -with external systems such as a patient record database. The patient records software component 240 enables the user interface subsystem to organise patient records and to store composite images of eyes as part of such patient records. The system diagnostics software component 248 enables the user interface subsystem to carry out a diagnostic procedure to determine whether the ophthalmoscope is functioning correctly, and to alert the user if a fault is detected.
Turning to Figure 5, the cradle 300 comprises a support surface 310, which is of complementary shape to the base of the handle portion 148 of the housing 110 of the ophthalmoscope, and has charging contacts 344 for engagement with the charging contacts 144 of the ophthalmoscope, such that the rechargeable battery 140 of the ophthalmoscope is recharged when the ophthalmoscope is placed in the cradle, and data transfer contacts 346 for engagement with the data transfer contacts 146 of the ophthalmoscope such that data can be transmitted between the cradle 300 and the ophthalmoscope 100.
The cradle 300 further comprises a wireless link subsystem 312, a microprocessor 314, a display subsystem 316, the computer link subsystem 318 and a power supply subsystem 320. The wireless link subsystem 312 includes a buffer memory 322, a radio frequency transceiver 326 and a decoder 328, the radio frequency transceiver 326 being adapted to receive encoded image and orientation data from the transmitter 138 of the ophthalmoscope, and the decoder 328 being adapted to decode the image and orientation data. The transceiver 326 of the cradle also enables additional data and commands (for example diagnostic data and image storage commands) to be transmitted between the ophthalmoscope and the cradle. The buffer memory 322 provides an initial store for data received from the ophthalmoscope.
The microprocessor 314 is operable to control the subsystems of the cradle and, if necessary, to carry out pre-processing of the image data, such as discarding images that are blurred because the ophthalmoscope was moving too fast as the image was captured. The display subsystem 316 includes a display screen 330 that can display information such as whether the cradle is charging the battery of the ophthalmoscope, whether the ophthalmoscope is within wireless transmission range of the cradle, and whether transmission of data between the cradle and the ophthalmoscope is in progress.
The computer link subsystem 318 includes the cable port 324 and buffer memory 332. The computer link subsystem 318 is connected to the cable port 224 of the cradle link subsystem 218 of the computer 200 by a cable.
The power supply subsystem 320 comprises a mains supply inlet 334 for connection to mains supply outlet 340, a transformer 336 and a battery charging controller 338. A rectifier (not shown) is used in conventional fashion to provide DC power to the other subsystems of the cradle and to the rechargeable battery 140 of the ophthalmoscope 100.
Image data and orientation data received by the transceiver 326 of the cradle are transmitted via the cable port 324 by a cable to the cable port 224 of the computer 200.
When two component images have been received by the computer, they are aligned with respect to their relative position data, which is obtained from the orientation data supplied with the image data. Correlation algorithms are then used to search for the exact match within the overlapping portions of the two images. Subsequent images are similarly aligned until the matching fails due to, for example, large eye movements or blinking of the eye 152. The composite image is saved within either the hard drive or volatile memory and a new composite image is started. This continues until the trigger 142b of the ophthalmoscope is released and a signal is sent via the cradle to the computer to begin final image processing. In a similar manner to the component images, such composite images are themselves registered so as to form a single large composite image. Algorithms using, for example, averaging routines, are used to blend the overlaid images together such that boundaries between individual component images are eliminated or at least reduced in visual impact. When the user has finished a first image capture session, by looking at the computer monitor 212 he can quickly see from the composite image which portions of the eye 152 have not yet been incorporated into the composite image, and move the ophthalmoscope as required to bring those portions into the field of view of the ophthalmoscope. When those portions of the eye 152 are in the field of view of the ophthalmoscope, as seen by the eye 154 of the user through the ophthalmoscope, the user operates the trigger 142b, the ophthalmoscope transmits the combined image and displacement data to the cradle 300 and the computer 200 incorporates the new composite image of the missing portion of the eye into the composite image displayed on the monitor 212.
Turning to Figure 6, the orientation sensing system 132 comprises first, second and third angular rotation sensors 132a, 132b and 132c respectively. The angular rotation sensors are ADXRS 150 micro-machined gyroscopes made by Analog Devices of the United States of America. Each sensor is operable to measure rotation about an axis that passes through the sensor. The sensors 132a, 132b and 132c are arranged such that the axis about which each measures rotation is perpendicular to the axes of the other two sensors.
The first sensor 132a is arranged such that when the ophthalmoscope 100 is in its attitude of normal use, i.e. substantially upright, the axis 190 about which the sensor measures rotation is a horizontal line parallel to a line connecting the eyes 152 and 154. The second sensor 132b is arranged such that the axis 192 about which the sensor measures rotation is another horizontal line perpendicular to the axis 190. The third sensor 132c is arranged such that the axis 194 about which the sensor measures rotation is a vertical line perpendicular to the axes 190 and 192. For clarity of explanation, the axis 190 will be referred to as the roll axis, the axis 192 will be referred to as the pitch axis, and the axis 194 will be referred to as the yaw axis.
In order to calculate angular position about one of the axes, the output signal of the relevant sensor, for example first sensor 132a for the roll axis, which is related to angular velocity of the sensor about the axis, is integrated with respect to time from a starting point. The starting point is the vertical position of the ophthalmoscope when it is placed in the cradle 300 in the correct manner. If it is assumed that in this position the ophthalmoscope is vertical, then the yaw axis can be assumed to be vertical, from which it follows that the pitch and roll axes are horizontal.
Although it is the displacement of the ophthalmoscope between capture of a first image and subsequent images that is used to align and register the subsequent images relative to the first image to form the composite image, it is useful to measure the displacement of the ophthalmoscope relative to the starting point because this enables the computer to align and register one such composite image with another composite image, and enables the computer to ensure that the top of the composite image corresponds to the uppermost portion of the eye 152.
The direction in which the ophthalmoscope is facing can be considered to be a vector combination of the roll, pitch and yaw axes, with the vector projecting in varying amounts along each axis depending on the direction in which the ophthalmoscope is facing.
Using trigonometric methods, the angular position around any of the three axes can be used to describe how far along the two other axes the vector combination projects. For a given amount of rotation about a particular axis, it is possible to calculate using trigonometric methods, how far along the other two axes the vector projects. Assuming that the ophthalmoscope has been calibrated such that the yaw axis is vertical, the values of components resolved along the axes of the projecting vector are used to obtain the indication of the offset of the portions of the image subject in the following manner.
For rotation of the ophthalmoscope about the pitch axis, the view of the retina of the eye 152 would appear to move up or down to a user of the ophthalmoscope and the value of the vector component along the yaw axis would be used to obtain the indication of the offset.
For rotation of the ophthalmoscope about the yaw axis, the view of the retina of the eye 152 would appear to move left or right to the user of the ophthalmoscope and the value of the vector component along the pitch axis would be used to obtain the indication of the offset. For rotation of the ophthalmoscope about the roll axis, the view of the retina of the eye 152 Would appear to rotate clockwise or anticlockwise to the user of the ophthalmoscope and the measured rotation of the ophthalmoscope about the roll axis would be used to obtain the indication of the offset using trigonometry.
The voltage output of each sensor is converted to an angular velocity and the change in orientation over a particular period of time is calculated by integration. The integration process is accomplished by sampling the voltage output of each sensor at lms intervals and summing the sampled voltage outputs to obtain the angular displacement of the ophthalmoscope from an initial orientation every lms. The angular displacement of the ophthalmoscope from the initial orientation is stored for the particular lms interval during which a component image is captured.
The change in angular position of the ophthalmoscope about the roll axis corresponds to the difference in angular orientation of a later composite image relative to a composite image captured with the ophthalmoscope in the initial orientation. It is a simple matter to rotate the later composite image relative to the first composite image to remove the difference in orientation. The change in angular position of the ophthalmoscope about the pitch and yaw axes corresponds to the difference in horizontal and vertical position of the later component image relative to the component image captured with the ophthalmoscope in the initial orientation. The angle of rotation of the ophthalmoscope about each of the pitch and yaw axes is converted to a number of pixels by which the later component image is to be displaced vertically and/or horizontally relative to the first component image. The CCD array 130 of the ophthalmoscope comprises 640 pixels by 480 pixels and has a field of view of 8 degrees. Each degree of rotation of the ophthalmoscope about the yaw axis therefore corresponds to a horizontal displacement of the later component image relative to the first component image of 80 pixels.
It will be appreciated that when the ophthalmoscope is used to capture component images of the retina of an eye, the images must be captured through the pupil of the eye. The ophthalmoscope is therefore used as close to the eye as possible so that each image captured is of as large a portion of the retina of the eye as possible, and as large a region of the retina as possible can be imaged. This means that the displacements of the ophthalmoscope between capture of the first component image and later component images are predominantly rotational rather than linear displacements of the ophthalmoscope, because such linear displacements would either (in the case of linear displacement along the roll axis) increase the distance between the ophthalmoscope and the pupil of the eye, so reducing the size of the portion of the retina captured by each image, or (in the case of linear displacement along the pitch or yaw axes) bring the ophthalmoscope out of alignment with the pupil, so that the retina can no longer be imaged through the pupil. The ophthalmoscope of the invention therefore does not require the linear motion sensors that might be required in other imaging applications within the scope of the present invention.
Figure 7 shows the initial image processing steps that are carried out by the ophthalmoscope in response to operation of the trigger 142b. At step 710 the ophthalmoscope is switched on, for example as a result of the user picking up the ophthalmoscope, and operation moves to step 712. At step 712 the ophthalmoscope waits until the trigger 142b is operated, and operation moves to step 714, at which the ophthalmoscope captures image data representative of a component image, and motion data representative of any motion of the ophthalmoscope between operation of the trigger and capture of the image, after which operation moves to step 716. At step 716 the ophthalmoscope determines from the motion data associated with the image data whether the image is blurred due to movement of the ophthalmoscope during capture of the image. If the image is blurred, the image is discarded and operation returns to step 714, as shown by path 716a, and a new component image is captured. If the image is not blurred, operation proceeds to step 718, as shown by path 716b, at which non-useful regions of the first component image are discarded.
The method of identifying and discarding non-useful regions of the first component image is as follows.
The first component image is divided into individual regions by identifying the edges of such regions, where the colours represented by the pixel values of the image change abruptly. For each individual region the area of the region is determined. If the area of the region is smaller than a threshold area, indicating that the region corresponds to a feature of the retina rather than an over- or underexposed region of the component image, the region is retained in the component image.
Otherwise, the colours represented by the pixel values of each region are compared with a predetermined range of colours expected to be encountered in an image of the retina of an eye and with the colours represented by the pixel values of the neighbouring regions, If the colours of the region do not match any of the predetermined colours and differ by more than a threshold amount from the colours of the neighbouring regions, which indicates that the region corresponds to an over- or underexposed region of the component image, the region is discarded from the component image. Otherwise the region is retained in the component image.
Operation then passes to step 720, at which the ophthalmoscope determines whether the first component image is in focus. If the image is not in focus, the image is discarded and operation proceeds to step 722, as shown by path 720a, at which the autofocus system of the ophthalmoscope adjusts the focus setting, before operation returns to step 714. If the image is in focus, operation proceeds to steps 724 and 726 at the same time, as shown by paths 720a and 720b. At step 724 the focus selling of the autofocus system is stored, then operation returns to step 714. At step 726 the ophthalmoscope determines whether the image is a first component image in the sense that all previous component images have been discarded at step 716 or 720. If the image is such a first image, operation proceeds to step 728, as shown by path 726a, at which the image data and motion data are stored. If the image is not such a first image, but a subsequent image, operation proceeds to step 730, as shown by path 726b, at which the motion data associated with the subsequent image are used to determine the indication of the offset of the subsequent image relative to the first image, and the image data and data representative of the offset are transmitted to the computer to enable the image data to be correlated with previously captured image data. Operation then proceeds to step 732 at which the ophthalmoscope determines whether the trigger is still operated. If the trigger is still operated, operation returns to step 714, as shown by path 732a. If the trigger is no longer operated, operation returns to step 712, as shown by path 732b. Note that step 712 has, for the purpose of clarity, been reproduced at the bottom of the flowchart. Turning finally to Figures 8 and 9, Figure 8a shows a component image 800 of a portion of a retina of an eye captured by the ophthalmoscope 100. The image consists of an underexposed outer region 810, a correctly exposed middle region 820 and an overexposed inner region 830. The underexposed region 810 corresponds to that portion of the retina in the field of view of the ophthalmoscope- that is not illuminated sufficiently by the illumination source of the ophthalmoscope. The overexposed region 830 is caused by reflection of light from the illumination source by the cornea of the eye. The correctly exposed region 820 is suitable for use in a composite image of the retina of the eye.
Figure 8b shows the component image 800 after excision of the underexposed region 810 and overexposed region 830, to leave the correctly exposed region 820 corresponding to an annular portion of the retina.
Figure 9 shows a composite image 900 of a portion of the retina. The composite image is made up of a plurality of annular portions, three of which are denoted by reference numerals 910, 920 and 930, the annular portions being similar to that shown in Figure 8b. It will be appreciated that although each component image is generally annular, and does not include the portion of the retina corresponding to the overexposed region of the image, nevertheless a continuous image of the retina can be assembled if sufficient overlapping component images are captured. Reference numeral 940 denotes a region of the composite image representing a continuous portion of the retina, which has been assembled by capturing a sufficient quantity of overlapping images. The peripheral region surrounding region 940 could be rendered continuous simply by capturing more component images of the corresponding region of the retina.
It will be apparent that the above description relates only to one embodiment of the invention, and that the invention encompasses other embodiments as defined by the foregoing statements of the invention.

Claims

Claims
1. Imaging apparatus comprising a portable image capture device operable to capture first and second component images of respective first and second portions of an image subject, and image processing means operable to assemble the first and second component images into a composite image of at least a portion of the image subject, wherein the imaging apparatus further comprises sensor means operable to determine a displacement of the portable image capture device between capture of the first and second component images and the image processing means is operable to derive from an output of the sensor means an indication of offset of the first and second portions of the image subject relative to one another or to a datum, and thereby to assemble the first and second component images into the composite image.
2. Imaging apparatus according to claim 1, wherein the portable image capture device is operable to capture first and second component images of first and second discrete portions of the image subject, and the image processing means is operable to shift (and, if necessary, rotate) one of the images relative to the other image so as to juxtapose the first and second images, such that the composite image consists of two discrete portions of the image subject.
3. Imaging apparatus according to claim 1 or 2, wherein the portable image capture device is operable to capture first and second component images of first and second overlapping portions of the image subject, and the image processing means is operable to shift (and, if necessary, rotate) one of the images relative to the other image so as to combine the first and second images, such that the composite image consists of one continuous portion of the image subject.
4. Imaging apparatus according to any preceding claim, wherein the portable image capture device is operable to capture at least one subsequent component image of a subsequent portion of the image subject, the sensor means is operable to determine a displacement of the portable image capture device between capture of the or each subsequent component image and another component image, and the image processing means is operable to derive from the output of the sensor means an indication of offset of the or each subsequent portion of the image subject relative to another portion of the image subject, and thereby to assemble the or each subsequent component image into the composite image.
5. Imaging apparatus according to claim 4, wherein the component images include images of discrete portions and of overlapping portions of the image subject, and the image processing means is operable to assemble the composite image by juxtaposing and combining the component images, as appropriate.
6. Imaging apparatus according to claim 4 or 5, wherein said another component image is the first component image.
7. Imaging apparatus according to any preceding claim, wherein the portable image capture device is provided with display means for displaying an image in a field of view of the image capture device.
8. Imaging apparatus according to any preceding claim, wherein the portable image capture device is provided with trigger means operable to cause the image capture device to capture an image in the field of view of the device.
9. Imaging apparatus according to claim 8, wherein the trigger means is operable to cause the image capture device to capture a succession of component images at known time intervals.
10. Imaging apparatus according to any preceding claim, wherein the sensor means forms part of the image capture device.
11. Imaging apparatus according to any preceding claim, wherein the sensor means is responsive to changes in orientation of the image capture device to produce an output from which said indication of offset is deriveable.
12. Imaging apparatus according to claim 10, wherein the sensor means comprises a velocity sensor operable to determine linear and/or angular velocities of the image capture device.
13. Imaging apparatus according to claim 12, wherein the velocity sensor is one of a plurality of such sensors mounted at spaced apart locations and/or differing orientations on the image capture device to enable changes in orientation of the latter to be determined.
14. Imaging apparatus according to claim 13, wherein the plurality of velocity sensors are arranged so as to enable angular velocities of the image capture device to be measured about any of three mutually orthogonal axes that pass through the device.
15. Imaging apparatus according to any preceding claim, wherein the image capture device includes an automatic focussing system to enable focussed images to be obtained at any distance in a range of possible distances from the image capture device to the image subject.
16. Imaging apparatus according to any preceding claim, wherein the image capture device is operable, at the same time as capturing the second or any subsequent component image, to store displacement data representative of the displacement of the image capture device between capture of that component image and the first component image.
17. Imaging apparatus according to claim 8 or any claim dependent therefrom, wherein the trigger means is operable to cause the image capture device to capture a succession of component images in response to predetermined displacements of the image capture device from an initial position and/or orientation of the device in which the trigger means was first operated.
18. Imaging apparatus according to any preceding claim, wherein the image capture device is operable, at the same time as capturing a component image, to store velocity data representative of the rate of change of the displacement of the image capture device at the time of capturing the component image.
19. Imaging apparatus according to any preceding claim, wherein the image processing means is operable to identify whether a user's movements of the image capture device follow a pattern, and if such a pattern is identified, to use the pattern to predict the approximate displacement of the image capture device from an initial position and/or orientation on the basis of previous displacements of the image capture device from the initial position and/or orientation.
20. Imaging apparatus according to any preceding claim, wherein the image processing means is operable to use common features in component images to position and orientate the component images relative to one another.
21. Imaging apparatus according to any preceding claim, wherein the image capture device is provided with illumination means operable to illuminate the image subject or a portion of the image subject of which an image is to be captured by the image capture device.
22. Imaging apparatus according to any preceding claim, wherein at least part of the image processing means is separate from the image capture device.
23. Imaging apparatus according to claim 22, wherein the image capture device includes a rechargeable battery and wireless transmission means operable to transmit image data and displacement data to the image processing means.
24. Imaging apparatus according to claim 23, wherein the sensor means is operable, if no movement of the device has occurred in a predetermined time, to cause the device to enter a low power consumption mode.
25. Imaging apparatus according to claim 23 or 24, -wherein the apparatus further comprises a charging cradle that has an electrical connection for connection to the image capture device to recharge the battery of the image capture device.
26. Imaging apparatus according to claim 25, wherein the cradle includes, a further electrical connection for connection to the image capture device to transfer data between the image capture device and the cradle.
27. Imaging apparatus according to claim 25, wherein the cradle includes wireless reception means operable to receive image data from the image capture device.
28. Imaging apparatus according to any of claims 25 to 27, wherein the image capture device includes calibration means for calibrating the sensor means of the image capture device when the latter is placed in the cradle.
29. Imaging apparatus according to any preceding claim, wherein the image capture device includes preliminary image processing means for identifying images which, on the basis of the associated velocity data, will be blurred, determining the autofocus and displacement data for each component image, linking autofocus and displacement data to their respective component images, and discarding individual regions of component images that are unsuitable for assembly into the composite image, prior to said data being supplied to the image processing means.
30. Imaging apparatus according to any preceding claim, wherein the image capture device is an instrument for use in ophthalmology.
31. Imaging apparatus according to any preceding claim, wherein the image capture device is a handheld device.
32. Imaging apparatus according to claim 30 or 31, when dependent from claim 15, wherein the imaging apparatus includes analysing means for comparing the autofocus measurements obtained during an the inspection of a retina of an eye with a set of expected data for an eye to determine whether the eye has any defects associated with an unusual variation of the distance of the retina from the cornea.
33. Imaging apparatus according to any preceding claim, wherein the image capture device includes a user interface that allows a user of the device to transmit "yes" and "no" responses to prompts transmitted to the device by the image processing means.
34. A portable image capture device for imaging apparatus according to any preceding claim.
35. A method of assembling a composite image from component images, the method comprising the steps of capturing first and second component images of respective first and second portions of an image subject using imaging apparatus including a portable image capture device comprising sensor means operable to determine a displacement of the portable image capture device between capture of the first and second component images, deriving from an output of the sensor means an offset of the first and second portions of the image subject relative to one another or to a datum, and assembling the first and second component images into the composite image using the offset of the first- and second portions of the image subject.
PCT/GB2006/003238 2005-09-03 2006-09-01 Imaging apparatus, portable image capture device and method of assembling composite images from component images WO2007026158A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0517948.6 2005-09-03
GBGB0517948.6A GB0517948D0 (en) 2005-09-03 2005-09-03 Imaging apparatus, portable image capture device and method of assembling composite images from component images

Publications (1)

Publication Number Publication Date
WO2007026158A1 true WO2007026158A1 (en) 2007-03-08

Family

ID=35220798

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2006/003238 WO2007026158A1 (en) 2005-09-03 2006-09-01 Imaging apparatus, portable image capture device and method of assembling composite images from component images

Country Status (2)

Country Link
GB (1) GB0517948D0 (en)
WO (1) WO2007026158A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011014204A1 (en) 2009-07-29 2011-02-03 Marc Ellman Digital imaging ophthalmoscope
US8944596B2 (en) 2011-11-09 2015-02-03 Welch Allyn, Inc. Digital-based medical devices
EP2473093A4 (en) * 2009-09-01 2015-08-19 Canon Kk Fundus camera
US10078226B2 (en) 2013-10-14 2018-09-18 Welch Allyn, Inc. Portable eye viewing device enabled for enhanced field of view
US11147441B2 (en) 2018-01-16 2021-10-19 Welch Allyn, Inc. Physical assessment device
EP4080996A1 (en) * 2021-04-19 2022-10-26 Welch Allyn, INC. Increasing battery life in handheld device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4633317A (en) * 1983-07-02 1986-12-30 Bodenseewerk Geratetechnic GmbH Electro-optical detector system
US5227888A (en) * 1990-02-19 1993-07-13 Nikon Corporation Still image pickup device with two-dimensionally displaceable image pickup surface
US5689302A (en) * 1992-12-10 1997-11-18 British Broadcasting Corp. Higher definition video signals from lower definition sources
US5762603A (en) * 1995-09-15 1998-06-09 Pinotage, Llc Endoscope having elevation and azimuth control of camera assembly
US20030130562A1 (en) * 2002-01-09 2003-07-10 Scimed Life Systems, Inc. Imaging device and related methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4633317A (en) * 1983-07-02 1986-12-30 Bodenseewerk Geratetechnic GmbH Electro-optical detector system
US5227888A (en) * 1990-02-19 1993-07-13 Nikon Corporation Still image pickup device with two-dimensionally displaceable image pickup surface
US5689302A (en) * 1992-12-10 1997-11-18 British Broadcasting Corp. Higher definition video signals from lower definition sources
US5762603A (en) * 1995-09-15 1998-06-09 Pinotage, Llc Endoscope having elevation and azimuth control of camera assembly
US20030130562A1 (en) * 2002-01-09 2003-07-10 Scimed Life Systems, Inc. Imaging device and related methods

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011014204A1 (en) 2009-07-29 2011-02-03 Marc Ellman Digital imaging ophthalmoscope
EP2459052A1 (en) * 2009-07-29 2012-06-06 Marc Ellman Digital imaging ophthalmoscope
EP2459052A4 (en) * 2009-07-29 2014-02-19 Marc Ellman Digital imaging ophthalmoscope
EP2473093A4 (en) * 2009-09-01 2015-08-19 Canon Kk Fundus camera
US8944596B2 (en) 2011-11-09 2015-02-03 Welch Allyn, Inc. Digital-based medical devices
US9642517B2 (en) 2011-11-09 2017-05-09 Welch Allyn, Inc. Digital-based medical devices
US10238462B2 (en) 2011-11-09 2019-03-26 Welch Allyn, Inc. Digital-based medical devices
US11553981B2 (en) 2011-11-09 2023-01-17 Welch Allyn, Inc. Digital-based medical devices
US10078226B2 (en) 2013-10-14 2018-09-18 Welch Allyn, Inc. Portable eye viewing device enabled for enhanced field of view
US11147441B2 (en) 2018-01-16 2021-10-19 Welch Allyn, Inc. Physical assessment device
USD959661S1 (en) 2018-01-16 2022-08-02 Welch Allyn, Inc. Medical viewing device
EP4080996A1 (en) * 2021-04-19 2022-10-26 Welch Allyn, INC. Increasing battery life in handheld device

Also Published As

Publication number Publication date
GB0517948D0 (en) 2005-10-12

Similar Documents

Publication Publication Date Title
JP3298640B2 (en) Gaze tracking for visual field analyzers
US7837329B2 (en) Fundus camera
US5889577A (en) Optical apparatus or apparatus for detecting direction of line of sight, and various kinds of same apparatus using the same
CA2852671C (en) Method and apparatus for determining eye topography
US7264355B2 (en) Ophthalmologic device and ophthalmologic measuring method
WO2011038457A1 (en) Imager, module for an imager, imaging system and method
JPH06142044A (en) Ophthalmic measuring apparatus
JP2005185431A (en) Line-of-sight detection method and line-of-sight detector
WO2007026158A1 (en) Imaging apparatus, portable image capture device and method of assembling composite images from component images
JP2005185523A (en) Eye refractive power measuring instrument
US20090046248A1 (en) Ocular scanning device with programmable patterns for scanning
JP2006280612A (en) Ophthalmological device
JP6221247B2 (en) Ophthalmic equipment
US8136945B2 (en) Ophthalmological measuring apparatus
KR20140099195A (en) Eye refractive power measurement apparatus
CN112190227B (en) Fundus camera and method for detecting use state thereof
JP4469205B2 (en) Ophthalmic equipment
US20230014952A1 (en) Retinal imaging system
JPH11137521A (en) Device for measuring refractive power of eye
JP2021166817A (en) Ophthalmologic apparatus
JP4136691B2 (en) Ophthalmic equipment
JPH06178762A (en) Perimetric device provided with pupil diameter measuring function
JP2010035727A (en) Fundus camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06779259

Country of ref document: EP

Kind code of ref document: A1