WO2006078902A2 - Devices and methods for identifying and monitoring changes of a suspect area on a patient - Google Patents

Devices and methods for identifying and monitoring changes of a suspect area on a patient Download PDF

Info

Publication number
WO2006078902A2
WO2006078902A2 PCT/US2006/002037 US2006002037W WO2006078902A2 WO 2006078902 A2 WO2006078902 A2 WO 2006078902A2 US 2006002037 W US2006002037 W US 2006002037W WO 2006078902 A2 WO2006078902 A2 WO 2006078902A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging device
suspect area
images
suspect
Prior art date
Application number
PCT/US2006/002037
Other languages
French (fr)
Other versions
WO2006078902A3 (en
Inventor
William T. Ii Christiansen
Eric R. Steinmetzer
James E. Torelli
Original Assignee
Dermaspect, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dermaspect, Llc filed Critical Dermaspect, Llc
Priority to NZ556655A priority Critical patent/NZ556655A/en
Priority to CA2595239A priority patent/CA2595239C/en
Priority to AU2006206334A priority patent/AU2006206334C1/en
Publication of WO2006078902A2 publication Critical patent/WO2006078902A2/en
Publication of WO2006078902A3 publication Critical patent/WO2006078902A3/en
Priority to AU2010257350A priority patent/AU2010257350A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0064Body surface scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/442Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/444Evaluating skin marks, e.g. mole, nevi, tumour, scar
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/446Scalp evaluation or scalp disorder diagnosis, e.g. dandruff
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/70Means for positioning the patient in relation to the detecting, measuring or recording means
    • A61B5/706Indicia not located on the patient, e.g. floor marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition

Definitions

  • This disclosure relates to devices and methods, which can be used alone or in combination, to identify and monitor changes of a suspect area on a patient, for example dermatological changes.
  • melanoma is a malignant cancer of the pigment cells (melanocytes).
  • basal and squamous cell cancers are tumors of unpigmented cells (keratinocytes) of the skin.
  • Melanocytes occur at various depths within the epidermal (upper) and dermal (lower) layers of skin. Melanocytes are normally distributed in the layers of the skin and produce pigment in response to being subjected to ultraviolet light (e.g., sunlight). Aggregated melanocytes are termed naevus cells and can be indicative of a melanoma. Because a melanoma may appear as a mole, medical professionals typically attempt to ascertain whether the suspicious area is changing over time. Identifying changes early typically results in a rapid diagnosis, which in turn often leads to rapid and highly effective treatments that can greatly increase the patient's survival rate and in most cases, complete recovery.
  • ultraviolet light e.g., sunlight
  • the method of comparison taught in Craine et al. is that the baseline image is compared visually by the viewer with a subsequently obtained image by alternately displaying the respective images, in a blinking fashion.
  • the blinking action is created by quickly alternating the images with respect to one another on a display monitor to enable the viewer to detect changes in the skin lesion.
  • each standard of care suffers from the subjectivity and uncertainty associated with the medical professional trying to ascertain changes in a suspicious area by visual comparison.
  • the visual comparison methods are subjective and less accurate for a number of reasons. For instance, the medical professional may be inexperienced, may have been distracted during the examination, or may have selected the wrong location on the patient's body during a follow-up examination.
  • one aspect of the present invention is the comparison of a plurality (e.g., at least two) images taken at different times utilizing a computer based algorithm that can overlay two images and either transform the images to fit over each other or do a best fit analysis thereby denoting or calling out one or more of any color, perimeter, or depth changes.
  • the analysis can include transforming the images to match color, contrast, angle, focus (sharpness), brightness, and subsequently comparing the multiple images with each other. Changes between the images may be called out in a variety of ways. Such embodiments include text reports, highlight or color coding the image itself, etc.
  • a device or apparatus comprising a digital image capture device and a distance-measuring device.
  • the distance measuring device measures the distance between the suspect area and the image capture device and provides a readout or a tone to signify the optimal distance.
  • embodiments include the use of a reference such as a strip of adhesive affixed to the surface or attached the device that provide contrast, color, sharpness, and/or depth references.
  • a reference such as a strip of adhesive affixed to the surface or attached the device that provide contrast, color, sharpness, and/or depth references.
  • Such an embodiment could include an adhesive strip having a color palette (e.g., one or more colors), gray scale, distance references (hash marks) or depth references.
  • distance measurements can be done by a sonic device, laser or any other means of measuring distance.
  • an enclosed tube or housing is affixed to the image device and positioned over the suspect area.
  • the housing or enclosure is essentially light free and may contain its own light source internally to provide consistent lighting of the suspect area.
  • the enclosure is a tube of a fixed length have LEDs or fiber optics positioned inside. One end of such an enclosure may be fitted to the image capturing device and the other fitted over the suspect area.
  • an apparatus to acquire an image of a suspect area on a patient comprises an imaging device; and a separation tool having a first end connected with a first section, at least a portion of the first end formed to contact the patient, the first section having an attachment portion to receive the imaging device, the first section formed to maintain the imaging device at a substantially fixed distance from the suspect area.
  • an imaging assembly to image a suspect area on a patient comprises an imaging device and at least one sensor to indicate an orientation and/or distance of the imaging device relative to a first location.
  • an imaging assembly to image a suspect area on a patient comprises a housing; an imaging device located within the housing; and at least one sensor to indicate an orientation of the housing relative to a first location.
  • a method of acquiring an image of a suspect area includes identifying the suspect area; positioning a patient with the suspect area in approximately a first position; identifying a reference item located on the patient; determining a position of the suspect area in relationship to the reference item; aligning an imaging device to acquire the image of the suspect area; and acquiring the image after aligning the image device.
  • a method of comparing at least two images, each image capturing a suspect area includes identifying a reference item in the at least two images; measuring an attribute of the reference item in a first image; transforming a second image based on the measured attribute of the reference item in the first image, wherein a reference item in the second image is transformed to correspond with an orientation and size of the reference item in the first image; measuring an attribute of the suspect areas in both images; and comparing the respective measured attributes of the respective suspect areas.
  • Such reference items can be points either away from the suspect area or within the suspect area.
  • the measured attribute can be color, distance between at least two points, total perimeter, distance between multiple points etc.
  • the method of comparing two images comprises receiving at least two images digitally into a computer system, performing a fitting analysis on the at least two images to obtain an overlay and providing an output noting any differences between the at least two images.
  • a method of acquiring an image of a suspect area includes identifying the suspect area; positioning a patient with the suspect area in approximately a first position; aligning an imaging device to acquire an image of the suspect area; and acquiring the image after aligning the image device.
  • a method of comparing at least two images of a suspect area on a patient includes providing at least two digital images of the suspect area; digitally overlaying the at least two images; performing a best-fit transformation of one image to encourage the one image to approximately correspond to at least one detected attribute of the other acquired image; comparing the at least two images to determine whether a difference exists between an aspect of the one image when compared with the same aspect of the other image.
  • the present invention can be used in the context of full-body imaging wherein one or more digital or other image capture device or devices are placed around the patient and the full-body is imaged either in a piece by piece manner or in its entirety. These images can then be compared by transformation or best-fit analysis and analyzed for any changes by a computer algorithm.
  • Figure 1 A is a front, left isometric view of an imaging device according to one illustrated embodiment positioned with respect to a portion of skin.
  • Figure 1 B is a side view of a portion of a guide of the imaging device of Figure 1A having measurement markers and a palette according to one illustrated embodiment.
  • Figure 2A is a partially exploded, front, left, isometric view of an imaging device according to another illustrated embodiment.
  • Figure 2B is a front, left isometric view of an intermediate bracket according to one illustrated embodiment.
  • Figure 3 is a front, left, isometric view of an imaging device according to another illustrated embodiment.
  • Figure 4 is an elevational view of a hand having several reference points for locating a suspect area according to one illustrated embodiment.
  • Figure 5A is a flowchart of a method of identifying a suspect area according to one illustrated embodiment.
  • Figure 5B is a continuation of the flowchart of Figure 5A.
  • Figure 6 is a top plan view of an image after the image has been pre-processed according to one illustrated embodiment.
  • Figure 7 is a flowchart of a method of acquiring a subsequent image of a suspect area according to one illustrated embodiment.
  • Figure 8A is left, front isometric view of a first image and a second image, each having in an initial and respectively different orientation and size according to one illustrated embodiment.
  • Figure 8B is a top plan view of the first image and the second image of Figure 8A transformed to have approximately the same respective orientation and size.
  • Figure 9 is a flowchart of a method of comparing at least two images of a suspect area according to one illustrated embodiment.
  • Figures 10A - 10C are images of suspect areas illustrating the various stages of the color balancing method of Figure 11 according to one illustrated embodiment.
  • Figure 11 is a flowchart of a method of color balancing an image to detect a potential suspect area according to one illustrated embodiment.
  • patient refers primarily to warm blooded mammals and is not limited to human beings, but could include animals such as dogs, cats, horses, cows, pigs, higher and lower primates, etc.
  • the headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention.
  • the embodiments disclosed herein are generally directed to acquiring images of a suspect area located on a patient, comparing the acquired images to one another; and evaluating the compared images to determine if some amount of change from one image to a subsequent image warrants a more detailed examination by a medical care professional.
  • the embodiments disclose a number of different devices and methods for achieving such results.
  • the surface of interest can be either internal or external to the patient.
  • the surface of interest is the patient's exposed skin that is monitored for the detection or growth of skin cancer.
  • the surface of interest can be the patient's mucous membranes, interior body surfaces related to reproductive and/or digestive systems of the patient, ocular surfaces, or any other accessible surface on a patient.
  • the surface of interest will be exemplified as an area on the patient's skin, referred to as a suspect area. However, this exemplification is not meant to limit or otherwise narrow the scope of the description, the claims, or any specific embodiment depicted herein.
  • the suspect area referred to herein can be the site of a suspected melanoma or mole (e.g., melanin containing areas to be monitored), but can also be any other suspect area on a patient that needs to be monitored.
  • the suspect area can be located in a variety of places on a patient, for example the patient's mucous membranes, surfaces of interior body cavities related to reproductive and/or digestive systems, ocular surfaces, or any other interior or exterior surface on a patient where monitoring is desired.
  • the suspect area can be a dermal feature, such a type of skin cancer, a skin lesion, a skin rash, a burn or scar, an infected or inflamed area, a wound, or some other skin anomaly that may or may not be capable of growth, reduction or other change.
  • a healing rate i.e., recession
  • a further embodiment envisions utilizing such technology to monitor the effectiveness of a drug or nutriceutical, such as those that heal the skin.
  • Another embodiment may monitor a patient's scalp for hair loss and/or growth.
  • the embodiments disclosed herein may be used in a number of settings, such as a home setting, a clinical setting, a laboratory or research setting, a regulatory compliance setting, or any combination of the above.
  • Figures 1 A-3 show three different embodiments of a device to acquire an image of a suspect area. Each of the devices differs in its degree of complexity, accuracy, and cost. It is contemplated that many, if not all, of the features or aspects of one device can be incorporated into the other devices.
  • Figure 1A shows a first imaging device 10 for imaging a suspect area 12 on skin 14 according to the illustrated embodiment.
  • the first imaging device 10 includes a housing 16 and a lens 18 to receive and direct light to imaging components (not shown) located within the housing 16. By locating the imaging components in the housing 16, damage and/or exposure of the imaging components may be prevented.
  • the housing 16 can have a handle 20 to permit the housing 16 to be lifted, moved, positioned, or otherwise manipulated. Additionally or alternatively, the handle 20 and/or other portions of the housing 16 can be configured with support locations so that the imaging device 10 can be secured to a tripod, for example.
  • the imaging components may take the form of a camera or an optical scanner operable to capture images of the suspect area 12.
  • the camera may advantageously take the form of a digital image capture device such as a CCD or CMOS type camera.
  • a CCD camera may consist of one-dimensional or two-dimensional arrays of charge coupled devices ("CCD") and suitable optics, such as optical lenses, for focusing an image on the CCD array.
  • CCD arrays can capture whole images at a time, or can be electronically controlled to successively sample (e.g., pixel-by-pixel, row-by-row, or column-by-column) the information on a region of the skin 14 (i.e., electronically scan).
  • the imaging components can take the form of a CMOS imager capable of capturing one-dimensional or two- dimensional arrays similar to that of a CCD reader.
  • a digital image capture device advantageously provides the image in a form suitable for use with a data processing system such as a computing system.
  • the camera may take the form of a non-digital image capture device, such as a film camera.
  • Such embodiments may employ image scanners, or other devices to digitize the images captured on film.
  • the imaging device 10 may advantageously take the form of a still image capture device.
  • the imaging device 10 may take the form of a motion picture capture device such as a movie camera or video camera.
  • Such embodiments may include a frame grabber or other device to capture single images.
  • the imaging device 10 may rely on ambient light, or may include one or more light sources, such as light emitting diodes (“LEDs”) or incandescent lights, which may be manually or automatically controlled.
  • LEDs light emitting diodes
  • incandescent lights which may be manually or automatically controlled.
  • a guide 22 is attachable to the housing 16 of the imaging device 10.
  • the guide 22 is configured so that the housing 16 can be placed at a desired distance away from the skin 14 along a Z-axis, perpendicular to an X-Y plane when an image is acquired.
  • the guide 22 includes an extension member 24 having a first end 26 that is coupled to the housing 16 of the imaging device 10.
  • a second end 28 is coupled to a contact member 30.
  • the contact member 30 can include a number of features to enhance the control and/or optimization of the imaging device 10.
  • the contact member 30 of the guide 22 can include measurement markings 30a, similar to those of a ruler, and/or contrast markings 30b, which can represent a color or grayscale palette.
  • the extension member 24 includes adjustable, complementary sliding members with gradations 25 to allow the housing 16 to be placed at a desired distance from the suspect area 12.
  • the extension member 24 is formed to be non-adjustable, thus the housing 16 is set at a fixed length from the contact member 30.
  • the contact member 30 may be shaped (e.g., arc-shaped) to provide an unobstructed line of sight between the imaging device 10 and the suspect area 12. As will be discussed in more detail below, it may be desirable that the contact member 30 be shaped such that at least a portion of the contact member 30 can be captured in the acquired image.
  • a skin contact region 31 of the contact member 30 can be padded to provide a more comfortable interaction with the patient.
  • first end 26 can be coupled to the housing 16 by any number of mechanical methods, for example fasteners, clips, VELCRO®, adhesive bonding, tie down straps, or some other structure that substantially keeps the housing 16 attached to the extension member 24.
  • the imaging device 10 can include at least one sensor 32 for determining an orientation of the device 10.
  • One advantage of determining the orientation of the device 10 is to provide for repetitive and consistent images in an X-Y plane, especially if the images are acquired at different times. For example, if a second image is being acquired of the suspect area 12, but the imaging device 10 is tilted or rotated at too much of an angle, the second image may be too distorted or misaligned to digitally process or compare to a previously acquired image.
  • a variety of sensors 32 can be used to indicate the orientation of the imaging device 10.
  • the sensor 32 is a fluid level encompassed in the housing 16 and visible by a user of the imaging device 10. The sensor 32 may also be integrated with the guide 22.
  • the fluid level sensor 32 generally indicates whether the imaging device 10 is tilted relative to the ground. Additionally or alternatively, a gyroscope, which is sometimes referred to as a tilt sensor, can be used to determine an acceleration of the imaging device 10 about at least one axis.
  • a pressure sensor 34 is located in the contact member 30 to sense the pressure exerted on the contact member 30 as it is positioned against the skin 14 of the patient. By sensing the pressure that the imaging device 10 is pressed against the skin 14, the imaging device 10 can be repetitively and accurately repositioned relative to the suspect area 12 from one image to the next.
  • a proximity sensor 35 can be used to detect when the contact member 30 is at a desired distance from the patient, to include when the contact member 30 barely makes contact with the patient.
  • Each of the sensors 32, 34, and/or 35 can be electronically coupled with the imaging device 10 to provide an indication that the imaging device 10 is at the desired orientation or distance.
  • the imaging device 10 can have an indicator 36 that sends a visual and/or audio signal to indicate when the imaging device 10 is at the desired orientation, distance, and/or when an amount of pressure is present between the contact member 30 and the patient. A signal from the indicator 36 would indicate that the image could be acquired at that moment in time.
  • the sensors 32, 34, and/or 35 can also be electronically coupled with a processor (not shown) to computationally update the orientation and/or proximity to the patient of the imaging device 10.
  • the processed information can be displayed on a screen (not shown) located on the housing 16.
  • Figure 2A shows a second imaging device 100 that includes a camera 102 and a member 104 for receiving and coupling the camera 102 to an extension member 106.
  • the extension member 106 includes a contact member 108.
  • the extension member 106 further includes detents 110 sized and configured to complementarily receive the member 104, a sensor 112 to indicate the orientation of the extension member 106 about a Roll axis 114, a Pitch axis 116, and/or a Yaw axis 118, and a color palette 120 that can be used to provide color and/or contrast balancing once the image is acquired and archived. Color and/or contrast balancing are described in more detail below.
  • the camera 102 can be a digital camera as described in the previous embodiment or a film camera that includes at least a lens 122, a camera body 124, an image trigger 126.
  • the camera can capture images on photographic film (not shown), which can be standard photographic film that is purchased in a store and is configured to be chemically processed in a photo lab after it has been exposed to light.
  • the photographic film may be specialized film, such as film that is sensitive to the non-visible portions of the electromagnetic spectrum, such as infrared or ultraviolet sensitive films.
  • the member 104 includes a compartment 128 that is sized to receive the camera 102 and a pair of flanges 130 formed to couple to the extension member 106.
  • a front portion 128 of the compartment 128 does not obstruct the lens 122 of the camera 102 when the camera 102 is seated in the compartment 128.
  • the camera 102 can be secured to the member 104 by virtue of the compartment 128 being sized to provide a tight or snug fit for the camera body 124.
  • hook and loop fastener pads commonly available under the trademark VELCRO®, can be provided to keep the camera 102 relatively secure in the compartment 128.
  • securing the camera 102 in the compartment 128 can be accomplished in a variety of known ways.
  • the flanges 130 are further formed to complementarily engage the detents 110 provided on the extension member 106.
  • the flanges 130 include rounded, depressible buttons 132. Sliding a first end 134 of the extension member 106 in between the flanges 130 and permitting the buttons 132 to click into the detents 110, so that the contact member 108 is at a desired distance from the camera 102, accomplishes the assembly of the member 104 with the extension member 106.
  • Figure 2B shows a different embodiment of a member 104 without a compartment. Instead, a bonding strip 136 is provided on a base 138 of the member 104. Each side 140, extending from the base 138 can be biasly resilient to form a snug fit with the camera 102.
  • the bonding strip 136 can be a pad of hook and loop fastener, a tacky substance, or other equivalent object or substance.
  • Figure 3 shows an automated imaging device 200 according to another embodiment. Many of the aspects of the imaging device 200 are similar to the aspects described in the previous embodiments, for example a housing 202, an imager 204, a handle 206, and a sensor 208. One difference between the imaging device 200 and the previously described devices 10, 100 is that the present embodiment does not employ an extension member. In lieu of the extension member, a second sensor or range finder 210 is used to indicate the distance between the imaging device 204 and a suspect area 212.
  • the range finder 210 is a laser triangulation sensor that provides non-contact linear displacement measurements of the suspect area 212 on the skin 214.
  • a laser beam e.g., from a semiconductor laser
  • a returning beam is received and focused onto a CCD sensing array (not shown) of the imager 204.
  • the CCD array detects the peak value of the light and determines the distance of the skin 214 based on the position of the beam spot.
  • the range finder 210 produces an analog voltage that is proportional to the distance of the skin 214 from the range finder 210.
  • the range finder 210 can be a laser interferometer, an ultrasonic sensor, or an equivalent sensor to measure the linear distance of the dermal suspect 212 to the range finder 210.
  • Laser interferometers use the length of a wave of light as the unit for measuring position and consist of three basic components, a laser that supplies a monochromatic light beam, optics that direct the beam and generate an interference pattern, and electronics which detect and count the light and dark interference fringes and output the distance information.
  • Ultrasonic sensors offer another means to make non-contact distance measurements. An ultrasonic sensor works by measuring the time it takes a sound wave to propagate from the range finder 210, to an object and back to the range finder 210.
  • the skin 214 would reflect the ultrasonic waves generated by a transmitter and then a receiver would detect the returning waves. The elapsed time from initial transmission to reception of the returning waves is used to determine the distance to the skin 214.
  • the monochromatic light used to illuminate at least the suspect area 212 during imaging can have a wavelength outside of the visible portion of the light spectrum.
  • the monochromatic light can be in a frequency range of ultraviolet light, infrared light, or some other non-visible range along the light spectrum.
  • the imaging device 200 is a camera.
  • the imaging device 200 may advantageously take the form of a still image capture device, a motion picture capture device such as a movie camera or video camera.
  • the alignment of the imaging device 200 is accomplished manually without the aid of an extension member or sensor.
  • the acquired images are compared in a best-fit analysis.
  • the best- fit analysis includes digitizing the images and matching key points or parameters of a first image onto similar key points or parameters of a second image. For example, the perimeter or border of the first image can be matched to the perimeter or border of the second image.
  • the first image and the second image are of the same suspect area and the best-fit analysis is employed to detect changes, if any, of the suspect area over time without respect to using other reference points and/or markers to align and/or orient the imaging device 200 relative to the suspect area.
  • the analysis software can transform, rotate, and otherwise manipulate at least one of the images until enough similarities are found between the two compared images to verify that both images are of the same suspect area or are possibly not of the same suspect area.
  • the best-fit analysis is described in more detail below in the discussion on image comparison.
  • a system is capable of contemporaneously acquiring images over a variety of locations on a patient.
  • the system can capture images over a larger surface area or can take multiple images over a larger area where the multiple images can be digitally overlaid and matched to form a large image.
  • Figure 4 shows a patient's hand 300 according to one embodiment.
  • a suspect area 302 e.g., a grouping of melanoma cells
  • a backside surface 304 of the patient's hand 300 appears on a backside surface 304 of the patient's hand 300.
  • Two spots are identifiable on the backside surface 304, a first spot 306 ⁇ e.g., a scar) is located on the ring finger 308 and a second spot 310 (e.g., a freckle) is located on the wrist 312 of the patient's hand 300.
  • the spots 306, 310 respectively, can be freckles, birthmarks, borders of a limb, or some other equivalent feature or landmark that is not susceptible to substantial changes in shape, size, and/or location with respect to its present location on the patient.
  • the spot 306 or 310 may be naturally occurring, such as a freckle, or the spot 306 or 310 may be a portion of a scar, a tattoo, or some other feature that is not susceptible to substantial changes in shape, size, and/or location with respect its present location on the patient.
  • One advantage of locating at least one of the spots 306 or 310 on the patient is to use one of the spots 306 or 310 as a reference object 311.
  • the reference object 311 which is equivalent to spot 310 in the illustrated embodiment, provides a starting point from which other key measurements can be taken, as explained in the method below.
  • a reference object 311 is not always necessary when the suspect area can be easily relocated.
  • a group of melanoma cells that is easily and routinely detectable on a patient could be imaged and re-imaged, especially when the images are compared using a best-fit analysis.
  • the selection of the spots 306, 310 is generally left to the discretion of the medical professional, and it is contemplated that the medical professional will select the spot 310 that is the most stable or less susceptible to change over time.
  • the imaging software may also select the spots 306, 310.
  • the first spot 306 may have a stable configuration, such as the scar
  • the location of the spot 306 on the ring finger 308 makes it less attractive as an reference object 311 because the ring finger 308 is easily moveable in relation to the hand 300, which can add error in repetitive measurements taken with respect to the spot 306 on the ring finger 308.
  • the second spot 310 shown on the wrist 312, has a more fixed relationship with respect to the suspect area 302 and thus may be a better reference object 311 from which to measure and document the location of the suspect area 302.
  • the reference object 311 or at least a reference marker 318 is located proximate to the suspect area 302 so that the reference object 311 or reference marker 318 can be captured in an image of the suspect area 302.
  • the reference marker 318 has a defined size and shape and can be placed quite near the suspect area 302. The advantage, however, of locating the reference object 311 remains unchanged because the placement of the reference marker 318 on the patient is made relative to the location of the reference object 311 on the patient.
  • Figure 5 is a flowchart illustrating a method 400 to identify and take an image of a suspect area 302 on a patient. For clarity and ease of explanation, the method 400 is described in reference to Figure 4.
  • One aspect of locating the suspect area 302 is to accurately identify, map, and document the reference object 311 , the marker 318, if needed, and the suspect area 302 for image comparison purposes as described in greater detail below.
  • Figure 5 shows that the method 400 commences at 402 where the suspect area is identified on the patient.
  • a medical professional, the patient, the computing system, or some other entity may identify the suspect area. Identifying the suspect area most often will be done visually, but it is understood that other approaches may be used, such as the sense of touch.
  • a reference object is identified on the patient.
  • the medical professional determines whether the reference object will be within a first field-of-view or first frame 314 ( Figure 4) of an imaging device. Recall, it is desirable to have the reference object 311 within the first frame 314 of because multiple images of the suspect area 302 will be compared to one another.
  • the first frame 314 is sized to provide an amount of resolution of the suspect area 302 that will be adequate for detailed image processing and evaluation.
  • a position of the suspect area 302 is determined relative to the reference object 311.
  • a Cartesian coordinate system (X, Y) having perpendicular axes, is used to determine the position of the suspect area 302 relative to the reference object 311.
  • a spherical coordinate system (r, ⁇ ) is used.
  • the reference object 311 is assigned coordinates "0, 0" and the suspect area 302, as measured from the reference object 311 , is determined to have coordinates of (a, b) at a point on the suspect area 302 that represents an approximate center point of the suspect area 302. If a smaller image frame 316 ( Figure 4) is necessary, for example to get an image with higher resolution, and the reference object 311 is located outside of the smaller image frame 316, then at 410, the reference marker 318 ( Figure 4) is placed in proximity to the suspect area 302.
  • a position of the reference marker 318 is determined relative to the reference object 311 , for example the position of the reference marker 318 is determined to have coordinates (c, d).
  • the position of the suspect area 302 is determined relative to the reference marker 318 and, by way of example, has coordinates (e, f).
  • the reference marker 318 is a pen mark on the patient.
  • the reference marker 318 is a small patch or sticker backed with an adhesive. The shape of the patch is customized so that the reference marker 318 can be placed in a desired orientation during successive examinations of the patient.
  • the reference marker 318 includes a pointed region that points towards the finger tips and parallel sides that substantially align with the sides of the patient's arm.
  • reference marker 311 can have a variety of shapes, sizes, colors, contrast features, textures, and can even have features, like a center dot, to identify an exact starting point for measurements. Additionally or alternatively, human- readable and/or machine-readable indicia can be encoded on the reference marker 318.
  • a computer algorithm can take points on the perimeter of the lesion or suspect area of a first image and perform multiple measurements between any two points or more and compare such measurements with a second image.
  • the first and second image may be resized, skewed, color balanced and/or brightness changed to assist in attempting to fit one image to the other.
  • the coordinates of the reference object 311 , the reference marker 318, if needed, and the suspect area 302 can be recorded and/or documented on paper or via and electronic medium, for example entering the data into a computer. In addition, descriptions of these features can also be recorded and/or documented. It is understood that the recordation and/or documentation can be accomplished in a number of known ways, which may be through manual, automatic, paper, or paperless means.
  • an imaging device 10, 100, 200 is positioned to take an image of the suspect area 302. Because the images will be digitally processed, it is desirable to position the imaging device 10, 100, 200 in a repeatable manner with respect to the suspect area 302.
  • the distance of the imaging device 10, 100, 200 from and the angle of the imaging device 10, 100, 200 with respect to the suspect area 302 is kept substantially constant from one image to the next.
  • the distance and angle of the imaging device 10, 100, 200 with respect to the suspect area 302 can vary by a significant amount from one image to the next and the analysis / comparison software can best-fit analysis to substantially match and align respective images.
  • the imaging device 10, 100, 200 can include a stereo camera to add depth perception to the resulting image.
  • Stereo cameras that are placed at a constant distance from each other could provide two images, one from each camera, of the suspect area 302. When the images are compared against each other, the depth and/or texture of the suspect area 302 can be determined.
  • a parameter on the imaging device 10, 100, 200 may be adjusted to enhance a quality of the image.
  • light filters or color filters can be coupled with the imaging device 10, 100, 200 as a way to control the light within the frame of the image.
  • the imaging device 10, 100, 200 can be focused to obtain a desired resolution, thus increasing or decreasing the frame size of the image to be acquired.
  • a first image is acquired that captures the suspect area 302 alone or the suspect area with one of either the reference object 311 or the reference marker 318.
  • the first image is electronically archived.
  • the first image can be given an identifier such as a file name, label, number, date stamp, or some other association that makes it easy to re-locate the first image in a database.
  • the electronic format of the first image can be archived as any number of common graphics formats such as *.jpg, *.tif, *.bmp, *.gif, or another equivalent format that is readable by a standard computer system.
  • the first image may be preprocessed.
  • Preprocessing the first image may include, but is not limited to, identifying the reference object 311 and/or reference marker 318 in the image; detecting, mapping, and computing the border of the object 311 and/or marker 318; detecting, mapping, and computing the border of the suspect area 302; and/or overlaying a reference grid 608 onto the image as shown in Figure 6, according to one illustrated, exemplary embodiment.
  • Figure 6 shows an image 600 having an image frame 602. Captured within the image frame 602 is an image 604 ( ⁇ ., an image of the suspect area 302) and a reference image 606 (i.e., an image of either one of the reference object 311 or the reference marker 318) located nearby or within image 604.
  • the reference grid 608 overlies the image 604 and the reference image 606.
  • One advantage of including the reference grid 608 is that the grid 608 can be printed with the image 600. This allows the medical professional to more easily visually examine the image 604 to identify obvious changes.
  • reference grid 608 Another advantage of the reference grid 608 is that it allows the medical professional to more accurately identify, describe, and even communicate respective changes of the suspect area 302 by referring to various quadrants or blocks of the reference grid, which can be colored or coded to indicate regions where substantial change has occurred.
  • Figure 7 shows a method 700 of acquiring a subsequent image of the suspect area 302 according to one embodiment.
  • Method 700 differs from the previous method 400 in that method 700 includes relocating the suspect area 302 and realigning the imaging device 10, 100, 200.
  • the suspect area 302 is relocated on the patient.
  • one of the purposes of the embodiments described herein is to track the changes of a suspect area 302. Because some patients may have many suspicious areas that are crowded together in one location or suspicious areas that rapidly change, it is important to relocate the exact area that is to be re-evaluated.
  • the suspect area 302 can be relocated by visually inspecting the patient, reviewing the patient's records, reviewing the position of a documented reference object and then measuring to obtain the position of the suspect area 302, automated by the computing system, or some combination thereof.
  • a reference marker 318 can be repositioned on the patient proximate to the suspect area 302, if necessary.
  • the patient is positioned at 706 and an imaging device 10, 100, 200 is reoriented and/or realigned relative to the suspect area 302.
  • an imaging device 10, 100, 200 used to acquire a subsequent image should be approximately matched to a distance and an angle of the imaging device 10, 100, 200 of a previous image.
  • the orientation of the imaging device 10, 100, 200 does not have to exactly match because a transformation algorithm can be used to account for some amount of deviation in the angle, the distance, and even the lighting.
  • a parameter e.g., functional features such as zoom, contrast, etc.
  • a subsequent image is acquired that captures both the suspect area 302 and one of either the reference object 311 or the reference marker 318.
  • the subsequent image is electronically archived according to the archiving process described above.
  • the subsequent image may be pre-processed as described above and illustrated in Figure 6.
  • Figure 8A shows two images 600a, 600b undergoing a transformation according to one embodiment.
  • a first image 600a includes a first frame 602a enclosing a first reference image 606a and a first image 604a. The orientation of the first frame 602a results from the angle and position of the imaging device 10, 100, 200 when the image was acquired.
  • a second image 600b includes a second frame 602b enclosing a second reference image 606b and a second image 604b, wherein both the first image 604a and the second image 604b are images of the suspect area 302. It should be understood the second reference image 600b may be skewed, of a different size, or otherwise misaligned with respect to the first reference image 600a.
  • the transformation algorithm is used to align, size, deskew, or otherwise manipulate the second reference image 606b to match the first reference image 606a as closely as possible. Moreover, any changes made to the second reference image 606b during the transformation process are made to the entire image 600b and everything enclosed within the image 606b. For example, if the reference image 606b is scaled down by ten percent, then the second frame 602b, the reference grid (not shown for clarity), and the second image 604b are also scaled down by ten percent. Further it should be noted that reference image 606a and 606b need not be separate from images 604a and 604b, but can be points within or on images 604a and 604b that are considered by the computer algorithm during the fit analysis.
  • Figure 8B shows the same two images from Figure 8A about to be overlaid after the second image 600b has been transformed.
  • Figure 9 shows a method 800 to compare subsequent images 600a, 600b taken of a suspect area 302 according to one illustrated embodiment.
  • This comparison can take place in a variety of settings, for example in the facility where the patient is treated or in a remote facility.
  • the comparison when done remotely, simply means that images of the suspect area 302 are forwarded to another location, which could be by a computer algorithm or a third party technician that specializes in performing the image comparisons.
  • the images can be transferred to the remote facility through any available means, for example over a computer network (private or the Internet), through a file transfer protocol (FTP) system, by courier or regular mail, with the images stored on a computer readable medium such as a compact disk, magnetic storage device, or other equivalent digital storage media.
  • FTP file transfer protocol
  • the method 800 can commence with the first image 600a being compared to a second, subsequent image 600b, for example.
  • the images have been electronically archived, but they may not have been pre-processed.
  • the present embodiment is not limited to the comparison of only two images. It is appreciated and understood that multiple images can be simultaneously compared against a baseline image and/or relative to each other. For example, each image taken over the preceding six months could be simultaneously compared to a first image 600a taken the previous year.
  • an animation software program can be used to animate the changes in the suspect area 302 over time. For purposes of clarity and brevity, however, the comparison of only two images will be described below.
  • the electronically archived images are accessed from a database of images.
  • the images are pre-processed, if desired.
  • a user may select a baseline image 600a.
  • the baseline image could be a first acquired image, an intermediately acquired image, or an image taken during the patient's previous office visit. For purposes of detecting changes, it is not necessary, but may be helpful to select a baseline image.
  • a mapping algorithm can be used to determine key features, such as a border or perimeter of the reference images 606a, 606b.
  • a mapping algorithm may look for key points on the reference images 606a, 606b with respect to the reference grid 608 ( Figure 6) or may use reference images 606a and 606b as points on images 604a and 604b during the fit analysis.
  • reference images 606a and 606b should be understood to be an image such as a landmark feature on the surface or simply a reference point or pixel or collection of pixels either separated from images 604a and 604b or on or within 604a or 604b.
  • the term image should be construed to mean a point that can be captured in a digital form and used as a reference point.
  • a transformation algorithm is used to transform the second image 600b into a comparative posture with the first image 600a by using the reference images 606a, 600b.
  • the reference image 600b is scaled up or down in size, rotated, skewed, or otherwise manipulated so that the reference images 606b is approximately the same size, same orientation, and in the same position within the frame 602b as the first reference image 606a of frame 602a.
  • both images 600a, 600b can be scaled up or down in size, rotated, skewed, or otherwise manipulated so that the respective reference images 606a, 606b are approximately the same size, have the same orientation, and are in the same position with respect to the reference grid 608 ( Figure 6).
  • the reference images 606a and 606b are in fact contained within images 604a and 604b. Such reference images may correspond to points on the perimeter of images 604a and 604b that appear unchanged once the images are sized and overlaid. As one of ordinary skill in the art can readily appreciate the more reference points taken into account during the fit analysis, the higher the quality of the comparison. Accordingly, in certain embodiments at least two reference points are considered, in other embodiments, at least 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, or more are utilized by the algorithm. At 812, after the reference images 606a, 606b have been sufficiently matched during the transformation process; a comparison algorithm is used to evaluate and compare the respective images 604a, 604b.
  • Key points or parameters are identified in both the first image 604a and the second image 604b.
  • the overall area, the perimeter or border length, the percentage change in size in a given quadrant, etc. are just some of the parameters that can be evaluated in each respective image 604a, 604b.
  • a color, contrast, and/or a depth of each of the respective images 604a, 604b can be determined. Balancing the color, brightness, and/or the contrast of the respective images is described below.
  • the features that are to be evaluated can be selected by a user or can be selected automatically.
  • the images 604a, 604b are compared with respect to one another to identify differences between the evaluated features. For example, the areas or perimeter lengths of the respective images 604a, 604b can be compared. The differences may be subtle, like slight changes in color or they may be substantial like a greatly enlarged area of the second image 604b.
  • any identified differences are further compared to determine if a threshold is exceeded.
  • the threshold could be one, two, three, four, five, six, seven, eight, nine, ten, fifteen, twenty, twenty-five percent increase in the area of the second image 604b compared to the area of the first image 604a.
  • a notification is provided that no noteworthy changes of the image 604b were detected.
  • a notification is provided that noteworthy changes of the image 604b were detected.
  • the threshold can be a user defined setting, a preprogrammed setting, or an automatically adjustable range depending on the image quality and resolution, for example.
  • data is provided detailing the specific changes, for example, shape, color, texture, a shift in position, etc.
  • the data and/or the results obtained from the comparison can be made available to the medical professional in a short amount of time to enable the medical professional to make a more objective, informed diagnosis and to quickly formulate a treatment plan.
  • a post-processing algorithm can be used to overlay the respective images 600a, 600b on a screen.
  • Color-coding, animation techniques, and other graphic processing techniques can be used to identify areas or regions of greatest change.
  • images may be captured and compared using only a standard imaging device, which includes, digital cameras, movie cameras, film cameras etc., as long as the image to be compared is at some point moved to a digital format thus allowing computational analysis thereon.
  • a standard imaging device which includes, digital cameras, movie cameras, film cameras etc.
  • an image from a standard consumer model digital camera is compared against a subsequent image.
  • the computer algorithm used will perform a best fit or transformation of the images by modifying size, angle, and brightness, to obtain the best possible fit prior to analysis for changes.
  • users already having an archive of older images can compare these images. While the error rate for such comparison is slightly higher, the flexibility of being able to review older images far exceeds the risk of a few false positive outcomes that can be easily discounted by the user upon further review.
  • images taken with no focal length limiter, brightness, color, or contrast control can be compared with images having such controls.
  • Figures 10A, 10B, and 10C illustrate the color balancing of an image 900.
  • Figure 1OA shows a digital image 900a of a suspect area 902 and a background region 904 prior to color balancing.
  • Figure 10B shows the image 900b after a filter has been applied to filter out the background skin region 904 based on the color, brightness, and/or contrast of the suspect area 902 compared to the background skin region 904.
  • Figure 10C shows the image 900c after it has been preprocessed, which may include but is not limited the application of additional filters to remove other features in the image and/or the application of a reference grid 908 over the image 900c.
  • color balancing is to provide an additional parameter that can be compared from one image to the next, for instance the respective darkness or lightness of the respective images.
  • a second advantage of color balancing permits a comparison between the suspect area and the surrounding skin as a means to more accurately detect suspect areas 302 over a larger skin surface.
  • Figure 11 shows a method 1000 of detecting suspect areas over a surface of skin 14 by means of image filtration (i.e., color or contrast balancing).
  • image filtration i.e., color or contrast balancing
  • At 1002 at least several images over a large area of skin are acquired. The images may have overlapping sections to insure that the entire skin area was imaged.
  • reference markers 318 can be placed at various locations on the imaged skin area so that any potential suspect areas 302 that may be discovered can be relocated at a later time.
  • each image is color balanced with respect to a reference color.
  • the reference color appears in the acquired image.
  • Such a reference could include distance measurements, contrast standards, color standards, etc.
  • the reference color can be a color palette of single color placed within the frame of the image when the image is acquired ( Figure 2A).
  • the color palette can have a variety of shades or colors thereon. Because the reference colors are electronically isolatable, the color and/or shading of the image can be digitally adjusted until a feature in the image approximately matches a certain reference color.
  • a spot detection algorithm is used to process each of the respective images to detect any suspect areas 302 in one or more images.
  • a filter is applied to the image to make darker objects, such as a mole, stand out relative to the skin.
  • the type of filter used will depend on the amount of color contrast between the skin and the suspect area. By way of example, images of a light skinned person with dark patches on their skin may not require filtering; whereas images of a dark skinned person with moderately dark patches on their skin may require a series of filters to achieve enough contrast between the skin and the suspect area.
  • notification of a potential suspect areas is provided to the medical professional.
  • the medical professional could perform a refined evaluation of any potential suspect area by taking higher resolution images of the suspect area and comparing these images over time, as described in detail above. Additionally or alternatively, the medical professional can perform or recommend that a biopsy be taken of the suspect area.
  • the computing system for performing the image comparisons may include a number of local computers for receiving downloaded images and at least one mainframe computer for performing the image comparisons. Alternatively, the image comparisons could be performed on the local computers.
  • the local computer typically includes a processor, memory, multiplex ("Mux") card, video and Ethernet cards, power supply and an image acquisition card.
  • a number of local computers be networked together and service a number of patient treatment facilities.
  • the local computer can communicate with other local computers and/or the mainframe computer over a communications link such as a local area network ("LAN") and/or a wide area network (“WAN").
  • the communications link can be wired and/or wireless.
  • the communications link can employ Internet, or World Wide Web communications protocols, and can take the form of a proprietary extranet.
  • a user could obtain images and upload these to a web-based server that could perform all the analysis and send back to the user only the analysis or only the analysis that yielded possible changes.
  • all algorithms could reside on the computer wherein the images or uploaded or on a server directly connected thereto.
  • patient confidentiality is maintained.
  • a user could upload all patient information into a database and also have image analysis linked thereto.
  • a remote server or housed in the users facility could be a computer that contains a database with a unique patient identifier, this identifier can be used to add new images to a patient folder and the image analysis could either be performed immediately while the user waits or could be performed in the background. Subsequent to this analysis a notification could be sent via email or secured web-access or the like that indicates the analysis has been completed and either no action is necessary or further review/action may be required, thus indicating a change was noted between the images.

Abstract

A device (10) for acquiring first and subsequent images of a suspect area on a patient and methods for monitoring or detecting changes of the suspect area over time and providing notification when the changes exceed a threshold. The device may be an imaging device, such as a digital camera, possibly augmented with physical or optical devices for arranging the orientation and/or distance of the imaging device with respect to the suspect area. In addition, methods for identifying, relocating, acquiring a first and/or subsequent image of the suspect area, and performing a comparative analysis of respective images are also described. Results of the comparative analysis can be used to notify and/or assist a medical professional in treating or counseling the patient.

Description

DEVICES AND METHODS FOR IDENTIFYING AND MONITORING CHANGES OF A SUSPECT AREA ON A PATIENT
BACKGROUND OF THE INVENTION
Field of the Invention This disclosure relates to devices and methods, which can be used alone or in combination, to identify and monitor changes of a suspect area on a patient, for example dermatological changes.
Description of the Related Art
There are many reasons why a medical professional, patient, or both would want to monitor changes on an exterior or internal surface of a patient. For a suspect area, especially one that may indicate some form of skin cancer, it is important to detect and treat the area in its early stages. One type of skin cancer is known as melanoma, which is a malignant cancer of the pigment cells (melanocytes). Other forms of skin cancer also exist and are known as basal and squamous cell cancers, which are tumors of unpigmented cells (keratinocytes) of the skin.
Melanocytes occur at various depths within the epidermal (upper) and dermal (lower) layers of skin. Melanocytes are normally distributed in the layers of the skin and produce pigment in response to being subjected to ultraviolet light (e.g., sunlight). Aggregated melanocytes are termed naevus cells and can be indicative of a melanoma. Because a melanoma may appear as a mole, medical professionals typically attempt to ascertain whether the suspicious area is changing over time. Identifying changes early typically results in a rapid diagnosis, which in turn often leads to rapid and highly effective treatments that can greatly increase the patient's survival rate and in most cases, complete recovery. Historically, the standard of care for screening or monitoring melanoma is a visual inspection or visual comparison of photographs by a medical professional. These visual inspections or comparisons are subjective and do not enable the medical professional to detect small or subtle changes in a suspect area. Changes in the perimeter, depth, shape, and even color of a melanoma can be subtle for a time and then progress rapidly. Moreover, changes in the color or perimeter, for example, of a melanoma are not easily discernable to the human eye so these changes may go unnoticed for a long period of time. In U.S. Patent No. 6,427,022, issued to Craine et al., a skin lesion is monitored by obtaining a series of digital baseline images over time and comparing these images. The method of comparison taught in Craine et al. is that the baseline image is compared visually by the viewer with a subsequently obtained image by alternately displaying the respective images, in a blinking fashion. The blinking action is created by quickly alternating the images with respect to one another on a display monitor to enable the viewer to detect changes in the skin lesion.
Even though the standards of care discussed above may involve images, each standard of care suffers from the subjectivity and uncertainty associated with the medical professional trying to ascertain changes in a suspicious area by visual comparison. The visual comparison methods are subjective and less accurate for a number of reasons. For instance, the medical professional may be inexperienced, may have been distracted during the examination, or may have selected the wrong location on the patient's body during a follow-up examination.
Therefore, a more effective, less subjective, and low-cost approach for at least monitoring changes in a suspect area is desirable. SUMMARY OF THE INVENTION
It should be understood that one aspect of the present invention is the comparison of a plurality (e.g., at least two) images taken at different times utilizing a computer based algorithm that can overlay two images and either transform the images to fit over each other or do a best fit analysis thereby denoting or calling out one or more of any color, perimeter, or depth changes. In one embodiment the analysis can include transforming the images to match color, contrast, angle, focus (sharpness), brightness, and subsequently comparing the multiple images with each other. Changes between the images may be called out in a variety of ways. Such embodiments include text reports, highlight or color coding the image itself, etc.
In another aspect a device or apparatus is contemplated that comprises a digital image capture device and a distance-measuring device. In certain embodiments the distance measuring device measures the distance between the suspect area and the image capture device and provides a readout or a tone to signify the optimal distance. Further, embodiments include the use of a reference such as a strip of adhesive affixed to the surface or attached the device that provide contrast, color, sharpness, and/or depth references. Such an embodiment could include an adhesive strip having a color palette (e.g., one or more colors), gray scale, distance references (hash marks) or depth references.
In certain embodiments of the invention distance measurements can be done by a sonic device, laser or any other means of measuring distance. In one particular embodiment an enclosed tube or housing is affixed to the image device and positioned over the suspect area. In one embodiment of such a device the housing or enclosure is essentially light free and may contain its own light source internally to provide consistent lighting of the suspect area. In a specific embodiment the enclosure is a tube of a fixed length have LEDs or fiber optics positioned inside. One end of such an enclosure may be fitted to the image capturing device and the other fitted over the suspect area.
In one aspect, an apparatus to acquire an image of a suspect area on a patient comprises an imaging device; and a separation tool having a first end connected with a first section, at least a portion of the first end formed to contact the patient, the first section having an attachment portion to receive the imaging device, the first section formed to maintain the imaging device at a substantially fixed distance from the suspect area.
In another aspect, an imaging assembly to image a suspect area on a patient comprises an imaging device and at least one sensor to indicate an orientation and/or distance of the imaging device relative to a first location.
In another aspect, an imaging assembly to image a suspect area on a patient comprises a housing; an imaging device located within the housing; and at least one sensor to indicate an orientation of the housing relative to a first location.
In yet another aspect, a method of acquiring an image of a suspect area includes identifying the suspect area; positioning a patient with the suspect area in approximately a first position; identifying a reference item located on the patient; determining a position of the suspect area in relationship to the reference item; aligning an imaging device to acquire the image of the suspect area; and acquiring the image after aligning the image device.
In yet another aspect, a method of comparing at least two images, each image capturing a suspect area includes identifying a reference item in the at least two images; measuring an attribute of the reference item in a first image; transforming a second image based on the measured attribute of the reference item in the first image, wherein a reference item in the second image is transformed to correspond with an orientation and size of the reference item in the first image; measuring an attribute of the suspect areas in both images; and comparing the respective measured attributes of the respective suspect areas. Such reference items can be points either away from the suspect area or within the suspect area. Further, the measured attribute can be color, distance between at least two points, total perimeter, distance between multiple points etc.
In one aspect the method of comparing two images comprises receiving at least two images digitally into a computer system, performing a fitting analysis on the at least two images to obtain an overlay and providing an output noting any differences between the at least two images.
In yet another aspect, a method of acquiring an image of a suspect area includes identifying the suspect area; positioning a patient with the suspect area in approximately a first position; aligning an imaging device to acquire an image of the suspect area; and acquiring the image after aligning the image device.
In still yet another aspect, a method of comparing at least two images of a suspect area on a patient includes providing at least two digital images of the suspect area; digitally overlaying the at least two images; performing a best-fit transformation of one image to encourage the one image to approximately correspond to at least one detected attribute of the other acquired image; comparing the at least two images to determine whether a difference exists between an aspect of the one image when compared with the same aspect of the other image.
In an even further aspect the present invention can be used in the context of full-body imaging wherein one or more digital or other image capture device or devices are placed around the patient and the full-body is imaged either in a piece by piece manner or in its entirety. These images can then be compared by transformation or best-fit analysis and analyzed for any changes by a computer algorithm.
BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn, are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
Figure 1 A is a front, left isometric view of an imaging device according to one illustrated embodiment positioned with respect to a portion of skin. Figure 1 B is a side view of a portion of a guide of the imaging device of Figure 1A having measurement markers and a palette according to one illustrated embodiment.
Figure 2A is a partially exploded, front, left, isometric view of an imaging device according to another illustrated embodiment. Figure 2B is a front, left isometric view of an intermediate bracket according to one illustrated embodiment.
Figure 3 is a front, left, isometric view of an imaging device according to another illustrated embodiment.
Figure 4 is an elevational view of a hand having several reference points for locating a suspect area according to one illustrated embodiment.
Figure 5A is a flowchart of a method of identifying a suspect area according to one illustrated embodiment.
Figure 5B is a continuation of the flowchart of Figure 5A.
Figure 6 is a top plan view of an image after the image has been pre-processed according to one illustrated embodiment.
Figure 7 is a flowchart of a method of acquiring a subsequent image of a suspect area according to one illustrated embodiment.
Figure 8A is left, front isometric view of a first image and a second image, each having in an initial and respectively different orientation and size according to one illustrated embodiment. Figure 8B is a top plan view of the first image and the second image of Figure 8A transformed to have approximately the same respective orientation and size.
Figure 9 is a flowchart of a method of comparing at least two images of a suspect area according to one illustrated embodiment.
Figures 10A - 10C are images of suspect areas illustrating the various stages of the color balancing method of Figure 11 according to one illustrated embodiment.
Figure 11 is a flowchart of a method of color balancing an image to detect a potential suspect area according to one illustrated embodiment.
DETAILED DESCRIPTION
In the following description, certain specific details are set forth in order to provide a thorough understanding of various embodiments of the disclosed subject matter. However, one skilled in the art will understand that the embodiments may be practiced without these details. In other instances, well-known structures associated with imaging systems, computing systems and processors, and various techniques for manipulating and evaluating digital image data have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments. Unless the context requires otherwise, throughout the specification and claims which follow, the word "comprise" and variations thereof, such as, "comprises" and "comprising" are to be construed in an open, inclusive sense, that is as "including, but not limited to."
Unless the context requires otherwise, throughout the specification and claims that follow, the term "patient" refers primarily to warm blooded mammals and is not limited to human beings, but could include animals such as dogs, cats, horses, cows, pigs, higher and lower primates, etc.
The headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention. The embodiments disclosed herein are generally directed to acquiring images of a suspect area located on a patient, comparing the acquired images to one another; and evaluating the compared images to determine if some amount of change from one image to a subsequent image warrants a more detailed examination by a medical care professional. The embodiments disclose a number of different devices and methods for achieving such results.
The surface of interest can be either internal or external to the patient. In one instance, the surface of interest is the patient's exposed skin that is monitored for the detection or growth of skin cancer. In another instance, the surface of interest can be the patient's mucous membranes, interior body surfaces related to reproductive and/or digestive systems of the patient, ocular surfaces, or any other accessible surface on a patient. For purposes of this description, the surface of interest will be exemplified as an area on the patient's skin, referred to as a suspect area. However, this exemplification is not meant to limit or otherwise narrow the scope of the description, the claims, or any specific embodiment depicted herein.
The suspect area referred to herein can be the site of a suspected melanoma or mole (e.g., melanin containing areas to be monitored), but can also be any other suspect area on a patient that needs to be monitored. Thus, it is within the scope of this disclosure that the suspect area can be located in a variety of places on a patient, for example the patient's mucous membranes, surfaces of interior body cavities related to reproductive and/or digestive systems, ocular surfaces, or any other interior or exterior surface on a patient where monitoring is desired. In the exemplary embodiment used for discussion purposes, the suspect area can be a dermal feature, such a type of skin cancer, a skin lesion, a skin rash, a burn or scar, an infected or inflamed area, a wound, or some other skin anomaly that may or may not be capable of growth, reduction or other change. For example, one embodiment may monitor a healing rate (i.e., recession) of a burn or scar when certain medications, lotions, or creams are applied to the skin. A further embodiment envisions utilizing such technology to monitor the effectiveness of a drug or nutriceutical, such as those that heal the skin. Another embodiment may monitor a patient's scalp for hair loss and/or growth. In addition, the embodiments disclosed herein may be used in a number of settings, such as a home setting, a clinical setting, a laboratory or research setting, a regulatory compliance setting, or any combination of the above.
Devices and Systems to Acquire an Image of a Suspect area
Figures 1 A-3 show three different embodiments of a device to acquire an image of a suspect area. Each of the devices differs in its degree of complexity, accuracy, and cost. It is contemplated that many, if not all, of the features or aspects of one device can be incorporated into the other devices. Figure 1A shows a first imaging device 10 for imaging a suspect area 12 on skin 14 according to the illustrated embodiment. The first imaging device 10 includes a housing 16 and a lens 18 to receive and direct light to imaging components (not shown) located within the housing 16. By locating the imaging components in the housing 16, damage and/or exposure of the imaging components may be prevented. The housing 16 can have a handle 20 to permit the housing 16 to be lifted, moved, positioned, or otherwise manipulated. Additionally or alternatively, the handle 20 and/or other portions of the housing 16 can be configured with support locations so that the imaging device 10 can be secured to a tripod, for example.
The imaging components may take the form of a camera or an optical scanner operable to capture images of the suspect area 12. In one embodiment, the camera may advantageously take the form of a digital image capture device such as a CCD or CMOS type camera. A CCD camera may consist of one-dimensional or two-dimensional arrays of charge coupled devices ("CCD") and suitable optics, such as optical lenses, for focusing an image on the CCD array. CCD arrays can capture whole images at a time, or can be electronically controlled to successively sample (e.g., pixel-by-pixel, row-by-row, or column-by-column) the information on a region of the skin 14 (i.e., electronically scan). Alternatively, the imaging components can take the form of a CMOS imager capable of capturing one-dimensional or two- dimensional arrays similar to that of a CCD reader.
Employing a digital image capture device advantageously provides the image in a form suitable for use with a data processing system such as a computing system. Alternatively, the camera may take the form of a non-digital image capture device, such as a film camera. Such embodiments may employ image scanners, or other devices to digitize the images captured on film. The imaging device 10 may advantageously take the form of a still image capture device. Alternatively, the imaging device 10 may take the form of a motion picture capture device such as a movie camera or video camera. Such embodiments may include a frame grabber or other device to capture single images.
The imaging device 10 may rely on ambient light, or may include one or more light sources, such as light emitting diodes ("LEDs") or incandescent lights, which may be manually or automatically controlled.
A guide 22 is attachable to the housing 16 of the imaging device 10. The guide 22 is configured so that the housing 16 can be placed at a desired distance away from the skin 14 along a Z-axis, perpendicular to an X-Y plane when an image is acquired. In the illustrated embodiment, the guide 22 includes an extension member 24 having a first end 26 that is coupled to the housing 16 of the imaging device 10. A second end 28 is coupled to a contact member 30. The contact member 30 can include a number of features to enhance the control and/or optimization of the imaging device 10. For example as illustrated in Figure 1B, the contact member 30 of the guide 22 can include measurement markings 30a, similar to those of a ruler, and/or contrast markings 30b, which can represent a color or grayscale palette. In one embodiment, the extension member 24 includes adjustable, complementary sliding members with gradations 25 to allow the housing 16 to be placed at a desired distance from the suspect area 12. In another embodiment, the extension member 24 is formed to be non-adjustable, thus the housing 16 is set at a fixed length from the contact member 30. The contact member 30 may be shaped (e.g., arc-shaped) to provide an unobstructed line of sight between the imaging device 10 and the suspect area 12. As will be discussed in more detail below, it may be desirable that the contact member 30 be shaped such that at least a portion of the contact member 30 can be captured in the acquired image. A skin contact region 31 of the contact member 30 can be padded to provide a more comfortable interaction with the patient. One skilled in the art will understand and appreciate that the first end 26 can be coupled to the housing 16 by any number of mechanical methods, for example fasteners, clips, VELCRO®, adhesive bonding, tie down straps, or some other structure that substantially keeps the housing 16 attached to the extension member 24.
In the illustrated embodiment, the imaging device 10 can include at least one sensor 32 for determining an orientation of the device 10. One advantage of determining the orientation of the device 10 is to provide for repetitive and consistent images in an X-Y plane, especially if the images are acquired at different times. For example, if a second image is being acquired of the suspect area 12, but the imaging device 10 is tilted or rotated at too much of an angle, the second image may be too distorted or misaligned to digitally process or compare to a previously acquired image. A variety of sensors 32 can be used to indicate the orientation of the imaging device 10. In the illustrated embodiments, the sensor 32 is a fluid level encompassed in the housing 16 and visible by a user of the imaging device 10. The sensor 32 may also be integrated with the guide 22. The fluid level sensor 32 generally indicates whether the imaging device 10 is tilted relative to the ground. Additionally or alternatively, a gyroscope, which is sometimes referred to as a tilt sensor, can be used to determine an acceleration of the imaging device 10 about at least one axis.
In addition to or instead of sensing the orientation of the imaging device 10, other sensors 34 can be used to determine the proximity of the imaging device 10 in relation to the skin 14 of the patient. In one embodiment, a pressure sensor 34 is located in the contact member 30 to sense the pressure exerted on the contact member 30 as it is positioned against the skin 14 of the patient. By sensing the pressure that the imaging device 10 is pressed against the skin 14, the imaging device 10 can be repetitively and accurately repositioned relative to the suspect area 12 from one image to the next. Additionally or alternatively, a proximity sensor 35 can be used to detect when the contact member 30 is at a desired distance from the patient, to include when the contact member 30 barely makes contact with the patient. Each of the sensors 32, 34, and/or 35, described above, as well as equivalent sensors, can be electronically coupled with the imaging device 10 to provide an indication that the imaging device 10 is at the desired orientation or distance. For example, the imaging device 10 can have an indicator 36 that sends a visual and/or audio signal to indicate when the imaging device 10 is at the desired orientation, distance, and/or when an amount of pressure is present between the contact member 30 and the patient. A signal from the indicator 36 would indicate that the image could be acquired at that moment in time. Likewise, the sensors 32, 34, and/or 35 can also be electronically coupled with a processor (not shown) to computationally update the orientation and/or proximity to the patient of the imaging device 10. In one embodiment, the processed information can be displayed on a screen (not shown) located on the housing 16.
Figure 2A shows a second imaging device 100 that includes a camera 102 and a member 104 for receiving and coupling the camera 102 to an extension member 106. Similar to the extension member discussed above, the extension member 106 includes a contact member 108. In addition, the extension member 106 further includes detents 110 sized and configured to complementarily receive the member 104, a sensor 112 to indicate the orientation of the extension member 106 about a Roll axis 114, a Pitch axis 116, and/or a Yaw axis 118, and a color palette 120 that can be used to provide color and/or contrast balancing once the image is acquired and archived. Color and/or contrast balancing are described in more detail below.
The camera 102 can be a digital camera as described in the previous embodiment or a film camera that includes at least a lens 122, a camera body 124, an image trigger 126. The camera can capture images on photographic film (not shown), which can be standard photographic film that is purchased in a store and is configured to be chemically processed in a photo lab after it has been exposed to light. Alternatively, the photographic film may be specialized film, such as film that is sensitive to the non-visible portions of the electromagnetic spectrum, such as infrared or ultraviolet sensitive films. In the illustrated embodiment, the member 104 includes a compartment 128 that is sized to receive the camera 102 and a pair of flanges 130 formed to couple to the extension member 106. A front portion 128 of the compartment 128 does not obstruct the lens 122 of the camera 102 when the camera 102 is seated in the compartment 128. The camera 102 can be secured to the member 104 by virtue of the compartment 128 being sized to provide a tight or snug fit for the camera body 124. Alternatively, hook and loop fastener pads, commonly available under the trademark VELCRO®, can be provided to keep the camera 102 relatively secure in the compartment 128. One skilled in the art will appreciate and understand that securing the camera 102 in the compartment 128 can be accomplished in a variety of known ways. The flanges 130 are further formed to complementarily engage the detents 110 provided on the extension member 106. In the illustrated embodiment, the flanges 130 include rounded, depressible buttons 132. Sliding a first end 134 of the extension member 106 in between the flanges 130 and permitting the buttons 132 to click into the detents 110, so that the contact member 108 is at a desired distance from the camera 102, accomplishes the assembly of the member 104 with the extension member 106.
Figure 2B shows a different embodiment of a member 104 without a compartment. Instead, a bonding strip 136 is provided on a base 138 of the member 104. Each side 140, extending from the base 138 can be biasly resilient to form a snug fit with the camera 102. The bonding strip 136 can be a pad of hook and loop fastener, a tacky substance, or other equivalent object or substance.
Figure 3 shows an automated imaging device 200 according to another embodiment. Many of the aspects of the imaging device 200 are similar to the aspects described in the previous embodiments, for example a housing 202, an imager 204, a handle 206, and a sensor 208. One difference between the imaging device 200 and the previously described devices 10, 100 is that the present embodiment does not employ an extension member. In lieu of the extension member, a second sensor or range finder 210 is used to indicate the distance between the imaging device 204 and a suspect area 212.
In one embodiment, the range finder 210 is a laser triangulation sensor that provides non-contact linear displacement measurements of the suspect area 212 on the skin 214. A laser beam (e.g., from a semiconductor laser) is reflected off the skin 214. A returning beam is received and focused onto a CCD sensing array (not shown) of the imager 204. The CCD array detects the peak value of the light and determines the distance of the skin 214 based on the position of the beam spot. The range finder 210 produces an analog voltage that is proportional to the distance of the skin 214 from the range finder 210.
As an alternative to the above embodiment, the range finder 210 can be a laser interferometer, an ultrasonic sensor, or an equivalent sensor to measure the linear distance of the dermal suspect 212 to the range finder 210. Laser interferometers use the length of a wave of light as the unit for measuring position and consist of three basic components, a laser that supplies a monochromatic light beam, optics that direct the beam and generate an interference pattern, and electronics which detect and count the light and dark interference fringes and output the distance information. Ultrasonic sensors offer another means to make non-contact distance measurements. An ultrasonic sensor works by measuring the time it takes a sound wave to propagate from the range finder 210, to an object and back to the range finder 210. In the illustrated embodiment, the skin 214 would reflect the ultrasonic waves generated by a transmitter and then a receiver would detect the returning waves. The elapsed time from initial transmission to reception of the returning waves is used to determine the distance to the skin 214.
The monochromatic light used to illuminate at least the suspect area 212 during imaging can have a wavelength outside of the visible portion of the light spectrum. For example, the monochromatic light can be in a frequency range of ultraviolet light, infrared light, or some other non-visible range along the light spectrum.
In yet another embodiment, the imaging device 200 is a camera. Again, the imaging device 200 may advantageously take the form of a still image capture device, a motion picture capture device such as a movie camera or video camera. In the present embodiment, the alignment of the imaging device 200 is accomplished manually without the aid of an extension member or sensor. The acquired images are compared in a best-fit analysis. The best- fit analysis includes digitizing the images and matching key points or parameters of a first image onto similar key points or parameters of a second image. For example, the perimeter or border of the first image can be matched to the perimeter or border of the second image. It is appreciated that in one embodiment the first image and the second image are of the same suspect area and the best-fit analysis is employed to detect changes, if any, of the suspect area over time without respect to using other reference points and/or markers to align and/or orient the imaging device 200 relative to the suspect area. The analysis software can transform, rotate, and otherwise manipulate at least one of the images until enough similarities are found between the two compared images to verify that both images are of the same suspect area or are possibly not of the same suspect area. The best-fit analysis is described in more detail below in the discussion on image comparison.
Devices and Systems to Acquire Images of Larger Surfaces
In one embodiment, a system is capable of contemporaneously acquiring images over a variety of locations on a patient. The system can capture images over a larger surface area or can take multiple images over a larger area where the multiple images can be digitally overlaid and matched to form a large image.
Methods of Identifying and Re-Locating a Suspect Area
Figure 4 shows a patient's hand 300 according to one embodiment. By way of example, a suspect area 302 (e.g., a grouping of melanoma cells) appears on a backside surface 304 of the patient's hand 300. Two spots are identifiable on the backside surface 304, a first spot 306 {e.g., a scar) is located on the ring finger 308 and a second spot 310 (e.g., a freckle) is located on the wrist 312 of the patient's hand 300. The spots 306, 310, respectively, can be freckles, birthmarks, borders of a limb, or some other equivalent feature or landmark that is not susceptible to substantial changes in shape, size, and/or location with respect to its present location on the patient. Further, the spot 306 or 310 may be naturally occurring, such as a freckle, or the spot 306 or 310 may be a portion of a scar, a tattoo, or some other feature that is not susceptible to substantial changes in shape, size, and/or location with respect its present location on the patient. One advantage of locating at least one of the spots 306 or 310 on the patient is to use one of the spots 306 or 310 as a reference object 311. The reference object 311 , which is equivalent to spot 310 in the illustrated embodiment, provides a starting point from which other key measurements can be taken, as explained in the method below. However, it is appreciated that a reference object 311 is not always necessary when the suspect area can be easily relocated. For example, a group of melanoma cells that is easily and routinely detectable on a patient could be imaged and re-imaged, especially when the images are compared using a best-fit analysis.
The selection of the spots 306, 310 is generally left to the discretion of the medical professional, and it is contemplated that the medical professional will select the spot 310 that is the most stable or less susceptible to change over time. Optionally, the imaging software may also select the spots 306, 310. Although the first spot 306 may have a stable configuration, such as the scar, the location of the spot 306 on the ring finger 308 makes it less attractive as an reference object 311 because the ring finger 308 is easily moveable in relation to the hand 300, which can add error in repetitive measurements taken with respect to the spot 306 on the ring finger 308. In contrast, the second spot 310, shown on the wrist 312, has a more fixed relationship with respect to the suspect area 302 and thus may be a better reference object 311 from which to measure and document the location of the suspect area 302.
It is also advantageous if the reference object 311 or at least a reference marker 318 is located proximate to the suspect area 302 so that the reference object 311 or reference marker 318 can be captured in an image of the suspect area 302. In accordance with the embodiments herein and described in more detail below, it is desirable to manipulate an image by matching or overlaying either the reference objects 311 or reference markers 318 that appear in different images, taken at different times. In the illustrated embodiment, the reference marker 318 has a defined size and shape and can be placed quite near the suspect area 302. The advantage, however, of locating the reference object 311 remains unchanged because the placement of the reference marker 318 on the patient is made relative to the location of the reference object 311 on the patient. Methods of Acquiring a First Image of a Suspect Area
Figure 5 is a flowchart illustrating a method 400 to identify and take an image of a suspect area 302 on a patient. For clarity and ease of explanation, the method 400 is described in reference to Figure 4. One aspect of locating the suspect area 302 is to accurately identify, map, and document the reference object 311 , the marker 318, if needed, and the suspect area 302 for image comparison purposes as described in greater detail below.
Figure 5 shows that the method 400 commences at 402 where the suspect area is identified on the patient. A medical professional, the patient, the computing system, or some other entity may identify the suspect area. Identifying the suspect area most often will be done visually, but it is understood that other approaches may be used, such as the sense of touch.
At 404, a reference object is identified on the patient. At 406, the medical professional determines whether the reference object will be within a first field-of-view or first frame 314 (Figure 4) of an imaging device. Recall, it is desirable to have the reference object 311 within the first frame 314 of because multiple images of the suspect area 302 will be compared to one another. In one embodiment, the first frame 314 is sized to provide an amount of resolution of the suspect area 302 that will be adequate for detailed image processing and evaluation.
If the reference object 311 is advantageously within the first frame 314 of the imaging device, then at 408, a position of the suspect area 302 is determined relative to the reference object 311. In one embodiment, a Cartesian coordinate system (X, Y) having perpendicular axes, is used to determine the position of the suspect area 302 relative to the reference object 311. In another embodiment, a spherical coordinate system (r, θ) is used. By way of the exemplary embodiment illustrated in Figure 4, which employs the Cartesian coordinate system, the reference object 311 is assigned coordinates "0, 0" and the suspect area 302, as measured from the reference object 311 , is determined to have coordinates of (a, b) at a point on the suspect area 302 that represents an approximate center point of the suspect area 302. If a smaller image frame 316 (Figure 4) is necessary, for example to get an image with higher resolution, and the reference object 311 is located outside of the smaller image frame 316, then at 410, the reference marker 318 (Figure 4) is placed in proximity to the suspect area 302. At 412, a position of the reference marker 318 is determined relative to the reference object 311 , for example the position of the reference marker 318 is determined to have coordinates (c, d). Next, the position of the suspect area 302 is determined relative to the reference marker 318 and, by way of example, has coordinates (e, f). In one embodiment, the reference marker 318 is a pen mark on the patient. In another embodiment, the reference marker 318 is a small patch or sticker backed with an adhesive. The shape of the patch is customized so that the reference marker 318 can be placed in a desired orientation during successive examinations of the patient. In the illustrated embodiment of Figure 4, the reference marker 318 includes a pointed region that points towards the finger tips and parallel sides that substantially align with the sides of the patient's arm. One skilled in the art will understand and appreciate that the reference marker 311 can have a variety of shapes, sizes, colors, contrast features, textures, and can even have features, like a center dot, to identify an exact starting point for measurements. Additionally or alternatively, human- readable and/or machine-readable indicia can be encoded on the reference marker 318.
It should be understood that in certain aspects of the invention no reference object is utilized perse, and the computer algorithm performs best fit on two images using only the suspect area or multiple points within the picture frame to make a transformation or best fit analysis. While user applied reference markers or the use of anatomical features as reference points are useful and may lead to higher quality results in certain scenarios, they are by no means required and thus should be considered optional embodiments. In addition, when utilizing such analysis in certain embodiments a computer algorithm can take points on the perimeter of the lesion or suspect area of a first image and perform multiple measurements between any two points or more and compare such measurements with a second image. The first and second image may be resized, skewed, color balanced and/or brightness changed to assist in attempting to fit one image to the other. When measurements between points in the first image and measurements between points in the second image are substantially the same the images can be considered compared and any deviations outside the error for such comparisons can be noted as a possible change for the user or medical professional to review.
The coordinates of the reference object 311 , the reference marker 318, if needed, and the suspect area 302 can be recorded and/or documented on paper or via and electronic medium, for example entering the data into a computer. In addition, descriptions of these features can also be recorded and/or documented. It is understood that the recordation and/or documentation can be accomplished in a number of known ways, which may be through manual, automatic, paper, or paperless means. At 414, an imaging device 10, 100, 200 is positioned to take an image of the suspect area 302. Because the images will be digitally processed, it is desirable to position the imaging device 10, 100, 200 in a repeatable manner with respect to the suspect area 302. Depending on the type of method used to compare images and the quality of the images, it may desirable that the distance of the imaging device 10, 100, 200 from and the angle of the imaging device 10, 100, 200 with respect to the suspect area 302 is kept substantially constant from one image to the next. However, it is appreciated and understood that the distance and angle of the imaging device 10, 100, 200 with respect to the suspect area 302 can vary by a significant amount from one image to the next and the analysis / comparison software can best-fit analysis to substantially match and align respective images. In addition, it may also be desirable to maintain constant lighting, at least within the frame of the image, in order to more easily detect color changes and/or shape changes of the suspect area 302 when the images are electronically processed and compared.
In another embodiment, the imaging device 10, 100, 200 can include a stereo camera to add depth perception to the resulting image. Stereo cameras that are placed at a constant distance from each other could provide two images, one from each camera, of the suspect area 302. When the images are compared against each other, the depth and/or texture of the suspect area 302 can be determined.
Optionally, at 416, a parameter on the imaging device 10, 100, 200 may be adjusted to enhance a quality of the image. For example, light filters or color filters can be coupled with the imaging device 10, 100, 200 as a way to control the light within the frame of the image. Additionally or alternatively, the imaging device 10, 100, 200 can be focused to obtain a desired resolution, thus increasing or decreasing the frame size of the image to be acquired.
At 418, a first image is acquired that captures the suspect area 302 alone or the suspect area with one of either the reference object 311 or the reference marker 318. At 420, the first image is electronically archived. During the archiving process, the first image can be given an identifier such as a file name, label, number, date stamp, or some other association that makes it easy to re-locate the first image in a database. The electronic format of the first image can be archived as any number of common graphics formats such as *.jpg, *.tif, *.bmp, *.gif, or another equivalent format that is readable by a standard computer system. Optionally, at 422, the first image may be preprocessed. Preprocessing the first image may include, but is not limited to, identifying the reference object 311 and/or reference marker 318 in the image; detecting, mapping, and computing the border of the object 311 and/or marker 318; detecting, mapping, and computing the border of the suspect area 302; and/or overlaying a reference grid 608 onto the image as shown in Figure 6, according to one illustrated, exemplary embodiment.
Figure 6 shows an image 600 having an image frame 602. Captured within the image frame 602 is an image 604 (ΛΘ., an image of the suspect area 302) and a reference image 606 (i.e., an image of either one of the reference object 311 or the reference marker 318) located nearby or within image 604. The reference grid 608 overlies the image 604 and the reference image 606. One advantage of including the reference grid 608 is that the grid 608 can be printed with the image 600. This allows the medical professional to more easily visually examine the image 604 to identify obvious changes. Another advantage of the reference grid 608 is that it allows the medical professional to more accurately identify, describe, and even communicate respective changes of the suspect area 302 by referring to various quadrants or blocks of the reference grid, which can be colored or coded to indicate regions where substantial change has occurred.
Methods of Acquiring a Subsequent Image of a Suspect area
Figure 7 shows a method 700 of acquiring a subsequent image of the suspect area 302 according to one embodiment. Method 700 differs from the previous method 400 in that method 700 includes relocating the suspect area 302 and realigning the imaging device 10, 100, 200. At 702, the suspect area 302 is relocated on the patient. As explained above, one of the purposes of the embodiments described herein is to track the changes of a suspect area 302. Because some patients may have many suspicious areas that are crowded together in one location or suspicious areas that rapidly change, it is important to relocate the exact area that is to be re-evaluated.
The suspect area 302 can be relocated by visually inspecting the patient, reviewing the patient's records, reviewing the position of a documented reference object and then measuring to obtain the position of the suspect area 302, automated by the computing system, or some combination thereof. At 704, a reference marker 318 can be repositioned on the patient proximate to the suspect area 302, if necessary.
The patient is positioned at 706 and an imaging device 10, 100, 200 is reoriented and/or realigned relative to the suspect area 302. Recall that a distance and an angle of the imaging device 10, 100, 200 used to acquire a subsequent image should be approximately matched to a distance and an angle of the imaging device 10, 100, 200 of a previous image. The orientation of the imaging device 10, 100, 200 does not have to exactly match because a transformation algorithm can be used to account for some amount of deviation in the angle, the distance, and even the lighting. At 708, a parameter (e.g., functional features such as zoom, contrast, etc.) on the imaging device 10, 100, 200 may be adjusted to enhance a quality of the image, if necessary.
At 710, a subsequent image is acquired that captures both the suspect area 302 and one of either the reference object 311 or the reference marker 318. At 712, the subsequent image is electronically archived according to the archiving process described above. Optionally, at 714, the subsequent image may be pre-processed as described above and illustrated in Figure 6.
Image Transformation
Figure 8A shows two images 600a, 600b undergoing a transformation according to one embodiment. A first image 600a includes a first frame 602a enclosing a first reference image 606a and a first image 604a. The orientation of the first frame 602a results from the angle and position of the imaging device 10, 100, 200 when the image was acquired. A second image 600b includes a second frame 602b enclosing a second reference image 606b and a second image 604b, wherein both the first image 604a and the second image 604b are images of the suspect area 302. It should be understood the second reference image 600b may be skewed, of a different size, or otherwise misaligned with respect to the first reference image 600a. Thus, the transformation algorithm is used to align, size, deskew, or otherwise manipulate the second reference image 606b to match the first reference image 606a as closely as possible. Moreover, any changes made to the second reference image 606b during the transformation process are made to the entire image 600b and everything enclosed within the image 606b. For example, if the reference image 606b is scaled down by ten percent, then the second frame 602b, the reference grid (not shown for clarity), and the second image 604b are also scaled down by ten percent. Further it should be noted that reference image 606a and 606b need not be separate from images 604a and 604b, but can be points within or on images 604a and 604b that are considered by the computer algorithm during the fit analysis.
Figure 8B shows the same two images from Figure 8A about to be overlaid after the second image 600b has been transformed. Once the images 600a, 600b are overlaid with respect to one another, a comparison algorithm can be employed to detect, map, and document differences, if any, between the first image 604a and the second image 604b.
Methods of Comparing Images
Figure 9 shows a method 800 to compare subsequent images 600a, 600b taken of a suspect area 302 according to one illustrated embodiment. This comparison can take place in a variety of settings, for example in the facility where the patient is treated or in a remote facility. The comparison, when done remotely, simply means that images of the suspect area 302 are forwarded to another location, which could be by a computer algorithm or a third party technician that specializes in performing the image comparisons. The images can be transferred to the remote facility through any available means, for example over a computer network (private or the Internet), through a file transfer protocol (FTP) system, by courier or regular mail, with the images stored on a computer readable medium such as a compact disk, magnetic storage device, or other equivalent digital storage media. The method 800 can commence with the first image 600a being compared to a second, subsequent image 600b, for example. In the present embodiment, the images have been electronically archived, but they may not have been pre-processed. In addition, the present embodiment is not limited to the comparison of only two images. It is appreciated and understood that multiple images can be simultaneously compared against a baseline image and/or relative to each other. For example, each image taken over the preceding six months could be simultaneously compared to a first image 600a taken the previous year. In one embodiment, an animation software program can be used to animate the changes in the suspect area 302 over time. For purposes of clarity and brevity, however, the comparison of only two images will be described below.
At 802, the electronically archived images are accessed from a database of images. At 804, the images are pre-processed, if desired. Optionally, at 806 a user may select a baseline image 600a. The baseline image could be a first acquired image, an intermediately acquired image, or an image taken during the patient's previous office visit. For purposes of detecting changes, it is not necessary, but may be helpful to select a baseline image. At 808, a mapping algorithm can be used to determine key features, such as a border or perimeter of the reference images 606a, 606b. Alternatively, a mapping algorithm may look for key points on the reference images 606a, 606b with respect to the reference grid 608 (Figure 6) or may use reference images 606a and 606b as points on images 604a and 604b during the fit analysis. In all of the embodiments described herein reference images 606a and 606b should be understood to be an image such as a landmark feature on the surface or simply a reference point or pixel or collection of pixels either separated from images 604a and 604b or on or within 604a or 604b. Thus, the term image should be construed to mean a point that can be captured in a digital form and used as a reference point. At 810, a transformation algorithm is used to transform the second image 600b into a comparative posture with the first image 600a by using the reference images 606a, 600b. In one embodiment, the reference image 600b is scaled up or down in size, rotated, skewed, or otherwise manipulated so that the reference images 606b is approximately the same size, same orientation, and in the same position within the frame 602b as the first reference image 606a of frame 602a. In an alternate embodiment, both images 600a, 600b can be scaled up or down in size, rotated, skewed, or otherwise manipulated so that the respective reference images 606a, 606b are approximately the same size, have the same orientation, and are in the same position with respect to the reference grid 608 (Figure 6). It should be understood that in certain aspects . the reference images 606a and 606b are in fact contained within images 604a and 604b. Such reference images may correspond to points on the perimeter of images 604a and 604b that appear unchanged once the images are sized and overlaid. As one of ordinary skill in the art can readily appreciate the more reference points taken into account during the fit analysis, the higher the quality of the comparison. Accordingly, in certain embodiments at least two reference points are considered, in other embodiments, at least 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, or more are utilized by the algorithm. At 812, after the reference images 606a, 606b have been sufficiently matched during the transformation process; a comparison algorithm is used to evaluate and compare the respective images 604a, 604b. Key points or parameters are identified in both the first image 604a and the second image 604b. For example, the overall area, the perimeter or border length, the percentage change in size in a given quadrant, etc. are just some of the parameters that can be evaluated in each respective image 604a, 604b.
In addition, a color, contrast, and/or a depth of each of the respective images 604a, 604b can be determined. Balancing the color, brightness, and/or the contrast of the respective images is described below. The features that are to be evaluated can be selected by a user or can be selected automatically.
At 814, the images 604a, 604b are compared with respect to one another to identify differences between the evaluated features. For example, the areas or perimeter lengths of the respective images 604a, 604b can be compared. The differences may be subtle, like slight changes in color or they may be substantial like a greatly enlarged area of the second image 604b.
At 816, any identified differences are further compared to determine if a threshold is exceeded. For example, the threshold could be one, two, three, four, five, six, seven, eight, nine, ten, fifteen, twenty, twenty-five percent increase in the area of the second image 604b compared to the area of the first image 604a. At 818, if there are no detected differences or if the detected differences do not exceed the threshold, then a notification is provided that no noteworthy changes of the image 604b were detected. At 820, if the detected differences do exceed the threshold, then a notification is provided that noteworthy changes of the image 604b were detected. The threshold can be a user defined setting, a preprogrammed setting, or an automatically adjustable range depending on the image quality and resolution, for example. At 822, data is provided detailing the specific changes, for example, shape, color, texture, a shift in position, etc. The data and/or the results obtained from the comparison can be made available to the medical professional in a short amount of time to enable the medical professional to make a more objective, informed diagnosis and to quickly formulate a treatment plan. Additionally or alternatively, a post-processing algorithm can be used to overlay the respective images 600a, 600b on a screen. Color-coding, animation techniques, and other graphic processing techniques can be used to identify areas or regions of greatest change.
In certain embodiments, images may be captured and compared using only a standard imaging device, which includes, digital cameras, movie cameras, film cameras etc., as long as the image to be compared is at some point moved to a digital format thus allowing computational analysis thereon. Clearly in one embodiment an image from a standard consumer model digital camera is compared against a subsequent image. In such embodiments, the computer algorithm used will perform a best fit or transformation of the images by modifying size, angle, and brightness, to obtain the best possible fit prior to analysis for changes. Accordingly, in such embodiments users already having an archive of older images can compare these images. While the error rate for such comparison is slightly higher, the flexibility of being able to review older images far exceeds the risk of a few false positive outcomes that can be easily discounted by the user upon further review. It should also be clearly understood that images taken with no focal length limiter, brightness, color, or contrast control can be compared with images having such controls.
Color and/or Contrast Balancing to Detect a Suspect area
Figures 10A, 10B, and 10C illustrate the color balancing of an image 900. In particular, Figure 1OA shows a digital image 900a of a suspect area 902 and a background region 904 prior to color balancing. Figure 10B shows the image 900b after a filter has been applied to filter out the background skin region 904 based on the color, brightness, and/or contrast of the suspect area 902 compared to the background skin region 904. Figure 10C shows the image 900c after it has been preprocessed, which may include but is not limited the application of additional filters to remove other features in the image and/or the application of a reference grid 908 over the image 900c.
Due to slight differences in lighting and environment, the colors in an image will likely not be constant from one image to the next, even though steps are taken to provide constant lighting. Moreover, a skin does not provide an adequate background for evaluating color changes of a suspect area because the skin can change color, for example the skin may be darker in the summer than in the winter. One advantage of color balancing is to provide an additional parameter that can be compared from one image to the next, for instance the respective darkness or lightness of the respective images. A second advantage of color balancing permits a comparison between the suspect area and the surrounding skin as a means to more accurately detect suspect areas 302 over a larger skin surface.
Figure 11 shows a method 1000 of detecting suspect areas over a surface of skin 14 by means of image filtration (i.e., color or contrast balancing). At 1002, at least several images over a large area of skin are acquired. The images may have overlapping sections to insure that the entire skin area was imaged. In addition, reference markers 318 can be placed at various locations on the imaged skin area so that any potential suspect areas 302 that may be discovered can be relocated at a later time.
At 1004 and according to one embodiment, each image is color balanced with respect to a reference color. In one embodiment, the reference color appears in the acquired image. Such a reference could include distance measurements, contrast standards, color standards, etc. The reference color can be a color palette of single color placed within the frame of the image when the image is acquired (Figure 2A). The color palette can have a variety of shades or colors thereon. Because the reference colors are electronically isolatable, the color and/or shading of the image can be digitally adjusted until a feature in the image approximately matches a certain reference color.
At 1006, after the color of the image has been adjusted, a spot detection algorithm is used to process each of the respective images to detect any suspect areas 302 in one or more images. In one embodiment, a filter is applied to the image to make darker objects, such as a mole, stand out relative to the skin. The type of filter used will depend on the amount of color contrast between the skin and the suspect area. By way of example, images of a light skinned person with dark patches on their skin may not require filtering; whereas images of a dark skinned person with moderately dark patches on their skin may require a series of filters to achieve enough contrast between the skin and the suspect area.
In 1008, notification of a potential suspect areas is provided to the medical professional. At this point, the medical professional could perform a refined evaluation of any potential suspect area by taking higher resolution images of the suspect area and comparing these images over time, as described in detail above. Additionally or alternatively, the medical professional can perform or recommend that a biopsy be taken of the suspect area.
Computing Systems The computing system for performing the image comparisons may include a number of local computers for receiving downloaded images and at least one mainframe computer for performing the image comparisons. Alternatively, the image comparisons could be performed on the local computers. The local computer typically includes a processor, memory, multiplex ("Mux") card, video and Ethernet cards, power supply and an image acquisition card. A number of local computers be networked together and service a number of patient treatment facilities. The local computer can communicate with other local computers and/or the mainframe computer over a communications link such as a local area network ("LAN") and/or a wide area network ("WAN"). The communications link can be wired and/or wireless. The communications link can employ Internet, or World Wide Web communications protocols, and can take the form of a proprietary extranet. In such instances a user could obtain images and upload these to a web-based server that could perform all the analysis and send back to the user only the analysis or only the analysis that yielded possible changes. In other embodiments all algorithms could reside on the computer wherein the images or uploaded or on a server directly connected thereto. In certain embodiments, patient confidentiality is maintained. In certain specific embodiments a user could upload all patient information into a database and also have image analysis linked thereto. Accordingly, either on a remote server or housed in the users facility could be a computer that contains a database with a unique patient identifier, this identifier can be used to add new images to a patient folder and the image analysis could either be performed immediately while the user waits or could be performed in the background. Subsequent to this analysis a notification could be sent via email or secured web-access or the like that indicates the analysis has been completed and either no action is necessary or further review/action may be required, thus indicating a change was noted between the images.
The various embodiments described above can be combined to provide further embodiments. All of the above U.S. patents, patent applications and publications referred to in this specification are incorporated herein by reference. Aspects can be modified, if necessary, to employ devices, features, and concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made in light of the above detailed description. In general, the terms used in the claims should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to cover all imaging devices, types of image formats, measuring techniques, and image transformation and comparison algorithms. Accordingly, the claims are not limited by the disclosure.

Claims

CLAIMS What is claimed is:
1. A method of comparing at least two images of a suspect area on a patient, the method comprising: providing at least two images of the suspect area; digitally overlaying the at least two images; performing a best-fit transformation of one image to encourage the one image to approximately correspond to at least one detected attribute of the other acquired image; comparing the at least two images to determine whether a difference exists between an aspect of the one image when compared with the same aspect of the other image.
2. The method of claim 1 , further comprising: determining whether a correlation between the at least two images permits the at least two images to be digitally compared.
3. The method of claim 1 wherein the detected attribute is a perimeter of the suspect area.
4. The method of claim 1 wherein the detected attribute is a two dimensional surface area of the suspect area.
5. The method of claim 1 wherein comparing the at least two image to determine whether the difference exists between the aspect of the one image when compared with the same aspect of the other image includes determining if a change between the respective aspects exceeds a threshold.
6. The method of claim 5, further comprising: providing an indication that the threshold is exceeded.
7. A method for monitoring a dermal area of a patient comprising: providing at least a first and second image of the dermal area; digitally overlaying the at least first and second image; performing a best-fit transformation of one image to encourage the one image to approximately correspond to at least one detected attribute of the other acquired image;
comparing the at least two images to determine whether a difference exists between an aspect of the one image when compared with the same aspect of the other image; and thereby monitoring the dermal area.
8.
The method of claim 7, wherein subsequent to step d, a computer algorithm provides an output that describes the presence or absence of difference between the two images.
9. The method of claim 7, wherein said dermal area of a patient is at risk of developing a melanoma.
10. An apparatus to acquire an image of a suspect area on a patient, the system comprising: an imaging device; and a separation tool having a first end connected with a first section, at least a portion of the first end formed to contact the patient, the first section having an attachment portion to receive the imaging device, the first section formed to maintain the imaging device at a substantially fixed distance from the suspect area.
11. The apparatus of claim 10, further comprising: a bracket to couple the imaging device to the separation tool, the bracket sized to receive the imaging device and configured to releasably engage the first section of the separation tool.
12. The apparatus of claim 10, further comprising: a first sensor for determining an orientation of the separation tool.
13. The apparatus of claim 12 wherein the first sensor is a fluid level.
14. The apparatus of claim 10, further comprising: a control system to receive input from a sensor; and an indicator to notify a user that the separation tool is in a desired orientation.
15. The apparatus of claim 12, further comprising: a second sensor for determining a relationship between the separation tool and the skin.
16. The apparatus of claim 15 wherein the second sensor is a pressure sensor to sense an amount of pressure applied to the separation tool when the separation tool is in contact with the skin.
17. The apparatus of claim 15 wherein the second sensor is a proximity sensor.
18. The apparatus of claim 10 wherein the separation tool includes a color palette positioned to be captured in an image acquired by the imaging device.
19. The apparatus of claim 10 wherein the first end of the separation tool includes a curved member formed to provide an unobstructed view from the imaging device to the suspect area.
20. The apparatus of claim 10 wherein the imaging device is configured to acquire a digital image.
21. The apparatus of claim 10 wherein the imaging device is a camera having a lens and a camera body.
22. The apparatus of claim 10, further comprising a stereo camera coupled with the imaging device to obtain a depth perspective of the suspect area.
23. The apparatus of claim 22 wherein the depth perspective of the suspect area includes detecting a texture of the suspect area.
24. An imaging assembly to image a suspect area on a patient, the assembly comprising: a housing; an imaging device located within the housing; and at least one sensor to indicate an orientation of the housing relative to a first location.
25. The imaging assembly of claim 24 wherein the imaging device is a CCD camera.
26. The imaging assembly of claim 24 wherein the imaging device is a CMOS camera.
27. The imaging assembly of claim 24, further comprising a stand for statically supporting the housing of the imaging device.
28. The imaging assembly of claim 24 wherein the at least one sensor is a level and the first location is a plane substantially perpendicular to gravity.
29. The imaging assembly of claim 24 wherein the at least one sensor is a proximity sensor.
30. A method of identifying and re-locating a suspect area on a patient, the method comprising: identifying the suspect area; identifying a reference item located on the patient; and determining the location of the suspect area in relationship to the reference item.
31. The method of claim 30 wherein identifying the suspect area includes visually identifying the suspect area.
32. The method of claim 30 wherein identifying a reference item comprises identifying a reference object located on the patient.
33. The method of claim 21 wherein identifying a reference item comprises placing a reference marker on the patient.
34. The method of claim 30 wherein identifying a reference item includes the reference item being a substantially dimensionally stable object on the patient.
35. The method of claim 30, further comprising determining if the reference item is close enough to the suspect area to be identified in an image acquired of the suspect area.
36. The method of claim 30, further comprising documenting a position of the suspect area in relationship to the reference item wherein documenting includes recording at least one measurement.
37. A method of acquiring an image of a suspect area, the method comprising: identifying the suspect area; positioning a patient with the suspect area in approximately a first position; identifying a reference item located on the patient; determining a position of the suspect area in relationship to the reference item; aligning an imaging device to acquire the image of the suspect area; and acquiring the image after aligning the image device.
38. The method of claim 37 wherein positioning a patient with the suspect area in a first position includes placing the patient in a predetermined and repeatable position.
39. The method of claim 37 wherein aligning an imaging device to acquire the image includes placing the imaging device in a desired spatial orientation with respect to the suspect area.
40. The method of claim 37 wherein aligning an imaging device to acquire the image includes positioning the imaging device at a desired distance from the suspect area.
41. The method of claim 37 wherein aligning an imaging device to acquire the image includes aligning the image device based at least in part on information from a sensor coupled with the imaging device.
42. The method of claim 37, further comprising: adjusting an aspect affecting the imaging device to enhance a quality of the image.
43. The method of claim 42 wherein adjusting an aspect affecting the imaging device includes attaching a filter to the imaging device.
44. The method of claim 42 wherein adjusting an aspect affecting the imaging device includes adjusting an amount of light in a vicinity of the imaging device.
45. The method of claim 37 wherein acquiring the image is done manually.
46. The method of claim 37 wherein acquiring the image is done automatically when the imaging device achieves a desired position.
47. The method of claim 37, further comprising archiving the first image in an electronic format.
48. The method of claim 37, further comprising pre-processing the first image.
49. The method of claim 48 where pre-processing the image includes overlaying the image onto a grid.
50. The method of claim 37 wherein identifying the reference item includes the reference item being a substantially dimensionally stable object on the patient.
51. A method of comparing at least two images, each image capturing a suspect area, the method comprising: identifying a reference item in the at least two images; measuring an attribute of the reference item in a first image; transforming a second image based on the measured attribute of the reference item in the first image, wherein a reference item in the second image is transformed to correspond with an orientation and size of the reference item in the first image; measuring an attribute of the suspect areas in both images; and comparing the respective measured attributes of the respective suspect areas.
52. The method of claim 51 , further comprising: acquiring the at least two images for comparison includes downloading the at least two images from a computer database wherein the first image was acquired at a first time and the second image was acquired at a second time.
53. The method of claim 51 wherein measuring the attribute of the reference item in the first image includes measuring a perimeter of the reference item.
54. The method of claim 51 wherein measuring the attribute of the reference item in the first image includes measuring a surface area of the reference item.
55. The method of claim 51 wherein transforming the second image based on the measured attribute of the reference item in the first image includes scaling a perimeter of the reference item in the second image to substantially match a perimeter of the reference item in the first image.
56. The method of claim 51 wherein measuring the attribute of the suspect areas in both images includes measuring a perimeter of each of the respective suspect areas.
57. The method of claim 51 wherein measuring the attribute of the suspect areas in both images includes measuring a surface area of each of the respective suspect areas.
58. The method of claim 51 wherein comparing the measured attributes of the respective suspect areas includes detecting a change with respect to the respective measured attribute of each of the respective suspect areas in each of the respective images.
59. The method of claim 51 wherein detecting a change with respect to the respective measured attribute of each of the respective suspect areas includes determining whether a perimeter corresponding to the suspect area in the first image is different than a perimeter corresponding to the suspect area in the second image.
60. The method of claim 51 wherein detecting a change with respect to the respective measured attribute of each of the respective suspect areas includes determining if a surface area corresponding to the suspect area in the first image is different than a surface area corresponding to the suspect area in the second image.
61. The method of claim 51 wherein comparing the respective measured attributes includes determining if a change between the respective measured attributes exceeds a threshold.
62. The method of claim 61 , further comprising providing an indication that the threshold is exceeded.
63. A method of acquiring an image of a suspect area, the method comprising: identifying the suspect area; positioning a patient with the suspect area in approximately a first position; aligning an imaging device to acquire an image of the suspect area; and acquiring the image after aligning the image device.
64. The method of claim 63 wherein positioning a patient with the suspect area in a first position includes placing the patient in a predetermined and repeatable position.
65. The method of claim 63 wherein aligning an imaging device to acquire the image includes placing the imaging device in a desired spatial orientation with respect to the suspect area.
66. The method of claim 63 wherein aligning an imaging device to acquire the image includes positioning the imaging device at a desired distance from the suspect area.
67. The method of claim 63 wherein aligning an imaging device to acquire the image includes aligning the image device based at least in part on information from a sensor coupled with the imaging device.
68. The method of claim 63, further comprising adjusting an aspect affecting the imaging device to enhance a quality of the image.
69. The method of claim 68 wherein adjusting an aspect affecting the imaging device includes attaching a filter to the imaging device.
70. The method of claim 68 wherein adjusting an aspect affecting the imaging device includes adjusting an amount of light in a vicinity of the imaging device.
71. The method of claim 63 wherein acquiring the image is done automatically when the imaging device achieves a desired position.
72. The method of claim 63, further comprising archiving the first image in an electronic format.
73. The method of claim 63, further comprising pre-processing the first image.
74. A system to acquire an image of a suspect area on a patient, the system comprising: means for acquiring at least two images of the suspect area; means for digitally overlaying the at least two images; means for comparing the two images to encourage the one acquired image to approximately correspond to at least one detected attribute of the other acquired image; means for detecting whether a difference exists between an aspect of the one acquired image when compared with the same aspect of the other acquired image.
75. A system of claim 74 wherein the means for comparing the two images comprises conducting a best-fit transformation of the one acquired image relative to the other acquired image.
76. A system of claim 74 wherein the means for detecting includes selecting a threshold value to indicate whether the difference between the aspect of the one acquired image when compared with the same aspect of the other acquired image warrants additional attention.
77. The use of any of the preceding claims to monitor or detect melanoma.
PCT/US2006/002037 2005-01-19 2006-01-19 Devices and methods for identifying and monitoring changes of a suspect area on a patient WO2006078902A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
NZ556655A NZ556655A (en) 2005-01-19 2006-01-19 Devices and methods for identifying and monitoring changes of a suspect area on a patient
CA2595239A CA2595239C (en) 2005-01-19 2006-01-19 Devices and methods for identifying and monitoring changes of a suspect area on a patient
AU2006206334A AU2006206334C1 (en) 2005-01-19 2006-01-19 Devices and methods for identifying and monitoring changes of a suspect area on a patient
AU2010257350A AU2010257350A1 (en) 2005-01-19 2010-12-21 Devices and methods for identifying and monitoring changes of a suspect area on a patient

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US64553805P 2005-01-19 2005-01-19
US60/645,538 2005-01-19

Publications (2)

Publication Number Publication Date
WO2006078902A2 true WO2006078902A2 (en) 2006-07-27
WO2006078902A3 WO2006078902A3 (en) 2006-12-07

Family

ID=36283976

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/002037 WO2006078902A2 (en) 2005-01-19 2006-01-19 Devices and methods for identifying and monitoring changes of a suspect area on a patient

Country Status (5)

Country Link
US (5) US7657101B2 (en)
AU (2) AU2006206334C1 (en)
CA (2) CA2595239C (en)
NZ (1) NZ556655A (en)
WO (1) WO2006078902A2 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008150343A1 (en) * 2007-05-22 2008-12-11 Eastman Kodak Company Monitoring physiological conditions
FR2935888A1 (en) * 2008-09-17 2010-03-19 Dataderm Internat Gmbh METHOD AND SYSTEM FOR SKIN ANALYSIS
WO2010073178A1 (en) * 2008-12-23 2010-07-01 Koninklijke Philips Electronics N.V. System for monitoring medical abnormalities and method of operation thereof
WO2011036259A1 (en) * 2009-09-24 2011-03-31 W.O.M. World Of Medicine Ag Dermatoscope and elevation measuring tool
EP2518689A1 (en) * 2011-04-27 2012-10-31 Cornelis Blokker Method for comparing medical images of a patient
WO2012169562A1 (en) * 2011-06-09 2012-12-13 Canon Kabushiki Kaisha Image processing apparatus, method of controlling image processing apparatus and storage medium
US8755053B2 (en) 2005-10-14 2014-06-17 Applied Research Associates Nz Limited Method of monitoring a surface feature and apparatus therefor
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
WO2015188964A1 (en) * 2014-06-13 2015-12-17 FotoFinder Systems GmbH Whole-body image recording and image processing system and method for operating same
WO2016060557A1 (en) * 2014-10-17 2016-04-21 Stichting Maastricht Radiation Oncology "Maastro-Clinic" Image analysis method supporting illness development prediction for a neoplasm in a human or animal body
US9339194B2 (en) 2010-03-08 2016-05-17 Cernoval, Inc. System, method and article for normalization and enhancement of tissue images
US9723270B2 (en) 2005-01-19 2017-08-01 Dermaspect Llc Devices and methods for identifying and monitoring changes of a suspect area of a patient
CN107798673A (en) * 2016-09-01 2018-03-13 卡西欧计算机株式会社 The storage medium of diagnosis supporting device, the image processing method of diagnosis supporting device and storage program
US10777317B2 (en) 2016-05-02 2020-09-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
AU2019203346B2 (en) * 2013-07-22 2021-03-25 The Rockefeller University Optical detection of skin disease
DE102019130369A1 (en) * 2019-11-11 2021-05-12 Gottfried Wilhelm Leibniz Universität Hannover Camera system for prescribing a detailed image of a patient's skin and online dermatoscope
AT511933A3 (en) * 2011-08-10 2021-07-15 Acd Elektronik Gmbh Process for recording, measuring and documenting wounds as well as a device for carrying out the process
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US11134885B2 (en) 2015-08-13 2021-10-05 The Rockefeller University Quantitative dermoscopic melanoma screening
US11903723B2 (en) 2017-04-04 2024-02-20 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems

Families Citing this family (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7539334B2 (en) * 2005-09-06 2009-05-26 Intel Corporation Method and apparatus for identifying mole growth
US7613337B2 (en) * 2005-09-06 2009-11-03 Intel Corporation Method and apparatus for identifying mole growth
US20100091104A1 (en) * 2006-09-27 2010-04-15 Georgia Tech Research Corporation Systems and methods for the measurement of surfaces
GB2444738A (en) * 2006-12-12 2008-06-18 Prosurgics Ltd Registration of the location of a workpiece within the frame of reference of a device
JP4640845B2 (en) * 2007-03-05 2011-03-02 富士フイルム株式会社 Image processing apparatus and program thereof
US20080294018A1 (en) * 2007-05-22 2008-11-27 Kurtz Andrew F Privacy management for well-being monitoring
US20090072142A1 (en) * 2007-09-14 2009-03-19 Forensicare Incorporated Scanning system and techniques for medical and/or forensic assessment using the same
WO2009048833A1 (en) * 2007-10-09 2009-04-16 Siemens Healthcare Diagnostics Inc. Two dimensional imaging of reacted areas on a reagent
AU2009246917A1 (en) * 2008-05-13 2009-11-19 Spectral Image, Inc. Systems and methods for hyperspectral medical imaging using real-time projection of spectral information
US8194952B2 (en) * 2008-06-04 2012-06-05 Raytheon Company Image processing system and methods for aligning skin features for early skin cancer detection systems
US9117133B2 (en) 2008-06-18 2015-08-25 Spectral Image, Inc. Systems and methods for hyperspectral imaging
US20090327890A1 (en) * 2008-06-26 2009-12-31 Raytheon Company Graphical user interface (gui), display module and methods for displaying and comparing skin features
JP5567908B2 (en) * 2009-06-24 2014-08-06 キヤノン株式会社 Three-dimensional measuring apparatus, measuring method and program
US8606345B2 (en) 2009-08-31 2013-12-10 Gsm Of Kansas, Inc. Medical dual lens camera for documentation of dermatological conditions with laser distance measuring
US20110172004A1 (en) * 2010-01-11 2011-07-14 Vendmore Systems, Llc Venue product sales and networking
WO2011127247A2 (en) * 2010-04-07 2011-10-13 Sanjay Krishna Apparatus and techniques of non-invasive analysis
CN101966083B (en) * 2010-04-08 2013-02-13 太阳系美容事业有限公司 Abnormal skin area computing system and computing method
US8630469B2 (en) 2010-04-27 2014-01-14 Solar System Beauty Corporation Abnormal skin area calculating system and calculating method thereof
US8515144B2 (en) * 2010-04-27 2013-08-20 Solar System Beauty Corporation Abnormal skin area calculating system and calculating method thereof
US8554016B2 (en) 2010-11-10 2013-10-08 Raytheon Company Image registration system and method for registering images for deformable surfaces
US20120157800A1 (en) * 2010-12-17 2012-06-21 Tschen Jaime A Dermatology imaging device and method
TWI519277B (en) * 2011-03-15 2016-02-01 明達醫學科技股份有限公司 Skin optical diagnosing apparatus and operating method thereof
KR101903407B1 (en) * 2011-09-08 2018-10-02 엘지전자 주식회사 Health care system based on video in remote health care solution and method for providing health care service
US9345957B2 (en) 2011-09-30 2016-05-24 Microsoft Technology Licensing, Llc Enhancing a sport using an augmented reality display
US9268406B2 (en) 2011-09-30 2016-02-23 Microsoft Technology Licensing, Llc Virtual spectator experience with a personal audio/visual apparatus
US9286711B2 (en) * 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Representing a location at a previous time period using an augmented reality display
US9606992B2 (en) 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
US10319484B1 (en) 2011-11-17 2019-06-11 Nuscale Power, Llc Method for imaging a nuclear reactor
DE102011087748A1 (en) * 2011-12-05 2013-06-06 Deutsches Zentrum für Luft- und Raumfahrt e.V. A liquid jet scalpel and method of operating a liquid jet scalpel
WO2013144184A2 (en) * 2012-02-11 2013-10-03 Dermosafe Sa Hand held device and method for capturing images of skin portions
WO2013126877A1 (en) * 2012-02-25 2013-08-29 Massachusetts Institute Of Technology Personal skin scanner system
US9020192B2 (en) 2012-04-11 2015-04-28 Access Business Group International Llc Human submental profile measurement
US20130304476A1 (en) 2012-05-11 2013-11-14 Qualcomm Incorporated Audio User Interaction Recognition and Context Refinement
US9746916B2 (en) 2012-05-11 2017-08-29 Qualcomm Incorporated Audio user interaction recognition and application interface
WO2014108896A1 (en) * 2013-01-08 2014-07-17 Marpe Technologies Ltd. Device and method for body moles mapping and tracking
US9386908B2 (en) * 2013-01-29 2016-07-12 Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) Navigation using a pre-acquired image
US9092697B2 (en) 2013-02-07 2015-07-28 Raytheon Company Image recognition system and method for identifying similarities in different images
USD743553S1 (en) 2013-02-28 2015-11-17 DermSpectra LLC Imaging booth
US9615786B1 (en) * 2013-03-15 2017-04-11 Sally E. Roney Solo home user skin imaging method, alignment aid, and chromatic filter for detecting growth of early melanomas
US20140378810A1 (en) 2013-04-18 2014-12-25 Digimarc Corporation Physiologic data acquisition and analysis
EP2987106A4 (en) * 2013-04-18 2016-12-14 Digimarc Corp Physiologic data acquisition and analysis
US20140313303A1 (en) * 2013-04-18 2014-10-23 Digimarc Corporation Longitudinal dermoscopic study employing smartphone-based image registration
GB201308866D0 (en) * 2013-05-16 2013-07-03 Siemens Medical Solutions System and methods for efficient assessment of lesion developemnt
US9220462B2 (en) 2013-05-24 2015-12-29 Toshiba America Electronic Components, Inc. Imaging sensor and method for biometric mapping of facial skin
US20140354830A1 (en) * 2013-06-03 2014-12-04 Littleton Precision, LLC System and method for adding scale to photographic images
TWI622011B (en) * 2013-10-23 2018-04-21 Maxell Holdings Ltd Surface state photographic display system and surface state photographic display method
US10037821B2 (en) * 2013-12-27 2018-07-31 General Electric Company System for integrated protocol and decision support
PL408689A1 (en) * 2014-06-28 2016-01-04 Ktg Spółka Z Ograniczoną Odpowiedzialnością Method for diagnosing birthmarks on skin
US9852595B2 (en) * 2014-09-22 2017-12-26 Dann M Allen Photo comparison and security process called the flicker process
JP6827922B2 (en) * 2014-11-06 2021-02-10 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Skin treatment system
JP6164238B2 (en) * 2015-03-18 2017-07-19 カシオ計算機株式会社 Diagnostic device, image processing method in the diagnostic device, and program thereof
US9990472B2 (en) * 2015-03-23 2018-06-05 Ohio State Innovation Foundation System and method for segmentation and automated measurement of chronic wound images
US10004885B2 (en) * 2015-05-22 2018-06-26 L'oreal Imaging applicator for treating skin conditions
US10169882B1 (en) * 2015-09-11 2019-01-01 WinguMD, Inc. Object size detection with mobile device captured photo
US10070049B2 (en) * 2015-10-07 2018-09-04 Konica Minolta Laboratory U.S.A., Inc Method and system for capturing an image for wound assessment
TWI615130B (en) * 2016-07-21 2018-02-21 國立臺灣大學 Image processing method and non-transitory computer readable medium
US11244456B2 (en) 2017-10-03 2022-02-08 Ohio State Innovation Foundation System and method for image segmentation and digital analysis for clinical trial scoring in skin disease
US20190214127A1 (en) * 2018-01-10 2019-07-11 International Business Machines Corporation Sub-optimal health detection and alert generation using a time series of images
US10755414B2 (en) * 2018-04-27 2020-08-25 International Business Machines Corporation Detecting and monitoring a user's photographs for health issues
WO2020014779A1 (en) * 2018-07-16 2020-01-23 Swift Medical Inc. Apparatus for visualization of tissue
US10957043B2 (en) * 2019-02-28 2021-03-23 Endosoftllc AI systems for detecting and sizing lesions
US11255785B2 (en) 2019-03-14 2022-02-22 Applied Materials, Inc. Identifying fiducial markers in fluorescence microscope images
US11469075B2 (en) 2019-03-14 2022-10-11 Applied Materials, Inc. Identifying fiducial markers in microscope images
US11176669B2 (en) 2019-04-14 2021-11-16 Holovisions LLC System for remote medical imaging using two conventional smart mobile devices and/or augmented reality (AR)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997047235A1 (en) * 1996-06-11 1997-12-18 J.M.I. Ltd. Dermal diagnostic analysis system and method
US6427022B1 (en) * 1998-11-10 2002-07-30 Western Research Company, Inc. Image comparator system and method for detecting changes in skin lesions
US6648820B1 (en) * 1999-10-27 2003-11-18 Home-Medicine (Usa), Inc. Medical condition sensing system
WO2004095372A1 (en) * 2003-04-22 2004-11-04 Provincia Italiana Della Congregazione Dei Figli Dell'immacolata Concezione - Instituto Dermopatico Dell'immacolata Automatic detection of skin lesions

Family Cites Families (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5413477A (en) * 1992-10-16 1995-05-09 Gas Research Institute Staged air, low NOX burner with internal recuperative flue gas recirculation
US5311131A (en) * 1992-05-15 1994-05-10 Board Of Regents Of The University Of Washington Magnetic resonance imaging using pattern recognition
JPH09511077A (en) * 1993-11-30 1997-11-04 アーチ ディヴェロプメント コーポレイション Automated method and system for image matching and image correlation in two different ways
IL108352A (en) * 1994-01-17 2000-02-29 Given Imaging Ltd In vivo video camera system
US5999840A (en) * 1994-09-01 1999-12-07 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets
US5810742A (en) * 1994-10-24 1998-09-22 Transcan Research & Development Co., Ltd. Tissue characterization based on impedance images and on impedance measurements
AT403654B (en) * 1994-12-01 1998-04-27 Binder Michael Dr DEVICE FOR THE OPTICAL EXAMINATION OF HUMAN SKIN AND THE SAME ASSIGNMENT EVALUATION DEVICE
US5627907A (en) * 1994-12-01 1997-05-06 University Of Pittsburgh Computerized detection of masses and microcalcifications in digital mammograms
US6221007B1 (en) * 1996-05-03 2001-04-24 Philip S. Green System and method for endoscopic imaging and endosurgery
US5673300A (en) * 1996-06-11 1997-09-30 Wisconsin Alumni Research Foundation Method of registering a radiation treatment plan to a patient
US5910972A (en) * 1996-09-25 1999-06-08 Fuji Photo Film Co., Ltd. Bone image processing method and apparatus
US6081612A (en) * 1997-02-28 2000-06-27 Electro Optical Sciences Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
AU740638B2 (en) * 1997-02-28 2001-11-08 Electro-Optical Sciences, Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
IL124616A0 (en) * 1998-05-24 1998-12-06 Romedix Ltd Apparatus and method for measurement and temporal comparison of skin surface images
US6425867B1 (en) * 1998-09-18 2002-07-30 University Of Washington Noise-free real time ultrasonic imaging of a treatment site undergoing high intensity focused ultrasound therapy
IL126727A (en) * 1998-10-22 2006-12-31 Given Imaging Ltd Method for delivering a device to a target location
US6266453B1 (en) * 1999-07-26 2001-07-24 Computerized Medical Systems, Inc. Automated image fusion/alignment system and method
US6941323B1 (en) * 1999-08-09 2005-09-06 Almen Laboratories, Inc. System and method for image comparison and retrieval by enhancing, defining, and parameterizing objects in images
US6490476B1 (en) * 1999-10-14 2002-12-03 Cti Pet Systems, Inc. Combined PET and X-ray CT tomograph and method for using same
US6603552B1 (en) * 1999-12-22 2003-08-05 Xillix Technologies Corp. Portable system for detecting skin abnormalities based on characteristic autofluorescence
WO2001076480A1 (en) * 2000-04-05 2001-10-18 Georgetown University Stereotactic radiosurgery methods to precisely deliver high dosages of radiation especially to the spine
US6594388B1 (en) * 2000-05-25 2003-07-15 Eastman Kodak Company Color image reproduction of scenes with preferential color mapping and scene-dependent tone scaling
DE10031074A1 (en) * 2000-06-30 2002-01-31 Schwerionenforsch Gmbh Device for irradiating a tumor tissue
CA2314794A1 (en) * 2000-08-01 2002-02-01 Dimitre Hristov Apparatus for lesion or organ localization
US6359513B1 (en) * 2001-01-31 2002-03-19 U.S. Philips Corporation CMOS power amplifier with reduced harmonics and improved efficiency
US20040201694A1 (en) * 2001-02-07 2004-10-14 Vladimir Gartstein Noninvasive methods and apparatus for monitoring at least one hair characteristic
AUPR509801A0 (en) * 2001-05-18 2001-06-14 Polartechnics Limited Boundary finding in dermatological examination
US7217266B2 (en) * 2001-05-30 2007-05-15 Anderson R Rox Apparatus and method for laser treatment with spectroscopic feedback
US6582079B2 (en) * 2001-06-05 2003-06-24 Metrologic Instruments, Inc. Modular adaptive optical subsystem for integration with a fundus camera body and CCD camera unit and improved fundus camera employing same
EP1345154A1 (en) * 2002-03-11 2003-09-17 Bracco Imaging S.p.A. A method for encoding image pixels and method for processing images aimed at qualitative recognition of the object reproduced by one more image pixels
US7136191B2 (en) * 2002-06-24 2006-11-14 Eastman Kodak Company Method for inspecting prints
US7128894B1 (en) * 2002-06-27 2006-10-31 The United States Of America As Represented By The United States Department Of Energy Contrast enhancing solution for use in confocal microscopy
US8095204B2 (en) * 2002-08-09 2012-01-10 Interstitial, Llc Apparatus and method for diagnosing breast cancer including examination table
US8088067B2 (en) * 2002-12-23 2012-01-03 Insightec Ltd. Tissue aberration corrections in ultrasound therapy
TW589170B (en) * 2002-12-25 2004-06-01 De-Yang Tian Endoscopic device
US7546156B2 (en) * 2003-05-09 2009-06-09 University Of Rochester Medical Center Method of indexing biological imaging data using a three-dimensional body representation
JP4639045B2 (en) * 2003-07-11 2011-02-23 財団法人先端医療振興財団 Non-invasive temperature distribution measuring method and apparatus for self-reference type and body movement tracking type by magnetic resonance tomography
US7162063B1 (en) * 2003-07-29 2007-01-09 Western Research Company, Inc. Digital skin lesion imaging system and method
FI116327B (en) * 2003-09-24 2005-10-31 Nokia Corp Method and system for automatically adjusting color balance in a digital image processing chain, corresponding hardware and software means for implementing the method
US20050096515A1 (en) * 2003-10-23 2005-05-05 Geng Z. J. Three-dimensional surface image guided adaptive therapy system
US7496392B2 (en) * 2003-11-26 2009-02-24 Becton, Dickinson And Company Fiber optic device for sensing analytes
DE10356088B4 (en) * 2003-12-01 2007-03-29 Siemens Ag Method and device for examining the skin
US7020240B2 (en) * 2003-12-31 2006-03-28 General Electric Company Method and apparatus for measuring matter properties
WO2005079306A2 (en) * 2004-02-13 2005-09-01 University Of Chicago Method, system, and computer software product for feature-based correlation of lesions from multiple images
US8517921B2 (en) * 2004-04-16 2013-08-27 Gyrus Acmi, Inc. Endoscopic instrument having reduced diameter flexible shaft
JP2005328845A (en) * 2004-05-06 2005-12-02 Oce Technologies Bv Methods, apparatus and computer for transforming digital colour images
US7672705B2 (en) * 2004-07-19 2010-03-02 Resonant Medical, Inc. Weighted surface-to-surface mapping
WO2006012631A2 (en) * 2004-07-23 2006-02-02 Calypso Medical Technologies, Inc. Integrated radiation therapy systems and methods for treating a target in a patient
US20060036135A1 (en) * 2004-08-10 2006-02-16 Kern Kenneth A Skin cancer identification template
US8026942B2 (en) * 2004-10-29 2011-09-27 Johnson & Johnson Consumer Companies, Inc. Skin imaging system with probe
AU2006206334C1 (en) 2005-01-19 2011-05-19 Dermaspect Llc Devices and methods for identifying and monitoring changes of a suspect area on a patient
WO2007113815A2 (en) * 2006-03-30 2007-10-11 Activiews Ltd System and method for optical position measurement and guidance of a rigid or semi flexible tool to a target
US20080045807A1 (en) * 2006-06-09 2008-02-21 Psota Eric T System and methods for evaluating and monitoring wounds
US8489177B2 (en) * 2008-07-16 2013-07-16 Dilon Technologies, Inc. Fiducial marker and method for gamma guided stereotactic localization
KR102087595B1 (en) * 2013-02-28 2020-03-12 삼성전자주식회사 Endoscope system and control method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997047235A1 (en) * 1996-06-11 1997-12-18 J.M.I. Ltd. Dermal diagnostic analysis system and method
US6427022B1 (en) * 1998-11-10 2002-07-30 Western Research Company, Inc. Image comparator system and method for detecting changes in skin lesions
US6648820B1 (en) * 1999-10-27 2003-11-18 Home-Medicine (Usa), Inc. Medical condition sensing system
WO2004095372A1 (en) * 2003-04-22 2004-11-04 Provincia Italiana Della Congregazione Dei Figli Dell'immacolata Concezione - Instituto Dermopatico Dell'immacolata Automatic detection of skin lesions

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9723270B2 (en) 2005-01-19 2017-08-01 Dermaspect Llc Devices and methods for identifying and monitoring changes of a suspect area of a patient
US8755053B2 (en) 2005-10-14 2014-06-17 Applied Research Associates Nz Limited Method of monitoring a surface feature and apparatus therefor
US10827970B2 (en) 2005-10-14 2020-11-10 Aranz Healthcare Limited Method of monitoring a surface feature and apparatus therefor
WO2008150343A1 (en) * 2007-05-22 2008-12-11 Eastman Kodak Company Monitoring physiological conditions
FR2935888A1 (en) * 2008-09-17 2010-03-19 Dataderm Internat Gmbh METHOD AND SYSTEM FOR SKIN ANALYSIS
WO2010031747A1 (en) * 2008-09-17 2010-03-25 Dataderm International Gmbh Method and system for analysing skin
CN102265308A (en) * 2008-12-23 2011-11-30 皇家飞利浦电子股份有限公司 System for monitoring medical abnormalities and method of operation thereof
WO2010073178A1 (en) * 2008-12-23 2010-07-01 Koninklijke Philips Electronics N.V. System for monitoring medical abnormalities and method of operation thereof
WO2011036259A1 (en) * 2009-09-24 2011-03-31 W.O.M. World Of Medicine Ag Dermatoscope and elevation measuring tool
US9339194B2 (en) 2010-03-08 2016-05-17 Cernoval, Inc. System, method and article for normalization and enhancement of tissue images
US10201281B2 (en) 2010-03-08 2019-02-12 Cernoval, Inc. System, method and article for normalization and enhancement of tissue images
EP2518689A1 (en) * 2011-04-27 2012-10-31 Cornelis Blokker Method for comparing medical images of a patient
US9569838B2 (en) * 2011-06-09 2017-02-14 Canon Kabushiki Kaisha Image processing apparatus, method of controlling image processing apparatus and storage medium
WO2012169562A1 (en) * 2011-06-09 2012-12-13 Canon Kabushiki Kaisha Image processing apparatus, method of controlling image processing apparatus and storage medium
US20140064563A1 (en) * 2011-06-09 2014-03-06 Canon Kabushiki Kaisha Image processing apparatus, method of controlling image processing apparatus and storage medium
JP2012254221A (en) * 2011-06-09 2012-12-27 Canon Inc Image processing apparatus, method for controlling the same, and program
AT511933B1 (en) * 2011-08-10 2021-10-15 Acd Elektronik Gmbh Process for recording, measuring and documenting wounds as well as a device for carrying out the process
AT511933A3 (en) * 2011-08-10 2021-07-15 Acd Elektronik Gmbh Process for recording, measuring and documenting wounds as well as a device for carrying out the process
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US9861285B2 (en) 2011-11-28 2018-01-09 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US10874302B2 (en) 2011-11-28 2020-12-29 Aranz Healthcare Limited Handheld skin measuring or monitoring device
AU2019203346B2 (en) * 2013-07-22 2021-03-25 The Rockefeller University Optical detection of skin disease
US11931164B2 (en) 2013-07-22 2024-03-19 The Rockefeller University System and method for optical detection of skin disease
US10057505B2 (en) 2014-06-13 2018-08-21 FotoFinder Systems GmbH Full-body image capturing and image processing system and method for its operation
WO2015188964A1 (en) * 2014-06-13 2015-12-17 FotoFinder Systems GmbH Whole-body image recording and image processing system and method for operating same
WO2016060557A1 (en) * 2014-10-17 2016-04-21 Stichting Maastricht Radiation Oncology "Maastro-Clinic" Image analysis method supporting illness development prediction for a neoplasm in a human or animal body
EP3996039A1 (en) * 2014-10-17 2022-05-11 Stichting Maastricht Radiation Oncology "Maastro-Clinic" Image analysis method supporting illness development prediction for a neoplasm in a human or animal body
US10311571B2 (en) 2014-10-17 2019-06-04 Stichting Maastricht Radiation Oncology “Maastro-Clinic” Image analysis method supporting illness development prediction for a neoplasm in a human or animal body
US11134885B2 (en) 2015-08-13 2021-10-05 The Rockefeller University Quantitative dermoscopic melanoma screening
US11250945B2 (en) 2016-05-02 2022-02-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US10777317B2 (en) 2016-05-02 2020-09-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11923073B2 (en) 2016-05-02 2024-03-05 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
CN107798673B (en) * 2016-09-01 2021-07-27 卡西欧计算机株式会社 Diagnosis support device, image processing method for diagnosis support device, and storage medium storing program
CN107798673A (en) * 2016-09-01 2018-03-13 卡西欧计算机株式会社 The storage medium of diagnosis supporting device, the image processing method of diagnosis supporting device and storage program
EP3432268A1 (en) * 2016-09-01 2019-01-23 Casio Computer Co., Ltd. Diagnosis assisting device, image processing method in diagnosis assisting device, and computer program
US10586331B2 (en) 2016-09-01 2020-03-10 Casio Computer Co., Ltd. Diagnosis assisting device, image processing method in diagnosis assisting device, and non-transitory storage medium having stored therein program
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US11903723B2 (en) 2017-04-04 2024-02-20 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
DE102019130369A1 (en) * 2019-11-11 2021-05-12 Gottfried Wilhelm Leibniz Universität Hannover Camera system for prescribing a detailed image of a patient's skin and online dermatoscope

Also Published As

Publication number Publication date
US7657101B2 (en) 2010-02-02
CA2595239A1 (en) 2006-07-27
US20120033867A1 (en) 2012-02-09
AU2010257350A1 (en) 2011-01-20
US20140125787A1 (en) 2014-05-08
CA2595239C (en) 2016-12-20
US8068675B2 (en) 2011-11-29
AU2006206334C1 (en) 2011-05-19
US20190273890A1 (en) 2019-09-05
US20100111387A1 (en) 2010-05-06
AU2006206334A1 (en) 2006-07-27
CA2856932A1 (en) 2006-07-27
CA2856932C (en) 2015-10-13
AU2006206334B2 (en) 2010-09-23
NZ556655A (en) 2010-10-29
US9723270B2 (en) 2017-08-01
US20060210132A1 (en) 2006-09-21
WO2006078902A3 (en) 2006-12-07

Similar Documents

Publication Publication Date Title
US20190273890A1 (en) Devices and methods for identifying and monitoring changes of a suspect area of a patient
US11382558B2 (en) Skin feature imaging system
US20210219907A1 (en) Method of monitoring a surface feature and apparatus therefor
US20120206587A1 (en) System and method for scanning a human body
US11903723B2 (en) Anatomical surface assessment methods, devices and systems
US6567682B1 (en) Apparatus and method for lesion feature identification and characterization
US6427022B1 (en) Image comparator system and method for detecting changes in skin lesions
CN103542935B (en) For the Miniaturized multi-spectral that real-time tissue oxygenation is measured
US20090060304A1 (en) Dermatology information
US8606345B2 (en) Medical dual lens camera for documentation of dermatological conditions with laser distance measuring
WO2011036259A1 (en) Dermatoscope and elevation measuring tool
KR102222059B1 (en) Object position recognition diagnostic device that outputs the object position and the diagnosis result using a diagnostic device with marker
EP4297632A1 (en) Multi-function device and a multi-function system for ergonomically and remotely monitoring a medical or a cosmetic skin condition
Sun et al. A photometric stereo approach for chronic wound measurement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2595239

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2006206334

Country of ref document: AU

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 556655

Country of ref document: NZ

ENP Entry into the national phase

Ref document number: 2006206334

Country of ref document: AU

Date of ref document: 20060119

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 06719015

Country of ref document: EP

Kind code of ref document: A2