US20170079577A1 - Method of monitoring a surface feature and apparatus therefor - Google Patents

Method of monitoring a surface feature and apparatus therefor Download PDF

Info

Publication number
US20170079577A1
US20170079577A1 US15/370,284 US201615370284A US2017079577A1 US 20170079577 A1 US20170079577 A1 US 20170079577A1 US 201615370284 A US201615370284 A US 201615370284A US 2017079577 A1 US2017079577 A1 US 2017079577A1
Authority
US
United States
Prior art keywords
outline
camera
anatomical feature
anatomical
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/370,284
Inventor
William Richard Fright
Mark Arthur Nixon
Bruce Clinton McCallum
James Telford George Preddey
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aranz Healthcare Ltd
Original Assignee
Aranz Healthcare Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aranz Healthcare Ltd filed Critical Aranz Healthcare Ltd
Priority to US15/370,284 priority Critical patent/US20170079577A1/en
Publication of US20170079577A1 publication Critical patent/US20170079577A1/en
Assigned to APPLIED RESEARCH ASSOCIATES NZ LIMITED reassignment APPLIED RESEARCH ASSOCIATES NZ LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FRIGHT, WILLIAM RICHARD, MCCALLUM, BRUCE CLINTON, NIXON, MARK ARTHUR, PREDDEY, JAMES TELFORD GEORGE
Assigned to ARANZ HEALTHCARE LIMITED reassignment ARANZ HEALTHCARE LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: APPLIED RESEARCH ASSOCIATES NZ LIMITED
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1073Measuring volume, e.g. of limbs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1077Measuring of profiles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14539Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring pH
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/444Evaluating skin marks, e.g. mole, nevi, tumour, scar
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/447Skin evaluation, e.g. for skin disorder diagnosis specially adapted for aiding the prevention of ulcer or pressure sore development, i.e. before the ulcer or sore has developed
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • A61B5/748Selection of a region of interest, e.g. using a graphics tablet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • A61B5/748Selection of a region of interest, e.g. using a graphics tablet
    • A61B5/7485Automatic selection of region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • G01B11/285Measuring arrangements characterised by the use of optical techniques for measuring areas using photoelectric detection means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00204Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
    • H04N1/00209Transmitting or receiving image data, e.g. facsimile data, via a computer, e.g. using e-mail, a computer network, the internet, I-fax
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00204Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
    • H04N1/00244Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server with a server, e.g. an internet server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • H04N5/23222
    • H04N5/23293
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal

Definitions

  • the invention relates to a method of monitoring a surface feature and an apparatus for performing such monitoring.
  • the method and apparatus may find application in a wide range of fields from industrial applications through to medical or veterinary applications such as monitoring dermatological surface features such as wounds, ulcers, sores, lesions, tumours, bruises, burns, psoriasis, keloids, skin cancers, erythema etc.
  • wounds Various techniques have been used to monitor wounds, ulcers, sores, lesions, tumours etc. (herein referred to collectively as “wounds”) both within hospitals and outside hospitals (e.g. domiciliary based care, primary care facilities etc.). Typically these wounds are concave and up to about 250 millimetres across. Manual techniques are typically labour-intensive and require examination and contact by skilled personnel. Such measurements may be inaccurate and there may be significant variation between measurements made by different personnel. Further, these approaches may not preserve any visual record for review by an expert or for subsequent comparison.
  • WO 2006/078902 discloses a system in which the scale of a captured image is determined using a laser triangulation sensor. The distance of the camera from a patient's skin is determined using the position of a laser spot in the image. Only a single laser spot is used and the laser is used only in a simple distance measurement.
  • US2005/0027567 discloses a system in which a medical professional may enter patient information into a portable computing device. A nurse may also photograph the patient's wounds, these photographs becoming part of the patient's record. However, use of this image data is limited and the computing device is effectively used simply to allow notes to be taken.
  • a method of determining at least one dimension of a surface feature including:
  • FIG. 1 shows the principle of operation of an apparatus according to one embodiment
  • FIG. 2 shows an image of a surface feature with a single stripe projected onto the surface feature
  • FIG. 3 a shows an image of a surface feature with cross hairs projected onto the surface feature
  • FIG. 3 b shows a cross-sectional view of a wound
  • FIG. 4 shows an image of a surface feature with a series of dots projected onto the surface feature
  • FIG. 5 shows one embodiment employing a personal digital assistant (PDA) for performing methods of the invention
  • FIG. 6 shows a bottom view of a Tablet PC and 3-D camera
  • FIG. 7 shows a top view of the Tablet PC and 3-D camera of FIG. 6 .
  • FIG. 8 shows an alternative apparatus and method
  • FIG. 9 shows an image illustrating a method of using the apparatus of FIG. 8 ;
  • FIG. 10 shows an apparatus according to a further embodiment
  • FIG. 11 shows a system according to another embodiment.
  • a camera 1 has an optical axis 2 and an image capture region 3 .
  • Laser 4 is disposed in a fixed angular relationship to optical axis 2 so that the fan beam 5 is disposed at angle a. to optical axis 2 .
  • laser 4 generates a single stripe 6 .
  • a laser projecting a single dot could be used.
  • the camera 1 is preferably a high resolution digital colour camera.
  • an illumination means (such as a white LED 44 for low power applications) can be used to give relatively constant background lighting.
  • the assembly of camera 1 and laser 4 is directed so that optical axis 2 is aligned with the central region of wound 7 .
  • Laser 4 projects stripe 6 across wound 7 and the image is captured by camera 1 . It will be appreciated that due to the fixed angular relationship of the laser fan beam 5 and the optical axis 2 that the distance of points of stripe 6 from camera 1 may be determined: the distance of points of stripe 6 along the x-axis shown in FIG. 1 is directly related to the distance of the point from camera 1 .
  • the assembly of camera 1 and laser 4 may be positioned above wound 7 so that stripe 6 is aligned with optical axis 2 .
  • This may be achieved by aligning cross hairs (or a dot) in the centre of a display screen displaying the image with the centre of wound 7 and stripe 6 .
  • the camera is positioned a known distance away from the centre of wound 7 and so a scale can be determined.
  • the area of a wound may be calculated by calculating the pixel area of wound 7 from a captured image and multiplying by a known scaling factor.
  • This technique may be effective where camera 1 can be oriented normal to the wound 7 and where wound 7 is generally planar. This technique offers a simple solution in such cases. However, many wounds are not generally planar and images may be taken at an oblique angle. In such cases this approach may not provide sufficient accuracy and repeatability due to the camera axis not being perpendicular to the wound and significant variation in the distance from the camera to the wound from that assumed.
  • an image may be captured in the same fashion except that the stripe need not be aligned with the optical axis of the camera.
  • An image as shown in FIG. 2 may be obtained.
  • Points 9 and 10 where the outline 8 of wound 7 intersects stripe 6 , may be used to calculate scale. From the locations of points 9 and 10 in the image 3 their corresponding (x, y, z) coordinates can be obtained using the known relationship of the laser-camera system.
  • a scale factor may be determined based on the x,y,z coordinates of points 9 and 10 to scale the area 7 to produce a scaled value. Whilst this technique does not require a user to align the stripe with the optical axis it still suffers from the limitations of the technique described above.
  • laser 4 projects structured light in the form of laser cross hairs onto the image capture area.
  • An image captured according to this embodiment is shown in FIG. 3 a .
  • the laser stripes 11 and 12 captured in the image may be identified automatically based on colour, light intensity etc.
  • the outline 13 is preferably user defined by drawing the outline on a touch display screen displaying the image.
  • the image points 14 , 15 , 16 and 17 where cross hairs 11 and 12 intersect with outline 13 may be automatically determined. From these points their corresponding (x, y, z) coordinates can be obtained as above. These three-dimensional coordinates may be utilised to determine the best-fit plane through all points.
  • the best-fit plane will generally be the plane having the minimum sum of squared orthogonal distances from the points to the plane.
  • the image may then be projected onto this plane using, for example, an affine transformation.
  • the resulting image is now scaled linearly and orthogonally.
  • the area within outline 13 may then be calculated from this transformed image. Any number of laser stripes may be used and these stripes may intersect with each other or not.
  • This approach has the advantage that it provides correction where an image is not taken normal to a wound. Determining the area within a two dimensional outline rather than in three dimensional space also reduces the computational load.
  • a wound depth measurement may also be derived as will be explained in connection with FIG. 3 b .
  • the point 18 of greatest depth b from best-fit plane 19 may be determined iteratively or by other methods. This may be determined for an individual point along one of the cross hairs 11 , 12 or for a group of points.
  • wound measurements may be made.
  • the so-called “Kundin area” may be calculated by obtaining the maximum linear dimension of the wound and the short axis (orthogonal to the long axis) of the outline and multiplying the product of these measurements by ⁇ /4.
  • the so-called “Kundin volume” may be calculated from the product of the two diameters, the maximum depth and a factor of 0.327. The dimensions may be determined and the volume calculated by a local processor. Various other algorithms may be used to calculate wound volume as appropriate for the circumstances.
  • FIG. 4 another implementation is shown.
  • a series of three laser dots 31 , 32 and 33 are projected instead of one or more laser stripe.
  • the laser dots are projected in a diverging pattern so that as the device is moved towards or away from the surface feature the spacing between the dots may be scaled so that they may be aligned with the outline of the wound 30 .
  • This approach has the advantage that the intersection between the stripes and the wound outline does not need to be determined as in previous embodiment. Further, the plane passing through the three points may be easily calculated.
  • a further point 34 may be provided for depth calculation. Point 34 will preferably be placed at the position of maximum wound depth.
  • the outline of the wound may be determined utilising image processing techniques. However, the results of such techniques may be variable depending upon image quality, available processing capacity and the optical characteristics of the wound. According to a preferred embodiment the outline is input by a user.
  • Apparatus for performing the method may take a variety of forms ranging from a stationary system (having a stationary camera or a handheld camera connected wirelessly or by a cable) to a fully portable unit.
  • Portable units in the form of PDAs, cell phones, notebooks, ultramobile PCs etc. including an integrated or plug-in camera allow great flexibility, especially for medical services outside of hospitals.
  • FIG. 5 an apparatus for implementing the invention according to one exemplary embodiment is shown.
  • the apparatus consists of a PDA 20 including a camera, such as a Palm or HP iPaQ, having a cross hair laser generator 21 which projects cross hairs at an angle to the optical axis of the PDA camera (as shown in FIG. 1 ).
  • the cross hair laser generator may be offset from the camera by about 50 millimetres and disposed at an angle of about 30° to the optical axis of the camera.
  • An image is captured by the camera of the PDA and displayed by touch screen 22 .
  • a user can draw an outline 24 about the boundary of the wound 25 using input device 23 on touch screen 22 .
  • the apparatus may allow adjustment of outline 24 using input device 23 .
  • placing input device 23 near outline 24 and dragging it may drag the proximate portion of the outline as the input device 23 is dragged across the screen.
  • This may be configured so that the effect of adjustment by the input device is proportional to the proximity of the input device to the outline.
  • the portion proximate to the outline will be adjusted whereas if the input device is placed some distance from the outline a larger area of the outline will be adjusted as the input device is dragged.
  • an image may be stored by the PDA in a patient record along with measurement information (wound area, wound depth, wound volume etc.).
  • An image without the cross hairs may also be captured by the PDA deactivating laser 21 . This may be desirable where an image of the wound only is required.
  • comparative measurements may be made and an indication of improvement or deterioration may be provided.
  • the PDA has wireless capabilities images may be sent directly for storage in a central database or distributed to medical professionals for evaluation. This allows an expert to review information obtained in the field and provide medical direction whilst the health practitioner is visiting the patient. The historic record allows patient progress to be tracked and re-evaluated, if necessary.
  • Measurements of other wound information may also be made.
  • the colour of the wound and the size of particular coloured regions may also be calculated. These measurements may require a colour reference target to be placed within the image capture area for accurate colour comparison to be made.
  • FIGS. 6 and 7 show a tablet PC 26 having a stereoscopic 3-D camera 27 connected thereto.
  • Tablet PC 26 is a notebook PC with an interactive screen such as a Toshiba Portege M200 and camera 27 may be a stereo camera such as a PointGrey Bumblebee camera.
  • the stereoscopic camera 27 provides three-dimensional image information which is utilised by the tablet PC 26 to produce a three-dimensional model.
  • a user utilising input device 28 may draw outline 29 around the wound displayed on the tablet PC screen. utilising the three dimensional data, area and volume may be directly calculated.
  • time-of-flight cameras may be substituted for camera 27 .
  • Time-of-flight cameras utilise modulated coherent light illumination and per-pixel correlation hardware.
  • the apparatus shown in FIG. 8 includes a pair of lasers 35 and 36 which project crossing fan beams 37 and 38 onto surface 39 .
  • Lasers 35 and 36 are maintained in a fixed relationship with respect to each other and camera 40 .
  • crossing beams 37 and 38 the spacing between beams 37 and 38 may be adjusted by a user over a convenient range by moving the assembly of lasers 35 , 36 and camera 40 towards or away from surface 39 .
  • FIG. 9 illustrates use of the apparatus shown in FIG. 8 in relation to a cylindrical surface 42 , such as is typical for a section of an arm or leg.
  • the method may be applied to any surface that may be transformed to a planar (flat) form, i.e. “unwrapped”. In the case of a “developable” surface, there is no distortion and the surface remains continuous, by definition.
  • fan beams 37 and 38 are projected onto cylindrical surface 42 they curve in a diverging manner as shown in FIG. 9 .
  • a user moves the assembly of lasers 35 and 36 and camera 40 with respect to the surface 42 so as to place beams 37 and 38 just outside the boundary 41 of a wound.
  • Camera 40 then captures an image as shown in FIG. 9 .
  • the beams 37 and 38 may be within the boundary 41 of a wound.
  • the three-dimensional locations of elements of beams 37 and 38 may then be determined from the captured image.
  • a three dimensional model of the surface (grid 43 illustrates this) may be calculated using the three dimensional coordinates of elements along lines 37 and 38 .
  • the model may be an inelastic surface draped between the three-dimensional coordinates of the structured light elements, or an elastic surface stretched between the three-dimensional coordinates, or a model of the anatomy, or simply a scaled planar projection.
  • a model of the anatomy may be a model retrieved from a library of models, or simply a geometric shape approximating anatomy (a cylinder approximating a leg, for example).
  • the three dimensional surface may be unwrapped to form a planar image in which all regions have the same scale (i.e. for a grid the grid is unwrapped such that all cells of the image are the same size).
  • the area within wound boundary 41 may then be easily calculated by calculating the area from the planar image.
  • the area within wound boundary 41 may be calculated by scaling the areas within each region according to scale attributes associated with each region (e.g. for the grid example normalising the total area within each cell to be the same). The granularity can of course be adjusted depending upon the accuracy required.
  • FIG. 10 shows an apparatus according to a further embodiment, in which one or more further sensors are provided.
  • the apparatus 50 includes a PDA 51 , with a housing 52 containing a camera 53 , laser generator 54 and a GPS receiver 55 .
  • the GPS receiver may alternatively be provided in a separate module, within the PDA 51 or in a plugin card.
  • the positioning module may be connected to the PDA via any suitable wired or wireless connection. Positioning systems other than GPS may also be suitable.
  • a positioning system allows automation of tasks and validation of actions. This may be achieved using the apparatus alone, or through communication with a central computer system and database. For example, a nurse may be using the apparatus to monitor wound healing for a patient. The nurse arrives at the patient's home and the position of the home is determined using the GPS system. The position may be used in determining an address. This may be used to ensure that the nurse is at the correct address, possibly by comparison with a schedule of patient visits.
  • the system may automatically select a patient associated with that address from a patient database.
  • the nurse enters patient information using the PDA and this information is automatically associated with the address determined using the GPS receiver. This avoids the necessity to enter a large amount of data using the PDA.
  • the position may be used directly without converting to an address, to select a patient associated with that position, or to associate a new patient with a position.
  • the positioning system may also be used in auditing user actions. For example, a nurse may enter patient information and this may be verified using the position data by checking it against a patient database. This also allows an employer to monitor staff actions, to ensure that a staff member has in fact visited a particular address or patient.
  • Data gathered using the GPS system may also be stored for future reference.
  • travel data may be gathered by monitoring position information over a period of time. This data may be used later in estimating travel times between sites and in establishing or optimizing travel schedules for workers.
  • FIG. 10 also shows an auxiliary sensor 56 , connected to the PDA via a wired connection 57 .
  • a wireless connection may also be used and any number of auxiliary sensors may be connected to the PDA.
  • Auxiliary sensors could also be included in the module 52 .
  • the auxiliary sensor allows further data to be gathered. For example, where the apparatus is used to capture an image of a wound in a patient's skin, the auxiliary sensor will allow measurement of another physical or chemical parameter associated with the patient, such as temperature, pH, moisture or odour.
  • the auxiliary sensor may also be an optical probe, which illuminates the skin or wound and analyses the spectrum of scattered light. For example, a fluorescence probe could be used.
  • the auxiliary sensors include a Doppler Ultrasound Probe.
  • Doppler Ultrasound Probe The management of some types of wound, such as vascular ulcers, requires measurement of blood-flow in the underlying tissue and Doppler Ultrasound is the method generally used to perform this measurement.
  • Low-power Doppler Ultrasound Probes such as those used in foetal heart-beat monitors may be suitable. This would make it unnecessary for a patient to visit a clinic or hospital, or for a separate ultrasound machine to be transported.
  • Data gathered from the auxiliary sensors may be associated with a particular address, patient or image. Data may be displayed on the PDA's screen, and may be overlaid on the associated image. The combined information may enable more advanced wound analysis methods to be employed.
  • auxiliary sensors allows many measurements to be more easily performed at the same time as an image is captured and by the same person. (In a medical setting, this person may also be performing wound treatment.) This is efficient and also allows data to be easily and accurately associated with a particular image or patient.
  • the section containing the lasers and camera could be combined so that they can be housed in a detachable unit from the PDA, interfaced via a SDIO or Compact Flash (CF) slot, for example.
  • CF Compact Flash
  • the camera can be optimally focussed, and an illumination means, such as a white LED, may be used to give relatively constant background lighting.
  • the section containing the camera and/or lasers could be movable with respect to the PDA (being interconnected by a cable or wirelessly). This allows independent manipulation of the camera to capture wounds in awkward locations whilst optimising viewing of the image to be captured.
  • multiple images may be captured in rapid succession. This is particularly advantageous where structured light (e.g. a laser) is used.
  • structured light e.g. a laser
  • two images may be captured: one with the laser on and one with the laser off. Subtracting one of these images from the other yields an image with just the laser lines (disregarding the inevitable noise). This facilitates the automated detection of the laser profiles.
  • Other combinations of images may also be useful.
  • three images could be captured: one without illumination but with the laser on, one without illumination and with the laser off and a third image with the illumination on and the laser off.
  • the first two images could be used to detect the laser profile, while the third image is displayed to the user.
  • the first image, showing the laser line with the illumination off would have a higher contrast, so that the laser line would stand out more clearly. Capturing the images in rapid succession means that the motion of the camera between the images is negligible.
  • FIG. 11 shows a system including one or more portable apparatuses 60 such as those described above. These apparatuses 60 may communicate via a communication network 61 with a central server 62 . Preferably the apparatuses 60 communicate wirelessly with the server 62 . The central server 62 may utilize an external database 63 for data storage.
  • This centralised system allows appropriate categorising and storage of data for future use. For example, by mining historical data from the database it is possible to analyse the efficacy of a particular treatment or to compare different treatments. Statistical trends of conditions, treatments and outcomes can be monitored. This data can be used to suggest a particular treatment, based on a set of symptoms exhibited by a particular patient. Data can provide predictions for wound healing. Where actual healing differs from the prediction by more than a threshold, the system may issue an alert.
  • a healthcare provider can use the data to audit efficiency of its whole organisation, departments within the organisation or even individual workers. Historical data may be compared with historical worker schedules to determine whether workers are performing all tasks on their schedules. Efficiencies of different workers may be compared.
  • the methods utilize human image processing capabilities to minimise the processing requirements.
  • the methods do not require the placement of articles near the wound and allow historical comparison of a wound.
  • the apparatus are portable with relatively low processing requirements and enable records to be sent wirelessly for evaluation and storage.

Abstract

Dimensions of a surface feature are determined by capturing an image of the surface feature and determining a scale associated with the image. Structured light may be projected onto the surface, such that the position of structured light in the captured image allows determination of scale. A non-planar surface may be unwrapped. The surface may alternatively be projected into a plane to correct for the scene being tilted with respect to the camera axis. A border of the surface feature may be input manually by a user. An apparatus and system for implementing the method are also disclosed.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 15/338,216, filed Oct. 28, 2016, which is a continuation of U.S. application Ser. No. 15/164,793 filed May 25, 2016, which is a continuation of U.S. application Ser. No. 14/272,719 filed May 8, 2014, now U.S. Pat. No. 9,377,295, which is a continuation of U.S. application Ser. No. 12/083,491, filed May 11, 2009, now U.S. Pat. No. 8,755,053, which is a 371 U.S. National Phase of International Application No. PCT/NZ2006/000262 filed Oct. 13, 2006, each of which is incorporated herein by reference in its entirety.
  • FIELD OF THE INVENTION
  • The invention relates to a method of monitoring a surface feature and an apparatus for performing such monitoring. The method and apparatus may find application in a wide range of fields from industrial applications through to medical or veterinary applications such as monitoring dermatological surface features such as wounds, ulcers, sores, lesions, tumours, bruises, burns, psoriasis, keloids, skin cancers, erythema etc.
  • BACKGROUND TO THE INVENTION
  • Various techniques have been used to monitor wounds, ulcers, sores, lesions, tumours etc. (herein referred to collectively as “wounds”) both within hospitals and outside hospitals (e.g. domiciliary based care, primary care facilities etc.). Typically these wounds are concave and up to about 250 millimetres across. Manual techniques are typically labour-intensive and require examination and contact by skilled personnel. Such measurements may be inaccurate and there may be significant variation between measurements made by different personnel. Further, these approaches may not preserve any visual record for review by an expert or for subsequent comparison.
  • A number of techniques for the automated monitoring of wounds have been proposed; see for example U.S. Pat. No. 6,101,408, U.S. Pat. No. 6,873,340, U.S. Pat. No. 4,535,782 and U.S. Pat. No. 5,967,979. A common approach is to place a reference object next to the wound and determine the size of the wound utilising the scale of the reference object. It is often undesirable to place a reference object near to a wound and this requires an additional cumbersome step for a user and risks contamination of the wound. Further, when the target is not in the plane of the wound, or if the wound is not planar, there will be errors in any area calculation.
  • WO 2006/078902 discloses a system in which the scale of a captured image is determined using a laser triangulation sensor. The distance of the camera from a patient's skin is determined using the position of a laser spot in the image. Only a single laser spot is used and the laser is used only in a simple distance measurement.
  • Systems utilising stereoscopic vision and automated boundary determination are known but they are expensive, complex, bulky and require significant computational power. Further, automated identification of the boundary of a wound may be inaccurate and variable. U.S. Pat. No. 6,567,682 and US2005/0084176 use stereoscopic techniques and automated wound boundary determination requiring intensive processing and bulky equipment.
  • Other systems, such as that described in US2004/0136579, require the camera always to be positioned with a guide against the patient's skin. While this consistently positions the camera a desired distance from the surface to be photographed and therefore sets the scale of the image, it is unwieldy and requires undesirable contact with the skin, risking contamination of the wound.
  • US2005/0027567 discloses a system in which a medical professional may enter patient information into a portable computing device. A nurse may also photograph the patient's wounds, these photographs becoming part of the patient's record. However, use of this image data is limited and the computing device is effectively used simply to allow notes to be taken.
  • It is an object of the invention to provide a simple, inexpensive and repeatable method that does not require a scale reference object to be employed and that may be performed at remote locations or to at least provide the public with a useful choice. It is a further object of the invention to provide an apparatus that is simple, portable, inexpensive and easy to use or which at least provides the public with a useful choice.
  • SUMMARY OF THE INVENTION
  • There is thus provided a method of producing a projection of a non-planar surface feature comprising:
      • a. projecting structured light onto the surface feature;
      • b. capturing an image including the surface feature;
      • c. determining the three-dimensional coordinates of structured light elements within the image; and
      • d. unwrapping the image based on the three-dimensional coordinates of the structured light elements to produce a planar projection of the surface feature.
  • According to a further embodiment there is provided a method of determining the area of a non-planar surface feature comprising:
      • a. projecting structured light onto the surface feature;
      • b. capturing an image including the surface feature;
      • c. determining the three-dimensional coordinates of structured light elements within the image;
      • d. determining scale attributes for regions of the image on the basis of the three-dimensional coordinates of the structured light elements; and
      • e. determining the area of the surface feature by scaling regions of the surface feature based on the scale attributes.
  • According to another embodiment there is provided a method of producing a projection of a surface feature comprising:
      • a. capturing an image of a surface feature;
      • b. determining from the image the coordinates of a plurality of points of the surface feature in three-dimensional space;
      • c. determining a plane in which at least a subset of the coordinates lie; and
      • d. projecting the image onto the plane to produce a transformed image.
  • According to a further embodiment there is provided a method of determining at least one dimension of a surface feature, including:
      • a. capturing an image including a surface feature;
      • b. determining a scale associated with the image;
      • c. manually inputting at least part of an outline of the surface feature; and
      • d. determining at least one dimension of the surface feature using the manually input outline data.
  • According to another embodiment there is provided an apparatus including:
      • a. a camera for capturing an image including a surface feature; and
      • b. a portable computing device including:
        • i. a display configured to display the image and to allow a user to manually input at least part of an outline of the surface feature; and
        • ii. a processor configured to determine a scale associated with the image and to determine at least one dimension of the surface feature using the manually input outline data.
  • According to a further embodiment there is provided a portable apparatus including:
      • a. a camera for capturing an image of a surface feature;
      • b. a portable computing device including a processor adapted to determine a scale associated with the image; and
      • c. a positioning module allowing the position of the apparatus to be determined.
  • According to another embodiment there is provided a healthcare apparatus including:
      • a. a camera for capturing an image of a surface feature on a patient;
      • b. one or more auxiliary sensors for determining a physical or chemical parameter associated with the patient; and
      • c. a portable computing device configured to receive image data from the camera and output from the auxiliary sensors, including a processor adapted to determine a scale associated with the image.
  • According to a further embodiment there is provided an apparatus including:
      • a. a camera for capturing an image including a surface feature; and
      • b. one or more structured light projectors configured to project structured light onto the surface, the structured light including two or more structured light components, each projected at a different angle to the camera's optical axis.
    DRAWINGS
  • The invention will now be described by way of example with reference to possible embodiments thereof as shown in the accompanying figures in which:
  • FIG. 1 shows the principle of operation of an apparatus according to one embodiment;
  • FIG. 2 shows an image of a surface feature with a single stripe projected onto the surface feature;
  • FIG. 3a shows an image of a surface feature with cross hairs projected onto the surface feature;
  • FIG. 3b shows a cross-sectional view of a wound;
  • FIG. 4 shows an image of a surface feature with a series of dots projected onto the surface feature;
  • FIG. 5 shows one embodiment employing a personal digital assistant (PDA) for performing methods of the invention;
  • FIG. 6 shows a bottom view of a Tablet PC and 3-D camera;
  • FIG. 7 shows a top view of the Tablet PC and 3-D camera of FIG. 6.
  • FIG. 8 shows an alternative apparatus and method;
  • FIG. 9 shows an image illustrating a method of using the apparatus of FIG. 8;
  • FIG. 10 shows an apparatus according to a further embodiment; and
  • FIG. 11 shows a system according to another embodiment.
  • DETAILED DESCRIPTION
  • Referring to FIG. 1 the general principle of operation of a first embodiment of the invention will be described. A camera 1 has an optical axis 2 and an image capture region 3. Laser 4 is disposed in a fixed angular relationship to optical axis 2 so that the fan beam 5 is disposed at angle a. to optical axis 2. In this embodiment laser 4 generates a single stripe 6. Alternatively a laser projecting a single dot could be used. The camera 1 is preferably a high resolution digital colour camera. Optionally, an illumination means (such as a white LED 44 for low power applications) can be used to give relatively constant background lighting.
  • In use the assembly of camera 1 and laser 4 is directed so that optical axis 2 is aligned with the central region of wound 7. Laser 4 projects stripe 6 across wound 7 and the image is captured by camera 1. It will be appreciated that due to the fixed angular relationship of the laser fan beam 5 and the optical axis 2 that the distance of points of stripe 6 from camera 1 may be determined: the distance of points of stripe 6 along the x-axis shown in FIG. 1 is directly related to the distance of the point from camera 1.
  • In a first embodiment the assembly of camera 1 and laser 4 may be positioned above wound 7 so that stripe 6 is aligned with optical axis 2. This may be achieved by aligning cross hairs (or a dot) in the centre of a display screen displaying the image with the centre of wound 7 and stripe 6. In this way the camera is positioned a known distance away from the centre of wound 7 and so a scale can be determined.
  • The area of a wound may be calculated by calculating the pixel area of wound 7 from a captured image and multiplying by a known scaling factor. This technique may be effective where camera 1 can be oriented normal to the wound 7 and where wound 7 is generally planar. This technique offers a simple solution in such cases. However, many wounds are not generally planar and images may be taken at an oblique angle. In such cases this approach may not provide sufficient accuracy and repeatability due to the camera axis not being perpendicular to the wound and significant variation in the distance from the camera to the wound from that assumed.
  • In a second embodiment an image may be captured in the same fashion except that the stripe need not be aligned with the optical axis of the camera. An image as shown in FIG. 2 may be obtained. Points 9 and 10, where the outline 8 of wound 7 intersects stripe 6, may be used to calculate scale. From the locations of points 9 and 10 in the image 3 their corresponding (x, y, z) coordinates can be obtained using the known relationship of the laser-camera system. Thus a scale factor may be determined based on the x,y,z coordinates of points 9 and 10 to scale the area 7 to produce a scaled value. Whilst this technique does not require a user to align the stripe with the optical axis it still suffers from the limitations of the technique described above.
  • In one embodiment laser 4 projects structured light in the form of laser cross hairs onto the image capture area. An image captured according to this embodiment is shown in FIG. 3a . The laser stripes 11 and 12 captured in the image may be identified automatically based on colour, light intensity etc. The outline 13 is preferably user defined by drawing the outline on a touch display screen displaying the image. The image points 14, 15, 16 and 17 where cross hairs 11 and 12 intersect with outline 13 may be automatically determined. From these points their corresponding (x, y, z) coordinates can be obtained as above. These three-dimensional coordinates may be utilised to determine the best-fit plane through all points. The best-fit plane will generally be the plane having the minimum sum of squared orthogonal distances from the points to the plane. The image may then be projected onto this plane using, for example, an affine transformation. The resulting image is now scaled linearly and orthogonally. The area within outline 13 may then be calculated from this transformed image. Any number of laser stripes may be used and these stripes may intersect with each other or not.
  • This approach has the advantage that it provides correction where an image is not taken normal to a wound. Determining the area within a two dimensional outline rather than in three dimensional space also reduces the computational load.
  • A wound depth measurement may also be derived as will be explained in connection with FIG. 3b . The point 18 of greatest depth b from best-fit plane 19 may be determined iteratively or by other methods. This may be determined for an individual point along one of the cross hairs 11, 12 or for a group of points.
  • Utilising this information standard wound measurements may be made. The so-called “Kundin area” may be calculated by obtaining the maximum linear dimension of the wound and the short axis (orthogonal to the long axis) of the outline and multiplying the product of these measurements by π/4. The so-called “Kundin volume” may be calculated from the product of the two diameters, the maximum depth and a factor of 0.327. The dimensions may be determined and the volume calculated by a local processor. Various other algorithms may be used to calculate wound volume as appropriate for the circumstances.
  • Referring now to FIG. 4 another implementation is shown. In this case a series of three laser dots 31, 32 and 33 are projected instead of one or more laser stripe. The laser dots are projected in a diverging pattern so that as the device is moved towards or away from the surface feature the spacing between the dots may be scaled so that they may be aligned with the outline of the wound 30. This approach has the advantage that the intersection between the stripes and the wound outline does not need to be determined as in previous embodiment. Further, the plane passing through the three points may be easily calculated. A further point 34 may be provided for depth calculation. Point 34 will preferably be placed at the position of maximum wound depth.
  • The outline of the wound may be determined utilising image processing techniques. However, the results of such techniques may be variable depending upon image quality, available processing capacity and the optical characteristics of the wound. According to a preferred embodiment the outline is input by a user.
  • Apparatus for performing the method may take a variety of forms ranging from a stationary system (having a stationary camera or a handheld camera connected wirelessly or by a cable) to a fully portable unit. Portable units in the form of PDAs, cell phones, notebooks, ultramobile PCs etc. including an integrated or plug-in camera allow great flexibility, especially for medical services outside of hospitals. Referring now to FIG. 5 an apparatus for implementing the invention according to one exemplary embodiment is shown. The apparatus consists of a PDA 20 including a camera, such as a Palm or HP iPaQ, having a cross hair laser generator 21 which projects cross hairs at an angle to the optical axis of the PDA camera (as shown in FIG. 1). For this embodiment the cross hair laser generator may be offset from the camera by about 50 millimetres and disposed at an angle of about 30° to the optical axis of the camera. An image is captured by the camera of the PDA and displayed by touch screen 22. A user can draw an outline 24 about the boundary of the wound 25 using input device 23 on touch screen 22. The apparatus may allow adjustment of outline 24 using input device 23.
  • In one embodiment placing input device 23 near outline 24 and dragging it may drag the proximate portion of the outline as the input device 23 is dragged across the screen. This may be configured so that the effect of adjustment by the input device is proportional to the proximity of the input device to the outline. Thus, if the input device is placed proximate to the outline the portion proximate to the outline will be adjusted whereas if the input device is placed some distance from the outline a larger area of the outline will be adjusted as the input device is dragged.
  • Utilising manual input of the outline avoids the need for complex image processing capabilities and allows a compact portable unit, such as a PDA, to be utilised. Further, this approach utilises human image processing capabilities to determine the outline where automated approaches may be less effective.
  • Once an image is captured it may be stored by the PDA in a patient record along with measurement information (wound area, wound depth, wound volume etc.). An image without the cross hairs may also be captured by the PDA deactivating laser 21. This may be desirable where an image of the wound only is required. Where previous information has been stored comparative measurements may be made and an indication of improvement or deterioration may be provided. Where the PDA has wireless capabilities images may be sent directly for storage in a central database or distributed to medical professionals for evaluation. This allows an expert to review information obtained in the field and provide medical direction whilst the health practitioner is visiting the patient. The historic record allows patient progress to be tracked and re-evaluated, if necessary.
  • Measurements of other wound information may also be made. The colour of the wound and the size of particular coloured regions may also be calculated. These measurements may require a colour reference target to be placed within the image capture area for accurate colour comparison to be made.
  • According to another embodiment a 3-D camera may be employed. FIGS. 6 and 7 show a tablet PC 26 having a stereoscopic 3-D camera 27 connected thereto. Tablet PC 26 is a notebook PC with an interactive screen such as a Toshiba Portege M200 and camera 27 may be a stereo camera such as a PointGrey Bumblebee camera. In this embodiment the stereoscopic camera 27 provides three-dimensional image information which is utilised by the tablet PC 26 to produce a three-dimensional model. However, as in the previous embodiments, a user utilising input device 28 may draw outline 29 around the wound displayed on the tablet PC screen. Utilising the three dimensional data, area and volume may be directly calculated.
  • In other embodiments “time-of-flight” cameras may be substituted for camera 27. Time-of-flight cameras utilise modulated coherent light illumination and per-pixel correlation hardware.
  • Referring now to FIGS. 8 and 9 an alternative apparatus and method will be described. The apparatus shown in FIG. 8 includes a pair of lasers 35 and 36 which project crossing fan beams 37 and 38 onto surface 39. Lasers 35 and 36 are maintained in a fixed relationship with respect to each other and camera 40. By utilising crossing beams 37 and 38 the spacing between beams 37 and 38 may be adjusted by a user over a convenient range by moving the assembly of lasers 35, 36 and camera 40 towards or away from surface 39.
  • FIG. 9 illustrates use of the apparatus shown in FIG. 8 in relation to a cylindrical surface 42, such as is typical for a section of an arm or leg. The method may be applied to any surface that may be transformed to a planar (flat) form, i.e. “unwrapped”. In the case of a “developable” surface, there is no distortion and the surface remains continuous, by definition. When fan beams 37 and 38 are projected onto cylindrical surface 42 they curve in a diverging manner as shown in FIG. 9. A user moves the assembly of lasers 35 and 36 and camera 40 with respect to the surface 42 so as to place beams 37 and 38 just outside the boundary 41 of a wound. Camera 40 then captures an image as shown in FIG. 9. For larger wounds the beams 37 and 38 may be within the boundary 41 of a wound.
  • The three-dimensional locations of elements of beams 37 and 38 may then be determined from the captured image. A three dimensional model of the surface (grid 43 illustrates this) may be calculated using the three dimensional coordinates of elements along lines 37 and 38. The model may be an inelastic surface draped between the three-dimensional coordinates of the structured light elements, or an elastic surface stretched between the three-dimensional coordinates, or a model of the anatomy, or simply a scaled planar projection. A model of the anatomy may be a model retrieved from a library of models, or simply a geometric shape approximating anatomy (a cylinder approximating a leg, for example).
  • In a first method the three dimensional surface may be unwrapped to form a planar image in which all regions have the same scale (i.e. for a grid the grid is unwrapped such that all cells of the image are the same size). The area within wound boundary 41 may then be easily calculated by calculating the area from the planar image.
  • Alternatively the area within wound boundary 41 may be calculated by scaling the areas within each region according to scale attributes associated with each region (e.g. for the grid example normalising the total area within each cell to be the same). The granularity can of course be adjusted depending upon the accuracy required.
  • This approach could be extended so that a plurality of parallel crossing lines are projected to achieve greater accuracy. The lines could have different optical characteristics (e.g. colour) to enable them to be distinguished. However, the two line approach described above does have the advantage of mimicking some manual approaches currently employed which involves tracing the wound outline onto a transparent sheet and then calculating the area.
  • FIG. 10 shows an apparatus according to a further embodiment, in which one or more further sensors are provided. The apparatus 50 includes a PDA 51, with a housing 52 containing a camera 53, laser generator 54 and a GPS receiver 55. The GPS receiver may alternatively be provided in a separate module, within the PDA 51 or in a plugin card. When external to the PDA, the positioning module may be connected to the PDA via any suitable wired or wireless connection. Positioning systems other than GPS may also be suitable.
  • Use of a positioning system allows automation of tasks and validation of actions. This may be achieved using the apparatus alone, or through communication with a central computer system and database. For example, a nurse may be using the apparatus to monitor wound healing for a patient. The nurse arrives at the patient's home and the position of the home is determined using the GPS system. The position may be used in determining an address. This may be used to ensure that the nurse is at the correct address, possibly by comparison with a schedule of patient visits.
  • In response to determination of an address, the system may automatically select a patient associated with that address from a patient database. Alternatively, for a new patient, the nurse enters patient information using the PDA and this information is automatically associated with the address determined using the GPS receiver. This avoids the necessity to enter a large amount of data using the PDA. Similarly, the position may be used directly without converting to an address, to select a patient associated with that position, or to associate a new patient with a position.
  • The positioning system may also be used in auditing user actions. For example, a nurse may enter patient information and this may be verified using the position data by checking it against a patient database. This also allows an employer to monitor staff actions, to ensure that a staff member has in fact visited a particular address or patient.
  • Data gathered using the GPS system may also be stored for future reference. For example, travel data may be gathered by monitoring position information over a period of time. This data may be used later in estimating travel times between sites and in establishing or optimizing travel schedules for workers.
  • FIG. 10 also shows an auxiliary sensor 56, connected to the PDA via a wired connection 57. A wireless connection may also be used and any number of auxiliary sensors may be connected to the PDA. Auxiliary sensors could also be included in the module 52. The auxiliary sensor allows further data to be gathered. For example, where the apparatus is used to capture an image of a wound in a patient's skin, the auxiliary sensor will allow measurement of another physical or chemical parameter associated with the patient, such as temperature, pH, moisture or odour. The auxiliary sensor may also be an optical probe, which illuminates the skin or wound and analyses the spectrum of scattered light. For example, a fluorescence probe could be used.
  • In one embodiment the auxiliary sensors include a Doppler Ultrasound Probe. The management of some types of wound, such as vascular ulcers, requires measurement of blood-flow in the underlying tissue and Doppler Ultrasound is the method generally used to perform this measurement. Low-power Doppler Ultrasound Probes such as those used in foetal heart-beat monitors may be suitable. This would make it unnecessary for a patient to visit a clinic or hospital, or for a separate ultrasound machine to be transported.
  • Data gathered from the auxiliary sensors may be associated with a particular address, patient or image. Data may be displayed on the PDA's screen, and may be overlaid on the associated image. The combined information may enable more advanced wound analysis methods to be employed.
  • Use of auxiliary sensors allows many measurements to be more easily performed at the same time as an image is captured and by the same person. (In a medical setting, this person may also be performing wound treatment.) This is efficient and also allows data to be easily and accurately associated with a particular image or patient.
  • In any of the above embodiments the section containing the lasers and camera could be combined so that they can be housed in a detachable unit from the PDA, interfaced via a SDIO or Compact Flash (CF) slot, for example. This allows added convenience for the user, plus enables lasers and cameras to be permanently mounted with respect to each other, for ease of calibration. Furthermore, the camera can be optimally focussed, and an illumination means, such as a white LED, may be used to give relatively constant background lighting.
  • In any of the above embodiments the section containing the camera and/or lasers could be movable with respect to the PDA (being interconnected by a cable or wirelessly). This allows independent manipulation of the camera to capture wounds in awkward locations whilst optimising viewing of the image to be captured.
  • In any of the embodiments described above, multiple images may be captured in rapid succession. This is particularly advantageous where structured light (e.g. a laser) is used. For example, two images may be captured: one with the laser on and one with the laser off. Subtracting one of these images from the other yields an image with just the laser lines (disregarding the inevitable noise). This facilitates the automated detection of the laser profiles. Other combinations of images may also be useful. For example, three images could be captured: one without illumination but with the laser on, one without illumination and with the laser off and a third image with the illumination on and the laser off. The first two images could be used to detect the laser profile, while the third image is displayed to the user. The first image, showing the laser line with the illumination off would have a higher contrast, so that the laser line would stand out more clearly. Capturing the images in rapid succession means that the motion of the camera between the images is negligible.
  • FIG. 11 shows a system including one or more portable apparatuses 60 such as those described above. These apparatuses 60 may communicate via a communication network 61 with a central server 62. Preferably the apparatuses 60 communicate wirelessly with the server 62. The central server 62 may utilize an external database 63 for data storage.
  • This centralised system allows appropriate categorising and storage of data for future use. For example, by mining historical data from the database it is possible to analyse the efficacy of a particular treatment or to compare different treatments. Statistical trends of conditions, treatments and outcomes can be monitored. This data can be used to suggest a particular treatment, based on a set of symptoms exhibited by a particular patient. Data can provide predictions for wound healing. Where actual healing differs from the prediction by more than a threshold, the system may issue an alert.
  • A healthcare provider can use the data to audit efficiency of its whole organisation, departments within the organisation or even individual workers. Historical data may be compared with historical worker schedules to determine whether workers are performing all tasks on their schedules. Efficiencies of different workers may be compared.
  • There are thus provided methods of measuring wounds that are simple, inexpensive, repeatable and may be performed remotely. The methods utilize human image processing capabilities to minimise the processing requirements. The methods do not require the placement of articles near the wound and allow historical comparison of a wound. The apparatus are portable with relatively low processing requirements and enable records to be sent wirelessly for evaluation and storage.
  • While the present invention has been illustrated by the description of the embodiments thereof, and while the embodiments have been described in detail, it is not the intention of the Applicant to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details, representative apparatus and method, and illustrative examples shown and described. Accordingly, departures may be made from such details without departure from the spirit or scope of the Applicant's general inventive concept.

Claims (31)

1-30. (canceled)
31. A computer-implemented method for evaluating an anatomical feature at the surface of a human patient's skin, the method comprising:
guiding a user to position a camera in at least one of x, y, or z directions relative to the anatomical feature by projecting visible light onto the surface at or near the anatomical feature from a light generating device fixedly coupled to the camera, wherein the projected visible light forms a first projection and a second projection on the surface and the relative positions of the first projection and the second projection change when the user moves the light generating device towards or away from the surface;
receiving, at a computing device operatively coupled to the camera, image data related to an image generated by the camera, wherein the image data characterizes the anatomical feature;
determining an area of the anatomical feature; and
displaying the area of the anatomical feature on a display.
32. The method of claim 31, further comprising determining an outline of the anatomical feature, wherein determining the area is based on the determined outline and the image data.
33. The method of claim 32, further comprising displaying the outline of the anatomical feature on the display.
34. The method of claim 31, further comprising determining at least one of a depth of the anatomical feature and a volume of the anatomical feature based on the image data.
35. The method of claim 31 wherein the camera is a three-dimensional (“3D”) camera or a time-of-flight camera.
36. The method of claim 31, further comprising determining an outline of the anatomical feature, wherein the area is determined based at least in part on the outline.
37. The method of claim 36, further comprising determining at least one of a depth of the anatomical feature and a volume of the anatomical feature based on the image data and the outline.
38. The method of claim 37, further comprising storing at least one of the image data, the area of the anatomical feature, the at least one of the depth and the volume of the anatomical feature, and the outline of the anatomical feature at a central data repository remote from the computing device.
39. The method of claim 31 wherein the computing device and the camera are physically coupled together.
40. The method of claim 31 wherein the computing device is a central server that is remote from the camera.
41. The method of claim 32 wherein determining the outline of the anatomical feature includes automatically processing the image data to generate the outline.
42. The method of claim 32 wherein determining the outline of the anatomical feature includes receiving, at the computing device, input from the user regarding at least a portion of the outline of the anatomical feature.
43. The method of claim 31 wherein guiding the user includes displaying a live image of the anatomical feature showing the first and second projections on the surface while the user positions the camera relative to the anatomical feature.
44. The method of claim 31 wherein at least one of the first projection and the second projection is a laser projection.
45. The method of claim 31 wherein the first projection and the second projection are laser fan beams.
46. The method of claim 31 wherein:
the first projection intersects the anatomical feature at a first plurality of points; and
the method further comprises:
determining the x, y, z coordinates of the first plurality of points, and
determining a depth of the anatomical feature based on the x, y, z coordinates of the first plurality of points.
47. A system for determining the area of an anatomical surface feature, the system comprising:
an imaging device configured to generate image data characterizing the anatomical surface feature;
an input device;
a display; and
one or more controllers having memory and processing circuitry, wherein the one or more controllers are configured to be in communication with the imaging device and the display, and wherein the memory includes instructions that, when processed by the processing circuitry, cause the controllers to:
display an image of the anatomical surface feature on the display;
create an outline of the anatomical surface feature based at least in part on a user-traced outline of the anatomical surface feature received from the input device.
48. The system of claim 47 wherein the memory includes instructions that, when processed by the processing circuitry, cause the one or more controllers to process the image of the anatomical surface feature to create the outline.
49. The system of claim 47 wherein the display and the imaging device are components of a portable capture device.
50. The system of claim 47 wherein the imaging device is a component of a portable capture device, and wherein the display is a component separate from the portable capture device.
51. The system of claim 47 wherein at least one of the one or more controllers is remote from the imaging device.
52. The system of claim 47 wherein the display is a touch screen.
53. The system of claim 47 wherein the input device is a touch screen.
54. The system of claim 4 wherein the imaging device is a digital camera.
55. The system of claim 47, further comprising a structured light arrangement having a known positional relationship relative to the imaging device.
56. The system of claim 55 wherein the structured light arrangement is configured to project visible light onto the anatomical surface feature.
57. The system of claim 55 wherein the structured light arrangement includes a laser.
58. The system of claim 55 wherein the imaging device and the structured light arrangement are components of a portable capture device, and wherein the display is a component separate from the portable capture device.
59. The system of claim 55 wherein the imaging device, the structured light arrangement, and the display are components of a portable capture device.
60. The system of claim 47 wherein the one or more controllers include a central server that is remote from and communicates at least partially wirelessly with the imaging device.
US15/370,284 2005-10-14 2016-12-06 Method of monitoring a surface feature and apparatus therefor Abandoned US20170079577A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/370,284 US20170079577A1 (en) 2005-10-14 2016-12-06 Method of monitoring a surface feature and apparatus therefor

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
NZ543003 2005-10-14
NZ54300305 2005-10-14
PCT/NZ2006/000262 WO2007043899A1 (en) 2005-10-14 2006-10-13 A method of monitoring a surface feature and apparatus therefor
US8349109A 2009-05-11 2009-05-11
US14/272,719 US9377295B2 (en) 2005-10-14 2014-05-08 Method of monitoring a surface feature and apparatus therefor
US15/164,793 US20160262659A1 (en) 2005-10-14 2016-05-25 Method of monitoring a surface feature and apparatus therefor
US15/338,216 US9955910B2 (en) 2005-10-14 2016-10-28 Method of monitoring a surface feature and apparatus therefor
US15/370,284 US20170079577A1 (en) 2005-10-14 2016-12-06 Method of monitoring a surface feature and apparatus therefor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/338,216 Continuation US9955910B2 (en) 2005-10-14 2016-10-28 Method of monitoring a surface feature and apparatus therefor

Publications (1)

Publication Number Publication Date
US20170079577A1 true US20170079577A1 (en) 2017-03-23

Family

ID=37943039

Family Applications (7)

Application Number Title Priority Date Filing Date
US12/083,491 Active 2030-11-10 US8755053B2 (en) 2005-10-14 2006-10-13 Method of monitoring a surface feature and apparatus therefor
US14/272,719 Active US9377295B2 (en) 2005-10-14 2014-05-08 Method of monitoring a surface feature and apparatus therefor
US15/164,793 Abandoned US20160262659A1 (en) 2005-10-14 2016-05-25 Method of monitoring a surface feature and apparatus therefor
US15/338,216 Active US9955910B2 (en) 2005-10-14 2016-10-28 Method of monitoring a surface feature and apparatus therefor
US15/370,284 Abandoned US20170079577A1 (en) 2005-10-14 2016-12-06 Method of monitoring a surface feature and apparatus therefor
US15/938,921 Active 2027-11-22 US10827970B2 (en) 2005-10-14 2018-03-28 Method of monitoring a surface feature and apparatus therefor
US17/093,488 Abandoned US20210219907A1 (en) 2005-10-14 2020-11-09 Method of monitoring a surface feature and apparatus therefor

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US12/083,491 Active 2030-11-10 US8755053B2 (en) 2005-10-14 2006-10-13 Method of monitoring a surface feature and apparatus therefor
US14/272,719 Active US9377295B2 (en) 2005-10-14 2014-05-08 Method of monitoring a surface feature and apparatus therefor
US15/164,793 Abandoned US20160262659A1 (en) 2005-10-14 2016-05-25 Method of monitoring a surface feature and apparatus therefor
US15/338,216 Active US9955910B2 (en) 2005-10-14 2016-10-28 Method of monitoring a surface feature and apparatus therefor

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/938,921 Active 2027-11-22 US10827970B2 (en) 2005-10-14 2018-03-28 Method of monitoring a surface feature and apparatus therefor
US17/093,488 Abandoned US20210219907A1 (en) 2005-10-14 2020-11-09 Method of monitoring a surface feature and apparatus therefor

Country Status (8)

Country Link
US (7) US8755053B2 (en)
EP (1) EP1948018A1 (en)
JP (1) JP2009511163A (en)
KR (1) KR20080064155A (en)
CN (1) CN101282687B (en)
AU (1) AU2006300008A1 (en)
CA (1) CA2625775A1 (en)
WO (1) WO2007043899A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10777317B2 (en) 2016-05-02 2020-09-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US10827970B2 (en) 2005-10-14 2020-11-10 Aranz Healthcare Limited Method of monitoring a surface feature and apparatus therefor
US10874302B2 (en) 2011-11-28 2020-12-29 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US11903723B2 (en) 2017-04-04 2024-02-20 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems

Families Citing this family (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070276309A1 (en) * 2006-05-12 2007-11-29 Kci Licensing, Inc. Systems and methods for wound area management
US8000777B2 (en) * 2006-09-19 2011-08-16 Kci Licensing, Inc. System and method for tracking healing progress of tissue
US20080088704A1 (en) * 2006-10-13 2008-04-17 Martin Edmund Wendelken Method of making digital planimetry measurements on digital photographs
US7705291B2 (en) 2007-11-02 2010-04-27 Woundmatrix, Inc. Apparatus and method for wound diagnosis
US20090177051A1 (en) * 2008-01-09 2009-07-09 Heal-Ex, Llc Systems and methods for providing sub-dressing wound analysis and therapy
US8330826B2 (en) * 2009-09-25 2012-12-11 Eastman Kodak Company Method for measuring photographer's aesthetic quality progress
US8505209B2 (en) 2009-10-27 2013-08-13 N.E. Solutionz, Llc Skin and wound assessment tool
US8276287B2 (en) 2009-10-27 2012-10-02 N.E. Solutionz, Llc Skin and wound assessment tool
US9161716B2 (en) 2009-10-27 2015-10-20 N.E. Solutionz, Llc Diagnostic imaging system for skin and affliction assessment
US10201296B2 (en) 2010-11-11 2019-02-12 Ascensia Diabetes Care Holdings Ag Apparatus, systems, and methods adapted to transmit analyte data having common electronic architecture
US9160898B2 (en) 2011-01-25 2015-10-13 Autofuss System and method for improved video motion control
US9875574B2 (en) * 2013-12-17 2018-01-23 General Electric Company Method and device for automatically identifying the deepest point on the surface of an anomaly
AT511265B1 (en) * 2011-03-24 2013-12-15 Red Soft It Service Gmbh DEVICE FOR DETERMINING A CHARACTERIZATION VALUE AND METHOD FOR EVALUATING THREE-DIMENSIONAL IMAGES
EP2713872A4 (en) * 2011-05-26 2014-10-22 3Derm Systems Llc Stereoscopic plug-and-play dermatoscope and web interface
JP5864950B2 (en) * 2011-08-15 2016-02-17 キヤノン株式会社 Three-dimensional measuring apparatus, three-dimensional measuring method and program
CN104054023B (en) * 2011-11-16 2017-11-17 欧特法斯公司 The system and method that the 3D of object Mapping for being controlled with robot is projected
US9832352B2 (en) 2011-11-16 2017-11-28 Autofuss System and method for 3D projection mapping with robotically controlled objects
US9351641B2 (en) * 2012-10-04 2016-05-31 Cerner Innovation, Inc. Mobile processing device system for patient monitoring data acquisition
US20150250416A1 (en) * 2012-10-05 2015-09-10 Vasamed, Inc. Apparatus and method to assess wound healing
US9277206B1 (en) * 2013-01-28 2016-03-01 Cognex Corporation Dual-view laser-based three-dimensional capture system and method for employing the same
US20140213936A1 (en) * 2013-01-29 2014-07-31 Monolysis Medical, LLC Devices, systems, and methods for tissue measurement
US20140218504A1 (en) * 2013-02-01 2014-08-07 Centre De Recherche Industrielle Du Quebec Apparatus and method for scanning a surface of an article
ES2666499T3 (en) * 2013-07-03 2018-05-04 Kapsch Trafficcom Ab Method for identification of contamination in a lens of a stereoscopic camera
CN103712572A (en) * 2013-12-18 2014-04-09 同济大学 Structural light source-and-camera-combined object contour three-dimensional coordinate measuring device
CN103697833B (en) * 2013-12-30 2016-03-09 北京农业智能装备技术研究中心 Agricultural product shape detecting method and device
EP3797680A1 (en) * 2014-01-10 2021-03-31 Ascensia Diabetes Care Holdings AG Setup synchronization apparatus and methods for end user medical devices
US9844904B2 (en) * 2014-02-18 2017-12-19 The Boeing Company Formation of thermoplastic parts
EP3129777B1 (en) 2014-04-11 2023-08-16 Ascensia Diabetes Care Holdings AG Wireless transmitter adapters for battery-operated biosensor meters and methods of providing same
EP3152737A4 (en) * 2014-06-03 2018-01-24 Jones, Freddy In-time registration of temporally separated image acquisition
CN104062840A (en) * 2014-06-19 2014-09-24 广东中烟工业有限责任公司 Fixing device of 3D imaging camera and structured light source
JP6824874B2 (en) 2014-07-07 2021-02-03 アセンシア・ディアベティス・ケア・ホールディングス・アーゲー Methods and equipment for improved low energy data communication
JP6446251B2 (en) * 2014-10-13 2018-12-26 ゼネラル・エレクトリック・カンパニイ Method and device for automatically identifying points of interest on anomalous surfaces
CN107106020A (en) * 2014-10-29 2017-08-29 组织分析股份有限公司 For analyzing and transmitting the data relevant with mammal skin damaged disease, image and the System and method for of video
CN104490361A (en) * 2014-12-05 2015-04-08 深圳市共创百业科技开发有限公司 Remote dermatosis screening system and method based on network hospitals
US10248761B2 (en) * 2015-01-07 2019-04-02 Derm Mapper, LLC Computerized system and method for recording and tracking dermatological lesions
JP6451350B2 (en) * 2015-01-28 2019-01-16 カシオ計算機株式会社 Medical image processing apparatus, medical image processing method and program
CN106152947B (en) * 2015-03-31 2019-11-29 北京京东尚科信息技术有限公司 Measure equipment, the method and apparatus of dimension of object
CA2983551A1 (en) 2015-04-29 2016-11-03 Ascensia Diabetes Care Holdings Ag Location-based wireless diabetes management systems, methods and apparatus
CN106289065B (en) * 2015-05-15 2020-10-27 高准精密工业股份有限公司 Detection method and optical device applying same
CN107920739B (en) * 2015-06-10 2021-07-09 泰拓卡尔有限公司 Device and method for examining skin lesions
US10070049B2 (en) 2015-10-07 2018-09-04 Konica Minolta Laboratory U.S.A., Inc Method and system for capturing an image for wound assessment
US10445606B2 (en) * 2015-10-08 2019-10-15 Microsoft Technology Licensing, Llc Iris recognition
CN105725979B (en) * 2016-05-16 2019-02-15 深圳大学 A kind of human body moire imaging device
CA3022540C (en) 2016-05-18 2020-02-18 Allen M. Waxman Hydrocarbon leak imaging and quantification sensor
GB2550582B (en) * 2016-05-23 2020-07-15 Bluedrop Medical Ltd A skin inspection device identifying abnormalities
WO2018013321A1 (en) 2016-06-28 2018-01-18 Kci Licensing, Inc. Semi-automated mobile system for wound image segmentation
CN106017324A (en) * 2016-07-15 2016-10-12 苏州鑫日达机械设备有限公司 Device for detecting precision of elevator guide rail-used hot-rolled section steel
CN106289064B (en) * 2016-10-18 2019-10-22 上海船舶工艺研究所 A kind of portable boat rib of slab bit line detection device
KR20180065135A (en) * 2016-12-07 2018-06-18 삼성전자주식회사 Methods and devices of reducing structure noises through self-structure analysis
CN106871814A (en) * 2017-01-16 2017-06-20 北京聚利科技股份有限公司 Contour outline measuring set and method
JP7081941B2 (en) * 2017-03-03 2022-06-07 Jfeテクノリサーチ株式会社 3D shape measuring device and 3D shape measuring method
EP3596308B1 (en) 2017-03-16 2022-11-16 Multisensor Scientific, Inc. Scanning ir sensor for gas safety and emissions monitoring
CN106974623A (en) * 2017-04-27 2017-07-25 上海迈鹊医用机器人技术有限公司 Blood vessel identification lancing system, blood vessel recognition methods
CN107388991B (en) * 2017-07-03 2019-12-03 中国计量大学 A kind of more fillet axial workpiece radius of corner measurement methods in end face
US11160491B2 (en) * 2017-09-12 2021-11-02 Hill-Rom Services, Inc. Devices, systems, and methods for monitoring wounds
CA3076483A1 (en) * 2017-11-16 2019-05-23 MultiSensor Scientific, Inc. Systems and methods for multispectral imaging and gas detection using a scanning illuminator and optical sensor
CN108917640A (en) * 2018-06-06 2018-11-30 佛山科学技术学院 A kind of laser blind hole depth detection method and its system
EP3824621A4 (en) 2018-07-19 2022-04-27 Activ Surgical, Inc. Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots
CN111156900B (en) * 2018-11-08 2021-07-13 中国科学院沈阳自动化研究所 Line-of-depth linear light measurement method for bullet primer assembly
JP6669236B2 (en) * 2018-12-12 2020-03-18 カシオ計算機株式会社 Medical image processing apparatus, medical image processing method, and program
US10976245B2 (en) 2019-01-25 2021-04-13 MultiSensor Scientific, Inc. Systems and methods for leak monitoring via measurement of optical absorption using tailored reflector installments
EP3696593A1 (en) 2019-02-12 2020-08-19 Leica Instruments (Singapore) Pte. Ltd. A controller for a microscope, a corresponding method and a microscope system
CA3132350A1 (en) 2019-04-08 2020-10-15 Stephen Tully Systems and methods for medical imaging
US11308618B2 (en) 2019-04-14 2022-04-19 Holovisions LLC Healthy-Selfie(TM): a portable phone-moving device for telemedicine imaging using a mobile phone
US11176669B2 (en) 2019-04-14 2021-11-16 Holovisions LLC System for remote medical imaging using two conventional smart mobile devices and/or augmented reality (AR)
WO2020234653A1 (en) 2019-05-20 2020-11-26 Aranz Healthcare Limited Automated or partially automated anatomical surface assessment methods, devices and systems
CN110470239B (en) * 2019-08-07 2021-05-18 上海交通大学 Laser profile sensor calibration system and method based on intersection point
TWI721533B (en) * 2019-08-19 2021-03-11 國立中央大學 Tremor identification method and system thereof
US20210093227A1 (en) * 2019-09-26 2021-04-01 Canon Kabushiki Kaisha Image processing system and control method thereof
JP2021049248A (en) * 2019-09-26 2021-04-01 キヤノン株式会社 Image processing system and method for controlling the same
US20210228148A1 (en) * 2020-01-28 2021-07-29 Zebra Technologies Corporation System and Method for Lesion Monitoring
KR102584067B1 (en) * 2020-09-09 2023-10-06 한국전자통신연구원 Apparatus and method for non-contact bio-signal measurement
US20240074735A1 (en) * 2021-01-14 2024-03-07 The Regents Of The University Of California Point of care ultrasound as a tool to assess wound size and tissue regenration after skin grafting
CN113063362B (en) * 2021-04-07 2023-05-09 湖南凌翔磁浮科技有限责任公司 Non-contact type magnetic levitation train bogie interval detection method
US11758263B2 (en) 2021-08-24 2023-09-12 Moleculight, Inc. Systems, devices, and methods for imaging and measurement using a stereoscopic camera system
CN114190890A (en) * 2021-11-26 2022-03-18 长沙海润生物技术有限公司 Wound surface imaging device and imaging method thereof
CN117442190B (en) * 2023-12-21 2024-04-02 山东第一医科大学附属省立医院(山东省立医院) Automatic wound surface measurement method and system based on target detection

Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4535782A (en) * 1984-03-07 1985-08-20 American Cyanamid Company Method for determining wound volume
US5270168A (en) * 1990-02-21 1993-12-14 Board Of Regents, The University Of Texas System Method for diagnosing non-healing ulcers
US5588428A (en) * 1993-04-28 1996-12-31 The University Of Akron Method and apparatus for non-invasive volume and texture analysis
US5957837A (en) * 1996-10-17 1999-09-28 Faro Technologies, Inc. Method and apparatus for wound management
US5969822A (en) * 1994-09-28 1999-10-19 Applied Research Associates Nz Ltd. Arbitrary-geometry laser surface scanner
US5967979A (en) * 1995-11-14 1999-10-19 Verg, Inc. Method and apparatus for photogrammetric assessment of biological tissue
US6359612B1 (en) * 1998-09-30 2002-03-19 Siemens Aktiengesellschaft Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device
US6381488B1 (en) * 1999-06-15 2002-04-30 Sandia Corporation Method and apparatus to measure the depth of skin burns
US6381026B1 (en) * 1999-03-15 2002-04-30 Lifecell Corp. Method of measuring the contour of a biological surface
US20020054297A1 (en) * 2000-11-06 2002-05-09 Chun-Hsing Lee Three dimensional scanning system
US20020149585A1 (en) * 1996-04-24 2002-10-17 Kacyra Ben K. Integrated system for quickly and accurately imaging and modeling three-dimensional objects
US20060055943A1 (en) * 2002-11-14 2006-03-16 Technodream21 Inc. Three-dimensional shape measuring method and its device
US20060073132A1 (en) * 2004-10-06 2006-04-06 Congote Luis F Agents for wound healing
US7068836B1 (en) * 2000-04-28 2006-06-27 Orametrix, Inc. System and method for mapping a surface
US7181363B2 (en) * 2003-04-16 2007-02-20 Massachusetts Institute Of Technology Three dimensional tangible interface for interacting with spatial-temporal data using a laser scanner
US7248724B2 (en) * 2002-11-19 2007-07-24 Polartechnics Limited Method for monitoring wounds
US20070276309A1 (en) * 2006-05-12 2007-11-29 Kci Licensing, Inc. Systems and methods for wound area management
US20070276195A1 (en) * 2006-05-12 2007-11-29 Kci Licensing, Inc. Systems and methods for wound area management
US20080045807A1 (en) * 2006-06-09 2008-02-21 Psota Eric T System and methods for evaluating and monitoring wounds
US20080098322A1 (en) * 2006-06-01 2008-04-24 Simquest Llc Method and apparatus for collecting and analyzing surface wound data
US7495208B2 (en) * 2006-06-01 2009-02-24 Czarnek And Orkin Laboratories, Inc. Portable optical wound scanner
US20090116712A1 (en) * 2007-11-02 2009-05-07 Osama Al-Moosawi Apparatus and method for wound diagnosis
US20090213213A1 (en) * 2005-10-14 2009-08-27 Applied Research Associates New Zealand Limited Method of Monitoring a Surface Feature and Apparatus Therefor
US20090234313A1 (en) * 2005-05-20 2009-09-17 Peter Mullejeans Device for Recording and Transferring a Contour
US20100004564A1 (en) * 2006-07-25 2010-01-07 Johan Jendle Wound measuring device
US20120035469A1 (en) * 2006-09-27 2012-02-09 Whelan Thomas J Systems and methods for the measurement of surfaces
US20120059266A1 (en) * 2009-03-09 2012-03-08 Mologic Ltd. Imaging method
US20130051651A1 (en) * 2010-05-07 2013-02-28 Purdue Research Foundation Quantitative image analysis for wound healing assay
US20130137991A1 (en) * 2011-11-28 2013-05-30 William Richard Fright Handheld skin measuring or monitoring device
US20130335545A1 (en) * 2010-12-19 2013-12-19 Matthew Ross Darling System for integrated wound analysis
US20140088402A1 (en) * 2012-09-25 2014-03-27 Innovative Therapies, Inc. Wound measurement on smart phones
US8814841B2 (en) * 2007-12-06 2014-08-26 Smith & Nephew Plc Apparatus and method for wound volume measurement
US20150150457A1 (en) * 2013-12-03 2015-06-04 Children's National Medical Center Method and system for wound assessment and management
US9330453B2 (en) * 2011-03-24 2016-05-03 Red. Soft It-Service Gmbh Apparatus and method for determining a skin inflammation value
US9451928B2 (en) * 2006-09-13 2016-09-27 Elekta Ltd. Incorporating internal anatomy in clinical radiotherapy setups
US20160284084A1 (en) * 2015-03-23 2016-09-29 Ohio State Innovation Foundation System and method for segmentation and automated measurement of chronic wound images

Family Cites Families (325)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NL260831A (en) 1960-02-04
US3335716A (en) 1965-01-18 1967-08-15 Gen Electric Diagnostic thermography method and means
US4090501A (en) * 1976-06-24 1978-05-23 Horace Chaitin Skin lesion analyzer
DE2642841C3 (en) 1976-09-23 1981-01-08 Siemens Ag, 1000 Berlin Und 8000 Muenchen Method for the quantitative topography evaluation of SEM images
US4170987A (en) 1977-11-28 1979-10-16 California Institute Of Technology Medical diagnosis system and method with multispectral imaging
US4236082A (en) 1979-01-29 1980-11-25 Palmguard, Inc. Method and apparatus for recording image details of the palm of a hand
US4515165A (en) 1980-02-04 1985-05-07 Energy Conversion Devices, Inc. Apparatus and method for detecting tumors
EP0063431B1 (en) 1981-04-10 1987-10-28 Masaaki Konomi Spectroscopic analyzer system
JPS5940830A (en) 1982-08-31 1984-03-06 浜松ホトニクス株式会社 Apparatus for diagnosis of cancer using laser beam pulse
NL8300965A (en) 1983-03-17 1984-10-16 Nicolaas Roelof Snijder SYSTEM FOR EXAMINATION OF SKELETON PARTS OF THE BODY OF A LIVING ORIGIN, IN PARTICULAR THE SPIRIT OF THE HUMAN BODY.
IT1163442B (en) 1983-06-03 1987-04-08 Agip Spa IMPROVED METHOD OF STEREO-PHOTOGRAMMETRIC DETECTION OF LARGE OBJECTS AT SEA AND ON LAND
JPS60256443A (en) 1984-05-31 1985-12-18 オムロン株式会社 Image measuring apparatus
JPS6164232A (en) 1984-09-07 1986-04-02 株式会社資生堂 Apparatus for detecting and classifying characteristics of skin surface shape
CN85100424B (en) 1985-04-01 1986-10-29 上海医疗器械研究所 Inherent fluorescence diagnostic instrument for malignant tumor
US4724480A (en) 1985-05-02 1988-02-09 Robotic Vision Systems, Inc. Method for optical alignment of one object with respect to another
US4736739A (en) 1985-05-28 1988-04-12 Dowd & Dowd, P.C. Photographic specimen mat
US4930516B1 (en) 1985-11-13 1998-08-04 Laser Diagnostic Instr Inc Method for detecting cancerous tissue using visible native luminescence
DE3545875A1 (en) 1985-12-23 1987-07-02 Anger Wilhelm DEVICE FOR PHOTOGRAMMETRICALLY DETECTING THE HUMAN HEAD
JPS62247232A (en) 1986-04-21 1987-10-28 Agency Of Ind Science & Technol Fluorescence measuring apparatus
JPS63122421A (en) 1986-11-12 1988-05-26 株式会社東芝 Endoscope apparatus
US4851984A (en) 1987-08-03 1989-07-25 University Of Chicago Method and system for localization of inter-rib spaces and automated lung texture analysis in digital chest radiographs
US4829373A (en) 1987-08-03 1989-05-09 Vexcel Corporation Stereo mensuration apparatus
US4839807A (en) 1987-08-03 1989-06-13 University Of Chicago Method and system for automated classification of distinction between normal lungs and abnormal lungs with interstitial disease in digital chest radiographs
US4894547A (en) 1987-09-28 1990-01-16 Yale University Optical method and apparatus for detecting and measuring aging, photoaging, dermal disease and pigmentation in skin
US5051904A (en) 1988-03-24 1991-09-24 Olganix Corporation Computerized dynamic tomography system
JPH06105190B2 (en) 1988-03-31 1994-12-21 工業技術院長 Signal analyzer
US5036853A (en) 1988-08-26 1991-08-06 Polartechnics Ltd. Physiological probe
FR2637189A1 (en) 1988-10-04 1990-04-06 Cgr Mev SYSTEM AND METHOD FOR MEASURING AND / OR VERIFYING THE POSITION OF A PATIENT IN RADIOTHERAPY EQUIPMENT
JP3217343B2 (en) 1989-03-23 2001-10-09 オリンパス光学工業株式会社 Image processing device
USD315901S (en) 1989-01-30 1991-04-02 Metrologic Instruments, Inc. Portable laser scanner
US4979815A (en) * 1989-02-17 1990-12-25 Tsikos Constantine J Laser range imaging system based on projective geometry
US5241468A (en) 1989-04-13 1993-08-31 Vanguard Imaging Ltd. Apparatus and method for spectral enhancement of body-surface images to improve sensitivity of detecting subtle color features
US5016173A (en) * 1989-04-13 1991-05-14 Vanguard Imaging Ltd. Apparatus and method for monitoring visually accessible surfaces of the body
US5421337A (en) 1989-04-14 1995-06-06 Massachusetts Institute Of Technology Spectral diagnosis of diseased tissue
US5369496A (en) 1989-11-13 1994-11-29 Research Foundation Of City College Of New York Noninvasive method and apparatus for characterizing biological materials
JP2852774B2 (en) 1989-11-22 1999-02-03 株式会社エス・エル・ティ・ジャパン Diagnostic device for living tissue and treatment device provided with the diagnostic device
US5157461A (en) 1990-06-14 1992-10-20 Smiths Industries Aerospace & Defense Systems Inc. Interface configuration for rate sensor apparatus
US5083570A (en) 1990-06-18 1992-01-28 Mosby Richard A Volumetric localization/biopsy/surgical device
FR2663529A1 (en) 1990-06-26 1991-12-27 Technomed Int Sa METHOD AND DEVICE FOR CONTROLLING THE POSITION OF THE BODY OF A PATIENT TO BE SUBJECTED TO MEDICAL TREATMENT, AND MEDICAL TREATMENT APPARATUS INCLUDING APPLICATION, IN PARTICULAR A LITHOTRIATOR
US5699798A (en) 1990-08-10 1997-12-23 University Of Washington Method for optically imaging solid tumor tissue
DE4026821A1 (en) 1990-08-24 1992-03-05 Philips Patentverwaltung METHOD FOR DETECTING ANOMALIES OF THE SKIN, ESPECIALLY MELANOMAS, AND DEVICE FOR IMPLEMENTING THE METHOD
US5784162A (en) 1993-08-18 1998-07-21 Applied Spectral Imaging Ltd. Spectral bio-imaging methods for biological research, medical diagnostics and therapy
DE69329554T2 (en) 1992-02-18 2001-05-31 Neopath Inc METHOD FOR IDENTIFYING OBJECTS USING DATA PROCESSING TECHNIQUES
US5603318A (en) 1992-04-21 1997-02-18 University Of Utah Research Foundation Apparatus and method for photogrammetric surgical localization
US5311131A (en) 1992-05-15 1994-05-10 Board Of Regents Of The University Of Washington Magnetic resonance imaging using pattern recognition
US5413477A (en) 1992-10-16 1995-05-09 Gas Research Institute Staged air, low NOX burner with internal recuperative flue gas recirculation
US7064749B1 (en) * 1992-11-09 2006-06-20 Adc Technology Inc. Portable communicator
US7970620B2 (en) * 1992-11-17 2011-06-28 Health Hero Network, Inc. Multi-user remote health monitoring system with biometrics support
US5640962A (en) * 1993-01-21 1997-06-24 Technomed Gesellschaft fur Med. und Med. Techn. Systeme mbH Process and device for determining the topography of a reflecting surface
US5408996A (en) 1993-03-25 1995-04-25 Salb; Jesse System and method for localization of malignant tissue
US5396331A (en) 1993-08-10 1995-03-07 Sanyo Machine Works, Ltd. Method for executing three-dimensional measurement utilizing correctively computing the absolute positions of CCD cameras when image data vary
ZA948393B (en) 1993-11-01 1995-06-26 Polartechnics Ltd Method and apparatus for tissue type recognition
US5689575A (en) 1993-11-22 1997-11-18 Hitachi, Ltd. Method and apparatus for processing images of facial expressions
EP0731956A4 (en) 1993-11-30 1997-04-23 Arch Dev Corp Automated method and system for the alignment and correlation of images from two different modalities
US5749830A (en) 1993-12-03 1998-05-12 Olympus Optical Co., Ltd. Fluorescent endoscope apparatus
US5463463A (en) 1994-01-25 1995-10-31 Mts System Corporation Optical motion sensor
USD393068S (en) 1994-01-31 1998-03-31 Kabushiki Kaisha Toshiba Radio frequency coil for magnetic resonance imaging apparatus
US5590660A (en) 1994-03-28 1997-01-07 Xillix Technologies Corp. Apparatus and method for imaging diseased tissue using integrated autofluorescence
JPH07299053A (en) 1994-04-29 1995-11-14 Arch Dev Corp Computer diagnosis support method
US5561526A (en) * 1994-05-26 1996-10-01 Lockheed Missiles & Space Company, Inc. Three-dimensional measurement device and system
US5531520A (en) * 1994-09-01 1996-07-02 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets including anatomical body data
US5701902A (en) 1994-09-14 1997-12-30 Cedars-Sinai Medical Center Spectroscopic burn injury evaluation apparatus and method
US5519208A (en) 1994-09-29 1996-05-21 Esparza; Joel Infrared aided method and apparatus for venous examination
EP0712092A1 (en) 1994-11-10 1996-05-15 Agfa-Gevaert N.V. Image contrast enhancing method
US5627907A (en) 1994-12-01 1997-05-06 University Of Pittsburgh Computerized detection of masses and microcalcifications in digital mammograms
US6032070A (en) 1995-06-07 2000-02-29 University Of Arkansas Method and apparatus for detecting electro-magnetic reflection from biological tissue
GB9515311D0 (en) * 1995-07-26 1995-09-20 3D Scanners Ltd Stripe scanners and methods of scanning
US5644141A (en) * 1995-10-12 1997-07-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus and method for high-speed characterization of surfaces
US5872859A (en) 1995-11-02 1999-02-16 University Of Pittsburgh Training/optimization of computer aided detection schemes based on measures of overall image quality
US6873716B1 (en) 1995-11-14 2005-03-29 ARETé ASSOCIATES Confocal-reflection streak lidar apparatus with strip-shaped photocathode, for applications at a wide range of scales
US5648915A (en) 1995-11-20 1997-07-15 Triangle Research & Development Corporation Soft tissue damage assessment system
US6690474B1 (en) * 1996-02-12 2004-02-10 Massachusetts Institute Of Technology Apparatus and methods for surface contour measurement
US5799100A (en) 1996-06-03 1998-08-25 University Of South Florida Computer-assisted method and apparatus for analysis of x-ray images using wavelet transforms
US5673300A (en) 1996-06-11 1997-09-30 Wisconsin Alumni Research Foundation Method of registering a radiation treatment plan to a patient
US6101408A (en) * 1996-08-22 2000-08-08 Western Research Company, Inc. Probe and method to obtain accurate area measurements from cervical lesions
US5791346A (en) * 1996-08-22 1998-08-11 Western Research Company, Inc. Colposcope device and method for measuring areas of cervical lesions
US5910972A (en) 1996-09-25 1999-06-08 Fuji Photo Film Co., Ltd. Bone image processing method and apparatus
US6091995A (en) * 1996-11-08 2000-07-18 Surx, Inc. Devices, methods, and systems for shrinking tissues
GB9624003D0 (en) 1996-11-19 1997-01-08 Univ Birmingham Method and apparatus for measurement of skin histology
US7054674B2 (en) 1996-11-19 2006-05-30 Astron Clinica Limited Method of and apparatus for investigating tissue histology
WO1998037811A1 (en) 1997-02-28 1998-09-03 Electro-Optical Sciences, Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
US6081612A (en) 1997-02-28 2000-06-27 Electro Optical Sciences Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
US6307957B1 (en) 1997-02-28 2001-10-23 Electro-Optical Sciences Inc Multispectral imaging and characterization of biological tissue
US5810014A (en) 1997-03-25 1998-09-22 Davis; Dennis W. Method and system for detection of physiological conditions
US6587701B1 (en) 1997-04-03 2003-07-01 Miroslaw F. Stranc Method of assessing tissue viability using near-infrared spectroscopy
US5946645A (en) * 1997-04-09 1999-08-31 National Research Council Of Canada Three dimensional imaging method and device
US6873340B2 (en) * 1997-05-15 2005-03-29 Visimatix, Inc. Method and apparatus for an automated reference indicator system for photographic and video images
US6045367A (en) 1997-09-24 2000-04-04 Teledyne Industries, Inc. Multi-pin connector
JP3015354B2 (en) 1997-12-05 2000-03-06 日本電信電話株式会社 Video information storage / playback system, storage device, playback device, and control method therefor
US6265151B1 (en) 1998-03-27 2001-07-24 Seroptix, Inc. Apparatus and method for infectious disease detection
US6421463B1 (en) 1998-04-01 2002-07-16 Massachusetts Institute Of Technology Trainable system to search for objects in images
DE19821611A1 (en) * 1998-05-14 1999-11-18 Syrinx Med Tech Gmbh Recording method for spatial structure of three-dimensional surface, e.g. for person recognition
US6081739A (en) 1998-05-21 2000-06-27 Lemchen; Marc S. Scanning device or methodology to produce an image incorporating correlated superficial, three dimensional surface and x-ray images and measurements of an object
IL124616A0 (en) * 1998-05-24 1998-12-06 Romedix Ltd Apparatus and method for measurement and temporal comparison of skin surface images
WO2000003210A1 (en) 1998-07-10 2000-01-20 Sugen, Inc. Device for estimating volume
US6427022B1 (en) 1998-11-10 2002-07-30 Western Research Company, Inc. Image comparator system and method for detecting changes in skin lesions
WO2000030337A2 (en) 1998-11-19 2000-05-25 Oracis Medical Corporation Three-dimensional handheld digital camera for medical applications
GB9828474D0 (en) 1998-12-24 1999-02-17 British Aerospace Surface topology inspection
US6290646B1 (en) 1999-04-16 2001-09-18 Cardiocom Apparatus and method for monitoring and communicating wellness parameters of ambulatory patients
US6611833B1 (en) 1999-06-23 2003-08-26 Tissueinformatics, Inc. Methods for profiling and classifying tissue using a database that includes indices representative of a tissue population
US6266453B1 (en) 1999-07-26 2001-07-24 Computerized Medical Systems, Inc. Automated image fusion/alignment system and method
US6941323B1 (en) 1999-08-09 2005-09-06 Almen Laboratories, Inc. System and method for image comparison and retrieval by enhancing, defining, and parameterizing objects in images
US6816847B1 (en) 1999-09-23 2004-11-09 Microsoft Corporation computerized aesthetic judgment of images
US20050182434A1 (en) 2000-08-11 2005-08-18 National Research Council Of Canada Method and apparatus for performing intra-operative angiography
US6490476B1 (en) 1999-10-14 2002-12-03 Cti Pet Systems, Inc. Combined PET and X-ray CT tomograph and method for using same
US6648820B1 (en) 1999-10-27 2003-11-18 Home-Medicine (Usa), Inc. Medical condition sensing system
US6678001B1 (en) 1999-11-01 2004-01-13 Elbex Video Ltd. Ball shaped camera housing with simplified positioning
US7581191B2 (en) 1999-11-15 2009-08-25 Xenogen Corporation Graphical user interface for 3-D in-vivo imaging
US6614452B1 (en) 1999-11-15 2003-09-02 Xenogen Corporation Graphical user interface for in-vivo imaging
IL132944A (en) 1999-11-15 2009-05-04 Arkady Glukhovsky Method for activating an image collecting process
US6567682B1 (en) * 1999-11-16 2003-05-20 Carecord Technologies, Inc. Apparatus and method for lesion feature identification and characterization
US6603552B1 (en) 1999-12-22 2003-08-05 Xillix Technologies Corp. Portable system for detecting skin abnormalities based on characteristic autofluorescence
USD455166S1 (en) 2000-03-14 2002-04-02 Silent Witness Enterprises, Ltd. Infrared illuminator housing
US6968094B1 (en) 2000-03-27 2005-11-22 Eastman Kodak Company Method of estimating and correcting camera rotation with vanishing point location
US6594388B1 (en) 2000-05-25 2003-07-15 Eastman Kodak Company Color image reproduction of scenes with preferential color mapping and scene-dependent tone scaling
IL136884A0 (en) 2000-06-19 2001-06-14 Yissum Res Dev Co A system for cancer detection and typing and for grading of malignancy
FR2810737B1 (en) 2000-06-23 2003-04-18 Oreal APPARATUS AND METHOD FOR EXAMINING A SURFACE
DE10033723C1 (en) 2000-07-12 2002-02-21 Siemens Ag Surgical instrument position and orientation visualization device for surgical operation has data representing instrument position and orientation projected onto surface of patient's body
US6594516B1 (en) * 2000-07-18 2003-07-15 Koninklijke Philips Electronics, N.V. External patient contouring
WO2002015559A2 (en) 2000-08-10 2002-02-21 The Regents Of The University Of California High-resolution digital image processing in the analysis of pathological materials
US6754370B1 (en) * 2000-08-14 2004-06-22 The Board Of Trustees Of The Leland Stanford Junior University Real-time structured light range scanning of moving scenes
US7106885B2 (en) 2000-09-08 2006-09-12 Carecord Technologies, Inc. Method and apparatus for subject physical position and security determination
GB0022447D0 (en) 2000-09-13 2000-11-01 Bae Systems Plc Measurement method
US7595878B2 (en) 2000-10-13 2009-09-29 Chemimage Corporation Spectroscopic methods for component particle analysis
US8218873B2 (en) 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US7680324B2 (en) 2000-11-06 2010-03-16 Evryx Technologies, Inc. Use of image-derived information as search criteria for internet and other search engines
US7565008B2 (en) 2000-11-06 2009-07-21 Evryx Technologies, Inc. Data capture and identification system and process
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
AU2002253784A1 (en) 2000-11-07 2002-08-28 Hypermed, Inc. Hyperspectral imaging calibration device
US6671349B1 (en) 2000-11-13 2003-12-30 Olganix Corporation Tomosynthesis system and registration method
US6715675B1 (en) 2000-11-16 2004-04-06 Eldat Communication Ltd. Electronic shelf label systems and methods
EP1207387A1 (en) 2000-11-20 2002-05-22 Institut Curie Multi-photon imaging installation.
US7103205B2 (en) 2000-11-24 2006-09-05 U-Systems, Inc. Breast cancer screening with ultrasound image overlays
DE10059070C1 (en) 2000-11-28 2002-02-14 Pulsion Medical Sys Ag Device for determining tissue perfusion has source and expansion optics arranged in safety housing so only expanded beam of intensity within safety limits for persons near device emanates
US6392744B1 (en) * 2000-12-11 2002-05-21 Analog Technologies, Corp. Range measurement system
US6816606B2 (en) 2001-02-21 2004-11-09 Interscope Technologies, Inc. Method for maintaining high-quality focus during high-throughput, microscopic digital montage imaging
US7155049B2 (en) 2001-01-11 2006-12-26 Trestle Acquisition Corp. System for creating microscopic digital montage images
US6798571B2 (en) 2001-01-11 2004-09-28 Interscope Technologies, Inc. System for microscopic digital montage imaging using a pulse light illumination system
US6993169B2 (en) 2001-01-11 2006-01-31 Trestle Corporation System and method for finding regions of interest for microscopic digital montage imaging
US6359513B1 (en) 2001-01-31 2002-03-19 U.S. Philips Corporation CMOS power amplifier with reduced harmonics and improved efficiency
US20040201694A1 (en) 2001-02-07 2004-10-14 Vladimir Gartstein Noninvasive methods and apparatus for monitoring at least one hair characteristic
JP2002236332A (en) 2001-02-09 2002-08-23 Olympus Optical Co Ltd Stereoscopic adapter, pattern projection adapter and adapter for light emitting member
DE10108240B4 (en) 2001-02-21 2018-06-28 Leica Microsystems Cms Gmbh Method for imaging and measuring microscopic three-dimensional structures
USD453350S1 (en) 2001-03-05 2002-02-05 Silent Witness Enterprises, Ltd. Enclosure for a video sensor
US20020181752A1 (en) 2001-03-14 2002-12-05 Warren Wallo Method for measuring changes in portions of a human body
US7064311B2 (en) 2001-04-02 2006-06-20 Atlab, Inc. Optical image detector and method for controlling illumination of the same
JP2002312079A (en) 2001-04-12 2002-10-25 Internatl Business Mach Corp <Ibm> Computer system, computer device, and feeding control method in the computer device
US8078262B2 (en) 2001-04-16 2011-12-13 The Johns Hopkins University Method for imaging and spectroscopy of tumors and determination of the efficacy of anti-tumor drug therapies
WO2002093134A1 (en) 2001-05-15 2002-11-21 Seroptix, Inc. Method for determining the presence of infection in an individual
WO2002093450A1 (en) 2001-05-16 2002-11-21 Cellavision Ab Information processing for distinguishing an object
AUPR509801A0 (en) 2001-05-18 2001-06-14 Polartechnics Limited Boundary finding in dermatological examination
US7217266B2 (en) 2001-05-30 2007-05-15 Anderson R Rox Apparatus and method for laser treatment with spectroscopic feedback
US6491632B1 (en) 2001-06-26 2002-12-10 Geoffrey L. Taylor Method and apparatus for photogrammetric orientation of ultrasound images
JP3905736B2 (en) 2001-10-12 2007-04-18 ペンタックス株式会社 Stereo image pickup device and automatic congestion adjustment device
US6961517B2 (en) 2001-11-08 2005-11-01 Johnson & Johnson Consumer Companies, Inc. Method of promoting skin care products
US7738032B2 (en) 2001-11-08 2010-06-15 Johnson & Johnson Consumer Companies, Inc. Apparatus for and method of taking and viewing images of the skin
US6907193B2 (en) 2001-11-08 2005-06-14 Johnson & Johnson Consumer Companies, Inc. Method of taking polarized images of the skin and the use thereof
US20040146290A1 (en) 2001-11-08 2004-07-29 Nikiforos Kollias Method of taking images of the skin using blue light and the use thereof
US6922523B2 (en) 2001-11-08 2005-07-26 Johnson & Johnson Consumer Companies, Inc. Method of promoting skin care products
US6770186B2 (en) 2001-11-13 2004-08-03 Eldat Communication Ltd. Rechargeable hydrogen-fueled motor vehicle
US7074509B2 (en) 2001-11-13 2006-07-11 Eldat Communication Ltd. Hydrogen generators for fuel cells
US7068828B2 (en) 2001-11-29 2006-06-27 Gaiagene Inc. Biochip image analysis system and method thereof
EP1461464A4 (en) 2001-12-03 2009-08-26 Seroptix Inc A method for identifying markers
US20040082940A1 (en) 2002-10-22 2004-04-29 Michael Black Dermatological apparatus and method
US6862542B2 (en) 2002-01-17 2005-03-01 Charlotte-Mecklenburg Hospital Erythema measuring device
US20030164875A1 (en) 2002-03-01 2003-09-04 Myers Kenneth J. System and method for passive three-dimensional data acquisition
US20030164841A1 (en) 2002-03-01 2003-09-04 Edward Greenberg System and method for passive three-dimensional data acquisition
EP1345154A1 (en) 2002-03-11 2003-09-17 Bracco Imaging S.p.A. A method for encoding image pixels and method for processing images aimed at qualitative recognition of the object reproduced by one more image pixels
ITBO20020164A1 (en) 2002-03-28 2003-09-29 Alessandro Barducci EQUIPMENT FOR DETECTION AND PROCESSING FOR DIAGNOSTIC PURPOSE OF RADIATIONS COMING FROM HUMAN SKIN
DK1494579T3 (en) 2002-04-02 2011-11-14 Yeda Res & Dev Characterization of moving objects in a stationary background
DE10214840A1 (en) 2002-04-04 2003-11-27 Uwe Braasch Photogrammetric method for determining geometric information from images
US7136191B2 (en) 2002-06-24 2006-11-14 Eastman Kodak Company Method for inspecting prints
US7128894B1 (en) 2002-06-27 2006-10-31 The United States Of America As Represented By The United States Department Of Energy Contrast enhancing solution for use in confocal microscopy
JP2005534042A (en) 2002-07-19 2005-11-10 アストロン クリニカ リミテッド Histological examination method and apparatus for epithelial tissue
DE10239801A1 (en) 2002-08-29 2004-03-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Process for extracting texture features from a multi-channel image
US20040059199A1 (en) * 2002-09-04 2004-03-25 Thomas Pamela Sue Wound assessment and monitoring apparatus and method
US7194114B2 (en) 2002-10-07 2007-03-20 Carnegie Mellon University Object finder for two-dimensional images, and system for determining a set of sub-classifiers composing an object finder
GB2395261A (en) * 2002-11-11 2004-05-19 Qinetiq Ltd Ranging apparatus
JP4632645B2 (en) 2002-12-12 2011-02-16 オリンパス株式会社 Imaging device and processor device
US7127094B1 (en) 2003-01-02 2006-10-24 Electro Optical Sciences Inc Method of controlling data gathered at remote locations
US7613335B2 (en) 2003-02-12 2009-11-03 The University Of Iowa Research Foundation Methods and devices useful for analyzing color medical images
US7006223B2 (en) 2003-03-07 2006-02-28 3Gen, Llc. Dermoscopy epiluminescence device employing cross and parallel polarization
US7545963B2 (en) 2003-04-04 2009-06-09 Lumidigm, Inc. Texture-biometrics sensor
US7347365B2 (en) 2003-04-04 2008-03-25 Lumidigm, Inc. Combined total-internal-reflectance and tissue imaging systems and methods
DE602004030549D1 (en) 2003-04-04 2011-01-27 Lumidigm Inc MULTISPEKTRALBIOMETRIESENSOR
US7751594B2 (en) 2003-04-04 2010-07-06 Lumidigm, Inc. White-light spectral biometric sensors
US7460696B2 (en) 2004-06-01 2008-12-02 Lumidigm, Inc. Multispectral imaging biometrics
US7668350B2 (en) 2003-04-04 2010-02-23 Lumidigm, Inc. Comparative texture analysis of tissue for biometric spoof detection
CA2521857C (en) 2003-04-07 2012-06-12 E.I. Du Pont De Nemours And Company Method and apparatus for quantifying visual showthrough of printed images on the reverse of planar objects
US7400754B2 (en) 2003-04-08 2008-07-15 The Regents Of The University Of California Method and apparatus for characterization of chromophore content and distribution in skin using cross-polarized diffuse reflectance imaging
ITRM20030184A1 (en) 2003-04-22 2004-10-23 Provincia Italiana Della Congregazi One Dei Figli METHOD FOR AUTOMATED DETECTION AND SIGNALING
US7233693B2 (en) 2003-04-29 2007-06-19 Inforward, Inc. Methods and systems for computer analysis of skin image
US20040225222A1 (en) 2003-05-08 2004-11-11 Haishan Zeng Real-time contemporaneous multimodal imaging and spectroscopy uses thereof
US7546156B2 (en) 2003-05-09 2009-06-09 University Of Rochester Medical Center Method of indexing biological imaging data using a three-dimensional body representation
WO2004111927A2 (en) 2003-06-13 2004-12-23 UNIVERSITé LAVAL Three-dimensional modeling from arbitrary three-dimensional curves
US20050012817A1 (en) * 2003-07-15 2005-01-20 International Business Machines Corporation Selective surveillance system with active sensor management policies
US7538869B2 (en) 2004-06-30 2009-05-26 Chemimage Corporation Multipoint method for identifying hazardous agents
US20050027567A1 (en) * 2003-07-29 2005-02-03 Taha Amer Jamil System and method for health care data collection and management
US7162063B1 (en) 2003-07-29 2007-01-09 Western Research Company, Inc. Digital skin lesion imaging system and method
US7924307B2 (en) 2003-08-14 2011-04-12 Carl Zeiss Ag Optical viewing system and method for operating the same
US7450783B2 (en) * 2003-09-12 2008-11-11 Biopticon Corporation Methods and systems for measuring the size and volume of features on live tissues
FI116327B (en) 2003-09-24 2005-10-31 Nokia Corp Method and system for automatically adjusting color balance in a digital image processing chain, corresponding hardware and software means for implementing the method
ITFI20030254A1 (en) 2003-10-08 2005-04-09 Actis Active Sensors S R L PERFECT METHOD AND DEVICE FOR SPECTRUM ANALYSIS
US7920908B2 (en) * 2003-10-16 2011-04-05 David Hattery Multispectral imaging for quantitative contrast of functional and structural features of layers inside optically dense media such as tissue
US7460250B2 (en) 2003-10-24 2008-12-02 3Dm Devices Inc. Laser triangulation system
US20050094262A1 (en) 2003-11-05 2005-05-05 Visx, Incorporated Microscope magnification sensor
US20050111757A1 (en) 2003-11-26 2005-05-26 Brackett Charles C. Auto-image alignment system and method based on identified anomalies
US9311540B2 (en) 2003-12-12 2016-04-12 Careview Communications, Inc. System and method for predicting patient falls
JP2005197792A (en) 2003-12-26 2005-07-21 Canon Inc Image processing method, image processing apparatus, program, storage medium, and image processing system
JP4437202B2 (en) 2004-01-09 2010-03-24 学校法人慶應義塾 Telemedicine system for pigmentation site
DE102004002918B4 (en) 2004-01-20 2016-11-10 Siemens Healthcare Gmbh Device for the examination of the skin
WO2005079306A2 (en) 2004-02-13 2005-09-01 University Of Chicago Method, system, and computer software product for feature-based correlation of lesions from multiple images
US20050190988A1 (en) 2004-03-01 2005-09-01 Mass Institute Of Technology (Mit) Passive positioning sensors
JP4256294B2 (en) 2004-03-31 2009-04-22 株式会社神戸製鋼所 Die plate
JP2005328845A (en) 2004-05-06 2005-12-02 Oce Technologies Bv Methods, apparatus and computer for transforming digital colour images
US8229185B2 (en) 2004-06-01 2012-07-24 Lumidigm, Inc. Hygienic biometric sensors
USD533555S1 (en) 2004-06-04 2006-12-12 Nobel Biocare Services Ag Scanner
US8019801B1 (en) 2004-06-23 2011-09-13 Mayo Foundation For Medical Education And Research Techniques to rate the validity of multiple methods to process multi-dimensional data
US20060008178A1 (en) 2004-07-08 2006-01-12 Seeger Adam A Simulation of scanning beam images by combination of primitive features extracted from a surface model
US7672705B2 (en) 2004-07-19 2010-03-02 Resonant Medical, Inc. Weighted surface-to-surface mapping
US20060036135A1 (en) 2004-08-10 2006-02-16 Kern Kenneth A Skin cancer identification template
US8787630B2 (en) 2004-08-11 2014-07-22 Lumidigm, Inc. Multispectral barcode imaging
US20060058665A1 (en) * 2004-08-19 2006-03-16 Biosound, Inc. Noninvasive method of ultrasound wound evaluation
US7227621B2 (en) 2004-08-30 2007-06-05 Postech Foundation System for visualizing flow and measuring velocity field using X-ray particle image velocimetry
USD561804S1 (en) 2004-09-08 2008-02-12 Matsushita Electric Industrial Co., Ltd. Surveillance television camera
US7813552B2 (en) 2004-09-23 2010-10-12 Mitsubishi Denki Kabushiki Kaisha Methods of representing and analysing images
US20060072122A1 (en) * 2004-09-30 2006-04-06 Qingying Hu Method and apparatus for measuring shape of an object
US8026942B2 (en) 2004-10-29 2011-09-27 Johnson & Johnson Consumer Companies, Inc. Skin imaging system with probe
US7489799B2 (en) 2004-11-30 2009-02-10 General Electric Company Method and apparatus for image reconstruction using data decomposition for all or portions of the processing flow
GB0427642D0 (en) 2004-12-16 2005-01-19 Renovo Ltd Information collection system
US20060135953A1 (en) 2004-12-22 2006-06-22 Wlodzimierz Kania Tissue ablation system including guidewire with sensing element
USD597205S1 (en) 2004-12-22 2009-07-28 Zutron Medical, L.L.C. Handle for endoscope stiffening device
NZ556655A (en) * 2005-01-19 2010-10-29 Dermaspect Llc Devices and methods for identifying and monitoring changes of a suspect area on a patient
CA2599483A1 (en) 2005-02-23 2006-08-31 Craig Summers Automatic scene modeling for the 3d camera and 3d video
US20060222263A1 (en) 2005-04-04 2006-10-05 Carlson Eric A Linear measurement machine-readable medium, method and system
JP4230525B2 (en) 2005-05-12 2009-02-25 有限会社テクノドリーム二十一 Three-dimensional shape measuring method and apparatus
US20060293613A1 (en) * 2005-06-27 2006-12-28 Concept Development Group Method and Apparatus for Automated Monitoring and Tracking of the Trajectory of Patients' Center of Gravity Movements
US20080232679A1 (en) 2005-08-17 2008-09-25 Hahn Daniel V Apparatus and Method for 3-Dimensional Scanning of an Object
CN100484479C (en) 2005-08-26 2009-05-06 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic image enhancement and spot inhibition method
GB0517992D0 (en) 2005-09-05 2005-10-12 Sld Ltd Laser imaging
USD554682S1 (en) 2005-09-30 2007-11-06 Logitech Europe S.A. Webcam with a user-configurable shell
TW200715821A (en) 2005-10-03 2007-04-16 Chroma Ate Inc Optical image system planar surface resolution calibration method and device
EP1946567A4 (en) 2005-10-04 2009-02-18 Eugene J Alexander Device for generating three dimensional surface models of moving objects
FR2891641B1 (en) 2005-10-04 2007-12-21 Lvmh Rech METHOD AND APPARATUS FOR CHARACTERIZING SKIN IMPERFECTIONS AND METHOD OF ASSESSING THE ANTI-AGING EFFECT OF A COSMETIC PRODUCT
US7400414B2 (en) 2005-10-31 2008-07-15 Mitutoyo Corporation Hand-size structured-light three-dimensional metrology imaging system and method
WO2007051299A1 (en) 2005-11-04 2007-05-10 Cryos Technology, Inc. Surface analysis method and system
US20070129602A1 (en) 2005-11-22 2007-06-07 Given Imaging Ltd. Device, method and system for activating an in-vivo imaging device
WO2007059780A1 (en) 2005-11-28 2007-05-31 3Shape A/S Coded structured light
US20070125390A1 (en) 2005-12-07 2007-06-07 Isabelle Afriat Method of evaluating the effects of exogenous and endogenous factors on the skin
US8478386B2 (en) 2006-01-10 2013-07-02 Accuvein Inc. Practitioner-mounted micro vein enhancer
USD547347S1 (en) 2006-01-06 2007-07-24 Samsung Electronics Co., Ltd. Monitoring camera
US20070229850A1 (en) 2006-04-04 2007-10-04 Boxternal Logics, Llc System and method for three-dimensional image capture
US20080006282A1 (en) 2006-05-04 2008-01-10 Predrag Sukovic Medical imaging exchange network
US20070273894A1 (en) 2006-05-23 2007-11-29 Johnson James T Method and apparatus for remote spatial calibration and imaging
US8244333B2 (en) 2006-06-29 2012-08-14 Accuvein, Llc Scanned laser vein contrast enhancer
US8463364B2 (en) * 2009-07-22 2013-06-11 Accuvein Inc. Vein scanner
NL1032488C2 (en) 2006-09-13 2008-03-14 Alb Van Gool R & D Device and method for positioning recording means for recording images relative to an object.
US7474415B2 (en) 2006-09-13 2009-01-06 Chung Shan Institute Of Science And Technology, Armaments Bureau, M.N.D. Measurement method of three-dimensional profiles and reconstruction system thereof using subpixel localization with color gratings and picture-in-picture switching on single display
CN101534698A (en) 2006-09-27 2009-09-16 乔治亚技术研究公司 Systems and methods for the measurement of surfaces
US20080088704A1 (en) 2006-10-13 2008-04-17 Martin Edmund Wendelken Method of making digital planimetry measurements on digital photographs
US7990397B2 (en) 2006-10-13 2011-08-02 Leica Geosystems Ag Image-mapped point cloud with ability to accurately represent point coordinates
SI22424A (en) 2006-11-07 2008-06-30 ALPINA, tovarna obutve, d.d., Žiri Device and procedure for threedimensional measurement of body shape
WO2008133650A2 (en) 2006-11-07 2008-11-06 Rudolph Technologies, Inc. Method and system for providing a high definition triangulation system
DE102006059416B4 (en) 2006-12-15 2009-05-20 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Device and method for increasing the measuring accuracy of digital 3D geometry measuring systems
DE102006062447B4 (en) 2006-12-28 2009-08-20 Chronos Vision Gmbh Method and device for detecting the three-dimensional surface of an object, in particular a vehicle tire
US7912320B1 (en) 2007-01-16 2011-03-22 Paul Minor Method and apparatus for photographic measurement
US7916834B2 (en) 2007-02-12 2011-03-29 Thermo Niton Analyzers Llc Small spot X-ray fluorescence (XRF) analyzer
US8213695B2 (en) * 2007-03-07 2012-07-03 University Of Houston Device and software for screening the skin
US8659698B2 (en) 2007-05-17 2014-02-25 Ilya Blayvas Compact 3D scanner with fixed pattern projector and dual band image sensor
US8488129B2 (en) 2007-10-05 2013-07-16 Artec Group, Inc. Combined object capturing system and display device and associated method
US8105233B2 (en) 2007-10-24 2012-01-31 Tarek Ahmed Nabil Abou El Kheir Endoscopic system and method for therapeutic applications and obtaining 3-dimensional human vision simulated imaging with real dynamic convergence
USD603441S1 (en) 2007-12-25 2009-11-03 Panasonic Corporation Surveillance camera
US9757053B2 (en) 2008-02-07 2017-09-12 Thomas J. Richards Photo scaling guide configured to scale wounds or objects
US8123704B2 (en) 2008-02-07 2012-02-28 Richards Thomas J Calibration and measurement system
US8107083B2 (en) 2008-03-05 2012-01-31 General Electric Company System aspects for a probe system that utilizes structured-light
US8161826B1 (en) 2009-03-05 2012-04-24 Stryker Corporation Elastically stretchable fabric force sensor arrays and methods of making
US8533879B1 (en) 2008-03-15 2013-09-17 Stryker Corporation Adaptive cushion method and apparatus for minimizing force concentrations on a human body
CA2765419C (en) 2008-06-13 2017-10-24 Premco Medical Systems, Inc. Wound treatment apparatus and method
US20110190637A1 (en) 2008-08-18 2011-08-04 Naviswiss Ag Medical measuring system, method for surgical intervention as well as use of a medical measuring system
EP2347369A1 (en) 2008-10-13 2011-07-27 George Papaioannou Non-invasive wound prevention, detection, and analysis
US20110292406A1 (en) 2008-10-28 2011-12-01 3Shape A/S Scanner with feedback control
WO2010077900A1 (en) 2008-12-16 2010-07-08 Faro Technologies, Inc. Structured light imaging system and method
US20100278312A1 (en) 2009-04-30 2010-11-04 Kent State University Core measurements stand for use with a portable xrf analyzer
US7931149B2 (en) 2009-05-27 2011-04-26 Given Imaging Ltd. System for storing and activating an in vivo imaging capsule
WO2011063266A2 (en) 2009-11-19 2011-05-26 The Johns Hopkins University Low-cost image-guided navigation and intervention systems using cooperative sets of local sensors
USD653687S1 (en) 2010-04-02 2012-02-07 Powertech Electronics Case for security camera
US20120078088A1 (en) 2010-09-28 2012-03-29 Point of Contact, LLC. Medical image projection and tracking system
US20120078113A1 (en) 2010-09-28 2012-03-29 Point of Contact, LLC Convergent parameter instrument
US9357963B1 (en) 2011-04-04 2016-06-07 James G. Spahn Grayscale thermographic imaging
USD662122S1 (en) 2011-04-07 2012-06-19 ARANZ Medical Limited Combined handheld laser and camera unit
EP2510972B1 (en) 2011-04-14 2014-08-06 Biotronik AG Catheter device
US8638986B2 (en) 2011-04-20 2014-01-28 Qualcomm Incorporated Online reference patch generation and pose estimation for augmented reality
GB201107225D0 (en) 2011-04-29 2011-06-15 Peira Bvba Stereo-vision system
TWI471117B (en) * 2011-04-29 2015-02-01 Nat Applied Res Laboratoires Human facial skin roughness and wrinkle inspection based on smart phone
JP5887770B2 (en) 2011-09-05 2016-03-16 富士ゼロックス株式会社 Image processing apparatus and image processing program
USD697210S1 (en) 2012-02-28 2014-01-07 X-Ray Optical Systems, Inc. Handheld x-ray analyzer
US9186053B2 (en) 2012-05-03 2015-11-17 Covidien Lp Methods of using light to repair hernia defects
US9167800B2 (en) 2012-06-04 2015-10-27 Clicrweight, LLC Systems for determining animal metrics and related devices and methods
US8787621B2 (en) 2012-06-04 2014-07-22 Clicrweight, LLC Methods and systems for determining and displaying animal metrics
US9224205B2 (en) 2012-06-14 2015-12-29 Qualcomm Incorporated Accelerated geometric shape detection and accurate pose tracking
US9064765B2 (en) 2012-08-14 2015-06-23 Symbol Technologies, Llc Handheld imaging apparatus for, and method of, imaging targets using a high performance, compact scan engine
US9026187B2 (en) 2012-09-01 2015-05-05 Morphie, Inc. Wireless communication accessory for a mobile device
US8904876B2 (en) 2012-09-29 2014-12-09 Stryker Corporation Flexible piezocapacitive and piezoresistive force and pressure sensors
US8997588B2 (en) 2012-09-29 2015-04-07 Stryker Corporation Force detecting mat with multiple sensor types
US20150250416A1 (en) 2012-10-05 2015-09-10 Vasamed, Inc. Apparatus and method to assess wound healing
US9877692B2 (en) 2012-11-09 2018-01-30 Baylor University Method and system of measuring anatomical features in subcutaneous images to assess risk of injury
US9395234B2 (en) 2012-12-05 2016-07-19 Cardiocom, Llc Stabilizing base for scale
WO2014182676A2 (en) 2013-05-06 2014-11-13 Scholar Rock, Inc. Compositions and methods for growth factor modulation
US20140354830A1 (en) 2013-06-03 2014-12-04 Littleton Precision, LLC System and method for adding scale to photographic images
US9438775B2 (en) 2013-09-17 2016-09-06 Occipital, Inc. Apparatus for real-time 3D capture
USD724216S1 (en) 2013-11-18 2015-03-10 American Science And Engineering, Inc. Handheld backscatter imager
US10445465B2 (en) 2013-11-19 2019-10-15 General Electric Company System and method for efficient transmission of patient data
USD714940S1 (en) 2014-01-06 2014-10-07 Nanofocusray Co., Ltd. Medical X-ray imaging apparatus
US10008870B2 (en) 2014-03-20 2018-06-26 Otter Products, Llc Powered case for portable electronic device
WO2015198578A1 (en) 2014-06-25 2015-12-30 パナソニックIpマネジメント株式会社 Projection system
USD740945S1 (en) 2014-10-31 2015-10-13 Aranz Healthcare Limited Handheld scanner
WO2016094439A1 (en) 2014-12-08 2016-06-16 Munoz Luis Daniel Device, system and methods for assessing tissue structures, pathology, and healing
TWM501041U (en) 2014-12-24 2015-05-11 Coremate Technical Co Ltd Mobile phone protective casing having wireless charging and discharging function
US10217022B2 (en) 2015-03-06 2019-02-26 Ricoh Company, Ltd. Image acquisition and management
CN107920739B (en) 2015-06-10 2021-07-09 泰拓卡尔有限公司 Device and method for examining skin lesions
US10311567B2 (en) 2015-09-23 2019-06-04 Novadaq Technologies ULC Methods and systems for assessing healing of tissue
CA3041583A1 (en) 2015-10-29 2017-05-04 PogoTec, Inc. Hearing aid adapted for wireless power reception
US10013527B2 (en) 2016-05-02 2018-07-03 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
EP4183328A1 (en) 2017-04-04 2023-05-24 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems

Patent Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4535782A (en) * 1984-03-07 1985-08-20 American Cyanamid Company Method for determining wound volume
US5270168A (en) * 1990-02-21 1993-12-14 Board Of Regents, The University Of Texas System Method for diagnosing non-healing ulcers
US5588428A (en) * 1993-04-28 1996-12-31 The University Of Akron Method and apparatus for non-invasive volume and texture analysis
US5969822A (en) * 1994-09-28 1999-10-19 Applied Research Associates Nz Ltd. Arbitrary-geometry laser surface scanner
US5967979A (en) * 1995-11-14 1999-10-19 Verg, Inc. Method and apparatus for photogrammetric assessment of biological tissue
US20020149585A1 (en) * 1996-04-24 2002-10-17 Kacyra Ben K. Integrated system for quickly and accurately imaging and modeling three-dimensional objects
US5957837A (en) * 1996-10-17 1999-09-28 Faro Technologies, Inc. Method and apparatus for wound management
US6359612B1 (en) * 1998-09-30 2002-03-19 Siemens Aktiengesellschaft Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device
US6381026B1 (en) * 1999-03-15 2002-04-30 Lifecell Corp. Method of measuring the contour of a biological surface
US6381488B1 (en) * 1999-06-15 2002-04-30 Sandia Corporation Method and apparatus to measure the depth of skin burns
US7068836B1 (en) * 2000-04-28 2006-06-27 Orametrix, Inc. System and method for mapping a surface
US20020054297A1 (en) * 2000-11-06 2002-05-09 Chun-Hsing Lee Three dimensional scanning system
US20060055943A1 (en) * 2002-11-14 2006-03-16 Technodream21 Inc. Three-dimensional shape measuring method and its device
US7248724B2 (en) * 2002-11-19 2007-07-24 Polartechnics Limited Method for monitoring wounds
US7181363B2 (en) * 2003-04-16 2007-02-20 Massachusetts Institute Of Technology Three dimensional tangible interface for interacting with spatial-temporal data using a laser scanner
US20060073132A1 (en) * 2004-10-06 2006-04-06 Congote Luis F Agents for wound healing
US20090234313A1 (en) * 2005-05-20 2009-09-17 Peter Mullejeans Device for Recording and Transferring a Contour
US20090213213A1 (en) * 2005-10-14 2009-08-27 Applied Research Associates New Zealand Limited Method of Monitoring a Surface Feature and Apparatus Therefor
US20070276309A1 (en) * 2006-05-12 2007-11-29 Kci Licensing, Inc. Systems and methods for wound area management
US20070276195A1 (en) * 2006-05-12 2007-11-29 Kci Licensing, Inc. Systems and methods for wound area management
US20080098322A1 (en) * 2006-06-01 2008-04-24 Simquest Llc Method and apparatus for collecting and analyzing surface wound data
US7495208B2 (en) * 2006-06-01 2009-02-24 Czarnek And Orkin Laboratories, Inc. Portable optical wound scanner
US20080045807A1 (en) * 2006-06-09 2008-02-21 Psota Eric T System and methods for evaluating and monitoring wounds
US20100004564A1 (en) * 2006-07-25 2010-01-07 Johan Jendle Wound measuring device
US9451928B2 (en) * 2006-09-13 2016-09-27 Elekta Ltd. Incorporating internal anatomy in clinical radiotherapy setups
US20120035469A1 (en) * 2006-09-27 2012-02-09 Whelan Thomas J Systems and methods for the measurement of surfaces
US20090116712A1 (en) * 2007-11-02 2009-05-07 Osama Al-Moosawi Apparatus and method for wound diagnosis
US8814841B2 (en) * 2007-12-06 2014-08-26 Smith & Nephew Plc Apparatus and method for wound volume measurement
US20120059266A1 (en) * 2009-03-09 2012-03-08 Mologic Ltd. Imaging method
US20130051651A1 (en) * 2010-05-07 2013-02-28 Purdue Research Foundation Quantitative image analysis for wound healing assay
US20130335545A1 (en) * 2010-12-19 2013-12-19 Matthew Ross Darling System for integrated wound analysis
US9330453B2 (en) * 2011-03-24 2016-05-03 Red. Soft It-Service Gmbh Apparatus and method for determining a skin inflammation value
US20130137991A1 (en) * 2011-11-28 2013-05-30 William Richard Fright Handheld skin measuring or monitoring device
US20140088402A1 (en) * 2012-09-25 2014-03-27 Innovative Therapies, Inc. Wound measurement on smart phones
US20150150457A1 (en) * 2013-12-03 2015-06-04 Children's National Medical Center Method and system for wound assessment and management
US20160284084A1 (en) * 2015-03-23 2016-09-29 Ohio State Innovation Foundation System and method for segmentation and automated measurement of chronic wound images

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Callieri, M., Cignoni, P., Pingi, P., Scopigno, R., Coluccia, M., Gaggio, G., & Romanelli, M. N. (2003, November). Derma: Monitoring the Evolution of Skin Lesions with a 3D System. In VMV (pp. 167-174). *
Thali, M. J., Braun, M., & Dirnhofer, R. (2003). Optical 3D surface digitizing in forensic medicine: 3D documentation of skin and bone injuries. Forensic science international, 137(2), 203-208. *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10827970B2 (en) 2005-10-14 2020-11-10 Aranz Healthcare Limited Method of monitoring a surface feature and apparatus therefor
US10874302B2 (en) 2011-11-28 2020-12-29 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US11850025B2 (en) 2011-11-28 2023-12-26 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US10777317B2 (en) 2016-05-02 2020-09-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11250945B2 (en) 2016-05-02 2022-02-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11923073B2 (en) 2016-05-02 2024-03-05 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US11903723B2 (en) 2017-04-04 2024-02-20 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems

Also Published As

Publication number Publication date
EP1948018A1 (en) 2008-07-30
US20160262659A1 (en) 2016-09-15
JP2009511163A (en) 2009-03-19
US8755053B2 (en) 2014-06-17
CN101282687A (en) 2008-10-08
AU2006300008A1 (en) 2007-04-19
US20140243619A1 (en) 2014-08-28
US20170042452A1 (en) 2017-02-16
US20210219907A1 (en) 2021-07-22
KR20080064155A (en) 2008-07-08
US10827970B2 (en) 2020-11-10
US9955910B2 (en) 2018-05-01
CA2625775A1 (en) 2007-04-19
US20180214071A1 (en) 2018-08-02
US20090213213A1 (en) 2009-08-27
WO2007043899A1 (en) 2007-04-19
US9377295B2 (en) 2016-06-28
CN101282687B (en) 2011-11-16

Similar Documents

Publication Publication Date Title
US20210219907A1 (en) Method of monitoring a surface feature and apparatus therefor
US11850025B2 (en) Handheld skin measuring or monitoring device
US11903723B2 (en) Anatomical surface assessment methods, devices and systems
US20210386295A1 (en) Anatomical surface assessment methods, devices and systems
US7657101B2 (en) Devices and methods for identifying and monitoring changes of a suspect area on a patient
CN111031897B (en) System and method for analyzing skin condition
US20080045807A1 (en) System and methods for evaluating and monitoring wounds
Sprigle et al. Iterative design and testing of a hand-held, non-contact wound measurement device
Foltynski et al. Wound surface area measurement methods
KR20160117956A (en) Skin diagnostic measurement system using a terminal

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: APPLIED RESEARCH ASSOCIATES NZ LIMITED, NEW ZEALAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FRIGHT, WILLIAM RICHARD;NIXON, MARK ARTHUR;MCCALLUM, BRUCE CLINTON;AND OTHERS;SIGNING DATES FROM 20090304 TO 20090421;REEL/FRAME:044022/0061

Owner name: ARANZ HEALTHCARE LIMITED, NEW ZEALAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:APPLIED RESEARCH ASSOCIATES NZ LIMITED;REEL/FRAME:044360/0390

Effective date: 20121128