WO2016073569A2 - Video detection of tooth condition using green and red fluorescence - Google Patents

Video detection of tooth condition using green and red fluorescence Download PDF

Info

Publication number
WO2016073569A2
WO2016073569A2 PCT/US2015/058977 US2015058977W WO2016073569A2 WO 2016073569 A2 WO2016073569 A2 WO 2016073569A2 US 2015058977 W US2015058977 W US 2015058977W WO 2016073569 A2 WO2016073569 A2 WO 2016073569A2
Authority
WO
WIPO (PCT)
Prior art keywords
tooth
image data
image
fluorescence
camera
Prior art date
Application number
PCT/US2015/058977
Other languages
French (fr)
Other versions
WO2016073569A3 (en
Inventor
Yingqian WU
Wei Wang
Victor C. Wong
Yan Zhang
Original Assignee
Carestream Health, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carestream Health, Inc. filed Critical Carestream Health, Inc.
Publication of WO2016073569A2 publication Critical patent/WO2016073569A2/en
Publication of WO2016073569A3 publication Critical patent/WO2016073569A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0088Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10064Fluorescence image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30036Dental; Teeth

Definitions

  • U.S. 4,515,476 (Ingmar) describes the use of a laser for providing excitation energy that generates fluorescence at some other wavelength for locating carious areas.
  • U.S. 6,231,338 (de Josselin de Jong) describes an imaging apparatus for identifying dental caries using fluorescence detection.
  • U.S. 2004/0202356 (Stookey) describes mathematical processing of spectral changes in fluorescence in order to detect caries in different stages with improved accuracy. Acknowledging the difficulty of early detection when using spectral fluorescence measurements, the '2356 Stookey et al. disclosure describes approaches for enhancing the spectral values obtained, effecting a transformation of the spectral data that is adapted to the spectral response of the camera that obtains the fluorescence image.
  • One aspect to existing dental imaging systems relates to the delay period between the time that the tooth is initially being screened and the image of the tooth is obtained and the time a possible caries condition is identified or reported to the dentist or technician.
  • tooth screening (during which the images are obtained)
  • caries detection (during which the images are processed and analyzed to identify carious regions) are carried out as two separate steps.
  • a still image capture is first obtained from the tooth in response to an operator instruction.
  • the image data are processed and analyzed for carious conditions to provide the clinician with a processed image (possibly also accompanied by a report) indicating caries information, such as apparent location, size, and severity, for example.
  • This caries information is available only at a later time, after the conclusion of the tooth screening step and only after image processing/analysis steps are completed.
  • U.S. 8311302 provides a method for real-time identification and highlighting of suspicious caries lesions in white light video images with reduced sensitivity to illumination variation. Neither method, however, takes advantage of information that is indicative of bacterial activity, obtained from the red signal content of the fluorescence image.
  • An object of the present disclosure is to provide apparatus and methods for identifying and quantifying caries and other disease conditions in digital images of a tooth.
  • Caries identification and analysis can be executed automatically to assist the practitioner before, during, and following treatment of the patient.
  • a method for imaging a tooth executed at least in part by a computer and comprising: a) illuminating the tooth and acquiring fluorescence image data from the tooth; b) calculating a risk condition for the tooth according to the fluorescence image data; c) mapping two or more display colors to areas of the tooth according to the calculated risk condition to form a pseudo-color mapped tooth; and d) displaying the pseudo-color mapped tooth.
  • Figure 1 is a schematic block diagram that shows a dental imaging apparatus for detection of caries and other tooth conditions according to an embodiment of the present invention.
  • Figure 2A is a schematic diagram that shows the activity of fluoresced green light for caries detection.
  • Figure 2B is an image that shows an advanced caries condition detected according to an embodiment of the present invention.
  • Figure 3A is a schematic diagram that shows the behavior of fluoresced red light for caries detection.
  • Figure 3B is an image that shows incipient caries detected according to an embodiment of the present invention.
  • FIG. 4 is a diagram that shows the overall arrangement of color space using the hue-saturation-value (HSV) model.
  • Figure 5 is a process flow diagram that shows steps for processing acquired image data for caries detection in video-image mode.
  • Figure 6 A shows processing of the video fluorescence image to generate a pseudo color image according to an embodiment of the present disclosure.
  • Figure 6B shows processing of the video fluorescence image to generate a pseudo color image according to an alternate embodiment of the present disclosure.
  • Figure 6C shows processing of the video fluorescence image to generate a grayscale likelihood image according to an embodiment of the present disclosure.
  • Figure 7A is a schematic diagram that shows an imaging apparatus for providing images supporting minimally invasive treatment according to an embodiment of the present disclosure.
  • Figure 7B is a schematic diagram that shows the imaging apparatus of Figure 7A with the camera moved out of imaging position.
  • Figure 7C is a schematic diagram that shows the imaging apparatus of Figure 7A with the camera repositioned in imaging position.
  • Figure 8 is a logic flow diagram showing an imaging procedure for supporting minimally invasive treatment.
  • Figure 9 shows a sequence of time-stamped images displayed to support patient treatment.
  • Figure 10 is a schematic diagram that shows an alternate embodiment of the present invention where a dental instrument is directly mounted (e.g., integral) to an intra-oral imaging camera as part of a dental treatment instrument.
  • opticals is used generally to refer to lenses and other refractive, diffractive, and reflective components used for shaping a light beam.
  • viewer In the context of the present disclosure, the terms “viewer”, “operator”, and “user” are considered to be equivalent and refer to the viewing practitioner, technician, or other person who views and manipulates an image, such as a dental image, on a display monitor.
  • viewer instruction is obtained from explicit commands entered by the viewer, such as by clicking a button on a camera or by using a computer mouse or by touch screen or keyboard entry.
  • highlighting for a displayed feature has its conventional meaning as is understood to those skilled in the information and image display arts. In general, highlighting uses some form of localized display enhancement to attract the attention of the viewer. Highlighting a portion of an image, such as an individual tooth or a set of teeth or other structure(s) can be achieved in any of a number of ways, including, but not limited to, annotating, displaying a nearby or overlaying symbol, outlining or tracing, display in a different color or at a markedly different intensity or gray scale value than other image or information content, blinking or animation of a portion of a display, or display at higher sharpness or contrast.
  • An image is displayed according to image data that can be acquired by a camera or other device, wherein the image data represents the image as an ordered arrangement of pixels.
  • Image content may be displayed directly from acquired image data or may be further processed, such as to combine image data from different sources or to highlight various features of tooth anatomy represented by the image data, for example.
  • image and image data are generally synonymous.
  • the described invention includes calculation steps. Those skilled in the art will recognize that these calculation steps may be performed by data processing hardware that is provided with instructions for image data processing. Because such image manipulation systems are well known, the present description is directed more particularly to algorithms and systems that execute the method of the present invention. Other aspects of such algorithms and systems, and data processing hardware and/or software for producing and otherwise processing the image signals may be selected from such systems, algorithms, components and elements known in the art. Given the description as set forth in the following specification, software implementation lies within the ordinary skill of those versed in the programming arts.
  • the stored instructions of such a software program may be stored in a computer readable storage medium, which may comprise, for example:
  • magnetic storage media such as a magnetic disk or magnetic tape
  • optical storage media such as an optical disc, optical tape, or machine readable bar code
  • solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program.
  • RAM random access memory
  • ROM read only memory
  • the present invention can be utilized on a data processing hardware apparatus, such as a computer system or personal computer, or on an embedded system that employs a dedicated data processing component, such as a digital signal processing chip.
  • the word "intensity” is used to refer to light level, and is also broadly used to refer to the value of a pixel in a digital image.
  • fluorescence can be used to detect dental caries using either of two characteristic responses: First, excitation by a blue light source causes healthy tooth tissue to fluoresce in the green spectrum, between about 500 and 550 nm. Tooth material that has been damaged may fluoresce at a lower intensity or may not fluoresce perceptibly. Secondly, excitation by a blue or red light source can cause bacterial by-products, such as those indicating caries, to fluoresce in the red spectrum, above 600 nm. Some existing caries detection systems use fluorescence of either type.
  • reflectance generally denotes the sum total of both specular reflectance and scattered reflectance.
  • specular component of reflectance is of no interest and is, instead, generally detrimental to obtaining an image or measurement from a sample.
  • the component of reflectance that is of interest for the present application is from back- scattered light only.
  • back-scattered reflectance is used in the present application to denote the component of reflectance that is of interest.
  • Back-scattered reflectance is defined as that component of the excitation light that is elastically back-scattered over a wide range of angles by the illuminated tooth structure.
  • Reflectance image data, as this term is used in the present disclosure, refers to image data obtained from back-scattered reflectance only, since specular reflectance is blocked or kept to a minimum.
  • back- scattered reflectance may also be referred to as back-reflectance or simply as back-scattering. Back- scattered reflectance is in the visible spectrum
  • back-scattered reflectance may be less effective an indicator than at earlier stages.
  • FIG. 1 shows a dental imaging apparatus 10 for detection of caries and other tooth conditions during a patient treatment session according to an embodiment of the present invention.
  • An intraoral camera 30 is used for imaging tooth 20, providing the different illumination sources needed for both reflectance and fluorescence imaging, with appropriate spectral filters and other optics, detector components, and other elements.
  • Camera 30 components for a camera that obtains both reflectance and fluorescence images are described, for example, in commonly assigned U.S. Patent No. 7596253 entitled "Method and Apparatus for Detection of Caries" to Wong et al.
  • the Wong et al. '253 patent describes a FIRE (Fluorescence Imaging with Reflective
  • Enhancement method that combines reflective image data with a portion of the fluorescent content.
  • the camera illumination source may be a solid state emissive device, such as a light-emitting diode (LED) or scanned laser, for example.
  • Camera 30 provides image data to an external processor 40 over a transmission link 32, which may be a wired or wireless link.
  • Processor 40 has an associated display 42 for display of the acquired and processed images.
  • An operator interface device 44 such as a keyboard with a mouse or other pointer or touchscreen, allows entry of instructions for camera 30 operation.
  • one or more operator controls 41 are provided on the camera 30 handset.
  • Embodiments of the present invention utilize fluorescence response in at least two different spectral bands.
  • the two spectral bands may overlap or may be essentially non- overlapping, with spectral bands centered at different wavelengths.
  • Figure 2A shows information that is provided from fluorescence in the green spectral band.
  • Excitation light 50 of blue and near UV wavelengths (nominally about 400 nm according to an embodiment of the present disclosure) is directed toward tooth 20 with an outer enamel layer 22 and inner dentine 24.
  • Fluoresced light 52 of green wavelengths approximately in the range from 500 - 550nm, is detected from portions of the tooth 20 having normal mineral content, not exhibiting perceptible damage from decay.
  • a demineralized area 26 is more opaque than healthy enamel and tends to block the incident excitation light 50 as well as to block back-scattered fluorescent light from surrounding enamel. This effect is used by the FIRE method described in the Wong et al '253 patent, wherein the fluorescence green channel data is combined with reflectance image data to heighten the contrast of caries regions.
  • Figure 2B shows an early caries condition detected for tooth 20 using the FIRE method, according to an embodiment of the present invention.
  • An area 28, circled in Figure 2B, shows suspected caries.
  • the fluoresced red light has different significance from that of green fluorescence, indicating the presence of bacterial metabolic products. Bacteria that typically cause a caries lesion, plaque, or tartar typically generate byproducts that fluoresce in the red spectrum, above about 600 nm.
  • Figure 3A shows the behavior of fluoresced red light 53 for caries detection.
  • a caries lesion 54 has significant bacterial activity, evidenced by emission of perceptible amounts of fluoresced light 53 in the red spectral region in response to excitation light 50. With proper filtering of the fluorescent light, this red wavelength emission indicates an active lesion 54, as circled in Figure 3B.
  • dental imaging apparatus 10 uses a camera 30 that can operate in both of two modes of imaging operation: still and video mode, using both reflectance and fluorescence imaging.
  • the present disclosure is directed primarily to image acquisition and processing during video imaging mode, in which a video stream of image data is obtained.
  • Embodiments of the present disclosure take advantage of both green and red spectral components of fluorescence image data for detecting conditions such as cavities or plaque.
  • Camera 30 typically captures color images in a tristimulus Red-Green-Blue (RGB) representation, using conventional types of CMOS (Complementary Metal Oxide Semiconductor) or CCD (Charge-Coupled Device) sensor devices.
  • the color image data is converted from tristimulus RGB to a polar- coordinate color model for simplifying calculation and conversion.
  • the diagram of Figure 4 shows the overall arrangement of color space using the hue- saturation- value (HSV) model.
  • HSV color representation Hue is provided with an angular coordinate. Coordinate position along a central axis represents color Value. Saturation is represented by coordinate distance from the central Value axis.
  • Three-dimensional HSV coordinates are represented in a perspective view 74.
  • a 2-D view 76 shows Hue and Value coordinate space.
  • a circular view 78 shows Hue angular relationship at a halfway point along the Value axis.
  • a 2-D view 72 shows a slice taken through perspective view 74 to show Saturation with respect to Value. Transformation of the RGB data from the camera detector to HSV data uses algorithmic methods for color conversion that are familiar those skilled in the color imaging arts.
  • RGB values are normalized to compensate for variation in illumination levels.
  • One or more additional images are prepared for reference in subsequent processing. For example, an RG ratio image can be generated, in which each pixel has a value proportional to the ratio of red (R) to green (G) values in the initial RGB value assignment from the imaging detector of camera 30.
  • the tooth Region of Interest (ROI) is also extracted using any of a number of well known segmentation techniques. As part of this segmentation, the tooth 20 is defined apart from background image content in the fluorescence image.
  • intra-oral camera 30 ( Figure 1) is also capable of obtaining fluorescence images in video mode.
  • the video mode processing sequence is shown in the flow diagram of Figure 5.
  • an ultra-violet or near UV source of camera 30 is energized.
  • the usable UV and near-UV light range is typically above 200 nm wavelength and may extend somewhat into the blue visible light range (near about 390-450 nm). This may be a solid-state source, such as an LED or scanning laser source for example.
  • Processing is arranged in a modular form.
  • the color space of the video content is transformed from RGB to HSV, as described previously for still mode image content. This allows features content to be obtained and a feature map generated using only HSV values.
  • the tooth Region of Interest (ROI) is also extracted using any of a number of well known segmentation techniques. As part of this segmentation, the tooth 20 is defined apart from background image content in the fluorescence image data.
  • Feature calculation step 300 converts the color space of current fluorescence video frame, IJluo(t), from RGB to HSV. Because reflectance video is not obtained by camera 30 while fluorescence video is captured, the feature value contains fluorescent color information without any reflectance content. The color information reflects an overall risk condition; in fluorescence images the metabolic by-product of bacteria activity is red in appearance.
  • the feature can be calculated as follows:
  • Each of these pixel values is in the range [0, 1] .
  • Hhl and Hlo are set corresponding to known hue angle values for healthy tissue in HSV color space, for example,0.38 and 0.36.
  • Iz(x,y,t) l/ ⁇ l+exp[-a*Imx(x,y,t)*I_s(x,y,t)+c)] ⁇ +b.
  • the value in the likelihood map is calculated for each pixel, expressing a risk degree for the pixel.
  • the values in the likelihood map ranging in [0,1] can be used to indicate relative risk levels corresponding to levels of bacterial activity.
  • the likelihood map, calculated as described above is termed a raw feature map for the current frame. In the sequence of Figure 5, the raw feature map is input to a global motion compensation step 310.
  • Global motion compensation step 310 Global motion compensation step 310
  • global motion compensation step 310 registers content from individual image frames, such as using an affine transformation matrix and related contour tracking techniques. This helps to temporally align the feature mapping obtained in successive video frames. Because it allows multiple frames to be used in combination, alignment helps to reduce noise and provide a more consistent image input for subsequent processing.
  • Global motion compensation step 310 processing includes a number of tasks, including the following:
  • ICP Iterative Closest Points
  • Ilkl_out(t) * ⁇ _p(x,y,t) + (I- r)*Ilkl(x,y,t).
  • the current Ilkl_out(t) is termed "Updated feature map for current frame”. This is fed back to Feature calculation step 300 and stored into Frame buffer management as part of a frame buffer management step 320.
  • Frame buffer management step 320
  • Frame buffer management step 320 provides management for information extracted in successive image frames as follows:
  • Croi(t) and Croi(t-l) are output from Frame Buffer Management processing in step 320.
  • Croi(t-l) is replaced by Cwi(t) for use in processing the next frame.
  • Ilkl_s(t-1 ) (termed the "Stored feature map of previous frame” in Figure 5) is fetched from Frame Buffer Management for current use.
  • Ilkl_s(t-1 ) is replaced with the updated Ilkl_out(t) (termed the "Updated feature map of previous frame” in Figure 5) for use in processing the next frame and calculating an adjusted calculated risk condition.
  • An optional classification step 330 takes feature map, Ilkl_out(t) as input and processes features using one or more trained classifiers in order to distinguish healthy tissue from infected tissue.
  • the trained classifiers can include software designed with neural network response, conditioned to provide decision results according to previous interaction with a training set of conditions and interaction with ongoing conditions encountered in normal use. Neural network and training software in general are known to those skilled in the software design arts.
  • a Gaussian Mixture Model is used for the decision process with classifier software, with the number of GMMs predetermined during the training procedure.
  • GMM Gaussian Mixture Model
  • an image enhancement and color mapping step 340 provides a number of processes, including color space transformation, image enhancement, and image fusion.
  • the likelihood value for each pixel in an infected tissue region is fused with the fluorescence image.
  • Pseudo-color mapping can be used to assist in forming and displaying a pseudo-color mapped tooth for visualizing the infected areas.
  • color transformation is performed, transforming fluorescence image data from RGB to HSV color space, in order to acquire H(x,y,t), S(x,y,t), and V(x,y,t), all ranging in [0,1] .
  • an image enhancement sequence processes the HSVcolor Value (V) of the ROI to emphasize detail.
  • the Hue value (H) is mapped to a specified range so that the high likelihood region is more noticeable.
  • the Saturation value (S) is set proportional to the likelihood value, obtained as described previously, so that color saturation expresses relative risk level for affected areas.
  • the new hue value, H'(x,y,t) can be calculated as follows:
  • hue_tgt hue tgt +S*[l-Ilkl_out(x,y,t)*Rrisk(x,y,t)]*[Ilkl_out(x,y,t)-hue_tgt] herein hue_tgt is a specified target hue for highest likelihood value and ⁇ is an empirically determined value that is used to control scaling degree;
  • the new saturation value, S'(x,y,t), can be calculated as follows:
  • the new intensity value, V'(x,y,t) can be calculated as follows:
  • v_tmp V(x,y,t)+2*(l-V_max(t))*Ilkl_out(x,y,t);
  • V_max(t) is the maximum intensity value in V(x,y,t).
  • Pseudo-color mapping is used to re-map fluorescence spectral response to color content that is representative of actual tooth color. Pseudo-color mapping thus approximates tooth color and helps to highlight problem areas of the tooth where image analysis has detected significant bacterial activity, based on the fluorescence signal. Pseudo-color mapping can be done in a number of ways, as shown in the examples Figures 6A - 6C.
  • Figure 6A shows mapping of the healthy tooth content of a fluorescence image 164 to a first representative tooth color, shown in the pseudo-color mapped tooth 170 at the right. In the pseudo-color mapped tooth 170, a second representative tooth color is mapped to tooth content that appears to show considerable bacterial activity.
  • a healthy area 160 is shown in one color; the infected region is displayed in a contrasting color for emphasis, as in the example shown for a caries region 162.
  • Figure 6B shows mapping that emphasizes caries region 162 and does not emphasize healthy area 160.
  • Figure 6C shows generation of a grayscale likelihood map 166 from fluorescence image content. The practitioner is given the option to display either the pseudo-color mapped fluorescence image or the likelihood map 166.
  • At least one of the display colors is colorimetrically closer to actual tooth color than to the colors in the fluorescence image data.
  • Colorimetric proximity between two colors is defined as the three- dimensional Euclidean distance between data points for the two colors. Where colors A and B are represented in HSV form, the colorimetric distance between them can thus be computed as:
  • H A , S A , and V A are HSV coordinates for color A and 3 ⁇ 4, S B , V B are the corresponding coordinates for color B.
  • dental plaque appears red in fluorescence video
  • embodiments of the present disclosure also provide dental plaque detection; however, different types of lesions can be difficult to distinguish from each other using only fluorescence video content.
  • Dental plaque is a biofilm, usually pale yellowish in color when viewed under broadband visible light, that develops naturally on the teeth. Like any biofilm, dental plaque is formed by colonizing bacteria trying to attach themselves to the tooth's smooth surface. Calculus or tartar is a form of hardened dental plaque. It is caused by the continual accumulation of minerals from saliva on plaque on the teeth. Its rough surface provides an ideal medium for further plaque formation, threatening the health of the gingival tissue (gums). Brushing and flossing can remove plaque from which calculus forms; however, once formed, it is too hard and firmly attached to be removed with a toothbrush.
  • Embodiments of the present disclosure provide tools that are designed to support minimally invasive dentistry.
  • Minimally invasive dentistry strategies provide conservative treatment practices for removing only irremediably damaged tooth tissue and preserving as much sound bone material as is possible, including bone tissue that can be re-mineralized in many cases.
  • FIGS. 7A, 7B, and 7C show a dental imaging apparatus 12 for supporting minimally invasive dentistry with images of tooth 20 that is being treated.
  • Camera 30, transmission link 32, processor 40, display 42, and operator interface device 44 have the functions described for imaging apparatus 10 in Figure 1.
  • Display 42 shows an infected area 56, highlighted on a pseudo-color image 64 of tooth 20, generated as described previously.
  • An optional positioning fixture 34 enables re-positioning of camera 30 in position for acquiring fluorescence images of tooth 20. This allows a sequence of imaging at a specific position, treatment of the tooth, and subsequent repositioning of the camera 30 and re-imaging at the same position.
  • positioning fixture 34 can be provided, including a hinged fixture, as suggested in Figures 7A-7C, a pivotable arm, or some other mechanical device that repositions the camera 30 at or near the same position relative to the patient's tooth 20.
  • An automated repositioning device can be provided for fine-tuning the position of the camera to align with the preceding position.
  • image processing is used to indicate re-positioning of the camera 30 using feature recognition techniques known to those skilled in the image processing arts.
  • Display 42 shows the last updated pseudo-color image until camera 30 position for imaging the tooth is restored. This provides a reference image for the practitioner during patient treatment, when the camera 30 is moved away from the patient for better access by the practitioner.
  • Alignment is achieved when the overlap of the tooth image at the current camera position with that at the preceding position exceeds a predetermined value, e.g., 98%. This computation is performed by a feature- matching algorithm inside processor 40.
  • An optional sensor 36 can also be used to provide a signal that indicates when camera 30 is in suitable position to resume imaging. Fluorescence imaging can re-commence automatically when proper camera 30 position is sensed.
  • Display 42 shows image 64 and updates image 64 during image acquisition.
  • image acquisition is suspended and display 42 is unchanged.
  • infected area 56' is reduced in size as infected tooth tissue is removed.
  • an outlined area 58 on display 42 shows the boundaries of the originally identified area of decay or other infection.
  • the logic flow diagram of Figure 8 shows a processing sequence for image acquisition and display as shown in Figures 7 A- 7C.
  • the acquired video stream includes fluorescence image content.
  • a processing step 410 identifies areas of the tooth that are infected and exhibit bacterial activity according to spectral contents of the fluorescence image data.
  • a display step 420 displays the tooth with bacterial activity highlighted, as described with reference to Figures 7A- 7C.
  • a suspend imaging step 430 suspends imaging acquisition during treatment.
  • a decision step 440 then allows the practitioner to determine whether or not to continue treatment of the tooth 20 or to move on to the next tooth.
  • processor 40 tracks the relative amount of detectable bacterial activity and provides a status message or a signal or display highlighting that indicates that excavation or other treatment appears completed or is at least nearer to
  • processor 40 automatically senses repositioning of the camera 30 performed by the viewer and resumes image acquisition when camera 30 is aligned with the previous imaging frames, using image correlation and alignment techniques known to those in the image processing arts. After repositioning of camera 30, the imaging logic compares the results of the current imaging session with results from previous imaging. The processor can then determine whether or not removal of infected tooth material is completed within an area and can report this to the practitioner.
  • the system displays a sequence of time-stamped images 432a, 432b, 432c that document the progress of excavation or other treatment.
  • a time stamp 88 or other timing or sequence indicator provides tracking of treatment progress.
  • the image processor further analyzes the fluorescence image data and provides a signal that indicates completion of treatment of an infected area of a tooth.
  • Completion can be detected, for example, by comparison of a series of images obtained at intervals over the treatment procedure. Successive images can be overlaid, partially overlaid as shown in Figure 9, or presented side-by-side in sequence.
  • a dental drill 38 is coupled to intraoral imaging camera 30 as part of a dental treatment instrument 48.
  • dental instrument 48 having this configuration, a practitioner can have the advantage of imaging during treatment activity, rather than requiring the camera 30 to pause in imaging while the practitioner drills or performs some other type of procedure.
  • camera 30 clips onto drill 38 or other type of instrument 48, allowing the camera to be an optional accessory for use where it is advantageous and otherwise removable from the treatment tool.
  • Camera 30 can similarly be clipped to other types of dental instruments, such as probes, for example.
  • Camera 30 can also be integrally designed into the drill or other instrument 48, so that it is an integral part of the dental instrument 48. Camera 30 can be separately energized from the dental instrument 48 so that image capture takes place with appropriate timing. Display 42 can indicate the position of the dental instrument 48 relative to a particular tooth 20 in the captured image, as indicated by a cross-hairs symbol 66 in Figure 10.
  • Exemplary types of dental instruments 48 for coupling with camera 30 include drills, probes, inspection devices, polishing devices, excavators, scalers, fastening devices, and plugging devices.
  • dental instrument 48 is a drill that has an integral camera 30 that images tooth 20 during drill operation, such as when actively drilling or when drilling is stopped momentarily for repositioning or for evaluation of progress. Imaging can be suspended when the drill is removed from the tooth, such as for inspection by the practitioner.
  • the display 42 can be used for inspection instead of requiring drill removal as is done in conventional practice. Images on display 42 can be used to guide ongoing drilling operation, such as by highlighting areas that have not yet undergone treatment or areas where additional drilling may be needed. Text annotation or audio signals may be provided to help guide drill operation by the practitioner.
  • the present invention utilizes a computer program with stored instructions that perform on image data accessed from an electronic memory.
  • a computer program of an embodiment of the present invention can be utilized by a suitable, general-purpose computer system, such as a personal computer or workstation.
  • a suitable, general-purpose computer system such as a personal computer or workstation.
  • many other types of computer systems can be used to execute the computer program of the present invention, including networked processors.
  • the computer program for performing the method of the present invention may be stored in a computer readable storage medium.
  • This medium may comprise, for example; magnetic storage media such as a magnetic disk such as a hard drive or removable device or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program.
  • the computer program for performing the method of the present invention may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other communication medium. Those skilled in the art will readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
  • “computer-accessible memory” in the context of the present disclosure can refer to any type of temporary or more enduring data storage workspace used for storing and operating upon image data and accessible to a computer system, including a database, such as database 50 described with reference to Figure 5A, for example.
  • the memory could be non-volatile, using, for example, a long-term storage medium such as magnetic or optical storage. Alternately, the memory could be of a more volatile nature, using an electronic circuit, such as random- access memory (RAM) that is used as a temporary buffer or workspace by a microprocessor or other control logic processor device. Displaying an image requires memory storage.
  • RAM random- access memory
  • Display data for example, is typically stored in a temporary storage buffer that is directly associated with a display device and is periodically refreshed as needed in order to provide displayed data.
  • This temporary storage buffer can also be considered to be a memory, as the term is used in the present disclosure.
  • Memory is also used as the data workspace for executing and storing intermediate and final results of calculations and other processing.
  • Computer-accessible memory can be volatile, non-volatile, or a hybrid combination of volatile and non- volatile types.
  • a method for imaging a tooth executed at least in part by a computer can include illuminating the tooth and acquiring fluorescence image data from the tooth;
  • calculating a risk condition for the tooth according to the fluorescence image data mapping two or more display colors to areas of the tooth according to the calculated risk condition to form a pseudo-color mapped tooth; and displaying, storing, or transmitting the pseudo-color mapped tooth.
  • a method for imaging a tooth executed at least in part by a computer can include illuminating the tooth and acquiring a plurality of frames of fluorescence image data from the tooth on a camera that is coupled with a dental drill; processing each of two or more of the plurality of frames by calculating a risk condition for at least one portion of the tooth according to the fluorescence image data; mapping one or more display colors to the at least one portion of the tooth according to the adjusted calculated risk condition to form a pseudo-color mapped tooth; and displaying the pseudo-color mapped tooth and updating the display one or more times during drill operation.
  • the fluorescence image data is obtained from a video stream.
  • the method can include forming a likelihood map that shows the calculated risk condition for each of a plurality of teeth.
  • acquiring fluorescence image data further includes applying motion compensation to one or more individual frames in the video stream.
  • the method can include combining a plurality of the individual frames in order to calculate the risk condition.
  • the method can include transforming the fluorescence image data from RGB color data to hue- saturation- value data.
  • the calculated risk condition relates to a proportion of red image pixels in the fluorescence image data.
  • the calculated risk condition relates to a level of bacterial activity that is indicated by the fluorescence image data.
  • At least one of the display colors is colorimetrically closer to actual tooth color than to the colors in the fluorescence image.
  • illuminating the tooth is performed using a solid-state light source.
  • the fluorescence image data is in at least two non-overlapping spectral bands.
  • calculating the risk condition further includes using one or more trained classifiers obtained from a memory that is in signal
  • a dental imaging apparatus for use during treatment of a patient's tooth, can include an intra-oral camera configured to acquire a video image data stream of fluorescence images from the tooth; a positioning fixture disposed to removably position the camera in an imaging position for the tooth and in a treatment position during treatment; and an image processor in signal communication with a display and disposed to display, during treatment, images processed from the video image data stream acquired by the intra-oral camera.
  • the positioning fixture is hinged or pivoted.
  • One exemplary embodiment further includes a sensor that provides a signal indicating when the camera is in the imaging position.
  • the display highlights one or more areas of a tooth for treatment.
  • the image processor further analyzes the fluorescence image data obtained during a treatment session and provides a signal that indicates completion of treatment of an infected area of a tooth.

Abstract

A system and method for imaging a tooth. The method illuminates the tooth and acquires fluorescence image data from the tooth. A risk condition for the tooth is calculated according to the fluorescence image data. Two or more display colors are mapped to areas of the tooth according to the calculated risk condition to form a pseudo-color mapped tooth that is displayed.

Description

VIDEO DETECTION OF TOOTH CONDITION
USING GREEN AND RED FLUORESCENCE
TECHNICAL FIELD
The disclosure relates generally to the field of diagnostic imaging and more particularly relates to methods and apparatus for
detection and display of suspected tooth lesions based on response of the tooth to incident light. BACKGROUND
While there have been improvements in detection, treatment and prevention techniques, dental caries remains a prevalent condition affecting people of all age groups. If not properly and promptly treated, caries could lead to permanent tooth damage and even to loss of teeth.
Traditional methods for caries detection include visual examination and tactile probing with a sharp dental explorer device, often assisted by radiographic (x-ray) imaging. Detection using these methods can be somewhat subjective, varying in accuracy due to many factors, including practitioner expertise, location of the infected site, extent of infection, viewing conditions, accuracy of x-ray equipment and processing, and other factors. There are also hazards associated with conventional detection techniques, including the risk of damaging weakened teeth and spreading infection with tactile methods as well as exposure to x-ray radiation. By the time a caries condition is evident under visual and tactile examination, the disease is generally in an advanced stage, requiring a filling and, if not timely treated, possibly leading to tooth loss.
In response to the need for improved caries detection methods, there has been considerable interest in improved imaging techniques that do not employ x-rays. One method employs fluorescence wherein teeth are illuminated with high intensity blue light. This technique, sometimes termed quantitative light-induced fluorescence (QLF), operates on the principle that sound, healthy tooth enamel yields a higher intensity of fluorescence under excitation from some wavelengths than does de-mineralized enamel that has been damaged by caries infection. The correlation between mineral loss and loss of fluorescence for blue light excitation is then used to identify and assess carious areas of the tooth. A different relationship has been found for blue or red light excitation regions of the spectrum within which bacterial by-products in carious regions absorb and fluoresce more pronouncedly than do healthy areas.
Applicants note some references related to optical detection of caries.
U.S. 4,515,476 (Ingmar) describes the use of a laser for providing excitation energy that generates fluorescence at some other wavelength for locating carious areas.
U.S. 6,231,338 (de Josselin de Jong) describes an imaging apparatus for identifying dental caries using fluorescence detection.
U.S. 2004/0240716 (de Josselin de Jong) describes methods for improved image analysis for images obtained from fluorescing tissue.
U.S. 4,479,499 (Alfano) describes a method for using transillumination to detect caries based on the translucent properties of tooth structure.
Among products for dental imaging using fluorescence behavior is the QLF Clinical System from Inspektor Research Systems BV, Amsterdam, The Netherlands. The Diagnodent Laser Caries Detection Aid from KaVo Dental
Corporation, Lake Zurich, Illinois, USA, detects caries activity by monitoring the intensity of red fluorescence of bacterial by-products under illumination from red light.
U.S. 2004/0202356 (Stookey) describes mathematical processing of spectral changes in fluorescence in order to detect caries in different stages with improved accuracy. Acknowledging the difficulty of early detection when using spectral fluorescence measurements, the '2356 Stookey et al. disclosure describes approaches for enhancing the spectral values obtained, effecting a transformation of the spectral data that is adapted to the spectral response of the camera that obtains the fluorescence image.
In U.S. 2008/0056551, a method and apparatus that employs both the reflectance and fluorescence images of the tooth is used to detect caries. It takes advantage of the observed back-scattering, or reflectance, for incipient caries and in combination with fluorescence effects, to provide a dental imaging technique to detect caries.
Existing systems and proposed solutions for automated caries detection have shown some success, but there is considered to be room for improvement. Individual detection apparatus are typically optimized for narrow functions such as advanced caries detection, while not providing assessment of other tooth conditions, such as incipient caries, plaque, and calculus. In addition, there is a lack of caries analysis tools designed to assist the practitioner during caries removal, such as during drilling, to help ensure that irreparably damaged tooth tissue is completely removed while healthy tooth structure is not
unnecessarily excavated.
One aspect to existing dental imaging systems relates to the delay period between the time that the tooth is initially being screened and the image of the tooth is obtained and the time a possible caries condition is identified or reported to the dentist or technician. With existing systems, tooth screening (during which the images are obtained) and caries detection (during which the images are processed and analyzed to identify carious regions) are carried out as two separate steps. In practice, at an appropriate point during screening, a still image capture is first obtained from the tooth in response to an operator instruction. Then, in a subsequent step, the image data are processed and analyzed for carious conditions to provide the clinician with a processed image (possibly also accompanied by a report) indicating caries information, such as apparent location, size, and severity, for example. This caries information is available only at a later time, after the conclusion of the tooth screening step and only after image processing/analysis steps are completed.
When the caries information becomes available at this later time after screening, the dentist often needs to go back and re-examine the imaged tooth in order to look more closely at the reported problem area. This delay is inconvenient and lengthens the duration of the examination session. It can be appreciated that there would be an advantage to an apparatus that would provide more immediate feedback to the examining practitioner, so that problem areas can be identified and examined more closely at the time of screening. However, this advantage is not available with conventional systems, due to factors such as the difficulty of detection, the intensive computation requirements needed for many existing detection methods, and the amount of image data that is required for each tooth.
In spite of some advancements, an acknowledged problem with real-time detection for existing dental imaging systems relates to the difficulty of identifying caries in teeth images without extensive image processing or absent a highly skilled practitioner who is familiar with this specialized equipment.
Systems such as the QLF system described earlier may show real-time
fluorescence images, but these displayed images are generally only of value to the experienced clinician who is trained in interpreting the displayed image from tooth fluorescence in order to identify a caries area. In general, caries detection from tooth images, whether using white light or fluorescence images, requires a relatively high level of expertise from the practitioner. Auto-detection by computer-aided image analysis can eliminate the expertise requirement.
However, because current auto-detection algorithms usually involve time- consuming image processing; they are not suitable for real time identification of caries.
It can be appreciated that there would be advantages to a method of image processing that can quickly identify carious areas from teeth images to provide immediate feedback of information suggestive of carious conditions. Such a method would allow auto-detection of caries in real time that would be useful even for the novice or relatively untrained user and provide a capable tool for use during a treatment session.
U.S. 2009/0185712 (Wong) describes using region growing and global threshold methods to determine or segment tooth areas and caries areas, respectively, at video rate based on either the reflectance image or the
demineralization-related green signal from the fluorescence image.
U.S. 8311302 provides a method for real-time identification and highlighting of suspicious caries lesions in white light video images with reduced sensitivity to illumination variation. Neither method, however, takes advantage of information that is indicative of bacterial activity, obtained from the red signal content of the fluorescence image.
Thus, it can be seen that there is a need for an improved method for identifying caries and other conditions in dental images obtained while the patient is positioned for dental treatment.
SUMMARY
An object of the present disclosure is to provide apparatus and methods for identifying and quantifying caries and other disease conditions in digital images of a tooth. Caries identification and analysis can be executed automatically to assist the practitioner before, during, and following treatment of the patient.
These objects are given only by way of illustrative example, and such objects may be exemplary of one or more embodiments of the invention. Other desirable objectives and advantages inherently achieved by the may occur or become apparent to those skilled in the art. The invention is defined by the appended claims.
According to one aspect of the disclosure, there is provided a method for imaging a tooth, the method executed at least in part by a computer and comprising: a) illuminating the tooth and acquiring fluorescence image data from the tooth; b) calculating a risk condition for the tooth according to the fluorescence image data; c) mapping two or more display colors to areas of the tooth according to the calculated risk condition to form a pseudo-color mapped tooth; and d) displaying the pseudo-color mapped tooth.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of the embodiments of the invention, as illustrated in the accompanying drawings.
The elements of the drawings are not necessarily to scale relative to each other. Figure 1 is a schematic block diagram that shows a dental imaging apparatus for detection of caries and other tooth conditions according to an embodiment of the present invention.
Figure 2A is a schematic diagram that shows the activity of fluoresced green light for caries detection.
Figure 2B is an image that shows an advanced caries condition detected according to an embodiment of the present invention.
Figure 3A is a schematic diagram that shows the behavior of fluoresced red light for caries detection.
Figure 3B is an image that shows incipient caries detected according to an embodiment of the present invention.
Figure 4 is a diagram that shows the overall arrangement of color space using the hue-saturation-value (HSV) model.
Figure 5 is a process flow diagram that shows steps for processing acquired image data for caries detection in video-image mode.
Figure 6 A shows processing of the video fluorescence image to generate a pseudo color image according to an embodiment of the present disclosure.
Figure 6B shows processing of the video fluorescence image to generate a pseudo color image according to an alternate embodiment of the present disclosure.
Figure 6C shows processing of the video fluorescence image to generate a grayscale likelihood image according to an embodiment of the present disclosure.
Figure 7A is a schematic diagram that shows an imaging apparatus for providing images supporting minimally invasive treatment according to an embodiment of the present disclosure.
Figure 7B is a schematic diagram that shows the imaging apparatus of Figure 7A with the camera moved out of imaging position.
Figure 7C is a schematic diagram that shows the imaging apparatus of Figure 7A with the camera repositioned in imaging position. Figure 8 is a logic flow diagram showing an imaging procedure for supporting minimally invasive treatment.
Figure 9 shows a sequence of time-stamped images displayed to support patient treatment.
Figure 10 is a schematic diagram that shows an alternate embodiment of the present invention where a dental instrument is directly mounted (e.g., integral) to an intra-oral imaging camera as part of a dental treatment instrument.
DETAILED DESCRIPTION OF THE EMBODIMENTS
This application claims the benefit of U.S. Provisional application U.S. Serial No. 62/075,283, provisionally filed on November 5, 2014, entitled "DETECTION OF TOOTH CONDITION USING REFLECTANCE IMAGES WITH RED AND GREEN FLUORESCENCE", in the names of Yingqian Wu et al., and U.S. Serial No. 62/075,284, provisionally filed on November 5, 2014, entitled "VIDEO DETECTION OF TOOTH CONDITION USING GREEN AND RED FLUORESCENCE", in the names of Yingqian Wu et al., which are incorporated herein by reference in its entirety.
The following is a detailed description of the preferred embodiments, reference being made to the drawings in which the same reference numerals identify the same elements of structure in each of the several figures.
The following is a detailed description of the preferred embodiments of the invention, reference being made to the drawings in which the same reference numerals identify the same elements of structure in each of the several figures.
Reference is made to commonly assigned PCT/CN2009/000078, filed on Jan. 20, 2009, entitled METHOD FOR DETECTION OF CARIES, by Wei Wang et al.
Reference is made to commonly assigned U.S. Patent Application Publication No. 2008/0056551, published Mar. 6, 2008, entitled METHOD FOR DETECTION OF CARIES, by Wong et al. Reference is made to commonly assigned U.S. Patent Application Publication No. 2008/0063998, published Mar. 13, 2008, entitled APPARATUS FOR CARIES DETECTION, by Liang et al.
Reference is made to U.S. Patent Application Publication No. 2008/0170764, published July 17, 2008, entitled SYSTEM FOR EARLY
DETECTION OF DENTAL CARIES, by Burns et al.
Reference is made to U.S. Patent Publication No. 2007/0099148, published on May 3, 2007, entitled METHOD AND APPARATUS FOR
DETECTION OF CARIES, by Wong et al.
Reference is made to commonly assigned U.S. Patent Publication
No. 2013/0038710, published on Feb. 14, 2013, entitled IDENTIFICATION OF DENTAL CARIES IN LIVE VIDEO IMAGES, by Inglese et al.
Reference is made to commonly assigned U.S. Patent Publication No. 2012/0148986, published on June 14, 2012, entitled METHOD FOR
IDENTIFICATION OF DENTAL CARIES IN POLYCHROMATIC IMAGES, by Yan et al.
Reference is made to commonly assigned U.S. Patent Publication No. 2009/0185712, published July 23, 2009, entitled METHOD FOR REALTIME VISUALIZATION OF CARIES CONDITION, by Wong et al.
Where they are used in the context of the present disclosure, the terms "first", "second", and so on, do not necessarily denote any ordinal, sequential, or priority relation, but are simply used to more clearly distinguish one step, element, or set of elements from another, unless specified otherwise.
In the context of the present disclosure, the term "optics" is used generally to refer to lenses and other refractive, diffractive, and reflective components used for shaping a light beam.
In the context of the present disclosure, the terms "viewer", "operator", and "user" are considered to be equivalent and refer to the viewing practitioner, technician, or other person who views and manipulates an image, such as a dental image, on a display monitor. An "operator instruction" or
"viewer instruction" is obtained from explicit commands entered by the viewer, such as by clicking a button on a camera or by using a computer mouse or by touch screen or keyboard entry.
The term "highlighting" for a displayed feature has its conventional meaning as is understood to those skilled in the information and image display arts. In general, highlighting uses some form of localized display enhancement to attract the attention of the viewer. Highlighting a portion of an image, such as an individual tooth or a set of teeth or other structure(s) can be achieved in any of a number of ways, including, but not limited to, annotating, displaying a nearby or overlaying symbol, outlining or tracing, display in a different color or at a markedly different intensity or gray scale value than other image or information content, blinking or animation of a portion of a display, or display at higher sharpness or contrast.
An image is displayed according to image data that can be acquired by a camera or other device, wherein the image data represents the image as an ordered arrangement of pixels. Image content may be displayed directly from acquired image data or may be further processed, such as to combine image data from different sources or to highlight various features of tooth anatomy represented by the image data, for example. As used in the context of the present disclosure, the terms "image" and "image data" are generally synonymous.
The term "at least one of is used to mean one or more of the listed items can be selected. The term "about" indicates that the value listed can be somewhat altered, as long as the alteration does not result in nonconformance of the process or structure to the illustrated embodiment. The term "exemplary" indicates that a particular description or instance is used by way of example, rather than implying that it is an ideal.
The described invention includes calculation steps. Those skilled in the art will recognize that these calculation steps may be performed by data processing hardware that is provided with instructions for image data processing. Because such image manipulation systems are well known, the present description is directed more particularly to algorithms and systems that execute the method of the present invention. Other aspects of such algorithms and systems, and data processing hardware and/or software for producing and otherwise processing the image signals may be selected from such systems, algorithms, components and elements known in the art. Given the description as set forth in the following specification, software implementation lies within the ordinary skill of those versed in the programming arts.
The stored instructions of such a software program may be stored in a computer readable storage medium, which may comprise, for example:
magnetic storage media such as a magnetic disk or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program. Using such software, the present invention can be utilized on a data processing hardware apparatus, such as a computer system or personal computer, or on an embedded system that employs a dedicated data processing component, such as a digital signal processing chip.
In this disclosure, the word "intensity" is used to refer to light level, and is also broadly used to refer to the value of a pixel in a digital image.
As noted in the preceding background section, it is known that fluorescence can be used to detect dental caries using either of two characteristic responses: First, excitation by a blue light source causes healthy tooth tissue to fluoresce in the green spectrum, between about 500 and 550 nm. Tooth material that has been damaged may fluoresce at a lower intensity or may not fluoresce perceptibly. Secondly, excitation by a blue or red light source can cause bacterial by-products, such as those indicating caries, to fluoresce in the red spectrum, above 600 nm. Some existing caries detection systems use fluorescence of either type.
For an understanding of how light is used in the present invention, it is important to give more precise definition to the terms "reflectance" and "back-scattering" as they are used in biomedical applications in general and, more particularly, in the method and apparatus of the present invention. In broadest optical terminology, reflectance generally denotes the sum total of both specular reflectance and scattered reflectance. (Specular reflection is that component of the excitation light that is reflected by the tooth surface at the same angle as the incident angle.) In many biomedical applications, however, as in the dental application of the present invention, the specular component of reflectance is of no interest and is, instead, generally detrimental to obtaining an image or measurement from a sample. The component of reflectance that is of interest for the present application is from back- scattered light only. Specular reflectance is blocked or otherwise removed from the imaging path. With this distinction in mind, the term "back-scattered reflectance" is used in the present application to denote the component of reflectance that is of interest. "Back-scattered reflectance" is defined as that component of the excitation light that is elastically back-scattered over a wide range of angles by the illuminated tooth structure. "Reflectance image" data, as this term is used in the present disclosure, refers to image data obtained from back-scattered reflectance only, since specular reflectance is blocked or kept to a minimum. In the scientific literature, back- scattered reflectance may also be referred to as back-reflectance or simply as back-scattering. Back- scattered reflectance is in the visible spectrum
(approximately 390 - 700 nm) at the same wavelength as the excitation light.
It has been shown that light scattering properties differ between healthy and carious dental regions. In particular, reflectance of light from the illuminated area can be at measurably different levels for normal versus carious areas. This change in reflectance, taken alone, may not be sufficiently
pronounced to be of diagnostic value when considered by itself, since this effect is very slight, although detectable. For more advanced stages of caries, for example, back-scattered reflectance may be less effective an indicator than at earlier stages.
The schematic block diagram of Figure 1 shows a dental imaging apparatus 10 for detection of caries and other tooth conditions during a patient treatment session according to an embodiment of the present invention. An intraoral camera 30 is used for imaging tooth 20, providing the different illumination sources needed for both reflectance and fluorescence imaging, with appropriate spectral filters and other optics, detector components, and other elements. Camera 30 components for a camera that obtains both reflectance and fluorescence images are described, for example, in commonly assigned U.S. Patent No. 7596253 entitled "Method and Apparatus for Detection of Caries" to Wong et al. The Wong et al. '253 patent describes a FIRE (Fluorescence Imaging with Reflective
Enhancement) method that combines reflective image data with a portion of the fluorescent content. Similar types of camera devices are familiar to those skilled in the dental imaging arts. The camera illumination source may be a solid state emissive device, such as a light-emitting diode (LED) or scanned laser, for example. Camera 30 provides image data to an external processor 40 over a transmission link 32, which may be a wired or wireless link. Processor 40 has an associated display 42 for display of the acquired and processed images. An operator interface device 44, such as a keyboard with a mouse or other pointer or touchscreen, allows entry of instructions for camera 30 operation. Optionally, one or more operator controls 41 are provided on the camera 30 handset.
Embodiments of the present invention utilize fluorescence response in at least two different spectral bands. The two spectral bands may overlap or may be essentially non- overlapping, with spectral bands centered at different wavelengths. Figure 2A shows information that is provided from fluorescence in the green spectral band. Excitation light 50 of blue and near UV wavelengths (nominally about 400 nm according to an embodiment of the present disclosure) is directed toward tooth 20 with an outer enamel layer 22 and inner dentine 24. Fluoresced light 52 of green wavelengths, approximately in the range from 500 - 550nm, is detected from portions of the tooth 20 having normal mineral content, not exhibiting perceptible damage from decay. In the representation shown in Figure 2A, a demineralized area 26 is more opaque than healthy enamel and tends to block the incident excitation light 50 as well as to block back-scattered fluorescent light from surrounding enamel. This effect is used by the FIRE method described in the Wong et al '253 patent, wherein the fluorescence green channel data is combined with reflectance image data to heighten the contrast of caries regions.
Figure 2B shows an early caries condition detected for tooth 20 using the FIRE method, according to an embodiment of the present invention. An area 28, circled in Figure 2B, shows suspected caries.
The fluoresced red light has different significance from that of green fluorescence, indicating the presence of bacterial metabolic products. Bacteria that typically cause a caries lesion, plaque, or tartar typically generate byproducts that fluoresce in the red spectrum, above about 600 nm. Figure 3A shows the behavior of fluoresced red light 53 for caries detection. Here, a caries lesion 54 has significant bacterial activity, evidenced by emission of perceptible amounts of fluoresced light 53 in the red spectral region in response to excitation light 50. With proper filtering of the fluorescent light, this red wavelength emission indicates an active lesion 54, as circled in Figure 3B.
Video Imaging Mode
According to an embodiment of the present disclosure, dental imaging apparatus 10 uses a camera 30 that can operate in both of two modes of imaging operation: still and video mode, using both reflectance and fluorescence imaging. The present disclosure is directed primarily to image acquisition and processing during video imaging mode, in which a video stream of image data is obtained.
Embodiments of the present disclosure take advantage of both green and red spectral components of fluorescence image data for detecting conditions such as cavities or plaque.
Camera 30 (Figure 1) typically captures color images in a tristimulus Red-Green-Blue (RGB) representation, using conventional types of CMOS (Complementary Metal Oxide Semiconductor) or CCD (Charge-Coupled Device) sensor devices. However, according to an aspect of the present disclosure, the color image data is converted from tristimulus RGB to a polar- coordinate color model for simplifying calculation and conversion. The diagram of Figure 4 shows the overall arrangement of color space using the hue- saturation- value (HSV) model. In HSV color representation, Hue is provided with an angular coordinate. Coordinate position along a central axis represents color Value. Saturation is represented by coordinate distance from the central Value axis. Three-dimensional HSV coordinates are represented in a perspective view 74. A 2-D view 76 shows Hue and Value coordinate space. A circular view 78 shows Hue angular relationship at a halfway point along the Value axis. A 2-D view 72 shows a slice taken through perspective view 74 to show Saturation with respect to Value. Transformation of the RGB data from the camera detector to HSV data uses algorithmic methods for color conversion that are familiar those skilled in the color imaging arts.
RGB values are normalized to compensate for variation in illumination levels. One or more additional images are prepared for reference in subsequent processing. For example, an RG ratio image can be generated, in which each pixel has a value proportional to the ratio of red (R) to green (G) values in the initial RGB value assignment from the imaging detector of camera 30. In addition, the tooth Region of Interest (ROI) is also extracted using any of a number of well known segmentation techniques. As part of this segmentation, the tooth 20 is defined apart from background image content in the fluorescence image.
As noted previously, intra-oral camera 30 (Figure 1) is also capable of obtaining fluorescence images in video mode. The video mode processing sequence is shown in the flow diagram of Figure 5. For video mode image acquisition, an ultra-violet or near UV source of camera 30 is energized. The usable UV and near-UV light range is typically above 200 nm wavelength and may extend somewhat into the blue visible light range (near about 390-450 nm). This may be a solid-state source, such as an LED or scanning laser source for example. Processing is arranged in a modular form. In a feature calculation step 300, the color space of the video content is transformed from RGB to HSV, as described previously for still mode image content. This allows features content to be obtained and a feature map generated using only HSV values. In addition, the tooth Region of Interest (ROI) is also extracted using any of a number of well known segmentation techniques. As part of this segmentation, the tooth 20 is defined apart from background image content in the fluorescence image data.
Feature calculation step 300
Feature calculation step 300 converts the color space of current fluorescence video frame, IJluo(t), from RGB to HSV. Because reflectance video is not obtained by camera 30 while fluorescence video is captured, the feature value contains fluorescent color information without any reflectance content. The color information reflects an overall risk condition; in fluorescence images the metabolic by-product of bacteria activity is red in appearance. The feature can be calculated as follows:
1) Transform the current fluorescence video frame, I(t), from RGB to HSV color space in order to acquire the following values for each pixel (x, y):
H image,I_h(x,y,t),
S image,I_s(x,y,t),
V image, I_v(x,y,t),
Each of these pixel values is in the range [0, 1] .
2) For each pixel (x,y) in the H image I_h(x,y,t), calculate a mapped value, Imx(x,y,t) by:
0, if [(I _ h(x, y, t) > Hlo )AND(I _ h(x, y, t) < Hhi )]
(*, y, H 1 - 7 y ) , ifU - h(x, y, t)≤ Hlo ] x^ ~ HJLt if[I _ h(x, y, t)≥Hhi )]
1 - H,
Hhl and Hlo are set corresponding to known hue angle values for healthy tissue in HSV color space, for example,0.38 and 0.36.
3) Combine mapped image value Imx(x,y,t) and the S image Is(x,y,t) to acquire the image Iz(x,y,t) using a mapping function:
Iz(x,y,t) = l/{ l+exp[-a*Imx(x,y,t)*I_s(x,y,t)+c)] }+b.
Here, parameters a, b, and c are empirically determined, used to control the shape of the mapping function, for example, a = -7.0, b = -0.05, and c=3.0.
4) After performing a smoothing operation with a Gaussian filter on Iz(x,y,t), the result, Ilkl(t), is termed the likelihood map.
The value in the likelihood map is calculated for each pixel, expressing a risk degree for the pixel. The values in the likelihood map ranging in [0,1] can be used to indicate relative risk levels corresponding to levels of bacterial activity. Here, the likelihood map, calculated as described above, is termed a raw feature map for the current frame. In the sequence of Figure 5, the raw feature map is input to a global motion compensation step 310. Global motion compensation step 310
Continuing with the sequence shown in Figure 5, global motion compensation step 310 registers content from individual image frames, such as using an affine transformation matrix and related contour tracking techniques. This helps to temporally align the feature mapping obtained in successive video frames. Because it allows multiple frames to be used in combination, alignment helps to reduce noise and provide a more consistent image input for subsequent processing.
For this function, an Iterative Closest Point (ICP) -based registration is used, based on its robustness of registration, particularly useful in the intra-oral environment. Global motion compensation step 310 processing includes a number of tasks, including the following:
1) Fetch the contour of the region of interest (ROI) in the current frame Cwi(t) and previous frame, Cwi(t-1 ), from Frame Buffer Management, and initialize a 3x3 affine transformation matrix, H(t), to a unit matrix and initialize a translation vector, T(t), that contains three elements to a zero vector [0, 0, 0] ; Then, if Cwi(t-1 ) is valid, continue to step 2); otherwise, go to step 5).
2) Use a 2-D Iterative Closest Points (ICP) routine to calculate H(t) and T(t), i.e. affine transformation and translation between Cwi(t) and Croi(t- 1 ). Once the iteration completes, if the error in ICP calculation is larger than a specified threshold, current estimation has failed and processing continues to step 5). Otherwise, if the ICP calculation error appears to be within normal range, go to step 3).
3) From Frame Buffer Management, fetch stored Ilkl_s(t-1 ) (termed the "Stored feature map of previous frame" in Figure 5). Using H(t) and T(t), perform bi-linear interpolation on Ilkl_s(t-1 ) in order to get the predicted Ilkl _p(t), in detail. For each pixel in Ilkl _p(t), use H(t) and T(t) to get the position in Ilkl(t-1 ). Then, use bi-linear interpolation to calculate the pixel value of Ilkl _p(t).
4) Apply a lst-order infinite impulse response (IIR) temporal filter to Ilkl _p{t) and Ilkl(t) (termed the "Raw feature map of current frame" in Figure 5) to calculate the final output Ilkl_out(t) for each pixel (x,y), Ilkl_out(x,y,t) = *ΙΜ _p(x,y,t) + (I- r)*Ilkl(x,y,t). Then, continue to step 6). Here the parameter τ is empirically determined, used to control the smoothing degree; in an extreme case, τ = 0 in order to remove the filtering.
5) Set the current Ilkl(t) directly as the final output Ilkl_out(x,y,t). Then proceed to 6).
6) Store the current Ilkl_out(t) into Frame Buffer Management for use with the next frame. The algorithmic routine for global motion compensation step 310 is then complete.
In terms of the Figure 5 sequence, the current Ilkl_out(t) is termed "Updated feature map for current frame". This is fed back to Feature calculation step 300 and stored into Frame buffer management as part of a frame buffer management step 320. Frame buffer management step 320
Frame buffer management step 320 provides management for information extracted in successive image frames as follows:
1) When the tooth Region of Interest (ROI) for the current frame is determined, the contour of ROI, Cwi(t) is stored into Frame Buffer Management.
2) When Global Motion Compensation begins, Croi(t) and Croi(t-l) are output from Frame Buffer Management processing in step 320. After Global Motion Compensation step 310 completes, Croi(t-l) is replaced by Cwi(t) for use in processing the next frame.
3) When Global Motion Compensation step 310 begins, Ilkl_s(t-1 ) (termed the "Stored feature map of previous frame" in Figure 5) is fetched from Frame Buffer Management for current use. After Global Motion Compensation step 310 completes, Ilkl_s(t-1 ) is replaced with the updated Ilkl_out(t) (termed the "Updated feature map of previous frame" in Figure 5) for use in processing the next frame and calculating an adjusted calculated risk condition.
Classification step 330
An optional classification step 330 takes feature map, Ilkl_out(t) as input and processes features using one or more trained classifiers in order to distinguish healthy tissue from infected tissue. The trained classifiers can include software designed with neural network response, conditioned to provide decision results according to previous interaction with a training set of conditions and interaction with ongoing conditions encountered in normal use. Neural network and training software in general are known to those skilled in the software design arts. According to an embodiment of the present disclosure, a Gaussian Mixture Model (GMM) is used for the decision process with classifier software, with the number of GMMs predetermined during the training procedure. As a result of classification step 330, sound tooth tissue is distinguished from infected tissue and the infected tissue is segmented so that only the infected portion of the tissue can be highlighted. Highlighting can be done using display color, for example.
A binary assignment is provided. The segmented infected tissue region is denoted as Rrisk(x,y,t) for each pixel. A value Rrisk(x,y,t) = 1 means that the pixel (x,y) represents infected tooth material while Rrisk(x,y,t) = 0 is assigned to a pixel that indicates sound material. If optional classification is deactivated, value Rrisk(x,y,t) for each pixel is set to 1.
Image enhancement and color mapping step 340
Continuing with the processing sequence of Figure 5, an image enhancement and color mapping step 340 provides a number of processes, including color space transformation, image enhancement, and image fusion. At the conclusion of this processing, the likelihood value for each pixel in an infected tissue region is fused with the fluorescence image. Pseudo-color mapping can be used to assist in forming and displaying a pseudo-color mapped tooth for visualizing the infected areas. In step 340, color transformation is performed, transforming fluorescence image data from RGB to HSV color space, in order to acquire H(x,y,t), S(x,y,t), and V(x,y,t), all ranging in [0,1] .
As part of step 340, an image enhancement sequence processes the HSVcolor Value (V) of the ROI to emphasize detail. The Hue value (H) is mapped to a specified range so that the high likelihood region is more noticeable. The Saturation value (S) is set proportional to the likelihood value, obtained as described previously, so that color saturation expresses relative risk level for affected areas.
In detail, at each pixel (x,y) of the current fluorescent video frame, the new hue value, H'(x,y,t) can be calculated as follows:
1) h mp = hue tgt +S*[l-Ilkl_out(x,y,t)*Rrisk(x,y,t)]*[Ilkl_out(x,y,t)-hue_tgt] herein hue_tgt is a specified target hue for highest likelihood value and δ is an empirically determined value that is used to control scaling degree;
2) if h_tmp < 0, H'(x,y,t) = h_tmp+1.0; otherwise H'(x,y,t) = h_tmp.
The new saturation value, S'(x,y,t), can be calculated as follows:
1) sjtmp = S(x,y,t)* l/[l+exp(- l4*Ilkl_out(x,y,t)+6)] ;
2) if sjtmp > 1, S'(x,y,t) = 1, otherwise, S'(x,y,t) = s_tmp.
The new intensity value, V'(x,y,t) can be calculated as follows:
1) v_tmp = V(x,y,t)+2*(l-V_max(t))*Ilkl_out(x,y,t);
wherein V_max(t) is the maximum intensity value in V(x,y,t).
2) if v_tmp > 1, V'(x,y,t) = 1 ; otherwise V'(x,y,t) = v_tmp.
Pseudo-color mapping is used to re-map fluorescence spectral response to color content that is representative of actual tooth color. Pseudo-color mapping thus approximates tooth color and helps to highlight problem areas of the tooth where image analysis has detected significant bacterial activity, based on the fluorescence signal. Pseudo-color mapping can be done in a number of ways, as shown in the examples Figures 6A - 6C. Figure 6A shows mapping of the healthy tooth content of a fluorescence image 164 to a first representative tooth color, shown in the pseudo-color mapped tooth 170 at the right. In the pseudo-color mapped tooth 170, a second representative tooth color is mapped to tooth content that appears to show considerable bacterial activity. A healthy area 160 is shown in one color; the infected region is displayed in a contrasting color for emphasis, as in the example shown for a caries region 162. Figure 6B shows mapping that emphasizes caries region 162 and does not emphasize healthy area 160. Figure 6C shows generation of a grayscale likelihood map 166 from fluorescence image content. The practitioner is given the option to display either the pseudo-color mapped fluorescence image or the likelihood map 166.
For pseudo-color mapping, at least one of the display colors is colorimetrically closer to actual tooth color than to the colors in the fluorescence image data. Colorimetric proximity between two colors is defined as the three- dimensional Euclidean distance between data points for the two colors. Where colors A and B are represented in HSV form, the colorimetric distance between them can thus be computed as:
[(HA - HBf + (SA - SBf +(VA - VBf f2
wherein HA, SA, and VA are HSV coordinates for color A and ¾, SB, VB are the corresponding coordinates for color B.
Since dental plaque appears red in fluorescence video, embodiments of the present disclosure also provide dental plaque detection; however, different types of lesions can be difficult to distinguish from each other using only fluorescence video content. Dental plaque is a biofilm, usually pale yellowish in color when viewed under broadband visible light, that develops naturally on the teeth. Like any biofilm, dental plaque is formed by colonizing bacteria trying to attach themselves to the tooth's smooth surface. Calculus or tartar is a form of hardened dental plaque. It is caused by the continual accumulation of minerals from saliva on plaque on the teeth. Its rough surface provides an ideal medium for further plaque formation, threatening the health of the gingival tissue (gums). Brushing and flossing can remove plaque from which calculus forms; however, once formed, it is too hard and firmly attached to be removed with a toothbrush.
Procedures for plaque and calculus detection and display are similar to those described for caries detection, with corresponding changes for spectral content. Fluorescence Imaging Aided Caries Excavation
Embodiments of the present disclosure provide tools that are designed to support minimally invasive dentistry. Minimally invasive dentistry strategies provide conservative treatment practices for removing only irremediably damaged tooth tissue and preserving as much sound bone material as is possible, including bone tissue that can be re-mineralized in many cases.
The schematic diagrams of Figures 7A, 7B, and 7C show a dental imaging apparatus 12 for supporting minimally invasive dentistry with images of tooth 20 that is being treated. Camera 30, transmission link 32, processor 40, display 42, and operator interface device 44 have the functions described for imaging apparatus 10 in Figure 1. Display 42 shows an infected area 56, highlighted on a pseudo-color image 64 of tooth 20, generated as described previously. An optional positioning fixture 34 enables re-positioning of camera 30 in position for acquiring fluorescence images of tooth 20. This allows a sequence of imaging at a specific position, treatment of the tooth, and subsequent repositioning of the camera 30 and re-imaging at the same position. Any of a number of configurations of positioning fixture 34 can be provided, including a hinged fixture, as suggested in Figures 7A-7C, a pivotable arm, or some other mechanical device that repositions the camera 30 at or near the same position relative to the patient's tooth 20. An automated repositioning device can be provided for fine-tuning the position of the camera to align with the preceding position. According to an alternate embodiment of the present disclosure, image processing is used to indicate re-positioning of the camera 30 using feature recognition techniques known to those skilled in the image processing arts.
When camera 30 is moved away from the patient's mouth, image acquisition is temporarily suspended according to an embodiment of the present invention. Display 42 shows the last updated pseudo-color image until camera 30 position for imaging the tooth is restored. This provides a reference image for the practitioner during patient treatment, when the camera 30 is moved away from the patient for better access by the practitioner.
Alignment is achieved when the overlap of the tooth image at the current camera position with that at the preceding position exceeds a predetermined value, e.g., 98%. This computation is performed by a feature- matching algorithm inside processor 40. An optional sensor 36 can also be used to provide a signal that indicates when camera 30 is in suitable position to resume imaging. Fluorescence imaging can re-commence automatically when proper camera 30 position is sensed.
Display 42 shows image 64 and updates image 64 during image acquisition. When the practitioner actively works on tooth 20, such as drilling or other activity, image acquisition is suspended and display 42 is unchanged.
Following practitioner activity, such as drilling, camera 30 is temporarily moved back into imaging position and update of image 64 resumes. As shown in Figure 7C, infected area 56' is reduced in size as infected tooth tissue is removed. For reference, an outlined area 58 on display 42 shows the boundaries of the originally identified area of decay or other infection.
The logic flow diagram of Figure 8 shows a processing sequence for image acquisition and display as shown in Figures 7 A- 7C. In an image acquisition step 400, the acquired video stream includes fluorescence image content. A processing step 410 identifies areas of the tooth that are infected and exhibit bacterial activity according to spectral contents of the fluorescence image data. A display step 420 displays the tooth with bacterial activity highlighted, as described with reference to Figures 7A- 7C. A suspend imaging step 430 suspends imaging acquisition during treatment. A decision step 440 then allows the practitioner to determine whether or not to continue treatment of the tooth 20 or to move on to the next tooth. According to an embodiment of the present disclosure, processor 40 tracks the relative amount of detectable bacterial activity and provides a status message or a signal or display highlighting that indicates that excavation or other treatment appears completed or is at least nearer to
completion.
According to an alternate embodiment of the present disclosure, processor 40 automatically senses repositioning of the camera 30 performed by the viewer and resumes image acquisition when camera 30 is aligned with the previous imaging frames, using image correlation and alignment techniques known to those in the image processing arts. After repositioning of camera 30, the imaging logic compares the results of the current imaging session with results from previous imaging. The processor can then determine whether or not removal of infected tooth material is completed within an area and can report this to the practitioner.
According to an alternate embodiment of the present disclosure, as shown in Figure 9, the system displays a sequence of time-stamped images 432a, 432b, 432c that document the progress of excavation or other treatment. A time stamp 88 or other timing or sequence indicator provides tracking of treatment progress. According to an alternate embodiment of the present disclosure, the image processor further analyzes the fluorescence image data and provides a signal that indicates completion of treatment of an infected area of a tooth.
Completion can be detected, for example, by comparison of a series of images obtained at intervals over the treatment procedure. Successive images can be overlaid, partially overlaid as shown in Figure 9, or presented side-by-side in sequence.
According to an alternate embodiment of the present invention, as shown in imaging apparatus 12 of Figure 10, a dental drill 38 is coupled to intraoral imaging camera 30 as part of a dental treatment instrument 48. Using dental instrument 48 having this configuration, a practitioner can have the advantage of imaging during treatment activity, rather than requiring the camera 30 to pause in imaging while the practitioner drills or performs some other type of procedure. Where mechanical coupling is used, camera 30 clips onto drill 38 or other type of instrument 48, allowing the camera to be an optional accessory for use where it is advantageous and otherwise removable from the treatment tool. Camera 30 can similarly be clipped to other types of dental instruments, such as probes, for example.
Camera 30 can also be integrally designed into the drill or other instrument 48, so that it is an integral part of the dental instrument 48. Camera 30 can be separately energized from the dental instrument 48 so that image capture takes place with appropriate timing. Display 42 can indicate the position of the dental instrument 48 relative to a particular tooth 20 in the captured image, as indicated by a cross-hairs symbol 66 in Figure 10. Exemplary types of dental instruments 48 for coupling with camera 30 include drills, probes, inspection devices, polishing devices, excavators, scalers, fastening devices, and plugging devices.
According to an embodiment of the present disclosure, dental instrument 48 is a drill that has an integral camera 30 that images tooth 20 during drill operation, such as when actively drilling or when drilling is stopped momentarily for repositioning or for evaluation of progress. Imaging can be suspended when the drill is removed from the tooth, such as for inspection by the practitioner. However, the display 42 can be used for inspection instead of requiring drill removal as is done in conventional practice. Images on display 42 can be used to guide ongoing drilling operation, such as by highlighting areas that have not yet undergone treatment or areas where additional drilling may be needed. Text annotation or audio signals may be provided to help guide drill operation by the practitioner.
Consistent with one embodiment, the present invention utilizes a computer program with stored instructions that perform on image data accessed from an electronic memory. As can be appreciated by those skilled in the image processing arts, a computer program of an embodiment of the present invention can be utilized by a suitable, general-purpose computer system, such as a personal computer or workstation. However, many other types of computer systems can be used to execute the computer program of the present invention, including networked processors. The computer program for performing the method of the present invention may be stored in a computer readable storage medium. This medium may comprise, for example; magnetic storage media such as a magnetic disk such as a hard drive or removable device or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program. The computer program for performing the method of the present invention may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other communication medium. Those skilled in the art will readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
It should be noted that the term "memory", equivalent to
"computer-accessible memory" in the context of the present disclosure, can refer to any type of temporary or more enduring data storage workspace used for storing and operating upon image data and accessible to a computer system, including a database, such as database 50 described with reference to Figure 5A, for example. The memory could be non-volatile, using, for example, a long-term storage medium such as magnetic or optical storage. Alternately, the memory could be of a more volatile nature, using an electronic circuit, such as random- access memory (RAM) that is used as a temporary buffer or workspace by a microprocessor or other control logic processor device. Displaying an image requires memory storage. Display data, for example, is typically stored in a temporary storage buffer that is directly associated with a display device and is periodically refreshed as needed in order to provide displayed data. This temporary storage buffer can also be considered to be a memory, as the term is used in the present disclosure. Memory is also used as the data workspace for executing and storing intermediate and final results of calculations and other processing. Computer-accessible memory can be volatile, non-volatile, or a hybrid combination of volatile and non- volatile types.
It will be understood that the computer program product of the present invention may make use of various image manipulation algorithms and processes that are well known. It will be further understood that the computer program product embodiment of the present invention may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes may include conventional utilities that are within the ordinary skill of the image processing arts. Additional aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images or co-operating with the computer program product of the present invention, are not specifically shown or described herein and may be selected from such algorithms, systems, hardware, components and elements known in the art. In certain exemplary embodiments according to the application, a method for imaging a tooth, executed at least in part by a computer can include illuminating the tooth and acquiring fluorescence image data from the tooth;
calculating a risk condition for the tooth according to the fluorescence image data; mapping two or more display colors to areas of the tooth according to the calculated risk condition to form a pseudo-color mapped tooth; and displaying, storing, or transmitting the pseudo-color mapped tooth. In selected exemplary embodiments according to the application, a method for imaging a tooth, executed at least in part by a computer can include illuminating the tooth and acquiring a plurality of frames of fluorescence image data from the tooth on a camera that is coupled with a dental drill; processing each of two or more of the plurality of frames by calculating a risk condition for at least one portion of the tooth according to the fluorescence image data; mapping one or more display colors to the at least one portion of the tooth according to the adjusted calculated risk condition to form a pseudo-color mapped tooth; and displaying the pseudo-color mapped tooth and updating the display one or more times during drill operation. In one exemplary embodiment, the fluorescence image data is obtained from a video stream. In one exemplary embodiment, the method can include forming a likelihood map that shows the calculated risk condition for each of a plurality of teeth. In one exemplary exemplary embodiment, acquiring fluorescence image data further includes applying motion compensation to one or more individual frames in the video stream. In one exemplary embodiment, the method can include combining a plurality of the individual frames in order to calculate the risk condition. In one exemplary embodiment, the method can include transforming the fluorescence image data from RGB color data to hue- saturation- value data. In one exemplary embodiment, the calculated risk condition relates to a proportion of red image pixels in the fluorescence image data. In one exemplary embodiment, the calculated risk condition relates to a level of bacterial activity that is indicated by the fluorescence image data. In one exemplary embodiment, at least one of the display colors is colorimetrically closer to actual tooth color than to the colors in the fluorescence image. In one exemplary embodiment, illuminating the tooth is performed using a solid-state light source. In one exemplary embodiment, the fluorescence image data is in at least two non-overlapping spectral bands. In one exemplary embodiment, calculating the risk condition further includes using one or more trained classifiers obtained from a memory that is in signal
communication with the computer.
In certain exemplary embodiments according to the application, a dental imaging apparatus for use during treatment of a patient's tooth, can include an intra-oral camera configured to acquire a video image data stream of fluorescence images from the tooth; a positioning fixture disposed to removably position the camera in an imaging position for the tooth and in a treatment position during treatment; and an image processor in signal communication with a display and disposed to display, during treatment, images processed from the video image data stream acquired by the intra-oral camera. In one exemplary embodiment, the positioning fixture is hinged or pivoted. One exemplary embodiment, further includes a sensor that provides a signal indicating when the camera is in the imaging position. In one exemplary embodiment, the display highlights one or more areas of a tooth for treatment. In one exemplary embodiment, the the image processor further analyzes the fluorescence image data obtained during a treatment session and provides a signal that indicates completion of treatment of an infected area of a tooth.
The invention has been described in detail with particular reference to a presently preferred embodiment, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention. The presently disclosed embodiments are therefore considered in all respects to be illustrative and not restrictive. In addition, while a particular feature of the invention can have been disclosed with respect to one of several implementations, such feature can be combined with one or more other features of the other implementations as can be desired and advantageous for any given or particular function. Exemplary embodiments according to the application can include various features described herein (individually or in combination). Further, "exemplary" indicates the description is used as an example, rather than implying that it is an ideal. The scope of the invention is indicated by the appended claims, and all changes that come within the meaning and range of equivalents thereof are intended to be embraced therein.

Claims

CLAIMS:
1. A dental imaging apparatus for use during treatment of a patient's tooth, comprising:
a dental instrument for treatment of diseased tooth material;
an intra-oral camera mounted to the dental instrument and configured to acquire a video image data stream of fluorescence images from the tooth during treatment of the tooth; and
an image processor in signal communication with a display and disposed to display, during treatment, images processed from the video image data stream acquired by the intra-oral camera.
2. The apparatus of claim 1 wherein the dental instrument is a drill.
3. The apparatus of claim 1 wherein the camera is detachably mechanically coupled to the dental instrument.
4. The apparatus of claim 1 wherein the camera is an integral part of the dental instrument.
5. The apparatus of claim 1 further comprising a positioning fixture disposed to removably position the camera in an imaging position for the tooth and in a treatment position during treatment, wherein the camera further comprises a light source that emits excitation light in the 200 - 450 nm range.
6. A method for imaging a tooth during treatment, the method executed at least in part by a computer, comprising:
emitting UV light and acquiring a fluorescent video stream from a surface of the tooth;
processing the acquired video stream and displaying an image that highlights one or more areas of bacterial infection along the surface of the tooth; responding to a first instruction to suspend acquisition and maintaining the display of the image during treatment of the tooth;
responding to a second instruction to resume image acquisition and processing steps and to refresh the displayed image of the tooth; and
analyzing the displayed image and indicating an end point for tooth treatment according to processing analysis.
7. The method of claim 6 further comprising identifying and highlighting an area of the tooth for treatment.
8. The method of claim 6 wherein indicating the end point comprises providing a status message.
9. The method of claim 6 further comprising repeating steps a) through d) one or more times.
10. The method of claim 6 wherein processing the video stream comprises analyzing fluorescent image content above about 600 nm.
11. The method of claim 6 wherein displaying the image comprises displaying incipient or advanced caries.
12. The method of claim 6 further comprising prompting the viewer to treat the one or more areas of bacterial infection.
13. The method of claim 6 further comprising showing boundaries of a treated portion of the tooth and highlighting the one or more areas of bacterial infection for treatment.
14. A method for imaging a tooth, the method executed at least in part by a computer, comprising: illuminating the tooth and acquiring a plurality of frames of fluorescence image data from the tooth on a camera;
processing each of two or more of the plurality of frames by calculating a risk condition for at least one portion of the tooth according to the fluorescence image data;
mapping one or more display colors to the at least one portion of the tooth according to the adjusted calculated risk condition to form a pseudo-color mapped tooth;
displaying the pseudo-color mapped tooth;
suspending image frame acquisition during patient treatment; and resuming image frame acquisition and updating the mapping and display following patient treatment.
15. The method of claim 14 wherein suspending image frame acquisition is performed in response to repositioning of the camera.
16. The method of claim 14 wherein the fluorescence image data is obtained from a video stream.
17. The method of claim 16 wherein acquiring fluorescence image data further comprises applying motion compensation to one or more individual frames in the video stream.
18. The method of claim 16 further comprising combining a plurality of the individual video frames in order to calculate the risk condition.
19. The method of claim 14 further comprising transforming the fluorescence image data from RGB color data to hue-saturation- value data.
20. The method of claim 14 further comprising displaying two or more images obtained during patient treatment.
21. A method for imaging a tooth, the method executed at least in part by a computer and comprising:
illuminating the tooth and acquiring a plurality of frames of fluorescence image data from the tooth;
processing each of two or more of the plurality of frames by calculating a risk condition for at least one portion of the tooth according to the fluorescence image data;
registering the processed frames to each other and adjusting the calculated risk condition according to the registered frames;
mapping two or more display colors to areas of the tooth according to the adjusted calculated risk condition to form a pseudo-color mapped tooth; and
displaying, storing, or transmitting the pseudo-color mapped tooth.
22. The method of claim 21 wherein registering uses iterative closest points processing.
23. The method of claim 21 wherein registering uses bilinear interpolation.
24. The method of claim 21 further comprising forming a likelihood map for a plurality of teeth according to their calculated risk conditions.
25. The method of claim 21 wherein processing comprises using one or more trained classifiers obtained from a memory that is in signal communication with the computer.
26. The method of claim 21 wherein the fluorescence image data is obtained from a video stream.
27. The method of claim 26 wherein acquiring fluorescence image data further comprises applying motion compensation to one or more individual frames in the video stream.
28. The method of claim 21 wherein the calculated risk condition relates to a level of bacterial activity that is indicated by the fluorescence image data.
29. The method of claim 21 wherein at least one of the display colors is colorimetrically closer to actual tooth color than to the colors in the fluorescence image.
30. The method of claim 21 wherein the calculated risk condition relates to a proportion of red image pixels in the fluorescence image data.
31. The method of claim 21 wherein the fluorescence imag data is in at least two spectral bands.
PCT/US2015/058977 2014-11-05 2015-11-04 Video detection of tooth condition using green and red fluorescence WO2016073569A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462075284P 2014-11-05 2014-11-05
US201462075283P 2014-11-05 2014-11-05
US62/075,283 2014-11-05
US62/075,284 2014-11-05

Publications (2)

Publication Number Publication Date
WO2016073569A2 true WO2016073569A2 (en) 2016-05-12
WO2016073569A3 WO2016073569A3 (en) 2016-10-27

Family

ID=55178314

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/058977 WO2016073569A2 (en) 2014-11-05 2015-11-04 Video detection of tooth condition using green and red fluorescence

Country Status (1)

Country Link
WO (1) WO2016073569A2 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017178889A1 (en) * 2016-04-13 2017-10-19 Inspektor Research Systems B.V. Bi-frequency dental examination
WO2017223378A1 (en) * 2016-06-23 2017-12-28 Li-Cor, Inc. Complementary color flashing for multichannel image presentation
WO2018029276A1 (en) * 2016-08-09 2018-02-15 Onaria Technologies Ltd. Method and system for processing an image of the teeth and gums
US10254227B2 (en) 2015-02-23 2019-04-09 Li-Cor, Inc. Fluorescence biopsy specimen imager and methods
US10379048B2 (en) 2015-06-26 2019-08-13 Li-Cor, Inc. Fluorescence biopsy specimen imager and methods
US10386301B2 (en) 2017-04-25 2019-08-20 Li-Cor, Inc. Top-down and rotational side view biopsy specimen imager and methods
US10489964B2 (en) 2016-04-21 2019-11-26 Li-Cor, Inc. Multimodality multi-axis 3-D imaging with X-ray
EP3599585A1 (en) * 2018-07-23 2020-01-29 Quanta Computer Inc. Image-processing methods for marking plaque fluorescent reaction area and systems therefor
CN111374642A (en) * 2020-03-24 2020-07-07 傅建华 Instrument for deep visual oral mucosa disease observation and oral cancer screening
US10993622B2 (en) 2016-11-23 2021-05-04 Li-Cor, Inc. Motion-adaptive interactive imaging method
WO2021183144A1 (en) * 2020-03-11 2021-09-16 Moheb Alireza System and method for classification of dental health based on digital imagery

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4479499A (en) 1982-01-29 1984-10-30 Alfano Robert R Method and apparatus for detecting the presence of caries in teeth using visible light
US4515476A (en) 1981-04-01 1985-05-07 Bjelkhagen Hans Ingmar Device for the ocular determination of any discrepancy in the luminescence capacity of the surface of a tooth for the purpose of identifying any caried area on the surface to the tooth
US6231338B1 (en) 1999-05-10 2001-05-15 Inspektor Research Systems B.V. Method and apparatus for the detection of carious activity of a carious lesion in a tooth
US20040202356A1 (en) 2003-04-10 2004-10-14 Stookey George K. Optical detection of dental caries
US20040240716A1 (en) 2003-05-22 2004-12-02 De Josselin De Jong Elbert Analysis and display of fluorescence images
US20070099148A1 (en) 2005-10-31 2007-05-03 Eastman Kodak Company Method and apparatus for detection of caries
US20080056551A1 (en) 2006-08-31 2008-03-06 Wong Victor C Method for detection of caries
US20080063998A1 (en) 2006-09-12 2008-03-13 Rongguang Liang Apparatus for caries detection
US20080170764A1 (en) 2007-01-17 2008-07-17 Burns Peter D System for early detection of dental caries
US20090185712A1 (en) 2008-01-22 2009-07-23 Wong Victor C Method for real-time visualization of caries condition
US20120148986A1 (en) 2010-12-13 2012-06-14 Jiayong Yan Method for identification of dental caries in polychromatic images
US20130038710A1 (en) 2011-08-09 2013-02-14 Jean-Marc Inglese Identification of dental caries in live video images

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4200741C2 (en) * 1992-01-14 2000-06-15 Kaltenbach & Voigt Device for the detection of caries on teeth
DE9317984U1 (en) * 1993-11-24 1995-03-23 Kaltenbach & Voigt Device for detecting caries
EP1120081A3 (en) * 2000-01-27 2002-05-08 Matsushita Electric Industrial Co., Ltd. Oral cavity image pickup apparatus
WO2005053562A1 (en) * 2003-12-08 2005-06-16 J. Morita Manufacturing Corporation Dental treating device
US20080058786A1 (en) * 2006-04-12 2008-03-06 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Autofluorescent imaging and target ablation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4515476A (en) 1981-04-01 1985-05-07 Bjelkhagen Hans Ingmar Device for the ocular determination of any discrepancy in the luminescence capacity of the surface of a tooth for the purpose of identifying any caried area on the surface to the tooth
US4479499A (en) 1982-01-29 1984-10-30 Alfano Robert R Method and apparatus for detecting the presence of caries in teeth using visible light
US6231338B1 (en) 1999-05-10 2001-05-15 Inspektor Research Systems B.V. Method and apparatus for the detection of carious activity of a carious lesion in a tooth
US20040202356A1 (en) 2003-04-10 2004-10-14 Stookey George K. Optical detection of dental caries
US20040240716A1 (en) 2003-05-22 2004-12-02 De Josselin De Jong Elbert Analysis and display of fluorescence images
US7596253B2 (en) 2005-10-31 2009-09-29 Carestream Health, Inc. Method and apparatus for detection of caries
US20070099148A1 (en) 2005-10-31 2007-05-03 Eastman Kodak Company Method and apparatus for detection of caries
US20080056551A1 (en) 2006-08-31 2008-03-06 Wong Victor C Method for detection of caries
US20080063998A1 (en) 2006-09-12 2008-03-13 Rongguang Liang Apparatus for caries detection
US20080170764A1 (en) 2007-01-17 2008-07-17 Burns Peter D System for early detection of dental caries
US20090185712A1 (en) 2008-01-22 2009-07-23 Wong Victor C Method for real-time visualization of caries condition
US20120148986A1 (en) 2010-12-13 2012-06-14 Jiayong Yan Method for identification of dental caries in polychromatic images
US8311302B2 (en) 2010-12-13 2012-11-13 Carestream Health, Inc. Method for identification of dental caries in polychromatic images
US20130038710A1 (en) 2011-08-09 2013-02-14 Jean-Marc Inglese Identification of dental caries in live video images

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10254227B2 (en) 2015-02-23 2019-04-09 Li-Cor, Inc. Fluorescence biopsy specimen imager and methods
US10379048B2 (en) 2015-06-26 2019-08-13 Li-Cor, Inc. Fluorescence biopsy specimen imager and methods
US10948415B2 (en) 2015-06-26 2021-03-16 Li-Cor, Inc. Method of determining surgical margins using fluorescence biopsy specimen imager
WO2017178889A1 (en) * 2016-04-13 2017-10-19 Inspektor Research Systems B.V. Bi-frequency dental examination
IL262401A (en) * 2016-04-13 2018-12-31 Inspektor Res Systems B V Bi-frequency dental examination
US10849506B2 (en) 2016-04-13 2020-12-01 Inspektor Research Systems B.V. Bi-frequency dental examination
US10489964B2 (en) 2016-04-21 2019-11-26 Li-Cor, Inc. Multimodality multi-axis 3-D imaging with X-ray
US10278586B2 (en) 2016-06-23 2019-05-07 Li-Cor, Inc. Complementary color flashing for multichannel image presentation
US11051696B2 (en) 2016-06-23 2021-07-06 Li-Cor, Inc. Complementary color flashing for multichannel image presentation
WO2017223378A1 (en) * 2016-06-23 2017-12-28 Li-Cor, Inc. Complementary color flashing for multichannel image presentation
GB2560661A (en) * 2016-08-09 2018-09-19 Onaria Tech Ltd Method and system for processing an image of the teeth and gums
WO2018029276A1 (en) * 2016-08-09 2018-02-15 Onaria Technologies Ltd. Method and system for processing an image of the teeth and gums
US10993622B2 (en) 2016-11-23 2021-05-04 Li-Cor, Inc. Motion-adaptive interactive imaging method
US10386301B2 (en) 2017-04-25 2019-08-20 Li-Cor, Inc. Top-down and rotational side view biopsy specimen imager and methods
US10775309B2 (en) 2017-04-25 2020-09-15 Li-Cor, Inc. Top-down and rotational side view biopsy specimen imager and methods
US10779735B2 (en) 2018-07-23 2020-09-22 Quanta Computer Inc. Image-processing methods for marking plaque fluorescent reaction area and systems therefor
EP3599585A1 (en) * 2018-07-23 2020-01-29 Quanta Computer Inc. Image-processing methods for marking plaque fluorescent reaction area and systems therefor
WO2021183144A1 (en) * 2020-03-11 2021-09-16 Moheb Alireza System and method for classification of dental health based on digital imagery
CN111374642A (en) * 2020-03-24 2020-07-07 傅建华 Instrument for deep visual oral mucosa disease observation and oral cancer screening

Also Published As

Publication number Publication date
WO2016073569A3 (en) 2016-10-27

Similar Documents

Publication Publication Date Title
WO2016073569A2 (en) Video detection of tooth condition using green and red fluorescence
US9870613B2 (en) Detection of tooth condition using reflectance images with red and green fluorescence
US11944187B2 (en) Tracked toothbrush and toothbrush tracking system
US11628046B2 (en) Methods and apparatuses for forming a model of a subject&#39;s teeth
EP3743010B1 (en) Diagnostic intraoral scanning and tracking
US10585958B2 (en) Intraoral scanner with dental diagnostics capabilities
EP2083389B1 (en) Method for a real-time visualization of caries condition
US9770217B2 (en) Dental variation tracking and prediction
JP2017537744A (en) Intraoral 3D fluorescence imaging
EP2688479A2 (en) A method for tooth surface classification
US20220189611A1 (en) Noninvasive multimodal oral assessment and disease diagnoses apparatus and method
KR20110040739A (en) Method for extracting a carious lesion area
US9547903B2 (en) Method for quantifying caries
JP2024512334A (en) Dental imaging system and image analysis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15826076

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15826076

Country of ref document: EP

Kind code of ref document: A2