WO2009120228A1 - Image processing systems and methods for surgical applications - Google Patents

Image processing systems and methods for surgical applications Download PDF

Info

Publication number
WO2009120228A1
WO2009120228A1 PCT/US2008/070297 US2008070297W WO2009120228A1 WO 2009120228 A1 WO2009120228 A1 WO 2009120228A1 US 2008070297 W US2008070297 W US 2008070297W WO 2009120228 A1 WO2009120228 A1 WO 2009120228A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
frames
contrast
fluorescence
Prior art date
Application number
PCT/US2008/070297
Other languages
French (fr)
Inventor
Christopher Unger
Stephen Johnson Lomnes
Floribertus Heukensfeldt Jansen
Craig Dennis
Original Assignee
General Electric Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Company filed Critical General Electric Company
Publication of WO2009120228A1 publication Critical patent/WO2009120228A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/044Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for absorption imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/41Detecting, measuring or recording for evaluating the immune or lymphatic systems
    • A61B5/414Evaluating particular organs or parts of the immune or lymphatic systems
    • A61B5/415Evaluating particular organs or parts of the immune or lymphatic systems the glands, e.g. tonsils, adenoids or thymus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/41Detecting, measuring or recording for evaluating the immune or lymphatic systems
    • A61B5/414Evaluating particular organs or parts of the immune or lymphatic systems
    • A61B5/418Evaluating particular organs or parts of the immune or lymphatic systems lymph vessels, ducts or nodes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/02Details
    • G01J3/0294Multi-channel spectroscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/44Raman spectrometry; Scattering spectrometry ; Fluorescence spectrometry
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners

Definitions

  • the subject matter disclosed herein relates generally to medical imaging and more particularly to image processing systems and methods for surgical applications.
  • Imaging systems and methods have been developed to assist surgeons in performing surgical procedures.
  • Some imaging methods utilize an agent to highlight desired tissues during surgery.
  • Fluorescence guided surgery is an example of a method involving injection of an agent into a subject to be imaged. After the fluorescent agent is injected into the subject, an excitation light source is applied to the subject to excite the fluorescent agent. The subject fluoresces in response to the excitation, and the resulting fluorescence emission is imaged to obtain information about the interior composition of the subject.
  • the region and/or vessel being illuminated fluoresces only over the time period during which the fluorescent substance remains in the region or vessel. Due to blood flow in the region, the fluorescent substance may be carried out of the region or through the vessel relatively quickly and before the surgical procedure has been completed, thereby making the patient susceptible to injury due to unintentional cutting or damage of the region or vessel that the surgeon desires to avoid. Further, numerous sets of image data may be acquired at different points in time during the surgery which may make it difficult for a surgeon to distinguish image data acquired earlier in time from image data acquired later in time.
  • image processing systems and methods that improve a surgeon's ability to identify regions or vessels of interest during surgery and/or to distinguish image data acquired earlier in time from image data acquired later in time are desirable.
  • the invention relates to a method comprising acquiring a set of frames of reflectance data representing light reflected from a subject, acquiring a set of frames of contrast data representing a contrast agent in the subject, applying a combination operator to the set of frames of contrast data to generate a combined contrast image, and generating a merged image based on the combined contrast image and at least one of the frames of reflectance data.
  • the invention relates to a system comprising a first detector configured to acquire a set of frames of reflectance data representing light reflected from a subject, a second detector configured to acquire a set of frames of contrast data representing a contrast agent in the subject, and a processor configured to apply a combination operator to the set of frames of contrast data to generate a combined contrast image and generate a merged image based on the combined contrast image and at least one of the frames of reflectance data.
  • the contrast data may comprise fluorescence emissions from a fluorescent contrast agent injected into a patient before or during surgery.
  • the contrast data may also comprise absorption data, chemiluminescence data, and/or optical scatter data.
  • the combination operator may comprise a maximum intensity projection (MIP) routine.
  • MIP maximum intensity projection
  • the invention also relates to an article of manufacture which comprises a computer usable medium having computer readable program code means embodied therein for causing a computer to execute the methods described herein.
  • Figure 1 is a drawing of an image processing system for image guided surgery according to one embodiment of the present invention.
  • Figure 2 is a diagram illustrating the components of an image processing system that includes an excitation source and a fluorescence camera according to an exemplary embodiment of the invention
  • Figure 3 is a flow chart showing a method for fluorescence guided surgery according to an exemplary embodiment of the invention.
  • Figure 4 is a diagram illustrating the method of Figure 3 according to an exemplary embodiment of the invention.
  • FIG. 5 is a flow chart showing an example of a combination operator comprising a maximum intensity projection (MIP) routine according to an exemplary embodiment of the invention
  • Figure 6 is a flow chart illustrating a method of generating and displaying a merged image according to an exemplary embodiment of the invention
  • Figure 7 is a diagram illustrating an example of the frames of reflectance data and fluorescence data involved in generating a merged image according to an exemplary embodiment of the invention
  • Figure 8 depicts examples of reflectance images, fluorescence images, a MIP image, a modified MIP image, and a merged image according to an exemplary embodiment of the invention
  • Figure 9 is a diagram of another example of an imaging system comprising an absorption camera
  • Figure 10 is a diagram of another example of an imaging system comprising a chemiluminescence camera
  • Figure 11 is a diagram of an example of an image processing system comprising an endoscope with proximal fluorescence detectors.
  • Figure 12 is a diagram of an example of an image processing system comprising an endoscope with distal fluorescence detectors.
  • FIG. 1 While the drawings illustrate system components in a designated physical relation to one another or having electrical communication designation with one another, and process steps in a particular sequence, such drawings illustrate examples of the invention and may vary while remaining within the scope of the invention. For example, components illustrated within the same housing may be located within the same housing, merely in electrical communication with one another, or otherwise. Additionally, illustrated data flows are merely exemplary and any communication channel may be utilized to receive and transmit data in accordance with exemplary embodiments of the invention.
  • the imaging system 10 includes an imaging unit 20, an image processing engine 50, and a display 80.
  • the imaging unit 20 may include one or more excitation light sources to excite a substance such as a fluorescent contrast agent or an absorbing contrast agent administered to the subject 18.
  • the imaging unit 20 may also include a white light source to illuminate the subject 18 as well as detectors and other components that will be described further in connection with Figures 2 and 9-12.
  • the detectors detect the emitted and reflected light from the subject 18 and send image signals to the image processing engine 50.
  • the image processing engine 50 shown in Figure 1 includes a computer or other processing device 52 to process the image signals received from the imaging unit 20.
  • the image processing engine 50 is coupled to the imaging unit 20, as shown by the communication channel 12 in Figure 1.
  • the image processing engine 50 also includes a human machine interface (HMI) 54 such as a keyboard, foot pedal, or control buttons, for example, which allows the surgeon or an assistant to control the imaging system 10.
  • HMI human machine interface
  • the image processing engine 50 may also include a display 56 which may be used primarily for controlling the image processing engine 50.
  • a display 80 connected to the image processing engine 50 through communication channel 14, that may be used by the surgeon primarily for displaying an image of the subject 18 during surgery, such as a fluorescence image merged with a reflectance image of the subject 18.
  • the surgeon positions the imaging unit 20 to illuminate the subject 18 and to acquire reflectance images and contrast images, such as fluorescence, absorption, or chemiluminescent images, of the subject 18.
  • the image processing engine 50 is configured to process the image signals acquired by the imaging unit 20 and to display for the surgeon on the display 80 a merged image to assist the surgeon in visualizing the area to be treated and in discriminating certain tissues and vessels during surgery.
  • the fluorescence, absorption, or chemiluminescent images may be referred to as "contrast images" because they are typically achieved using a contrast agent administered to the subject prior to or during surgery. Examples of such contrast agents are well known in the art and are disclosed, for example, in U.S. Patent No.
  • Figure 2 shows the components of an exemplary imaging system including a fluorescence camera.
  • the imaging unit 120 includes at least one excitation source 123, a white light source 127, a lens 124, a beam splitter 126, a fluorescence camera 128 and a video camera 130 according to one embodiment of the invention.
  • the excitation source 123 transmits light of a preselected wavelength or wavelength range to the subject 18.
  • the wavelength or wavelength range which may or may not be visible, is chosen to excite the fluorescent substance in the subject 18.
  • the fluorescent substance e.g., a fluorescent contrast agent, may be injected into the subject prior to or during the surgery or may be endogenous.
  • the excitation source 123 may be any light source that emits an excitation light capable of causing a fluorescent emission from the fluorescent substance. This may include light sources that comprise light emitting diodes, laser diodes, lamps, and the like.
  • the imaging unit 120 may also include a white light source 127 for illuminating the subject 18. However, the surgeon may also simply rely on ambient light to illuminate the subject 18.
  • the excitation source 123 and the white light source 127 may each comprise a multitude of light sources and combination of light sources, such as arrays of light emitting diodes (LEDs), lasers, laser diodes, lamps of various kinds, or other known light sources.
  • the white light source 127 may comprise an incandescent, halogen, or fluorescence light source, for example.
  • the white light source typically has, either alone or in combination with the excitation source, a correlated color temperature of 2800-10,000 degrees Kelvin, more preferably 3000-6700 degrees Kelvin, and a color rendering index (CRI) of 10-100, more preferably 85-100.
  • the fluorescence emission and the visible light reflected from the subject 18 are received through a lens 124 and then propagate to a beam splitter 126.
  • the lens 124 collects light and focuses an image onto the cameras 128 and 130.
  • the beam splitter 126 splits the image information into different paths either spectrally, with the use of dichroic filters for example, or by splitting the image with a partially reflective surface.
  • the beam splitter 126 divides the fluorescence emission from the remainder of the light.
  • the fluorescence emission propagates through a filter 129 and then to the fluorescence camera 128.
  • the filter 129 is configured to reject the reflected visible and excitation light from being detected by the fluorescence camera 128 while allowing the emitted fluorescent light from the subject 18 to be detected by the fluorescence camera 128.
  • the white light source 127 is configured so that it generates little or no light in the emission band of the fluorescent substance.
  • the fluorescence camera 128 may be any device configured to acquire a fluorescence image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, and the like.
  • the fluorescence camera 128 receives the filtered fluorescence emission and converts it to a signal that is transmitted via the communications channel 112 to the image processing engine 50.
  • the remainder of the light passes through a filter 131 and then to a video camera 130.
  • the filter 131 rejects the excitation light and the fluorescence emission light to allow for accurate representation of the visible reflected light image.
  • the video camera 130 may be any device configured to acquire a reflectance image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, and the like.
  • CMOS complementary metal-oxide semiconductor
  • the video camera 130 receives the filtered reflected light and converts it to a signal or image that is transmitted via the communications channel 112 to the image processing engine 50.
  • the communications channel 112 may be any known channel, such as a wired or wireless channel.
  • the fluorescence camera 128 and the video camera 130 may be referred to as "detectors" and may be digital or analog, for example.
  • the image processing engine 50 receives video signals from the imaging unit 120 through the communications channel 112 and processes the signals.
  • the image processing engine 50 includes a processor 52 that executes various image processing routines as is described further herein.
  • the image processing engine 50 also includes a memory 58 for storing, among other things, image data and various computer programs for image processing.
  • the memory 58 may be provided in various forms, such as RAM, ROM, hard drive, flash drive, etc.
  • the memory 58 may comprise different components for different functions, such as a first component for storing computer programs, a second component for storing image data, etc.
  • the image processing engine 50 may include hardware, software or a combination of hardware and software.
  • the image processing engine 50 is programmed to execute the various image processing methods described herein.
  • the image processing engine 50 also includes a human machine interface (HMI) 54, such as a keyboard, foot pedal or control buttons and a display 56, and/or other input/output devices that may be used primarily for input/output between the user and the imaging system 110.
  • HMI human machine interface
  • the HMI may also comprise control buttons on the imaging unit 120, or other desired control mechanism, for example.
  • the surgeon or an assistant may enter various parameters through the HMI 54 to define and control the imaging method that will occur during surgery.
  • the surgeon or an assistant may also modify the imaging method during surgery or input various commands during surgery to refine the images being displayed.
  • the display 56 may be used for controlling the imaging system 10, and it may also be used to display images of the subject 18.
  • the displays 80 and 56 may present the same data or they may present different data.
  • the display 80 is typically used to display images of the subject 18 for the surgeon.
  • the display 80 may comprise a larger screen that is positioned closer to the surgeon so that the surgeon can view the images easily during surgery.
  • the display 80 may comprise an LCD display, a plasma display, an HDTV display, or other high resolution device, for example.
  • the display 80 may also include controls allowing the surgeon to adjust brightness, contrast, gain, or other parameters of the image.
  • the hardware may include one or more digital signal processing (DSP) microprocessors, application-specific integrated chips (ASIC), field programmable gate arrays (FPGA), and the like.
  • DSP digital signal processing
  • ASIC application-specific integrated chips
  • FPGA field programmable gate arrays
  • the software may include modules, submodules and generally computer program products containing computer code that may be executed to perform the image processing methods of the embodiments of the invention.
  • the memory 58 may be any type of memory suitable for storing information that may be accessed by the processor 52 for performing image processing.
  • Information stored in the memory 58 may include one or more previously acquired image data sets, computer code for executing image processing routines, and/or any other information accessed by the processor 52.
  • the memory 58 may comprise, for example, a hard drive, flash drive, random access memory (RAM), and/or read-only memory (ROM).
  • the memory 58 stores computer executable code executed by the processor 52.
  • the memory 58 may also include information descriptive of the subject 18 on which fluorescence guided surgery is being performed.
  • the imaging system is configured to execute various imaging methods according to exemplary embodiments of the invention.
  • the methods typically involve acquiring frames of image data at different points in time.
  • the frames of image data include reflectance data and fluorescence data.
  • the reflectance data sets and the fluorescence data sets may be used to generate a merged image in which the fluorescence image data are overlaid onto the reflectance image data.
  • the merged image assists the surgeon in visualizing certain tissues which emit fluorescent light during surgery.
  • the term "reflectance image data” refers to image data representing a reflection of light from a surface.
  • fluorescence image data refers to image data representing a fluorescent emission from a subject. The fluorescent emission is typically achieved with a fluorescent contrast agent administered to the subject prior to or during surgery.
  • FIG. 3 is a flow chart showing the exemplary method.
  • the method begins with block 148 in which a fluorescent contrast agent is administered to the subject.
  • a set of single frames of reflectance data and fluorescence data are acquired with the imaging system 110.
  • the individual frames may be modified with various image processing methods, such as adjustments of brightness, color and gamma.
  • weighting operators may be applied to a set of frames of fluorescence data.
  • a combination operator is applied to the set of frames of fluorescence data to generate a combined fluorescence image.
  • the combined fluorescence image may be modified with various image processing routines.
  • the combined fluorescence image is merged with one or more reflectance images to generate a merged image.
  • the merged image may be displayed on the display 80.
  • a determination is made as to whether imaging is continuing. If so, the process returns to block 150 where additional frames of image data are acquired.
  • the method may be carried out with three primary steps (150, 156, 160), and may include one or more of the other steps illustrated in Figure 3.
  • FIG. 4 is a diagram that shows the method of Figure 3 in more detail.
  • the exemplary method typically includes three primary steps (150, 156, 160) and may include one or more of the other steps, as illustrated.
  • step 150 comprises acquiring a set of single frames of reflectance image data and fluorescence image data.
  • step 156 comprises applying a combination operator to the fluorescence data to create a combined fluorescence image.
  • step 160 comprises creating a merged image based on the combined fluorescence image and one or more of the reflectance images.
  • the imaging system 110 acquires a set of six frames of reflectance data and fluorescence data.
  • the imaging system 110 is typically configured to acquire a reflectance image data set and a fluorescence image data set at frame rates between 1 and 60 frames per second, and preferably between 15 and 30 frames per second.
  • the fluorescence data set and reflectance data set can be acquired at independently controlled frame rates.
  • the fluorescence data set is acquired with the fluorescence camera 128, the reflectance data set is acquired with the video camera 130, and the data sets are stored in the memory 58 of the image processing engine 50.
  • Each pixel of image data typically includes an intensity value and one or more values to specify a color. Although the value of the pixel is typically an intensity value, the value may also include other characteristics such as fluorescence lifetime, quantum yield, absorption, and the like depending on the particular nature of the imaging system.
  • a thresholding step can be applied to individual frames of fluorescence data.
  • a thresholding step can be applied by setting all pixel values below a specified threshold value to zero. The inclusion of the thresholding step eliminates a certain subset of the fluorescence data set values below a specified threshold value.
  • This feature provides a thresholded fluorescence data set that can make it easier for the surgeon to see the highlighted fluorescence portions as compared with any background fluorescence, such as endogenous fluorescence.
  • the thresholding step can also be applied by setting all pixel values above a specified threshold value to a preselected maximum value.
  • Known filtering methods can also be applied to individual frames of fluorescence data in step 152 of Figure 4.
  • known filtering operations such as blurring, sharpening, edge finding, low pass, band pass, high pass, Fourier, notch, despeckling, speckling, and other filtering operations can be applied.
  • a weighting operator can be applied to a set of frames of fluorescence data prior to application of the combination operator in step 156.
  • the weighting operator step 154 may comprise applying color changes to different frames and/or pixels of fluorescence data and/or applying weighting factors to the intensity values in different frames and/or pixels of fluorescence data (e.g., multiplying the intensity values by weighting factors between 0 and 1).
  • the color changes and/or weighting factors may be selected based on the age of the frame. For example, the method may comprise changing the color of the older frames to red and changing the color of the newer frames to green.
  • the fluorescent emission in the newest frames is more prominently displayed in green and the fluorescent emission in the older frames is less prominently displayed in red.
  • the color of the older fluorescence data is manipulated to assist the surgeon in determining the age of the fluorescence data, e.g., discriminating the newest fluorescence data from the older fluorescence data.
  • one or more weighting factors may be applied to the intensity values in the frames of fluorescence data. For example, a weighting factor of 1.0 may be multiplied by the pixel values in the newest frame of fluorescence data, a weighting factor of 0.9 applied to the next newest frame, a weighting factor of 0.8 applied to the next newest frame, and so on.
  • the processor 52 multiplies the weighting factors by each pixel intensity value and the resulting weighed frames are stored in the memory 58.
  • the older fluorescence image data when displayed, will have a lower intensity than the newer fluorescence data because it is discounted by the weighting factor.
  • This effect is illustrated in Figure 4.
  • the first weighted frame of fluorescence data that is input to the combination operator in step 156 has a lower intensity than the second weighted frame of fluorescence data, which has a lower intensity that the third weighted frame of fluorescence data, and so on.
  • Figure 4 depicts the application of the combination operator in step 156.
  • the combination operator is a routine that receives as input at least two frames of fluorescence data and outputs a single combined fluorescence image.
  • the combination operator may utilize a number of routines based on, for example, maximum pixel values, minimum pixel values, the sum of pixel values, the difference or subtraction of pixel values, the average of pixel values, the derivative of pixel values, and cross correlation of pixel values.
  • the combination operator utilizes one or more of these routines to convert the multiple input fluorescence data sets to a single combined fluorescent image data set.
  • the combination operator uses a routine based on maximum pixel values.
  • This routine may be referred to as a maximum intensity projection (MIP) routine.
  • MIP maximum intensity projection
  • the MIP routine is applied to the fluorescence data prior to the fluorescence data being displayed in order to assist the surgeon in visualizing the fluorescent regions of the subject 18.
  • the reflectance image data is generally displayed without MIP processing.
  • the MIP routine which is one embodiment of the Combination Operator (156) depicted in Figure 4, typically includes the steps of comparing two or more input image data sets and generating a MIP data set, which is one embodiment of the "Combined Fluorescence Image" depicted in Figure 4.
  • the MIP data set is generated by comparing the value of a pixel from one of the input data sets to the value of a corresponding pixel from the other input data set and taking the higher value.
  • the higher value is stored in the MIP data set for the corresponding pixel.
  • the process is repeated for each pixel in the input data sets.
  • the MIP data set thus includes, for each pixel, the higher value from one of the input data sets. For example, if the values of the first five pixels in data set 1 are: (4, 6, 7, 4, 2, . . .) and the values of the corresponding first five pixels in data set 2 are: (1, 5, 8, 3, 3, . . .), the MIP data set would have the following values for its corresponding first five pixels: (4, 6, 8, 4, 3, . . .).
  • the MIP routine can have any number of input data sets, such as the six input fluorescence data sets shown in Figure 4.
  • FIG. 5 is a flowchart showing an example of the MIP routine for n input data sets.
  • the MIP routine is carried out by the image processing engine 50.
  • the processor 52 performs the mathematical operations, and the memory 58 stores the image data and the program or programs used by the processor 52 to execute the blocks of the flow chart of Figure 5.
  • block 170 the value of a pixel P 1 j from data set 1 is read.
  • the value of the pixel is typically an intensity value, but the value may also include other characteristics as noted above.
  • step 171 the value of the corresponding pixel P, j from data set 2 is read. This process is repeated for each remaining input data set, up to data set n in step 172.
  • the values are compared by the processor 52.
  • step 174 the highest value is stored in memory 58 in a MIP data set.
  • step 175 it is determined by the processor 52 whether there are any remaining pixels to compare in the input data sets. If yes, the process returns to step 170 where the next pixel is read in each input data set. If no, the process ends.
  • the MIP routine is used to process the image data from the fluorescence emission.
  • the MIP routine can capture the brightest fluorescence emissions during a specified time period, over a number of frames of image data, so that the MIP image shows the fluorescence substance as it has traveled through a region of the subject over time.
  • the MIP ("Max") routine is applied to six frames of fluorescence data. The result is a combined fluorescence image that contains, for each pixel, the maximum value from the corresponding pixels in each of the six input data sets. With this process, over time, the MIP data set is updated to include the maximum values from a specified number of previous frames of fluorescence data.
  • the result may be, for example, that if a fluorescent substance passes through the region of interest relatively quickly, for example as a fluorescent bolus traveling through a designated vessel, the MIP routine will generate an image where the entire vessel is highlighted rather than just one portion of the vessel that contains the bolus at a particular time.
  • the highlighting of the entire vessel can assist the surgeon in visualizing and identifying the vessel during surgery.
  • step 156 there are a number of other routines in step 156 that may be utilized and executed by the processor 52 in an analogous manner to the MIP routine.
  • the processor 52 may be programmed to compare the corresponding pixel values from each input fluorescence data set and store the minimum, rather than the maximum, pixel value in the combined fluorescence image data set in the memory 58.
  • the processor 52 may be programmed to add all the values from each corresponding pixel in the input data sets and store the sum in the combined fluorescence image data set in the memory 58.
  • the processor 52 may be programmed to subtract specified corresponding pixel values and store the difference in the combined fluorescence image data set in the memory 58.
  • the processor 52 may be programmed to calculate the average value of each corresponding pixel in the input data sets and store the average value in the combined fluorescence image data set in the memory 58.
  • the processor 52 may be programmed to calculate a derivative value based on corresponding pixel values in the input data sets and store the derivate value in the combined fluorescence image data set in the memory 58.
  • the processor 52 may be programmed to calculate a cross correlation value based on the input fluorescence image data sets and to store the calculated value in the combined fluorescence image data set in the memory 58.
  • Pijc max ([PiJ 1,..PiJN]) for all ij.
  • Pijc min ([PiJ 1,..PiJN]) for all ij.
  • Pijc sum ([Pij 1,..PiJN]) for all ij.
  • Pijc difference ([Pij 1,..PiJN]) for all ij.
  • Pijc average ([Pij 1,..PiJN]) for all ij.
  • Pijc derivative ([Pij 1,..PiJN]) for all ij.
  • P XCOiT(P 1,..,PN), for 2D cross correlation.
  • N another two dimensional image matrix array.
  • MATLAB is a well-known programming language available commercially from The MathWorks, Inc.
  • the combined fluorescence image data set may be modified in a number of ways, as shown in Figure 4.
  • the modifications are executed by the image processing engine 50.
  • the brightness, contrast, and gamma (BCG), window and level may be modified according to a lookup table stored in the memory 58, as is known in the art.
  • a thresholding step can be applied to the combined fluorescence image data set.
  • a thresholding step can be applied by setting all pixel values below a specified threshold value to zero, and/or by setting all pixel values above a specified threshold value to a maximum value.
  • the thresholding step can be used, for example, to remove data representing background or endogenous fluorescence emissions.
  • Known filtering methods can also be applied to the combined fluorescence image data set. For example, known filtering operations such as blurring, sharpening, edge finding, low pass, band pass, high pass, Fourier, notch, despeckling, speckling, and other filtering operations can be applied, [00056]
  • a false coloring method can be applied to the combined fluorescence image data set in step 158. In generating the merged image, it may be helpful to the surgeon to change the color of the fluorescence image to a predetermined color that is more easily visible than the naturally occurring fluorescence emission color which may be difficult to distinguish or even invisible.
  • the false coloring method can be carried out by the image processing engine 50.
  • a lookup table may be stored in the memory 58 that maps color values according to a specified desired routine.
  • the combined fluorescence image data set may be modified so that the color of the emission, even if initially invisible, is changed to bright green to be clearly visible by a surgeon. Any desired color can be used.
  • a merge step 160 is executed, according to an exemplary embodiment of the invention.
  • the merge step 160 comprises merging the modified, combined fluorescence image data set with one or more frames of reflectance data.
  • the modified fluorescence image is overlaid onto one or more frames of reflectance data acquired by the video camera 130.
  • This merge step is carried out with the image processing engine 50.
  • the MIP data combined fluorescence image
  • an older frame of reflectance data may also be used.
  • the MIP data set is more closely matched to an older frame of reflectance data
  • the older frame may be used.
  • one or more landmarks in the MIP data set are compared with corresponding landmarks in the reflectance data sets to identify which reflectance data set is the closest match to the MIP data set so that the identified reflectance data set is used to generate the combined image.
  • the particular reflectance data that is used may be selected automatically by mutual information cross correlation, or other anatomical segmentation and registration techniques known in the art.
  • the reflectance data chosen may be a function of the parameters of the MIP data set collection.
  • the reflectance data corresponding to the eighth fluorescence data frame may be selected.
  • the reflectance image may be selected manually by the surgeon or operator.
  • the combined fluorescence image data set may also be merged with one or more additional frames of reflectance data.
  • the MIP data set once generated, may provide a good image of the particular vessel of interest. The MIP data set, therefore, may be generated early in the procedure and then used throughout the surgery by combining it with subsequently generated frames of reflectance data as they are acquired.
  • step 162 the merged image is displayed for the surgeon on the display 80.
  • the combined fluorescence image may be displayed alone, without the reflectance image.
  • One aspect of the invention relates to the number of frames of fluorescence data that is used to generate the combined fluorescence image.
  • the MIP routine is applied by the image processing engine 50 to all historical fluorescence data sets from the beginning of the surgical procedure.
  • a first frame of fluorescence data and a first frame of reflectance data are acquired.
  • a second frame of fluorescence data and a second frame of reflectance data are acquired.
  • the MIP routine is then applied to the first and second frames of fluorescence data to generate a MIP data set.
  • the MIP data set is then stored in the memory 58 so that it can be used in the next iteration of the process. In the next step, it is determined whether the imaging processing is continuing.
  • the process returns to acquisition of another (third) frame of fluorescence data and another (third) frame of reflectance data.
  • the MIP routine is then applied again to the previously generated, stored MIP data set and the third frame of fluorescence data to generate a new MIP data set.
  • the merged image is generated by the image processing engine 50 and displayed for the surgeon on the display 80 using the new MIP data set and a frame of reflectance data, typically the most recent (third) frame of reflectance data.
  • the process is repeated for each subsequent frame of acquired data.
  • the MIP data set therefore represents all the fluorescence data acquired starting at the beginning of the surgical procedure.
  • the combination operator is applied to a specified number of frames (e.g., 3 frames) of fluorescence data, rather than all previous frames of fluorescence data.
  • a specified number of frames e.g., 3 frames
  • the use of a specified number of frames, rather than all frames, can enhance the quality of images displayed for the surgeon.
  • This process is illustrated in Figures 6 and 7. As shown in Figure 6, the process is conducted as follows.
  • a contrast agent is administered to the subject.
  • a number of initial frames e.g., two frames
  • a third frame of fluorescence data and a third frame of reflectance data are acquired in step 182.
  • the MIP routine is then applied to the latest m frames of data in step 183, in this case the latest three frames of data, to generate a MIP data set.
  • the MIP data set is merged with one of the frames of reflectance data, typically the most recent (third) frame of reflectance data, to generate a merged image in step 184.
  • the merged image is displayed for the surgeon in step 185.
  • Fluorescence data sets are then stored in the memory 58 in step 186 so that they can be used in the next iteration of the process.
  • step 187 it is determined whether the imaging processing is continuing. If so, the process returns to step 182 for acquisition of another (fourth) frame of fluorescence data and another (fourth) frame of reflectance data.
  • the MIP routine is then applied again in step 183 to the latest three frames of fluorescence data (the second, third and fourth frames) to generate a new MIP data set.
  • the merged image is generated and displayed for the surgeon in steps 184 and 185 using the new MIP data set and a frame of reflectance data, typically the most recent (fourth) frame of reflectance data. The process is repeated for each subsequent frame of acquired data.
  • FIG. 7 is another illustration of this exemplary process, showing the frames that are used to generate the combined image.
  • the MIP routine is applied to the three most recent frames of fluorescence data and the older frames of fluorescence data are not used in generating the MIP data set.
  • the MIP routine is carried out by reading the values from corresponding pixels in the most recent three frames of data, comparing the values, and writing the highest value to the corresponding pixel in the MIP data set, which is stored in the memory 58.
  • the MIP data set contains a certain amount of historical fluorescence data, but does not contain all historical fluorescence data. The number of frames can be selected based on the extent of historical data that is desired.
  • a merged image is generated with the reflectance data set and the MIP data set.
  • the merged image assists the surgeon in visualizing the area of interest (reflectance image) combined with fluorescence data processed according to the MIP routine.
  • This embodiment may be used to reduce or prevent saturation over time, since only a limited number of previous fluorescence data sets are used.
  • Figure 8 is an illustration of an application of the MIP algorithm to a surgical site.
  • the five photographs on the top row of Figure 8 represent five frames of reflectance data.
  • the images in the next row down represent five frames of fluorescence data.
  • the photograph in the bottom left corner represents the MIP data set for five frames of fluorescence data.
  • the image in the bottom center represents the MIP data set after it has been enhanced with respect to brightness, contrast, gain and false coloring.
  • the brightness, contrast and gamma were modified to visually enhance the fluorescence components of interest and to minimize the background contributions. Then, this gray-scale image was false colored in bright green to permit visible contrast when merged with the color reflected light image.
  • the image in the bottom right of Figure 8 is the merged image in which the false colored MIP data set has been superimposed onto a frame of reflectance data.
  • the MIP data set shows a highlighted bolus over a period of time, which enables the surgeon to see not just one snapshot of the bolus, but rather a much larger portion of the vessel of interest.
  • the discussion thus far has focused largely on an example of a system that uses fluorescence data and a MIP routine
  • other embodiments of the invention comprise different image processing methods and different hardware.
  • the combination operator in addition to a MIP ("Max") routine may also be programmed to execute routines based on minimum pixel values, the sum of pixel values, a difference of pixel values, an average of pixel values, a derivative of pixel values, and/or a cross correlation of pixel values.
  • the processor 52 may be programmed, depending on the desired application, to execute any one or more of these routines.
  • the imaging system can be designed to acquire types of contrast image data other than fluorescence image data.
  • the other types of contrast image data may include, for example, absorption data, chemiluminescence data, bioluminescence data, and scatter data.
  • Chemiluminescence generally refers to the emission of light as a result of a chemical reaction.
  • Bioluminescence generally refers to the production and emission of light by a living organism as the result of a chemical reaction. Bioluminescence is generally considered a subset of chemiluminescence.
  • the image data other than the reflectance data may be referred to as "contrast data" because it is typically achieved using a contrast agent.
  • Figure 9 is a diagram of an absorption-based system 210. As shown in Figure 9, many of the components are the same as the components in the fluorescence system depicted in Figure 2.
  • the absorption camera 228 replaces the fluorescence camera 228 of Figure 2.
  • the imaging unit 220 includes at least one excitation source 223, a white light source 227, a lens 224, a beam splitter 226, an absorption camera 228 and a video camera 230 according to one embodiment of the invention.
  • the excitation source 223 transmits light of a preselected wavelength or wavelength range to the subject 18.
  • the wavelength or wavelength range which may or may not be visible, is chosen to target the absorbing substance in the subject 18, which may be injected into the subject prior to or during the surgery or may be endogenous.
  • the excitation source 223 may be any light source that emits an excitation light capable of being absorbed by the absorbing substance.
  • the excitation source 223 may include light sources that use light emitting diodes, laser diodes, lamps, and the like.
  • the imaging unit 220 may also include a white light source 227 for illuminating the subject 18. However, the surgeon may also simply rely on ambient light to illuminate the subject 18.
  • the excitation source 223 and the white light source 227 may each comprise a multitude of light sources and combination of light sources, such as arrays of light emitting diodes, lasers, laser diodes, lamps of various kinds, or other known light sources.
  • the white light source 227 may comprise an incandescent, halogen, or fluorescence light source, for example.
  • the white light source 227 and the excitation source 223 may comprise the same source.
  • the absorption spectrum and the visible light reflected from the subject 18 are detected through a lens 224 and then propagate to a beam splitter 226.
  • the lens 224 collects light and focuses an image onto the cameras 228 and 230.
  • the beam splitter is capable of splitting the image information into different paths either spectrally, with the use of dichroic filters for example, or by splitting the image with a partially reflective surface.
  • the beam splitter 226 divides the absorption spectrum from the remainder of the light.
  • the absorption spectrum travels through a filter 229 and then to an absorption camera 228.
  • the excitation light should not be blocked by the filter 229 in front of the absorption camera 228.
  • the filter 229 is selected to allow transmission from a particular wavelength band of interest.
  • the filter 231 may be selected to choose a second wavelength band of interest, or to broadly accept visible light to provide a reflected light image on camera 230.
  • the absorption camera 228 may be any device configured to acquire an absorption image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, or the like.
  • the absorption camera 228 receives the absorption spectrum and converts it to a signal that is transmitted to the image processing engine 50. The remainder of the light passes through the filter 231 and then to the video camera 230.
  • the video camera 230 may be any device configured to acquire a reflectance image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, or the like.
  • the video camera 230 receives the filtered reflected light and converts it to a signal or image that is transmitted to the image processing engine 50.
  • the imaging system can also be designed to acquire chemiluminescent (including bioluminescent) light.
  • Figure 10 is a drawing of chemiluminescent system 310. As shown in Figure 10, many of the components are the same as the components in the fluorescence system depicted in Figure 2. Referring to Figure 10, the chemiluminescence camera 328 replaces the fluorescence camera 128 of Figure 2. As shown in Figure 10, the imaging unit 320 includes an optional illumination source 327, a lens 324, a beam splitter 326, a chemiluminescence camera 328, and a video camera 330 according to one embodiment of the invention.
  • the illumination source 327 transmits light of a preselected wavelength or wavelength range to the subject 18.
  • the wavelength or wavelength range which may or may not be visible, is chosen to provide visible light to illuminate the subject 18, but substantially devoid of wavelengths in the acceptance band imaged by the luminescence camera 328, as defined by the characteristics of filter 329 and the beam splitting element 326.
  • the chemiluminescent light and the visible light reflected from the subject 18 are detected through a lens 324 and then propagate to a beam splitter 326.
  • the lens 324 collects light and focuses an image onto the cameras 328 and 330.
  • the beam splitter is capable of splitting the image information into different paths either spectrally, with the use of dichroic filters for example, or by splitting the image with a partially reflective surface.
  • the beam splitter 326 divides the chemiluminescent emission from the remainder of the light.
  • the chemiluminescent emission travels through a filter 329 and then to a chemiluminescence camera 328.
  • the filter 329 is typically designed to block the illumination light from the illumination source 327 if the illumination source 327 is on when the chemiluminescence camera 328 acquires the image data.
  • the illumination source 327 should not have light in the emission band of the chemiluminescent material at the time when the chemiluminescence camera 328 is acquiring image data.
  • the chemiluminescence camera 328 may be any device configured to acquire a chemiluminescent image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, or the like.
  • the chemiluminescence camera 328 receives the chemiluminescent emission and converts it to a signal that is transmitted to the image processing engine 50.
  • the remainder of the light passes to a video camera 330.
  • the video camera 330 may be any device configured to acquire a reflectance image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, and the like.
  • the video camera 330 receives the reflected light and converts it to a signal or image that is transmitted to the image processing engine 50.
  • inventions include an endoscope.
  • the endoscope may be used in endoscopic surgery, which can be significantly less invasive as compared with the open surgery depicted in Figure 1.
  • the endoscope is inserted into a small incision in the patient, as known in the art, to minimize the invasive nature of the surgery.
  • Endoscopes are commercially available from a number of manufacturers, such as Stryker and Karl Storz, for example.
  • the endoscope may include proximal detectors or distal detectors. These embodiments will be described in connection with a fluorescence system. However, those skilled in the art will appreciate that the endoscopes can be designed and configured to be used with a chemiluminescent camera to acquire chemiluminescent images, an absorption camera to acquire absorption images, or other types of cameras to detect a desired form of radiation, such as optical scattering data.
  • FIG 11 is a drawing of an endoscope system having proximal fluorescence detectors.
  • the detectors are referred to as proximal detectors, because they are located near the top end of the endoscope proximate to the surgeon.
  • the imaging system 410 comprises an imaging unit 450 that houses a beam splitter 426, a fluorescence camera 428, a filter 429, a video camera 430 and a filter 431.
  • the imaging system 410 also includes an excitation source 423 and a white light source 427.
  • the excitation source 423 and the white light source 427 are optically coupled to the endoscope 440 at or near the proximal end of the endoscope 440.
  • the functions of the beam splitter 426, the cameras 428, 430, the filters 429, 431, the excitation source 423 and the white light source 427 are substantially the same as the corresponding elements in Figure 2.
  • the endoscope is connected via communications channel 412 to the image processing engine 50 which includes a processor 52, human machine interface (HMI) 54 such as a keyboard, foot pedal, or control buttons, output device 56 such as a display, and one or more memories 58 as described previously.
  • the image processing engine 50 is connected to the display 80 via communications channel 414, The image processing engine 50 and the display 80 operate in essentially the same manner as described above with respect to Figure
  • a fluorescent agent is injected into the subject and the endoscope 440 is inserted into a body cavity or incision.
  • the white light source 427 and the excitation source 423 are activated, image data are acquired, and the image data are processed, as described above in connection with Figure 4.
  • the endoscope 440 can provide the advantages associated with minimally invasive surgery, for example.
  • FIG. 12 A second embodiment of an endoscope is shown in Figure 12.
  • the endoscope depicted in Figure 12 includes distal detectors 528, 530.
  • the fluorescence detector 528 and the video camera 530 are referred to as distal detectors because they are located at the bottom end of the endoscope away from the surgeon.
  • the imaging system 510 also includes an excitation source 523 and a white light source 527.
  • the excitation source 523 and the white light source 527 are optically coupled to the endoscope 540 at or near the proximal end of the endoscope 540.
  • the functions of the detectors 528, 530, the excitation source 523 and the white light source 527 are substantially the same as the corresponding elements in Figure 2.
  • the imaging system 510 also comprises an imaging unit 550 that relays the signals and data acquired by the endoscope 540 to the imaging processing engine 50.
  • the imaging unit 550 is connected via communications channel 512 to the image processing engine 50 which includes a processor 52, HMI 54, output device 56 such as a display, and one or more memories 58 as described previously.
  • the image processing engine 50 is connected to the display 80 via communications channel 514.
  • the image processing engine and the display 80 operate in essentially the same manner as described above with respect to Figure 2.
  • a fluorescent agent is injected into the subject and the endoscope 540 is inserted into a body cavity or incision.
  • the white light source 527 and the excitation source 523 are activated, image data are acquired, and the image data are processed, as described above.
  • the imaging systems described herein can be modified to include a laparoscope or other known surgical devices.
  • a laparoscope is typically inserted into an incision in the abdomen to provide access to an interior of the abdomen in a minimally invasive procedure.
  • the image data can be processed to account for motion during the surgery.
  • a vessel or tissue of interest such as one that has been injected with a fluorescent substance, may move significantly during imaging.
  • the resulting processed fluorescent image consequently may be somewhat difficult to interpret because it may appear to depict multiple vessels due to the movement.
  • the image processing methods described herein can be modified as follows to address this issue.
  • the displacement or motion of the vessel or tissue of interest is detected.
  • the displacement value of the subject may be determined by a number of approaches.
  • the displacement value of the subject can be determined manually or automatically by detecting motion of one or more selected landmarks on the subject over a specified time period.
  • the distance between one or more selected landmarks can be determined before, during, and/or after the motion.
  • the types of methods used for automatically detecting motion based on comparing frames of data acquired at different times are well known in the art and include 2D cross-correlation, mathematical comparison (subtraction, division, multiplication, etc.) of the spatial or Fourier domain images, deformable image registration, feature segmentation and tracking, and the like.
  • the detection of each displacement results in a displacement value.
  • each displacement value determines the weighting factor (e.g., a number between 0 and 1) that is multiplied by the pixel values of the older fluorescence data.
  • the weighting factors can be applied to individual frames or to individual pixels, for example.
  • a look-up table is stored in the processor 52 that maps various displacement values to weighting factors. Generally, it will be advantageous to reduce the weighting factor as the displacement value increases, and vise versa.
  • the weighting factor is reduced, so that the persistence of the older data, which is derived from a substantially different location due to the motion, is reduced. Conversely, if the displacement is small, the weighting factor is larger, so that the relative persistence of the older data is increased. This feature can selectively reduce the persistence of the older data to a desired extent when there has been significant motion between the time of acquisition of the older data and the time of acquisition of the most recent data.
  • the above described method of applying the weighting operator based on a displacement value is limited to displacement perpendicular to a vessel of interest.
  • a vessel of interest is identified by the surgeon, for example by identifying its longitudinal axis.
  • Image segmentation and feature selection methods used for this process are well known in the art and include but are not limited to atlas-based segmentation, vessel tree-tracking, and the like.
  • the weighting factors are then calculated based only on displacement perpendicular to the longitudinal axis of the vessel. This method provides the advantage of focusing the weighting operator modification so that it is applied only to perpendicular displacement of a vessel of interest.
  • the displacement values are used to change the color of the contrast image data (e.g., fluorescence data) used to generate the merged image.
  • the image processing engine 50 stores a look-up table that maps displacement values to specified colors.
  • the colors of the fluorescence data are modified based on displacement values. For example, a displacement value of more than two centimeters results in changing the fluorescence color to red; a displacement value between one and two centimeters results in changing the fluorescence color to yellow; a displacement value of less than one centimeter results in changing the fluorescence color to green, etc.
  • the resulting image therefore allows the surgeon to understand the extent of displacement of the fluorescence image data based on the color of the image displayed to the surgeon.
  • the image processing engine 50 may also provide the surgeon the option of erasing all historical fluorescence image data by entering a command during imaging. This option allows the surgeon to clear all historical fluorescence data from the merged image during surgery, in the event, for example, that the historical fluorescence data is difficult to interpret or otherwise not helpful to the surgeon.

Abstract

The invention relates to image processing systems and methods for surgical applications The method may comprise acquiring a set of frames of reflectance data, acquiring a set of frames of contrast data representing, applying a combination operator to the set of frames of contrast data to generate a combined contrast image, and generating a merged image based on the combined contrast image and at least one of the frames of reflectance data The contrast data may comprise fluorescence emissions from an injected fluorescent substance The combination operator may comprise a maximum intensity projection (MIP) routine The system may comprise a first reflecting detector, a second contrast frame detector, and a processor configured to apply a combination operator to the set of frames of contrast data to generate a combined contrast image and generate a merged image.

Description

IMAGE PROCESSING SYSTEMS AND METHODS FOR SURGICAL APPLICATIONS
CROSS-REFERENCE OF RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional Application No. 61/039,038, filed March 24, 2008, which is hereby incorporated by reference in its entirety.
FIELD OF THE INVENTION
[0002] The subject matter disclosed herein relates generally to medical imaging and more particularly to image processing systems and methods for surgical applications.
BACKGROUND
[0003] Various medical imaging systems and methods have been developed to assist surgeons in performing surgical procedures. Some imaging methods, for example, utilize an agent to highlight desired tissues during surgery. Fluorescence guided surgery is an example of a method involving injection of an agent into a subject to be imaged. After the fluorescent agent is injected into the subject, an excitation light source is applied to the subject to excite the fluorescent agent. The subject fluoresces in response to the excitation, and the resulting fluorescence emission is imaged to obtain information about the interior composition of the subject.
[0004] During surgery, surgeons endeavor to avoid inadvertently cutting and damaging tissues and vessels surrounding the area being treated. However, the surrounding tissues may not always be easily distinguishable. Accordingly, methods in fluorescence guided surgery can address such problems by illuminating various designated tissues and vessels. However, such systems and methods can be further improved.
[0005] For example, in fluorescence guided surgery, the region and/or vessel being illuminated fluoresces only over the time period during which the fluorescent substance remains in the region or vessel. Due to blood flow in the region, the fluorescent substance may be carried out of the region or through the vessel relatively quickly and before the surgical procedure has been completed, thereby making the patient susceptible to injury due to unintentional cutting or damage of the region or vessel that the surgeon desires to avoid. Further, numerous sets of image data may be acquired at different points in time during the surgery which may make it difficult for a surgeon to distinguish image data acquired earlier in time from image data acquired later in time.
[0006] Accordingly, image processing systems and methods that improve a surgeon's ability to identify regions or vessels of interest during surgery and/or to distinguish image data acquired earlier in time from image data acquired later in time are desirable.
SUMMARY
[0007] According to one embodiment, the invention relates to a method comprising acquiring a set of frames of reflectance data representing light reflected from a subject, acquiring a set of frames of contrast data representing a contrast agent in the subject, applying a combination operator to the set of frames of contrast data to generate a combined contrast image, and generating a merged image based on the combined contrast image and at least one of the frames of reflectance data.
[0008] According to another embodiment, the invention relates to a system comprising a first detector configured to acquire a set of frames of reflectance data representing light reflected from a subject, a second detector configured to acquire a set of frames of contrast data representing a contrast agent in the subject, and a processor configured to apply a combination operator to the set of frames of contrast data to generate a combined contrast image and generate a merged image based on the combined contrast image and at least one of the frames of reflectance data.
[0009] The contrast data, according to one embodiment, may comprise fluorescence emissions from a fluorescent contrast agent injected into a patient before or during surgery. The contrast data may also comprise absorption data, chemiluminescence data, and/or optical scatter data. The combination operator, according to one example, may comprise a maximum intensity projection (MIP) routine. The systems and methods can assist a surgeon in distinguishing selected tissues or vessels during a surgical procedure.
[00010] The invention also relates to an article of manufacture which comprises a computer usable medium having computer readable program code means embodied therein for causing a computer to execute the methods described herein. BRIEF DESCRIPTION OF THE DRAWINGS
[00011] These and other features and aspects of exemplary embodiments of the invention will become better understood when reading the following detailed description with reference to the accompanying drawings, in which like characters represent like parts throughout the drawings, and wherein:
[00012] Figure 1 is a drawing of an image processing system for image guided surgery according to one embodiment of the present invention;
[00013] Figure 2 is a diagram illustrating the components of an image processing system that includes an excitation source and a fluorescence camera according to an exemplary embodiment of the invention;
[00014] Figure 3 is a flow chart showing a method for fluorescence guided surgery according to an exemplary embodiment of the invention;
[00015] Figure 4 is a diagram illustrating the method of Figure 3 according to an exemplary embodiment of the invention;
[00016] Figure 5 is a flow chart showing an example of a combination operator comprising a maximum intensity projection (MIP) routine according to an exemplary embodiment of the invention;
[00017] Figure 6 is a flow chart illustrating a method of generating and displaying a merged image according to an exemplary embodiment of the invention;
[00018] Figure 7 is a diagram illustrating an example of the frames of reflectance data and fluorescence data involved in generating a merged image according to an exemplary embodiment of the invention;
[00019] Figure 8 depicts examples of reflectance images, fluorescence images, a MIP image, a modified MIP image, and a merged image according to an exemplary embodiment of the invention; [00020] Figure 9 is a diagram of another example of an imaging system comprising an absorption camera;
[00021] Figure 10 is a diagram of another example of an imaging system comprising a chemiluminescence camera;
[00022] Figure 11 is a diagram of an example of an image processing system comprising an endoscope with proximal fluorescence detectors; and
[00023] Figure 12 is a diagram of an example of an image processing system comprising an endoscope with distal fluorescence detectors.
[00024] While the drawings illustrate system components in a designated physical relation to one another or having electrical communication designation with one another, and process steps in a particular sequence, such drawings illustrate examples of the invention and may vary while remaining within the scope of the invention. For example, components illustrated within the same housing may be located within the same housing, merely in electrical communication with one another, or otherwise. Additionally, illustrated data flows are merely exemplary and any communication channel may be utilized to receive and transmit data in accordance with exemplary embodiments of the invention.
DETAILED DESCRIPTION
[00025] An imaging system for image guided surgery according to one embodiment of the present invention is shown in Figure 1. The imaging system 10 includes an imaging unit 20, an image processing engine 50, and a display 80. The imaging unit 20 may include one or more excitation light sources to excite a substance such as a fluorescent contrast agent or an absorbing contrast agent administered to the subject 18. The imaging unit 20 may also include a white light source to illuminate the subject 18 as well as detectors and other components that will be described further in connection with Figures 2 and 9-12. The detectors detect the emitted and reflected light from the subject 18 and send image signals to the image processing engine 50.
[00026] The image processing engine 50 shown in Figure 1 includes a computer or other processing device 52 to process the image signals received from the imaging unit 20. The image processing engine 50 is coupled to the imaging unit 20, as shown by the communication channel 12 in Figure 1. The image processing engine 50 also includes a human machine interface (HMI) 54 such as a keyboard, foot pedal, or control buttons, for example, which allows the surgeon or an assistant to control the imaging system 10. The image processing engine 50 may also include a display 56 which may be used primarily for controlling the image processing engine 50. Also shown in Figure 1 is a display 80, connected to the image processing engine 50 through communication channel 14, that may be used by the surgeon primarily for displaying an image of the subject 18 during surgery, such as a fluorescence image merged with a reflectance image of the subject 18. During surgery, the surgeon positions the imaging unit 20 to illuminate the subject 18 and to acquire reflectance images and contrast images, such as fluorescence, absorption, or chemiluminescent images, of the subject 18. The image processing engine 50 is configured to process the image signals acquired by the imaging unit 20 and to display for the surgeon on the display 80 a merged image to assist the surgeon in visualizing the area to be treated and in discriminating certain tissues and vessels during surgery. The fluorescence, absorption, or chemiluminescent images may be referred to as "contrast images" because they are typically achieved using a contrast agent administered to the subject prior to or during surgery. Examples of such contrast agents are well known in the art and are disclosed, for example, in U.S. Patent No. 6,436,682 entitled "Luciferases, fluorescent proteins, nucleic acids encoding the luciferases and fluorescent proteins and the use thereof in diagnostics, high throughput screening and novelty items"; Varghese, A.T. Abdel-Rahman, S. Akberali, A. Mostafa, J.M. Gattuso, and R. Carpenter, "Methylene Blue Dye - A Safe and Effective Alternative for Sentinel Lymph Node Localization," Breast J. 2008 Jan-Feb;14(l):61-7, PMID: 18186867 (PubMed - indexed for MEDLINE); F. Aydogan, V. Celik, C. Uras, Z. Salihoglu, and U. Topuz, "A Comparison of the Adverse Reactions Associated with Isosulfan Blue Versus Methylene Blue Dye in Sentinel Lymph Node Biopsy for Breast Cancer, Am. J. Surg. 2008 Feb;195(2):277-8, PMID: 18194680 (PubMed - indexed for MEDLINE); and as commercially available products such as Isosulfan Blue or Methylene Blue for tissue and organ staining. Additional examples of suitable contrast agents are disclosed in U.S. Application No. 12/054,214, attorney docket number 60497.000067, filed on the same day as the present application, entitled "Systems and Methods for Optical Imaging," which is incorporated herein by reference in its entirety.
[00027] Figure 2 shows the components of an exemplary imaging system including a fluorescence camera. As shown in Figure 2, the imaging unit 120 includes at least one excitation source 123, a white light source 127, a lens 124, a beam splitter 126, a fluorescence camera 128 and a video camera 130 according to one embodiment of the invention. The excitation source 123 transmits light of a preselected wavelength or wavelength range to the subject 18. The wavelength or wavelength range, which may or may not be visible, is chosen to excite the fluorescent substance in the subject 18. The fluorescent substance, e.g., a fluorescent contrast agent, may be injected into the subject prior to or during the surgery or may be endogenous. The excitation source 123 may be any light source that emits an excitation light capable of causing a fluorescent emission from the fluorescent substance. This may include light sources that comprise light emitting diodes, laser diodes, lamps, and the like. The imaging unit 120 may also include a white light source 127 for illuminating the subject 18. However, the surgeon may also simply rely on ambient light to illuminate the subject 18. The excitation source 123 and the white light source 127 may each comprise a multitude of light sources and combination of light sources, such as arrays of light emitting diodes (LEDs), lasers, laser diodes, lamps of various kinds, or other known light sources. The white light source 127 may comprise an incandescent, halogen, or fluorescence light source, for example. The white light source typically has, either alone or in combination with the excitation source, a correlated color temperature of 2800-10,000 degrees Kelvin, more preferably 3000-6700 degrees Kelvin, and a color rendering index (CRI) of 10-100, more preferably 85-100.
[00028] The fluorescence emission and the visible light reflected from the subject 18 are received through a lens 124 and then propagate to a beam splitter 126. The lens 124 collects light and focuses an image onto the cameras 128 and 130. The beam splitter 126 splits the image information into different paths either spectrally, with the use of dichroic filters for example, or by splitting the image with a partially reflective surface. The beam splitter 126 divides the fluorescence emission from the remainder of the light. The fluorescence emission propagates through a filter 129 and then to the fluorescence camera 128. The filter 129 is configured to reject the reflected visible and excitation light from being detected by the fluorescence camera 128 while allowing the emitted fluorescent light from the subject 18 to be detected by the fluorescence camera 128. Typically, the white light source 127 is configured so that it generates little or no light in the emission band of the fluorescent substance. The fluorescence camera 128 may be any device configured to acquire a fluorescence image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, and the like. The fluorescence camera 128 receives the filtered fluorescence emission and converts it to a signal that is transmitted via the communications channel 112 to the image processing engine 50. The remainder of the light passes through a filter 131 and then to a video camera 130. The filter 131 rejects the excitation light and the fluorescence emission light to allow for accurate representation of the visible reflected light image. The video camera 130 may be any device configured to acquire a reflectance image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, and the like. The video camera 130 receives the filtered reflected light and converts it to a signal or image that is transmitted via the communications channel 112 to the image processing engine 50. The communications channel 112 may be any known channel, such as a wired or wireless channel. The fluorescence camera 128 and the video camera 130 may be referred to as "detectors" and may be digital or analog, for example.
[00029] The image processing engine 50 receives video signals from the imaging unit 120 through the communications channel 112 and processes the signals. The image processing engine 50 includes a processor 52 that executes various image processing routines as is described further herein. The image processing engine 50 also includes a memory 58 for storing, among other things, image data and various computer programs for image processing. The memory 58 may be provided in various forms, such as RAM, ROM, hard drive, flash drive, etc. The memory 58 may comprise different components for different functions, such as a first component for storing computer programs, a second component for storing image data, etc. The image processing engine 50 may include hardware, software or a combination of hardware and software. The image processing engine 50 is programmed to execute the various image processing methods described herein.
[00030] The image processing engine 50 also includes a human machine interface (HMI) 54, such as a keyboard, foot pedal or control buttons and a display 56, and/or other input/output devices that may be used primarily for input/output between the user and the imaging system 110. The HMI may also comprise control buttons on the imaging unit 120, or other desired control mechanism, for example. Prior to surgery, the surgeon or an assistant may enter various parameters through the HMI 54 to define and control the imaging method that will occur during surgery. The surgeon or an assistant may also modify the imaging method during surgery or input various commands during surgery to refine the images being displayed. The display 56 may be used for controlling the imaging system 10, and it may also be used to display images of the subject 18. The displays 80 and 56 may present the same data or they may present different data.
[00031] The display 80 is typically used to display images of the subject 18 for the surgeon. The display 80 may comprise a larger screen that is positioned closer to the surgeon so that the surgeon can view the images easily during surgery. The display 80 may comprise an LCD display, a plasma display, an HDTV display, or other high resolution device, for example. The display 80 may also include controls allowing the surgeon to adjust brightness, contrast, gain, or other parameters of the image.
[00032] Those skilled in the art will appreciate that various configurations of displays, input/output devices, processors, and memory, can be utilized to process and display images for the surgeon according to various embodiments of the invention. For example, in some embodiments, the hardware may include one or more digital signal processing (DSP) microprocessors, application-specific integrated chips (ASIC), field programmable gate arrays (FPGA), and the like. In some embodiments, the software may include modules, submodules and generally computer program products containing computer code that may be executed to perform the image processing methods of the embodiments of the invention. The memory 58 may be any type of memory suitable for storing information that may be accessed by the processor 52 for performing image processing. Information stored in the memory 58 may include one or more previously acquired image data sets, computer code for executing image processing routines, and/or any other information accessed by the processor 52. The memory 58 may comprise, for example, a hard drive, flash drive, random access memory (RAM), and/or read-only memory (ROM). In various embodiments, the memory 58 stores computer executable code executed by the processor 52. In various embodiments, the memory 58 may also include information descriptive of the subject 18 on which fluorescence guided surgery is being performed.
[00033] The imaging system is configured to execute various imaging methods according to exemplary embodiments of the invention. The methods typically involve acquiring frames of image data at different points in time. According to one embodiment, the frames of image data include reflectance data and fluorescence data. The reflectance data sets and the fluorescence data sets may be used to generate a merged image in which the fluorescence image data are overlaid onto the reflectance image data. The merged image assists the surgeon in visualizing certain tissues which emit fluorescent light during surgery. [00034] As used herein, the term "reflectance image data" refers to image data representing a reflection of light from a surface. The term "fluorescence image data" refers to image data representing a fluorescent emission from a subject. The fluorescent emission is typically achieved with a fluorescent contrast agent administered to the subject prior to or during surgery.
[00035] An exemplary method for fluorescence guided surgery will now be described with reference to Figures 3 and 4. Figure 3 is a flow chart showing the exemplary method. The method begins with block 148 in which a fluorescent contrast agent is administered to the subject. In block 150, a set of single frames of reflectance data and fluorescence data are acquired with the imaging system 110. In block 152, the individual frames may be modified with various image processing methods, such as adjustments of brightness, color and gamma. In block 154, weighting operators may be applied to a set of frames of fluorescence data. In block 156, a combination operator is applied to the set of frames of fluorescence data to generate a combined fluorescence image. In block 158, the combined fluorescence image may be modified with various image processing routines. In block 160, the combined fluorescence image is merged with one or more reflectance images to generate a merged image. In block 162, the merged image may be displayed on the display 80. In block 164, a determination is made as to whether imaging is continuing. If so, the process returns to block 150 where additional frames of image data are acquired. According to some embodiments of the invention, the method may be carried out with three primary steps (150, 156, 160), and may include one or more of the other steps illustrated in Figure 3.
[00036] Figure 4 is a diagram that shows the method of Figure 3 in more detail. The exemplary method typically includes three primary steps (150, 156, 160) and may include one or more of the other steps, as illustrated. As shown in Figure 4, step 150 comprises acquiring a set of single frames of reflectance image data and fluorescence image data. Step 156 comprises applying a combination operator to the fluorescence data to create a combined fluorescence image. Step 160 comprises creating a merged image based on the combined fluorescence image and one or more of the reflectance images. These steps are carried out by the imaging unit 120 and the image processing engine 50 shown in Figure 2.
[00037] In the example of Figure 4, the imaging system 110 acquires a set of six frames of reflectance data and fluorescence data. The imaging system 110 is typically configured to acquire a reflectance image data set and a fluorescence image data set at frame rates between 1 and 60 frames per second, and preferably between 15 and 30 frames per second. Alternatively the fluorescence data set and reflectance data set can be acquired at independently controlled frame rates. The fluorescence data set is acquired with the fluorescence camera 128, the reflectance data set is acquired with the video camera 130, and the data sets are stored in the memory 58 of the image processing engine 50. Each pixel of image data typically includes an intensity value and one or more values to specify a color. Although the value of the pixel is typically an intensity value, the value may also include other characteristics such as fluorescence lifetime, quantum yield, absorption, and the like depending on the particular nature of the imaging system.
[00038] After the frames of fluorescence data have been acquired, they may be modified individually, as shown in step 152 of Figure 4. The modifications are executed by the image processing engine 50. For example, the brightness, contrast, and gamma (BCG), window and level may be modified according to a lookup table stored in the memory 58, as is known in the art. Also shown in step 152, a thresholding step can be applied to individual frames of fluorescence data. For example, a thresholding step can be applied by setting all pixel values below a specified threshold value to zero. The inclusion of the thresholding step eliminates a certain subset of the fluorescence data set values below a specified threshold value. This feature provides a thresholded fluorescence data set that can make it easier for the surgeon to see the highlighted fluorescence portions as compared with any background fluorescence, such as endogenous fluorescence. The thresholding step can also be applied by setting all pixel values above a specified threshold value to a preselected maximum value.
[00039] Known filtering methods can also be applied to individual frames of fluorescence data in step 152 of Figure 4. For example, known filtering operations such as blurring, sharpening, edge finding, low pass, band pass, high pass, Fourier, notch, despeckling, speckling, and other filtering operations can be applied.
[00040] As shown in step 154 of Figure 4, a weighting operator can be applied to a set of frames of fluorescence data prior to application of the combination operator in step 156. In Figure 4, six frames are shown as an example. The weighting operator step 154 may comprise applying color changes to different frames and/or pixels of fluorescence data and/or applying weighting factors to the intensity values in different frames and/or pixels of fluorescence data (e.g., multiplying the intensity values by weighting factors between 0 and 1). The color changes and/or weighting factors may be selected based on the age of the frame. For example, the method may comprise changing the color of the older frames to red and changing the color of the newer frames to green. In this way, the fluorescent emission in the newest frames is more prominently displayed in green and the fluorescent emission in the older frames is less prominently displayed in red. The color of the older fluorescence data is manipulated to assist the surgeon in determining the age of the fluorescence data, e.g., discriminating the newest fluorescence data from the older fluorescence data. As another example, one or more weighting factors may be applied to the intensity values in the frames of fluorescence data. For example, a weighting factor of 1.0 may be multiplied by the pixel values in the newest frame of fluorescence data, a weighting factor of 0.9 applied to the next newest frame, a weighting factor of 0.8 applied to the next newest frame, and so on. The processor 52 multiplies the weighting factors by each pixel intensity value and the resulting weighed frames are stored in the memory 58. In this way, the older fluorescence image data, when displayed, will have a lower intensity than the newer fluorescence data because it is discounted by the weighting factor. This effect is illustrated in Figure 4. In particular, the first weighted frame of fluorescence data that is input to the combination operator in step 156 has a lower intensity than the second weighted frame of fluorescence data, which has a lower intensity that the third weighted frame of fluorescence data, and so on.
[00041] Figure 4 depicts the application of the combination operator in step 156. The combination operator is a routine that receives as input at least two frames of fluorescence data and outputs a single combined fluorescence image. As shown in Figure 4, the combination operator may utilize a number of routines based on, for example, maximum pixel values, minimum pixel values, the sum of pixel values, the difference or subtraction of pixel values, the average of pixel values, the derivative of pixel values, and cross correlation of pixel values. The combination operator utilizes one or more of these routines to convert the multiple input fluorescence data sets to a single combined fluorescent image data set.
[00042] According to one embodiment of the invention, the combination operator uses a routine based on maximum pixel values. This routine may be referred to as a maximum intensity projection (MIP) routine. The MIP routine is applied to the fluorescence data prior to the fluorescence data being displayed in order to assist the surgeon in visualizing the fluorescent regions of the subject 18. The reflectance image data is generally displayed without MIP processing. [00043] The MIP routine, which is one embodiment of the Combination Operator (156) depicted in Figure 4, typically includes the steps of comparing two or more input image data sets and generating a MIP data set, which is one embodiment of the "Combined Fluorescence Image" depicted in Figure 4. According to one embodiment, with two input data sets, the MIP data set is generated by comparing the value of a pixel from one of the input data sets to the value of a corresponding pixel from the other input data set and taking the higher value. The higher value is stored in the MIP data set for the corresponding pixel. The process is repeated for each pixel in the input data sets. The MIP data set thus includes, for each pixel, the higher value from one of the input data sets. For example, if the values of the first five pixels in data set 1 are: (4, 6, 7, 4, 2, . . .) and the values of the corresponding first five pixels in data set 2 are: (1, 5, 8, 3, 3, . . .), the MIP data set would have the following values for its corresponding first five pixels: (4, 6, 8, 4, 3, . . .). The MIP routine can have any number of input data sets, such as the six input fluorescence data sets shown in Figure 4.
[00044] Figure 5 is a flowchart showing an example of the MIP routine for n input data sets. The MIP routine is carried out by the image processing engine 50. In particular, the processor 52 performs the mathematical operations, and the memory 58 stores the image data and the program or programs used by the processor 52 to execute the blocks of the flow chart of Figure 5. In block 170, the value of a pixel P1 j from data set 1 is read. The value of the pixel is typically an intensity value, but the value may also include other characteristics as noted above. In step 171, the value of the corresponding pixel P,j from data set 2 is read. This process is repeated for each remaining input data set, up to data set n in step 172. In step 173, the values are compared by the processor 52. In step 174, the highest value is stored in memory 58 in a MIP data set. In step 175, it is determined by the processor 52 whether there are any remaining pixels to compare in the input data sets. If yes, the process returns to step 170 where the next pixel is read in each input data set. If no, the process ends.
[00045] According to exemplary embodiments of the invention, the MIP routine is used to process the image data from the fluorescence emission. In particular, the MIP routine can capture the brightest fluorescence emissions during a specified time period, over a number of frames of image data, so that the MIP image shows the fluorescence substance as it has traveled through a region of the subject over time. As shown in step 156 of Figure 4, the MIP ("Max") routine is applied to six frames of fluorescence data. The result is a combined fluorescence image that contains, for each pixel, the maximum value from the corresponding pixels in each of the six input data sets. With this process, over time, the MIP data set is updated to include the maximum values from a specified number of previous frames of fluorescence data. The result may be, for example, that if a fluorescent substance passes through the region of interest relatively quickly, for example as a fluorescent bolus traveling through a designated vessel, the MIP routine will generate an image where the entire vessel is highlighted rather than just one portion of the vessel that contains the bolus at a particular time. The highlighting of the entire vessel can assist the surgeon in visualizing and identifying the vessel during surgery.
[00046] Referring again to Figure 4, in addition to the MIP ("Max") routine, there are a number of other routines in step 156 that may be utilized and executed by the processor 52 in an analogous manner to the MIP routine. For example, the processor 52 may be programmed to compare the corresponding pixel values from each input fluorescence data set and store the minimum, rather than the maximum, pixel value in the combined fluorescence image data set in the memory 58. The processor 52 may be programmed to add all the values from each corresponding pixel in the input data sets and store the sum in the combined fluorescence image data set in the memory 58. The processor 52 may be programmed to subtract specified corresponding pixel values and store the difference in the combined fluorescence image data set in the memory 58. The processor 52 may be programmed to calculate the average value of each corresponding pixel in the input data sets and store the average value in the combined fluorescence image data set in the memory 58. The processor 52 may be programmed to calculate a derivative value based on corresponding pixel values in the input data sets and store the derivate value in the combined fluorescence image data set in the memory 58. The processor 52 may be programmed to calculate a cross correlation value based on the input fluorescence image data sets and to store the calculated value in the combined fluorescence image data set in the memory 58. These operations can be represented, respectively, by the following equations:
[00047] Pijc = max ([PiJ 1,..PiJN]) for all ij.
[00048] Pijc = min ([PiJ 1,..PiJN]) for all ij.
[00049] Pijc = sum ([Pij 1,..PiJN]) for all ij.
[00050] Pijc = difference ([Pij 1,..PiJN]) for all ij. [00051] Pijc = average ([Pij 1,..PiJN]) for all ij.
[00052] Pijc = derivative ([Pij 1,..PiJN]) for all ij.
[00053] P = XCOiT(P 1,..,PN), for 2D cross correlation.
[00054] Given a two-dimensional image matrix array M the previous operations can be created in MATLAB using the following commands:
Pmax = max(M, [], 3);
Pmin = min(M, [], 3);
Psum = sum(M, 3);
Pdiff =M(:,:,2) - M(:,:,l);
Pavg = sum(M, 3)./size(M, 3);
Pder = diff(M, 1, 3);
P = xcorr2(M, N), where N is another two dimensional image matrix array. MATLAB is a well-known programming language available commercially from The MathWorks, Inc.
[00055] Once the combined fluorescence image data set has been calculated, it may be modified in a number of ways, as shown in Figure 4. The modifications are executed by the image processing engine 50. For example, as shown in step 158, the brightness, contrast, and gamma (BCG), window and level may be modified according to a lookup table stored in the memory 58, as is known in the art. Also shown in step 158 of Figure 3, a thresholding step can be applied to the combined fluorescence image data set. For example, a thresholding step can be applied by setting all pixel values below a specified threshold value to zero, and/or by setting all pixel values above a specified threshold value to a maximum value. The thresholding step can be used, for example, to remove data representing background or endogenous fluorescence emissions. Known filtering methods can also be applied to the combined fluorescence image data set. For example, known filtering operations such as blurring, sharpening, edge finding, low pass, band pass, high pass, Fourier, notch, despeckling, speckling, and other filtering operations can be applied, [00056] In addition, a false coloring method can be applied to the combined fluorescence image data set in step 158. In generating the merged image, it may be helpful to the surgeon to change the color of the fluorescence image to a predetermined color that is more easily visible than the naturally occurring fluorescence emission color which may be difficult to distinguish or even invisible. The false coloring method can be carried out by the image processing engine 50. A lookup table may be stored in the memory 58 that maps color values according to a specified desired routine. For example, the combined fluorescence image data set may be modified so that the color of the emission, even if initially invisible, is changed to bright green to be clearly visible by a surgeon. Any desired color can be used.
[00057] Referring again to Figure 4, after the modification step 158, a merge step 160 is executed, according to an exemplary embodiment of the invention. The merge step 160 comprises merging the modified, combined fluorescence image data set with one or more frames of reflectance data. The modified fluorescence image is overlaid onto one or more frames of reflectance data acquired by the video camera 130. This merge step, like the other image processing steps, is carried out with the image processing engine 50. In the MIP embodiment, the MIP data (combined fluorescence image) can be merged with one of the frames of reflectance data, typically the most recent frame of reflectance data, to generate a merged image.
[00058] Although the combined image is typically generated using the most recent frame of reflectance data, an older frame of reflectance data may also be used. For example, if the MIP data set is more closely matched to an older frame of reflectance data, the older frame may be used. According to an exemplary embodiment of the invention, one or more landmarks in the MIP data set are compared with corresponding landmarks in the reflectance data sets to identify which reflectance data set is the closest match to the MIP data set so that the identified reflectance data set is used to generate the combined image. The particular reflectance data that is used may be selected automatically by mutual information cross correlation, or other anatomical segmentation and registration techniques known in the art. Alternatively the reflectance data chosen may be a function of the parameters of the MIP data set collection. For example, if the MIP data set is generated with 15 frames at 15 frames per second, the reflectance data corresponding to the eighth fluorescence data frame may be selected. Alternatively, the reflectance image may be selected manually by the surgeon or operator. [00059] The combined fluorescence image data set may also be merged with one or more additional frames of reflectance data. For example, the MIP data set, once generated, may provide a good image of the particular vessel of interest. The MIP data set, therefore, may be generated early in the procedure and then used throughout the surgery by combining it with subsequently generated frames of reflectance data as they are acquired.
[00060] Referring again to Figure 4, in step 162 the merged image is displayed for the surgeon on the display 80. Alternatively, the combined fluorescence image may be displayed alone, without the reflectance image.
[00061] One aspect of the invention relates to the number of frames of fluorescence data that is used to generate the combined fluorescence image. According to one embodiment of the invention, the MIP routine is applied by the image processing engine 50 to all historical fluorescence data sets from the beginning of the surgical procedure. According to this embodiment, a first frame of fluorescence data and a first frame of reflectance data are acquired. Next, a second frame of fluorescence data and a second frame of reflectance data are acquired. The MIP routine is then applied to the first and second frames of fluorescence data to generate a MIP data set. The MIP data set is then stored in the memory 58 so that it can be used in the next iteration of the process. In the next step, it is determined whether the imaging processing is continuing. If yes, the process returns to acquisition of another (third) frame of fluorescence data and another (third) frame of reflectance data. The MIP routine is then applied again to the previously generated, stored MIP data set and the third frame of fluorescence data to generate a new MIP data set. The merged image is generated by the image processing engine 50 and displayed for the surgeon on the display 80 using the new MIP data set and a frame of reflectance data, typically the most recent (third) frame of reflectance data. The process is repeated for each subsequent frame of acquired data. The MIP data set therefore represents all the fluorescence data acquired starting at the beginning of the surgical procedure.
[00062] According to other embodiments of the invention, the combination operator is applied to a specified number of frames (e.g., 3 frames) of fluorescence data, rather than all previous frames of fluorescence data. The use of a specified number of frames, rather than all frames, can enhance the quality of images displayed for the surgeon. This process is illustrated in Figures 6 and 7. As shown in Figure 6, the process is conducted as follows. In step 180, a contrast agent is administered to the subject. In step 181, a number of initial frames (e.g., two frames) of reflectance data and fluorescence data are acquired. Next, a third frame of fluorescence data and a third frame of reflectance data are acquired in step 182. The MIP routine is then applied to the latest m frames of data in step 183, in this case the latest three frames of data, to generate a MIP data set. The MIP data set is merged with one of the frames of reflectance data, typically the most recent (third) frame of reflectance data, to generate a merged image in step 184. The merged image is displayed for the surgeon in step 185. Fluorescence data sets are then stored in the memory 58 in step 186 so that they can be used in the next iteration of the process. In step 187, it is determined whether the imaging processing is continuing. If so, the process returns to step 182 for acquisition of another (fourth) frame of fluorescence data and another (fourth) frame of reflectance data. The MIP routine is then applied again in step 183 to the latest three frames of fluorescence data (the second, third and fourth frames) to generate a new MIP data set. The merged image is generated and displayed for the surgeon in steps 184 and 185 using the new MIP data set and a frame of reflectance data, typically the most recent (fourth) frame of reflectance data. The process is repeated for each subsequent frame of acquired data.
[00063] Figure 7 is another illustration of this exemplary process, showing the frames that are used to generate the combined image. As illustrated in Figure 7, the MIP routine is applied to the three most recent frames of fluorescence data and the older frames of fluorescence data are not used in generating the MIP data set. In this process, the MIP routine is carried out by reading the values from corresponding pixels in the most recent three frames of data, comparing the values, and writing the highest value to the corresponding pixel in the MIP data set, which is stored in the memory 58. The MIP data set contains a certain amount of historical fluorescence data, but does not contain all historical fluorescence data. The number of frames can be selected based on the extent of historical data that is desired. As in previous examples, a merged image is generated with the reflectance data set and the MIP data set. The merged image assists the surgeon in visualizing the area of interest (reflectance image) combined with fluorescence data processed according to the MIP routine. This embodiment may be used to reduce or prevent saturation over time, since only a limited number of previous fluorescence data sets are used.
[00064] Figure 8 is an illustration of an application of the MIP algorithm to a surgical site. The five photographs on the top row of Figure 8 represent five frames of reflectance data. The images in the next row down represent five frames of fluorescence data. The photograph in the bottom left corner represents the MIP data set for five frames of fluorescence data. The image in the bottom center represents the MIP data set after it has been enhanced with respect to brightness, contrast, gain and false coloring. In the bottom center image, the brightness, contrast and gamma were modified to visually enhance the fluorescence components of interest and to minimize the background contributions. Then, this gray-scale image was false colored in bright green to permit visible contrast when merged with the color reflected light image. The image in the bottom right of Figure 8 is the merged image in which the false colored MIP data set has been superimposed onto a frame of reflectance data. In the merged image, the MIP data set shows a highlighted bolus over a period of time, which enables the surgeon to see not just one snapshot of the bolus, but rather a much larger portion of the vessel of interest.
[00065] Although the discussion thus far has focused largely on an example of a system that uses fluorescence data and a MIP routine, other embodiments of the invention comprise different image processing methods and different hardware. For example, as discussed above in connection with step 156 of Figure 4, the combination operator in addition to a MIP ("Max") routine, may also be programmed to execute routines based on minimum pixel values, the sum of pixel values, a difference of pixel values, an average of pixel values, a derivative of pixel values, and/or a cross correlation of pixel values. The processor 52 may be programmed, depending on the desired application, to execute any one or more of these routines.
[00066] In addition, the imaging system can be designed to acquire types of contrast image data other than fluorescence image data. The other types of contrast image data may include, for example, absorption data, chemiluminescence data, bioluminescence data, and scatter data. Chemiluminescence generally refers to the emission of light as a result of a chemical reaction. Bioluminescence generally refers to the production and emission of light by a living organism as the result of a chemical reaction. Bioluminescence is generally considered a subset of chemiluminescence. In general, the image data other than the reflectance data may be referred to as "contrast data" because it is typically achieved using a contrast agent.
[00067] Figure 9 is a diagram of an absorption-based system 210. As shown in Figure 9, many of the components are the same as the components in the fluorescence system depicted in Figure 2. Referring to Figure 9, the absorption camera 228 replaces the fluorescence camera 228 of Figure 2. As shown in Figure 9, the imaging unit 220 includes at least one excitation source 223, a white light source 227, a lens 224, a beam splitter 226, an absorption camera 228 and a video camera 230 according to one embodiment of the invention. The excitation source 223 transmits light of a preselected wavelength or wavelength range to the subject 18. The wavelength or wavelength range, which may or may not be visible, is chosen to target the absorbing substance in the subject 18, which may be injected into the subject prior to or during the surgery or may be endogenous. The excitation source 223 may be any light source that emits an excitation light capable of being absorbed by the absorbing substance. The excitation source 223 may include light sources that use light emitting diodes, laser diodes, lamps, and the like. The imaging unit 220 may also include a white light source 227 for illuminating the subject 18. However, the surgeon may also simply rely on ambient light to illuminate the subject 18. The excitation source 223 and the white light source 227 may each comprise a multitude of light sources and combination of light sources, such as arrays of light emitting diodes, lasers, laser diodes, lamps of various kinds, or other known light sources. The white light source 227 may comprise an incandescent, halogen, or fluorescence light source, for example. The white light source 227 and the excitation source 223 may comprise the same source.
[00068] The absorption spectrum and the visible light reflected from the subject 18 are detected through a lens 224 and then propagate to a beam splitter 226. The lens 224 collects light and focuses an image onto the cameras 228 and 230. The beam splitter is capable of splitting the image information into different paths either spectrally, with the use of dichroic filters for example, or by splitting the image with a partially reflective surface. The beam splitter 226 divides the absorption spectrum from the remainder of the light. The absorption spectrum travels through a filter 229 and then to an absorption camera 228. The excitation light should not be blocked by the filter 229 in front of the absorption camera 228. The filter 229 is selected to allow transmission from a particular wavelength band of interest. The filter 231 may be selected to choose a second wavelength band of interest, or to broadly accept visible light to provide a reflected light image on camera 230.
[00069] The absorption camera 228 may be any device configured to acquire an absorption image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, or the like. The absorption camera 228 receives the absorption spectrum and converts it to a signal that is transmitted to the image processing engine 50. The remainder of the light passes through the filter 231 and then to the video camera 230. The video camera 230 may be any device configured to acquire a reflectance image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, or the like. The video camera 230 receives the filtered reflected light and converts it to a signal or image that is transmitted to the image processing engine 50.
[00070] The imaging system can also be designed to acquire chemiluminescent (including bioluminescent) light. Figure 10 is a drawing of chemiluminescent system 310. As shown in Figure 10, many of the components are the same as the components in the fluorescence system depicted in Figure 2. Referring to Figure 10, the chemiluminescence camera 328 replaces the fluorescence camera 128 of Figure 2. As shown in Figure 10, the imaging unit 320 includes an optional illumination source 327, a lens 324, a beam splitter 326, a chemiluminescence camera 328, and a video camera 330 according to one embodiment of the invention. The illumination source 327 transmits light of a preselected wavelength or wavelength range to the subject 18. The wavelength or wavelength range, which may or may not be visible, is chosen to provide visible light to illuminate the subject 18, but substantially devoid of wavelengths in the acceptance band imaged by the luminescence camera 328, as defined by the characteristics of filter 329 and the beam splitting element 326.
[00071] The chemiluminescent light and the visible light reflected from the subject 18 are detected through a lens 324 and then propagate to a beam splitter 326. The lens 324 collects light and focuses an image onto the cameras 328 and 330. The beam splitter is capable of splitting the image information into different paths either spectrally, with the use of dichroic filters for example, or by splitting the image with a partially reflective surface. The beam splitter 326 divides the chemiluminescent emission from the remainder of the light. The chemiluminescent emission travels through a filter 329 and then to a chemiluminescence camera 328. The filter 329 is typically designed to block the illumination light from the illumination source 327 if the illumination source 327 is on when the chemiluminescence camera 328 acquires the image data. The illumination source 327 should not have light in the emission band of the chemiluminescent material at the time when the chemiluminescence camera 328 is acquiring image data.
[00072] The chemiluminescence camera 328 may be any device configured to acquire a chemiluminescent image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, or the like. The chemiluminescence camera 328 receives the chemiluminescent emission and converts it to a signal that is transmitted to the image processing engine 50. The remainder of the light passes to a video camera 330. The video camera 330 may be any device configured to acquire a reflectance image data set, such as a charge coupled device (CCD) camera, a photo detector, a complementary metal-oxide semiconductor (CMOS) camera, and the like. The video camera 330 receives the reflected light and converts it to a signal or image that is transmitted to the image processing engine 50.
[00073] Other embodiments of the invention include an endoscope. The endoscope may be used in endoscopic surgery, which can be significantly less invasive as compared with the open surgery depicted in Figure 1. The endoscope is inserted into a small incision in the patient, as known in the art, to minimize the invasive nature of the surgery. Endoscopes are commercially available from a number of manufacturers, such as Stryker and Karl Storz, for example.
[00074] According to exemplary embodiments of the invention, the endoscope may include proximal detectors or distal detectors. These embodiments will be described in connection with a fluorescence system. However, those skilled in the art will appreciate that the endoscopes can be designed and configured to be used with a chemiluminescent camera to acquire chemiluminescent images, an absorption camera to acquire absorption images, or other types of cameras to detect a desired form of radiation, such as optical scattering data.
[00075] Figure 11 is a drawing of an endoscope system having proximal fluorescence detectors. The detectors are referred to as proximal detectors, because they are located near the top end of the endoscope proximate to the surgeon. As shown in Figure 11, the imaging system 410 comprises an imaging unit 450 that houses a beam splitter 426, a fluorescence camera 428, a filter 429, a video camera 430 and a filter 431. The imaging system 410 also includes an excitation source 423 and a white light source 427. The excitation source 423 and the white light source 427 are optically coupled to the endoscope 440 at or near the proximal end of the endoscope 440. The functions of the beam splitter 426, the cameras 428, 430, the filters 429, 431, the excitation source 423 and the white light source 427 are substantially the same as the corresponding elements in Figure 2.
[00076] The endoscope is connected via communications channel 412 to the image processing engine 50 which includes a processor 52, human machine interface (HMI) 54 such as a keyboard, foot pedal, or control buttons, output device 56 such as a display, and one or more memories 58 as described previously. The image processing engine 50 is connected to the display 80 via communications channel 414, The image processing engine 50 and the display 80 operate in essentially the same manner as described above with respect to Figure
2.
[00077] In practice, a fluorescent agent is injected into the subject and the endoscope 440 is inserted into a body cavity or incision. The white light source 427 and the excitation source 423 are activated, image data are acquired, and the image data are processed, as described above in connection with Figure 4. The endoscope 440 can provide the advantages associated with minimally invasive surgery, for example.
[00078] A second embodiment of an endoscope is shown in Figure 12. The endoscope depicted in Figure 12 includes distal detectors 528, 530. The fluorescence detector 528 and the video camera 530 are referred to as distal detectors because they are located at the bottom end of the endoscope away from the surgeon. The imaging system 510 also includes an excitation source 523 and a white light source 527. The excitation source 523 and the white light source 527 are optically coupled to the endoscope 540 at or near the proximal end of the endoscope 540. The functions of the detectors 528, 530, the excitation source 523 and the white light source 527 are substantially the same as the corresponding elements in Figure 2. The imaging system 510 also comprises an imaging unit 550 that relays the signals and data acquired by the endoscope 540 to the imaging processing engine 50.
[00079] The imaging unit 550 is connected via communications channel 512 to the image processing engine 50 which includes a processor 52, HMI 54, output device 56 such as a display, and one or more memories 58 as described previously. The image processing engine 50 is connected to the display 80 via communications channel 514. The image processing engine and the display 80 operate in essentially the same manner as described above with respect to Figure 2.
[00080] In practice, a fluorescent agent is injected into the subject and the endoscope 540 is inserted into a body cavity or incision. The white light source 527 and the excitation source 523 are activated, image data are acquired, and the image data are processed, as described above. [00081] As will be appreciated by those skilled in the art, the imaging systems described herein can be modified to include a laparoscope or other known surgical devices. A laparoscope is typically inserted into an incision in the abdomen to provide access to an interior of the abdomen in a minimally invasive procedure.
[00082] According to other embodiments of the invention, the image data can be processed to account for motion during the surgery. In some circumstances, a vessel or tissue of interest, such as one that has been injected with a fluorescent substance, may move significantly during imaging. The resulting processed fluorescent image consequently may be somewhat difficult to interpret because it may appear to depict multiple vessels due to the movement. The image processing methods described herein can be modified as follows to address this issue.
[00083] First, the displacement or motion of the vessel or tissue of interest is detected. The displacement value of the subject may be determined by a number of approaches. In some embodiments, the displacement value of the subject can be determined manually or automatically by detecting motion of one or more selected landmarks on the subject over a specified time period. The distance between one or more selected landmarks can be determined before, during, and/or after the motion. The types of methods used for automatically detecting motion based on comparing frames of data acquired at different times are well known in the art and include 2D cross-correlation, mathematical comparison (subtraction, division, multiplication, etc.) of the spatial or Fourier domain images, deformable image registration, feature segmentation and tracking, and the like. The detection of each displacement results in a displacement value.
[00084] Next, the displacement values are used with a weighting operator (e.g., step 154 of Figure 4) to modify the data sets used to create the combined image (step 156). According to one embodiment, each displacement value determines the weighting factor (e.g., a number between 0 and 1) that is multiplied by the pixel values of the older fluorescence data. The weighting factors can be applied to individual frames or to individual pixels, for example. A look-up table is stored in the processor 52 that maps various displacement values to weighting factors. Generally, it will be advantageous to reduce the weighting factor as the displacement value increases, and vise versa. For example, if the displacement value is large, the weighting factor is reduced, so that the persistence of the older data, which is derived from a substantially different location due to the motion, is reduced. Conversely, if the displacement is small, the weighting factor is larger, so that the relative persistence of the older data is increased. This feature can selectively reduce the persistence of the older data to a desired extent when there has been significant motion between the time of acquisition of the older data and the time of acquisition of the most recent data.
[00085] According to another embodiment, the above described method of applying the weighting operator based on a displacement value is limited to displacement perpendicular to a vessel of interest. According to this embodiment, a vessel of interest is identified by the surgeon, for example by identifying its longitudinal axis. Image segmentation and feature selection methods used for this process are well known in the art and include but are not limited to atlas-based segmentation, vessel tree-tracking, and the like. The weighting factors are then calculated based only on displacement perpendicular to the longitudinal axis of the vessel. This method provides the advantage of focusing the weighting operator modification so that it is applied only to perpendicular displacement of a vessel of interest.
[00086] According to another embodiment of the invention, the displacement values are used to change the color of the contrast image data (e.g., fluorescence data) used to generate the merged image. The image processing engine 50 stores a look-up table that maps displacement values to specified colors. The colors of the fluorescence data are modified based on displacement values. For example, a displacement value of more than two centimeters results in changing the fluorescence color to red; a displacement value between one and two centimeters results in changing the fluorescence color to yellow; a displacement value of less than one centimeter results in changing the fluorescence color to green, etc. The resulting image therefore allows the surgeon to understand the extent of displacement of the fluorescence image data based on the color of the image displayed to the surgeon.
[00087] The image processing engine 50 may also provide the surgeon the option of erasing all historical fluorescence image data by entering a command during imaging. This option allows the surgeon to clear all historical fluorescence data from the merged image during surgery, in the event, for example, that the historical fluorescence data is difficult to interpret or otherwise not helpful to the surgeon.
[00088] While the foregoing description includes details and specific examples, it is to be understood that these have been included for purposes of explanation only, and are not to be interpreted as limitations of the present invention. For example, there are various types of image data, combination operators, and medical devices that can be used in the imaging system. Modifications to the embodiments described herein can be made without departing from the spirit and scope of the invention, which is intended to be encompassed by the following claims and their legal equivalents.

Claims

1. A method comprising: acquiring a set of frames of reflectance data representing light reflected from a subject; acquiring a set of frames of contrast data representing a contrast agent in the subject; applying a combination operator to the set of frames of contrast data to generate a combined contrast image; and generating a merged image based on the combined contrast image and at least one of the frames of reflectance data.
2. The method of claim 1, wherein applying the combination operator comprises applying a maximum intensity projection (MIP) routine to the set of frames of contrast data.
3. The method of claim 1, wherein applying the combination operator comprises applying a routine based on at least one of minimum pixel values, a sum of pixel values, a difference of pixel values, an average of pixel values, a derivative of pixel values, and a cross correlation of pixel values.
4. The method of claim 1, wherein the contrast data comprises fluorescence data.
5. The method of claim 1, wherein the contrast data comprises at least one of chemiluminescence data, optical scatter data, and absorption data.
6. The method of claim 1, further comprising modifying individual frames of contrast data prior to applying the combination operator.
7. The method of claim 6, wherein the modifying individual frames of contrast data comprises at least one of modifying a brightness, a contrast, and a gamma value.
8. The method of claim 6, wherein the modifying individual frames of contrast data comprises setting to zero those pixel values that are less than a specified threshold value.
9. The method of claim 1, further comprising applying one or more weighting operators to the frames of contrast data prior to applying the combination operator.
10. The method of claim 9, wherein applying one or more weighting operators comprises changing a color of at least one pixel of contrast data based on an age of the contrast data.
11. The method of claim 9, wherein applying one or more weighting operators comprises multiplying a weighting factor by a pixel value.
12. The method of claim 1, further comprising modifying the combined contrast image prior to generating the merged image.
13. The method of claim 12, further comprising changing a color of at least one pixel in the combined contrast image.
14. The method of claim 1, further comprising displaying the merged image during a surgery.
15. The method of claim 1, further comprising: determining a displacement value of the subject; and selecting a weighting factor based on the displacement value.
16. A system comprising: a first detector configured to acquire a set of frames of reflectance data representing light reflected from a subject; a second detector configured to acquire a set of frames of contrast data representing a contrast agent in the subject; and a processor configured to: apply a combination operator to the set of frames of contrast data to generate a combined contrast image; and generate a merged image based on the combined contrast image and at least one of the frames of reflectance data.
17. The system of claim 16, further comprising a display to display the merged image.
18. The system of claim 16, wherein the first detector comprises a video camera.
19. The system of claim 16, wherein the second detector comprises a fluorescence camera.
20. The system of claim 16, wherein the second detector comprises at least one of an absorption camera, a chemiluminescence camera, and a camera configured to detect optical scattering.
21. The system of claim 16, further comprising a white light source.
22. The system of claim 16, further comprising an excitation source configured to excite a contrast agent in the subject that produces the contrast data.
23. The system of claim 16, wherein the first and second detectors are housed in an imaging unit.
24. The system of claim 16, further comprising an endoscope.
25. The system of claim 24, wherein the first and second detectors are housed in a distal end of the endoscope.
PCT/US2008/070297 2008-03-24 2008-07-17 Image processing systems and methods for surgical applications WO2009120228A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US3903808P 2008-03-24 2008-03-24
US61/039,038 2008-03-24

Publications (1)

Publication Number Publication Date
WO2009120228A1 true WO2009120228A1 (en) 2009-10-01

Family

ID=41114226

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/070297 WO2009120228A1 (en) 2008-03-24 2008-07-17 Image processing systems and methods for surgical applications

Country Status (1)

Country Link
WO (1) WO2009120228A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014018936A3 (en) * 2012-07-26 2014-06-05 Olive Medical Corporation Continuous video in a light deficient environment
WO2014144947A1 (en) * 2013-03-15 2014-09-18 Olive Medical Corporation Super resolution and color motion artifact correction in a pulsed color imaging system
CN105808181A (en) * 2014-12-31 2016-07-27 中强光电股份有限公司 Image intermediary device, interactive display system and operating method thereof
US9462234B2 (en) 2012-07-26 2016-10-04 DePuy Synthes Products, Inc. Camera system with minimal area monolithic CMOS image sensor
WO2016157260A1 (en) * 2015-03-31 2016-10-06 パナソニックIpマネジメント株式会社 Visible light projection device
US9516239B2 (en) 2012-07-26 2016-12-06 DePuy Synthes Products, Inc. YCBCR pulsed illumination scheme in a light deficient environment
US9777913B2 (en) 2013-03-15 2017-10-03 DePuy Synthes Products, Inc. Controlling the integral light energy of a laser pulse
US10084944B2 (en) 2014-03-21 2018-09-25 DePuy Synthes Products, Inc. Card edge connector for an imaging sensor
US10251530B2 (en) 2013-03-15 2019-04-09 DePuy Synthes Products, Inc. Scope sensing in a light controlled environment
EP3540494A1 (en) * 2018-03-16 2019-09-18 Leica Instruments (Singapore) Pte. Ltd. Augmented reality surgical microscope and microscopy method
US10517471B2 (en) 2011-05-12 2019-12-31 DePuy Synthes Products, Inc. Pixel array area optimization using stacking scheme for hybrid image sensor with minimal vertical interconnects
US10881272B2 (en) 2013-03-15 2021-01-05 DePuy Synthes Products, Inc. Minimize image sensor I/O and conductor counts in endoscope applications
US10980406B2 (en) 2013-03-15 2021-04-20 DePuy Synthes Products, Inc. Image sensor synchronization without input clock and data transmission clock
CN113693739A (en) * 2021-08-27 2021-11-26 南京诺源医疗器械有限公司 Tumor navigation correction method and device and portable fluorescent image navigation equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5865754A (en) * 1995-08-24 1999-02-02 Purdue Research Foundation Office Of Technology Transfer Fluorescence imaging system and method
US5878159A (en) * 1996-05-02 1999-03-02 Andromis S.A. Method for processing endoscopic images obtained with multicore fibers or multifibers
US5982953A (en) * 1994-09-02 1999-11-09 Konica Corporation Image displaying apparatus of a processed image from temporally sequential images
US20050065421A1 (en) * 2003-09-19 2005-03-24 Siemens Medical Solutions Usa, Inc. System and method of measuring disease severity of a patient before, during and after treatment
US7242997B2 (en) * 2002-09-03 2007-07-10 Genex Technologies, Inc. Diffuse optical tomography system and method of use

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5982953A (en) * 1994-09-02 1999-11-09 Konica Corporation Image displaying apparatus of a processed image from temporally sequential images
US5865754A (en) * 1995-08-24 1999-02-02 Purdue Research Foundation Office Of Technology Transfer Fluorescence imaging system and method
US5878159A (en) * 1996-05-02 1999-03-02 Andromis S.A. Method for processing endoscopic images obtained with multicore fibers or multifibers
US7242997B2 (en) * 2002-09-03 2007-07-10 Genex Technologies, Inc. Diffuse optical tomography system and method of use
US20050065421A1 (en) * 2003-09-19 2005-03-24 Siemens Medical Solutions Usa, Inc. System and method of measuring disease severity of a patient before, during and after treatment

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10517471B2 (en) 2011-05-12 2019-12-31 DePuy Synthes Products, Inc. Pixel array area optimization using stacking scheme for hybrid image sensor with minimal vertical interconnects
US11848337B2 (en) 2011-05-12 2023-12-19 DePuy Synthes Products, Inc. Image sensor
US11682682B2 (en) 2011-05-12 2023-06-20 DePuy Synthes Products, Inc. Pixel array area optimization using stacking scheme for hybrid image sensor with minimal vertical interconnects
US11432715B2 (en) 2011-05-12 2022-09-06 DePuy Synthes Products, Inc. System and method for sub-column parallel digitizers for hybrid stacked image sensor using vertical interconnects
US11179029B2 (en) 2011-05-12 2021-11-23 DePuy Synthes Products, Inc. Image sensor with tolerance optimizing interconnects
US11109750B2 (en) 2011-05-12 2021-09-07 DePuy Synthes Products, Inc. Pixel array area optimization using stacking scheme for hybrid image sensor with minimal vertical interconnects
US11026565B2 (en) 2011-05-12 2021-06-08 DePuy Synthes Products, Inc. Image sensor for endoscopic use
US10863894B2 (en) 2011-05-12 2020-12-15 DePuy Synthes Products, Inc. System and method for sub-column parallel digitizers for hybrid stacked image sensor using vertical interconnects
US10709319B2 (en) 2011-05-12 2020-07-14 DePuy Synthes Products, Inc. System and method for sub-column parallel digitizers for hybrid stacked image sensor using vertical interconnects
US10537234B2 (en) 2011-05-12 2020-01-21 DePuy Synthes Products, Inc. Image sensor with tolerance optimizing interconnects
US10075626B2 (en) 2012-07-26 2018-09-11 DePuy Synthes Products, Inc. Camera system with minimal area monolithic CMOS image sensor
US10785461B2 (en) 2012-07-26 2020-09-22 DePuy Synthes Products, Inc. YCbCr pulsed illumination scheme in a light deficient environment
AU2013295553B2 (en) * 2012-07-26 2017-10-19 DePuy Synthes Products, Inc. Continuous video in a light deficient environment
WO2014018936A3 (en) * 2012-07-26 2014-06-05 Olive Medical Corporation Continuous video in a light deficient environment
US11863878B2 (en) 2012-07-26 2024-01-02 DePuy Synthes Products, Inc. YCBCR pulsed illumination scheme in a light deficient environment
US11766175B2 (en) 2012-07-26 2023-09-26 DePuy Synthes Products, Inc. Camera system with minimal area monolithic CMOS image sensor
KR20150037955A (en) * 2012-07-26 2015-04-08 올리브 메디컬 코포레이션 Continuous video in a light deficient environment
US9462234B2 (en) 2012-07-26 2016-10-04 DePuy Synthes Products, Inc. Camera system with minimal area monolithic CMOS image sensor
US11089192B2 (en) 2012-07-26 2021-08-10 DePuy Synthes Products, Inc. Camera system with minimal area monolithic CMOS image sensor
US10277875B2 (en) 2012-07-26 2019-04-30 DePuy Synthes Products, Inc. YCBCR pulsed illumination scheme in a light deficient environment
US11083367B2 (en) 2012-07-26 2021-08-10 DePuy Synthes Products, Inc. Continuous video in a light deficient environment
US9762879B2 (en) 2012-07-26 2017-09-12 DePuy Synthes Products, Inc. YCbCr pulsed illumination scheme in a light deficient environment
US11070779B2 (en) 2012-07-26 2021-07-20 DePuy Synthes Products, Inc. YCBCR pulsed illumination scheme in a light deficient environment
US10568496B2 (en) 2012-07-26 2020-02-25 DePuy Synthes Products, Inc. Continuous video in a light deficient environment
KR102278509B1 (en) * 2012-07-26 2021-07-19 디퍼이 신테스 프로덕츠, 인코포레이티드 Continuous video in a light deficient environment
US10701254B2 (en) 2012-07-26 2020-06-30 DePuy Synthes Products, Inc. Camera system with minimal area monolithic CMOS image sensor
US9516239B2 (en) 2012-07-26 2016-12-06 DePuy Synthes Products, Inc. YCBCR pulsed illumination scheme in a light deficient environment
US10917562B2 (en) 2013-03-15 2021-02-09 DePuy Synthes Products, Inc. Super resolution and color motion artifact correction in a pulsed color imaging system
US11674677B2 (en) 2013-03-15 2023-06-13 DePuy Synthes Products, Inc. Controlling the integral light energy of a laser pulse
US10881272B2 (en) 2013-03-15 2021-01-05 DePuy Synthes Products, Inc. Minimize image sensor I/O and conductor counts in endoscope applications
US11903564B2 (en) 2013-03-15 2024-02-20 DePuy Synthes Products, Inc. Image sensor synchronization without input clock and data transmission clock
US9777913B2 (en) 2013-03-15 2017-10-03 DePuy Synthes Products, Inc. Controlling the integral light energy of a laser pulse
US10980406B2 (en) 2013-03-15 2021-04-20 DePuy Synthes Products, Inc. Image sensor synchronization without input clock and data transmission clock
WO2014144947A1 (en) * 2013-03-15 2014-09-18 Olive Medical Corporation Super resolution and color motion artifact correction in a pulsed color imaging system
US10670248B2 (en) 2013-03-15 2020-06-02 DePuy Synthes Products, Inc. Controlling the integral light energy of a laser pulse
US9641815B2 (en) 2013-03-15 2017-05-02 DePuy Synthes Products, Inc. Super resolution and color motion artifact correction in a pulsed color imaging system
US11344189B2 (en) 2013-03-15 2022-05-31 DePuy Synthes Products, Inc. Image sensor synchronization without input clock and data transmission clock
US10251530B2 (en) 2013-03-15 2019-04-09 DePuy Synthes Products, Inc. Scope sensing in a light controlled environment
US11253139B2 (en) 2013-03-15 2022-02-22 DePuy Synthes Products, Inc. Minimize image sensor I/O and conductor counts in endoscope applications
US10205877B2 (en) 2013-03-15 2019-02-12 DePuy Synthes Products, Inc. Super resolution and color motion artifact correction in a pulsed color imaging system
US11185213B2 (en) 2013-03-15 2021-11-30 DePuy Synthes Products, Inc. Scope sensing in a light controlled environment
US11438490B2 (en) 2014-03-21 2022-09-06 DePuy Synthes Products, Inc. Card edge connector for an imaging sensor
US10084944B2 (en) 2014-03-21 2018-09-25 DePuy Synthes Products, Inc. Card edge connector for an imaging sensor
US10911649B2 (en) 2014-03-21 2021-02-02 DePuy Synthes Products, Inc. Card edge connector for an imaging sensor
CN105808181A (en) * 2014-12-31 2016-07-27 中强光电股份有限公司 Image intermediary device, interactive display system and operating method thereof
CN105808181B (en) * 2014-12-31 2019-02-12 中强光电股份有限公司 Image mediating device, interactive display system and its operating method
WO2016157260A1 (en) * 2015-03-31 2016-10-06 パナソニックIpマネジメント株式会社 Visible light projection device
JPWO2016157260A1 (en) * 2015-03-31 2017-08-17 パナソニックIpマネジメント株式会社 Visible light projector
JP6176582B2 (en) * 2015-03-31 2017-08-09 パナソニックIpマネジメント株式会社 Visible light projector
US10080623B2 (en) 2015-03-31 2018-09-25 Panasonic Intellectual Property Management Co., Ltd. Visible light projection device for surgery to project images on a patient
EP3540494A1 (en) * 2018-03-16 2019-09-18 Leica Instruments (Singapore) Pte. Ltd. Augmented reality surgical microscope and microscopy method
US11800980B2 (en) 2018-03-16 2023-10-31 Leica Instruments (Singapore) Pte. Ltd. Augmented reality surgical microscope and microscopy method
CN113693739A (en) * 2021-08-27 2021-11-26 南京诺源医疗器械有限公司 Tumor navigation correction method and device and portable fluorescent image navigation equipment

Similar Documents

Publication Publication Date Title
WO2009120228A1 (en) Image processing systems and methods for surgical applications
JP6671442B2 (en) Method and apparatus for displaying enhanced imaging data on a clinical image
US8049184B2 (en) Fluoroscopic device and fluoroscopic method
JP5435532B2 (en) Image processing system
US8547425B2 (en) Fluorescence observation apparatus and fluorescence observation method
EP2754379B1 (en) Endoscope system and image display method
US8633976B2 (en) Position specifying system, position specifying method, and computer readable medium
JPWO2018159363A1 (en) Endoscope system and operation method thereof
US10694117B2 (en) Masking approach for imaging multi-peak fluorophores by an imaging system
EP3110314B1 (en) System and method for specular reflection detection and reduction
JP6319449B2 (en) Imaging device
WO2018175227A1 (en) Tissue identification by an imaging system using color information
US20180288404A1 (en) Image processing system, image processing apparatus, projecting apparatus, and projection method
JP2021035549A (en) Endoscope system
CN115668296A (en) Globally processing multiple luminescence images for mapping and/or segmentation thereof
JP6512320B2 (en) Imaging device
JP7090706B2 (en) Endoscope device, operation method and program of the endoscope device
US20210100440A1 (en) Endoscope apparatus, operation method of endoscope apparatus, and information storage medium
US20230020346A1 (en) Scene adaptive endoscopic hyperspectral imaging system
Taniguchi et al. Improving convenience and reliability of 5-ALA-induced fluorescent imaging for brain tumor surgery
CN113557462A (en) Medical control device and medical observation device
KR20210051192A (en) Medical system providing functional image
JPWO2018216658A1 (en) Imaging device, imaging system, and imaging method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08781961

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08781961

Country of ref document: EP

Kind code of ref document: A1