WO2007074440A2 - System and method for displaying the location of an in-vivo device - Google Patents

System and method for displaying the location of an in-vivo device Download PDF

Info

Publication number
WO2007074440A2
WO2007074440A2 PCT/IL2006/001477 IL2006001477W WO2007074440A2 WO 2007074440 A2 WO2007074440 A2 WO 2007074440A2 IL 2006001477 W IL2006001477 W IL 2006001477W WO 2007074440 A2 WO2007074440 A2 WO 2007074440A2
Authority
WO
WIPO (PCT)
Prior art keywords
data
displayed
time
display
distance
Prior art date
Application number
PCT/IL2006/001477
Other languages
French (fr)
Other versions
WO2007074440A3 (en
Inventor
Tal Davidson
Michael Skala
Mordechai Frisch
Original Assignee
Given Imaging Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Given Imaging Ltd. filed Critical Given Imaging Ltd.
Publication of WO2007074440A2 publication Critical patent/WO2007074440A2/en
Publication of WO2007074440A3 publication Critical patent/WO2007074440A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/041Capsule endoscopes for imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/07Endoradiosondes
    • A61B5/073Intestinal transmitters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00011Operational features of endoscopes characterised by signal transmission
    • A61B1/00016Operational features of endoscopes characterised by signal transmission using wireless means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects

Definitions

  • the present invention relates to displaying the location of an in- vivo sensing device.
  • Various methods of displaying data recorded by a moving device are known in the art. These metiiods typically show the data and may show, for example, the time and position of the recording device.
  • an in vivo imaging system is known, using a swallowable device. The imaging system captures and transmits images of the gastrointestinal (GI) tract to an external recording device while the device passes through the GI lumen.
  • GI gastrointestinal
  • Such an in vivo imaging system provides a platform from which moving or still images of a GI tract may be viewed. Large numbers of images may be collected for viewing. For example, the images may be combined in sequence, and a moving image of, for example, 40 minutes in length, may be presented to the user.
  • the location and motility of the device help the viewer to interpret and select the relevant data, for example, video footage. For example, if a viewer wants to focus attention on a specific region of the GI lumen, he may want to know which images were taken in that region.
  • the motility of device a may also be used to track the location of the device, for example, based on the expected motility of the device in various regions of the GI tract.
  • Other in-vivo sensing systems that have other sensors, for example, pressure or pH sensors, are known.
  • An exemplary embodiment of the present invention provides a system and method for displaying data to track the location and motility of an in-vivo recording device.
  • the device may, for example, be a swallowable capsule.
  • the device may record, for example, image data.
  • the display model may provide location data, for example, the time, displacement or position, corresponding to the device as it travels through the body. Typically, the location of the device is the location of the device at the time the data being displayed was captured.
  • a region display may indicate the general area through which the device may travel, for example, the two or three-dimensional space of the GI tract.
  • the region display may, for example, be a schematic diagram of a two or three-dimensional space, divided into a plurality of sections, each corresponding to an area of that two or three-dimensional space.
  • the region display may highlight the section of the diagram that corresponds to an area where the device is located.
  • a time display may indicate temporal data relating to the device, for example, a function of the time the device has traveled, when the device captured the data being displayed.
  • a distance display may indicate distance data relating to the device, for example, a function of the distance the device has traveled, when the device captured the data being displayed.
  • the time and distance displays may be of any suitable design, including, but not limited to, linear diagrams with indicators of measure.
  • the time and distance displays may be displayed together, and may for example, be adjacent.
  • the time and distance displays may be linear diagrams with indicators of measure, and they may, for example, be displayed in parallel.
  • Another embodiment may, for example, present the relative data of a plurality of contiguous meters, for example, a time display and a distance display or a region display and a distance display, etc.
  • Fig. 1 is a schematic diagram of an in-vivo imaging system, according to one embodiment of the present invention.
  • Fig. 2 is a schematic diagram of a display, which may be used to localize a device, according to one embodiment of the present invention.
  • Fig. 3 is a diagram of a display, which may be used to localize a device, according to one embodiment of the present invention.
  • a receiving and/or display system suitable for use with embodiments of the present invention may also be similar to embodiments described in US 2001- 0035902 and/or in U.S. Patent Number 5,604,531. Devices and systems as described
  • a device, system and method may be used with other devices, for example, non-imaging and/or non-in-vivo devices.
  • Fig. 1 shows a schematic diagram of an in-vivo imaging system according to one embodiment of the present invention.
  • the system comprises a device 40 having an imager 46, for capturing images, an illumination source 42, for illuminating the body lumen, and a transmitter 41, for transmitting image and possibly other information to a receiving device.
  • An optical system (not shown), including, for example, lenses or mirrors, may aid in focusing reflected light onto the imager 46.
  • the device 40 is inserted into the patient by, for example, lenses or mirrors, may aid in focusing reflected light onto the imager 46.
  • swallowing and preferably traverses the patient's GI tract.
  • Device 40 typically may be or may include an autonomous swallowable capsule, but device 40 may have other shapes and need not be swallowable or autonomous. Embodiments of device 40 are typically autonomous, and are typically self-contained. For example, device 40 may be a capsule or other unit where all the components
  • ?0 including for example power components are substantially contained within a container or shell, and where device 40 does not require any wires or cables to, for example, receive power or transmit information.
  • Device 40 may communicate with an external receiving and display system to provide display of data, control, or other functions.
  • power may be provided by an internal battery or a wireless receiving system.
  • Other embodiments may have other configurations and capabilities.
  • components may be distributed over multiple sites or units.
  • Control information may be received from an external source.
  • device 40 may, for example, be an in-vivo imaging device.
  • device 40 may sense pH or temperature or pressure or some combination of these.
  • an image receiver 12 located outside the patient's body in one or more locations, are an image receiver 12, preferably including an antenna 20 or antenna array 22, an image receiver storage unit 16, a data processor 14, a data processor storage unit 19, and a display or an image monitor 18, for displaying, inter alia, the images recorded by the device 40.
  • an image receiver 12 preferably including an antenna 20 or antenna array 22, an image receiver storage unit 16, a data processor 14, a data processor storage unit 19, and a display or an image monitor 18, for displaying, inter alia, the images recorded by the device 40.
  • Data processor storage unit 19 includes an image database 210.
  • data processor 14, data processor storage unit 19 and monitor 18 are part of a personal computer or workstation which includes standard components such as processor 14, a memory, a disk drive, and input-output devices, although alternate configurations are possible.
  • Data processor 14 may include any standard data processor, such as a microprocessor, multiprocessor, accelerator board, or any other serial or parallel high performance data processor.
  • Image monitor 18 is preferably a conventional video display, but may, in addition, be any other device capable of providing image or other data.
  • Processor 14 or another unit may, as discussed herein, calculate position, time position, and/or distance traveled, and cause to be displayed the general (e.g., region or quadrant) location of the device 40, its distance traveled, etc.
  • the imager 46 is a suitable or CMOS camera, such as a "camera on a chip” type CMOS imager specified by Given Imaging Ltd of Israel and designed by Photobit
  • imager 46 captures images and sends data representing, for example, images to transmitter 41, which transmits images to image receiver 12 using, for example, electromagnetic radio waves.
  • Image receiver 12 transfers the image data to image receiver storage unit 16.
  • the image data stored in storage unit 16 is sent to the data processor 14 or the data processor storage unit 19.
  • the image receiver storage unit 16 may be taken off the patient's body and connected to the personal computer or workstation which includes the data processor 14 and data processor storage unit 19 via a standard data link, e.g., a serial or parallel interface of known construction.
  • the image data is then transferred from the image receiver storage unit 16 to the image database 210 within data processor storage unit 19.
  • Data processor 14 may analyze the data and provide the analyzed data to the image monitor 18, where a health professional views the image data.
  • Data processor 14 operates software (not shown) which, in conjunction with basic operating software such as an operating system and device drivers, controls the operation of data processor 14 and which may display location and other information as discussed herein.
  • the software controlling data processor 14 includes code written in the C++ language, but may be implemented in a variety of known methods. Data in addition to or other than image data may be transmitted.
  • the image data collected and stored may be stored indefinitely, transferred to other locations, or manipulated or analyzed.
  • a health professional may use the images to diagnose pathological conditions of the GI tract, and, in addition, the system may provide information about the location of these pathologies.
  • the image monitor 18 presents the image data, preferably in the form of still and moving pictures, and preferably, may present other information. Multiple monitors 18 may be used to display images and other data. Monitor 18 may be any suitable display, for example, a CRS, LCD display, etc.
  • the in- vivo imager system collects a series of still images as it traverses the GI tract.
  • the images may be later presented as individual still images 103, a stream of still images 103 or a moving image 103 of the traverse of the GI tract.
  • the in-vivo imager system may collect a large volume of data, as the device 40 may take several hours to traverse the GI tract, and may record images at a rate of, for example, two images every second, resulting in the recordation of thousands of images.
  • the image recordation rate (or frame capture rate) may be varied.
  • data processor storage unit 19 stores a series of images recorded by a device
  • the images may be combined consecutively to form a moving image of the images device 40 recorded as it moved through a patient's GI tract.
  • This moving image may be displayed in a window on monitor 18.
  • the moving image may be frozen to view one frame, speeded up, or reversed; sections may be skipped; or any other method for viewing an image may be applied to the moving image.
  • each frame of image data includes 256 rows of 256 pixels each, each pixel including bytes for color and brightness, according to known methods.
  • color may be represented by a mosaic of four sub-pixels, each sub-pixel corresponding to primaries such as red, green, or blue (where one primary is represented twice).
  • the brightness of the overall pixel is recorded by a one byte (i.e., 0- 255) brightness value.
  • images are stored sequentially in data processor storage unit 19.
  • the stored data is comprised of one or more pixel properties, including color and brightness.
  • the system and method of the present invention may be practiced with alternate configurations.
  • the components gathering image information need not be contained in a device, but may be contained in any other vehicle suitable for traversing a lumen in a human body, such as an endoscope, stent, catheter, needle etc.
  • Various methods may be used for determining the position of the device 40 as it gathers data.
  • U.S. patent 6,904,308 to Frisch et al. entitled "Array System and Method for Locating an In- Vivo Signal Source", incorporated by reference herein in its entirety, and U.S. patent 5,604,531 each describe methods for determining the location of an in-vivo device.
  • Location may be determined by, for example, analyzing the signal strength of data received at multiple antennas 20 placed on a patient's body. Data from the antennas 20 or antenna array 22 may be sent or transmitted to the image receiver 12 via a connection, for example, a wire, and may be stored in storage 16. Other methods may be used; for example methods where a beacon or six-degree-of-freedom sensor is included in device 40 itself.
  • the distance traveled may be determined by, for example, methods disclosed in U.S. Patent Application US2006/0036166 entitled "System and Method for Determining Path Lengths Through a Body Lumen" incorporated by reference herein in its entirety. Distance traveled may be determined by using, for example, motility information, location information, or other information; such information may be for example summed.
  • Fig. 2 shows a schematic diagram of a data viewing window 100, that may, for example, be viewed on monitor 18, according to one embodiment of the present invention.
  • the window may include, for example, multiple data representations.
  • an image window 102 may display an image 103 that may be, for example an image-stream or still portions of that image, or any combination of these.
  • Window 100 may include controls 104, for example, buttons or other symbols, which may alter the display of the image; for example, stop, play, pause, capture image, step, fast-forward, rewind, or other manipulators. Controls 104 may be activated by, for example, a pointing device such as a mouse or trackball.
  • Window 100 may display other or different data such as pH or temperature data.
  • the display window 100 includes localization displays, which may provide data about the position of the in- vivo device 40 in space or time.
  • Localization displays may display data, wherein the location of the device 40 is the location of the device 40 at the time the data being displayed was captured. For example, if the device 40 captured a moving-image stream, then the localization displays may display data, wherein the location of the device indicated by the display may correspond to the location of the device at the time it captured the data being displayed.
  • localization displays may display location data that corresponds to the image 103 frames displayed by image display 102. In one embodiment, the localization displays may display data that corresponds to individual frames. In another embodiment, localization displays may not display data that corresponds to individual image frames 103, but may correspond to multiple frames.
  • localization data may be displayed based on an average of location data of device 40 that may correspond to a number of frames. This embodiment may dampen the appearance of fluctuations in localization data, and the data displayed may appear more continuous or smooth. Localization displays may display data, wherein the data is computed from pre-recorded data.
  • Window 100 may display, for example, a time display or timeline 106, or time bar, which may show temporal data that relates to device 40 as a function of time.
  • the time function may include, but is not limited to, the absolute time elapsed for the current image 103 or image stream 103, displayed in the image window
  • This absolute time elapsed may be, for example, the amount of time that elapsed between the moment the device 40 was first activated or inserted into the body (for example, by swallowing) and the image receiver 12 started receiving transmission from the device 40 and the moment that the current image being displayed was captured.
  • the time function may include the total time of the moving image, or a percentage of the total elapsed time for the current image 103 being shown in image window 102.
  • the time display or timeline 106 may show another suitable function of time.
  • the timeline 106 may be a bar labeled with time units or relative time units or other suitable labeling.
  • the value of the time function which corresponds to the current image 103 being shown, may be, for example, indicated by a time cursor 107 or other indicator that moves along the timeline 106 bar.
  • the value of the time function of the device may be the value of the time function at the time the data being displayed was captured.
  • Published Patent Application Document Number 2005/0075551 entitled “Device, System and Method for Presentation of In- Vivo Data", assigned to the common assignee of the present application and incorporated herein by reference, includes in one embodiment, a time bar, which displays a color at each time interval, which is the average color of the moving image 102 over that time interval.
  • Window 100 may display, for example, a distance display or distance line 108 or trace bar, which may show distance or displacement data that relates to device 40, as a function of distance.
  • this function may include, but is not limited to, the absolute distance device 40 traveled at the time the current image 103 or image stream 103, displayed in image window 102, was taken.
  • the absolute distance may be, for example, the distance between the position where the device 40 was first activated or inserted into the body (for example, by swallowing) and the image receiver 12 started receiving transmission from the device 40 and the moment that the current image 103 being displayed was captured.
  • distance function may include the total distance device 40 travels, or the percentage of the total distance the device 40 has traveled at the time the current image 103 was taken.
  • the distance display or distance line 108 may show any function of distance, including, but not limited to, all that are mentioned here.
  • the distance line 108 may be a bar labeled with distance units or relative distance units or other suitable labeling.
  • the value of the distance function, which corresponds to the current image 103 being shown, may be, for example, indicated by a distance cursor 109 or other indicator that moves along me distance line 108 bar.
  • the value of the distance function of the device may be the value of the distance function at the time the data being displayed was captured.
  • Window 100 may display a region location diagram 110.
  • the region location diagram 110 may be a schematic representation of a two or three-dimensional space or volume of the body through which the device 40 travels.
  • the region location diagram 110 may be divided into multiple regions 112.
  • a region 112 may represent a section of the two or three-dimensional space region location diagram 110 represents.
  • Region location diagram 110 may be used to show in which region 112 of that space device 40 is located.
  • Regions 112 may be of any number, shape or size, for example, four quadrants.
  • the shape and size of, and the number of, regions 112 of region location diagram 110 may vary depending on the specific details of device 40, the space through which device 40 travels, and the specific application.
  • the regions 112 may be of equal or un-equal size.
  • the region location diagram 110 may be used, for example, to represent the human torso.
  • the human torso is schematically illustrated by region location diagram 110, which is for example, a square divided into four quadrants, where the orientation of the region location diagram 110 matches that of the torso of an erect human, where, for example, the upper-right- hand region 112 of the region location diagram 110 represents the upper-right-hand three-dimensional space of a region of the human torso.
  • region location diagram 110 which is for example, a square divided into four quadrants, where the orientation of the region location diagram 110 matches that of the torso of an erect human, where, for example, the upper-right- hand region 112 of the region location diagram 110 represents the upper-right-hand three-dimensional space of a region of the human torso.
  • region location diagram 110 is for example, a square divided into four quadrants, where the orientation of the region location diagram 110 matches that of the torso of an erect human, where, for example, the upper-right- hand region 112 of the region location diagram 110 represents the upper-right-hand three-dimensional
  • the region location diagram 110 may offer information that the distance line 108 may not.
  • Window 100 may display a body or overlay 114, or another contextual schematic diagram of a body, for example, a human torso, that is displayed over region location diagram 110, as in Figure 2.
  • Body overlay 114 may clarify the correspondence between the region location diagram and the space through which the device travels.
  • Fig. 3 shows a schematic diagram of an antenna display 120.
  • Window 100 may display an antenna display 120.
  • Antenna display 120 may indicate which antenna 20 of the antenna array 22 may have collected the data being displayed.
  • the antenna array 22 may be a number of antennas 20, typically, attached externally to the body that contains device 40.
  • antennas may be placed around a patient's body; for example, a set of antennas may be placed around a patient's torso.
  • Preferably all antennas 20 may receive data from device 40.
  • one or more antennas 20' may be selected for data collection (e.g., after the signal strength of the antennas is compared) and the received data may be displayed. The data from the other antennas 20 may not be displayed.
  • Antennas 20 may be selected for data transmission for display based on for example their proximity to device 40 or the strength of the data signal they receive from device 40 or any other suitable measure that may determine which of the antennas 20' record the most relevant or accurate data.
  • the antenna display 120 may show which antennas are being used to receive data, and/or the strength of the signal at various antennas 20, which is typically a measure of the proximity of device 40 to the antennas, antenna display 120 may be used as a localization display. Display 120 may indicate where device 40 may be located relative to the antennas 20' of the antenna array 22. In an exemplary embodiment shown in figure 3, the antenna display 120 may schematically display a number of antennas 20 by indicating a number of antenna representations 122, for example circles, points or dots.
  • the antenna display 120 may indicate the number and arrangement of antennas 20 in the antenna array 22. For example, if eight antennas 20 are placed around the waist of a human torso, antenna display 120 may show eight dots 122, where each dot 122 may correspond to one antenna 20, where the geometric arrangement of dots 122 may correspond to the geometric arrangement of the antennas
  • antenna display 120 may, for example, highlight the antenna representations 122' of antenna display 120 that may correspond to the selected antennas 20'.
  • the antenna representations 122' of antenna display 120 may be highlighted by, for example, lighting up, having their appearance altered, or changing color or undergoing any other form of focus.
  • a single antenna 20' is selected to transmit data for display.
  • the antenna representation 122' that may correspond to the selected antenna 20' may be highlighted by, for example, surrounding the antenna representation 122' by a shape, for example, a square.
  • the shape may be, for example, colored and the color or color intensity or brightness may vary to correspondence with signal strength or device 40 proximity to the antenna 20' or any other measure of the relevancy or accuracy of the data the antenna 20' transmits. If localization data is displayed that may correspond to multiple antennas 20', each selected antenna representation 122' may be highlighted. If a number of antennas 20' may be selected with varying priority, the display may show this by highlighting the antennas representations 122' to a varying degree, for example, by highlighting the different representations 122' with different colors or different intensities of color.
  • the signal strength at each antenna may be indicated by, for example, a color, brightness, size, etc.
  • each antenna representation may be displayed in a certain color or brightness based on signal strength, or the size of the icon or representation of the antenna may be altered based on signal strength.
  • one or more of the timeline 106, the distance line 108, region location diagram 110 and antenna display 120 may be displayed in window 100.
  • the timeline 106 may be a tracking display that may report a temporal measure of the progress of device 40 along the GI tract
  • the distance line 108 may be a tracking display that may report a distance measure of the progress of device 40 along the GI tract
  • the region location diagram 110 may be a tracking display that may report a general region of the GI tract where device 40 is located.
  • the distance line 108, region location diagram 110 and antenna display 120 may be displayed to clarify the location and, for example, the motility of device 40, that corresponds with the image 102 displayed.
  • combining diagram 110 and the distance line 108 by for example, placing them side by side or adjacently, may produce two measures of the location of device 40. Together, these measures may be used to localize device 40 with greater accuracy.
  • Combining the diagram 110 with the timeline 106 may produce a similar advantage, since the timeline 106 may be interpreted as an average of the distance line 108.
  • a display that combines the distance line 108 and the timeline 106, so, for example, they are displayed so that they are parallel, may clarify the average motility of device 40.
  • a halt to the device 40 motility may, for example, indicate an abnormality in the GI tract that may prevent the device 40 from moving regularly. It may also, for example, indicate that the device 40 may be located in a region of the tract where it does not move for periods of time, such as the stomach.
  • the halt of device 40 may show a change in its path, which may mark the anatomical region where it is located.
  • the time and distance displays may be of any design, including, but not limited to, linear diagrams with indicators of measure.
  • a time display may be a circular display that resembles a clock.
  • the time and distance displays are typically adjacent, which may clarify the relationship between device 40 time and distance data.
  • the time and distance displays need not be adjacent, and may be positioned in window 100 in any suitable configuration.
  • when the time and distance displays are linear diagrams with indicators of measure, they may, for example, be displayed in parallel. It may be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the invention is defined by the claims that follow:

Abstract

The application is directed to a method and system for displaying localization data of an in-vivo device. Data is displayed to track the location and motility of an in-vivo recording device.

Description

SYSTEM AND METHOD FOR DISPLAYING THE LOCATION OF AN
IN-VΓVO DEVICE
FIELD OF THE INVENTION The present invention relates to displaying the location of an in- vivo sensing device.
BACKGROUND OF THE INVENTION
Various methods of displaying data recorded by a moving device are known in the art. These metiiods typically show the data and may show, for example, the time and position of the recording device. For example, an in vivo imaging system is known, using a swallowable device. The imaging system captures and transmits images of the gastrointestinal (GI) tract to an external recording device while the device passes through the GI lumen. Such an in vivo imaging system provides a platform from which moving or still images of a GI tract may be viewed. Large numbers of images may be collected for viewing. For example, the images may be combined in sequence, and a moving image of, for example, 40 minutes in length, may be presented to the user. The location and motility of the device, which may be, for example, a capsule, help the viewer to interpret and select the relevant data, for example, video footage. For example, if a viewer wants to focus attention on a specific region of the GI lumen, he may want to know which images were taken in that region. The motility of device a may also be used to track the location of the device, for example, based on the expected motility of the device in various regions of the GI tract. Other in-vivo sensing systems that have other sensors, for example, pressure or pH sensors, are known. A need exists for a useful, aesthetically pleasing, and clear method to display the location and, for example, motility, of an in-vivo sensing device.
SUMMARY OF THE INVENTION
An exemplary embodiment of the present invention provides a system and method for displaying data to track the location and motility of an in-vivo recording device. The device may, for example, be a swallowable capsule. The device may record, for example, image data. The display model may provide location data, for example, the time, displacement or position, corresponding to the device as it travels through the body. Typically, the location of the device is the location of the device at the time the data being displayed was captured.
A region display may indicate the general area through which the device may travel, for example, the two or three-dimensional space of the GI tract. In an exemplary embodiment, the region display may, for example, be a schematic diagram of a two or three-dimensional space, divided into a plurality of sections, each corresponding to an area of that two or three-dimensional space. Preferably, the region display may highlight the section of the diagram that corresponds to an area where the device is located. A time display may indicate temporal data relating to the device, for example, a function of the time the device has traveled, when the device captured the data being displayed. A distance display may indicate distance data relating to the device, for example, a function of the distance the device has traveled, when the device captured the data being displayed. The time and distance displays may be of any suitable design, including, but not limited to, linear diagrams with indicators of measure. The time and distance displays may be displayed together, and may for example, be adjacent. In an exemplary embodiment, the time and distance displays may be linear diagrams with indicators of measure, and they may, for example, be displayed in parallel.
Another embodiment may, for example, present the relative data of a plurality of contiguous meters, for example, a time display and a distance display or a region display and a distance display, etc.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention may be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:
Fig. 1 is a schematic diagram of an in-vivo imaging system, according to one embodiment of the present invention;
Fig. 2 is a schematic diagram of a display, which may be used to localize a device, according to one embodiment of the present invention; and
Fig. 3 is a diagram of a display, which may be used to localize a device, according to one embodiment of the present invention. DETAILED DESCRIPTION OF THE INVENTION
In the following description, various aspects of the present invention will be described.
For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it may also be apparent to one skilled in the art that the present invention may be practiced without the specific details presented herein. Furthermore, well known features may be omitted or simplified in order not to obscure the present invention. Some embodiments of the present invention may be directed to an autonomous,
5 typically swallowable in-vivo device. Other embodiments need not be swallowable or autonomous. Devices or systems according to embodiments of the present invention may be similar to embodiments described in published US application US 2001- 0035902 entitled "Device for In- Vivo Imaging" published November 1, 2001 and/or in U.S. Patent No. 5,604,531 entitled "In- Vivo Video Camera System" issued February
10 18, 1997, each of which being assigned to the common assignee of the present invention and each of which being hereby fully incorporated herein by reference. Furthermore, a receiving and/or display system suitable for use with embodiments of the present invention may also be similar to embodiments described in US 2001- 0035902 and/or in U.S. Patent Number 5,604,531. Devices and systems as described
15 herein may have other configurations and other sets of components. Alternate embodiments of a device, system and method may be used with other devices, for example, non-imaging and/or non-in-vivo devices.
Reference is made to Fig. 1, which shows a schematic diagram of an in-vivo imaging system according to one embodiment of the present invention. In an exemplary
20 embodiment, the system comprises a device 40 having an imager 46, for capturing images, an illumination source 42, for illuminating the body lumen, and a transmitter 41, for transmitting image and possibly other information to a receiving device. An optical system (not shown), including, for example, lenses or mirrors, may aid in focusing reflected light onto the imager 46. The device 40 is inserted into the patient by, for
>5 example, swallowing, and preferably traverses the patient's GI tract.
Device 40 typically may be or may include an autonomous swallowable capsule, but device 40 may have other shapes and need not be swallowable or autonomous. Embodiments of device 40 are typically autonomous, and are typically self-contained. For example, device 40 may be a capsule or other unit where all the components
?0 including for example power components are substantially contained within a container or shell, and where device 40 does not require any wires or cables to, for example, receive power or transmit information. Device 40 may communicate with an external receiving and display system to provide display of data, control, or other functions. For example, in an autonomous system power may be provided by an internal battery or a wireless receiving system. Other embodiments may have other configurations and capabilities. For example, components may be distributed over multiple sites or units. Control information may be received from an external source. In an exemplary embodiment, device 40, may, for example, be an in-vivo imaging device. In other embodiments, device 40 may sense pH or temperature or pressure or some combination of these.
Preferably, located outside the patient's body in one or more locations, are an image receiver 12, preferably including an antenna 20 or antenna array 22, an image receiver storage unit 16, a data processor 14, a data processor storage unit 19, and a display or an image monitor 18, for displaying, inter alia, the images recorded by the device 40.
Preferably, the image receiver 12 and image receiver storage unit 16 are small and portable, and are worn on the patient's body during recording of the images. Data processor storage unit 19 includes an image database 210. Preferably, data processor 14, data processor storage unit 19 and monitor 18 are part of a personal computer or workstation which includes standard components such as processor 14, a memory, a disk drive, and input-output devices, although alternate configurations are possible. Data processor 14 may include any standard data processor, such as a microprocessor, multiprocessor, accelerator board, or any other serial or parallel high performance data processor. Image monitor 18 is preferably a conventional video display, but may, in addition, be any other device capable of providing image or other data. Processor 14 or another unit may, as discussed herein, calculate position, time position, and/or distance traveled, and cause to be displayed the general (e.g., region or quadrant) location of the device 40, its distance traveled, etc. Preferably, the imager 46 is a suitable or CMOS camera, such as a "camera on a chip" type CMOS imager specified by Given Imaging Ltd of Israel and designed by Photobit
Corporation of California, USA. Other imagers, such as a CCD, may be used. The illumination source 42 may be, for example, one or more light emitting diodes. In operation, imager 46 captures images and sends data representing, for example, images to transmitter 41, which transmits images to image receiver 12 using, for example, electromagnetic radio waves. Image receiver 12 transfers the image data to image receiver storage unit 16. After a certain period of time of data collection, the image data stored in storage unit 16 is sent to the data processor 14 or the data processor storage unit 19. For example, the image receiver storage unit 16 may be taken off the patient's body and connected to the personal computer or workstation which includes the data processor 14 and data processor storage unit 19 via a standard data link, e.g., a serial or parallel interface of known construction. The image data is then transferred from the image receiver storage unit 16 to the image database 210 within data processor storage unit 19. Data processor 14 may analyze the data and provide the analyzed data to the image monitor 18, where a health professional views the image data. Data processor 14 operates software (not shown) which, in conjunction with basic operating software such as an operating system and device drivers, controls the operation of data processor 14 and which may display location and other information as discussed herein. Preferably, the software controlling data processor 14 includes code written in the C++ language, but may be implemented in a variety of known methods. Data in addition to or other than image data may be transmitted.
The image data collected and stored may be stored indefinitely, transferred to other locations, or manipulated or analyzed. A health professional may use the images to diagnose pathological conditions of the GI tract, and, in addition, the system may provide information about the location of these pathologies. While, using a system where the data processor storage unit 19 first collects data and then transfers data to the data processor 14, the image data is not viewed in real time, other configurations allow for real time viewing. The image monitor 18 presents the image data, preferably in the form of still and moving pictures, and preferably, may present other information. Multiple monitors 18 may be used to display images and other data. Monitor 18 may be any suitable display, for example, a CRS, LCD display, etc.
Preferably, the in- vivo imager system collects a series of still images as it traverses the GI tract. The images may be later presented as individual still images 103, a stream of still images 103 or a moving image 103 of the traverse of the GI tract. The in-vivo imager system may collect a large volume of data, as the device 40 may take several hours to traverse the GI tract, and may record images at a rate of, for example, two images every second, resulting in the recordation of thousands of images. The image recordation rate (or frame capture rate) may be varied. Preferably, data processor storage unit 19 stores a series of images recorded by a device
40. The images may be combined consecutively to form a moving image of the images device 40 recorded as it moved through a patient's GI tract. This moving image may be displayed in a window on monitor 18. The moving image may be frozen to view one frame, speeded up, or reversed; sections may be skipped; or any other method for viewing an image may be applied to the moving image.
Preferably, the image data recorded and transmitted by the device 40 is digital color image data, although in alternate embodiments other image formats may be used. In an exemplary embodiment, each frame of image data includes 256 rows of 256 pixels each, each pixel including bytes for color and brightness, according to known methods. For example, in each pixel, color may be represented by a mosaic of four sub-pixels, each sub-pixel corresponding to primaries such as red, green, or blue (where one primary is represented twice). The brightness of the overall pixel is recorded by a one byte (i.e., 0- 255) brightness value. Preferably, images are stored sequentially in data processor storage unit 19. The stored data is comprised of one or more pixel properties, including color and brightness.
While, preferably, information gathering, storage and processing is performed by certain units, the system and method of the present invention may be practiced with alternate configurations. For example, the components gathering image information need not be contained in a device, but may be contained in any other vehicle suitable for traversing a lumen in a human body, such as an endoscope, stent, catheter, needle etc. Various methods may be used for determining the position of the device 40 as it gathers data. For example, U.S. patent 6,904,308 to Frisch et al. entitled "Array System and Method for Locating an In- Vivo Signal Source", incorporated by reference herein in its entirety, and U.S. patent 5,604,531 each describe methods for determining the location of an in-vivo device. Location may be determined by, for example, analyzing the signal strength of data received at multiple antennas 20 placed on a patient's body. Data from the antennas 20 or antenna array 22 may be sent or transmitted to the image receiver 12 via a connection, for example, a wire, and may be stored in storage 16. Other methods may be used; for example methods where a beacon or six-degree-of-freedom sensor is included in device 40 itself. The distance traveled may be determined by, for example, methods disclosed in U.S. Patent Application US2006/0036166 entitled "System and Method for Determining Path Lengths Through a Body Lumen" incorporated by reference herein in its entirety. Distance traveled may be determined by using, for example, motility information, location information, or other information; such information may be for example summed. Other methods of determining the distance traveled may be used. Reference is made to Fig. 2, which shows a schematic diagram of a data viewing window 100, that may, for example, be viewed on monitor 18, according to one embodiment of the present invention. The window may include, for example, multiple data representations. In an exemplary embodiment, an image window 102 may display an image 103 that may be, for example an image-stream or still portions of that image, or any combination of these. Window 100 may include controls 104, for example, buttons or other symbols, which may alter the display of the image; for example, stop, play, pause, capture image, step, fast-forward, rewind, or other manipulators. Controls 104 may be activated by, for example, a pointing device such as a mouse or trackball. Window 100 may display other or different data such as pH or temperature data.
Preferably, the display window 100 includes localization displays, which may provide data about the position of the in- vivo device 40 in space or time. Localization displays may display data, wherein the location of the device 40 is the location of the device 40 at the time the data being displayed was captured. For example, if the device 40 captured a moving-image stream, then the localization displays may display data, wherein the location of the device indicated by the display may correspond to the location of the device at the time it captured the data being displayed. Typically, localization displays may display location data that corresponds to the image 103 frames displayed by image display 102. In one embodiment, the localization displays may display data that corresponds to individual frames. In another embodiment, localization displays may not display data that corresponds to individual image frames 103, but may correspond to multiple frames. For example, localization data may be displayed based on an average of location data of device 40 that may correspond to a number of frames. This embodiment may dampen the appearance of fluctuations in localization data, and the data displayed may appear more continuous or smooth. Localization displays may display data, wherein the data is computed from pre-recorded data.
Window 100 may display, for example, a time display or timeline 106, or time bar, which may show temporal data that relates to device 40 as a function of time. In an exemplary embodiment, the time function may include, but is not limited to, the absolute time elapsed for the current image 103 or image stream 103, displayed in the image window
102, was taken. This absolute time elapsed may be, for example, the amount of time that elapsed between the moment the device 40 was first activated or inserted into the body (for example, by swallowing) and the image receiver 12 started receiving transmission from the device 40 and the moment that the current image being displayed was captured. In other embodiments, the time function may include the total time of the moving image, or a percentage of the total elapsed time for the current image 103 being shown in image window 102. In other embodiments, the time display or timeline 106 may show another suitable function of time. In an exemplary embodiment, the timeline 106 may be a bar labeled with time units or relative time units or other suitable labeling. The value of the time function, which corresponds to the current image 103 being shown, may be, for example, indicated by a time cursor 107 or other indicator that moves along the timeline 106 bar. Typically, the value of the time function of the device may be the value of the time function at the time the data being displayed was captured. For example, U.S. Patent Application serial number 10/950,480 published as US
Published Patent Application Document Number 2005/0075551, entitled "Device, System and Method for Presentation of In- Vivo Data", assigned to the common assignee of the present application and incorporated herein by reference, includes in one embodiment, a time bar, which displays a color at each time interval, which is the average color of the moving image 102 over that time interval.
Window 100 may display, for example, a distance display or distance line 108 or trace bar, which may show distance or displacement data that relates to device 40, as a function of distance. In an exemplary embodiment, this function may include, but is not limited to, the absolute distance device 40 traveled at the time the current image 103 or image stream 103, displayed in image window 102, was taken. The absolute distance may be, for example, the distance between the position where the device 40 was first activated or inserted into the body (for example, by swallowing) and the image receiver 12 started receiving transmission from the device 40 and the moment that the current image 103 being displayed was captured. In other embodiments, distance function may include the total distance device 40 travels, or the percentage of the total distance the device 40 has traveled at the time the current image 103 was taken. In other embodiments, the distance display or distance line 108 may show any function of distance, including, but not limited to, all that are mentioned here. In an exemplary embodiment, the distance line 108 may be a bar labeled with distance units or relative distance units or other suitable labeling. The value of the distance function, which corresponds to the current image 103 being shown, may be, for example, indicated by a distance cursor 109 or other indicator that moves along me distance line 108 bar. Typically, the value of the distance function of the device may be the value of the distance function at the time the data being displayed was captured. Window 100 may display a region location diagram 110. The region location diagram 110 may be a schematic representation of a two or three-dimensional space or volume of the body through which the device 40 travels. The region location diagram 110 may be divided into multiple regions 112. A region 112 may represent a section of the two or three-dimensional space region location diagram 110 represents. Region location diagram 110 may be used to show in which region 112 of that space device 40 is located. Regions 112 may be of any number, shape or size, for example, four quadrants. Typically, the shape and size of, and the number of, regions 112 of region location diagram 110, may vary depending on the specific details of device 40, the space through which device 40 travels, and the specific application. The regions 112 may be of equal or un-equal size. In an exemplary embodiment, the region location diagram 110, may be used, for example, to represent the human torso. In such an embodiment, the human torso is schematically illustrated by region location diagram 110, which is for example, a square divided into four quadrants, where the orientation of the region location diagram 110 matches that of the torso of an erect human, where, for example, the upper-right- hand region 112 of the region location diagram 110 represents the upper-right-hand three-dimensional space of a region of the human torso. For example, when device 40 is determined to be located in a given space of the torso, the display highlights the corresponding region 112' of region location diagram 110, where that region 112' may, for example, light up, have its appearance altered, or change color or undergo any other form of focus. The highlighted region 112 is, for example, the active location region 112', shown in Figure 2. Since in an embodiment where the GI tract is sensed, the GI tract forms a non-linear path, the region location diagram 110 may offer information that the distance line 108 may not. Window 100 may display a body or overlay 114, or another contextual schematic diagram of a body, for example, a human torso, that is displayed over region location diagram 110, as in Figure 2. Body overlay 114 may clarify the correspondence between the region location diagram and the space through which the device travels. Reference is made to Fig. 3, which shows a schematic diagram of an antenna display 120. Window 100 may display an antenna display 120. Antenna display 120 may indicate which antenna 20 of the antenna array 22 may have collected the data being displayed. In an exemplary embodiment, the antenna array 22 may be a number of antennas 20, typically, attached externally to the body that contains device 40. For example, antennas may be placed around a patient's body; for example, a set of antennas may be placed around a patient's torso. Preferably all antennas 20 may receive data from device 40. In one embodiment, one or more antennas 20' may be selected for data collection (e.g., after the signal strength of the antennas is compared) and the received data may be displayed. The data from the other antennas 20 may not be displayed. Antennas 20 may be selected for data transmission for display based on for example their proximity to device 40 or the strength of the data signal they receive from device 40 or any other suitable measure that may determine which of the antennas 20' record the most relevant or accurate data. The antenna display 120 may show which antennas are being used to receive data, and/or the strength of the signal at various antennas 20, which is typically a measure of the proximity of device 40 to the antennas, antenna display 120 may be used as a localization display. Display 120 may indicate where device 40 may be located relative to the antennas 20' of the antenna array 22. In an exemplary embodiment shown in figure 3, the antenna display 120 may schematically display a number of antennas 20 by indicating a number of antenna representations 122, for example circles, points or dots. Other representations may be used. The antenna display 120 may indicate the number and arrangement of antennas 20 in the antenna array 22. For example, if eight antennas 20 are placed around the waist of a human torso, antenna display 120 may show eight dots 122, where each dot 122 may correspond to one antenna 20, where the geometric arrangement of dots 122 may correspond to the geometric arrangement of the antennas
20 they represent. For example, the dots 122 may be positioned along a circular or elliptical curve, which may represent or suggest the curve of the waist. When antennas 20' are selected to transmit data for display, antenna display 120 may, for example, highlight the antenna representations 122' of antenna display 120 that may correspond to the selected antennas 20'. The antenna representations 122' of antenna display 120 may be highlighted by, for example, lighting up, having their appearance altered, or changing color or undergoing any other form of focus. In one embodiment, shown in Figure 3, a single antenna 20' is selected to transmit data for display. The antenna representation 122' that may correspond to the selected antenna 20' may be highlighted by, for example, surrounding the antenna representation 122' by a shape, for example, a square. The shape may be, for example, colored and the color or color intensity or brightness may vary to correspondence with signal strength or device 40 proximity to the antenna 20' or any other measure of the relevancy or accuracy of the data the antenna 20' transmits. If localization data is displayed that may correspond to multiple antennas 20', each selected antenna representation 122' may be highlighted. If a number of antennas 20' may be selected with varying priority, the display may show this by highlighting the antennas representations 122' to a varying degree, for example, by highlighting the different representations 122' with different colors or different intensities of color. Alternately or in addition, the signal strength at each antenna, not only those selected for data, or at only those antennas selected for data, may be indicated by, for example, a color, brightness, size, etc. For example, each antenna representation may be displayed in a certain color or brightness based on signal strength, or the size of the icon or representation of the antenna may be altered based on signal strength. In various embodiments, one or more of the timeline 106, the distance line 108, region location diagram 110 and antenna display 120, may be displayed in window 100. In an exemplary embodiment, the timeline 106 may be a tracking display that may report a temporal measure of the progress of device 40 along the GI tract, the distance line 108 may be a tracking display that may report a distance measure of the progress of device 40 along the GI tract, and the region location diagram 110 may be a tracking display that may report a general region of the GI tract where device 40 is located. Although it is not specifically mentioned, it is assumed that other displays may be shown on the window 100, in conjunction with the localization display combinations discussed. Typically, the image window 102, image 103 and controls 104 displays may be shown. Typically some combination of the timeline 106, the distance line 108, region location diagram 110 and antenna display 120 may be displayed to clarify the location and, for example, the motility of device 40, that corresponds with the image 102 displayed. In one embodiment, combining diagram 110 and the distance line 108, by for example, placing them side by side or adjacently, may produce two measures of the location of device 40. Together, these measures may be used to localize device 40 with greater accuracy. Combining the diagram 110 with the timeline 106 may produce a similar advantage, since the timeline 106 may be interpreted as an average of the distance line 108. In another embodiment, a display that combines the distance line 108 and the timeline 106, so, for example, they are displayed so that they are parallel, may clarify the average motility of device 40. For instance, if the distance cursor 109 lags the time cursor 107, then on average, up to that time, device 40 may have moved slower than during its entire recorded journey. Conversely, if the distance cursor 109 is leading the time cursor 109, then on average, up to that time, device 40 may move faster than during its entire recorded journey. This display model may also clarify instances where device 40 is stationary. A halt to the device 40 motility may, for example, indicate an abnormality in the GI tract that may prevent the device 40 from moving regularly. It may also, for example, indicate that the device 40 may be located in a region of the tract where it does not move for periods of time, such as the stomach. In this case, the halt of device 40 may show a change in its path, which may mark the anatomical region where it is located. The time and distance displays may be of any design, including, but not limited to, linear diagrams with indicators of measure. For example, in other embodiments, a time display may be a circular display that resembles a clock. When the time and distance displays are combined, they are typically adjacent, which may clarify the relationship between device 40 time and distance data. However, the time and distance displays need not be adjacent, and may be positioned in window 100 in any suitable configuration. In an exemplary embodiment, when the time and distance displays are linear diagrams with indicators of measure, they may, for example, be displayed in parallel. It may be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the invention is defined by the claims that follow:

Claims

CLAIMSWhat is claimed is:
1. A method for displaying localization information of an in- vivo device, comprising: displaying a schematic diagram of a function of the distance the device has traveled, when the device captured the data being displayed; and displaying a schematic diagram of a function of the time the device has traveled, when the device captured the data being displayed.
2. The method of Claim 1, further comprising displaying a moving-image stream captured by the device, wherein the function of time and the function of distance displayed, correspond to that function of time and function of distance, respectively, of the device at the time it captured the portion of the image stream being displayed.
3. The method of Claim 1, wherein the data displayed is computed from prerecorded data.
4. A system for displaying localization data of an in- vivo device, comprising a processor for displaying a schematic diagram of the space through which the device may travel, the diagram divided into a plurality of sections; and wherein the processor is for displaying a highlighted section, where the device is determined to be located, wherein the processor is to display a schematic diagram of a function of the distance the device has traveled, when the device captured the data being displayed.
5. The system of Claim 4, wherein the location of the device is the location of the device at the time the data being displayed was captured.
6. The system of Claim 4, wherein the device is an in- vivo imaging device.
7. The system of Claim 4, wherein the processor is to display a moving-image stream captured by the device, wherein the location indicated by the highlighted section corresponds to the location of the device at the time it captured the portion of the image stream being displayed.
8. The system of Claim 4, wherein the processor is to display a schematic diagram of a function of the time the device has traveled, when the device captured the data being displayed.
9. The system of Claim 4, wherein the data displayed is computed from prerecorded data.
10. A system for displaying localization information of an in- vivo device, comprising: a processor for displaying a schematic diagram of a function of the distance the device has traveled, when the device captured the data being displayed; wherein the processor is for displaying a schematic diagram of a function of the time the device has traveled, when the device captured the data being displayed.
11. The system of Claim 10, wherein the displays are adjacent.
12. The system of Claim 10, wherein the schematic diagrams are linear.
13. The system of Claim 10, wherein the displays are parallel.
14. The system of Claim 10, wherein the processor is for displaying a moving- image stream captured by the device; and wherein the function of time and the function of distance displayed, correspond to that function of time and that function of distance, respectively, of the device at the time it captured the portion of the image stream being displayed.
15. The system of Claim 10, wherein the data displayed is computed from prerecorded data.
PCT/IL2006/001477 2005-12-27 2006-12-24 System and method for displaying the location of an in-vivo device WO2007074440A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US75341505P 2005-12-27 2005-12-27
US60/753,415 2005-12-27

Publications (2)

Publication Number Publication Date
WO2007074440A2 true WO2007074440A2 (en) 2007-07-05
WO2007074440A3 WO2007074440A3 (en) 2009-04-23

Family

ID=38218374

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2006/001477 WO2007074440A2 (en) 2005-12-27 2006-12-24 System and method for displaying the location of an in-vivo device

Country Status (1)

Country Link
WO (1) WO2007074440A2 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6246913B1 (en) * 1997-02-14 2001-06-12 Oractec Interventions, Inc. Method and apparatus for the treatment of strabismus
US20030193589A1 (en) * 2002-04-08 2003-10-16 Lareau Andre G. Multispectral or hyperspectral imaging system and method for tactical reconnaissance
US6810207B2 (en) * 2002-05-13 2004-10-26 Olympus Corporation Camera
US20040243533A1 (en) * 2002-04-08 2004-12-02 Wsi Corporation Method for interactively creating real-time visualizations of traffic information
US20060052684A1 (en) * 2002-05-07 2006-03-09 Takashi Takahashi Medical cockpit system
US7350236B1 (en) * 1999-05-25 2008-03-25 Silverbrook Research Pty Ltd Method and system for creation and use of a photo album

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6246913B1 (en) * 1997-02-14 2001-06-12 Oractec Interventions, Inc. Method and apparatus for the treatment of strabismus
US7350236B1 (en) * 1999-05-25 2008-03-25 Silverbrook Research Pty Ltd Method and system for creation and use of a photo album
US20030193589A1 (en) * 2002-04-08 2003-10-16 Lareau Andre G. Multispectral or hyperspectral imaging system and method for tactical reconnaissance
US20040243533A1 (en) * 2002-04-08 2004-12-02 Wsi Corporation Method for interactively creating real-time visualizations of traffic information
US20060052684A1 (en) * 2002-05-07 2006-03-09 Takashi Takahashi Medical cockpit system
US6810207B2 (en) * 2002-05-13 2004-10-26 Olympus Corporation Camera

Also Published As

Publication number Publication date
WO2007074440A3 (en) 2009-04-23

Similar Documents

Publication Publication Date Title
AU2005214199B2 (en) System and method for editing an image stream captured in vivo
US7567692B2 (en) System and method for detecting content in-vivo
EP1474927B1 (en) System and method for displaying an image stream
US7724928B2 (en) Device, system and method for motility measurement and analysis
US8150124B2 (en) System and method for multiple viewing-window display of capsule images
US7577283B2 (en) System and method for detecting content in-vivo
JP5227496B2 (en) In-vivo image display system and operating method thereof
EP2290613B1 (en) System and method for presentation of data streams
US8446465B2 (en) System and method for displaying an image stream captured in-vivo
US8724868B2 (en) System and method for display of panoramic capsule images
US7805178B1 (en) Device, system and method of receiving and recording and displaying in-vivo data with user entered data
WO2002045567A2 (en) Method and system for use of a pointing device with moving images
US20110085021A1 (en) System and method for display of panoramic capsule images
JP5116070B2 (en) System for motility measurement and analysis
US8401262B2 (en) Device, system and method for motility measurement and analysis
EP1762171B1 (en) Device, system and method for determining spacial measurements of anatomical objects for in-vivo pathology detection
EP1900322A1 (en) Device and method for displaying images received from an in-vivo imaging device
WO2007074440A2 (en) System and method for displaying the location of an in-vivo device
WO2007077554A2 (en) System and method for displaying an image stream
EP1853066B1 (en) System and method for displaying an image stream

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06821660

Country of ref document: EP

Kind code of ref document: A2