WO2007005367A2 - Robotic image guided catheter-based surgical devices and techniques - Google Patents

Robotic image guided catheter-based surgical devices and techniques Download PDF

Info

Publication number
WO2007005367A2
WO2007005367A2 PCT/US2006/024796 US2006024796W WO2007005367A2 WO 2007005367 A2 WO2007005367 A2 WO 2007005367A2 US 2006024796 W US2006024796 W US 2006024796W WO 2007005367 A2 WO2007005367 A2 WO 2007005367A2
Authority
WO
WIPO (PCT)
Prior art keywords
catheter
movement
image
master
surgical
Prior art date
Application number
PCT/US2006/024796
Other languages
French (fr)
Other versions
WO2007005367A3 (en
Inventor
Frederic Moll
Gopal Chopra
Original Assignee
Intuitive Surgical, Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical, Inc filed Critical Intuitive Surgical, Inc
Publication of WO2007005367A2 publication Critical patent/WO2007005367A2/en
Publication of WO2007005367A3 publication Critical patent/WO2007005367A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/72Micromanipulators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/75Manipulators having means for prevention or compensation of hand tremors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00477Coupling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/305Details of wrist mechanisms at distal ends of robotic arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation

Definitions

  • This present invention is generally related to telesurgical devices, systems, and methods.
  • the invention provides systems and methods for robotically treating tissues with reference to a remote imaging modality (such as magnetic residence imaging, computerized tomography, ultrasound, digital angiography, positron emission tomography, or the like).
  • a remote imaging modality such as magnetic residence imaging, computerized tomography, ultrasound, digital angiography, positron emission tomography, or the like.
  • Exemplary embodiments employ flexible tissue treating catheters for neurosurgery, cardiology, and the like; orthopedic probes for drilling of bone and the like, or other treatment structures.
  • Minimally invasive medical techniques are intended to reduce the amount of extraneous tissue which is damaged during diagnostic or surgical procedures, thereby reducing patient recovery time, discomfort, and deleterious side effects.
  • One effect of minimally invasive surgery may be reduced post-operative hospital recovery times. Because the average hospital stay for a standard surgery is typically significantly longer than the average stay for an analogous minimally invasive surgery, increased use of minimally invasive techniques could save millions of dollars in hospital costs each year. While many of the surgeries performed each year in the United States could potentially be performed in a minimally invasive manner, only a portion of the current surgeries use these advantageous techniques due to limitations in minimally invasive surgical instruments and the additional surgical training involved in mastering them.
  • Minimally invasive robotic surgical or telesurgical systems have been developed to increase a surgeon's dexterity and avoid some of the limitations on traditional minimally invasive techniques.
  • the surgeon uses some form of remote control, e.g., a servomechanism or the like, to manipulate surgical instrument movements.
  • the surgeon can be provided with an image of the surgical site at the surgical workstation. While viewing a two or three dimensional image of the surgical site on a display, the surgeon performs the surgical procedures on the patient by manipulating master control devices, which in turn control motion of the servomechanically operated instruments.
  • the servomechanism used for telesurgery will often accept input from two master controllers (one for each of the surgeon's hands) and may include two or more robotic arms or manipulators, on each of which a surgical instrument is mounted. Operative communication between master controllers and associated robotic arm and instrument assemblies is typically achieved through a control system.
  • the control system typically includes at least one processor which relays input commands from the master controllers to the associated robotic arm and instrument assemblies and back from the instrument and arm assemblies to the associated master controllers in the case of, e.g., force feedback or the like.
  • One example of a robotic surgical system is the DA VINCI ® system available from Intuitive Surgical, Inc. of Mountain View, CA.
  • the new telesurgical devices have significantly advanced the art, providing huge potential improvements in endoscopic procedures. However, as with many such advances, still further improvements would be desirable.
  • This present invention general provides improved telesurgical devices, systems, and methods.
  • Exemplary embodiments provide catheter-based treatments which can be directed using a remote imaging modality such as computerized tomography, magnetic residence imaging, ultrasound imaging, digital angiography, positron emission tomography, and the like. These catheter-based systems are particularly well suited for neurosurgery and/or at least partially intravascular approaches.
  • Alternative embodiments may include orthopedic probes for drilling bone or the like. Regardless, movement of the catheter or probe may be robotically directed so that a treatment surface movement as seen in a display device at least substantially corresponds to a movement input to an input device.
  • the processor may optionally include a tremor filter so as to reduce input hand tremor, and may also scale the input movements and display to facilitate very precise treatments. Motion artifacts shown in the display may be reduced or eliminated, significantly facilitating the procedure. Providing real-time image guidance and allowing the surgeon or other system operator to direct the treatment without the risks presented by emissions from medical imaging systems may significantly extend the number of patients which can be treated using telesurgery.
  • the invention provides a surgical system comprising an image capture device for acquiring an image of an internal surgical site through an intermediate tissue.
  • a display is coupled to the image capture device, and an input device is near the display.
  • a catheter has a proximal end, a distal end, a flexible catheter body between the ends, and a therapy delivery surface disposed near the distal end.
  • a manipulator can be coupled to the proximal end of the catheter so as to effect movement of the distal end.
  • a processor couples the input device to the manipulator. The processor is configured to determine movement of the distal end of the catheter in response to movement of the input device.
  • the invention provides an orthopedic surgical system.
  • the system comprises an image capture device for acquiring an image of an orthopedic surgical site through an intermediate tissue.
  • a display is coupled to the image capture device.
  • An input device is near the display.
  • a probe has a proximal end, a distal end, and an orthopedic therapy delivery surface disposed near the distal end.
  • a manipulator can be coupled to the proximal end of the probe so as to effect movement of the distal end.
  • a processor couples the input device to the manipulator. The processor is configured to effect movement of the distal end of the catheter in response to movement of the input device so as to allow real-time image-guided robotic orthopedic therapy.
  • FIG. 1 is a schematic block diagram illustrating a telesurgical system in which a three-dimensional remote imaging modality provides real-time image guidance
  • FIG. IA shows a three-dimensional view of an operator station of a telesurgical system in accordance with the invention
  • FIG. 2 shows a three-dimensional view of a patient-side cart or surgical station of the telesurgical system, the cart carrying three robotically controlled arms, the movement of the arms being remotely controllable from the operator station shown in FIG. IA;
  • FIG. 3 shows a side view of a robotic arm and surgical instrument assembly
  • FIG. 4 shows a three-dimensional view of a surgical instrument
  • FIG. 5 shows, at an enlarged scale, a wrist member and end effector of the surgical instrument shown in FIG. 4, the wrist member and end effector being movably mounted on a working end of a shaft of the surgical instrument;
  • FIG. 6 shows a three-dimensional view of the master control device showing a wrist gimbal mounted on an articulated arm portion
  • FIG. 7 shows a schematic three-dimensional drawing indicating the positions of the end effectors relative to a viewing end of an endoscope and the corresponding positions of master control devices relative to the eyes of an operator, typically a surgeon;
  • FIG. 8 shows a schematic three-dimensional drawing indicating the position and orientation of an end effector relative to an imaging Cartesian coordinate reference system
  • FIG. 9 shows a schematic three-dimensional drawing indicating the position and orientation of a pincher formation of the master control device relative to an eye Cartesian coordinate reference system
  • FIG. 10 shows a schematic side view of part of the surgical station of the minimally invasive surgical apparatus indicating the location of Cartesian reference coordinate systems used to determine the position and orientation of an end effector relative an image capturing device
  • FIG. 11 shows a schematic side view of part of the operator station of the minimally invasive surgical apparatus indicating the location of Cartesian reference coordinate systems used by the control systems of the minimally invasive surgical apparatus to determine the position and orientation of the input device relative to an eye of the system operator
  • FIG. 12 schematically illustrates a high level control architecture model of a master/slave surgical robotic system
  • FIG. 13 shows a block diagram representing control steps followed by the control system of the minimally invasive surgical apparatus in effecting control between input device positional and orientational movement and end effector positional and orientational movement;
  • FIG. 14 shows a fragmentary portion of the insertion portion of an endoscope for use with the present invention.
  • FIG. 15 is a flow chart schematically illustrating a method for providing realtime image guidance during a telesurgical procedure.
  • the present invention generally provides improved telesurgical devices, systems, and methods.
  • Embodiments of the invention may be particularly well suited for effective, precise, and less-traumatic ways of destroying, removing, and/or treating tumors and vascular abnormalities through robotic neurosurgery.
  • Alternatively embodiments may provide robotic endovascular and other catheter-based telesurgeries, orthopedic telesurgery, and the like.
  • the devices, systems, and methods described herein may optionally employ significant portions of commercially available robotic surgical systems, including the DaVinciTM surgical system available from Intuitive Surgical, Inc. of Sunnyvale, California.
  • the enhancements in precision and dexterity provided by robotic surgeries may be extended to image guided neurosurgery, interventional cardiology, endovascular surgery, orthopedic surgery, and a range of additional surgical interventions.
  • embodiments of the invention may increase the quality of care for patients by reducing trauma, measurably improving clinical outcomes, reducing costs, and the like.
  • objects appear "substantially connected” if a direction of an incremental positional movement of a first object matches the direction of an incremental positional movement of the second object (often as seen in an image).
  • Matching directions need not be exactly equal, as the objects (or the object and the image) may be perceived as being connected if the angular deviation between the movements remains less than about ten degrees, preferably being less than about five degrees.
  • objects and/or images may be perceived as being "substantially and orientationally connected” if they are substantially connected and if the direction of an incremental orientational movement of the first object is matched by the direction of an incremental oriental movement of the second object (often as seen in an image displayed near the first object).
  • Magnetic connection indicates substantial connection and that the magnitude of orientational and/or positional movements of the first object and second object (typically as seen in an image) are directly related. The magnitudes need not be equal, so that it is possible to accommodate scaling and/or warping within a substantially magnitude connected robotic system.
  • Orientational magnitude connection will imply substantial and orientational connection as well as related orientational movement magnitudes, while substantial and magnitude connection means substantial connection with positional magnitudes being related.
  • a first object appears absolutely positionally connected with an image of a second object if the objects are substantially connected and the position of the first object and the position of the image of the second object appear to match, i.e., to be at the same location, during movement.
  • a first object appears absolutely orientationally connected with an image of the second object if they are substantially connected and the orientation of the first object and the second object appear to match during movement.
  • a telesurgical system 1 allows a system operator O to perform a surgical treatment on an internal tissue of patient P.
  • An image capture system 2 obtains three dimensional images of the internal tissue, and an image of the tissue is displayed to the system operator O on a display device 3. While observing the image in the display device, and by reference to that image, the system operator manipulates one or more handles of an input device 4.
  • a processor 5 calculates movements of a treatment probe 6.
  • the processor transmits signals corresponding to the calculated movements to the robotic manipulator 7, and the manipulator in response effects movement of the probe.
  • movement of a treatment surface near a distal end of probe 6 (which may comprise a flexible catheter, an orthopedic probe, or the like) as seen in display 3 at least substantially corresponds to a movement of a handle of input device 4.
  • System 1 integrates telesurgical control over movement of treatment probe 6 with one or more remote image capture devices 2.
  • Image capture device 2 may employ any of a variety of medical imaging modalities, including computerized tomography, magnetic resonance imaging, ultrasound, digital angiography, positron emission tomography, planar or bi-planar fluoroscopy and/or the like.
  • Three dimensional positional and orientational data may be obtained from image capture device 2, typically using image processing in processor 5.
  • positional and orientational information regarding treatment probe 6 may be provided by sensors of manipulator 7, with information from both sources optionally being combined to effect feedback and control over an operative procedure, despite any anatomical movement of targets or tissues during the therapy.
  • this information can be used to allow robotic tissue manipulation, resection, and the like under real-time control of system operator O.
  • System 1 will typically provide the system operator O with effectively constant or sufficiently rapid sequential updated images so as to allow the system operator to robotically manipulate the treatment probe 6 by reference to the displayed image despite tissue movements, with the processor 5 indicating relative positions between the tissue and probe.
  • System operator O may thus have the ability to make clinical judgments and to perform delicate surgical intervention with up-to-date anatomic and functional data.
  • system operator O may have sufficient imaging information to understand the borders of a tumor as the system operator is effecting its removal, to evaluate the risk of post-operative neural-functional deficit by precisely and accurately identifying the tissue and/or function of that tissue just prior to and/or during manipulation, retraction, or removal of the tissue.
  • Such capabilities may be provided by system 1, and particularly by including a three dimensional image capture system capable of providing information regarding three dimensional shift of neural tissues intra-operatively.
  • Three dimensional shifting of neural tissues during an operation may be quite frequent, in part due to CSF fluid leakage, the effects of gravity during tissue removal, and the like.
  • tissue shifting can be tracked and accounted for during the procedure using system 1.
  • the manipulator and the associated robotically actuated probe 6 may determine and track the position of at least a portion of the probe relative to the anatomy of patient P. Tracking may be effected using an image processing module of processor 5, maintaining a known relative position between image capture device 2 and manipulator 7, and/or the like. Intra-operatively acquiring and updating the relative or absolute position of the therapeutic probe 6 and anatomy of patient P may reduce surgical error, improving patient outcomes and avoiding duplicate or repetitive surgeries caused by incomplete or inaccurate therapies (for example, residual tumors may be less likely to left in the patient due to lack of information on the intra- operative location of abnormal cells, improper shunt placement may be inhibited from any lack of reliable operative guidance regarding the positioning of the catheter, or the like).
  • Intra-operative functional imaging feedback may help determine the safety and efficacy of, for example, a cell transplant or device implant procedure.
  • Delicate cell transplantation, gene implantation, and the like may be effected using system 1, ideally with assistance of such real-time intra-operative image guidance, giving system operator O the ability to establish, in the course of the procedure, the probable therapeutic outcome.
  • x-ray imaging can play a major role in the guidance of surgical procedures.
  • Sophisticated imaging techniques can provide, through the use of appropriate modules of processor 5, high quality three-dimensional images depicting anatomy, pathology, vascularity, and function.
  • Ongoing advancement of image guided surgery allows images derived from different image capture devices 2, different imaging modalities, and different views of patient P to be registered, optional through image overlay of a variety of scans and registration to the patient via fiducial markers.
  • processing and image capture components of the image capture system may be commercially available or included in systems available commercially from a variety of suppliers.
  • system 1 may find applications in a variety of neurosurgical tasks, including tumor section, shunt placement, vascular aneurism (surgical and catheter based) repair, neurofunctional restorative surgery, cell transplantation, and the like.
  • system 1 may employ any of a wide variety of imaging modalities.
  • the image capture devices may at least in part comprise magnetic resonance imaging structures.
  • at least the portion of treatment probe 6 used within an imaging field will often comprise a nonferromagnetic material, with optionally all of probe 6 and at least a portion of manipulator 7 comprising such a material.
  • Suitable materials may comprise nonferromagnetic metals such as titanium and the like, and the probe and manipulator may be designed so as to be compatible with a magnetic resonance imaging field.
  • Compatibility with such a field may be provided by removal and/or shielding of any ferromagnetic parts of, for example, manipulator 7 or couplings between the manipulator and other components of system 1 near the magnetic field.
  • movements of probe 6 via manipulator 7 may be interspersed with image capture, rather than continuously performing both robotic movements and imaging simultaneously. Depending on the image capture and manipulator cycle rates, such interspersing may be noticeable to the system operator or the interspersing may be transparent.
  • robotic movements of probe 6 may be sequenced so as to inhibit robotic movements during scanning intervals. By inhibiting motion of the actuators of manipulator 7 while a scan is taken, motion artifacts in a image may be reduced or eliminated.
  • one or more image capture interval may be initiated by system operator O.
  • the system may start and stop movements after and before imaging, respectively, without losing position and/or orientation awareness.
  • processor 5 may hold one or more treatment probe 6 at a position during imaging, or may remove treatment probe 6 from a position in an image field and return it into that position after imaging, or the like, particularly during intra-operative image updating (such as during a magnetic resonance image capture or the like). Movement of treatment probe 6 and articulated robotic structures may be filtered so as to reduce or remove tremor of the system operator's hand. Such tremor can be particularly significant when the imaging and/or manipulator systems are used with high magnification for precise therapies such as delivery of clips and other devices to occlude or ablate vascular aneurisms, using a surgical or intravascular approach, or the like.
  • system operator O and components of robotic system 1 may be isolated from the imaging field using isolation 9. This may facilitate, for example, providing the surgeon with an ability to control and monitor movements of probe 6 while the system operator O remains safely isolated from the patient (optionally at a sufficient distance from the patient) so as to inhibit injury to the surgeon from potentially hostile prolonged exposure to radiation of x-rays, computer tomography, angiography, or the like.
  • image capture device 2 will provide a three-dimensional view of the target anatomy via video transmission. The three- dimensional view may be provided by a wide range of the image capture devices, including optical viewing devices, magnetic resonance systems, computer tomography systems, angiography systems, or the like.
  • the operator may accurately and safely position and/or orient one or more treatment surfaces of one or more probes 6 relative to a target tissue or anatomy of patient P.
  • the treatment surfaces may comprise stimulators, electrodes, implant receptacles, transplantable biological tissue receptacles, or the like.
  • robotic control elements of system 1 together with image processing and/or image capture device control elements so as to positionally relate the robotic and imaging functionalities, registration and/or coordination of movement of the treatment probe 6 with the images shown in display 3 may be provided. Exemplary techniques for coordinating image capture and input movement will be described below, with many other alternatives also being available.
  • system 1 may find utility in a variety of surgical specialties, including orthopedic surgery and cardiology. Examples of potentially uses in orthopedics include the real-time drilling of the femoral canal using robotics controlled by imaging in the radiology suite. System l's combination of robotics and imaging may obviate any need for fiducial markers or the like, dramatically simplifying such drilling procedures. Other uses in orthopedics include operations on the lumbar and thoracic spine.
  • Examples of the potential use for system 1 in cardiology include positionally aware robotically controlled image guided catheter placement.
  • system operator O can control both the speed and direction of the catheter movement quite accurately, despite any disparity between the input movement and the output movement seen at the distal end of probe 6 by the image capture system.
  • Such robotic control of the tip of the flexible catheter may dramatically reduce the training for traversing intravascular anatomies, and may reduce the difficulty and/or increase the efficiency in traversing tortuous and diseased vessels.
  • the use of system 1 for cardiology and other body lumen-based treatments may also allow the system operator O to be isolated from the surgical site, protecting the surgeon from exposure to potential harmful ionizing radiation.
  • System 1 may significantly enhance the use of remote imaging modalities for navigation to a desired surgical site through the use of real-time digital image displays and robotic operative control over probe movements.
  • a combination of remote imaging and telesurgical probe manipulation may thus have a variety of advantages.
  • an operator station or surgeon's console of a minimally invasive telesurgical system is generally indicated by reference numeral 200.
  • the station 200 includes a viewer 202 where an image of a surgical site is displayed in use.
  • a support 204 is provided on which an operator, typically a surgeon, can rest his or her forearms while gripping two master controls (not shown in FIG. IA), one in each hand.
  • the master controls are positioned in a space 206 inwardly beyond the support 204.
  • the surgeon typically sits in a chair in front of the control station 200, positions his or her eyes in front of the viewer 202 and grips the master controls one in each hand while resting his or her forearms on the support 204.
  • a cart or surgical station of the telesurgical system is generally indicated by reference numeral 300.
  • the cart 300 is positioned close to a patient requiring surgery and is then normally caused to remain stationary until a surgical procedure to be performed has been completed.
  • the cart 300 typically has wheels or castors to render it mobile.
  • the station 200 is typically positioned remote from the cart 300 and can be separated from the cart 300 by a great distance, even miles away, but will typically be used within an operating room with the cart 300.
  • the cart 300 typically carries three robotic arm assemblies.
  • One of the robotic arm assemblies is arranged to hold an image capturing device 304, e.g., a remote image device, an endoscope, or the like.
  • an image capturing device 304 e.g., a remote image device, an endoscope, or the like.
  • Each of the two other arm assemblies 10, 10 respectively includes a surgical instrument 14. While described in portions of the following description with reference to endoscopic instruments and/or image capture devices, many embodiments will instead include intravascular and/or orthopedic instruments and remote imaging systems.
  • the endoscope 304 has a viewing end 306 at a remote end of an elongate shaft thereof. It will be appreciated that the endoscope 304 has an elongate shaft to permit its viewing end 306 to be inserted through an entry port into an internal surgical site of a patient's body.
  • the endoscope 304 is operatively connected to the viewer 202 to display an image captured at its viewing end 306 on the viewer 202.
  • Each robotic arm assembly 10, 10 is normally operatively connected to one of the master controls. Thus, the movement of the robotic arm assemblies 10, 10 is controlled by manipulation of the master controls.
  • the instruments 14 of the robotic arm assemblies 10, 10 have end effectors which are mounted on wrist members which are pivotally mounted on distal ends of elongate shafts of the instruments 14, as is described in greater detail hereinbelow. It will be appreciated that the instrument 14 have elongate shafts to permit the end effectors to be inserted through entry ports into the internal surgical site of a patient's body.
  • Movement of the end effectors relative to the ends of the shafts of the instruments 14 is also, controlled by the master controls.
  • the robotic arms 10, 10, 302 are mounted on a carriage 97 by means of setup joint arms 95.
  • the carriage 97 can be adjusted selectively to vary its height relative to a base 99 of the cart 300, as indicated by arrows K.
  • the setup joint arms 95 are arranged to enable the lateral positions and orientations of the arms 10, 10, 302 to be varied relative to a vertically extending column 93 of the cart 300. Accordingly, the positions, orientations and heights of the arms 10, 10, 302 can be adjusted to facilitate passing the elongate shafts of the instruments 14 and the endoscope 304 through the entry ports to desired positions relative to the surgical site.
  • the setup joint arms 95 and carriage 97 are typically locked in position.
  • Each assembly 10 includes an articulated robotic arm 12, and a surgical instrument, schematically and generally indicated by reference numeral 14, mounted thereon.
  • FIG. 4 indicates the general appearance of the surgical instrument 14 in greater detail.
  • the surgical instrument 14 includes an elongate shaft 14.1.
  • the wrist-like mechanism, generally indicated by reference numeral 50, is located at a working end of the shaft 14.1.
  • a housing 53, arranged releasably to couple the instrument to the robotic arm 12, is located at an opposed end of the shaft 14.1.
  • the shaft 14.1 extends along an axis indicated at 14.2.
  • the instrument 14 is typically releasably mounted on a carriage 11, which can be driven to translate along a linear guide formation 24 of the arm 12 in the direction of arrows P.
  • the robotic arm 12 is typically mounted on a base or platform at an end of its associated setup joint arm 95 by means of a bracket or mounting plate 16.
  • the robotic arm 12 includes a cradle, generally indicated at 18, an upper arm portion 20, a forearm portion 22 and the guide formation 24.
  • the cradle 18 is pivotally mounted on the plate 16 in a gimbaled fashion to permit rocking movement of the cradle 18 about a pivot axis 28.
  • the upper arm portion 20 includes link members 30, 32 and the forearm portion 22 includes link members 34, 36.
  • the link members 30, 32 are pivotally mounted on the cradle 18 and are pivotally connected to the link members 34, 36.
  • the link members 34, 36 are pivotally connected to the guide formation 24.
  • the pivotal connections between the link members 30, 32, 34, 36, the cradle 18, and the guide formation 24 are arranged to constrain the robotic arm 12 to move in a specific manner, specifically with a pivot center 49 is coincident with the port of entry, such that movement of the arm does not excessively effect the surrounding tissue at the port of entry.
  • the wrist-like mechanism 50 includes a wrist member 52.
  • One end portion of the wrist member 52 is pivotally mounted in a clevis, generally indicated at 17, on the end 14.3 of the shaft 14.1 by means of a pivotal connection 54.
  • the wrist member 52 can pivot in the direction of arrows 56 about the pivotal connection 54.
  • An end effector, generally indicated by reference numeral 58 is pivotally mounted on an opposed end of the wrist member 52.
  • the end effector 58 is in the form of, e.g., a clip applier for anchoring clips during a surgical procedure. Accordingly, the end effector 58 has two parts 58.1, 58.2 together defining a j aw-like arrangement.
  • the end effector can be in the form of any desired surgical tool, e.g., having two members or fingers which pivot relative to each other, such as scissors, pliers for use as needle drivers, or the like. Instead, it can include a single working member, e.g., a scalpel, cautery electrode, or the like.
  • a tool other than a clip applier is desired during the surgical procedure, the tool 14 is simply removed from its associated arm and replaced with an instrument bearing the desired end effector, e.g., a scissors, or pliers, or the like.
  • the end effector 58 is pivotally mounted in a clevis, generally indicated by reference numeral 19, on an opposed end of the wrist member 52, by means of a pivotal connection 60. It will be appreciated that free ends 11, 13 of the parts 58.1, 58.2 are angularly displaceable about the pivotal connection 60 toward and away from each other as indicated by arrows 62, 63. It will further be appreciated that the members 58.1, 58.2 can be displaced angularly about the pivotal connection 60 to change the orientation of the end effector 58 as a whole, relative to the wrist member 52.
  • each part 58.1, 58.2 is angularly displaceable about the pivotal connection 60 independently of the other, so that the end effector 58, as a whole, is angularly displaceable about the pivotal connection 60 as indicated in dashed lines in FIG. 5.
  • the shaft 14.1 is rotatably mounted on the housing 53 for rotation as indicated by the arrows 59.
  • the end effector 58 has three degrees of freedom of movement relative to the arm 12, namely, rotation about the axis 14.2 as indicated by arrows 59, angular displacement as a whole about the pivot 60 and angular displacement about the pivot 54 as indicated by arrows 56.
  • a hand held part or wrist gimbal of the master control device 700 is generally indicated by reference numeral 699.
  • Part 699 has an articulated arm portion including a plurality of members or links connected together by pivotal connections or joints. The surgeon grips the part 699 by positioning his or her thumb and index finger over a pincher formation. When the pincher formation is squeezed between the thumb and index finger, the fingers or end effector elements of the end effector 58 close. When the thumb and index finger are moved apart the fingers of the end effector 58 move apart in sympathy with the moving apart of the pincher formation.
  • the joints of the part 699 are operatively connected to actuators, e.g., electric motors, or the like, to provide for, e.g., force feedback, gravity compensation, and/or the like, as described in greater detail hereinbelow.
  • actuators e.g., electric motors, or the like
  • sensors e.g., encoders, or potentiometers, or the like
  • the part 699 is typically mounted on an articulated arm 712.
  • the articulated arm 712 includes a plurality of links 714 connected together at pivotal connections or joints 714.
  • the articulated arm 712 has appropriately positioned actuators, e.g., electric motors, or the like, to provide for, e.g., force feedback, gravity compensation, and/or the like.
  • appropriately positioned sensors e.g., encoders, or potentiometers, or the like, are positioned on the joints so as to enable joint positions of the articulated arm 712 to be determined by the control system.
  • the surgeon To move the orientation of the end effector 58 and/or its position along a translational path, the surgeon simply moves the pincher formation to cause the end effector 58 to move to where he wants the end effector 58 to be in the image viewed in the viewer 202. Thus, the end effector position and/or orientation is caused to follow that of the pincher formation.
  • the master control devices 700, 700 are typically mounted on the station 200 through pivotal connections.
  • the electric motors and sensors associated with the robotic arms 12 and the surgical instruments 14 mounted thereon, and the electric motors and sensors associated with the master control devices 700 are operatively linked in the control system.
  • the control system typically includes at least one processor, typically a plurality of processors, for effecting control between master control device input and responsive robotic arm and surgical instrument output and for effecting control between robotic arm and surgical instrument input and responsive master control output in the case of, e.g., force feedback.
  • the surgeon views the surgical site through the viewer 202.
  • the end effector 58 carried on each arm 12 is caused to perform positional and orientational movements in response to movement and action inputs on its associated master controls.
  • the master controls are indicated schematically at 700, 700. It will be appreciated that during a surgical procedure images of the end effectors 58 are captured by the endoscope 304 together with the surgical site and are displayed on the viewer 202 so that the surgeon sees the responsive movements and actions of the end effectors 58 as he or she controls such movements and actions by means of the master control devices 700, 700.
  • the control system is arranged to cause end effector orientational and positional movement as viewed in the image at the viewer 202 to be mapped onto orientational and positional movement of a pincher formation of the master control as will be described in greater detail hereinbelow.
  • the operation of the control system of the minimally invasive surgical apparatus will now be described in greater detail. In the description which follows, the control system will be described with reference to a single master control 700 and its associated robotic arm 12 and surgical instrument 14.
  • the master control 700 will be referred to simply as "master” and its associated robotic arm 12 and surgical instrument 14 will be referred to simply as "slave.”
  • Control between master and slave movement is achieved by comparing master position and orientation in an eye Cartesian coordinate reference system with slave position and orientation in a camera Cartesian coordinate reference system.
  • Cartesian coordinate reference system will simply be referred to as "frame” in the rest of this specification. Accordingly, when the master is stationary, the slave position and orientation within the camera frame is compared with the master position and orientation in the eye frame, and should the position and/or orientation of the slave in the camera frame not correspond with the position and/or orientation of the master in the eye frame, the slave is caused to move to a position and/or orientation in the camera frame at which its position and/or orientation in the camera frame does correspond with the position and/or orientation of the master in the eye frame.
  • the camera frame is generally indicated by reference numeral 610 and the eye frame is generally indicated by reference numeral 612 in FIG. 9.
  • the new master position and/or orientation does not correspond with the previously corresponding slave position and/or orientation in the camera frame 610.
  • the control system then causes the slave to move into a new position and/or orientation in the camera frame 610 at which new position and/or orientation, its position and orientation in the camera frame 610 does correspond with the new position and/or orientation of the master in the eye frame 612.
  • control system includes at least one, and typically a plurality, of processors which compute new corresponding positions and orientations of the slave in response to master movement input commands on a continual basis determined by the processing cycle rate of the control system.
  • a typical processing cycle rate of the control system under discussion is about 1000 Hz or more, often being about 1300 Hz.
  • the control system can have any appropriate processing cycle rate depending on the processor or processors used in the control system. All real-time servocycle processing is preferably conducted on a DSP (Digital Signal Processor) chip. DSPs are preferable because of their constant calculation predictability and reproducibility.
  • a Share DSP from Analog Devices, Inc. of Massachusetts is an acceptable example of such a processor for performing the functions described herein.
  • the camera frame 610 is positioned such that its origin 614 is positioned at the viewing end 306 of the endoscope 304.
  • the z axis of the camera frame 610 extends axially along a viewing axis 616 of the endoscope 304.
  • the viewing axis 616 is shown in coaxial alignment with a shaft axis of the endoscope 304, it is to be appreciated that the viewing axis 616 can be angled relative thereto.
  • the endoscope can be in the form of an angled scope.
  • the x and y axes are positioned in a plane perpendicular to the z axis.
  • the endoscope is typically angularly displaceable about its shaft axis.
  • the x, y and z axes are fixed relative to the viewing axis of the endoscope 304 so as to displace angularly about the shaft axis in sympathy with angular displacement of the endoscope 304 about its shaft axis.
  • a frame is defined on or attached to the end effector 58.
  • This frame is referred to as an end effector frame or slave tip frame, in the rest of this specification, and is generally indicated by reference numeral 618.
  • the end effector frame 618 has its origin at the pivotal connection 60.
  • one of the axes e.g. the z axis, of the frame 618 is defined to extend along an axis of symmetry, or the like, of the end effector 58.
  • the x and y axes then extend perpendicularly to the z axis.
  • orientation of the slave is then defined by the orientation of the frame 618 having its origin at the pivotal connection 60, relative to the camera frame 610.
  • position of the slave is then defined by the position of the origin of the frame at 60 relative to the camera frame 610.
  • the eye frame 612 is chosen such that its origin corresponds with a position 201 where the surgeon's eyes are normally located when he or she is viewing the surgical site at the viewer 202.
  • the z axis extends along a line of sight of the surgeon, indicated by axis 620, when viewing the surgical site through the viewer 202.
  • the x and y axes extend perpendicularly from the z axis at the origin 201.
  • the y axis is chosen to extend generally vertically relative to the viewer 202 and the x axis is chosen to extend generally horizontally relative to the viewer 202.
  • a point on the master is chosen which defines an origin of a master or master tip frame, indicated by reference numeral 622. This point is chosen at a point of intersection indicated by reference numeral 3A between axes of rotation 1 and 3 of the master.
  • the z axis of the master frame 622 on the master extends along an axis of symmetry of the pincher formation 706 which extends coaxially along the rotational axis 1.
  • the x and y axes then extend perpendicularly from the axis of symmetry 1 at the origin 3A.
  • orientation of the master within the eye frame 612 is defined by the orientation of the master frame 622 relative to the eye frame 612.
  • the position of the master in the eye frame 612 is defined by the position of the origin 3A relative to the eye frame 612.
  • FIG. 10 shows a schematic diagram of one of the robotic arm 12 and surgical instrument 14 assemblies mounted on the cart 300.
  • the linkages of the robotic arm and its associated instrument may be altered or tailored for positioning and moving a flexible catheter body, an orthopedic probe, or the like.
  • the surgical station 300 In use, when it is desired to perform a surgical procedure by means of the minimally invasive surgical apparatus, the surgical station 300 is moved into close proximity to a patient requiring the surgical procedure.
  • the patient is normally supported on a surface such as an operating table, or the like.
  • the surgical station 300 is provided with the ability to have varying initial setup configurations. Accordingly, the robotic arms 12, 12, and the endoscope arm 302 are mounted on the carriage 97 which is heightwise adjustable, as indicated by arrows K, relative to the base 99 of the cart 300, as can best be seen in FIGS. 2 and 10 of the drawings.
  • the robotic arms 12, 12 and the endoscope arm 302 are mounted on the carriage 97 by means of the setup joint arms 95.
  • the lateral position and orientation of the arms 12, 12, 302 can be selected by moving the setup joint arms 95.
  • the cart 300 is moved into the position in close proximity to the patient, an appropriate height of the carriage 97 is selected by moving it to an appropriate height relative to the base 99 and the surgical instruments 14 are moved relative to the carriage 97 so as to introduce the shafts of the instruments 14 and the endoscope 304 through the ports of entry and into positions in which the end effectors 58 and the viewing end 306 of the endoscope 304 are appropriately positioned at the surgical site and the fulcrums are coincident with the ports of entry.
  • the carriage 97 is locked at its appropriate height and the setup joint arms 95 are locked in their positions and orientations. Normally, throughout the surgical procedure, the carriage 97 is maintained at the selected height and similarly the setup joint arms 95 are maintained in their selected positions. However, if desired, either the endoscope or one or both of the instruments can be introduced through other ports of entry during the surgical procedure.
  • the control system determines the position and orientation of the slave within the camera frame 610 by determining the position and orientation of the slave relative to a cart frame 624 and by determining the orientation and position of the endoscope 304 with reference to the same cart frame 624.
  • the cart frame 624 has an origin indicated by reference numeral 626 in FIG. 10.
  • the position of a fulcrum frame 630 having its origin at the fulcrum 49 is determined within the cart frame 624 as indicated by the arrow 628 in dashed lines. It will be appreciated that the position of the fulcrum 49 normally remains at the same location, coincident with a port of entry into the surgical site, throughout the surgical procedure.
  • the position of the end effector frame 618 on the slave, having its origin at the pivotal connection 60, is then determined relative to the fulcrum frame 630 and the orientation of the end effector frame 618 on the slave is also determined relative to the fulcrum frame 630.
  • the position and orientation of the end effector frame 618 relative to the cart frame is then determined by means of routine calculation using trigonometric relationships.
  • the robotic arm 302 of the endoscope 304 is constrained to move in similar fashion to the robotic arm 10.
  • the endoscope 304 when positioned with its viewing end 306 directed at the surgical site, also defines a fulcrum coincident with its associated port of entry into the surgical site.
  • the endoscope arm 302 can be driven to cause the endoscope 304 to move into a different position during a surgical procedure, to enable the surgeon to view the surgical site from a different position in the course of performing the surgical procedure. It will be appreciated that movement of the viewing end 306 of the endoscope 304 is performed by varying the orientation of the endoscope 304 relative to its pivot center or fulcrum.
  • the position and orientation of the camera frame 610 within the cart frame 624 is determined in similar fashion to the position and orientation of the slave within the cart frame 624.
  • the position and orientation of the camera frame 610 relative to the cart frame 624, and the position and orientation of the slave relative to the cart frame 624 have been determined in this manner, the position and the orientation of the slave relative to the camera frame 610 is readily determinable through routine calculation using trigonometric relationships.
  • FIG. 11 shows a schematic diagram of one of the master controls 700 at the operator station 200.
  • the operator station 200 optionally also includes setup joint arms, as indicated at 632, to enable the general location of the masters 700, 700 to be varied to suit the surgeon.
  • the general position of the masters 700, 700 can be selectively varied to bring the masters 700, 700 into a general position at which they are comfortably positioned for the surgeon.
  • the setup joint arms 632 are locked in position and are normally maintained in that position throughout the surgical procedure.
  • the position and orientation of the eye frame 612 within the eye frame 612, the position and orientation of the eye frame 612 relative to a surgeon's station frame 634, and the position and orientation of the master 700 relative to the surgeon's frame 634 is determined.
  • the surgeon's station frame 634 has its origin at a location which is normally stationary during the surgical procedure, and is indicated at 636.
  • a position of a master setup frame 640 at an end of the setup joint arms 632 on which the master 700 is mounted, relative to the station frame 636, is determined, as indicated by the arrow 638 in dashed lines.
  • the position and orientation of the master frame 622 on the master 700 having its origin at 3A is then determined relative to the master setup frame 640.
  • the position and orientation of the master frame 622 relative to the frame 634 can be determined by means of routine calculation using trigonometric relationships.
  • the position and orientation of the eye frame 612 relative to the station frame 634 is determined in similar fashion.
  • the position of the viewer 202 relative to the rest of the surgeon's console 200 can selectively be varied to suit the surgeon.
  • the position and orientation of the master frame 622 relative to the eye frame 612 can then be determined from the position and orientation of the master frame 622 and the eye frame 612 relative to the surgeon station frame 634 by means of routine calculation using trigonometric relationships.
  • the control system of the minimally invasive surgical apparatus determines the position and orientation of the end effector 58 by means of the end effector frame 618 in the camera frame 610, and, likewise, determines the position and orientation of the master by means of the master frame 622 relative to the eye frame 612.
  • the surgeon grips the master by locating his or her thumb and index fmger over the pincher formation 706. When the surgeon's thumb and index finger are located on the pincher formation, the point of intersection 3A is positioned inwardly of the thumb and index fmger tips.
  • the master frame having its origin at 3A is effectively mapped onto the end effector frame 618, having its origin at the pivotal connection 60 of the end effector 58 as viewed by the surgeon in the viewer 202.
  • the surgeon manipulates the position and orientation of the pincher formation 706 to cause the position and orientation of the end effector 58 to follow, it appears to the surgeon that his or her thumb and index finger are mapped onto the fingers of the end effector 58 and that the pivotal connection 60 of the end effector 58 corresponds with a virtual pivot point of the surgeon's thumb and index finger inwardly from the tips of the thumb and index finger.
  • actuation of the end effector 58 corresponds intuitively to the opening and closing of the surgeon's thumb and index finger.
  • actuation of the end effector 58 as viewed in the viewer 302 is performed by the surgeon in a natural intuitive manner, since the pivot point 60 of the end effector 58 is appropriately mapped onto a virtual pivot point between the surgeon's thumb and index fmger.
  • the cart frame is chosen at 624. It will be appreciated that determining the position of the fulcrum frame 630 relative to the cart frame 624 is achieved through appropriately positioned sensors, such as potentiometers, encoders, or the like. Conveniently, the fulcrum frame position 630 relative to the cart frame 624 is determined through two intermediate frames. One of the frames is a carriage guide frame 644 which has its origin at a convenient location on a guide along which the carriage 97 is guided. The other frame, an arm platform frame indicated at 646 is positioned at an end of the setup joint arm 95 on which the robotic arm 12 is mounted.
  • the carriage guide frame 644 position relative to the cart frame 624 is determined, then the platform frame 646 position relative to the carriage guide frame 644, then the fulcrum frame 630 relative to the platform frame 646, and then the slave orientation and position relative to the fulcrum frame 630, thereby to determine the slave position and orientation relative to the cart frame 624.
  • the slave position and orientation relative to the cart frame 624 is determined in this manner for each arm 10 and in similar fashion for the camera frame 610, through its arm 302, relative to the cart frame 624.
  • the position and orientation of the master control is determined by determining the position of a base frame 648 relative to the surgeon's station frame 634, then determining the position of the platform frame 640 relative to the base frame 648, and then determining master position and orientation relative to the platform frame 640.
  • the position and orientation of the master frame 622 relative to the surgeon's station frame 634 is then readily determined through routine calculation using trigonometric relationships. It will be appreciated that the position and orientation of the other master frame relative to the surgeon console frame 634 is determined in a similar fashion.
  • FIG. 12 schematically illustrates a high level control architecture for a master/slave robotic system 1000.
  • a surgeon 1002 moves an input device of a master manipulator 1004 by applying manual or human forces f / , against the input device.
  • Encoders of master manipulator 1004 generate master encoder signals e m which are interpreted by a master input/output processor 1006 to determine the master joint positions Q m
  • the master joint positions are used to generate Cartesian positions of the input device of the master x m using a master kinematics model 1008.
  • the tissue structures in the surgical workspace will impose forces f e against a surgical end effector (and possibly against other elements of the tool and/or manipulator).
  • Environmental forces f e from the surgical environment 1018 alter position of the slave 1016, thereby altering slave encoder values Q S transmitted to the slave input/output processor 1014.
  • Slave input/output processor 1014 interprets the slave encoder values to determine joint positions Q s , which are then used to generate Cartesian slave position signals x s according to the slave kinematics processing block 1012.
  • the master and slave Cartesian positions x w , ⁇ s are input into bilateral controller 1010, which uses these inputs to generate the desired Cartesian forces to be applied by the slave f s so that the surgeon can manipulate the salve as desired to perform a surgical procedure. Additionally, bilateral controller 1010 uses the Cartesian master and slave positions %, favor, x s to generate the desired Cartesian forces to be applied by the master f m so as to provide force feedback to the surgeon.
  • bilateral controller 1010 will generate the slave and master forces f s , f m by mapping the Cartesian position of the master in the master controller workspace with the Cartesian position of the end effector in the surgical workspace according to a transformation.
  • the control system 1000 will derive the transformation in response to state variable signals provided from the imaging system so that an image of the end effector in a display appears substantially connected to the input device.
  • state variables will generally indicate the Cartesian position of the field of view from the image capture device, as supplied by the slave manipulators supporting the image capture device.
  • coupling of the image capture manipulator and slave end effector manipulator is beneficial for deriving this transformation.
  • bilateral controller 1010 may be used to control more than one slave arm, and/or may be provided with additional inputs.
  • bilateral controller 1010 Based generally on the difference in position between the master and the slave in the mapped workspace, bilateral controller 1010 generates Cartesian slave force ⁇ to urge the slave to follow the position of the master.
  • the slave kinematics 1012 are used to interpret the Cartesian slave forces f s to generate joint torques of the slave ⁇ which will result in the desired forces at the end effector.
  • Slave input/output processor 1014 uses these joint torques to calculate slave motor currents i s , which reposition the slave x e within the surgical worksite.
  • the desired feedback forces from bilateral controller are similarly interpreted from Cartesian force on the master f m based on the master kinematics 1008 to generate master joint torques ⁇ s .
  • the master joint torques are interpreted by the master input/output controller 1006 to provide master motor current i m to the master manipulator 1004, which changes the position of the hand held input device X / , in the surgeon's hand.
  • control system 1000 illustrated in FIG. 12 is a simplification.
  • the surgeon does not only apply forces against the master input device, but also moves the handle within the master workspace.
  • the motor current supplied to the motors of the master manipulator may not result in movement if the surgeon maintains the position of the master controller. Nonetheless, the motor currents do result in tactile force feedback to the surgeon based on the forces applied to the slave by the surgical environment.
  • Cartesian coordinate mapping is preferred, the use of spherical, cylindrical, or other reference frames may provide at least some of the advantages of the invention.
  • the master control 700 has sensors, e.g., encoders, or potentiometers, or the like, associated therewith to enable the control system 810 to determine the position of the master control 700 in joint space as it is moved from one position to a next position on a continual basis during the course of performing a surgical procedure.
  • sensors e.g., encoders, or potentiometers, or the like
  • signals from these positional sensors are indicated by arrow 814.
  • Positional readings measured by the sensors at 687 are read by the processor.
  • the master control 700 includes a plurality of joints connecting one arm member thereof to the next, sufficient positional sensors are provided on the master 700 to enable the angular position of each arm member relative to the arm member to which it is joined to be determined thereby to enable the position and orientation of the master frame 622 on the master to be determined.
  • the processor 689 As the angular positions of one arm member relative to the arm member to which it is joined is read cyclically by the processor 689 in response to movements induced on the master control 700 by the surgeon, the angular positions are continuously changing. The processor at 689 reads these angular positions and computes the rate at which these angular positions are changing.
  • the processor 6S9 reads angular positions and computes the rate of angular change, or joint velocity, on a continual basis corresponding to the system processing cycle time, i.e., 1300 Hz. Joint position and joint velocity commands thus computed at 689 are then input to the Forward Kinematics (FKIN) controller at 691, as already described hereinabove.
  • FKIN Forward Kinematics
  • the positions and velocities in joint space are transformed into corresponding positions and velocities in Cartesian space, relative to the eye frame 612.
  • the FKIN controller 691 is a processor which typically employs a Jacobian (J) matrix to accomplish this. It will be appreciated that the Jacobian matrix transforms angular positions and velocities into corresponding positions and velocities in Cartesian space by means of conventional trigonometric relationships. Thus, corresponding positions and velocities in Cartesian space, or Cartesian velocity and position commands, are computed by the FKIN controller 691 which correspond to Cartesian position and velocity changes of the master frame 622 in the eye frame 612.
  • J Jacobian
  • the minimally invasive surgical apparatus provides for a scale change between master control input movement and responsive slave output movement.
  • a scale can be selected where, for example, a 1-inch movement of the master control 700 is transformed into a corresponding responsive 1/5-inch movement on the slave.
  • the Cartesian position and velocity values are scaled in accordance with the scale selected to perform the surgical procedure.
  • no change in scale is effected at 822.
  • offsets are taken into account which determine the corresponding position and/or orientation of the end effector frame 618 in the camera frame 610 relative to the position and orientation of the master frame 622 in the eye frame 612.
  • a resultant desired slave position and desired slave velocity in Cartesian space is input to a simulated or virtual domain at 812, as indicated by arrows 811.
  • a simulated or virtual domain is for identification only. Accordingly, the simulated control described hereinbelow is performed by elements outside the block 812 also.
  • the simulated domain 812 will be described in greater detail hereinbelow. However, the steps imposed on the desired slave velocity and position in the virtual domain 812 will now be described broadly for ease of understanding of the description which follows.
  • a current slave position and velocity is continually monitored in the virtual or simulated domain 812.
  • the desired slave position and velocity is compared with the current slave position and velocity. Should the desired slave position and/or velocity as input from 822 not cause transgression of limitations, e.g., velocity and/or position and/or singularity, and/or the like, as set in the virtual domain 812, a similar Cartesian slave velocity and position is output from the virtual domain 812 and input into an inverse scale and offset converter as indicated at 826.
  • the similar velocity and position output in Cartesian space from the virtual domain 812 is indicated by arrows 813 and corresponds with actual commands in joint space output from the virtual domain 812 as indicated by arrows 815 as will be described in greater detail hereinbelow.
  • the inverse scale and offset converter 826 which performs the scale and offset step of 822 in reverse, the reverted Cartesian position and velocity is input into the Cartesian controller at 820.
  • the FKIN controller 691 is compared with the Cartesian position and velocity input from the simulated domain 812.
  • the error signal is typically used to calculate a Cartesian force.
  • the Cartesian force is typically calculated, by way of example, in accordance with the following formula;
  • K is a spring constant
  • B is a damping constant
  • ⁇ x is the difference between the Cartesian velocity inputs to the Cartesian controller 820
  • ⁇ x is the difference between the Cartesian position inputs to the Cartesian controller 820. It will be appreciated that for an orientational error, a corresponding torque in Cartesian space is determined in accordance with conventional methods.
  • the Cartesian force corresponds to an amount by which the desired slave position and/or velocity extends beyond the limitations imposed in the simulated domain 812.
  • the Cartesian force which could result from a velocity limitation, a positional limitation, and/or a singularity limitation, as described in greater detail below, is then converted into a corresponding torque signal by means of the master transpose kinematics controller 828 which typically includes a processor employing a Jacobian Transpose (J ⁇ ) matrix and kinematic relationships to convert the Cartesian force to a corresponding torque in joint space.
  • the torque thus determined is then input to a processor at 830 whereby appropriate electrical currents to the motors associated with the master 700 are computed and supplied to the motors. These torques are then applied on the motors operatively associated with the master control 700.
  • the effect of this is that the surgeon experiences a resistance on the master control to either move it at the rate at which he or she is urging the master control to move, or to move it into the position into which he or she is urging the master control to move.
  • the resistance to movement on the master control is due to the torque on the motors operatively associated therewith. Accordingly, the higher the force applied on the master control to urge the master control to move to a position beyond the imposed limitation, the higher the magnitude of the error signal and the higher an opposing torque on the motors resisting displacement of the master control in the direction of that force. Similarly, the higher the velocity imposed on the master beyond the velocity limitation, the higher the error signal and the higher the opposing torque on the motors associated with the master.
  • the slave joint velocity and position signal is passed from the simulated domain 812 to a joint controller 848.
  • the resultant joint velocity and position signal is compared with the current joint position and velocity.
  • the current joint position and velocity is derived through the sensors on the slave as indicated at 849 after having been processed at an input processor 851 to yield slave current position and velocity in joint space.
  • the joint controller 848 computes the torques desired on the slave motors to cause the slave to follow the resultant joint position and velocity signal taking its current joint position and velocity into account.
  • the joint torques so determined are then routed to a feedback processor at 852 and to an output processor at 854.
  • the joint torques are typically computed, by way of example, by means of the following formula:
  • T K( ⁇ )+B( ⁇ ) [0109] where K is a spring constant, B is a damping constant, ⁇ is the difference between the joint velocity inputs to the joint controller 851 , and ⁇ is the difference between the joint position inputs to the joint controller 851.
  • the output processor 854 determines the electrical currents to be supplied to the motors associated with the slave to yield the commanded torques and causes the currents to be supplied to the motors as indicated by arrow 855.
  • force feedback is supplied to the master.
  • force feedback is provided on the master 700 whenever a limitation is induced in the simulated domain 812.
  • force feedback is provided directly from the slave 798, in other words, not through a virtual or simulated domain but through direct slave movement. This will be described in greater detail hereinbelow.
  • the slave indicated at 798 is provided with a plurality of sensors. These sensors are typically operatively connected to pivotal joints on the robotic arm 10 and on the instrument 14.
  • These sensors are operatively linked to the processor at 851. It will be appreciated that these sensors determine current slave position. Should the slave 798 be subjected to an external force great enough to induce reactive movement on the slave 798, the sensors will naturally detect such movement. Such an external force could originate from a variety of sources such as when the robotic arm 10 is accidentally knocked, or knocks into the other robotic arm 10 or the endoscope arm 302, or the like. As mentioned, the joint controller 848 computes torques desired to cause the slave 798 to follow the master 700. An external force on the slave 798 which causes its current position to vary also causes the desired slave movement to follow the master to vary.
  • a compounded joint torque is generated by the joint controller 848, which torque includes the torque desired to move the slave to follow the master and the torque desired to compensate for the reactive motion induced on the slave by the external force.
  • the torque generated by the joint controller 848 is routed to the feedback processor at 852, as already mentioned.
  • the feedback processor 852 analyzes the torque signal from the joint controller 848 and accentuates that part of the torque signal resulting from the extraneous force on the slave 798.
  • the part of the torque signal accentuated can be chosen depending on requirements. In this case, only the part of the torque signal relating to the robotic arm 12, 12, 302 joints are accentuated.
  • the torque signal after having been processed in this way is routed to a kinematic mapping block 860 from which a corresponding Cartesian force is determined.
  • the information determining slave fulcrum position relative to the camera frame is input from 647 as indicated.
  • the Cartesian force is readily determined relative to the camera frame.
  • This Cartesian force is then passed through a gain step at 862 appropriately to vary the magnitude of the Cartesian force.
  • the resultant force in Cartesian space is then passed to the summation junction at 827 and is then communicated to the master control 700 as described earlier.
  • Reference numeral 866 generally indicates another direct force feedback path of the control system 810, whereby direct force feedback is supplied to the master control 700.
  • the path 866 includes one or more sensors which are not necessarily operatively connected to slave joints. These sensors can typically be in the form of force or pressure sensors appropriately positioned on the surgical instrument 14, typically on the end effector 58. Thus, should the end effector 58 contact an extraneous body, such as body tissue at the surgical site, it generates a corresponding signal proportionate to the force of contact. This signal is processed by a processor at 868 to yield a corresponding torque. This torque is passed to a kinematic mapping block 864, together with information from 647 to yield a corresponding Cartesian force relative to the camera frame.
  • resultant force is passed through a gain block at 870 and then forwarded to the summation junction 827.
  • Feedback is imparted on the master control 700 by means of torque supplied to the motors operatively associated with the master control 700 as described earlier. It will be appreciated that this can be achieved by means of any appropriate sensors such as current sensors, pressure sensors, accelerometers, proximity detecting sensors, or the like.
  • resultant forces from kinematic mapping 864 may be transmitted to an alternative presentation block 864.1 so as to indicate the applied forces in an alternative format to the surgeon.
  • FIG. 14 wherein a distal end portion, or tip, 260 of the insertion section of a flexible instrument or endoscope is shown.
  • the insertion end of the instrument includes a pair of spaced viewing windows 262R and 262L and an illumination source 264 for viewing and illuminating a workspace to be observed.
  • Light received at the windows is focused by objective lens means, not shown, and transmitted through fiberoptic bundles to a pair of cameras at the operating end of the instrument, not shown.
  • the camera outputs are converted to a three-dimensional image of the workspace which image is located adjacent hand-operated means at the operator's station, now shown.
  • Right and left steerable catheters 268R and 268L pass through accessory channels in the endoscope body, which catheters are adapted for extension from the distal end portion, as illustrated.
  • End effectors 270R and 270L are provided at the ends of the catheters which may comprise conventional endoscopic instruments. Force sensors, not shown, also are inserted through the endoscope channels.
  • Steerable catheters which include control wires for controlling bending of the catheters and operation of an end effector suitable for use with this invention are well know.
  • Control motors for operation of the control wires are provided at the operating end of the endoscope, which motors are included in a servomechanism of a type described above for operation of the steerable catheters and associated end effectors from a remote operator's station.
  • the interfacing computer in the servomechanism system remaps the operator's hand motion into the coordinate system of the end effectors, and images of the end effectors are viewable adjacent the hand-operated controllers in a manner described above.
  • Flexible catheter-based instruments and probes of different types may be employed in this embodiment of the invention.
  • a method for using system 1 can be initiated by capturing an image 504 of the target anatomy and/or adjacent portions of treatment probe 6.
  • An image is displayed 506 to the system operator, with the image preferably comprising a three dimensional image of the surgical site.
  • the system operator inputs a command to move the treatment probe with reference to the displayed image 508.
  • the processor of the robotic system calculates a substantially corresponding movement 510.
  • the processor, manipulator, and probe effect a robotic movement of a treatment surface per the calculated movement 512.
  • the system operator may actuate the treatment surface or the like so as to treat tissue with the treatment surface 514.
  • the telesurgical system will again capture a new image 504, and the like.
  • Method 502 illustrated in FIG. 15 is a simplified schematic illustration, as treatment of a target tissue need not occur each time an image is captured. In the image capture and movement cycle times may differ.
  • position and/or orientation information from a distal portion of a flexible body may be provided at least in part by image processing techniques applied to one or more high contrast reference markers of the treatment probe.
  • the treatment probe may include structures which facilitate identifying a position of the distal end within the body, and/or which measure flexing of the catheter body along it's length so that the position of the distal end may be calculated by processor 5.

Abstract

Telesurgical devices, systems, and methods provide catheter-based treatments which can be directed by a remote imaging modality for neurosurgery and/or at least partially intravascular approaches. Alternative embodiments include orthopedic probes. Movement of the catheter or probe may be robotically directed with reference to three-dimensional images.

Description

ROBOTIC IMAGE GUIDED CATHETER-BASED SURGICAL DEVICES AND TECHNIQUES
CROSS-REFERENCES TO RELATED APPLICATIONS
[0001] This non-provisional United States (U.S .) patent application claims the benefit of U.S. provisional patent Application No. 60/695,366 entitled "Robotic Image Guided Catheter-Based Surgical Devices and Techniques" filed on June 30, 2005 by inventors Frederic Moll et al.
STATEMENT AS TO RIGHTS TO INVENTIONS MADE UNDER FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT [0002] NOT APPLICABLE
REFERENCE TO A "SEQUENCE LISTING," A TABLE, OR A COMPUTER
PROGRAM LISTING APPENDIX SUBMITTED ON A COMPACT DISK. [0003] NOT APPLICABLE
BACKGROUND OF THE INVENTION [0004] This present invention is generally related to telesurgical devices, systems, and methods. In an exemplary embodiment, the invention provides systems and methods for robotically treating tissues with reference to a remote imaging modality (such as magnetic residence imaging, computerized tomography, ultrasound, digital angiography, positron emission tomography, or the like). Exemplary embodiments employ flexible tissue treating catheters for neurosurgery, cardiology, and the like; orthopedic probes for drilling of bone and the like, or other treatment structures.
[0005] Minimally invasive medical techniques are intended to reduce the amount of extraneous tissue which is damaged during diagnostic or surgical procedures, thereby reducing patient recovery time, discomfort, and deleterious side effects. One effect of minimally invasive surgery, for example, may be reduced post-operative hospital recovery times. Because the average hospital stay for a standard surgery is typically significantly longer than the average stay for an analogous minimally invasive surgery, increased use of minimally invasive techniques could save millions of dollars in hospital costs each year. While many of the surgeries performed each year in the United States could potentially be performed in a minimally invasive manner, only a portion of the current surgeries use these advantageous techniques due to limitations in minimally invasive surgical instruments and the additional surgical training involved in mastering them.
[0006] Minimally invasive robotic surgical or telesurgical systems have been developed to increase a surgeon's dexterity and avoid some of the limitations on traditional minimally invasive techniques. In telesurgery, the surgeon uses some form of remote control, e.g., a servomechanism or the like, to manipulate surgical instrument movements. In telesurgery systems, the surgeon can be provided with an image of the surgical site at the surgical workstation. While viewing a two or three dimensional image of the surgical site on a display, the surgeon performs the surgical procedures on the patient by manipulating master control devices, which in turn control motion of the servomechanically operated instruments.
[0007] The servomechanism used for telesurgery will often accept input from two master controllers (one for each of the surgeon's hands) and may include two or more robotic arms or manipulators, on each of which a surgical instrument is mounted. Operative communication between master controllers and associated robotic arm and instrument assemblies is typically achieved through a control system. The control system typically includes at least one processor which relays input commands from the master controllers to the associated robotic arm and instrument assemblies and back from the instrument and arm assemblies to the associated master controllers in the case of, e.g., force feedback or the like. One example of a robotic surgical system is the DA VINCI® system available from Intuitive Surgical, Inc. of Mountain View, CA.
[0008] The new telesurgical devices have significantly advanced the art, providing huge potential improvements in endoscopic procedures. However, as with many such advances, still further improvements would be desirable. In general, it would desirable to provide improved telesurgical devices and telesurgical devices, systems, and methods, particularly for performing telesurgical procedures which are not well suited for an endoscopic approach. In particular, it would be desirable to extend the improvements provided through robotic and telemanipulation to additional forms of surgery. It would be, for example, advantageous to facilitate robotic neurosurgery, robotic intravascular and other catheter-based therapies, robotic orthopedic surgeries, and the like. It would be particularly beneficial if these improved treatments and therapies could take advantage of at least a portion of the recently developed robotic surgical systems and technologies, optionally in combination with other medical technologies that are currently available.
BRIEF SUMMARY OF THE INVENTION
[0009] This present invention general provides improved telesurgical devices, systems, and methods. Exemplary embodiments provide catheter-based treatments which can be directed using a remote imaging modality such as computerized tomography, magnetic residence imaging, ultrasound imaging, digital angiography, positron emission tomography, and the like. These catheter-based systems are particularly well suited for neurosurgery and/or at least partially intravascular approaches. Alternative embodiments may include orthopedic probes for drilling bone or the like. Regardless, movement of the catheter or probe may be robotically directed so that a treatment surface movement as seen in a display device at least substantially corresponds to a movement input to an input device. The processor may optionally include a tremor filter so as to reduce input hand tremor, and may also scale the input movements and display to facilitate very precise treatments. Motion artifacts shown in the display may be reduced or eliminated, significantly facilitating the procedure. Providing real-time image guidance and allowing the surgeon or other system operator to direct the treatment without the risks presented by emissions from medical imaging systems may significantly extend the number of patients which can be treated using telesurgery. [0010] In the first aspect, the invention provides a surgical system comprising an image capture device for acquiring an image of an internal surgical site through an intermediate tissue. A display is coupled to the image capture device, and an input device is near the display. A catheter has a proximal end, a distal end, a flexible catheter body between the ends, and a therapy delivery surface disposed near the distal end. A manipulator can be coupled to the proximal end of the catheter so as to effect movement of the distal end. A processor couples the input device to the manipulator. The processor is configured to determine movement of the distal end of the catheter in response to movement of the input device.
[0011] In another aspect, the invention provides an orthopedic surgical system. The system comprises an image capture device for acquiring an image of an orthopedic surgical site through an intermediate tissue. A display is coupled to the image capture device. An input device is near the display. A probe has a proximal end, a distal end, and an orthopedic therapy delivery surface disposed near the distal end. A manipulator can be coupled to the proximal end of the probe so as to effect movement of the distal end. A processor couples the input device to the manipulator. The processor is configured to effect movement of the distal end of the catheter in response to movement of the input device so as to allow real-time image-guided robotic orthopedic therapy.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIG. 1 is a schematic block diagram illustrating a telesurgical system in which a three-dimensional remote imaging modality provides real-time image guidance;
[0013] FIG. IA shows a three-dimensional view of an operator station of a telesurgical system in accordance with the invention;
[0014] FIG. 2 shows a three-dimensional view of a patient-side cart or surgical station of the telesurgical system, the cart carrying three robotically controlled arms, the movement of the arms being remotely controllable from the operator station shown in FIG. IA;
[0015] FIG. 3 shows a side view of a robotic arm and surgical instrument assembly; [0016] FIG. 4 shows a three-dimensional view of a surgical instrument;
[0017] FIG. 5 shows, at an enlarged scale, a wrist member and end effector of the surgical instrument shown in FIG. 4, the wrist member and end effector being movably mounted on a working end of a shaft of the surgical instrument;
[0018] FIG. 6 shows a three-dimensional view of the master control device showing a wrist gimbal mounted on an articulated arm portion;
[0019] FIG. 7 shows a schematic three-dimensional drawing indicating the positions of the end effectors relative to a viewing end of an endoscope and the corresponding positions of master control devices relative to the eyes of an operator, typically a surgeon;
[0020] FIG. 8 shows a schematic three-dimensional drawing indicating the position and orientation of an end effector relative to an imaging Cartesian coordinate reference system;
[0021] FIG. 9 shows a schematic three-dimensional drawing indicating the position and orientation of a pincher formation of the master control device relative to an eye Cartesian coordinate reference system; [0022] FIG. 10 shows a schematic side view of part of the surgical station of the minimally invasive surgical apparatus indicating the location of Cartesian reference coordinate systems used to determine the position and orientation of an end effector relative an image capturing device; [0023] FIG. 11 shows a schematic side view of part of the operator station of the minimally invasive surgical apparatus indicating the location of Cartesian reference coordinate systems used by the control systems of the minimally invasive surgical apparatus to determine the position and orientation of the input device relative to an eye of the system operator; [0024] FIG. 12 schematically illustrates a high level control architecture model of a master/slave surgical robotic system;
[0025] FIG. 13 shows a block diagram representing control steps followed by the control system of the minimally invasive surgical apparatus in effecting control between input device positional and orientational movement and end effector positional and orientational movement; and
[0026] FIG. 14 shows a fragmentary portion of the insertion portion of an endoscope for use with the present invention.
[0027] FIG. 15 is a flow chart schematically illustrating a method for providing realtime image guidance during a telesurgical procedure.
DETAILED DESCRIPTION OF THE INVENTION
[0028] The present invention generally provides improved telesurgical devices, systems, and methods. Embodiments of the invention may be particularly well suited for effective, precise, and less-traumatic ways of destroying, removing, and/or treating tumors and vascular abnormalities through robotic neurosurgery. Alternatively embodiments may provide robotic endovascular and other catheter-based telesurgeries, orthopedic telesurgery, and the like. Advantageously, the devices, systems, and methods described herein may optionally employ significant portions of commercially available robotic surgical systems, including the DaVinci™ surgical system available from Intuitive Surgical, Inc. of Sunnyvale, California. By combining the robotic input device, processor, and/or portions of the robotic manipulator of this advantageous surgical system with commercially available remote imaging technologies, catheter-based treatment devices, orthopedic probes, and the like, the enhancements in precision and dexterity provided by robotic surgeries may be extended to image guided neurosurgery, interventional cardiology, endovascular surgery, orthopedic surgery, and a range of additional surgical interventions. Hence, embodiments of the invention may increase the quality of care for patients by reducing trauma, measurably improving clinical outcomes, reducing costs, and the like.
[0029] As used herein, objects (and/or images) appear "substantially connected" if a direction of an incremental positional movement of a first object matches the direction of an incremental positional movement of the second object (often as seen in an image).
Matching directions need not be exactly equal, as the objects (or the object and the image) may be perceived as being connected if the angular deviation between the movements remains less than about ten degrees, preferably being less than about five degrees. Similarly, objects and/or images may be perceived as being "substantially and orientationally connected" if they are substantially connected and if the direction of an incremental orientational movement of the first object is matched by the direction of an incremental oriental movement of the second object (often as seen in an image displayed near the first object).
[0030] Additional levels of connectedness may, but need not, be provided. "Magnitude connection" indicates substantial connection and that the magnitude of orientational and/or positional movements of the first object and second object (typically as seen in an image) are directly related. The magnitudes need not be equal, so that it is possible to accommodate scaling and/or warping within a substantially magnitude connected robotic system. Orientational magnitude connection will imply substantial and orientational connection as well as related orientational movement magnitudes, while substantial and magnitude connection means substantial connection with positional magnitudes being related.
[0031] As used herein, a first object appears absolutely positionally connected with an image of a second object if the objects are substantially connected and the position of the first object and the position of the image of the second object appear to match, i.e., to be at the same location, during movement. A first object appears absolutely orientationally connected with an image of the second object if they are substantially connected and the orientation of the first object and the second object appear to match during movement.
[0032] A telesurgical system 1 allows a system operator O to perform a surgical treatment on an internal tissue of patient P. An image capture system 2 obtains three dimensional images of the internal tissue, and an image of the tissue is displayed to the system operator O on a display device 3. While observing the image in the display device, and by reference to that image, the system operator manipulates one or more handles of an input device 4.
[0033] In response to signals from the input device, a processor 5 calculates movements of a treatment probe 6. The processor transmits signals corresponding to the calculated movements to the robotic manipulator 7, and the manipulator in response effects movement of the probe. In the exemplary embodiments, movement of a treatment surface near a distal end of probe 6 (which may comprise a flexible catheter, an orthopedic probe, or the like) as seen in display 3 at least substantially corresponds to a movement of a handle of input device 4.
[0034] System 1 integrates telesurgical control over movement of treatment probe 6 with one or more remote image capture devices 2. Image capture device 2 may employ any of a variety of medical imaging modalities, including computerized tomography, magnetic resonance imaging, ultrasound, digital angiography, positron emission tomography, planar or bi-planar fluoroscopy and/or the like. Three dimensional positional and orientational data may be obtained from image capture device 2, typically using image processing in processor 5. Alternatively, positional and orientational information regarding treatment probe 6 may be provided by sensors of manipulator 7, with information from both sources optionally being combined to effect feedback and control over an operative procedure, despite any anatomical movement of targets or tissues during the therapy. Advantageously, this information can be used to allow robotic tissue manipulation, resection, and the like under real-time control of system operator O.
[0035] Real-time control over remote imaging guided robotic surgery is facilitated by integration between the manipulator 7 and its associated treatment probe 6 with the image capture device 2. In many embodiments, such integration will provide coordination between the image of the surgical field shown in display 3 and movement of the input device 4. This coordination will often rely at least in part on software and/or hardware of processor 5, in many cases allowing the system operator O to control movement of the treatment probe (often a catheter, orthopedic probe, or the like) with precision relative to the imaged target anatomy.
[0036] System 1 will typically provide the system operator O with effectively constant or sufficiently rapid sequential updated images so as to allow the system operator to robotically manipulate the treatment probe 6 by reference to the displayed image despite tissue movements, with the processor 5 indicating relative positions between the tissue and probe. System operator O may thus have the ability to make clinical judgments and to perform delicate surgical intervention with up-to-date anatomic and functional data. For example, system operator O may have sufficient imaging information to understand the borders of a tumor as the system operator is effecting its removal, to evaluate the risk of post-operative neural-functional deficit by precisely and accurately identifying the tissue and/or function of that tissue just prior to and/or during manipulation, retraction, or removal of the tissue. Such capabilities may be provided by system 1, and particularly by including a three dimensional image capture system capable of providing information regarding three dimensional shift of neural tissues intra-operatively.
[0037] Three dimensional shifting of neural tissues during an operation may be quite frequent, in part due to CSF fluid leakage, the effects of gravity during tissue removal, and the like. Advantageously, such tissue shifting can be tracked and accounted for during the procedure using system 1.
[0038] In addition to tracking of tissue movements, the manipulator and the associated robotically actuated probe 6 may determine and track the position of at least a portion of the probe relative to the anatomy of patient P. Tracking may be effected using an image processing module of processor 5, maintaining a known relative position between image capture device 2 and manipulator 7, and/or the like. Intra-operatively acquiring and updating the relative or absolute position of the therapeutic probe 6 and anatomy of patient P may reduce surgical error, improving patient outcomes and avoiding duplicate or repetitive surgeries caused by incomplete or inaccurate therapies (for example, residual tumors may be less likely to left in the patient due to lack of information on the intra- operative location of abnormal cells, improper shunt placement may be inhibited from any lack of reliable operative guidance regarding the positioning of the catheter, or the like). System 1 and its use may also significantly advance development of restorative neurosurgery and/or brain improvement operations. Intra-operative functional imaging feedback may help determine the safety and efficacy of, for example, a cell transplant or device implant procedure. Delicate cell transplantation, gene implantation, and the like may be effected using system 1, ideally with assistance of such real-time intra-operative image guidance, giving system operator O the ability to establish, in the course of the procedure, the probable therapeutic outcome.
[0039] Regarding image capture devices 2, x-ray imaging can play a major role in the guidance of surgical procedures. Sophisticated imaging techniques can provide, through the use of appropriate modules of processor 5, high quality three-dimensional images depicting anatomy, pathology, vascularity, and function. Ongoing advancement of image guided surgery allows images derived from different image capture devices 2, different imaging modalities, and different views of patient P to be registered, optional through image overlay of a variety of scans and registration to the patient via fiducial markers. Such processing and image capture components of the image capture system may be commercially available or included in systems available commercially from a variety of suppliers.
[0040] By adding, to known image guided surgery capabilities, the ability to track therapeutic probe 6 in real-time during a procedure within patient P, system 1 may find applications in a variety of neurosurgical tasks, including tumor section, shunt placement, vascular aneurism (surgical and catheter based) repair, neurofunctional restorative surgery, cell transplantation, and the like.
[0041] Along with x-ray based image captured devices 2, system 1 may employ any of a wide variety of imaging modalities. For example, the image capture devices may at least in part comprise magnetic resonance imaging structures. In such embodiments, at least the portion of treatment probe 6 used within an imaging field will often comprise a nonferromagnetic material, with optionally all of probe 6 and at least a portion of manipulator 7 comprising such a material. Suitable materials may comprise nonferromagnetic metals such as titanium and the like, and the probe and manipulator may be designed so as to be compatible with a magnetic resonance imaging field. Compatibility with such a field may be provided by removal and/or shielding of any ferromagnetic parts of, for example, manipulator 7 or couplings between the manipulator and other components of system 1 near the magnetic field. [0042] Where appropriate, movements of probe 6 via manipulator 7 may be interspersed with image capture, rather than continuously performing both robotic movements and imaging simultaneously. Depending on the image capture and manipulator cycle rates, such interspersing may be noticeable to the system operator or the interspersing may be transparent. When magnetic residence imaging is used for image capture device 2, robotic movements of probe 6 may be sequenced so as to inhibit robotic movements during scanning intervals. By inhibiting motion of the actuators of manipulator 7 while a scan is taken, motion artifacts in a image may be reduced or eliminated. In some embodiments, one or more image capture interval may be initiated by system operator O. Advantageously, the system may start and stop movements after and before imaging, respectively, without losing position and/or orientation awareness. For example, processor 5 may hold one or more treatment probe 6 at a position during imaging, or may remove treatment probe 6 from a position in an image field and return it into that position after imaging, or the like, particularly during intra-operative image updating (such as during a magnetic resonance image capture or the like). Movement of treatment probe 6 and articulated robotic structures may be filtered so as to reduce or remove tremor of the system operator's hand. Such tremor can be particularly significant when the imaging and/or manipulator systems are used with high magnification for precise therapies such as delivery of clips and other devices to occlude or ablate vascular aneurisms, using a surgical or intravascular approach, or the like.
[0043] As schematically illustrated in Fig. 1, system operator O and components of robotic system 1 may be isolated from the imaging field using isolation 9. This may facilitate, for example, providing the surgeon with an ability to control and monitor movements of probe 6 while the system operator O remains safely isolated from the patient (optionally at a sufficient distance from the patient) so as to inhibit injury to the surgeon from potentially hostile prolonged exposure to radiation of x-rays, computer tomography, angiography, or the like. In many embodiments, image capture device 2 will provide a three-dimensional view of the target anatomy via video transmission. The three- dimensional view may be provided by a wide range of the image capture devices, including optical viewing devices, magnetic resonance systems, computer tomography systems, angiography systems, or the like.
[0044] Through isolation of system operator O from the imaging field of patient P, and through the use of appropriate robotic components of system 1, the operator may accurately and safely position and/or orient one or more treatment surfaces of one or more probes 6 relative to a target tissue or anatomy of patient P. The treatment surfaces may comprise stimulators, electrodes, implant receptacles, transplantable biological tissue receptacles, or the like. By integrated into processor 5 robotic control elements of system 1 together with image processing and/or image capture device control elements so as to positionally relate the robotic and imaging functionalities, registration and/or coordination of movement of the treatment probe 6 with the images shown in display 3 may be provided. Exemplary techniques for coordinating image capture and input movement will be described below, with many other alternatives also being available. [0045] The basic arrangement of system 1 may find utility in a variety of surgical specialties, including orthopedic surgery and cardiology. Examples of potentially uses in orthopedics include the real-time drilling of the femoral canal using robotics controlled by imaging in the radiology suite. System l's combination of robotics and imaging may obviate any need for fiducial markers or the like, dramatically simplifying such drilling procedures. Other uses in orthopedics include operations on the lumbar and thoracic spine.
[0046] Examples of the potential use for system 1 in cardiology include positionally aware robotically controlled image guided catheter placement. In driving a catheter robotically, system operator O can control both the speed and direction of the catheter movement quite accurately, despite any disparity between the input movement and the output movement seen at the distal end of probe 6 by the image capture system. Such robotic control of the tip of the flexible catheter may dramatically reduce the training for traversing intravascular anatomies, and may reduce the difficulty and/or increase the efficiency in traversing tortuous and diseased vessels. The use of system 1 for cardiology and other body lumen-based treatments may also allow the system operator O to be isolated from the surgical site, protecting the surgeon from exposure to potential harmful ionizing radiation.
[0047] Registration of probe 6 with the anatomy and eliminating motion artifacts such as those produced by hand tremor elimination may be particularly advantageous. System 1 may significantly enhance the use of remote imaging modalities for navigation to a desired surgical site through the use of real-time digital image displays and robotic operative control over probe movements. A combination of remote imaging and telesurgical probe manipulation may thus have a variety of advantages.
[0048] Referring to FIG. IA of the drawings, an operator station or surgeon's console of a minimally invasive telesurgical system is generally indicated by reference numeral 200. The station 200 includes a viewer 202 where an image of a surgical site is displayed in use. A support 204 is provided on which an operator, typically a surgeon, can rest his or her forearms while gripping two master controls (not shown in FIG. IA), one in each hand. The master controls are positioned in a space 206 inwardly beyond the support 204. When using the control station 200, the surgeon typically sits in a chair in front of the control station 200, positions his or her eyes in front of the viewer 202 and grips the master controls one in each hand while resting his or her forearms on the support 204.
[0049] In FIG. 2 of the drawings, a cart or surgical station of the telesurgical system is generally indicated by reference numeral 300. In use, the cart 300 is positioned close to a patient requiring surgery and is then normally caused to remain stationary until a surgical procedure to be performed has been completed. The cart 300 typically has wheels or castors to render it mobile. The station 200 is typically positioned remote from the cart 300 and can be separated from the cart 300 by a great distance, even miles away, but will typically be used within an operating room with the cart 300.
[0050] The cart 300 typically carries three robotic arm assemblies. One of the robotic arm assemblies, indicated by reference numeral 302, is arranged to hold an image capturing device 304, e.g., a remote image device, an endoscope, or the like. Each of the two other arm assemblies 10, 10 respectively, includes a surgical instrument 14. While described in portions of the following description with reference to endoscopic instruments and/or image capture devices, many embodiments will instead include intravascular and/or orthopedic instruments and remote imaging systems.
[0051] The endoscope 304 has a viewing end 306 at a remote end of an elongate shaft thereof. It will be appreciated that the endoscope 304 has an elongate shaft to permit its viewing end 306 to be inserted through an entry port into an internal surgical site of a patient's body. The endoscope 304 is operatively connected to the viewer 202 to display an image captured at its viewing end 306 on the viewer 202. Each robotic arm assembly 10, 10 is normally operatively connected to one of the master controls. Thus, the movement of the robotic arm assemblies 10, 10 is controlled by manipulation of the master controls. The instruments 14 of the robotic arm assemblies 10, 10 have end effectors which are mounted on wrist members which are pivotally mounted on distal ends of elongate shafts of the instruments 14, as is described in greater detail hereinbelow. It will be appreciated that the instrument 14 have elongate shafts to permit the end effectors to be inserted through entry ports into the internal surgical site of a patient's body.
Movement of the end effectors relative to the ends of the shafts of the instruments 14 is also, controlled by the master controls.
[0052] The robotic arms 10, 10, 302 are mounted on a carriage 97 by means of setup joint arms 95. The carriage 97 can be adjusted selectively to vary its height relative to a base 99 of the cart 300, as indicated by arrows K. The setup joint arms 95 are arranged to enable the lateral positions and orientations of the arms 10, 10, 302 to be varied relative to a vertically extending column 93 of the cart 300. Accordingly, the positions, orientations and heights of the arms 10, 10, 302 can be adjusted to facilitate passing the elongate shafts of the instruments 14 and the endoscope 304 through the entry ports to desired positions relative to the surgical site. When the surgical instruments 14 and endoscope 304 are so positioned, the setup joint arms 95 and carriage 97 are typically locked in position.
[0053] In FIG. 3 of the drawings, one of the robotic arm assemblies 10 is shown in greater detail. Each assembly 10 includes an articulated robotic arm 12, and a surgical instrument, schematically and generally indicated by reference numeral 14, mounted thereon.
[0054] FIG. 4 indicates the general appearance of the surgical instrument 14 in greater detail. The surgical instrument 14 includes an elongate shaft 14.1. The wrist-like mechanism, generally indicated by reference numeral 50, is located at a working end of the shaft 14.1. A housing 53, arranged releasably to couple the instrument to the robotic arm 12, is located at an opposed end of the shaft 14.1. In FIG. 3, and when the instrument 14 is coupled or mounted on the robotic arm 12, the shaft 14.1 extends along an axis indicated at 14.2. The instrument 14 is typically releasably mounted on a carriage 11, which can be driven to translate along a linear guide formation 24 of the arm 12 in the direction of arrows P. [0055] The robotic arm 12 is typically mounted on a base or platform at an end of its associated setup joint arm 95 by means of a bracket or mounting plate 16. The robotic arm 12 includes a cradle, generally indicated at 18, an upper arm portion 20, a forearm portion 22 and the guide formation 24. The cradle 18 is pivotally mounted on the plate 16 in a gimbaled fashion to permit rocking movement of the cradle 18 about a pivot axis 28. The upper arm portion 20 includes link members 30, 32 and the forearm portion 22 includes link members 34, 36. The link members 30, 32 are pivotally mounted on the cradle 18 and are pivotally connected to the link members 34, 36. The link members 34, 36 are pivotally connected to the guide formation 24. The pivotal connections between the link members 30, 32, 34, 36, the cradle 18, and the guide formation 24 are arranged to constrain the robotic arm 12 to move in a specific manner, specifically with a pivot center 49 is coincident with the port of entry, such that movement of the arm does not excessively effect the surrounding tissue at the port of entry.
[0056] Referring now to FIG. 5 of the drawings, the wrist-like mechanism 50 will now be described in greater detail. In FIG. 5, the working end of the shaft 14.1 is indicated at 14.3. The wrist-like mechanism 50 includes a wrist member 52. One end portion of the wrist member 52 is pivotally mounted in a clevis, generally indicated at 17, on the end 14.3 of the shaft 14.1 by means of a pivotal connection 54. The wrist member 52 can pivot in the direction of arrows 56 about the pivotal connection 54. An end effector, generally indicated by reference numeral 58, is pivotally mounted on an opposed end of the wrist member 52. The end effector 58 is in the form of, e.g., a clip applier for anchoring clips during a surgical procedure. Accordingly, the end effector 58 has two parts 58.1, 58.2 together defining a j aw-like arrangement.
[0057] It will be appreciated that the end effector can be in the form of any desired surgical tool, e.g., having two members or fingers which pivot relative to each other, such as scissors, pliers for use as needle drivers, or the like. Instead, it can include a single working member, e.g., a scalpel, cautery electrode, or the like. When a tool other than a clip applier is desired during the surgical procedure, the tool 14 is simply removed from its associated arm and replaced with an instrument bearing the desired end effector, e.g., a scissors, or pliers, or the like.
[0058] The end effector 58 is pivotally mounted in a clevis, generally indicated by reference numeral 19, on an opposed end of the wrist member 52, by means of a pivotal connection 60. It will be appreciated that free ends 11, 13 of the parts 58.1, 58.2 are angularly displaceable about the pivotal connection 60 toward and away from each other as indicated by arrows 62, 63. It will further be appreciated that the members 58.1, 58.2 can be displaced angularly about the pivotal connection 60 to change the orientation of the end effector 58 as a whole, relative to the wrist member 52. Thus, each part 58.1, 58.2 is angularly displaceable about the pivotal connection 60 independently of the other, so that the end effector 58, as a whole, is angularly displaceable about the pivotal connection 60 as indicated in dashed lines in FIG. 5. Furthermore, the shaft 14.1 is rotatably mounted on the housing 53 for rotation as indicated by the arrows 59. Thus, the end effector 58 has three degrees of freedom of movement relative to the arm 12, namely, rotation about the axis 14.2 as indicated by arrows 59, angular displacement as a whole about the pivot 60 and angular displacement about the pivot 54 as indicated by arrows 56. By moving the end effector within its three degrees of freedom of movement, its orientation relative to the end 14.3 of the shaft 14.1 can selectively be varied. It will be appreciated that movement of the end effector relative to the end 14.3 of the shaft 14.1 is controlled by appropriately positioned actuators, e.g., electrical motors, or the like, which respond to inputs from the associated master control to drive the end effector 58 to a desired orientation as dictated by movement of the master control. Furthermore, appropriately positioned sensors, e.g., encoders, or potentiometers, or the like, are provided to permit the control system of the minimally invasive telesurgical system to determine joint positions as described in greater detail hereinbelow.
[0059] One of the master controls 700, 700 is indicated in FIG. 6 of the drawings. A hand held part or wrist gimbal of the master control device 700 is generally indicated by reference numeral 699. Part 699 has an articulated arm portion including a plurality of members or links connected together by pivotal connections or joints. The surgeon grips the part 699 by positioning his or her thumb and index finger over a pincher formation. When the pincher formation is squeezed between the thumb and index finger, the fingers or end effector elements of the end effector 58 close. When the thumb and index finger are moved apart the fingers of the end effector 58 move apart in sympathy with the moving apart of the pincher formation. The joints of the part 699 are operatively connected to actuators, e.g., electric motors, or the like, to provide for, e.g., force feedback, gravity compensation, and/or the like, as described in greater detail hereinbelow. Furthermore, appropriately positioned sensors, e.g., encoders, or potentiometers, or the like, are positioned on each joint of the part 699, so as to enable joint positions of the part 699 to be determined by the control system. [0060] The part 699 is typically mounted on an articulated arm 712. The articulated arm 712 includes a plurality of links 714 connected together at pivotal connections or joints 714. It will be appreciated that also the articulated arm 712 has appropriately positioned actuators, e.g., electric motors, or the like, to provide for, e.g., force feedback, gravity compensation, and/or the like. Furthermore, appropriately positioned sensors, e.g., encoders, or potentiometers, or the like, are positioned on the joints so as to enable joint positions of the articulated arm 712 to be determined by the control system.
[0061] To move the orientation of the end effector 58 and/or its position along a translational path, the surgeon simply moves the pincher formation to cause the end effector 58 to move to where he wants the end effector 58 to be in the image viewed in the viewer 202. Thus, the end effector position and/or orientation is caused to follow that of the pincher formation. The master control devices 700, 700 are typically mounted on the station 200 through pivotal connections.
[0062] The electric motors and sensors associated with the robotic arms 12 and the surgical instruments 14 mounted thereon, and the electric motors and sensors associated with the master control devices 700 are operatively linked in the control system. The control system typically includes at least one processor, typically a plurality of processors, for effecting control between master control device input and responsive robotic arm and surgical instrument output and for effecting control between robotic arm and surgical instrument input and responsive master control output in the case of, e.g., force feedback.
[0063] In use, and as schematically indicated in FIG. 7 of the drawings, the surgeon views the surgical site through the viewer 202. The end effector 58 carried on each arm 12 is caused to perform positional and orientational movements in response to movement and action inputs on its associated master controls. The master controls are indicated schematically at 700, 700. It will be appreciated that during a surgical procedure images of the end effectors 58 are captured by the endoscope 304 together with the surgical site and are displayed on the viewer 202 so that the surgeon sees the responsive movements and actions of the end effectors 58 as he or she controls such movements and actions by means of the master control devices 700, 700. The control system is arranged to cause end effector orientational and positional movement as viewed in the image at the viewer 202 to be mapped onto orientational and positional movement of a pincher formation of the master control as will be described in greater detail hereinbelow. [0064] The operation of the control system of the minimally invasive surgical apparatus will now be described in greater detail. In the description which follows, the control system will be described with reference to a single master control 700 and its associated robotic arm 12 and surgical instrument 14. The master control 700 will be referred to simply as "master" and its associated robotic arm 12 and surgical instrument 14 will be referred to simply as "slave."
[0065] The method whereby control between master movement and corresponding slave movement is achieved by the control system of the minimally invasive surgical apparatus will now be described with reference to FIGS. 7 to 9 of the drawings in overview fashion. The method will then be described in greater detail with reference to FIGS. 10 to 21 of the drawings.
[0066] Control between master and slave movement is achieved by comparing master position and orientation in an eye Cartesian coordinate reference system with slave position and orientation in a camera Cartesian coordinate reference system. For ease of understanding and economy of words, the term "Cartesian coordinate reference system" will simply be referred to as "frame" in the rest of this specification. Accordingly, when the master is stationary, the slave position and orientation within the camera frame is compared with the master position and orientation in the eye frame, and should the position and/or orientation of the slave in the camera frame not correspond with the position and/or orientation of the master in the eye frame, the slave is caused to move to a position and/or orientation in the camera frame at which its position and/or orientation in the camera frame does correspond with the position and/or orientation of the master in the eye frame. In FIG. 8, the camera frame is generally indicated by reference numeral 610 and the eye frame is generally indicated by reference numeral 612 in FIG. 9. [0067] When the master is moved into a new position and/or orientation in the eye frame 612, the new master position and/or orientation does not correspond with the previously corresponding slave position and/or orientation in the camera frame 610. The control system then causes the slave to move into a new position and/or orientation in the camera frame 610 at which new position and/or orientation, its position and orientation in the camera frame 610 does correspond with the new position and/or orientation of the master in the eye frame 612. [0068] It will be appreciated that the control system includes at least one, and typically a plurality, of processors which compute new corresponding positions and orientations of the slave in response to master movement input commands on a continual basis determined by the processing cycle rate of the control system. A typical processing cycle rate of the control system under discussion is about 1000 Hz or more, often being about 1300 Hz. Thus, when the master is moved from one position to a next position, the corresponding movement desired by the slave to respond is computed at about 1300 Hz. Naturally, the control system can have any appropriate processing cycle rate depending on the processor or processors used in the control system. All real-time servocycle processing is preferably conducted on a DSP (Digital Signal Processor) chip. DSPs are preferable because of their constant calculation predictability and reproducibility. A Share DSP from Analog Devices, Inc. of Massachusetts is an acceptable example of such a processor for performing the functions described herein.
[0069] The camera frame 610 is positioned such that its origin 614 is positioned at the viewing end 306 of the endoscope 304. Conveniently, the z axis of the camera frame 610 extends axially along a viewing axis 616 of the endoscope 304. Although in FIG. 8, the viewing axis 616 is shown in coaxial alignment with a shaft axis of the endoscope 304, it is to be appreciated that the viewing axis 616 can be angled relative thereto. Thus, the endoscope can be in the form of an angled scope. Naturally, the x and y axes are positioned in a plane perpendicular to the z axis. The endoscope is typically angularly displaceable about its shaft axis. The x, y and z axes are fixed relative to the viewing axis of the endoscope 304 so as to displace angularly about the shaft axis in sympathy with angular displacement of the endoscope 304 about its shaft axis.
[0070] To enable the control system to determine slave position and orientation, a frame is defined on or attached to the end effector 58. This frame is referred to as an end effector frame or slave tip frame, in the rest of this specification, and is generally indicated by reference numeral 618. The end effector frame 618 has its origin at the pivotal connection 60. Conveniently, one of the axes e.g. the z axis, of the frame 618 is defined to extend along an axis of symmetry, or the like, of the end effector 58. Naturally, the x and y axes then extend perpendicularly to the z axis. It will appreciated that the orientation of the slave is then defined by the orientation of the frame 618 having its origin at the pivotal connection 60, relative to the camera frame 610. Similarly, the position of the slave is then defined by the position of the origin of the frame at 60 relative to the camera frame 610.
[0071] Referring now to FIG. 9 of the drawings, the eye frame 612 is chosen such that its origin corresponds with a position 201 where the surgeon's eyes are normally located when he or she is viewing the surgical site at the viewer 202. The z axis extends along a line of sight of the surgeon, indicated by axis 620, when viewing the surgical site through the viewer 202. Naturally, the x and y axes extend perpendicularly from the z axis at the origin 201. Conveniently, the y axis is chosen to extend generally vertically relative to the viewer 202 and the x axis is chosen to extend generally horizontally relative to the viewer 202.
[0072] To enable the control system to determine master position and orientation within the viewer frame 612, a point on the master is chosen which defines an origin of a master or master tip frame, indicated by reference numeral 622. This point is chosen at a point of intersection indicated by reference numeral 3A between axes of rotation 1 and 3 of the master. Conveniently, the z axis of the master frame 622 on the master extends along an axis of symmetry of the pincher formation 706 which extends coaxially along the rotational axis 1. The x and y axes then extend perpendicularly from the axis of symmetry 1 at the origin 3A. Accordingly, orientation of the master within the eye frame 612 is defined by the orientation of the master frame 622 relative to the eye frame 612. The position of the master in the eye frame 612 is defined by the position of the origin 3A relative to the eye frame 612.
[0073] How the position and orientation of the slave within the camera frame 610 is determined by the control system will now be described with reference to FIG. 10 of the drawings. FIG. 10 shows a schematic diagram of one of the robotic arm 12 and surgical instrument 14 assemblies mounted on the cart 300. When used for neurosurgery, cardiology, and/or orthopedic surgery, the linkages of the robotic arm and its associated instrument may be altered or tailored for positioning and moving a flexible catheter body, an orthopedic probe, or the like. However, before commencing with a description of FIG. 10, it is appropriate to describe certain previously mentioned aspects of the surgical station 300 which impact on the determination of the orientation and position of the slave relative to the camera frame 610. [0074] In use, when it is desired to perform a surgical procedure by means of the minimally invasive surgical apparatus, the surgical station 300 is moved into close proximity to a patient requiring the surgical procedure. The patient is normally supported on a surface such as an operating table, or the like. To make allowance for support surfaces of varying height, and to make allowance for different positions of the surgical station 300 relative to the surgical site at which the surgical procedure is to be performed, the surgical station 300 is provided with the ability to have varying initial setup configurations. Accordingly, the robotic arms 12, 12, and the endoscope arm 302 are mounted on the carriage 97 which is heightwise adjustable, as indicated by arrows K, relative to the base 99 of the cart 300, as can best be seen in FIGS. 2 and 10 of the drawings. Furthermore, the robotic arms 12, 12 and the endoscope arm 302 are mounted on the carriage 97 by means of the setup joint arms 95. Thus, the lateral position and orientation of the arms 12, 12, 302 can be selected by moving the setup joint arms 95. Thus, at the commencement of the surgical procedure, the cart 300 is moved into the position in close proximity to the patient, an appropriate height of the carriage 97 is selected by moving it to an appropriate height relative to the base 99 and the surgical instruments 14 are moved relative to the carriage 97 so as to introduce the shafts of the instruments 14 and the endoscope 304 through the ports of entry and into positions in which the end effectors 58 and the viewing end 306 of the endoscope 304 are appropriately positioned at the surgical site and the fulcrums are coincident with the ports of entry. Once the height and positions are selected, the carriage 97 is locked at its appropriate height and the setup joint arms 95 are locked in their positions and orientations. Normally, throughout the surgical procedure, the carriage 97 is maintained at the selected height and similarly the setup joint arms 95 are maintained in their selected positions. However, if desired, either the endoscope or one or both of the instruments can be introduced through other ports of entry during the surgical procedure.
[0075] Returning now to FIG. 10, the determination by the control system of the position and orientation of the slave within the camera frame 610 will now be described. It will be appreciated that this is achieved by means of one or more processors having a specific processing cycle rate. Thus, where appropriate, whenever position and orientation are referred to in this specification, it should be borne in mind that a corresponding velocity is also readily determined. The control system determines the position and orientation of the slave within the camera frame 610 by determining the position and orientation of the slave relative to a cart frame 624 and by determining the orientation and position of the endoscope 304 with reference to the same cart frame 624. The cart frame 624 has an origin indicated by reference numeral 626 in FIG. 10.
[0076] To determine the position and orientation of the slave relative to the cart frame 624, the position of a fulcrum frame 630 having its origin at the fulcrum 49 is determined within the cart frame 624 as indicated by the arrow 628 in dashed lines. It will be appreciated that the position of the fulcrum 49 normally remains at the same location, coincident with a port of entry into the surgical site, throughout the surgical procedure. The position of the end effector frame 618 on the slave, having its origin at the pivotal connection 60, is then determined relative to the fulcrum frame 630 and the orientation of the end effector frame 618 on the slave is also determined relative to the fulcrum frame 630. The position and orientation of the end effector frame 618 relative to the cart frame is then determined by means of routine calculation using trigonometric relationships.
J0077] It will be appreciated that the robotic arm 302 of the endoscope 304 is constrained to move in similar fashion to the robotic arm 10. Thus, the endoscope 304 when positioned with its viewing end 306 directed at the surgical site, also defines a fulcrum coincident with its associated port of entry into the surgical site. The endoscope arm 302 can be driven to cause the endoscope 304 to move into a different position during a surgical procedure, to enable the surgeon to view the surgical site from a different position in the course of performing the surgical procedure. It will be appreciated that movement of the viewing end 306 of the endoscope 304 is performed by varying the orientation of the endoscope 304 relative to its pivot center or fulcrum. The position and orientation of the camera frame 610 within the cart frame 624 is determined in similar fashion to the position and orientation of the slave within the cart frame 624. When the position and orientation of the camera frame 610 relative to the cart frame 624, and the position and orientation of the slave relative to the cart frame 624 have been determined in this manner, the position and the orientation of the slave relative to the camera frame 610 is readily determinable through routine calculation using trigonometric relationships.
[0078] How the position and orientation of the master within the viewer frame 612 is determined by the control system will now be described with reference to FIG. 11 of the drawings. FIG. 11 shows a schematic diagram of one of the master controls 700 at the operator station 200. [0079] The operator station 200 optionally also includes setup joint arms, as indicated at 632, to enable the general location of the masters 700, 700 to be varied to suit the surgeon. Thus, the general position of the masters 700, 700 can be selectively varied to bring the masters 700, 700 into a general position at which they are comfortably positioned for the surgeon. When the masters 700, 700 are thus comfortably positioned, the setup joint arms 632 are locked in position and are normally maintained in that position throughout the surgical procedure.
[0080] To determine the position and orientation of the master 700, as indicated in FIG. 11, within the eye frame 612, the position and orientation of the eye frame 612 relative to a surgeon's station frame 634, and the position and orientation of the master 700 relative to the surgeon's frame 634 is determined. The surgeon's station frame 634 has its origin at a location which is normally stationary during the surgical procedure, and is indicated at 636.
[0081] To determine the position and orientation of the master 700 relative to the station frame 634, a position of a master setup frame 640 at an end of the setup joint arms 632 on which the master 700 is mounted, relative to the station frame 636, is determined, as indicated by the arrow 638 in dashed lines. The position and orientation of the master frame 622 on the master 700 having its origin at 3A is then determined relative to the master setup frame 640. In this manner, the position and orientation of the master frame 622 relative to the frame 634 can be determined by means of routine calculation using trigonometric relationships. The position and orientation of the eye frame 612 relative to the station frame 634 is determined in similar fashion. It will be appreciated that the position of the viewer 202 relative to the rest of the surgeon's console 200 can selectively be varied to suit the surgeon. The position and orientation of the master frame 622 relative to the eye frame 612 can then be determined from the position and orientation of the master frame 622 and the eye frame 612 relative to the surgeon station frame 634 by means of routine calculation using trigonometric relationships.
[0082] In the manner described above, the control system of the minimally invasive surgical apparatus determines the position and orientation of the end effector 58 by means of the end effector frame 618 in the camera frame 610, and, likewise, determines the position and orientation of the master by means of the master frame 622 relative to the eye frame 612. [0083] As mentioned, the surgeon grips the master by locating his or her thumb and index fmger over the pincher formation 706. When the surgeon's thumb and index finger are located on the pincher formation, the point of intersection 3A is positioned inwardly of the thumb and index fmger tips. The master frame having its origin at 3A is effectively mapped onto the end effector frame 618, having its origin at the pivotal connection 60 of the end effector 58 as viewed by the surgeon in the viewer 202. Thus, when performing the surgical procedure, and the surgeon manipulates the position and orientation of the pincher formation 706 to cause the position and orientation of the end effector 58 to follow, it appears to the surgeon that his or her thumb and index finger are mapped onto the fingers of the end effector 58 and that the pivotal connection 60 of the end effector 58 corresponds with a virtual pivot point of the surgeon's thumb and index finger inwardly from the tips of the thumb and index finger.
[0084] Accordingly, as the surgical procedure is being performed the position and orientation of the fingers of the end effector tracks orientation and position changes of the surgeon's thumb and index finger in a natural intuitive or superimposed fashion.
Furthermore, actuation of the end effector 58, namely causing the end effector fingers selectively to open and close, corresponds intuitively to the opening and closing of the surgeon's thumb and index finger. Thus, actuation of the end effector 58 as viewed in the viewer 302 is performed by the surgeon in a natural intuitive manner, since the pivot point 60 of the end effector 58 is appropriately mapped onto a virtual pivot point between the surgeon's thumb and index fmger.
[0085] Referring again to FIG. 10 of the drawings, the cart frame is chosen at 624. It will be appreciated that determining the position of the fulcrum frame 630 relative to the cart frame 624 is achieved through appropriately positioned sensors, such as potentiometers, encoders, or the like. Conveniently, the fulcrum frame position 630 relative to the cart frame 624 is determined through two intermediate frames. One of the frames is a carriage guide frame 644 which has its origin at a convenient location on a guide along which the carriage 97 is guided. The other frame, an arm platform frame indicated at 646 is positioned at an end of the setup joint arm 95 on which the robotic arm 12 is mounted. Thus, when slave position and orientation is determined relative to the cart frame 624, the carriage guide frame 644 position relative to the cart frame 624 is determined, then the platform frame 646 position relative to the carriage guide frame 644, then the fulcrum frame 630 relative to the platform frame 646, and then the slave orientation and position relative to the fulcrum frame 630, thereby to determine the slave position and orientation relative to the cart frame 624. It will be appreciated that the slave position and orientation relative to the cart frame 624 is determined in this manner for each arm 10 and in similar fashion for the camera frame 610, through its arm 302, relative to the cart frame 624.
[0086] Referring to FIG. 11, the position and orientation of the master control is determined by determining the position of a base frame 648 relative to the surgeon's station frame 634, then determining the position of the platform frame 640 relative to the base frame 648, and then determining master position and orientation relative to the platform frame 640. The position and orientation of the master frame 622 relative to the surgeon's station frame 634 is then readily determined through routine calculation using trigonometric relationships. It will be appreciated that the position and orientation of the other master frame relative to the surgeon console frame 634 is determined in a similar fashion. [0087] FIG. 12 schematically illustrates a high level control architecture for a master/slave robotic system 1000. Beginning at the operator input, a surgeon 1002 moves an input device of a master manipulator 1004 by applying manual or human forces f/, against the input device. Encoders of master manipulator 1004 generate master encoder signals emwhich are interpreted by a master input/output processor 1006 to determine the master joint positions Qm The master joint positions are used to generate Cartesian positions of the input device of the master xm using a master kinematics model 1008.
[0088] Starting now with the input from the surgical environment 1018, the tissue structures in the surgical workspace will impose forces fe against a surgical end effector (and possibly against other elements of the tool and/or manipulator). Environmental forces fe from the surgical environment 1018 alter position of the slave 1016, thereby altering slave encoder values QS transmitted to the slave input/output processor 1014. Slave input/output processor 1014 interprets the slave encoder values to determine joint positions Qs, which are then used to generate Cartesian slave position signals xs according to the slave kinematics processing block 1012. [0089] The master and slave Cartesian positions xw, χs are input into bilateral controller 1010, which uses these inputs to generate the desired Cartesian forces to be applied by the slave fs so that the surgeon can manipulate the salve as desired to perform a surgical procedure. Additionally, bilateral controller 1010 uses the Cartesian master and slave positions %,„, xs to generate the desired Cartesian forces to be applied by the master fm so as to provide force feedback to the surgeon.
[0090] In general, bilateral controller 1010 will generate the slave and master forces fs, fm by mapping the Cartesian position of the master in the master controller workspace with the Cartesian position of the end effector in the surgical workspace according to a transformation. Preferably, the control system 1000 will derive the transformation in response to state variable signals provided from the imaging system so that an image of the end effector in a display appears substantially connected to the input device. These state variables will generally indicate the Cartesian position of the field of view from the image capture device, as supplied by the slave manipulators supporting the image capture device. Hence, coupling of the image capture manipulator and slave end effector manipulator is beneficial for deriving this transformation. Clearly, bilateral controller 1010 may be used to control more than one slave arm, and/or may be provided with additional inputs.
[0091] Based generally on the difference in position between the master and the slave in the mapped workspace, bilateral controller 1010 generates Cartesian slave force β to urge the slave to follow the position of the master. The slave kinematics 1012 are used to interpret the Cartesian slave forces fs to generate joint torques of the slave τ^ which will result in the desired forces at the end effector. Slave input/output processor 1014 uses these joint torques to calculate slave motor currents is, which reposition the slave xe within the surgical worksite.
[0092] The desired feedback forces from bilateral controller are similarly interpreted from Cartesian force on the master fm based on the master kinematics 1008 to generate master joint torques τs. The master joint torques are interpreted by the master input/output controller 1006 to provide master motor current im to the master manipulator 1004, which changes the position of the hand held input device X/, in the surgeon's hand.
[0093] It will be recognized that the control system 1000 illustrated in FIG. 12 is a simplification. For example, the surgeon does not only apply forces against the master input device, but also moves the handle within the master workspace. Similarly, the motor current supplied to the motors of the master manipulator may not result in movement if the surgeon maintains the position of the master controller. Nonetheless, the motor currents do result in tactile force feedback to the surgeon based on the forces applied to the slave by the surgical environment. Additionally, while Cartesian coordinate mapping is preferred, the use of spherical, cylindrical, or other reference frames may provide at least some of the advantages of the invention. [0094] The control system, generally indicated by reference numeral 810, will now be described in greater detail with reference to FIG. 13 of the drawings, in which like reference numerals are used to designate similar parts or aspects, unless otherwise stated.
[0095] As mentioned earlier, the master control 700 has sensors, e.g., encoders, or potentiometers, or the like, associated therewith to enable the control system 810 to determine the position of the master control 700 in joint space as it is moved from one position to a next position on a continual basis during the course of performing a surgical procedure. In FIG. 13, signals from these positional sensors are indicated by arrow 814. Positional readings measured by the sensors at 687 are read by the processor. It will be appreciated that since the master control 700 includes a plurality of joints connecting one arm member thereof to the next, sufficient positional sensors are provided on the master 700 to enable the angular position of each arm member relative to the arm member to which it is joined to be determined thereby to enable the position and orientation of the master frame 622 on the master to be determined. As the angular positions of one arm member relative to the arm member to which it is joined is read cyclically by the processor 689 in response to movements induced on the master control 700 by the surgeon, the angular positions are continuously changing. The processor at 689 reads these angular positions and computes the rate at which these angular positions are changing. Thus, the processor 6S9 reads angular positions and computes the rate of angular change, or joint velocity, on a continual basis corresponding to the system processing cycle time, i.e., 1300 Hz. Joint position and joint velocity commands thus computed at 689 are then input to the Forward Kinematics (FKIN) controller at 691, as already described hereinabove.
[0096] At the FKIN controller 691, the positions and velocities in joint space are transformed into corresponding positions and velocities in Cartesian space, relative to the eye frame 612. The FKIN controller 691 is a processor which typically employs a Jacobian (J) matrix to accomplish this. It will be appreciated that the Jacobian matrix transforms angular positions and velocities into corresponding positions and velocities in Cartesian space by means of conventional trigonometric relationships. Thus, corresponding positions and velocities in Cartesian space, or Cartesian velocity and position commands, are computed by the FKIN controller 691 which correspond to Cartesian position and velocity changes of the master frame 622 in the eye frame 612.
[0097] The velocity and the position in Cartesian space is input into a Cartesian controller, indicated at 820, and into a scale and offset converter, indicated at 822.
[0098] The minimally invasive surgical apparatus provides for a scale change between master control input movement and responsive slave output movement. Thus, a scale can be selected where, for example, a 1-inch movement of the master control 700 is transformed into a corresponding responsive 1/5-inch movement on the slave. At the scale and offset step 822, the Cartesian position and velocity values are scaled in accordance with the scale selected to perform the surgical procedure. Naturally, if a scale of 1:1 has been selected, no change in scale is effected at 822. Similarly, offsets are taken into account which determine the corresponding position and/or orientation of the end effector frame 618 in the camera frame 610 relative to the position and orientation of the master frame 622 in the eye frame 612.
[0099] After a scale and offset step is performed at 822, a resultant desired slave position and desired slave velocity in Cartesian space is input to a simulated or virtual domain at 812, as indicated by arrows 811. It will be appreciated that the labeling of the block 812 as a simulated or virtual domain is for identification only. Accordingly, the simulated control described hereinbelow is performed by elements outside the block 812 also.
[0100] The simulated domain 812 will be described in greater detail hereinbelow. However, the steps imposed on the desired slave velocity and position in the virtual domain 812 will now be described broadly for ease of understanding of the description which follows. A current slave position and velocity is continually monitored in the virtual or simulated domain 812. The desired slave position and velocity is compared with the current slave position and velocity. Should the desired slave position and/or velocity as input from 822 not cause transgression of limitations, e.g., velocity and/or position and/or singularity, and/or the like, as set in the virtual domain 812, a similar Cartesian slave velocity and position is output from the virtual domain 812 and input into an inverse scale and offset converter as indicated at 826. The similar velocity and position output in Cartesian space from the virtual domain 812 is indicated by arrows 813 and corresponds with actual commands in joint space output from the virtual domain 812 as indicated by arrows 815 as will be described in greater detail hereinbelow. From the inverse scale and offset converter 826, which performs the scale and offset step of 822 in reverse, the reverted Cartesian position and velocity is input into the Cartesian controller at 820. At the Cartesian controller 82O5 the original Cartesian position and velocities as output &om the FKIN controller 691 is compared with the Cartesian position and velocity input from the simulated domain 812. If no limitations were transgressed in the simulated domain 812 the velocity and position values input from the FKIN controller 691 would be the same as the velocity and position values input from the simulated domain 812. In such a case, a zero error signal is generated by the Cartesian controller 820.
[0101] In the event that the desired Cartesian slave position and velocity input at 811 would transgress one or more set limitations, the desired values are restricted to stay within the bounds of the limitations. Consequently, the Cartesian velocity and position forwarded from the simulated domain 812 to the Cartesian controller 820 would then not be the same as the values from the FKIN controller 691. In such a case, when the values are compared by the Cartesian controller 820, an error signal is generated.
[0102] Assuming that a zero error is generated at the Cartesian controller 820 no signal is passed from the Cartesian controller or converter 820. In the case that an error signal is generated the signal is passed through a summation junction 827 to a master transpose kinematics controller 828.
[0103] The error signal is typically used to calculate a Cartesian force. The Cartesian force is typically calculated, by way of example, in accordance with the following formula;
Figure imgf000029_0001
[0104] where K is a spring constant, B is a damping constant, Δx is the difference between the Cartesian velocity inputs to the Cartesian controller 820 and Δx is the difference between the Cartesian position inputs to the Cartesian controller 820. It will be appreciated that for an orientational error, a corresponding torque in Cartesian space is determined in accordance with conventional methods. [0105] The Cartesian force corresponds to an amount by which the desired slave position and/or velocity extends beyond the limitations imposed in the simulated domain 812. The Cartesian force, which could result from a velocity limitation, a positional limitation, and/or a singularity limitation, as described in greater detail below, is then converted into a corresponding torque signal by means of the master transpose kinematics controller 828 which typically includes a processor employing a Jacobian Transpose (Jτ) matrix and kinematic relationships to convert the Cartesian force to a corresponding torque in joint space. The torque thus determined is then input to a processor at 830 whereby appropriate electrical currents to the motors associated with the master 700 are computed and supplied to the motors. These torques are then applied on the motors operatively associated with the master control 700. The effect of this is that the surgeon experiences a resistance on the master control to either move it at the rate at which he or she is urging the master control to move, or to move it into the position into which he or she is urging the master control to move. The resistance to movement on the master control is due to the torque on the motors operatively associated therewith. Accordingly, the higher the force applied on the master control to urge the master control to move to a position beyond the imposed limitation, the higher the magnitude of the error signal and the higher an opposing torque on the motors resisting displacement of the master control in the direction of that force. Similarly, the higher the velocity imposed on the master beyond the velocity limitation, the higher the error signal and the higher the opposing torque on the motors associated with the master. [0106] Referring once again to FIG. 13 of the drawings, the slave joint velocity and position signal is passed from the simulated domain 812 to a joint controller 848. At the joint controller 848, the resultant joint velocity and position signal is compared with the current joint position and velocity. The current joint position and velocity is derived through the sensors on the slave as indicated at 849 after having been processed at an input processor 851 to yield slave current position and velocity in joint space.
[0107] The joint controller 848 computes the torques desired on the slave motors to cause the slave to follow the resultant joint position and velocity signal taking its current joint position and velocity into account. The joint torques so determined are then routed to a feedback processor at 852 and to an output processor at 854. [0108] The joint torques are typically computed, by way of example, by means of the following formula:
T=K(ΔΘ)+B(ΔΘ) [0109] where K is a spring constant, B is a damping constant, Δθ is the difference between the joint velocity inputs to the joint controller 851 , and Δθ is the difference between the joint position inputs to the joint controller 851.
[0110] The output processor 854 determines the electrical currents to be supplied to the motors associated with the slave to yield the commanded torques and causes the currents to be supplied to the motors as indicated by arrow 855.
[0111] From the feedback processor 852 force feedback is supplied to the master. As mentioned earlier, force feedback is provided on the master 700 whenever a limitation is induced in the simulated domain 812. Through the feedback processor 852 force feedback is provided directly from the slave 798, in other words, not through a virtual or simulated domain but through direct slave movement. This will be described in greater detail hereinbelow.
[0112] As mentioned earlier, the slave indicated at 798 is provided with a plurality of sensors. These sensors are typically operatively connected to pivotal joints on the robotic arm 10 and on the instrument 14.
[0113] These sensors are operatively linked to the processor at 851. It will be appreciated that these sensors determine current slave position. Should the slave 798 be subjected to an external force great enough to induce reactive movement on the slave 798, the sensors will naturally detect such movement. Such an external force could originate from a variety of sources such as when the robotic arm 10 is accidentally knocked, or knocks into the other robotic arm 10 or the endoscope arm 302, or the like. As mentioned, the joint controller 848 computes torques desired to cause the slave 798 to follow the master 700. An external force on the slave 798 which causes its current position to vary also causes the desired slave movement to follow the master to vary. Thus a compounded joint torque is generated by the joint controller 848, which torque includes the torque desired to move the slave to follow the master and the torque desired to compensate for the reactive motion induced on the slave by the external force. The torque generated by the joint controller 848 is routed to the feedback processor at 852, as already mentioned. The feedback processor 852 analyzes the torque signal from the joint controller 848 and accentuates that part of the torque signal resulting from the extraneous force on the slave 798. The part of the torque signal accentuated can be chosen depending on requirements. In this case, only the part of the torque signal relating to the robotic arm 12, 12, 302 joints are accentuated. The torque signal, after having been processed in this way is routed to a kinematic mapping block 860 from which a corresponding Cartesian force is determined. At the kinematic block 860, the information determining slave fulcrum position relative to the camera frame is input from 647 as indicated. Thus, the Cartesian force is readily determined relative to the camera frame. This Cartesian force is then passed through a gain step at 862 appropriately to vary the magnitude of the Cartesian force. The resultant force in Cartesian space is then passed to the summation junction at 827 and is then communicated to the master control 700 as described earlier.
[0114] Reference numeral 866 generally indicates another direct force feedback path of the control system 810, whereby direct force feedback is supplied to the master control 700. The path 866 includes one or more sensors which are not necessarily operatively connected to slave joints. These sensors can typically be in the form of force or pressure sensors appropriately positioned on the surgical instrument 14, typically on the end effector 58. Thus, should the end effector 58 contact an extraneous body, such as body tissue at the surgical site, it generates a corresponding signal proportionate to the force of contact. This signal is processed by a processor at 868 to yield a corresponding torque. This torque is passed to a kinematic mapping block 864, together with information from 647 to yield a corresponding Cartesian force relative to the camera frame. From 864, the resultant force is passed through a gain block at 870 and then forwarded to the summation junction 827. Feedback is imparted on the master control 700 by means of torque supplied to the motors operatively associated with the master control 700 as described earlier. It will be appreciated that this can be achieved by means of any appropriate sensors such as current sensors, pressure sensors, accelerometers, proximity detecting sensors, or the like. In some embodiments, resultant forces from kinematic mapping 864 may be transmitted to an alternative presentation block 864.1 so as to indicate the applied forces in an alternative format to the surgeon.
[0115] Reference now is made to FIG. 14 wherein a distal end portion, or tip, 260 of the insertion section of a flexible instrument or endoscope is shown. The insertion end of the instrument includes a pair of spaced viewing windows 262R and 262L and an illumination source 264 for viewing and illuminating a workspace to be observed. Light received at the windows is focused by objective lens means, not shown, and transmitted through fiberoptic bundles to a pair of cameras at the operating end of the instrument, not shown. The camera outputs are converted to a three-dimensional image of the workspace which image is located adjacent hand-operated means at the operator's station, now shown. Right and left steerable catheters 268R and 268L pass through accessory channels in the endoscope body, which catheters are adapted for extension from the distal end portion, as illustrated. End effectors 270R and 270L are provided at the ends of the catheters which may comprise conventional endoscopic instruments. Force sensors, not shown, also are inserted through the endoscope channels. Steerable catheters which include control wires for controlling bending of the catheters and operation of an end effector suitable for use with this invention are well know. Control motors for operation of the control wires are provided at the operating end of the endoscope, which motors are included in a servomechanism of a type described above for operation of the steerable catheters and associated end effectors from a remote operator's station. As with the other embodiments, the interfacing computer in the servomechanism system remaps the operator's hand motion into the coordinate system of the end effectors, and images of the end effectors are viewable adjacent the hand-operated controllers in a manner described above. Flexible catheter-based instruments and probes of different types may be employed in this embodiment of the invention.
[0116] Referring now to FIG. 15, a method for using system 1 (See FIG. 1) can be initiated by capturing an image 504 of the target anatomy and/or adjacent portions of treatment probe 6. An image is displayed 506 to the system operator, with the image preferably comprising a three dimensional image of the surgical site. The system operator inputs a command to move the treatment probe with reference to the displayed image 508.
[0117] In response to the input command, the processor of the robotic system calculates a substantially corresponding movement 510. The processor, manipulator, and probe effect a robotic movement of a treatment surface per the calculated movement 512. The system operator may actuate the treatment surface or the like so as to treat tissue with the treatment surface 514. The telesurgical system will again capture a new image 504, and the like. Method 502 illustrated in FIG. 15 is a simplified schematic illustration, as treatment of a target tissue need not occur each time an image is captured. In the image capture and movement cycle times may differ. [0118] While the actual manipulator structures, image capture devices, and the like may differ significantly from the exemplary structures described herein, coordination of an image and robotic command/movement will generally provide benefits for a wide variety of differing embodiments. For example, rather than electromechanically correlating input and output of the linkages relying on rigid link structures coupled together at discreet joints, position and/or orientation information from a distal portion of a flexible body may be provided at least in part by image processing techniques applied to one or more high contrast reference markers of the treatment probe. Alternatively, the treatment probe may include structures which facilitate identifying a position of the distal end within the body, and/or which measure flexing of the catheter body along it's length so that the position of the distal end may be calculated by processor 5.
[0119] While the exemplary embodiments have been described in some detail, for clarity of understanding and by way of example, a number of adaptations, changes, and modifications will be obvious to those who's skilled in the art. Hence, the scope of the present invention is limited solely by the appended claims.

Claims

WHAT IS CLAIMED IS:
L A surgical system comprising: an image capture device for acquiring an image of an internal surgical site through an intermediate tissue; a display coupled to the image capture device; an input device near the display; a catheter having a proximal end, a distal end, a flexible catheter body therebetween, and a therapy delivery surface disposed near the distal end; a manipulator coupleable to the proximal end of the catheter so as to effect movement of the distal end; and a processor coupling the input device to the manipulator, the processor configured to effect movement of the distal end of the catheter in response to movement of the input device.
2. The system of claim 1, wherein the image capture device is configured to capture images including the distal end of the catheter substantially contemporaneously with movement of the distal end of the catheter so as to allow real time image guided catheter therapy.
3. The system of claim 2, wherein the image capture device comprises at least one of: a magnetic resonance (MR) imaging device; computerized tomography, ultrasound, digital angiography, or positron emission tomography.
4. The system of claim 3, wherein the image capture device comprises an MR device, and wherein the catheter and at least a portion of the manipulator comprise non-ferromagnetic materials.
5. The system of claim 1, wherein the input device and display are isolated from at least potentially harmful emissions of the image capture device.
6. The system of claim 1, wherein the processor comprises a hand tremor filter.
7. The system of claim 1, wherein the catheter comprises a neurosurgical catheter.
8. The system of claim 1, wherein the catheter comprises a cardiology catheter.
9. The system of claim 1, wherein the therapy delivery surface comprises one or more of: a tissue manipulator; a retractor; a cutting blade; a transplant cell, biological tissue, or gene receptacle; a stimulator; an electrode; a vascular clip applier; vascular occluder; an ablation surface; or an implantable device receptacle.
10. The system of claim 1, wherein the image capture device is coupled to the processor, and wherein the processor is configured so that a motion of the image seen in the display is less than a motion of the tissue.
11. The system of claim 10, wherein the processor is configured to sequence scanning of the image capture device with the motion of the tissue.
12. An orthopedic surgical system comprising: an image capture device for acquiring an image of an orthopedic surgical site through an intermediate tissue; a display coupled to the image capture device; an input device near the display; a probe having a proximal end, a distal end, and an orthopedic therapy delivery surface disposed near the distal end; a manipulator coupleable to the proximal end of the probe so as to effect movement of the distal end; and a processor coupling the input device to the manipulator, the processor configured to effect movement of the distal end of the catheter in response to movement of the input device so as to allow real time image-guided robotic orthopedic therapy.
13. The system of claim 12, wherein the orthopedic therapy delivery surface comprises a bone drill.
14. A surgical method comprising: acquiring an image of an internal surgical site through an intermediate tissue; displaying the image; inputting a movement command with reference to the displayed image so as to align a treatment delivery surface of a flexible catheter with a target tissue; calculating a movement of the catheter per the input movement; robotically moving the distal end of the catheter per the calculated movement; and treating the target tissue with the treatment delivery surface.
15. An orthopedica method comprising: acquiring an image of an orthopedic surgical site through an intermediate tissue; displaying the image; inputting a movement command with reference to the displayed image so as to align a treatment delivery surface of an orthopedic probe with an orthopedic tissue; calculating a movement of the catheter per the input movement; robotically moving the distal end of the probe per the calculated movement; and treating the target tissue with the treatment delivery surface.
PCT/US2006/024796 2005-06-30 2006-06-26 Robotic image guided catheter-based surgical devices and techniques WO2007005367A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US69536605P 2005-06-30 2005-06-30
US60/695,366 2005-06-30

Publications (2)

Publication Number Publication Date
WO2007005367A2 true WO2007005367A2 (en) 2007-01-11
WO2007005367A3 WO2007005367A3 (en) 2007-03-01

Family

ID=37499923

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/024796 WO2007005367A2 (en) 2005-06-30 2006-06-26 Robotic image guided catheter-based surgical devices and techniques

Country Status (1)

Country Link
WO (1) WO2007005367A2 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007045075A1 (en) * 2007-09-21 2009-04-09 Siemens Ag Interventional medical diagnosis and / or therapy system
EP2130479A1 (en) * 2007-03-29 2009-12-09 Olympus Medical Systems Corp. Device for controlling position of treatment instrument for endoscope
WO2010025943A1 (en) * 2008-09-08 2010-03-11 Kuka Roboter Gmbh Medical workstation and operating device for the manual movement of a robot arm
CN101870107A (en) * 2010-06-26 2010-10-27 上海交通大学 Control system of auxiliary robot of orthopedic surgery
WO2012034886A1 (en) * 2010-09-17 2012-03-22 Siemens Aktiengesellschaft Method for positioning a laparoscopic robot in a specifiable relative position with respect to a trocar
WO2014036034A1 (en) * 2012-08-27 2014-03-06 University Of Houston Robotic device and system software, hardware and methods of use for image-guided and robot-assisted surgery
EP2979605A4 (en) * 2013-03-29 2016-11-23 Tokyo Inst Tech Endoscopic operating system and endoscopic operating program
US20170135772A1 (en) * 2012-08-24 2017-05-18 University Of Houston System Robotic device for image-guided surgery and interventions
WO2021200881A1 (en) * 2020-03-31 2021-10-07 地方独立行政法人神奈川県立産業技術総合研究所 Medical device and medical program
CN114504388A (en) * 2022-03-15 2022-05-17 山东大学齐鲁医院 Flexible surgical robot system and control method thereof
US11395708B2 (en) 2018-06-20 2022-07-26 Gabriel Gruionu Systems and methods for automatic guidance of medical catheters and endoscopes

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2440130A4 (en) 2009-06-08 2015-06-03 Mri Interventions Inc Mri-guided surgical systems with proximity alerts
US8396532B2 (en) 2009-06-16 2013-03-12 MRI Interventions, Inc. MRI-guided devices and MRI-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5749362A (en) * 1992-05-27 1998-05-12 International Business Machines Corporation Method of creating an image of an anatomical feature where the feature is within a patient's body
US6402737B1 (en) * 1998-03-19 2002-06-11 Hitachi, Ltd. Surgical apparatus
WO2003086190A1 (en) * 2002-04-10 2003-10-23 Stereotaxis, Inc. Systems and methods for interventional medicine
US20040077939A1 (en) * 2001-02-22 2004-04-22 Rainer Graumann Device and method for controlling surgical instruments

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5749362A (en) * 1992-05-27 1998-05-12 International Business Machines Corporation Method of creating an image of an anatomical feature where the feature is within a patient's body
US6402737B1 (en) * 1998-03-19 2002-06-11 Hitachi, Ltd. Surgical apparatus
US20040077939A1 (en) * 2001-02-22 2004-04-22 Rainer Graumann Device and method for controlling surgical instruments
WO2003086190A1 (en) * 2002-04-10 2003-10-23 Stereotaxis, Inc. Systems and methods for interventional medicine

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2130479A1 (en) * 2007-03-29 2009-12-09 Olympus Medical Systems Corp. Device for controlling position of treatment instrument for endoscope
EP2130479A4 (en) * 2007-03-29 2013-04-03 Olympus Medical Systems Corp Device for controlling position of treatment instrument for endoscope
DE102007045075B4 (en) * 2007-09-21 2010-05-12 Siemens Ag Interventional medical diagnosis and / or therapy system
DE102007045075A1 (en) * 2007-09-21 2009-04-09 Siemens Ag Interventional medical diagnosis and / or therapy system
WO2010025943A1 (en) * 2008-09-08 2010-03-11 Kuka Roboter Gmbh Medical workstation and operating device for the manual movement of a robot arm
US8649905B2 (en) 2008-09-08 2014-02-11 Kuka Laboratories Gmbh Medical workstation and operating device for the manual movement of a robot arm
CN101870107A (en) * 2010-06-26 2010-10-27 上海交通大学 Control system of auxiliary robot of orthopedic surgery
WO2012034886A1 (en) * 2010-09-17 2012-03-22 Siemens Aktiengesellschaft Method for positioning a laparoscopic robot in a specifiable relative position with respect to a trocar
US20170135772A1 (en) * 2012-08-24 2017-05-18 University Of Houston System Robotic device for image-guided surgery and interventions
US10136955B2 (en) * 2012-08-24 2018-11-27 University Of Houston System Robotic device for image-guided surgery and interventions
WO2014036034A1 (en) * 2012-08-27 2014-03-06 University Of Houston Robotic device and system software, hardware and methods of use for image-guided and robot-assisted surgery
US9855103B2 (en) 2012-08-27 2018-01-02 University Of Houston System Robotic device and system software, hardware and methods of use for image-guided and robot-assisted surgery
JP2015527144A (en) * 2012-08-27 2015-09-17 ユニバーシティ オブ ヒューストンUniversity Of Houston Robotic device and system software, hardware, and method for use in image guided and robot assisted surgery
EP2979605A4 (en) * 2013-03-29 2016-11-23 Tokyo Inst Tech Endoscopic operating system and endoscopic operating program
US11395708B2 (en) 2018-06-20 2022-07-26 Gabriel Gruionu Systems and methods for automatic guidance of medical catheters and endoscopes
WO2021200881A1 (en) * 2020-03-31 2021-10-07 地方独立行政法人神奈川県立産業技術総合研究所 Medical device and medical program
CN114504388A (en) * 2022-03-15 2022-05-17 山东大学齐鲁医院 Flexible surgical robot system and control method thereof

Also Published As

Publication number Publication date
WO2007005367A3 (en) 2007-03-01

Similar Documents

Publication Publication Date Title
WO2007005367A2 (en) Robotic image guided catheter-based surgical devices and techniques
US11045077B2 (en) Autofocus and/or autoscaling in telesurgery
Freschi et al. Technical review of the da Vinci surgical telemanipulator
Tendick et al. Applications of micromechatronics in minimally invasive surgery
US8396598B2 (en) Microsurgical robot system
US8583274B2 (en) Method for graphically providing continuous change of state directions to a user of medical robotic system
US8419717B2 (en) Control system configured to compensate for non-ideal actuator-to-joint linkage characteristics in a medical robotic system
Taylor et al. Medical robotics and computer-integrated interventional medicine
US11179213B2 (en) Controllers for robotically-enabled teleoperated systems
Ginoya et al. A historical review of medical robotic platforms
US11950869B2 (en) System and method for providing on-demand functionality during a medical procedure
CN109431606A (en) A kind of blood vessel intervention operation robot combined system and its application method
CN115334993A (en) System and method for constrained motion control of a medical instrument
Yen et al. Shared control for a handheld orthopedic surgical robot
WO2021191690A1 (en) Systems and methods of communicating thermal information for surgical robotic devices
WO2023180891A1 (en) Physician console generating haptic vibration for teleoperation
WO2021191693A1 (en) Hand-manipulated input device for robotic system
Heller Robotic laparoscopic surgery

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06773994

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 06773994

Country of ref document: EP

Kind code of ref document: A2