US20040015079A1 - Ultrasound probe with integrated electronics - Google Patents

Ultrasound probe with integrated electronics Download PDF

Info

Publication number
US20040015079A1
US20040015079A1 US10/386,360 US38636003A US2004015079A1 US 20040015079 A1 US20040015079 A1 US 20040015079A1 US 38636003 A US38636003 A US 38636003A US 2004015079 A1 US2004015079 A1 US 2004015079A1
Authority
US
United States
Prior art keywords
interface
data
ultrasonic
image
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/386,360
Inventor
Noah Berger
Michael Brodsky
Alice Chiang
Mark LaForest
William Wong
Xingbai He
Peter Chang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TeraTech Corp
Original Assignee
TeraTech Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/449,780 external-priority patent/US6530887B1/en
Priority claimed from US09/822,764 external-priority patent/US6669633B2/en
Priority claimed from US10/094,950 external-priority patent/US6969352B2/en
Priority claimed from US10/354,946 external-priority patent/US9402601B1/en
Application filed by TeraTech Corp filed Critical TeraTech Corp
Priority to US10/386,360 priority Critical patent/US20040015079A1/en
Assigned to TERATECH CORPORATION reassignment TERATECH CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BERGER, NOAH, BRODSKY, MICHAEL, CHANG, PETER P., CHIANG, ALICE M., HE, XINGBAI, LAFOREST, MARK, WONG, WILLIAM
Publication of US20040015079A1 publication Critical patent/US20040015079A1/en
Priority to US13/846,231 priority patent/US11547382B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/546Control of the diagnostic device involving monitoring or regulation of device temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • A61B10/02Instruments for taking cell samples or for biopsy
    • A61B10/04Endoscopic instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/02Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by cooling, e.g. cryogenic techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0866Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M25/00Catheters; Hollow probes
    • A61M25/01Introducing, guiding, advancing, emplacing or holding catheters
    • A61M25/0105Steering means as part of the catheter or advancing means; Markers for positioning
    • A61M25/0108Steering means as part of the catheter or advancing means; Markers for positioning using radio-opaque or ultrasound markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • A61N5/1039Treatment planning systems using functional images, e.g. PET or MRI
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/892Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being curvilinear
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/899Combination of imaging systems with ancillary equipment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52025Details of receivers for pulse systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/5208Constructional features with integration of processing functions inside probe or scanhead
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0808Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4472Wireless probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8918Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being linear
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8979Combined Doppler and pulse-echo imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52055Display arrangements in association with ancillary recording equipment

Definitions

  • Conventional ultrasound imaging systems typically include a hand-held probe coupled by cables to a large rack-mounted console processing and display unit.
  • the probe typically includes an array of ultrasonic transducers which transmit ultrasonic energy into a region being examined and receive reflected ultrasonic energy returning from the region.
  • the transducers convert the received ultrasonic energy into low-level electrical signals which are transferred over the cable to the processing unit.
  • the processing unit applies appropriate beam forming techniques to combine the signals from the transducers to generate an image of the region of interest.
  • Typical conventional ultrasound systems include a transducer array each transducer being associated with its own processing circuitry located in the console processing unit.
  • the processing circuitry typically includes driver circuits which, in the transmit mode, send precisely timed drive pulses to the transducer to initiate transmission of the ultrasonic signal. These transmit timing pulses are forwarded from the console processing unit along the cable to the scan head.
  • beamforming circuits of the processing circuitry introduce the appropriate delay into each low-level electrical signal from the transducers to dynamically focus the signals such that an accurate image can subsequently be generated.
  • control circuitry and beamforming circuitry are localized in a portable assembly. Such an integrated package simplifies the cable requirements of the assembly, without adding significant weight.
  • a system and method for gathering ultrasonic data on a standard user computing device and employing the data via an integrated interface program allows such ultrasonic data to be invoked by a variety of external applications having access to the integrated interface program via a standard, predetermined platform such as visual basic or c++.
  • the system provides external application integration in an ultrasonic imaging system by defining an ultrasonic application server for performing ultrasonic operations.
  • An integrated interface program with a plurality of entry points into the ultrasonic application server is defined. The entry points are operable to access each of the ultrasonic operations.
  • An external application sends a command indicative of at least one of the ultrasonic operations. The command is transmitted via the integrated interface program to the ultrasonic application server. Concurrently, at periodic intervals, raw ultrasonic data indicative of ultrasonic image information is received by the ultrasonic application server over a predetermined communication interface. A result corresponding to the command is computed by the ultrasonic application server, and transmitted to the external application by the integrated interface program.
  • An embodiment of the invention includes a probe having a plurality of circuit boards or circuit panels that are mounted within a generally rectangular cavity within a hand-held housing.
  • the circuit panels each have one or more integrated circuits and are mounted in planes that are parallel to one another.
  • These integrated circuits can be fabricated using a standard CMOS process that support voltage levels between 3.3 V and 200 V.
  • a particular embodiment of the invention utilizes two or three circuit boards or panels, a center panel having a center system controller and a communication link to an external processor.
  • the center panel can be mounted between a pair of surrounding panels, each including a memory and a beamforming circuit.
  • the system accommodates the use of different probe elements and can employ a variable power supply that is adjusted to different levels for different probes. Also, it is desirable to use a variable clock generator so that different frequencies can be selected for different probes.
  • Another preferred embodiment of the invention provides a small probe that is connected by a first cable to an interface-housing.
  • the interface housing can contain the beamformer device and associated circuits and is a small light weight unit that can be held in one hand by the user while the other hand manipulates the probe.
  • the probe can be any of several conventional probes that can be interchangeably connected by cable to the interface housing.
  • the interface housing can be worn on the body of the user with a strap, on the forearm or the waist with a belt, for example, or in a pocket of the user.
  • a preferred embodiment using such an interface can include two or three circuit boards as described in greater detail herein.
  • the interface housing is connected to a personnel computer by standard FireWire or serial bus connection.
  • the probe incorporating the beamformer, or the probe with the interface housing can be connected to a wearable personal computer.
  • the computer performing scan conversion, post signal processing or color doppler processing is located in a housing worn by the user, such as on the forearm, on the waist or in a pocket.
  • a power supply board can be inserted into the probe, into the interface housing or in another external pod and can include a DC-DC converter.
  • the display system can also include a head mounted display.
  • a hand-held controller can be connected to the computer or interface by wire or wireless connection.
  • a preferred embodiment of the invention can utilize certain safety features including circuits that check the power supply voltage level, that test every channel of the beamformer and assists in setting gain levels, that counts pulses per second and automatically shuts off the system to prevent over-radiating of the patient.
  • Another preferred embodiment of the invention employs the use of dedicated controls that the user can employ to perform specific tasks during a patient study.
  • These controls are readily accessible and intuitive in use. These controls provide for freezing or unfreezing of the image on the display, for recording an image in electronic memory, to measure distances in two dimensions using a marker or caliper and a “set” function fix two markers or calipers on screen, a track ball, touchpad or other manually manipulated element to control the marker, a time gain compensation control, such as 8 slide pots, to correct for sound attenuation in the body, scale or depth control to provide a zoom feature and for selection of focal zones.
  • a time gain compensation control such as 8 slide pots
  • the system can be employed with a number of probe systems and imaging methods. These include the generation of color Doppler, power Doppler and spectral density studies. These studies can be aided by the use of contrast agents that are introduced into the body during a study to enhance the response to ultrasound signals. Such agents can also include medications that are acoustically released into the body when they are activated by specific acoustic signals generated by the probe transducer array.
  • a system for ultrasonic imaging includes a probe and a computing device.
  • the probe has a transducer array, control circuitry and a digital communication control circuit.
  • the control circuitry includes a transmit/receive module, beamforming module and a system controller.
  • a computing device connects to the digital communication control circuit of the probe with a communication interface.
  • the computer processes display data.
  • the communication interface between the probe and the computing device is a wireless interface in several embodiments.
  • the wireless interface is a radio frequency (RF) interface.
  • the wireless interface is an infrared interface (IR).
  • the communication interface between the probe and the computing device is a wired link.
  • the beamforming module is a charge domain processor beamforming module.
  • the control circuitry has a pre-amp/time-gain compensation (TGC) module.
  • TGC pre-amp/time-gain compensation
  • a supplemental display device is connected to the computing device by a second communication interface.
  • the supplemental display device is a computing device in several embodiments. At least one of the communication interfaces can be a wireless interface.
  • the communication between the probe and the computing device is a wireless interface.
  • the second communication interface between the supplemental display device and the computing device is wireless.
  • the second communication interface includes a hub to connect a plurality of secondary supplemental devices.
  • the ultrasonic imaging system includes a handheld probe system which is in communication with a remotely located computing device.
  • the computing device can be a handheld portable infonrmation device such as a personal digital assistant provided by Compaq or Palm, Inc.
  • the communication link between the probe and the computing device is a wireless link such as, but not limited to, IEEE 1394 (FireWire).
  • the computing device may be used for controlling, monitoring or displaying ultrasonic imaging data.
  • a method of controlling an ultrasonic imaging system from a unitary operating position facilitates ultrasonic image processing by defining ultrasonic imaging operations and defining a range of values corresponding to each of the ultrasonic imaging operations.
  • An operator selects, via a first control, one of the ultrasonic imaging operations, and then selects, via a second control, a parameter in the range of values corresponding to the selected ultrasonic imaging operation.
  • the ultrasonic imaging system applies the selected ultrasonic imaging operation employing the selected parameter.
  • the operator produces the desired ultrasonic image processing results by employing both the first control and the second control from a common operating position from one hand, thereby allowing the operator to continue scanning with a free hand while continuing to control the ultrasonic imaging system.
  • the ultrasonic imaging system is controlled from a control keypad accessible from one hand of the operator, or user.
  • the other hand of the operator may therefore be employed in manipulating an ultrasonic probe attached to the ultrasonic imaging system for gathering ultrasonic data employed in the ultrasonic imaging operations.
  • the first control allows qualitative selection of the various ultrasonic imaging operations which may be invoked using the system.
  • the second control allows quantitative selection of parameters along a range to be employed in the ultrasonic operation.
  • the range of parameters may be a continuum, or may be a series of discrete values along the range.
  • the control keypad includes two keys for scrolling through the qualitative ultrasonic operations, and two keys for selecting the quantitative parameters along the corresponding range.
  • the ultrasonic imaging system in accordance with preferred embodiments may be used for patient monitoring systems such as bedside monitoring system, pacemaker monitoring, for providing image guided implants, and pacemaker implantation. Further, preferred embodiments of the systems of the present invention may be used for cardiac rhythm management, for radiation therapy systems and for image guided surgery, such as, but not limited to, image guided neurosurgery, breast biopsy and computer enabled surgery.
  • the ultrasonic imaging operations which may be invoked include scanning operations, to be applied to live, real time ultrasonic image gathering, and processing operations, which may be applied to live or frozen ultrasonic images.
  • Typical scanning ultrasonic imaging operations which are known to those skilled in the art and which may be applied by the ultrasonic imaging system include size, depth, focus, gain, Time Gain Compensation (TGC) and TGC lock.
  • Typical processing ultrasonic imaging operations include view, inversion, palette, smoothing, persistence, map, and contrast.
  • Preferred embodiments of the present invention include control and data transfer methods that allow a third party Windows® based application to control, for example, a portable Windows® based ultrasound system by running the ultrasound application as a background task, sending control commands to the ultrasound application server and receiving images (data) in return. Further, the embodiment configures a portable ultrasound Windows® based application as a server of live ultrasound image frames supplying another Windows® based application that acts as a client. This client application receives these ultrasound image frames and processes them further.
  • an alternate embodiment configures the portable ultrasound Windows® based application as a server, interacting with a third party client application via two communication mechanisms, for example, a component object model (COM) automation interface used by third party, hereinafter referred to as an external application or a client to startup and control the portable ultrasound Windows® based application and a high-speed shared memory interface to deliver live ultrasound images.
  • COM component object model
  • a preferred embodiment includes and configures a shared memory interface to act as a streaming video interface between a portable Windows® based Ultrasound application and another third party Windows® based application.
  • This streaming video interface is designed to provide ultrasound images to a third party client in real-time.
  • a preferred embodiment allows the third party Windows® based application to control the flow rate of images from the portable ultrasound Windows® based application through the shared memory interface within the same PC platform and the amount of memory required to implement this interface.
  • These controls consist of a way to set the number of image buffers, the size of each buffer and the rate of image transfer. This flow rate control can be set for zero data loss thus ensuring that every frame is delivered to the third party Windows® based application from the ultrasound system, or minimum latency thus delivering the latest frame generated by the ultrasound system to the third party Windows® based application first.
  • a preferred embodiment formats the ultrasound image frame such that probe, spatial, and temporal information can be interpreted by the third party Windows® based application as it retrieves the images (generated by the portable ultrasound Windows® based application) from the shared memory interface.
  • the actual image data passed between the server (i.e. portable ultrasound application) and the client application (third party Windows® based application) is a Microsoft” device independent bitmap (DIB) with 8 bit pixels and a 256 entry color table.
  • DIB device independent bitmap
  • the image frame also contains a header that provides the following additional information, for example, but not limited to, Probe Type, Probe Serial Number, Frame Sequence Number, Frame Rate, Frame Timestamp, Frame Trigger Timestamp, Image Width (in pixels), Image Height (in pixels), Pixel Size (in X and Y), Pixel Origin (x,y location of the first pixel in image relative to the Transducer Head, and Direction (spatial direction along or across each line of the image).
  • Probe Type Probe Serial Number
  • Frame Sequence Number Frame Rate
  • Frame Timestamp Frame Trigger Timestamp
  • Image Width in pixels
  • Image Height in pixels
  • Pixel Size in X and Y
  • Pixel Origin x,y location of the first pixel in image relative to the Transducer Head
  • Direction spatial direction along or across each line of the image.
  • the preferred embodiment controls the shared memory interface used to transfer ultrasound images between a Windows® based portable ultrasound system and a third party Windows® based system through the use of ActiveX controls.
  • the Windows® based portable ultrasound application contains an ActiveX control that transfers a frame into the shared memory and sends out a Windows® Event (that includes a pointer to the frame just written) to the third party Windows® based application.
  • This third party application has a similar ActiveX control that receives this Event and pulls the image frame out of shared memory.
  • the present invention includes a method for providing streaming video in an ultrasonic imaging system including providing an ultrasonic application server having at least one ultrasonic operation and corresponding ultrasonic data.
  • the method further includes sending, from an external application, a command indicative of one of the ultrasonic operations, executing in the ultrasonic application server, a result corresponding to the commands and sending data from the ultrasonic server to the external application.
  • a shared memory is in communication with the ultrasonic application server and the external application.
  • the method further includes an integrated interface program having a plurality of entry points into the application server, transmitting via the integrated interface program a command to the ultrasonic application server, receiving over a predetermined communication interface, ultrasonic data indicative of ultrasonic image formation and transmitting the result to the external application via the integrated interface program.
  • the integrated interface program is adapted to transmit real-time imaging data including ultrasonic imaging for radiation therapy planning and treatment, minimally invasive and robotic surgery methods including biopsy procedures, invasive procedures such as catheter introduction for diagnostic and therapeutic angiography, fetal imaging, cardiac imaging, vascular imaging, imaging during endoscopic procedures, imaging for telemedicine applications, imaging for veterinary applications, cryotherapy and ultrasound elastography.
  • the streaming video includes radio frequency data, real-time image data and transformation parameters.
  • the external application can reside on the same computing device as the ultrasonic application server or be resident on a different computing device.
  • the external application communicates with the ultrasonic application server using a control program using a component object model automation interface and a shared memory interface.
  • the command in the method for providing streaming video includes operations selected from the group consisting of an ultrasound application initialization/shutdown functions such as, for example, start ultrasound application, load preset files, exit application; ultrasound setup functions such as, for example, set shared memory parameters, initialize communication to shared memory, set image frame size, set shared memory size, set transfer priority (for low latency, high throughput, or first in, first out), set image resolution and format; and ultrasound image capture functions such as, for example, freeze live data, fetch live data, and resume live imaging.
  • ultrasound application initialization/shutdown functions such as, for example, start ultrasound application, load preset files, exit application
  • ultrasound setup functions such as, for example, set shared memory parameters, initialize communication to shared memory, set image frame size, set shared memory size, set transfer priority (for low latency, high throughput, or first in, first out), set image resolution and format
  • ultrasound image capture functions such as, for example, freeze live data, fetch live data, and resume live imaging.
  • the ultrasonic application server includes a graphical user interface having image control presets which are operable to store image settings.
  • the image settings include application controls such as, for example, image mode, patient name, patient ID; B-mode controls, for example, size, depth, focus, TGC, change examination type; M-mode controls, for example, sweep speed, scan line position; image quality controls, for example, brightness, contrast, invert, palette, smoothing persistence; and Doppler controls, for example, color region of interest, pulse repetition rate, wall filter, steering angle, color gain, color invert, color priority, color baseline and line density control.
  • application controls such as, for example, image mode, patient name, patient ID
  • B-mode controls for example, size, depth, focus, TGC, change examination type
  • M-mode controls for example, sweep speed, scan line position
  • image quality controls for example, brightness, contrast, invert, palette, smoothing persistence
  • Doppler controls for example, color region of interest, pulse repetition rate, wall filter, steering angle, color gain, color invert, color
  • a display can be integrated into the probe housing and/or the interface housing.
  • FIG. 1 is a schematic block diagram of an integrated probe system.
  • FIGS. 2 A- 2 C illustrate a particular embodiment of packaging integrated probe electronics.
  • FIG. 3A is a schematic block diagram of a particular embodiment of an integrated probe system.
  • FIGS. 3B and 3C illustrate embodiments of the transmit/receive circuit.
  • FIG. 3D illustrates an alternate embodiment in which the probe housing is separated from the interface housing by a cable.
  • FIG. 4A is a block diagram of a particular 1 -dimensional time-domain beamformer.
  • FIG. 4B illustrates another preferred embodiment of a beamformer in accordance with the invention.
  • FIG. 5A is a functional block diagram of the system controller of FIG. 3.
  • FIG. 5B schematically illustrates a timing diagram for the control of modules in the system.
  • FIG. 6 shows a block diagram of an ultrasonic imaging system adapted for external application integration as defined by the present claims.
  • FIG. 7A shows an integrated interface program operable for use with a local external application.
  • FIG. 7B shows an integrated interface program operable for use with a remote external application.
  • FIG. 8 shows a flowchart of external application integration as defined herein.
  • FIG. 9 shows a graphical user interface (GUI) for use with the ultrasonic imaging system as defined herein.
  • GUI graphical user interface
  • FIG. 10 is a preferred embodiment of a portable ultrasound system in accordance with the invention.
  • FIG. 11 illustrates a wearable or body mounted ultrasound system in accordance with the invention.
  • FIG. 12 illustrates an interface system using a standard communication link to a personal computer.
  • FIG. 13 shows the top-level screen of a graphical user interface (GUI) for controlling the ultrasonic imaging system.
  • GUI graphical user interface
  • FIG. 14 shows a unitary control keypad for use in conjunction with the GUI of FIGS. 15 A- 15 B.
  • FIG. 15A shows a graphical user interface (GUI) for controlling the scanning operations of the ultrasonic imaging system.
  • GUI graphical user interface
  • FIG. 15B shows a graphical user interface (GUI) for controlling the processing operations of the ultrasonic imaging system
  • FIG. 16 shows a state diagram corresponding to the GUI of FIGS. 15 A- 15 B.
  • FIG. 17A is a block diagram illustrating an ultrasound imaging system with wired and wireless communication.
  • FIG. 17B is a block diagram illustrating an ultrasound imaging system with wireless and wired communication.
  • FIG. 17C is a block diagram illustrating an ultrasound imaging system with wireless communication.
  • FIG. 18 is a block diagram illustrating an ultrasound imaging system with a remote or secondary controller/viewer and wireless communication.
  • FIG. 19 is a block diagram illustrating an ultrasound imaging system with wired and wireless network communication capability.
  • FIG. 20 is a diagram illustrating further details of the architecture of the ultrasound imaging system in accordance with a preferred embodiment of the present invention.
  • FIG. 21 is a diagram of a wireless viewer graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 22 is a diagram of a facility wide ultrasound image distribution system in accordance with a preferred embodiment of the present invention.
  • FIG. 23 is a diagram illustrating an ultrasound imaging system in accordance with a preferred embodiment of the present invention.
  • FIG. 24 is a block diagram illustrating a personal digital assistant (PDA) in communication with the host computer or probe system in accordance with preferred embodiment of the present invention.
  • PDA personal digital assistant
  • FIGS. 25 A- 25 C illustrate an ultrasound system in accordance with a preferred embodiment of the present invention integrated with an angiography system, a high frequency image of the carotid artery with directional power doppler and an image of the carotid artery with simultaneous quantitative spectral doppler, respectively.
  • FIGS. 26A and 26B illustrate an ultrasound image of vessel walls in accordance with a preferred embodiment of the system of the present invention and a catheter used with the system, respectively.
  • FIGS. 27A and 27B illustrate a radiation planning system integrating the ultrasound system in accordance with preferred embodiments of the present invention and the probe of the ultrasound system, respectively.
  • FIGS. 28A and 28B illustrate an ultrasonic imaging system for cryotherapy in accordance with a preferred embodiment of the present invention and a probe used in the system, respectively.
  • FIG. 29 is a schematic diagram illustrating a robotic imaging and surgical system integrating the ultrasound system in accordance with a preferred embodiment of the present invention.
  • FIG. 30 is a schematic diagram illustrating an imaging and telemedicine system integrating the ultrasound system in accordance with a preferred embodiment of the present invention.
  • FIGS. 31 A and 31B are three-dimensional images from fetal imaging obtained from an ultrasound system in accordance with a preferred embodiment of the present invention.
  • FIG. 32 is a block diagram illustrating the structure of the physical shared memory in accordance with a preferred embodiment of the present invention.
  • FIG. 33 is a schematic block diagram of the processing flow between the server side, the client side and the shared memory control in accordance with a preferred embodiment of the present invention.
  • FIG. 34 is a view of a graphical user interface of the Autoview user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 35 illustrates a view of a main screen display of a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIGS. 36 A- 36 C are views of a graphical user interface showing the icons used to control the size of windows, and creation of floating windows in accordance with a preferred embodiment of the present invention.
  • FIGS. 37A and 37B are views of a graphical user interface illustrating a patient folder and an image folder directory in accordance with a preferred embodiment of the present invention.
  • FIG. 38 illustrates a tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention, whereby different modes of imaging can be selected.
  • FIG. 39 illustrates a measurement tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 40 illustrates a playback tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIGS. 41A and 41B illustrate Live/Freeze interface buttons in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 42 illustrates a file tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 43 illustrates a view of patient information screen in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 44 illustrates further interface buttons in a patient interface screen in accordance with a preferred embodiment of the present invention.
  • FIG. 45 illustrates a view of a screen for adding a new patient in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 46 illustrates an image in the B-mode including the controls provided by a graphical user interface in the B-mode in accordance with a preferred embodiment of the present invention.
  • FIGS. 47 A- 47 H illustrate the control interfaces for adjusting a B-mode image in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 48 illustrates the image quality control setting provided in the B-mode image option in accordance with a preferred embodiment of the present invention.
  • FIG. 49 illustrates an M-mode image and the controls provided to adjust the M-mode image in accordance with a preferred embodiment of the present invention.
  • FIG. 50 illustrates a portable ultrasound imaging system including a hand-held probe with an integrated display in accordance with a preferred embodiment of the present invention.
  • FIGS. 51 A and 51B are embodiments illustrating a hand-held scan head having a display unit integrated on the scan head housing or the display unit being attached to the scan head, respectively, in accordance with the present invention.
  • FIG. 52 illustrates the single board computer and beamformer circuits that form the processing unit in accordance with a preferred embodiment of the present invention.
  • FIGS. 53A and 53B illustrate alternate preferred embodiments wherein the beamformer electronics are housed either in the hand-held scan head assembly or in the processing unit, respectively, in accordance with the present invention.
  • FIG. 54 illustrates a trapezoidal scan format using a uniform angular increment with non-uniform tangential increment.
  • FIG. 55 illustrates a trapezoidal scan format using uniform tangential increment with non-uniform angle increment.
  • FIG. 56 schematically describes preferred embodiments for generating trapezoides scan formats in accordance with the present invention.
  • FIG. 58 illustrates the steering angle as a function of scan line for the uniform scan line position approach.
  • FIG. 1 is a schematic block diagram of an integrated probe system. Illustrated are a target object 1 , a front-end probe 3 , and a host computer 5 , and a supplemental display/recording device 9 .
  • the front-end probe 3 integrates a transducer array 10 and control circuitry into a single hand-held housing.
  • the control circuitry includes a transmit/receive module 12 , a pre-amp/time-gain compensation (TGC) module 14 , a charge domain processor (CDP) beamforming module 16 , and a system controller 18 .
  • Memory 15 stores program instructions and data.
  • the CDP beamformer integrated circuit 16 includes a computational capacity that can be used to calculate the delay coefficients used in each channel.
  • the probe 3 interfaces with the host computer 5 over a communications link 40 , which can follow a standard high-speed communications protocol, such as the FireWire (IEEE P1394 Standards Serial Interface) or fast (e.g., 200 Mbits/second or faster) Universal Serial Bus (USB 2.0) protocol.
  • the standard communication link to the personal computer operates at least at 100 Mbits/second or higher, preferably at 200 Mbits/second, 400 Mbits/second or higher.
  • the link 40 can be a wireless connection such as an infrared (IR) link.
  • the probe 3 thus includes a communications chipset 20 .
  • the components in the portable ultrasound system require a continuous source of data for correct operation.
  • the beamformer 16 requires steering data
  • the transmit circuitry 12 requires data to instruct it where to focus the next pulse and when to fire
  • the TGC 14 needs to know what gain level is appropriate at the given time.
  • further information may be required synchronous to the scanning operation to control how the beamformed data is sent back to the host. For instance, a DATAVALID signal can be helpful to reduce the amount of data that the host 5 actually has to process.
  • the various parts of the ultrasound system relies on common synchronization for the system to work in harmony. For example, the transmitter must be fired at an exact time with respect to when the beamformer is looking at a particular position.
  • FPGA Field Programmable Gate Array
  • FIGS. 2 A- 2 C illustrate a particular embodiment of integrated probe electronics.
  • FIG. 2A is a perspective view showing a transducer array housing 32 , an upper circuit board 100 A, a lower circuit board 100 B, and a central circuit board 200 . Also shown is a lower Molex connector 150 B carrying data and signal lines between a central circuit board 200 and the lower circuit board 100 B.
  • the transducer array housing 32 can be a commercially available unit having a pair of flexible cable connectors 120 A, 120 B (See FIG. 2C) connected to the upper board I OOA and lower board 100 IB, respectively, with strain relief.
  • FIG. 2B is a back-end view of the probe, which also shows an upper Molex connector 150 A.
  • FIG. 2C is a side-view of the probe. Using 8 mm high Molex connectors 150 A, 150 B, the entire stack has a thickness of approximately 30 or less, with this particular embodiment being about 21 mm.
  • More functionality can be included within the hand-held probe such as a wireless IEEE 1394 connection to the personal computer.
  • a display can be mounted directly on the hand-held probe, for example, to provide a more usable and user-friendly instrument.
  • FIG. 3A is a schematic block diagram of a particular embodiment of an integrated probe system.
  • the host computer 5 can be a commercially available personal computer having a microprocessor CPU 52 and a communications chipset 54 .
  • a communications cable 40 is connected through a communications port 56 to the communications chipset 54 .
  • the front-end probe 3 ′ includes a transducer head 32 , which can be an off-the-shelf commercial product, and an ergonomic hand-held housing 30 .
  • the transducer head 32 houses the transducer array 10 .
  • the housing 30 provides a thermally and electrically insulated molded plastic handle that houses the beamforming and control circuitry.
  • The, beamforming circuitry can be embodied in a pair of analog circuit boards 100 A, 100 B.
  • Each analog circuit board 100 A, 100 B includes a respective transmit/receive chip 112 A, 112 B; a preamp/TGC chip 114 A, 114 B; a beamformer chip 116 A, 116 B; all of which are interconnected with a pair of the memory chips 115 A- 1 , 115 B- 1 , 115 A- 2 , 115 B- 2 via an operational bus 159 A, 159 B.
  • the memory chips are Video Random Access Memory (VRAM) chips and the operational bus is 32 bits wide.
  • preamp/TGC chips 114 and beamformer chips 116 operate on 32 channels simultaneously.
  • the transmit/receive chips 112 include a 64 channel driver and a 64-to-32 demultiplexer.
  • FIG. 4A is a block diagram of a particular 1-dimensional time-domain beamformner.
  • the beamformer 600 features 32-channel programmable apodized delay lines.
  • the beamforiner 600 can include an on-chip output bandpass filtering and analog-to-digital conversion.
  • the beamformer 600 includes a plurality of single channel beamforming processors 620 subscript I, . . . , 620 subscript J. imaging signals are represented by solid leader lines, digital data is represented by dashed leader lines, and clock and control signals are illustrated by alternating dot and dash leader lines.
  • a timing controller 610 and memory 615 interface with the single channel beamforming processors 620 .
  • Each single channel beamforming processor includes clock circuitry 623 , memory and control circuitry 625 , a programmable delay unit with sampling circuitry 621 , in a multiplier circuit 627 .
  • Each programmable delay unit 621 receives an imaging signal echo E from a respective transducer element.
  • the outputs from the single channel beamforming processors 620 are added in a summer 630 .
  • a frequency impulse response (FIR) filter 640 processes the resulting imaging signal, which is digitized by the analog-to-digital (A/D) converter 650 .
  • both the FIR filter 640 and the A/D converter 650 are fabricated on chip with the beamforming processors 620 .
  • VRAM is a standard Dynamic RAM (DRAM) with an additional higher-speed serial access port. While DRAM has two basic operations, for example, read and write memory location, VRAM adds a third operation: transfer block to serial readout register. This transfers a block (typically 128 or 256 words) of data to the serial readout register which can then be clocked out at a constant rate without further tying up the DRAM core. Thus refresh, random access data read/write, and sequential readout can operate concurrently. Alternate embodiments may include a synchronous Dynamic Ram (synchDRAM) memory.
  • dual-ported operation is beneficial so the data loading performed by the host 5 can be decoupled from data sent to memory modules.
  • a modular architecture which allows additional VRAMs to be added in order to obtain additional bandwidth is useful, particularly when the exact data rate requirements may change. Using wide memories, the data does not have to be buffered before going to the various destination modules in the system.
  • a particular embodiment uses five 256 Kword by 16 bit VRAMs which yields a total of 80 output lines. If fewer output lines are required, fewer VRAMs can be used. If more output lines are required, only very minor modifications to the controller have to be made.
  • VRAM is lower density than other varieties of DRAM.
  • Synchronous DRAM SDRAM
  • SDRAM Synchronous DRAM
  • the use of SDRAM implies that the modules accept data bursts instead of continuous data. Additionally, more buffering of host data can be used or else concurrent readout and loading may not be possible.
  • Using a multiple data rate feature in the controller can reduce the storage requirements making VRAM a first embodiment.
  • a further preferred embodiment uses SDRAM to provide further improvements in the speed and capacity of the system.
  • the control circuitry is embodied in a digital circuit board 200 .
  • the digital circuit board 200 includes a FireWire chipset 220 , a system control chip 218 to control the scan head, and a memory chip 215 .
  • the memory chip 215 is a VRAM chip and the system control chip 218 is interconnected to the various memory chips 115 , 215 over a control bus 155 , which in this particular application is 16 bits wide.
  • the system control chip 218 provides scan head control signals to be transmit/receive chips 112 A, 112 B over respective signal lines 152 A, 152 B.
  • the transmit/receive chips 112 A, 112 B energize the transducer array 10 over transmit lines 124 A, 124 B.
  • Received energy from the transducer array 10 is provided to the transmit/receive chips 112 A, 112 B over receive lines 122 A, 122 B.
  • the received signals are provided to the pre-amp/TGC chips 114 A, 114 B.
  • the signals are provided to the beamformer chips 116 A, 116 B. Control signals are exchanged between the beamformer and the system controller over signal lines 154 A, 154 B to adjust the scan beam.
  • the five VRAM chips 115 A- 1 , 115 A- 2 , 115 B- 1 , 115 B- 2 , 215 serve to supply the real-time control data needed by the various operating modules.
  • the term “operating modules” refers to the different parts of the system that require control data - namely the beamformers 116 A, 116 B, transmit/receive chips 112 A, 112 B, and preamp/TGC chips 114 A, 114 B.
  • the system controller 218 maintains proper clocking and operation of the VRAM to assure continuous data output. Additionally, it generates clocks and control signals for the various operating modules of the system so that they know when the data present at the DRAM serial port output is for them. Finally, it also interfaces with the host (PC) 5 via a PC communications protocol (e.g., FireWire or high speed bus) to allow the host 5 to write data into the VRAM.
  • PC PC communications protocol
  • VRAMs are shared by multiple modules.
  • the 64-bit output of four VRAMs 15 A- 1 , 115 A- 2 , 115 B- 1 , 1155 B- 2 is used by both the transmit module as well as the beamformer. This is not a problem, because typically only one requires data at any given time. Additionally, the transmit module chip uses relatively less data and thus it is wasteful to have to dedicate entire VRAMs for transmit operations.
  • codes are embedded in the VRAM data that the controller deciphers and asserts the appropriate MODCLOCK line.
  • the fifth VRAM 215 is used to generate data that is not shared by multiple modules. For example, it is convenient to put the control for the TGC here because that data is required concurrently with beamformer data. It can also be useful to have one dedicated control bit which indicates when valid data is available from the beamformer and another bit indicating frame boundaries. Thus, because the location of the data in the VRAM corresponds to the position in the frame scanning sequence, additional bits are synchronized with the operation of the system. CCD clock enable signals can also be generated to gate the CCD clock to conserve power. Lastly, the VRAM can be used to generate test data for a D/A converter to test the analog circuitry with known waveforms.
  • the number of VRAMs may be reduced.
  • the four shared VRAM chips may be merged into two SDRAM chips in a 128 line system, for example.
  • the data sent to the beamformer and transmit modules are bit-serial within a channel, with all channels being available in parallel.
  • For the transmit module two transmit channels share each bit line with alternating clocks strobing in data for the two channels. All per channel transmit module coefficients (such as start time) are presented bit-serially.
  • a run consists of a one word header, which is interpreted by the VRAM controller, followed by zero or more actual data words which are used by the various modules.
  • the headers (see Table 1) specify where the data in the run is destined, how fast it should be clocked out, and how many values there are in the run. (Note that the run destination is only for the data coming out of the 4 VRAMs. The bits coming out of the controller VRAM always have the same destinations.)
  • the headers are also used to encode the special instructions for Jump, Pause, and End described below.
  • the data in the VRAM are read out basically sequentially but some variations are allowed to reduce the memory requirements and facilitate system operation based on several observations about how the ultrasound system operates.
  • RATE field can only take on the values 0, 2 and 3 because a pause of RATE 1 is interpreted as a wait command, described next. This is not a problem, however, because typically only RATE 0 is used for maximum wait accuracy (to within one clock) and RATE 3 is used for maximum wait time (up to 16376 clock cycles).
  • the buffering is achieved by a 16 K by 18 FIFO while the flow control is achieved by feeding the FIFO fullness indication back to the system controller 218 .
  • the scanning stops until the FIFO has been emptied.
  • the scanning should not stop arbitrarily because it is timed with the propagation of the sound waves.
  • explicit synchronization points can be inserted into the code, and at these points the controller waits until the FIFO is empty enough to proceed safely. The wait command is used to indicate these synchronization points.
  • the wait command causes the controller to wait until the WAITPROCEED line is high. In one embodiment, this is connected (via the aux FPGA) to the “not half-full” indicator on the FIFO.
  • the wait commands can be placed at least every 8 K data-generating cycles to assure that data overflow cannot occur. Because this is greater than one ultrasound line, it still allows multi-line interleaving to be used.
  • the next command is the jump command.
  • This allows non-sequential traversal through the VRAM memory. This is employed so that the VRAM memory can be modified concurrently with the readout operation and also to make it easier to add and remove variable size control sequences.
  • this is useful, consider the following example: Imagine that one wants to change the data in VRAM locations 512 - 1023 while continuing operation of the scanning using the other locations. If the host were to just modify locations 512 - 1023 , there is no guarantee that they will not be used exactly when they are in the middle of being modified. Thus the data would be in an indeterminate state and can lead to an erroneous sequence.
  • the last command is the end command. This is used at the end of the sequence for a frame to tell the system controller that the frame has completed. The controller then stops fetching instructions until it is restarted (from location 0 ) by host if it is in single-frame mode. If it is in continuous mode then it will start immediately on the next frame. (After 128 cycles required for the implied jump 0 ).
  • FIG. 5A is a functional block diagram of the architecture of the system controller of FIG. 3A.
  • the system controller 218 has four basic parts: a readout controller 282 , a host controller 284 , the refresh controller 286 , and the Arbiter 288 .
  • the first three support the three basic operations on the VRAM: reading out data, writing in of data at host's request, and refreshing the DRAM core.
  • the arbiter 288 is responsible for merging the requests of the first three sections into one connection to the VRAM's DRAM core. Only one of the first three sections can have control at a given time, so the explicitly request control and wait until this request is acknowledged by the arbiter 288 . They also must tell the arbiter 288 when they are still using the DRAM so that the arbiter knows not to grant it to one of the other sections. This is done via the INUSE lines.
  • the arbiter 288 sends the host controller 284 a RELREQ or relinquish request signal to ask the host controller 284 to give up ownership of the DRAM core because some other section wants it. Note that only the host 284 controller needs to be asked to relinquish the bus because the readout controller 284 and refresh controller 286 both only use the DRAM core for fixed short intervals. The host controller 284 , however, can hold on to the DRAM as long as there is data coming over the FireWire to be written into the DRAM, so it needs to be told when to temporarily stop transferring data.
  • VRAM serial section of the VRAMs is not multiplexed—it is always controlled by the readout controller 282 .
  • the VRAM serial data also only goes to the readout controller 282 .
  • the readout controller 282 controls the sequencing of the data out the VRAMs' serial access ports. This involves parsing the data headers to determine what locations should be read, clocking the VRAM Serial Clock at the correct time, driving the module control lines, and also arranging for the proper data from the VRAM's DRAM core to be transferred into the serial access memory.
  • the host controller 284 is the part of the VRAM Controller that interfaces to the host 5 via FireWire to allow the host to write into the VRAM.
  • the host controller 284 sends asynchronous packets specifying which VRAM and which addresses to modify as well as the new data to write.
  • the host controller 284 then asks the arbiter 288 for access to the VRAM.
  • the arbiter 288 grants control to the host controller 284 .
  • the host controller 284 then takes care of address and control signal generation. When the whole packet has been decoded, the host controller 284 releases its request line giving up the DRAM control, allowing the other two sections to use it.
  • the refresh controller 286 is responsible for periodically generating refresh cycles to keep the DRAM core of the VRAM from losing its data.
  • the refresh controller 286 has its own counter to keep track of when it needs to request a refresh. Once it gains access to the VRAMs via the arbiter 288 , it generates one refresh cycle for each of the VRAMs sequentially. This reduces the amount of spikes on the DRAM power supply lines as compared to refreshing all 5 VRAMs in parallel.
  • the REFRATE inputs control how many system clock cycles occur between refresh cycles. (See Table 3.) This is compensate for different system clock rates. Additionally, refresh may be disabled for debugging purposes. TABLE 3 Refresh Rate Definitions Minimum System clock cycles System Clock to achieve RefRate1 RefRate0 between refresh cycles 16 ⁇ s refresh rate 0 0 128 8 MHZ 0 1 256 16 MHZ 1 0 512 32 MHZ 1 1 No Refresh ⁇
  • the arbiter controls 288 the access to the VRAM by the Readout, Host, and Refresh Controller 282 , 284 , 286 sections. Only one section may have access to the DRAM port of the VRAM at any given time. The arbiter 288 does not reassign control of the VRAM to another section until the section with control relinquishes it by de-asserting its IN_USE line. The sections are prioritized with the Readout Controller 282 getting the highest priority and the host controller 284 getting the lowest priority. The reasoning is that if the readout controller 282 needs access to the VRAM, but does not get it, then the system may break down as the serial output data will be incorrect. The refresh controller 286 can tolerate occasional delay, although it should not happen much. Finally, the host controller 284 can potentially tolerate very long delays because the host can be kept waiting without too many consequences except that the writing of the VRAM may take longer.
  • FireWire standard also known as IEEE 1394.
  • the FireWire standard is used for multimedia equipment and allows 100-200 Mbps and preferably in the range of 400-800 Mbps operation over an inexpensive 6 wire cable. Power is also provided on two of the six wires so that the FireWire cable is the only necessary electrical connection to the probe head.
  • a power source such as a battery or EEE1394 hub can be used.
  • the FireWire protocol provides both isochronous communication for transferring high-rate, low-latency video data as well as asynchronous, reliable communication that can be used for configuration and control of the peripherals as well as obtaining status information from them.
  • VRAM controller directly controls the ultrasound scan head
  • higher level control, initialization, and data processing and display comes from a general purpose host such as a desktop PC, laptop, or palmtop computer.
  • the display can include a touchscreen capability.
  • the host writes the VRAM data via the VRAM Controller. This is performed both at initialization as well as whenever any parameters change (such as number or positions of zones, or types of scan head) requiring a different scanning pattern.
  • the host need not write to the VRAM.
  • the VRAM controller also tracks where in the scan pattern it is, it can perform the packetization to mark frame boundaries in the data that goes back to the host.
  • the control of additional functions such as power-down modes and querying of buttons, or dial on the head can also be performed via the FireWire connection.
  • FireWire chipsets manage electrical and low-level protocol interface to the FireWire interface
  • the system controller has to manage the interface to the FireWire chipset as well as handling higher level FireWire protocol issues such as decoding asynchronous packets and keeping frames from spanning isochronous packet boundaries.
  • Asynchronous data transfer occurs at anytime and is asynchronous with respect to the image data.
  • Asynchronous data transfers take the form of a write or read request from one node to another.
  • the writes and reads are to a specific range of locations in the target node's address space.
  • the address space can be 48 bits.
  • the individual asynchronous packet lengths are limited to 1024 bytes for 200 Mbps operation.
  • Both reads and writes are supported by the system controller.
  • Asynchronous writes are used to allow the host to modify the VRAM data as well as a control word in the controller which can alter the operation mode.
  • Asynchronous reads are used to query a configuration ROM (in the system controller FPGA) and can also be used to query external registers or I/O such as a “pause” button.
  • the configuration ROMs contain a querible “unique ID” which can be used to differentiate the probe heads as well as allow node-lockings of certain software features based on a key.
  • a node Using isochronous transfers, a node reserves a specified amount of bandwidth, and it gets guaranteed low-overhead bursts of link access every ⁇ fraction (1/8000) ⁇ second. All image data from the head to the host is sent via isochronous packets.
  • the FireWire protocol allows for some packet-level synchronization and additional synchronization is built into the system controller.
  • the asynchronous write request packets are sent from the host to the probehead in order to:
  • Both the “Asynchronous Write Request with Block Payload” or the “Asynchronous Write Request with Quadlet Payload” forms can be used. The later simply restricts the payload to one quadlet (4 bytes).
  • the formats of the two packets are shown in Table 4 and Table 5. Note that these are how the packets are passed on by the TI LINK controller chip. The difference between this and the format over the wire is that the CRCs are stripped and the speed code (spd) and acknowledgment code (ackSent) are appended to the end.
  • the Adaptec API and device driver take care of assembling the packets.
  • the destinationID field holds the node ED of the destination which is the probe head FireWier controller.
  • the physical layer chip can use this to determine if the packet is for it.
  • the system controller can ignore this field.
  • the tLabel field is used to match requests and responses. For write requests, this does not matter and can be ignored.
  • the rt is the retry code used at link and/or phy level. It is not used by the system controller.
  • the tCode field is the transaction code which determines what type of packet it is. In particular 0 is for quadlet write requests and 1 is for block write requests. The system controller parses this field to determine what type of packet it is. Currently only tCode values of 0 and 1 are recognized.
  • the priority field is used by the PHY chip only and is ignored by the system controller. It is used in, i.e. in selecting which unit on the interface is to recieve a particular packet of data.
  • the destinationOffsetHi and destinationOffsetLo fields form the 48 but destination start address. This indicates within the node what the data should be used for.
  • the system controller used the destinationOffsetHi to determine the function as shown in table 6. Note that only the 3 least significant bits of the destinationOffsetHi field are currently examined.
  • the spd field indicates the speed at which the data was sent while the ackSent field is use to indicate status by saying how the LINK chip acknowledged the packet. TABLE 6 destinationOffsetHi values destinationOffsetHi Meaning 0 Write VRAM 0 1 Write VRAM 1 2 Write VRAM 2 3 Write VRAM 3 4 Write VRAM 4 5 Write ISO packet Length Register 6 Write System Controller Mode Word 7 Wrote to LINK chip
  • destinationOffsetHi values of 0-4 correspond to writing the VRAMs.
  • the destinationOffsetLow is set to the byte address to start writing. This is twice the standard VRAM address which is typically formed in 16-bit words.
  • the start address (destinationOffsetLow) and the length (dataLength) can both be multiples of 4 such that all operations are quadlet aligned.
  • the payload data is little endian and thus need not be converted if written by an Intel PC host.
  • the length (dataLength) must additionally be between 4 and 128 bytes due to the size of the GPLynx FIFO.
  • the total FIFO size is 200 bytes, but 72 bytes are dedicated to the asynchronous transmit FIFO required for read responses.
  • a destinationOffsetHi value of 5 signifies that the system controller ISO Packet Length register is to be written.
  • the ISO Packet Length has to be set in the controller to allow it to correctly format the ISO packets back to the host via FireWire.
  • An explicit counter in the system controller is used due to the fact that the TI GPLynx chip does not assert the end-of-packet indication until one word too late.
  • the ISO Packet length also has to be set in the LINK chip.
  • the value written is the number of 16-bit words in the ISO Packet length which also has to be set in the LINK chip.
  • the value written is the number of 16-bit words in the ISO packet (i.e. bytes/2) and it is written in little endian order because it is only interpreted by system controller and not the LINK chip.
  • the BOF Word field is used to set the value that the system controller will put in the high byte of the first word of an isochronous packet to indicate the beginning of frame.
  • the BOF word field can be set to some value that is not likely to occur in typical data. This not crucial, however, because choosing a BOF word that occurs in the data will make it more likely to miss incorrect frame synchronization but will never cause false alarms where it thinks it is mis-synchronized but is really correctly synchronized.
  • the initial value upon reset is 80 hex.
  • the DataLoopback bit is used to control whether the data that is read back from the host comes from A/D or from one of the VRAMs. (Currently this is VRAM 1 .) This second option can used for test purposes to test the digital data generation and collection without testing the beamformer and A/D conversion.
  • a 0 in the DataLoopback bit indicates normal operation of reading from A/D while a 1 means that it should get data from the VRAM.
  • Extra1 and Extra2 bits are available for general use. They are latched by the system controller and currently brought out on pins called EXTRACLOCK0 and EXTRACLOCK1 but can be used for any purpose.
  • destinationOffsetHi indicates that the data in the asynchronous packet be written back to the FireWire Link chip.
  • the destinationOffsetLow specifies the first register to write. Because the registers are all 4-bytes in size and must be written in their entirety, destinationOffsetLow and dataLength must both be multiples of 4. Multiple consecutive registers can be written with a single packet. Note that the data is big-endian because the TSB12LV31 is designed as big-endian. This byte-swapping must be performed by the Intel PC host.
  • Read request packets are used to asynchronously read data from the probehead. This currently only consists of configuration ROM data (see below) but can be easily used for other types of data such as status information or button indications.
  • the Adaptec device drivers send Asynchronous Read Requests in response to explicit application requests as well as to interrogate the node's FireWire configuration ROM in response to a SendPAPICommand of P_GET_DEV_INFO or after a bus reset or when an application tries to obtain a handle to a node.
  • Asynchronous read requests can either be of the quadlet or block variety as with the asynchronous write requests.
  • the formnats are shown in Table 9 and Table 10. They are similar to the write request formats.
  • the destinationOffsetHi and destinationOffsetLow determine what is being requested.
  • the high addresses are defined for use as Control and Status Registers and the configuration ROM while the lower address are for more general purpose use.
  • the system controller When the system controller receives a Quadlet or Block Read Request packet from the TI LINK chip's General Receive FIFO, it formulates a Quadlet or Block Read Response packet and places it in the LINK chip's Asynchronous Transmit FIFO.
  • the format of these packets (as placed in the Asynchronous Transmit FIFO) is shown in Table 11 and Table 12.
  • the spd, tLabel, rt, and priority values are copied from the request packet.
  • the destinationID is taken from the sourceID of the request packet. Note that all packet CRCs are generated by the TI LINK chip and are thus note included the data that the system controller must generate. (The ROM CRCs do have to be computed explicitly off-line.)
  • the rCode field is used to indicate the status of the reply. In particular, 0 means resp_complete indicating all is well. A value of 6 means resp type error indicating that some field of the packet was invalid or unsupported. In this case, if the request was a block request then the dataLength of the response packet must be 0 and no data should be included. A resp_type_error is returned if the dataLength or destinationOffsetLow of the request packet were not multiples of 4 or if the dataLength was not between 4 and 32 (for block packets).
  • the TI chip's asynchronous transmit FIFO is configured to be 12 quadlets (for 8 payload quadlets+4 quadlet header) so that the receive FIFO can be 36 quadlets in order to allow 128 byte payload write packets.
  • the longest request the Adaptec device drivers should request is 8 quadlets because that is the length of the configuration ROM. In any case, it is assumed that if a long transfer failed, it falls back toga smaller request.
  • the FireWire specification expects each FireWire node to have a configuration ROM that contains various details about the device, its requirements, and its capabilities. This ROM is to be queried via Read Request packets.
  • ROM configuration ROM
  • the general ROM has many other fields, and many which are optional ranging from the ASCII name of the vendor and device to its power consumption and how to access its capabilities.
  • a node unique ID This consists of the 24-bit vendor ID and a 40-bit chip ID.
  • the 40-bit chip-ID is up to the vendor to assign such that all nodes have unique values.
  • the node unique ID's are required to keep a consistent handle on the device if the FireWire bus is reset or reconfigured during operation.
  • the application reads its configuration ROM and determines if it wants to work with it. If so it records its node unique ID and opens a connection to the device via that node unique ID. This is then at any given time mapped to its FireWire ID (16-bit) by the host adapter and its device driver.
  • the adapter automatically determines the new FireWire ID and continues. Thus for smooth operation, particularly with multiple heads attached to the system, implementing node unique IDs and the configuration ROM is required.
  • the configuration ROM is divided into several sections. The sections of particular interest are the first word, which defines the length and CRC of the ROM, the next 4 words comprising the Bus_Info_Block, which gives some fixed 1394-specific information (such as Node Unique ID), and the last 3 words representing the Root Directory which is a set of key-value tagged entries. Only the two required key-value pairs are included the ROM built into the FPGA. An 8-word ROM that can be used is shown in Table 13.
  • Isochronous packets are used for the probehead-to-host communication of beamformed data. This is conceptually a stream of 16-bit numbers punctuated by frame markers.
  • the frame markers are important to keep in sync with where in the frame the data corresponds. While some ultrasound systems use elaborate frame and line markers embedded in the data, the integrated system can use a single auxiliary bit, which is not sent as part of the data, to mark frame boundaries. Line boundaries can be derived by knowing the VRAM sequencing program.
  • isochronous packets can be used as low-overhead way to send a guarenteed rate of data.
  • a peripheral reserves a specified amount of bandwidth, it gets guaranteed bursts of link access every ⁇ fraction (1/8000) ⁇ second. All data from the head to the host is sent via isochronous packets. Because isochronous packets are limited to ⁇ fraction (1/8000) ⁇ second, this is a frame of data.
  • the FireWire specification describes the use of synchronization bits which can be used to tag each isochronous packet with a 4 bit SYNC code.
  • the Adaptec FireWire-to-PCI bridge can then use the Sync field to assure proper frame alignment.
  • the TI GPLynx Controller chip only supports frame-level granularity of when to send packets and not packet level so when the System. Controller tells the FireWire link chip it has data, it must be prepared to send a whole frame of data. Because the FIFO is much smaller than a frame, a sage option is to reduce the effective FireWire frame size to one packet. Then a specific Beginning of Frame (BOF) code in the high byte of the first word of every ultrasound frame and force the start of ultrasound frames to occur at the beginning of FireWire frames (and packets) and do frame-level synchronization in the Ultrasound application software. For efficiency, a full ultrasound frame of data can still be read in one FireWire call (and hence one interrupt).
  • BOF Beginning of Frame
  • the first step is to reserve isochronous bandwidth.
  • This reservation causes a central record of the request (in the FireWire isochronous cycle manager node) to be kept to assure that the total bandwidth allocated does not exceed the total bandwidth of the link.
  • this reservation is achieved using the Adaptec API BusConfig 0 command with Cmd field set to P_ALLOCATE_RESOURCE.
  • a requested payload in bytes is passed in. This can be the amount of data desired in every ⁇ fraction (1/8000) ⁇ second. Setting this value too high simply wastes reserved bandwidth on the FireWire interface which is not a problem if there is only one device. Setting this value too low may constrain the head-to-host data rate.
  • the resource allocation call will return both an isochronous channel number as well as the payload size granted. This payload size granted may be less than that requested if part of the link has already been reserved.
  • the next step is to set the system controller ISO packet length word to tell how long of an ISO packet to expect.
  • the final step is to initialize the probehead LINK chip. This is done via the writeback to LINK chip asynchronous packets described above. In particular, initializing registers 54 h, 58 h, and 5 ch is necessary. The probehead can then be told to start sequencing and the data will flow back.
  • the isochronous bandwidth reservation can take place once but at any given time, only one probe's isochronous transmission (as well as its sequencing) is enabled.
  • isochronous data transfers are used to deliver the probe head data to the host. Maintaining frame synchronization is necessary.
  • the FireWire will support sub-frame packetization of about 3000 bytes but it is up to the system controller to implement frame synchronization on top of this. Synchronization is achieved via two methods:
  • the high byte of the first word in the first packet of a frame is set to the Beginning of Frame (BOF) code. (This can be set in the system controller Mode word).
  • An example packetization is shown in Table 14. This depicts 4 packets of 4 words (8 bytes) apiece showing one complete ultrasound frame and the first packet of the next frame.
  • the ultrasound frame size is 10 words.
  • the Hi byte of the first word is set to the BOF code. This can be examined to assure that proper synchronization has been maintained.
  • the data is then split into the three packets 1 - 3 . Because the frame ends in the middle of packet 3 , the end of packet 3 is padded with the BOF code in the high word. Importantly, this means that the first word of the fourth packet will be the first word of the second frame even though the ultrasound frame size is not a multiple of the packet size.
  • the TSB12LV31 (or 32) performs packetization of the isochronous data but informs the system controller of packet boundaries via the ISORST signal. The system controller then uses this to reset its internal word-to-byte multiplexer as well as packetization circuitry. If it receives a frame marker from the FIFO then stops clocking data out of the FIFO until it receive a ISORST pulse.
  • the module interface defines how the various modules in the system are controlled by the VRAM controller. There are two types of modules, those that receive data from the four VRAMs which are shared (two on each analog board), and those that receive data from the VRAM on the digital board, (via the VRAM controller) which is dedicated. The two types of modules use different control signals to synchronize their operation.
  • FIG. 5B shows typical timing for the different module interfacing modes for a typical program sequence.
  • VRAMDATA the data from the loopback VRAM, control the execution.
  • the diagonal shaded boxes denote header data used by the VRAM controller while the shaded boxes denote module data in FIG. 5B.
  • the data in the four other VRAMs go to the modules.
  • the data from the first VRAM is looped back into the system controller and then used for dedicated data supply for things like the TGC, feedback control, etc.
  • modules accepting data at the full rate must additionally make sure that they do not latch the data more than T hold after the rising clock. This is because the same clock is used to retrieve the next words from the VRAM. Thus in general modules should make sure to delay the data inputs at least as much as they delay the clock inputs to effectively clock at or before the rising edge of their module clock. This second constraint does not exist when 1/2, 1/4, or 1/8 rate data is used.
  • the MODULEFASTCLOCK0 signal follows the MODULECLOCK0 line. They will only differ when 1/2, 1/4, or 1/8 rate data is used.
  • Clocks 7 - 15 show a run of length 2 at rate 1/4 destined for Module 2 .
  • new data will be clocked out of the VRAMs only once every 4 th master clock.
  • MODULEFASTCLOCK2 will exhibit different behavior than MODULECLOCK2.
  • the NEWRUNCLOCK at clock 7 signals that a new run is beginning on the next clock cycle.
  • the VRAM controller has latched the header data indicating that the next run is for module 2 at a rate of 1/4.
  • the VRAM generates the module data that the module will use.
  • a MODCLOCK2 occurs, telling module 2 to latch in and use the VRAM's data. Note that the data will present until the master clock before the next MODCLOCK2.
  • MODCLOCK2 is only clocked once per new data word
  • MODULEFASTCLOCK2 is clocked once per master clock for the duration of the run. This is useful for modules, such as the beamformer which may only need data at a lower rate but need to perform computation at the full rate.
  • the MODNEWDATA signal can also be used by modules using the MODFASTCLOCK lines to determine on which of the fast clocks new data has been presented.
  • Clocks 16 - 18 show the result of a pause command.
  • the NEWRUNCLOCK is sequenced as usual but no MODCLOCK or MODFASTCLOCK is generated.
  • the T/R circuit and the preamplifier/TGC circuit are fabricated in a single integrated circuit and are placed on one board with a CDP beamformer that is formed as a second integrated circuit.
  • the beamformer control circuits can include the calculation of weighted inputs with processor 670 .
  • the memory for this system is either a SDRAM or VRAM located on the second board along with the system controller and the digital communication control circuit.
  • the standard FireWire cable 40 includes a plurality of FireWire signal lines 42 and a FireWire power line 44 .
  • the FireWire power line 44 is fed to an inline DC-DC converter 300 .
  • the DC-DC converter 300 generates the necessary voltages and provides them over a plurality of power lines 46 . These new power lines 46 are repackaged with the FireWire signal lines 42 in a custom cable 40 ′.
  • the FireWire signal lines 42 are connected to the FireWire chipset 220 and the custom power lines 46 are connected to a power distributor 48 , which filters and distributes the various voltages over respective internal voltage lines 148 A, 148 B, 248 .
  • the power distributor 48 may perform additional DC-DC conversions, as described in more detail below.
  • the transmit/receive control chip is needed to interface with the transducer array.
  • the chip can provide delays to the high-voltage driving pulses applied to each of the selected transducer elements such that the transmitted pulses will be coherently summed on the image place at the required transmit focus point.
  • a receive mode it provides connection of the reflected sound waves received by a selected element to its corresponding amplifier.
  • the functions of a multi-channel transmit/receive chip can be separated into two parts: a core function which provide low-voltage transmit/receive control and a buffer function which level shifts the low-voltage transmit/receive control into high voltage and directly interfaces with the transducer array.
  • the core function of the transmit/receive chip includes a global counter which broadcasts a master clock and bit values to each channel processor; a global memory which controls transmit frequency, pulse number, pulse sequence and transmit/receive select; a local comparator which provides delay selection for each channel. For example, for a 60 MHZ clock and a 10 bit global counter, it can provide each channel with up to 17 ⁇ s delay; a local frequency counter which provides programmable transmit frequency; a local pulse counter which provides different pulse sequences. For example, a 7-bit counter can provide programmable transmitted pulse lengths from one pulse up to 128 pulses; a locally programmable phase selector which provides sub-clock delay resolution. For example, for a 60 MHz master clock and a two-to-one phase selector provides 8 ns delay resolution.
  • programmable subdlock delay resolution allows the delay resolution to be more precise than the clock period.
  • the output of the frequency counter is gated with a phase of the clock that is programmable on a per-channel basis.
  • a two-phase clock is used and the output of the frequency counter- is either gated with the asserted or Deasserted clock.
  • multiple skewed clocks can be used. One per channel can be selected and used to gate the coarse timing signal from the frequency counter.
  • a semiconductor process that can support both high-voltage and low-voltage operations is ideally matched for a single-chip solution to the transmit/receive chip described above.
  • the core function of the transmit/receive chip can be implemented on low-voltage transistors to reduce power consumption.
  • the level-shifting function can be implemented on high-voltage transistors to provide the necessary driving pulses to the transducer array.
  • only selected semiconductor processes can make the integration of both high-voltage (buffer 292 ) and low-voltage transistors ( 294 ) on one chip 290 possible.
  • the high/low voltage process has been so far offered only with 0.8-to-lum-design rules. With these design rules, a 64-channel transmit/receive chip can easily be integrated on a single chip in less than 1 cm 2 chip area.
  • a multi-chip module 295 can be used to implement a transmit/receive chip.
  • a deep-sub-micron process can be used to implement the core function 296 of the module, and a separate process can be used to implement the buffer 298 function.
  • the multi-chip set can be mounted in a single package to realize the transmit/receive control function.
  • a 128-channel transmit/receive controller can easily be integrated on one package.
  • FIG. 3D illustrates an alternate embodiment in which the transducer array 10 ′ is located in a separate probe housing 410 connected to the interface housing 404 by a cable 412 .
  • a probe housing in which certain circuit elements such as the transmit/receive circuitry and/or the preamp/TGC circuitry is included with the transducer array while the beamformer, system control and memory circuits remain in the interface.
  • the system in FIG. 3D provides for the use of standard probes and a beamformer interface that weighs less than 10 lbs and which can be connected to a standard personal computer.
  • the interface 404 has a volume of less than 1500 cm 3 and a weight that is preferably less than 5 lbs.
  • FIG. 6 shows a block diagram of another particular embodiment of an ultrasonic imaging system adapted f 6 r external application integration.
  • the transducer array housing 32 and associated circuitry are connected to a system controller 500 via an ultrasound (US) interface 502 .
  • the system controller 500 is connected to a host user computing device 5 such as a PC via a standard interface 40 which is a predetermined communication link, such as an IEEE 1394 interface, also known as FireWire.
  • the US data therefore, is transmitted to a user computing device 5 via the standard interface 40 , relieving the need for specialized components to be employed in the user computing device 5 .
  • the user computing device 5 therefore provides an ultrasonic application server which may be integrated with an external application, as will be described further below.
  • the ultrasonic application server running on the user computer device 5 therefore, receives the US data, and makes it available to be invoked by an external application for further processing.
  • the external application may be either local, and therefore running on the user computer device 5 , or remote, and accessing the ultrasonic application server remotely.
  • FIG. 7A shows an integrated interface program operable for use with a local external application.
  • the ultrasonic server application 504 is running on the user computing device 5 .
  • a local external application 506 is also running on the user computing device 5 , and transmits to and from the ultrasonic server application 504 via an integrated interface program 508 .
  • the integrated interface program 508 contains a series of predetermined entry points 510 a . . . 510 n corresponding to operations which the ultrasonic application server 504 may perform on behalf of the local external application 506 .
  • the local external application 506 sends a command, which includes an instruction and optional parameters as defined by the predetermined entry points 510 .
  • the local external application 506 transmits the command to the ultrasonic server application 504 by invoking the entry point 510 n in the integrated interface program which corresponds to intended operation.
  • the entry point may be invoked by procedure or function call via a stack call, message transmission, object passing, or other suitable interprocess communication mechanism.
  • Windows® messages may be used.
  • the command is received by the ultrasonic server application 504 via the desired entry point 510 n from the integrated interface program 508 , and is processed.
  • the ultrasonic server application 504 executes a result corresponding to the desired function, and transmits the result back to the external application 506 via the integrated interface program 508 , typically by similar interprocess communication mechanisms employed in transmitting the corresponding command.
  • the operations performed by the ultrasonic application server may include the following as referenced in Table 15: TABLE 15 OPERATION DESCRIPTION Freeze Image Freeze active ultrasound data image; used to capture still frames Resume Live Obtain realtime ultrasound image Export Frame Export a frame of ultrasound image data in a format as determined by the parameters Application Status Return a status code of a previous operation Initialize Initialize Ultrasonic Application Server to begin receiving commands from an external application Exit Application Disconnect external application from the Ultrasonic Application Server
  • the result received by the local external application 506 may be employed and analyzed by any functions provided by the local external application 506 .
  • the local external application 506 may be extended and modified to provide desired functions without modifying the ultrasonic application server 504 or the integrated interface program 508 . Further, additional entry points 510 n to other operations provided by the ultrasonic server application 504 may require only modification of the integrated interface program 508 . Further, multiple external applications may access the integrated interface program 508 by computing the proper instructions and parameters of the commands as defined by the integrated interface program 508 .
  • the external application is operable to process 2 dimensional and 3 dimensional radiation therapy data, fetal image data, cardiac image data, and image guided surgery data.
  • Such applications are employed in the medical field by operators such as surgeons to provide visual feedback about medical information.
  • fetal image data is used to view a fetus in utero.
  • multidimensional data to provide a visual image
  • conditions such as birth defects, treatable ailments, gender, size, and others can be determined.
  • radiation therapy data may be employed to simultaneously display information about the direction and intensity of radiation treatment, and a visual image of the treatment area.
  • Such visual image data may also be employed in image guided surgery, to indicate the location of a surgical instrument. Such information is particularly useful in contexts such as brain surgery, where it may not be possible to expose the afflicted area.
  • FIG. 7B shows an integrated interface program 508 operable for use with a remote external application.
  • a remote external application 512 is running on a remote computing device 514 such as a PC, and is connected to the user computing device 5 via a public access network 517 such as the Internet via a communication link 518 .
  • the integrated interface program 508 includes connection points 516 a . . . 516 n such as remote procedure call (RPC) points or other inter-node communication mechanism.
  • RPC remote procedure call
  • the connection points are sockets in accordance with the TCP/IP protocol.
  • the remote external application 512 is operable to compute a command corresponding to an intended operation in the ultrasonic application server 504 .
  • the connection points 516 n are generally operable to receive a command transmitted from the remote external application 512 .
  • the ultrasonic application server 504 sends a result corresponding to the command, and transmits the result back to the remote external application 512 via the integrated interface program 508 by an inter-node communication mechanism such as that used to transmit the command.
  • the same integrated interface program could have both entry points 510 n, generally to be accessed by the local external application 506 , and connection points 516 n, generally accessible by the remote external application 512 .
  • FIG. 8 shows a flowchart of external application integration.
  • an external application determines a desired US operation to be employed in processing and/or analysis, as depicted at step 550 .
  • the operation may provide data, and may cause a certain result or state change, or a combination.
  • the external application determines the instruction corresponding to this operation, as shown at step 552 , as defined by the integrated interface program.
  • the external application determines if any parameters are required for the operation, as disclosed at step 554 . If parameters are required, the external application determines the parameters, as depicted at step 556 . If no parameters are required, execution continues.
  • the external application determines a command including the instruction and any required parameters, corresponding to the desired US operation, as shown at step 558 .
  • the command is transmitted to the ultrasonic application server via the integrated interface program, as disclosed at step 560 .
  • the transmission may be by any suitable method, such as those described above and others, depending on whether the external application is local or remote.
  • Ultrasonic data is received by the ultrasonic server application 504 via the standard communication interface 40 indicative of ultrasonic image information, as depicted at step 562 .
  • the ultrasonic data is received via a test probe disposed in contact with the subject, or patient, for viewing such visual information as radiation therapy data, fetal image data, cardiac image data, and image guided surgery data.
  • Information such as the ultrasonic application server 504 executes a result corresponding to the command from the ultrasonic data, as disclosed at step 564 .
  • step 564 may involve control signals being generated to define or re-define a region of interest in which radiation is to be directed for treatment.
  • the ultrasonic application server 504 then transmits the computed result to the external application via the integrated interface program 508 , as shown at step 566 . Note that it is expected that many successive command and results are computed, and the ultrasonic data is concurrently sent in an iterative manner over the standard communication interface 40 .
  • the integrated application program includes both entry points for local external applications, and connection points for remote external applications.
  • the instructions and parameters corresponding to the entry points are known to the local external application, and the instruction and parameters corresponding to the connection points are known to the remote external application.
  • a semaphore or reentrancy mechanism is employed in the ultrasonic application server to avoid deadlock or simultaneous attempts to invoke the same operation.
  • Both the local and remote external applications invoke the ultrasound application server via the integrated interface program 508 (FIGS. 7A and 7B).
  • the ultrasonic application server also includes a graphical user interface for manipulating operations without accessing the external application.
  • a control bar 578 of a top level GUI screen is shown.
  • the control bar allows manipulation of tools affecting image settings of the display via image control presets.
  • the image settings are controlled for each of three sizes small 570 a, medium 570 b, and large 570 c.
  • the image settings within that size may be controlled, including depth 572 , focus 574 , and time gain compensation 576 .
  • Each of these settings may be saved under a user defined name for later recall.
  • Each of the three sets of image settings corresponding to the size settings 570 a, 570 b, and 570 c is then stored corresponding to the file name, and may be recalled by the user at a later time.
  • the programs defining the operations and methods defined herein are deliverable to a user computing device and a remote computing device in many forms, including but not limited to a) information permanently stored on non-writeable storage media such as ROM devices, b) information alterably stored on writeable storage media such as floppy disks, magnetic tapes, CDs, RAM devices, and other magnetic and optical media, or c) information conveyed to a computer through communication media, for example using baseband signaling or broadband signaling techniques, as in an electronic network such as the Internet or telephone modem lines.
  • the operations and methods may be implemented in a software executable by a processor or as a set of instructions embedded in a carrier wave. Alternatively, the operations and methods may be embodied in whole or in part using hardware components, such as Application Specific Integrated Circuits (ASICs), state machines, controllers or other hardware components or devices, or a combination of hardware, software, and firmware components.
  • ASICs Application Specific Integrated Circuits
  • FIG. 10 illustrates a preferred embodiment of a portable ultrasound system 470 in accordance with the invention.
  • a personnel computer 472 such as a laptop, a hand-held computer or a desktop workstation can provide power and a standard interface (e.g. IEEE 1394 or USB) to a housing 474 along cable 476 .
  • Housing 474 includes a DC-DC converter to deliver power along cable 480 to interface housing ( 482 , 490 ).
  • This interface housing has two or three circuit boards 484 , 486 , 488 as described previously.
  • a standard transducer housing 496 with transducer array 498 is connected to the interface housing along cable 494 and connector 492 .
  • the beamformer integrated circuit mounted on circuit board 486 requires steering data, the transmit circuitry requires data to provide proper transmit focus and the TGC must have gain level information for a given depth.
  • FIG. 11 illustrates a wearable ultrasound imaging system that can include a belt mounted computer 360 or interface connected big cable 362 to hand-held probe 364 , a second hand-held unit 368 that can include various controls including a mouse control and buttons to freeze the image displayed or to store a particular image in electronic memory.
  • the unit 368 can be connected by wireless (RF or infrared) connection or by cable 366 to housing 360 .
  • the computer 360 can be connected to a desktop, laptop or hand-held display or can be connected by cable to a headmounted display system 370 that includes a microphone, a pair of speakers for audio and a high resolution display positioned adjacent the user's eye.
  • FIG. 12 Another preferred embodiment is illustrated in FIG. 12 in which a laptop computer 450 , having a flat panel display and a standard keyboard, has been programmed to perform scan conversion, doppler processing etc. on a beamformed representation of the region of interest that has been transmitted from interface housing 454 along a standard communications link such as cable 458 that conforms to the IEEE 1394 FireWire standard or the USB 2 . 0 standard, for example.
  • the computer 450 and/or the interface can optionally include a control panel 452 , 456 , that can be used to control the study being conducted.
  • a preferred embodiment of the interface housing 454 is controlled solely by the personnel computer 450 and provides for the use of standard transducer array probes that can be interchangeably attached to the interface housing 454 with a cable.
  • an additional remote controller 464 can be used to control system operation.
  • the interface 454 can house the circuit boards on which the beamformer, memory, system controller and digital communication circuits are mounted.
  • the interface 454 is connected to the hand-held probe 460 with a cable that is preferably between two feet and six feet in length, however longer lengths can be used.
  • the transmit/receive and/or the preamplifier/TGC circuits can be in the probe housing 460 or in the interface housing 454 .
  • the computer can also be configured for gigabit Ethernet operation and for transmitting video and image data over networks to remote systems at clinics or hospitals.
  • the video data can also be sent to a VCR or standard video recorder or video camera with an IEEE 1394 part for recording on videotape.
  • the VCR or video camera can be controlled using the computer.
  • the host 5 can be a desktop, laptop palmtop or other portable computer executing software instructions to display ultrasound images.
  • Doppler ultrasound data can be used to display an estimate of blood velocity in the body in real time.
  • the color-flow imaging modality interrogates a specific region of the body, and displays a real-time image of mean velocity distribution.
  • the CFI's are usually shown on top of the dynamic B-mode image.
  • different colors indicate velocity toward and away from the transducer.
  • color flow images display the mean or standard deviation of the velocity of reflectors (i.e., blood cells) in a given region
  • power Doppler displays a measurement of the amount of moving reflectors in the area, similar to a B-mode image's display of the total amount of reflectivity.
  • a PD image is an energy image in which the energy of the flow signal is displayed.
  • the spectral Doppler or spectral sonogram modality utilizes a pulsed-wave system to interrogate a single range gate and displays the velocity distribution as a function of time.
  • This sonogram can be combined with a B-mode image to yield a duplex image.
  • the top side of the display shows a B-mode image of the region under investigation, and the bottom shows the sonogram.
  • the sonogram can also be combined with the CFI image to yield a triplex image.
  • the time for data acquisition is divided between acquiring all three sets of data. Consequently, the frame rate of the complex image is generally decreased, compared to either CFI or duplex imaging.
  • CD Color Doppler
  • color-flow imaging combines, in a single modality, the capabilities of ultrasound to image tissue and to investigate blood flow.
  • CD images consist of Doppler information that can be color-encoded and superimposed on a B-mode gray-scale image.
  • Color-flow imaging is a mean velocity estimator.
  • FFTs pulsed Doppler system fast fourier transformer
  • the other approach uses a one-dimensional auto correlation.
  • P( ⁇ ) is the power-spectral density of the received, demodulated signal.
  • the mean velocity estimator can be reduced to an estimation of the autocorrelation and the derivative of the autocorrelation.
  • the estimator given by the proceeding expression can be calculated when data from two returned lines are used, i.e.,
  • f prf is the pulse repetition frequency
  • N c are the number of lines used in autocorrelation estimator. In practice, more then 2 lines are used to improve the signal-to-noise ratio. Data from several RF lines are needed in order to get useful velocity estimates by the auto-correlation technique. Typically, between 8 and 16 lines are acquired for the same image direction. The lines are divided into range gates throughout the image depths and the velocity is estimated along the lines.
  • the CFI pulses are interspersed between the B-mode image pulses.
  • CFI pulses it is known that a longer duration pulse train gives an estimator with a lower variance, however, good spatial resolution necessitates a short pulse train. Consequently, a separate pulse train must be used for the B-mode image, because the CFI pulse train is too long for high-resolution, gray-scale images.
  • CFI the velocity estimator is given by Eq. (5). This can be computed by serial processing, since the arrival of samples for a new line results in the addition of the new data to an already calculated sum. Four multiplications, three additions, and a subtraction are performed for each range gate and each new line. Stationary echo cancellation is also performed for each new sample. A filter with N e , coefficients necessitates 2 N e multiplications and additions per gate and line.
  • N ops (2 N e +2) Mf 0 (7)
  • N ops ⁇ ⁇ ( nN e + 2 ) ⁇ Mf 0 ⁇ N c - N b N c ( 8 )
  • N c is the number of CFI lines per estimate
  • N B is the number of B-mode image lines interspersed between CFI lines
  • denotes the effective time spent on acquiring useful data
  • CFI Color Flow Imaging
  • PD Power Doppler imaging provides an alternative method of displaying the blood stream in the insonified regions of interest. While CF imaging displays the mean or standard deviation of the velocity of reflectors (e.g., blood cells) in a given region, PD displays a measurement of the density of moving reflectors in the area, similar to the B-mode image's display of reflectivity.
  • Power Doppler is akin to a B-mode image with stationary reflectivity suppressed. This is particularly useful for viewing moving particles with small cross-sectional scattering, such as red blood cells.
  • Power Doppler displays the integrated Doppler power instead of the mean frequency shift as used for color Doppler imaging.
  • the integrated power in the frequency domain is the same as the integrated power in the time domain and hence the power Doppler can be computed from either the time-domain or the frequency-domain data.
  • the undesired signals from the surrounding tissue such as the vessel walls, should be removed via filtering. This calculation is also referred to as a Wall filter.
  • the PD can be computed in software running on a microprocessor; similar to the computation of the CFI processing described above.
  • Parallel computation units such as those in the Intel PentiumTM and Pentium II's MMX coprocessors, allow rapid computation of the required functions.
  • a Digital Signal Processor can also be used to perform this task.
  • DSP Digital Signal Processor
  • the frequency content of the Doppler signal is related to the velocity distribution of the blood. It is common to devise a system for estimating blood movement at a fixed depth in tissue. A transmitter emits an ultrasound pulse that propagates into and interacts with tissue and blood. The backscattered signal is received by the same transducer and amplified. For a multiple-pulse system, one sample is acquired for each line or pulse emitted. A display of the distribution of velocities can be made by Fourier transforming the received signal and showing the result. This display is also called a sonogram. Often a B-mode image is presented along with the sonogram in a duplex system, and the area of investigation, or range gate, is shown as an overlay on the B-mode image. The placement and size of the range gate is determined by the user. In turn, this selects the epoch for data processing. The range gate length determines the area of investigation and sets the length of the emitted pulse.
  • the calculates spectral density is displayed on a screen with frequency on the y-axis and time on the x-axis.
  • the intensity of a pixel on the screen indicates the magnitude of the spectrum; thus, it is proportional to the number of blood scatterers moving at a particular velocity.
  • the range gate length and position are selected by the user. Through this selection, both emitted pulse and pulse repetition frequency are determined.
  • the size of the range gate is determined by the length of the pulse.
  • the gate length is l g and M is the number of periods.
  • the gate duration determines how rapidly pulse echo lines can be acquired. This is referred to as the pulse-repetition frequency or f prf ⁇ c 2 ⁇ d 0 , ( 14 )
  • d 0 is the distance to the gate.
  • a 4 period, 7 M HZ pulse is used for probing a blood vessel lying at a depth of 3 cm with a 10 ms observation time.
  • the gate length is computed as
  • the pulse-repetition frequency is f prf ⁇ c 2 ⁇ d 0 ⁇ 25 ⁇ ⁇ KHz . ( 16 )
  • the sonograph computation can be carried out in software running on a microprocessor (similar to the computation of the CFI processing described above).
  • Parallel computation units such as those inside the Intel PentiumTM and Pentium II's MMX coprocessors, allow rapid computation of the required FFT functions. All three velocity estimation systems can be implemented in software on current microprocessors, such as the Intel Pentium, or digital signal processors (DSP).
  • Stabilized microbubbles are used for ultrasound contrast imaging because of their unique acoustic properties compared to biological tissues. They present superior backscattering and nonlinear behavior, and fragility upon exposure to ultrasound. A number of ultrasound imaging modalities have been created to exploit these features.
  • fundamental B-Mode imaging the transmitting and receiving frequencies are the same.
  • the echogenicity of blood is significantly increased with the administration of a contrast material.
  • Gas microbubbles scatter sound much more intensely than an equivalent size liquid or solid particle owing to the acoustic impedance mismatch (particularly the difference in compressibility) between the gas and the surrounding tissue or blood. This effect will be observed in Doppler and M-Mode imaging techniques as well.
  • One disadvantage of using fundamental B-Mode for contrast imaging is that the level of the echoes created by the bubbles is similar to the level of the echoes resulting from the biological tissues.
  • a technique using the second harmonic relies on the fact that bubbles generate harmonics of the transmitted frequency at a level much higher than the harmonics generated by the tissues. By creating images from the signal received at twice the transmitted frequency, high image contrast is achieved between regions with and without bubbles.
  • a problem with this imaging modality is that a short pulse (typically used in B-mode imaging) has a broad bandwidth and the transmitting and receiving frequencies overlap, contaminating the harmonic image with the fundamental frequency. To alleviate this problem, the pulse length is increased to achieve a narrow bandwidth, however, at the expense of decreasing the axial resolution of the image.
  • the pulse inversion method (also called wideband harmonic imaging or dual pulse imaging), solves the problem of overlapping frequencies observed with the second harmonic technique.
  • Each scan line is formed by summing the signals received from two ultrasound pulses, where the second pulse is inverted and slightly delayed relative to the first. This procedure cancels the response of all linear scatters (if there is no tissue movement between the two pulses) while enhancing the effects of nonlinear scatterers. Because there is delay between the two pulses, any bubble displacement adds an additional signal, resulting in velocity-dependent enhancement.
  • the stimulated acoustic emission method typically involves color Doppler with the transmitting power set high to ensure bubble disruption with the first pulse.
  • a broadband acoustic signal is generated. Since ultrasound Doppler systems compare the backscattered signal with respect to a “clean” reference signal, this loss of frequency correlation caused by the bubble collapse is interpreted by the machine as a random Doppler shift, resulting in a mosaic of colors at the location of the microbubbles.
  • a preferred embodiment of the invention employs a spatial filter in providing a power doppler image, for example.
  • This spatial or high pass filter can also be used effectively with a contrast agent to further differentiate between blood flow and the surrounding vessel or artery.
  • First the power is computed and a two pulse canceller is employed.
  • the ratio of the power of the signal before and after the filter provides a data set yielding clear images of moving fluid within the body.
  • a preferred embodiment of the invention employs a spatial filter in providing a power doppler image, for example.
  • This spatial or high pass filter can also be used effectively with a contrast agent to further differentiate between blood flow and the surrounding vessel or artery.
  • First the power is computed and a two pulse canceller is employed.
  • the ratio of the power of the signal before and after the filter provides a data set yielding clear images of moving fluid within the body.
  • FIG. 13 shows the top-level screen of a graphical user interface (GUI) for controlling the ultrasonic imaging system.
  • GUI graphical user interface
  • a selection bar 702 allows the operator to select the active focus areas of the screen.
  • An image area 704 displays the ultrasonic image of the subject area.
  • a patient information area 706 displays information about the subject from whom ultrasonic data is being gathered.
  • a Time Gain Compensation area 708 provides feedback about time gain compensation, described further below.
  • a control bar 710 allows qualitative and quantitative selection of ultrasonic imaging operations, as will be described further below with respect to FIGS. 15A and 15B.
  • FIG. 14 shows the unitary, directional keypad which provides a single operating position from which to control the ultrasonic imaging operations.
  • an up arrow key 712 and a down arrow key 714 allow a user to scroll through the qualitative ultrasonic imaging operations of the system, as will be described further below.
  • a left arrow key 716 and a right arrow key 718 allow a user to select quantitative parameters corresponding to the ultrasonic imaging operation selected. As described above, the quantitative parameters may be in a range of discrete values, or may span a continuum.
  • a control key 720 employed in conjunction with the up arrow key 712 or down arrow key 714 allows an operator to toggle between two control tabs depicted in FIGS.
  • FIGS. 15A and 15B show qualitative and quantitative selection of ultrasonic imaging operations via invoking the unitary directional keypad of FIG. 14.
  • the scanning operations are directed active acquisition of real-time, dynamic ultrasonic image data, and are typically applied as the hand-held probe is manipulated over the subject imaging area.
  • a size operation 722 sets a series of predetermined defaults for other ultrasonic imaging operations.
  • a small, medium, or large subject may be selected via the left and right arrow keys 716 , 718 (FIG. 14).
  • a depth operation 724 allows selection of a depth parameter via the arrow keys 716 , 718 . Focus is controlled by a focus 726 operation.
  • Gain 728 control adjusts the TGC for all TGC settings 730 a - 730 h.
  • TGC operations 730 a - 730 f adjusts amplification of return signals at varying depth, ranging from the least depth 730 a to greatest depth 730 h, via the arrow keys 716 - 718 .
  • ultrasonic imaging operations applicable to processing are shown.
  • the processing operations may be applied to static real-time or frozen images.
  • An inversion operation is controlled by the inversion 732 selection, and rotates the image via the arrow keys 716 , 718 (FIG. 14).
  • Palate, smoothing, persistence, and mapping 734 , 736 , 738 and 740 respectively are selected via the up and down arrow keys 712 , 714 , and parameters selected via the arrow keys 716 , 718 (FIG. 14).
  • Brightness and contrast scales are selected via sliders 742 and 744 , respectively, and are changed using arrow keys 716 , 718 .
  • FIG. 16 shows a state diagram depicting transition between the ultrasonic imaging operations depicted in FIGS. 15A and 15B.
  • the Tab 746 operations are selected via the up and down arrow keys 712 , 714 and transition according to the following state sequence: size 600 , depth 602 , focus 604 , Gain 606 and TGC degrees 608 , 610 , 612 , 614 , 616 , 618 , 620 and 622 .
  • the Tab 2 operations are selected according to the following sequence: invert 624 , palette 626 , smoothing 628 , persistence 630 , map 632 , brightness 634 , and contrast 636 . As indicated above, selection of operations may be toggled between Tab 1 746 and Tab 2 748 using control key 720 and arrow keys 712 , 714 .
  • FIG. 15A The scanning operations shown in FIG. 15A are displayed on Tab 1 746 , as shown in FIG. 13.
  • the processing operations shown in FIG. 15B are displayed and selected on Tab 2 , as shown in FIG. 13.
  • control is toggled between Tab 1 746 and Tab 2 748 using a combination of the control key 720 and either the up or down arrow keys 712 , 714 , as shown by dotted lines 638 a and 638 b.
  • Another embodiment of the invention involves providing the user with an intuitive and simple way to use the interface, and with the ability to quickly and automatically set imaging parameters based on a software module. This enables general medical personnel with limited ultrasound experience to obtain diagnostic-quality images without having to adjust the controls.
  • the “Quick Look” feature provides the user with a very simple mechanism of image optimization. It allows the user to simply adjust the image so as to obtain appropriate diagnostic image quality with one push of one button.
  • the procedure involves the use of predefined histograms. Separate histograms are provided for different anatomical structures that are to be examined. The user chooses a structure, similar to the existing method of choosing a preset. Once the structure is chosen, the user places the transducer on the area of interest in the scanning window. At that time, pressing the selected control button triggers the system to adjust the system contrast and brightness control values so that a histogram of the gray levels in the image closely matches the corresponding pre-defined histogram for that structure. The result is an image of diagnostic image quality that is easily recreated.
  • the integrated probe system 24 includes the front end probe 3 , the host computer 5 , and a portable information device such as a personal digital assistant (PDA) 9 .
  • the PDA 9 such as a Palm Pilot device, or other hand-held computing device is a remote display and/or recording device 9 .
  • the front end probe 3 is connected to the host computer 5 by the communication link 40 that is a wired link.
  • the host computer 5 a computing device, is connected to the PDA 9 by a communication link or interface 46 that is wireless link 46 .
  • the integrated ultrasound probe system 24 in the embodiment described has a Windows®-based host computer 5
  • the system can leverage the extensive selection of software available for the Windows® operating system.
  • One potentially useful application is electronically connecting ultrasound systems allowing physicians to send and receive messages, diagnostic images, instructions, reports or even remotely controlling the front-end probe 3 using the system.
  • connections through the communication links or interfaces 40 and 746 can be either wired through an Ethernet or wireless through a wireless communication link such as, but not limited to, IEEE 802.11a, IEEE 802.11b, Hyperlink or HomeRF.
  • FIG. 17A shows a wired link for the communication link 40 and a wireless link for the communication link 746 .
  • Alternative embodiments and protocols for wired links are described above with respect to FIG. 1. It is recognized that other wired embodiments or protocols can be used.
  • the wireless communication link 746 can use various different protocols, such as, an RF link which may be implemented using all or parts of a specialized protocol, such as the IEEE 1394 protocol stack or Bluetooth system protocol stack.
  • IEEE 1394 is a preferred interface for high bandwidth applications such as high quality digital video editing of ultrasonic imaging data.
  • the Bluetooth protocol uses a combination of circuit and packet switching. Slots can be reserved for synchronous packets.
  • Bluetooth can support an asynchronous data channel, up to three simultaneous synchronous channels, or a channel which simultaneously supports asynchronous data and synchronous voice. Each synchronous channel support a 64 kb/s synchronous (voice) channel in each direction.
  • the asynchronous channel can support maximal 723.2 kb/s asymmetric, or 433.9 kb/s symmetric.
  • the Bluetooth system consists of a radio unit, a link control unit, and a support unit for link management and host terminal interface functions.
  • the link controller carries out the baseband protocols and other low-level link routines.
  • the Bluetooth system provides a point-to-point connection (only two Bluetooth units involved), or a point-to-multipoint connection.
  • the channel is shared among several Bluetooth units. Two or more units sharing the same channel form a piconet.
  • One Bluetooth unit acts as the master of the piconet, whereas the other units act as slaves. Up to seven slaves can be active in a piconet.
  • the Bluetooth link controller has two major states: STANDBY and CONNECTION, in addition, there are seven substates, page, page scan, inquiry, inquiry scan, master response, slave response, and inquiry response.
  • the substates are interim states that are used to add new slaves to a piconet.
  • the link may also be implemented using, but not limited to, Home RF, or the IEEE 802.11 wireless LAN specification.
  • IEEE 802.11 Wireless LAN specification see the Institute of Electrical and Electronic Engineers (IEEE) standard for Wireless LAN incorporated herein by reference. IEEE standards can be found on the World Wide Web at the Universal Resource Locator (URL) www.ieee.org.
  • IEEE standard 802.11b provides a communications link between two personal computers at 2 and 11 Mbps. The frequency bands allocated for transmission and reception of the signals is approximately 2.4 GHz.
  • IEEE standard 802.11a provides 54 Mbps communications. The frequency allocation for this standard is around 5 GHz.
  • PC Cards and access points that use a proprietary data-doubling, chipset, technology to achieve 108 Mbps communications.
  • the chip that provides the data doubling (the AR5000) is manufactured by Atheros Communications.
  • the actual data rate maintained between two computers is related to the physical distance between the transmitter and receiver.
  • the wireless link 746 can also take on other formns, such as, an infrared communications link as defined by the Infrared Data Association (IrDA).
  • IrDA Infrared Data Association
  • the host computer 5 and the remote display and/or recording device 9 each has the desired communication port.
  • FIG. 17B shows the communication link 40 between the probe 3 and the host computer 5 as a wireless link.
  • the communication link 746 between the host computer 5 and the PDA 9 is shown as a wired link.
  • the integrated probe system 24 of FIG. 17C has wireless links for both the communication link 40 between the probe 3 and the host computer 5 and the communication link 746 between the host computer 5 and the PDA 9 . It is recognized that wired and wireless links can both be used together or in the alternative, can be exclusively wired links or wireless links in a system 24 .
  • the remote display and/or recording device 9 of the integrated probe system 24 of FIG. 18 is a remote computing system 26 .
  • the remote computing system 26 in addition to having remote display and/or recording capability can also remotely control the probe 3 .
  • the communication link 746 is shown as a wireless link.
  • the communication link 40 between the probe 3 and the host computer 5 is shown as a wired link.
  • An example of a remote control system includes using a wearable computer (such as the one manufactured by Xybernaut Corporation), a pair of high-speed, wireless PC Cards (such as those provided by Proxim) and the ultrasound program and the probe 3 .
  • a portable-networked ultrasound system can be configured weighing less than 2.5 pounds.
  • a program similar to Microsoft® NetMeeting a real-time connection between a remote PC and the wearable computer can be established.
  • the remote host can monitor all interactions with the wearable computer, including real-time ultrasound imaging (at display rates up to approximately 4 frames per second). NetMeeting can also be used to “take control” of the wearable computer and manage the ultrasound session from the remote personal computer in real time.
  • images and iterative executable software instructions that are archived to the hard disk on the wearable computer can be transferred at 108 Mbps to the host computer.
  • real time ultrasound diagnoses can be performed and relayed to a remote sight at speeds that rival a hardwired 100 million bits per second (Mbps) local area network (LAN).
  • Mbps bits per second
  • LAN local area network
  • FIG. 19 illustrates an integrated probe system 800 that has a hub 748 for connecting a plurality of remote devices 9 to the host computer 5 .
  • the communication link 750 from the hub 748 to the remote devices are shown both as wireless and wired links. It is recognized that a completely wired network such as a LAN or Ethernet can be used.
  • a wireless transceiver and port in each of the computers (remote device) 9 a wireless Network/Communication system can readily be established. With the recent advent of high-speed wireless standards, such as IEEE 802.11a, the communications between the remote and local machines can rival that of a wired, 100 Mbps local area network (LAN).
  • Another alternative is using a Bluetooth system to form a piconet.
  • IEEE 1394 uses a wireless solution for the transmission of 1394 protocols over IEEE 802 . 11 , the emerging standard for wireless data transmission in the corporate environment and increasingly in the home as well.
  • IEEE 1394 is implemented as a Protocol Adaptation Layer (PAL) on top of the 802.11 radio hardware and Ethernet protocols, bringing together a convergence of these important technologies.
  • PAL Protocol Adaptation Layer
  • This protocol adaptation layer enables the PC to function as a wireless 1394 device.
  • the engineering goal is for real delivered IEEE 1394 bandwidth sufficient for the transmission of a single high-definition MPEG2 video stream (or multiple standard-definition MPEG2 video streams) from one room in a facility to another.
  • Preferred embodiments of the present invention include the use of wireless transmission of IEEE 1394 at 2.4 GHz using Wi-LAN's Wideband Orthogonal Frequency Division Multiplexing (W-OFDM) technology.
  • W-OFDM Wideband Orthogonal Frequency Division Multiplexing
  • the Wireless IEEE 1394 system includes an MPEG-2 data stream generator, which feeds a multiple transport stream into a Set Top Box (STB) such as provided by Philips Semiconductors.
  • STB Set Top Box
  • the STB converts this signal to an IEEE 1394 data stream and applies it to the W-OFDM radio system such as provided by Wi-LANTM.
  • the radio transmitter then sends the IEEE 1394 data stream over the air to the corresponding W-OFDM receiver in the host computer, for example.
  • the IEEE 1394 signal is demodulated and sent to two STBs, which display the content of the different MPEG-2 data streams on two separate TV monitors.
  • W-OFDM technology is inherently immune to the effects of multipath. Like all modulation schemes, OFDM encodes data inside a radio frequency (RF) signal. Radio communications are often obstructed by occurring noise, stray and reflected signals. By sending high-speed signals concurrently on different frequencies, OFDM technology offers robust communications. OFDM-enabled systems are highly tolerant to noise and multipath, making wide-area and in-home multi-point coverage possible. Additionally, as these systems are very efficient in use of bandwidth, many more high-speed channels are possible within a frequency band. W-OFDM is a cost-effective variation of OFDM that allows much larger throughputs than conventional OFDM by using a broad frequency band. W-OFDM further processes the signal to maximize the range. These improvements to conventional OFDM result in the dramatically increased transmission speeds.
  • OFDM technology is becoming increasingly more visible as American and European standardization committees are choosing it as the only technology capable of providing reliable wireless high data rate connections.
  • European terrestrial digital video broadcasting uses OFDM and the IEEE 802.11 working group recently selected OFDM in its proposed 6 to 54 Mbps wireless LAN standard.
  • the European Telecommunications Standards Institute is considering W-OFDM for the ETSI BRAN standard.
  • Wi-LANTM can be found on the Web at http://www.wi-lan.com/Philips Semiconductors, a division of Royal Philips Electronics, headquartered in Eindhoven, The Netherlands. Additional information on Philips Semiconductors can be obtained by accessing its home page at http://www.semiconductors.philips.com/.
  • NEC Corporation's wireless transmission technology based on the IEEE 1394 high-speed serial bus capable of 400 megabits (Mbps), at transmission ranges of up to 7 meters through interior walls and up to 12 meters by line-of-sight may also be used in preferred embodiments.
  • This embodiment uses 60 GHz millimeter wavelength transmissions, which does not require any kind of license, with the amplitude shift keying (ASK) modulation scheme and the development of a low cost transceiver.
  • ASK amplitude shift keying
  • This embodiment incorporates an echo detection function in NEC's PD72880 400 Mbps long-distance transmission physical layer device, to prevent the influence of signal reflections, a significant obstacle to stable operation of IEEE 1394 over a wireless connection.
  • Wireless IEEE 1394 can play an important role in bridging the PC to clusters of interconnected IEEE 1394 devices, which can be in another room in the facility.
  • Three example applications are sourcing video or audio stream from a PC, providing internet content and connectivity to a IEEE 1394 cluster, and provide command, control and configuration capabilities to the cluster.
  • the PC may provide data to someone in another room in a facility.
  • the PC may provide an avenue for 1394 enabled devices to access the Internet.
  • the PC plays the role of orchestrating activities in the 1394 clusters and routing data within the clusters and over bridges—though the actual data does not flow through the PC.
  • FIG. 20 is a diagram showing the provision of wireless access to the images created by a preferred embodiment ultrasound imaging system and the associated architecture.
  • the imaging system 906 exports patient information and images to files in corresponding folders.
  • Executable software instructions have all functionality required to implement the ultrasonic imaging methods described hereinbefore.
  • the wireless agent 910 serves to detect patient directories and image files and opens a port for wireless clients to get connection thereto. Upon establishing a connection it sends back to the client list of patients and corresponding images.
  • the wireless agent 910 may include data interface circuitry which may include a first port such as a RF interface port.
  • the wireless viewer 912 residing on a handheld side can establish connection to the wireless agent 910 and retrieve patient and image information. Upon user selection of the patient and image it initiates file transmission from the wireless agent. Upon receiving an image the Viewer 912 displays this image along with patient information. The image gets stored on the handheld for future use. The handheld user can view images retrieved in previous sessions or can request new image transmission.
  • FIG. 24 is a block diagram illustrating a portable information device such as a personal digital assistant (PDA) or any computing device according to an exemplary embodiment of the present invention.
  • the link interface or data interface circuitry 1020 illustrates, but is not limited to, one link interface for establishing a wireless link to another device.
  • the wireless link is preferable an RF link, defined by IEEE 1394 communications specifications.
  • the wireless link can take on other forms, such as the infrared communications link as defined by the Infrared Data Association (IrDA).
  • the PDA includes a processor 1050 that is capable of executing an RF stack 1150 that communicates with a data interface circuitry 1020 through bus 1110 .
  • the processor 1050 is also connected through bus 1110 to user interface circuitry 1040 , data storage 1090 and memory 1100 .
  • the data interface circuitry 1020 includes a port such as the RF interface port.
  • the RF link interface may include a first connection 1022 which includes radio-frequency (RF) circuitry 1024 for converting signals into radio-frequency output and for accepting radio-frequency input.
  • the RF circuitry 1024 can send and receive RF data communications via a transceiver that establishes communication port 1026 .
  • RF communication signals received by the RF circuitry 1024 are converted into electrical signals and relayed to the RF stack 1150 in processor 1050 via bus 1110 .
  • the radio interface 1024 , 1026 and the link between the laptop PC (host computer) and the PDA may be implemented by, without limitation, IEEE 1394 specifications.
  • the PC host computer has a RF stack and circuitry to be able to communicate to the remotely located image viewer.
  • the remote image viewer may be used to monitor and/or control the ultrasonic imaging operations not just display the resultant imaging data.
  • the handheld market offers various handheld devices as well. For imaging purposes it is very important to have high quality screen and enough processing power to display an image. Considering these factors, in a preferred embodiment, a Compaq iPAQ is used, in particular a Compaq iPAQ 3870 is used. A wireless PC card compatible with the handheld is used such as Compaq's Wireless PC Card WL110 and corresponding Wireless Access Point.
  • FIG. 21 illustrates the image viewer 920 in communication with the personal computer in a preferred embodiment or the probe in an alternate embodiment.
  • the image viewer has user interface buttons 922 , 924 , 926 , 928 that allow the user to interface with the ultrasonic imaging system computer or probe in accordance with preferred embodiments of the present invention.
  • a communicating interface such as button 922 allows the user to initiate a connection with the ultrasonic imaging application.
  • button 924 is used to terminate an established connection with the ultrasonic imaging application.
  • a button 926 functions as a selection button that is used to provide a list of patients and corresponding images that are selectable. These images are either stored locally or remotely. If selected, the image that may be stored remotely is transmitted to the viewer. The selected image is displayed on the viewer 930 .
  • buttons such as button 928 functions as an options button which may, but is not limited to, allow changing configuration parameters such as an internet protocol (IP) address.
  • IP internet protocol
  • FIG. 22 is a diagram illustrating a preferred embodiment ultrasound image collection and distribution system including four major software components.
  • the main hardware element of the system is ultrasound probe 942 a . . . n.
  • the probe in communication with the laptop computer 944 a . . . n allows generation of the ultrasound images and related patient information and submits images and information to an image/patient information distribution server 946 .
  • the distribution server utilizes an SQL database server 948 to store and retrieve images and related patient information.
  • the SQL server provides distributed database management. Multiple workstations can manipulate data stored on the server, and the server coordinates operations and performs resource-intensive calculations.
  • Image viewing software or executable instructions may be implemented in two different embodiments.
  • a full stationary version of the Image Viewer as described in FIG. 21 may reside on a workstation or laptop computer equipped with high bandwidth network connection.
  • a light weight version of the Image Viewer may reside on a small PocketPC handheld 952 equipped with IEEE 802.11b and/or IEEE 802.11a compliant network card.
  • the PocketPC image viewer implements only limited functionality allowing basic image viewing operations.
  • the wireless network protocols 950 such as IEEE 802.11 may be used to transmit information to a handheld or other computing devices 952 in communication with a hospital network.
  • This preferred embodiment describes the ultrasound imaging system to cover hospital wide image collecting and retrieving needs. It also provides instant access to non-image patient related information.
  • image distribution servers In order to provide inter-hospital information exchange, image distribution servers have the ability to maintain connectivity with each other across wide area networks.
  • the probe may directly communicate with a remote computing device such as a PDA 964 using a wireless communication link 966 .
  • the communication link may use the IEEE 1394 protocol.
  • the probe and the PDA both have an RF stack and circuitry described with respect to FIG. 24 to communicate using wireless protocols.
  • the probe includes a transducer array, beamforming circuitry, transmit/receive module, a system controller and digital communication control circuitry. Post processing of the ultrasonic image data including scan conversion is provided in the PDA.
  • a preferred embodiment of the microminiaturized PC enabled ultrasound imaging system runs on an industry standard PC and Windows® 2000 operating system (OS). It is therefore network ready which makes it ideal for telemedicine solutions while being cost efficient. It provides open architecture support embedded and thus integrated with third party applications.
  • the preferred embodiment includes an enhanced Application Programming Interface (API), common interface, export support for third party applications, such as, but not limited to, for example, radiation therapy planning, image guided surgery, integrated solutions, for example, calculations, three-dimensional and reporting packages.
  • the API provides a set of software interrupts, calls, and data formats that application programs use to initiate contact with network services, mainframe communication programs, telephone equipment or program-to-program communications. Software based feature enhancements reduce hardware obsolescence and provide efficient upgrades.
  • the preferred embodiment includes system-on-chip integrated circuits (ICs) which run on PCs and have a large channel count, large dynamic range, high image quality, full feature sets, broad diagnostic coverage, minimal supply chain requirements, simplified design for easy testing and high reliability, and very low maintenance costs.
  • ICs system-on-chip integrated circuits
  • the preferred embodiment includes a PC based design which is intuitive, has a simple graphical user interface, is easy to use and train with, which leverages PC industry know-how, robust electronics, high quality displays and low manufacturing costs. It also provides support of software controlled communications with other applications, which are embedded applications that allows patient data, scanner image, Current Procedural Terminology (CPT) code management, which is a numeric coding system by which physicians record their procedures and services, physician's plan, outcome assessment reports, all on an integrated PC.
  • CPT Current Procedural Terminology
  • the reforms to the health care system have been applying pressure to lower costs, highlight the need for first visit/in-field diagnosis, data storage and retrieval solutions which when combined with technology innovations such as, for example, data storage and retrieval based on the Digital Imaging and Communications in Medicine (DICOM) standard, broadband and Picture Archiving and Communications Systems (PACS) drives, changes in patient record storage and retrieval and transmission, innovations in lower cost/handheld devices for ultrasound data acquisition, all which enable the preferred embodiment of the present invention.
  • the DICOM standard aids the distribution and viewing of medical images such as, for example, ultrasound, Magnetic Resonance Images (MRIs), and CT scans.
  • Broadband is a wide area network term that refers to a transmission facility providing bandwidth greater than 45 Mbps. Broadband systems are generally fiber optic in nature.
  • a preferred embodiment of the present invention provides image acquisition and end-user application, for example, radiation therapy, surgery, angiography, all applications executed on the same platform. This provides low cost, user friendly controls through a common software interface.
  • the ultrasound system has scalable user interfaces for advanced users and has an intuitive Windows® based PC interface.
  • a preferred embodiment of the ultrasound system also provides an enhanced diagnostic ability due to the features of one-stop image capture, analysis, storage, retrieval and transmittal capability for the data and images.
  • a high image quality is provided by a 128 channel bandwidth.
  • the ultrasound system also provides patient access at any time, any location and using any tool. Point of care imaging is provided with a 10 ounce probe in accordance with a preferred embodiment of the present invention.
  • the data storage and retrieval abilities are based on the DICOM standard and are compatible with off-the-shelf third party analytical and patient record systems.
  • the ultrasound system in accordance with a preferred embodiment also provides immediate image transfer ability using, but not limited to, for example, electronic mail, LAN/WAN, DICOM and Digital Imaging Network—Picture Archiving and Communications Systems (DINPACs).
  • DINPACs Digital Imaging Network—Picture Archiving and Communications Systems
  • the choices to display the images captured include, but are not limited to, a desktop computer, a laptop computer, wearable personal computers and handheld devices such as personal digital assistants.
  • FIGS. 25 A- 25 C illustrate an ultrasound system 1200 in accordance with a preferred embodiment of the present invention integrated with an angiography system, a high frequency image 1220 of the carotid artery with directional power doppler and an image 1240 of the carotid artery with simultaneous. quantitative spectral doppler, respectively.
  • Intravascular ultrasound with the system of the present invention can evaluate the entire coronary circulation. Ultrasonographic screening reduces mortality from abdominal aortic aneurysms.
  • the ultrasound system of a present invention provides easy guidance and confirmation of aortic arch placement, helps the rapid delivery of cold perfusate into the aortic arch, hypothermic preservation of the brain, heart, and spinal cord. Further, sensor monitoring for critical flow/temperature/physiological data can be provided. Automatic computer controlled flow-temperature adjustments can be facilitated along with exsanguination control and blood pressure management using the embodiment of the present invention. Preferred embodiments use a touch screen display.
  • FIGS. 26A and 26B illustrate an ultrasound image of vessel walls 1260 in accordance with a preferred embodiment of the system of the present invention and a catheter placement 1270 used with the system. Surgeons can use the ultrasonic system for catheter or line placements for both guidance and confirmation of placement.
  • the image file or raw RF data is directly accessed using a direct digital memory access.
  • the ultrasound system provides real-time RF data output.
  • the ultrasound system of the present invention contributes to accurate surgical planning and imaging by providing neuro-navigation during neurosurgery.
  • FIGS. 27A and 27B illustrate a radiation planning system 1280 integrating the ultrasound system in accordance with a preferred embodiment of the present invention and the probe 1290 of the ultrasound system, respectively.
  • the ultrasound image can be integrated in the display.
  • FIGS. 28A and 28B illustrate an ultrasonic imaging system 1300 for cryotherapy in accordance with a preferred embodiment of the present invention and a probe 1310 used in the system, respectively.
  • prostate cancer patients with limited disease percutaneous ultrasound-guided cryosurgery applied focally can spare one neurovascular bundle and thus preserve potency without compromising cancer control.
  • the cryotherapy can be used for urological surgery also.
  • Preferred embodiments of the present invention provide multi-plane images with processing instructions that easily switch between planes in real time. At least two orthogonal transducer arrays having 64 or 128 elements can be used.
  • FIG. 29 is a schematic diagram 1320 illustrating a robotic imaging and surgical system integrating the ultrasound system in accordance with a preferred embodiment of the present invention.
  • the system ensures appropriate vessel harvesting.
  • the operating surgeon uses the ultrasound system to visualize the forceps and cautery controls.
  • the surgeon is seated across the room from the patient and peers into a monitor and manipulates the robot with controls such as, for example, joystick-like hand controls.
  • the robotic arms slip through the small, for example, nickel-size incisions between the ribs.
  • a camera, forceps and a cautery are used to free up the mammary artery and attach it to the heart.
  • the smaller incisions due to ultrasonic ⁇ image guided surgery results in lower trauma to patients, less post-operative pain, less patient morbidity and shorter recovery times.
  • image-guided surgery benefits from the provision of real-time RF data output in accordance with preferred embodiments of the system.
  • processed data includes data compression processing which masks differences between bone and tissue.
  • RF data emphasizes the reflectivity and thus the differences between bone and tissue.
  • the output data from the beamformer can be formatted and provided to enhance surgical imaging. This is enabling for surgeries such as, for example, hip and pelvic replacements.
  • computer enhanced image guided surgery further benefits a patient as it combines the dexterity of open surgery with low patient trauma.
  • an ultrasound system can be used for pacemaker placement surgery and monitoring.
  • a three-dimensional ultrasound can be integrated into the systems for providing direct access to digital data via a shared memory.
  • FIG. 30 is a schematic diagram 1340 illustrating an imaging and telemedicine system integrating the ultrasound system in accordance with a preferred embodiment of the present invention.
  • Preferred embodiments of the system output the real-time RF digital data or the front-end data.
  • FIGS. 31A and 31B are three-dimensional images from fetal imaging obtained from an ultrasound system in accordance with a preferred embodiment of the present invention.
  • Preferred embodiments for fetal imaging use the streaming data.
  • the location of each frame of data can be provided thus allowing for spatial registration.
  • Three-dimensional alignment is provided by looking at the frame locations.
  • the ultrasound imaging system provides an elastographic image of a tissue's elastic properties both in-vitro and in-vivo.
  • Ultrasound elastography is an imaging technique whereby local axial tissue strains are estimated from differential ultrasonic speckle displacements. These displacements are generated by a weak, quasi-static stress field. The resultant strain image is called an elastogram. Most pathological changes are associated with changes in tissue stiffness. Palpation is an effective method for lesion detection and evaluation. Many cancers (breast, prostate) are isoechoic, and hence difficult to detect by ultrasound alone.
  • Elastography uses the principle by which a small compression (strain) of the tissue results in a small compression of the signal (similar to frequency modulation.)
  • Ultrasound elastography conveys new and clinically important tissue information. The tradeoffs among engineering and elastographic image parameters are now reasonably well understood. Elastography can operate in hypoechoic areas, for example, shadows. Reliable small elastic contrast exists among normal soft-tissue components, good carrier-to-noise ratio (CNR) allows its visualization. Pathology generally exhibits large elastic contrast. Areas that can benefit from the elastography include breast, prostate, vasculature, small parts and treatment monitoring.
  • breast cancer is the most frequent cancer in women. Every ninth woman in the U.S. is affected during her lifetime. It is well known that palpation of the breast is a very helpful mean to detect conspicuous lesions. Although a lot of effort is put into screening methods for breast cancer, in the majority of cases the patient herself is the first who notices palpable changes in her breast during self-examination. Although it can support the diagnostics of breast tissue, there is still a need for an imaging modality that can provide a direct measure of material parameters related to tissue elasticity such as Young's modulus. Concerning breast imaging, the elasticity tensor can be reconstructed three-dimensionally using magnetic resonance imaging.
  • a semi-quantitative measure of elasticity with ultrasound has recently become a real-time imaging modality.
  • the strain imaging or elastography method is helpful to describe mechanical properties of tissue in vivo.
  • Elastography compares ultrasonic radio frequency (RF) data of an object before and after the application of a slight compression step. Time delays between the pre- and post-compression RF signals can be estimated and converted to mechanical displacement in the axial direction. The derivative of axial displacement leads to axial strain as a semi-quantitative measure of elastic properties.
  • RF radio frequency
  • a slight compression can be applied to the breast and a palpation performed with the transducer including both compression and relaxation.
  • the system is able to store the last two compression cycles in a cine-buffer of the ultrasound system, including approximately 80 images.
  • demodulated echo data i.e., gray-scaled imaged data.
  • a color-mode window of 20 ⁇ 40 mm i.e., 20 mm depth and 40 mm width can be used.
  • the RF data from the color-mode window which is usually used to calculate flow parameters, can be recorded as IQ-data (base-band data). Prior to the off-line calculation of strain images, the limited bandwidth of color-mode RF data is compensated for.
  • the ultrasound system is reprogrammed to use broadband transmit pulses and broadband receive filter for the color-mode. After performing time delay estimation on every two successive frames of an IQ-data series, a series of time-delay images or axial displacement images is obtained, respectively.
  • the purpose of imaging elastic tissue properties using ultrasound elastography is to support the detection, localization and differential diagnosis of lesions. Strain imaging is a semi-quantitative method. Therefore, it is definitely possible to evaluate elastograms qualitatively.
  • a qualitative method for the evaluation of ultrasound elastograms uses a gray-scaled colormap. The appearance (visualization, brightness, margin) and size of each lesion on the elastogram in comparison to the B-mode image, in order to distinguish breast tissues is used. However, the results of a qualitative image analysis depend on the choice of the colormap and image scaling, respectively.
  • the ultrasound systems of the present invention are used in minimally invasive surgery and robotic surgery methods including biopsy procedures, catheter introduction for diagnostic and therapeutic angiography, fetal imaging, cardiac imaging, vascular imaging, imaging during endoscopic procedures, imaging for telemedicine applications and imaging for veterinary applications, radiation therapy, and cryotherapy, without limitation.
  • the embodiments use computer based tracking systems and CT and MR images to pinpoint the precise location of the target areas.
  • Alternative preferred embodiments of ultrasound systems can at lower cost and using smaller footprint devices provide images just prior, during, and just after the procedure.
  • a preferred embodiment of the ultrasound system provides a fully integrated solution, since it can run its' ultrasound application on the same platform as any third party application that is processing the images.
  • the system includes a streaming video interface, an interface between a third party application and the system's ultrasound application.
  • a key component of this system allows the two applications to run on the same computer platform (using the same operating system (OS)) such as, for example, Windows® based platform, other platforms such as Linux can also be used and thus providing a seamless integration of the two applications.
  • OS operating system
  • the details of the software interface to move images from the system's ultrasound application to another application are described herein below.
  • Preferred embodiments include control and data transfer methods that allow a third party Windows® based application to control, for example, a portable Windows® based ultrasound system by running the ultrasound application as a background task, sending control commands to the ultrasound application and receiving images (data) in return. Further, the embodiment configures a portable ultrasound Windows® based application as a server of live ultrasound image frames supplying another Windows® based application that acts as a client. This client application receives these ultrasound image frames and processes them further.
  • an alternate embodiment configures the portable ultrasound Windows® based application as a server, interacting with a third party client application via two communication mechanisms, for example a component object model (COM) automation interface used by third party, hereinafter referred to interchangeably as external or a client to startup and control the portable ultrasound Windows® based application and a high-speed shared memory interface to deliver live ultrasound images.
  • COM component object model
  • a preferred embodiment includes and configures a shared memory interface to act as a streaming video interface between a portable Windows® based Ultrasound application and another third party Windows® based application.
  • This streaming video interface is designed to provide ultrasound images to a third party client in real-time.
  • a preferred embodiment allows the third party Windows® based application to control the flow rate of images from the portable ultrasound Windows® based application through the shared memory interface within the same PC platform and the amount of memory required to implement this interface.
  • These controls consist of a way to set the number of image buffers, the size of each buffer and the rate of image transfer. This flow rate control can be set for zero data loss thus ensuring that every frame is delivered to the third party Windows® based application from the ultrasound system, or minimum latency thus delivering the latest frame generated by ultrasound system to the third party Windows® based application first.
  • a preferred embodiment formats the ultrasound image frame such that probe, spatial, and temporal information can be interpreted by the third party Windows® based application as it retrieves the images (generated by the portable ultrasound Windows® based application) from the shared memory interface.
  • the actual image data passed between the server (i.e. portable ultrasound application) and the client application (third party Windows® based application) is a Microsofto device independent bitmap (DIB) with 8 bit pixels and a 256 entry color table.
  • DIB Microsofto device independent bitmap
  • the image frame also contains a header that provides the following additional information, for example, but not limited to, Probe Type, Probe Serial Number, Frame Sequence Number, Frame Rate, Frame Timestamp, Frame Trigger Timestamp, Image Width (in pixels), Image Height (in pixels), Pixel Size (in X and Y), Pixel Origin (x, y location of the first pixel in image relative to the Transducer Head, and Direction (spatial direction along or across each line of the image).
  • Probe Type Probe Serial Number
  • Frame Sequence Number Frame Rate
  • Frame Timestamp Frame Trigger Timestamp
  • Image Width in pixels
  • Image Height in pixels
  • Pixel Size in X and Y
  • Pixel Origin x, y location of the first pixel in image relative to the Transducer Head
  • Direction spatial direction along or across each line of the image.
  • the preferred embodiment controls the shared memory interface used to transfer ultrasound images between a Windows® based portable ultrasound system and a third party Windows® based system through the use of ActiveX controls.
  • the Windows® based portable ultrasound application contains an ActiveX control that transfers a frame into the shared memory and sends out a Windows® Event (that includes a pointer to the frame just written) to the third party Windows® based application.
  • This third party application has a similar ActiveX control that receives this Event and pulls the image frame out of shared memory.
  • the graphical user interface includes one or more control programs, each of which is preferably a self-contained, for example, client-side script.
  • the control programs are independently configured for, among other functions, generating graphical or text-based user controls in the user interface, for generating a display area in the user interface as directed by the user controls, or for displaying the processed streaming media.
  • the control programs can be implemented as ActiveX controls, as Java applets, or as any other self-contained and/or self-executing application, or portion thereof, operable within a media gateway container environment and controllable through the web page.
  • Ultrasonic content can be displayed within a frame in the graphical user interface.
  • the program generates an instance of an ActiveX control.
  • ActiveX refers to a set of object-oriented programming technologies and toofs provided by Microsof® Corporation of Redmond, Washington.
  • the core part of the ActiveX technology is the component object model (COM).
  • COM component object model
  • a program run in accordance with the ActiveX environment is known as “component,” a self-sufficient program that can be run anywhere in the network, as long as the program is supported. This component is commonly known as an “ActiveX control.”
  • an ActiveX control is a component program object that can be re-used by many application programs within a computer or among computers in a network, regardless of the programming language with which it was created.
  • An ActiveX control runs in what is known as a container, which is an application program utilizing the COM program interfaces.
  • One advantage of using a component is that it can be re-used by many applications, which are known as “component containers.”
  • an ActiveX control can be created using one of several well-known languages or development tools, including C++, Visual Basic, or PowerBuilder, or with scripting tools such as VBScript. ActiveX controls can be downloaded as small executable programs, or as self-executable code for Web pages animation, for example. Similar to ActiveX controls, and suitable for the client-side scripts, are applets. An applet is typically a self-contained, self-executing computer program written in JavaTM, a web-based, object-oriented programming language promulgated by SUN Microsystems Corporation of Sunnyvale, Calif.
  • control programs can be stored and accessed locally at the client system, or downloaded from the network. Downloading is typically done by encapsulating a control program in one or more markup language-based files.
  • the control programs can also be used for any commonly-needed task by an application program running in one of several operating system environments. Windows®, Linux and Macintosh are examples of operating system environments that can be used in preferred embodiments.
  • a preferred embodiment of the Ultrasound Imaging System has specific software architecture for the image streaming capabilities.
  • This Ultrasound Imaging System is an application that controls the Ultrasound Probe of a preferred embodiment and allows to obtain and display visual images for medical purposes.
  • the Imaging System has it's own graphical user interface. This interface has reach in features and is conveniently organized to provide maximum flexibility working with the separate images as well as streams of images. Some of the possible medical applications require developing of graphical user interfaces with significantly different features. This involves integration of the Imaging System into other more complicated medical system.
  • the preferred embodiment allows exporting imaging data in a highly effective and convenient fashion for original equipment manufacturers (OEMs) to have direct access to imaging data.
  • OEMs original equipment manufacturers
  • the quality of the Image Streaming solution in accordance with a preferred embodiment is measured by the following criteria such as, data transfer performance. Imaging data consume significant amount of memory and processor power. Large number of separate image frames are required to produce live medical video patient examination. It becomes very important to minimize data coping operations in a process of transferring data from one process generating video data to a process consuming video data.
  • the second criteria includes industry standard imaging format. Since applications consuming video imaging data are intended to be developed by third party companies data can be represented in industry standard formats. A third criteria is convenience. Imaging data may be presented by means of a programming interface that is convenient to use and does not require additional learning.
  • the criteria includes scalability and extendibility.
  • Streaming data architecture may be easily extendable to accommodate new data types. It may provide basic framework for future multiplication of video streams targeting more then one data receiving process.
  • the image streaming architecture of the preferred embodiment provides methods of data transportation between two processes.
  • the image streaming architecture defines operational parameters regulating data transferring process, and describes the mechanism of transferring parameters between processes.
  • One of the methods to transfer operational parameters from a third party client application to the imaging system of a preferred embodiment is by using existing COM interface.
  • the image transferring architecture intensively uses object-oriented programming methodology and inter-processing capabilities of the Microsoft Windows® operating system.
  • Object-oriented methodology provides a necessary foundation allowing an architectural solution that satisfies the necessary requirements. It also lays ground for future enhancements and extensions making modification relatively simple and backward compatible.
  • Video imaging data represent complicated data structures with mutual interferences between different data elements. It also permits and often requires different interpretation of the same data elements.
  • the preferred embodiment of the following image transferring architecture includes a shared memory for physical data exchange. For example, Windows® shared memory is a fast and economical way to exchange data between processes. Further, the shared memory can be subdivided into separate sections of a fixed size in certain embodiments. Each section can then be at a minimum a controllable unit.
  • the imaging data can be abstracted as objects. Each frame of the imaging data can be represented by a separate object. The objects can then be mapped to the sections of the shared memory.
  • Preferred embodiments can include the locking-unlocking of a section-object.
  • the programming API notification mechanism used is an event-driven mechanism. Event-driven mechanisms are implementation based on C++ pure-virtual functions.
  • the image transferring architecture consists of three layers: an application programming interface (API) layer, a programming interface implementation and shared memory access layer, and a physical shared memory layer.
  • the application programming interface layer provides two different C++ class library interfaces to applications on a client and server side. All the associated sequence of instructions that belongs to the application itself is part of this layer as well.
  • Application derived classes and their implementation are the key elements of application programming interface layer.
  • the server which is the imaging data provider side uses, for example, Object Transmitter class and related derived and base classes.
  • the client which is the imaging data consumer side uses an Object Factory class, for example, and related derived and base classes.
  • the programming interface implementation layer provides two different Dynamic Link Libraries (DLLs) implementing classes for the applications.
  • This layer maps objects of the classes associated with the application to an internal implementation of objects accessing the shared memory physical system object. This layer allows the hiding of all implementation specific member variables and functions from the scope of the application.
  • the server side application can use, for example, Object-Xmitter.DLL, while the client side application can use, for example, ObjectFactory.DLL.
  • the physical shared memory layer represents the operating system object implementing shared memory functionality. It also describes the structure of the shared memory, it's segmentation, and memory controlling blocks.
  • the server side of the application is responsible for the creation of shared memory creation. In a process of creation, it has to specify not only unique name of the shared memory but other configuration parameters. These parameters include, but are not limited to, segments count which specifies the number of segments to be allocated, segment size and operational flags. There are three such flags in a preferred embodiment. The first one specifies the segment submission and retrieval order. It can be one of, Last In First Out (LIFO), First In First Out (FIFO), or Last In Out (LIO). LIO is a modification of the usual LIFO in such a way that whenever at the time when a new frame arrives, if it finds frames that were ready for retrieval, but yet not locked for retrieval, they are erased.
  • LIFO Last In First Out
  • FIFO First In First Out
  • LIO Last In Out
  • the second flag specifies shared memory implementation behavior under a condition when a new segment allocation is requested but there is no segment available. Typically it may happen when receiving application process data slower then submitting the application. This flag may allow deleting one of the previously allocated segments. If it does not allow deleting one of the previously allocated segments, it reports an exceptional condition back to the application. Using this flag application may automatically select overwriting of data in a shared memory or it may control the data overwrite process itself.
  • the third flag can be used only when the second one allows overwriting segments in a shared memory. It specifies how to select a segment to be overwritten. By default, shared memory implementation deletes the youngest or the most recently submitted data segment. Alternatively, the oldest segment can be selected for overwrite process.
  • FIG. 32 is a block diagram illustrating the structure of the physical shared memory.
  • the memory segment header contains the occupied size of the segment, unique tag of the class of the object mapped to the segment, and the segment state.
  • Each segment can be in one of four states: unused where the segment is available for allocation, locked for write where the segment is mapped to an object of a specific class and currently is being formed, written, wherein the segment is mapped to an object of a specific class and available for retrieval, and locked for read, wherein the segment is mapped to an object of a specific class and currently is in a process on data retrieval. Since every segment has it's own state it is possible for the application to lock more then one segment for object forming and object retrieval.
  • the last element in a physical shared memory layout contains memory segments.
  • the logical shared memory besides physical shared memory contains a physical system mutex 1388 and system event 1390 .
  • the physical mutex provides mutual exclusive access to physical shared memory.
  • the physical event is of a manual control type. It stays at the level “high” all the time when at least one of the segments has a “written” state. It goes to the level “low” only when there is no single segment in a “written” state. This mechanism allows to retrieve “written” objects from the shared memory without passing control to an operating system within the same time-slice allocation for the thread.
  • the object transmitting programming interface consists of three classes: namely, AObjectXmitter, USFrame, and BModeFrame.
  • the AObjectXmitter class allows the initiation of an object transferring service specifying desired operational parameters. Once the AObjectXmitter class object is instantiated the initialized objects of USFrame and BModeFrame classes can be created.
  • the USFrame class constructor requires a reference to an object of the AObjectXmitter class.
  • the first action that has to be accomplished upon instantiation of the USFrame object is to establish association of the object with one of the segments in the shared memory.
  • the function Allocateo maps an object to an unused shared memory segment and locks this segment for the current object usage.
  • a bitmap size may be provided by an application. The provided size represents only the size required for bitmap data not including the memory size required for other data elements of the object.
  • the BModeFrame class is a class derived from the USFrame class. It inherits all the methods and functionality that the base class has. The only additional functionality provided by BModeFrame class is additional methods allowing to provide information related specifically to the BMode operation.
  • the USFrame or BModeFrame object can be reused by means of subsequent remapping and resubmitting. Alternatively, it can be deleted and a new one can be created when it is appropriate for an application. Since object instantiation does not require any interprocess-communication mechanisms, it is as simple as memory allocation for an ordinary variable.
  • the ObjectXmitter class does have knowledge about the USFrame or BModeFrame class, it is very easy to introduce additional classes similar or directly or indirectly derived from the USFrame class. This allows to produce future versions of Object Transmitting Programming Interface without requiring any modifications to the code or sequence of instructions that was developed for existing embodiments. Further, Object Transmitting Programming Interface classes do not have any member variables. This provides two more benefits of the interface. The first one is that these classes are COM object interface oriented and can be directly used for the COM object interface specification and implementation. The second benefit is that these classes effectively hide all implementation specific details making the interface very clear, easy to understand and use.
  • the Object Transmitting Programming Interface is implemented by the ObjectXmitter.DLL.
  • ObjectXmitter.DLL For every object created by the application there is a mirroring implementation object being created by the code residing in the ObjectXmitter.DLL. Since every programming interface class has corresponding mirroring class in implementation modifications are facilitated and extend currently to specified image types. This can be accomplished by the creation of the corresponding mirroring classes in the implementation DLL.
  • Implementation objects are responsible for handling of the shared memory and the mapping of programming interface objects.
  • An embodiment of the present invention includes the DLL allowing instantiate of only one ObjectXmitter class object using only one communication channel with the one client application. Object Transmitting implementation transmits not only object data but provides additional information describing the object type transferred.
  • the Object Factory Programming Interface consists of three classes: AObjectFactory, USFrame, and BModeFrame.
  • AObjectFactory contains three pure virtual member functions. This makes this class an abstract class that cannot be instantiated by an application. It is required from the application to define its own class derived from the AObjectFactory class. There is no need to define any “special” class derived from the AObjectFactory class. Since the application intends to process images that would be received, the chances that it will have a class processing images are very high. An image processing class can very well be derived from AObjectFactory class.
  • the class derived from an AObjectFactory class has to define and implement only pure virtual functions such as, for example, OnFrameOverrun( ), OnUSFrame( ), and OnBModeFrame( ).
  • a derived class can be defined as follows: Class ImageProcessor: public AObjectFactory ⁇ public: ImageProcessor(void); ⁇ ImageProcessor(void); virtual unsigned long OnFrameOverrun(void); virtual unsigned long OnBModeFrame(const BModeFrame * frame); virtual unsigned long OnUSFrame(const USFrame *frame); ⁇ ;
  • Open( ) Upon instantiation of an object of the class ImageProcessor base class member function Open( ) can be called. This function provides a shared memory name that matches to the shared memory name being used by the server side of application. Function Open( ) connects the client application to the server application via a specified shared memory.
  • the application can expect a call on a virtual function OnFrameOverrun( ), OnUSFrame( ), and OnBModeFrame( ). Every invocation of OnUSFrame( ) function carries as an argument an object of USFrame class type. Every invocation of OnBModeFrame( ) function carries as an argument an object of BModeFrame class type. There is no need for an application to instantiate an object of USFrame or BModeFrame class. USFrame and BModeFrame objects are “given” to an application by underlying implementation on an AObjectFactory class.
  • OnFrameOverrun( ) is called when the Frame Overrun is raised by the servicing application. This condition is raised any time when the servicing application makes an attempt to submit a new frame and there is not any available shared segments to map an object to. This condition can be cleared only by the client side of application by means of calling function ResetFrameOverrun( ). If this function is not called by the client application the Frame Overrun condition is raised and OnFrameOverrun( ) pure virtual function is called again.
  • the Object Factory Interface has the same advantages that were outlined herein above in describing the Object Transmitting Interface. In addition to these advantages, it implements an event-driven programming method that minimizes programming effort and maximizes execution performance. At the same time there are functions such as, for example, USFrames( ), BModeFrames( ), GetUSFrame( ), and GetBModeFrame( ). These functions can be used to implement less efficient “polling” programming methods.
  • the Object Factory Programming Interface is implemented by the ObjectFactory.DLL.
  • This DLL retrieves an object class type information as well as object related data from the shared memory. It creates an object of the type that is used by the transmitter.
  • the Object factory implementation maps newly created objects to the corresponding data.
  • Object factory implementation has a separate thread that fires newly generated and mapped object via pure virtual function event.
  • the application “owns” this object for the duration of processing and by calling Releaseo function indicates that the object is no longer needed by the application.
  • the factory implementation releases resources allocated for the object locally as well as shared memory resources.
  • the new types of objects can be introduced by deriving a new class from one of the existing classes.
  • a newly derived class can be derived from the appropriate level of the base classes.
  • An alternative way to create a new object type is by the creation of a new base class. This method may have the advantage in the case when a newly defined class differs from existing ones significantly.
  • alternate preferred embodiments can support more than one AObjectXmitter class object and more then one corresponding communication channel. It also can be extended in such a way that it allows communication channels transmitting objects in opposite directions. This allows the application to distribute imaging data to more then one client application. It can accept incoming communication controlling image creation and probe operation.
  • wireless and remote image streaming channels can be accommodated in preferred embodiments.
  • a same Object Transmitting Programming Interface can be implemented to transfer images not via the shared memory but via the high-speed wireless communication network such as, for example, ISO 802.11a. It also can be used to transfer images across a wired Ethernet connection.
  • Remote and wireless image streaming assumes that the recipient computing system can differ in performance. This makes the selection of a model of the recipient's device one of the important factors for the successful implementation.
  • the streamed imaging included in preferred embodiments thus utilizes a shared- memory client-server architecture that provides high bandwidth with low overhead.
  • the Ultrasound Imaging System software application of a preferred embodiment is used as a server of live ultrasound image frames by a client application.
  • This client- server relationship is supported by two communications mechanisms as described hereinabove.
  • a COM automation interface is used by the client application to start-up and control the ultrasound imaging system application.
  • a high-speed shared-memory interface delivers live ultrasound images with probe identification, spatial and temporal information from the application to the client application.
  • the shared-memory communications have flexible parameters that are specified by the client application. Queue order, number of buffers, buffer size and overwrite permission are all specified by the client when opening the image-frame stream.
  • the queue order mode can be specified as First-In-First-Out (FIFO), Last-In-First-Out (LIFO) and Last-In-Out (LIO).
  • FIFO First-In-First-Out
  • LIFO Last-In-First-Out
  • LIO Last-In-Out
  • the FIFO mode is preferred when zero data loss is more important than minimum latency.
  • the LIO mode delivers only the most recent image frames and is preferred when minimum latency is more important than data loss.
  • the LIFO mode can be used when minimum latency and minimum data loss are both important. However, in the LIFO mode, frames might not always be delivered in sequential order and a more complicated client application is required to sort them after they are received. Overwrite permission, when all of the shared-memory buffers are full, is specified as not allowed, overwrite oldest and overwrite newest.
  • Each image frame contains a single ultrasound image, probe identification information, pixel spatial information and temporal information.
  • the image format is a standard Microsoft device independent bitmap (DIB) with 8-bit pixels and a 256-entry color table.
  • DIB Microsoft device independent bitmap
  • the TTFrameReceiver ActiveX control provides two schemes for receiving frames.
  • the first scheme is event driven.
  • a COM event, FrameReady is fired when a frame has been received.
  • the image and associated data can be read using the data access methods of the interface.
  • the client releases the frame by calling the ReleaseFrame method.
  • the next FrameReady event does not occur until after the previous frame is released.
  • the client can poll for the next available frame using the WaitForFrame method.
  • both the client application and the server application are executed on the same computer.
  • the computer can be running the Microsoft® Windows® 2000/XP operating system, for example, without limitation.
  • the client application (USAutoView) can be developed using Microsoft® Visual C++6.0 and MFC.
  • the source code can be-compiled, for example, in Visual Studio 6.0.
  • the server side COM Automation interface and the TTFrameReceiver ActiveX control may be compatible with other MS Windows® software development environments and languages.
  • the name of the server side COM automation interface is, for example, “Ultrasound.Document” and the interface is registered on the computer the first time the application is run.
  • the dispatch interface can be imported into a client application from a type library.
  • automation interface is extended to support frame streaming with the addition of different methods such as void OpenFrameStream (BSTR* queueName, short numBuffers, long bufferSize, BSTR* queueOrder, short overwritepermission).
  • the Opens frame stream transmitter on the server side opens the shared-memory interface to the client application
  • queueNam,e is a unique name of the shared-memory “file” and is the same name that is used when opening the receiver
  • numBuffers is the number of buffers in the shared-memory queue
  • bufferSize is the size of each buffer in the shared-memory queue in bytes wherein the buffer size is 5120 bytes larger than the largest image that can be transmitted
  • overwritePermission is 0 for overwrite not allowed, 1 for overwrite oldest, or 2 for overwrite newest.
  • OpenFrameStream must be called before opening the TTFrameReceiver control.
  • the next additional methods include void CloseFrameStream( ) which closes the frame stream transmitter on the server side, void StartTransmitting( ), which tells the server side to start transmitting ultrasound frames, void StopTransmitting( ), which tells the server side to stop transmitting ultrasound frames, and short GetFrameStreamStatus( ), which gets the status of the frame stream transmitter. It is important to check that the stream transmitter is open before opening the TTFrameReceiver.
  • the COM automation interface is non-blocking and the OpenFrameStream call cannot occur at the instant it is called from the client application.
  • the TTFrameReceiver ActiveX Control is the client application's interface to the live ultrasound frame stream.
  • Frame Stream Control Methods include boolean Open(BSTR name), which opens the frame stream receiver. The frame stream receiver cannot be opened until after the frame stream transmitter on the server has been opened. It also includes boolean Close( ), which closes the frame stream receiver, long WaitForFrame(long timeoutms), which wait for a frame to be ready or until end of timeout period, and boolean ReleaseFrame( ), which release the current image frame. The current frame can be released as soon as all of the desired data has been copied. The next frame cannot be received until the current frame is released. The return values of the other data access functions are not valid after the current frame is released until the next FrameReady event.
  • Data Access Methods in a preferred embodiment for the image includes long GetPtrBitmapinfo( ), which gets a pointer to the header (with color table) of the DIB that contain the image.
  • the ultrasound image is stored as a standard Microsoft device independent bitmap (DIB).
  • BITMAPINFO and BITMAPINFOHEADER structures can be cast to the returned pointer as needed.
  • Memory for the BITMAPINFO structure is allocated in shared-memory and may not be de-allocated; instead, ReleaseFrame( ) can be called to return the memory to the shared-memory mechanism.
  • Further methods include long GetPtrBitmapBits( ), which gets a pointer to the image pixels. The returned pointer can be cast as needed for use with the Microsoft DIB API.
  • Memory for the bitmap pixels is allocated in shared-memory and may not be de-allocated; instead, ReleaseFrame( ) is called to return the memory to the shared-memory mechanism.
  • the methods related to probe identification include short GetProbeType( ), which gets the defined ultrasound probe type.being used, BSTR GetProbeType( ), which gets the defined probe name, long GetProbeSN( ), which gets the serial number of the probe being used.
  • the methods include short GetSequenceNum( ), which gets the sequence number of the current frame.
  • the sequence number is derived from an 8-bit counter and thus repeats every 256 frames. It is useful for determining gaps in the frame sequence and for re-ordering frames received when using the LIFO buffer order mode.
  • double GetRate( ) gets the frame rate when combined with the sequence number, provides precise relative timing for the received frames
  • BSTR GetTimestamp( ) gets a timestamp for the current frame which provides an absolute time for the current frame that may be useful when synchronizing to external events.
  • the resolution is approximately milliseconds. Timestamps can be averaged and used in conjunction with rate and sequence number to achieve higher precision.
  • the methods include BSTR GetTriggerTimestamp( ), which gets a timestamp for the start of ultrasound scanning wherein the ultrasound probe is stopped when “freezing” the image. The trigger timestamp is recorded when live imaging is resumed.
  • Spatial Information in preferred embodiments has the following methods, short GetXPixels( ), which get the width of the image in pixels; short GetYPixels( ), which gets the height of the image in pixels; double GetXPixelSize( ), which gets the size of each pixel in the x-direction, (x-direction is defined to be horizontal and parallel to each image line); and double GetYPixelSize( ), which gets the size of each pixel in the y-direction.
  • the y-direction is defined to be vertical and perpendicular to each image line.
  • the positive y-direction is defined to be away from the transducer head into the patient.
  • Another method includes short GetXDirection( ), which gets the spatial direction along each line of the image.
  • the positive x-direction is defined to be away from the probe marker.
  • the short GetYDirection( ) gets the spatial direction across each line of the image.
  • the positive y-direction is defined to be away from the transducer head into the patient.
  • N the index of the pixel in the image
  • D the direction of the pixel.
  • void FrameReady( ) is used when a frame is ready and data can be read.
  • the handler copies data from the data access methods and then calls ReleaseFrame( ). It is recommended that any kind of indefinite processing, for example, function that invokes message loops be avoided in the handler.
  • void FrameOverrun( ) is used when the server is unable to send a frame or a frame has to be overwritten in the buffers because the buffers are full. This only applies to the FIFO and LIFO modes, since the LIO automatically releases old buffers. This event is usefully for determining whether the client application is reading frames quickly enough and whether the number of buffers allocated is sufficient for the latency of the client.
  • USAutoView is a sample client application that automates the server side and displays live ultrasound image frames. It has functions to demonstrate starting and stopping the server side, hiding and showing the server side, toggling between showing and not showing graphics on the image, freezing and resuming the ultrasound acquisition, loading a preset exam, changing the designated patient size, changing the image size, spatial information, and inverting the image.
  • FIG. 34 is a view of a graphical user interface used for a USAutoView UI in accordance with a preferred embodiment of the present invention.
  • the USAutoView program is a Windows® dialog application with three ActiveX components.
  • TTFrameReceiver which supplies ActiveX interface to receive ultrasound frames
  • TTAutomate which encapsulates automation of the server side
  • TTSimplelmageWnd which is the image display window.
  • CUSAutoViewDlg is the main dialog. It manages the automation of the server side through the TTAutomate control, receiving ultrasound frames through TTFrameReceiver and image display through TTSimplelrnageWnd.
  • the OnStartUS( ) method of CUSAutoViewDlg calls the TTAutomate and TTFrameReceiver methods needed to start or stop automation and data transmission from the server side.
  • OnFramReady( ) handles the FrameReady event from TTFrameReciever. It copies the desired data from TTFrameReceiver and then releases the frame with TTFrameReceiver's ReleaseFrame( ) method. It avoids any functions that perform indeterminate processing, such as functions that invoke message loops.
  • TTAutomate is an ActiveX control that encapsulates automation functions for the server side.
  • the native COM Automation interface of the server side is non-blocking and requires waiting with GetStatusFlags to coordinate functions.
  • TTAutomate wraps each function in the required wait loops.
  • the wait loops allow Windows® messages to be processed so that the client application's user interface thread does not become blocked while waiting.
  • automation methods in TTAutomate cannot return until the function has been completed, other Windows® messages are still processed before the function is completed. It is recommended to prevent multiple concurrent calls from message handlers to TTAutomate methods, as coordination with the server side is generally non-reentrant.
  • Source code for this control is included in the USAutoView workspace. It may be reused or modified as desired.
  • TTSimplelmageWnd is an ActiveX control that provides a display window for device independent bitmaps (DIB's).
  • the two properties of the display interface are long DIBitmaplnfo and long DIBits.
  • DIBitmaplnfo corresponds to a pointer to a block of memory that contains the BITMAPINFO structure for the DIB.
  • DIBits corresponds to a pointer to a block of memory that contains the image pixels.
  • the DIBitmapInfo is set to the pointer to the bitmap info of the DIB. Then DIBits is set to the pointer to the bitmap bits.
  • DIBitmapInfo When DIBits is set, the pointer that was set for DIBitmapInfo is expected to still be valid and both the bitmap info and bitmap bits are copied internally for display on the screen. Both DIBitmapInfo and DIBits are set to zero to clear the image. Source code for this control is included in the USAutoView workspace. It may be reused or modified as desired.
  • the preferred embodiments of the present invention include a plurality of probe types.
  • the probes include, but are not limited to, a convex-linear transducer array operating between, 2-4 MHz, a phased-linear transducer array operating between 2-4 MHz, a convex-linear endocavity transducer array operating between 4-8 MHz, a linear transducer array operating between 4-8 MHz and a linear transducer array operating between 5-10 MHz.
  • Preferred embodiments of the portable ultrasound system of the present invention provide high resolution images such as the following during an examination: B-mode, M-mode, Color Doppler (CD), Pulsed Wave Doppler (PWD), Directional Power Doppler (DirPwr) and Power Doppler (PWR).
  • the probe device is connected into a desktop or laptop.
  • the probe can be an industry standard transducer connected to a 28 oz. case that contains the system's beamforming hardware. If the probe is connected to a laptop, then a 4-pin FireWire cable is connected to a IEEE 1394 serial connection located on a built-in MediaBay.
  • the computer may not be equipped with a MediaBay.
  • EDCM External DC Module
  • the EDCM is designed to accept a 6-pin IEEE 1394 (also referred to as FireWire) cable at one end and a Lemo connector from the probe at the other end.
  • the EDCM accepts an input DC voltage from +10 to +40 Volts.
  • the system in an embodiment, can be connected to a host computer with IEEE 1394.
  • the 6-pin IEEE 1394 input to the EDCM can originate from any IEEE 1394 equipped host computer running, for example, the Windows® 2000 operating system.
  • An external IEEE 1394 hub may also be necessary to provide the requisite DC voltage to the EDCM.
  • there are one of two types of IEEE 1394 connectors In a host computer equipped with IEEE 1394, there are one of two types of IEEE 1394 connectors; a 4-pin or a 6-pin.
  • the 6-pin connector is most often found in PC-based workstations that use internal PCI-bus cards. Typically, the 6-pin connector provides the necessary DC voltage to the EDCM.
  • a 6-pin-male to 6-pin-male IEEE 1394 cable is used to connect the host computer to the EDCM.
  • the 4-pin connector is found in laptop computers that do not contain a MediaBay in accordance with a preferred embodiment or provide a DC voltage output.
  • an external IEEE-1394 hub can be used to power the EDCM and the probe.
  • an external IEEE-1394 hub can be used between the host computer and the EDCM.
  • the hub derives its power from a wall outlet and is connected using a medical-grade power supply that conforms to the IEC 60601-1 electrical safety standard.
  • a 4-pin-male to 6-pin-male or 6-pin-male to 6-pin-male IEEE cable is required.
  • the appropriate connector (4-pin or 6-pin) is inserted into the host computer and the 6-pin connector into the hub.
  • the hub is then connected to the EDCM using a 6-pin-male to 6-pin-male IEEE 1394 cable.
  • An IEEE 1394 hub is only necessary when the host computer cannot supply at least +10 to +40 DC volts and 10 watts power to the EDCM. If the host computer can supply adequate voltage and power, a 6-pin-male to 6-pin-male IEEE 1394 cable can be used to connect the computer directly to the EDCM.
  • FIG. 35 illustrates a view of a main screen display of a graphical user interface in accordance with a preferred embodiment of the present invention.
  • the main screen displays.
  • the main screen can be considered as four separate work areas that provide information to help one perform tasks. These include a menu bar, an image display window, an image control bar and a tool bar.
  • buttons in the upper right of the window In order to resize windows and regions the user can click the small buttons in the upper right of the window to close, resize, and exit the program.
  • a user interface or button closes the window but leaves the program running (minimizing the window).
  • a system button appears at the bottom of the screen, in the area called the taskbar. By clicking the system button in the taskbar the window re-opens.
  • Another interface button enlarges the window to fill the entire screen (called maximizing), however, when the window is at its largest, the frame rates may decrease.
  • Another interface button returns the window to the size that it was before being enlarged.
  • the system program can be closed by another interface button.
  • the user can increase or decrease the width of each region of the application to meet one's needs. For example, to make the Explorer window more narrow, the cursor is placed at either end of the region and by clicking and dragging the new desired size is obtained. One can re-position the size and location of each region so that they become floating windows. To create floating windows, the user simply clicks one's mouse on the double-edged border of the specific region and drags it until it appears as a floating window. To restore the floating window back to original form, one double-clicks in the window.
  • FIGS. 36 A- 36 C are views in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • the Explorer window provides the nested level file directory for all patient folders the user creates images that are created and saved.
  • the folder directory structure includes the following, but is not limited to, patient folder, and an image folder.
  • the patient folder directory is where patient information files are stored along with any associated images.
  • the image folder directory contains images by date and exam type. The images in this directory are not associated with a patient and are created without patient information.
  • FIGS. 37 A- 37 B illustrate the patient folder and image folder in accordance with a preferred embodiment of the present invention.
  • the menu bar at the top of the screen provides nine options one can use to perform basic tasks. To access a menu option simply click the menu name to display the drop-down menu options. The user can also access any menu by using its shortcut key combination.
  • the Image display window provides two tabs: Image Display and Patient Information.
  • the image is displayed in the window according to the control settings that are defined. Once the image is saved, when the user retrieves it again, the category, date and time of the image is also shown in the Image Display window.
  • the Patient Info tab is used to enter new patient information which is later stored in a patient folder. The user can access this tab to also make modifications and updates to the patient information.
  • the Image Mode bar is illustrated in FIG. 38 in accordance with a preferred embodiment of the present invention. It provides six control modes the user can choose from when performing an examination.
  • the modes include B-Mode which is a brightness mode providing a standard two-dimensional display in real time, an M-Mode which is used to display motion along a line depicted in the B-mode image as a function of time, and CD mode which is Color Doppler tab that displays, in real time, a two-dimensional image of blood flow overlaid to the B-mode image.
  • the hues in the color palette indicate mean flow velocity, and the different colors indicate the direction of blood flow.
  • Pulsed-Wave Doppler (PWD) mode displays a line in the B-mode image, which contains the sample size and location of interest.
  • the pulsed Doppler waveform depicts the instantaneous velocity of flow within that sample, as a function of time.
  • the Directional Power Doppler (DirPwr) mode displays, in real time, a two-dimensional image of blood flow overlaid to the B-mode image.
  • the hues in the color palette indicated the density of red blood cells. Brighter hues indicate greater density. The different colors indicate the direction of blood flow.
  • the Power Doppler (Pwr) mode displays, in real time, a two-dimensional image of blood flow overlaid to the B-mode image.
  • the hues in the color palette indicate the density of red blood cells. Brighter hues indicate greater density. It should be noted that directional information is not provided. Power Doppler is not subject to aliasing. It is generally more sensitive to low flow than Color Doppler or Directional Power Doppler.
  • FIG. 38 illustrates the tool bar in the graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 39 illustrates a measurement tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • the Measurements toolbar provides the following buttons: a zoom button that lets a user magnify the selected region of the image of interest, an ellipse button lets one perform real and ellipse measurements on the image, a measure distance button that performs distance measurements on an image, an SD button that provides cursors for measurement of the systolic and diastolic portions of the pulse Doppler waveform, and a delete button that removes a selected measurement or the last measurement made, a text button that lets one enter text on live or frozen images.
  • FIG. 40 illustrates a playback toolbar in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • the Playback toolbar provides the following buttons: a play button that lets one play and pause loops of data. The user can play or pause up to sixty frames of loop information. The flip side of this button is a Pause Loop which lets the user pause the loops of date in Play mode. Further, a previous button that lets a user return to the previous frame during Playback Mode, a next image button that allows a user to advance to the next frame during Playback Mode, a status indicator button that shows graphically and numerically the frame number being viewed.
  • FIGS. 41A and 41B The Live/Freeze buttons that are used during a scan to record the examination or save the image to a file are illustrated in FIGS. 41A and 41B in accordance with a preferred embodiment of the present invention.
  • the live button provides a real-time image display, while the freeze button freezes the image during the scan to allow the user to print or save to a file.
  • FIG. 42 illustrates the file toolbar in a graphical user interface of a preferred embodiment.
  • the file toolbar provides the following buttons: a save button saves the current image to a file, a save Loop button that saves the maximum allowed number of previous frames as a Cine loop, and a print button that lets the user print the current image.
  • the preferred embodiments of the present invention also provide an online help system from the system program, that provides information grouped as contents, index and search.
  • the preferred embodiments of the present invention provides the steps a user needs to take to set up and modify information relating to a new patient in the system program.
  • the user can enter new patient information so that pertinent exam history can be retained when a patient is scanned.
  • a patient folder is created when one sets up a new patient which stores the patient information. All examination images for the patient are stored in this folder and are viewable by accessing the system Explorer window.
  • FIG. 43 illustrates a view of a patient information screen in a graphical user interface of a preferred embodiment.
  • the Patient Information screen is accessible by selecting the Patient Information tab from the main system window.
  • Several fields in the Patient Information screen provide drop-down arrows.
  • the drop-down arrows displays a list box of choices for a field when selected.
  • the choices available in the list box are created based on new data the user enters into a field for patients each time they perform an exam. For example, one can enter a new comment or choose a previously entered comment available from the list box in the Comment field. In the examination location and clinical information fields, one can enter new data or choose a value from a list box displaying existing names or locations.
  • FIG. 44 illustrates further interface buttons in a patient interface screen.
  • the interface buttons provide the following functions: a save button lets a user save new or modified patient information, a new Patient button clears the Patient Information screen so the user can add a new patient, a cancel button stops the performing function and reverts back to last saved data, and a help button provides access to the online system Help. If previous data was entered for the last patient, when one clicks on a New Patient the data is eliminated from the screen. A dialog box displays prompting one to save the data. If the user chooses Yes, the data is saved. If the user chooses No, the screen is cleared with no changes saved. If one chooses Cancel, the operation is cancelled.
  • the user To view the data in the file, the user locates the specific patient information folder and clicks to select the Patient Information file. The data appears again in the Patient Information screen. If the user clicks the Cancel button while entering new patient information, the data is lost and cannot be retrieved.
  • FIG. 45 illustrates a view of a screen for adding a new patient in a graphical user interface of a preferred embodiment.
  • the user can enter information in the fields that display in the Patient Information screen.
  • a Patient Information file is created that resides in the system directory.
  • the Patient Information file is stored in the patient folder. Any associated images for a patient are also stored in the same directory.
  • the Patient Information tab can be chosen in the image area to enter new patient information.
  • the New Patient button can be chosen at the bottom to clear all previous information entered from the last patient.
  • the user can choose the exam type they want for this examination. Each time one performs an examination on the specific patient, the user can choose the new examination type. The information is stored as part of the image filename. Further, to save a new patient that has been added, the user clicks on Save. The patient information is saved in the Patient Information file and displays in the system Explorer window next to the patient folder for the patient. Important patient information the user entered in the Patient Information tab is displayed in the Image Display screen. To view the patient information, one clicks the Image Display tab. The patient information is shown across the top of the screen and is saved with scanned images one creates for the patient. The user, be it a clinician or an ultrasound technician, for example, can update information for an existing patient. First they need to retrieve the appropriate file and then make their changes.
  • Ultrasound is primarily an operator-dependent imaging technology.
  • the quality of images and the ability to make a correct diagnosis based on scans depends upon precise image adjustments and adequate control settings applied during the exam.
  • the user can optimize the image quality during a patient exam while using any of the six image modes available in the system software.
  • a two-dimensional (2D) image control setting tab that provides adjustments for size depth, focus, as well as time gain compensation is available for each of the six modes.
  • An image quality (IQ) control setting that allows the user to further adjust the image for clarity is also available for each of the six modes.
  • B-mode and M-mode images provide B-mode and M-mode images.
  • the B-mode (Brightness mode) tab provides two-dimensional image adjustments that allow the user to control the size, depth, focus, and overall image gain as well as TGC (Time Gain Compensation).
  • the user can further manipulate the image quality by selecting from various palettes, smoothing, persistence settings and maps offered in the Image Quality tab menu.
  • B-mode is selected when the user wants to view only a two-dimensional display of the anatomy.
  • the B-mode provides the standard image mode options necessary for an ultrasound using the 2D and IQ (image control) settings. To select the B-mode for image display, the user chooses the B-mode button from the image mode bar, or from the Modes menu.
  • FIG. 46 illustrates an image in the B-mode including the controls provided by a graphical user interface of a preferred embodiment.
  • FIGS. 47 A- 47 H illustrate the different control interfaces for adjusting a B-mode image in the graphical user interface of the preferred embodiment.
  • the user can choose the parameters for the scan that meet the size of the patient, or structured anatomy. For example, the user clicks the T-shirt button that matches the patient size for small, medium, or large (or for superficial, moderately deep, and deep areas of interest). hI the alternative, the user can access the size option from the Image menu and choose the size from the drop-down list. Selection of the appropriate scan size can provide the user with fast and easy baseline settings.
  • Other B-mode settings such as Gain, Focus and Depth have been optimized according to the T-shirt size in a preferred embodiment. All controls return to default settings when a new T-shirt size is selected. This feature makes it easy for the user to reset multiple parameters.
  • the user can control the field of view for the scanned image by using depth control. If they want to capture deeper structures, they increase the depth. If there is a large part of the exam display that is unused or not necessary at the bottom of the screen, the user decreases the depth. To select the depth, the user clicks on the down arrow next to the Depth field label, and chooses a value from the list of available options. The available depth option depends on the probe that the user is working with. To decrease the depth, the user chooses a lower value from the depth list box. Or, in the alternative, the user can access the depth option from the Image menu and choose depth from the drop-down list.
  • the depth control adjusts the user's field of view. It increases one's field of view to see larger or deeper structures. Depth control also decreases the user's field of vision to enlarge the display of structures near the skin line. After adjusting Depth, the user may want to adjust the TGC and focus control settings.
  • the user can change the position of the focal zone to specify a location for the optimal area of focus.
  • a graphic caret is positioned on the depth scale to represent the focal zone.
  • the list of values displayed in the menu can be chosen.
  • the available focal position options depend on the probe being used.
  • the Focus optimizes the image by increasing the resolution for a specific area.
  • Adjusting Gain may have the effect of brightening or darkening the image if sufficient echo information is generated.
  • the user slides the bar illustrated in FIG. 47F to the right to increase, or to the left to decrease the level. Gain allows the user to balance echo contrast so that cystic structure appear echo-free and reflecting tissue fills in.
  • a slider illustrated in FIG. 47C adjusts how the control amplifies returning signals to correct for the attenuation caused by tissues at increasing depths.
  • Each of the eight TGC slider bars are spaced proportionately to the depth.
  • the TGC curve on the image display illustrated in FIG. 47H shows the values that match the TGC control.
  • the TGC is resealed across the new depth range. The TGC is used to balance the image so that the brightness of echoes is the same from near field to far field.
  • FIG. 48 illustrates the image quality control setting provided in the B-mode image option in a preferred embodiment.
  • the user can invert the scanned image to the left or right by clicking on the Left/Right button. Further, the user can also invert the image in the top to bottom direction by clicking on the Up/Down button. It is appropriate to refer to the display to confirm the orientation of the image.
  • the palette can be adjusted by the user.
  • the user can choose from a palette color range to define the reference bar, or the user can choose to use the default conventional gray scale display.
  • the down-arrow next to the Palette field label can be clicked or selected to view the list box of color choices.
  • the color the user wants can be chosen.
  • the color scale changes to the new color and is represented in the image display.
  • the gray palette is the most frequently used palette. To determine if another palette will improve visualization of the anatomy, the user can cycle through the available options. The anatomy that is being imaged has an effect on which palette is most advantageous.
  • the user can select from a range of A to E smooth image display.
  • the user can click on the down-arrow next to the Smoothing field label to view the list box of values.
  • the value displays in the probe information and is the first value in A/4/E.
  • Increasing the smoothing increases the amount of interpolation between scan lines which makes the image appear smoother. It also decreases the frame rate. The opposite is true when the user decreases the amount of smoothing.
  • the user selects from a range of 0 to 4 to define the amount of image frame averaging is desired.
  • the user views the list box of values.
  • the value displays in the probe information and is the second value in A/4/E.
  • the persistence rate is high, the image appears less speckled and smoother.
  • increasing the persistence rate also increases the possibility that the image appears to be blurred if the tissue is moving when the user freezes the image.
  • the persistence is low, the opposite is true.
  • the user can select from a range of A to F in the Map label field to change gray levels.
  • the value displays in the probe information and is the third value in A/4/E. Adjusting the map can assist in closely viewing certain anatomical features or to detect subtle pathologies.
  • the user does so by defining the brightness range. By adjusting the brightness control to the right, the brightness of the image is increased. By adjusting the brightness control to the left, the brightness is decreased. Increasing the brightness increases the overall brightness of the image. The brightness is adjusted to correspond with map and contrast values.
  • the contrast of the image tone of the display is adjusted by defining the contrast range. By adjusting the contrast control to the right, the contrast of the image is increased. By adjusting the contrast control to the left, the contrast is decreased. Increasing the contrast decreases the amount of levels of gray in the image and makes the image contrast more. The opposite is true when the user decreases the contrast. It is recommended to adjust contrast to correspond and compliment the brightness and map value.
  • FIG. 49 illustrates the M-mode image option in accordance with a preferred embodiment of the present invention.
  • the M-mode (motion mode) provides a display format and measurement capability that represents tissue displacement (motion) occurring over time along a single vector.
  • the M-mode is used to determine patterns of motion for objects within the ultrasound beam. Typically, this mode is used for viewing motion patterns of the heart.
  • M-mode for the image display the user can view the image in B-mode at the top of the screen as well as view the M-mode depth scale in the window at the bottom of the screen.
  • the user chooses the M-mode button from the image mode bar.
  • the following image control settings can be used to make adjustments to image display: 2D-Two dimensional, I.Q.-Image quality, an M-mode.
  • the M-mode display is optimized by adjusting the Depth, Focus, Gain and TGC controls on the 2D tab of the Image Control bar.
  • the user can also adjust the image quality by selecting the IQ tab in the Image Control bar.
  • the user clicks on the horizontal bar between the two images and drags to the new appropriate window size. Changing the Sweep Speed can affect the thermal index (TI) and/or mechanical index (MI).
  • the user can click on the left or right arrows next to the Scan Line Position label.
  • the scan line moves accordingly to the left or the right.
  • the M-mode cursor line can be moved manually.
  • a scan live images are recorded by frame. Depending upon the mode the user selects, a certain amount of frames are recorded. For example, the B-mode allows the capture of up to 60 frames in a Cine loop.
  • the user freezes a real-time image during a scan all movement is suspended in the image display area.
  • the freezed frame can be saved as a single image file or an entire image loop dependent upon the mode.
  • the ultrasound images can be magnified and text annotation can be added to the image area. Further, measurements accompanying ultrasound images can be added to supplement other clinical procedures available to the attending physician. The accuracy of measurements is not only determined by the system software, but also by the use of proper medical protocols by the users. The user can create measurements for Distance, Ellipse, or Peak Systole/End Diastole depending upon the mode you are using.
  • Images and loops can be exported in formats that can be viewed by others who do not have the system software.
  • the preferred embodiment of the present invention can electronically mail (e-mail) an image and loop files or include them as graphics in other applications.
  • the user can export images to one of the following graphic file formats: Bitmap (.bmp), and DICOM (.dcm).
  • the user can export loops to one of the following graphic file formats: Tagged Image File Format (.tif), DICOM (.dcm).
  • the Obstetrical measurement generated by the preferred embodiments of the system can easily be exported to the R4 Obstetrical reporting package.
  • Examination types in preferred embodiments are considered presets that contain standard image control settings.
  • the system provides the user with a wide variety of exam types.
  • the exam types automatically optimize multiple parameters one can use during a scan.
  • the examination types available depend on which probe is being used. Although several probes may provide the same examination type, the preset image control parameters are unique to the characteristics of each probe.
  • Each examination type contains three T-shirt presets: Small, Medium, and Large.
  • the T-shirt icons, also interfaces represent predefined parameters for image control settings used with small, medium and large patients, or for superficial, moderately deep and deep areas of interest.
  • the image settings that can be optimized for each size using a two dimensional graphical interface include: depth, focus, gain, and TGC.
  • the image setting controls that are optimized for each examination type using the image quality graphical user interface includes: left/right, up/down, palette, smoothing, persistence, map, brightness and contrast.
  • the image setting controls that are optimized for each examination type using the M-mode graphical user interface include: sweep speed, scan line position, and B-mode to M-mode display ration.
  • the image setting controls that are optimized for each examination type using the Pulsed Wave Doppler (PWD) interface include: sweep speed, velocity display, PRF (Pulse repetition frequency), wall filter, steering angle, invert, correction angle, sample volume size, gain, baseline, and sound volume.
  • PRF Pulsed Wave Doppler
  • the image setting controls that are optimized for each examination type using the Color Doppler (CD) graphical user interface include: scan area, PRF (Pulse repetition frequency), wall filter, steering angle, invert, color gain, priority, persistence, baseline, high spatial resolution, and high frame rate.
  • the image setting controls that are optimized for each examination type using the Direction Power Doppler (DirPwr) interface include: scan area, PRF (Pulse repetition frequency), wall filter, steering angle, invert, gain, priority, persistence, baseline, high spatial resolution, and high frame rate.
  • the image setting controls that are optimized for each examination type using the PWR interface include: scan area, PRF (Pulse repetition frequency), wall filter, steering angle, gain, priority, persistence, high spatial resolution, and high frame rate.
  • Customized exam types can include specific modifications to the preset image control setting parameters.
  • the user can select a customized exam type for future use without, having to perform the exact settings againi.
  • Any examination type can be customized to include the specific control settings the user wishes to use.
  • Certain embodiments include diagnostic ultrasound images being displayed on a display monitor away from the operator's hands.
  • Larger ultrasound imaging systems have the display integrated in the main operator console. The operator is accustomed to turning his/her head between his/her hand holding the probe or scan head, the system console and display, or he/she keeps his/her eyes on the display and manipulates the scan head without looking at the patient. This does not work well for some medical procedures in which the operator is performing visually intensive operations at the same location on the patient where the scan head is. Therefore, it is very beneficial to locate the display proximate to the operator's hands. Therefore, in alternate preferred embodiments, a visor mounted display is a workable alternative, but is deemed uncomfortable for some operators.
  • a preferred embodiment integrates the display on the hand-held scan head, thus allowing the operator to easily view the image and operate the probe or scan head, as well as perform operations in the same local area with the other hand.
  • the data/video processing unit is also compact and portable, and may be placed close to the operator or alternatively at a remote location.
  • a display is also integrated into the data/video processing unit.
  • the processing unit also provides an external monitor port for use with traditional display monitors.
  • FIG. 50 illustrates a preferred embodiment of a portable ultrasound imaging system 1900 including a hand-held scan head with integrated display and a portable processing unit.
  • the hand-held scan head assembly 1902 comprises an ultrasound transducer and a compact video display 1904 .
  • the display unit 1942 can be integrated into the hand-held scan head either directly on the scan head housing as illustrated in FIG. 51 A, or the display unit 1962 can be mounted onto the scan head via a mechanical swivel assembly 1966 as illustrated in FIG. 51B, which may be permanently attached or detachable.
  • the display can be mounted on the interface 1908 in close proximity to the probe 1904 .
  • the interface can use a system such as that shown in FIG. 3D, for example, that is modified to have a video cable and power running alongside the Firewire connection from the processor housing 5 .
  • the on-the-probe display unit such as shown in FIG. 51 A can be a 2.5 inch to 6.4 inch color LCD panel display, with a screen resolution ranging from a quarter VGA (320 ⁇ 240 pixels) to full VGA (640 ⁇ 480 pixels).
  • the video data delivered to the display may be composite video via a thin coaxial cable, or digital VGA using Low Voltage Differential Signaling (LVDS) via four coaxial or twisted-pair wires.
  • LVDS Low Voltage Differential Signaling
  • the display may be powered by battery in the unit, or by DC-power supplied by the portable data/video processing unit 1946 via power wires from the processing unit.
  • the video and power wires for the display are integrated with the transducer data wires for the scan head to form a single cable assembly 1948 that connects the hand-held scan head to the portable data/video processing unit.
  • the data/video processing unit 1908 is compact and portable.
  • the beamformer electronics is an integral part of the hand-held scan head assembly, and the scan head communicates with the processing unit via a Firewire (IEEE 1394) cable as illustrated in FIG. 53A.
  • IEEE 1394 Firewire
  • the beamformer electronics is moved inside the processing unit to further reduce the size and weight of the hand-held scan head as illustrated in FIG. 53B.
  • the processing unit in this configuration comprises of a compact single board 1982 computer and the beamformer electronics as illustrated in FIG. 52.
  • the beamformer electronics includes a digital processing printed circuit board 1984 and an analog processing printed circuit board 1986 .
  • the beamforming electronics communicates with the single board computer via a Firewire (IEEE 1394) interface 1988 .
  • the compact single board computer has a printed circuit board size of a 51 ⁇ 4 inch disk drive or a 31 ⁇ 2 inch disk drive.
  • One embodiment of the present invention uses a NOVA- 7800 -P 800 single board computer in a 51 ⁇ 4 inch form factor, with a low power Mobile Pentium-III 800 MHz processor, 512 Mbytes of memory, and has on board interface ports for Firewire (IEEE 1394), LAN, Audio, IDE, PCMCIA and Flash memories.
  • the entire system includes the hand-held scan head with an integrated display and the portable data/video processing unit, can be operated without any controls other than power on/off.
  • the system is equipped with an optional operator interface such as buttons and knobs, either on the processing unit, or integrated in the scan head, or both.
  • the processing unit provides an additional video output to drive an external monitor, or optionally an integrated display on the processing unit itself.
  • the hand-held scan head display may be used in a View-Finder mode wherein the operator uses the smaller hand-held display to locate the region of interest, and uses the larger monitor to observe more detailed images either in real time, or capture-then-review manner.
  • a first method is to perform conventional rectangular linear scanning over the center of the array and scan a portion of a sector field at each end of the array.
  • a second approach is to activate groups of elements of the transducer successively such that the transmitted beams form a sector scan field having the origin point located behind the transducer array.
  • the second approach has only one coordinate system, thus it simplifies the scan conversion. It also creates a more uniform image over the whole field of view. However, because all the beams originate from one point, it is difficult to have both steering angle and scan-position (tangential) increments be uniform as illustrated in FIGS. 54 and 55 , respectively.
  • the uniform angular increment approach has higher scan line density in the center of the array than on the side, so the images on the side are more degraded.
  • the scan line density of the constant.tangential increment approach is uniform, but almost every line has larger steering angle than the uniform angular increment, which degrades the image quality especially in the near field.
  • a preferred embodiment creates trapezoidal image scans using an extension of the two methods.
  • f(x s ) or g( ⁇ ) By choosing different functions f(x s ) or g( ⁇ ), one can get different scan patterns.
  • Three different approaches include the following:
  • the x s can be calculated by solving the following equation
  • the scan lines are even space located at the front face of the tranducer.
  • FIG. 57 shows a comparison of the steering angle using the method of the preferred embodiment and the uniform tangential increment approach. Note that at every beam position, the steering angle using the preferred embodiment approach is smaller than the uniform tangential approach (better near field). FIG. 58 shows more clearly that the preferred embodiment approach can have uniform tangential increment and also nearly uniform steering angle increments.

Abstract

A hand-held ultrasound system includes integrated electronics within an ergonomic housing. The electronics includes control circuitry, beamforming and circuitry transducer drive circuitry. The electronics communicate with a host computer using an industry standard high speed serial bus. The ultrasonic imaging system is operable on a standard, commercially available, user computing device without specific hardware modifications, and is adapted to interface with an external application without modification to the ultrasonic imaging system to allow a user to gather ultrasonic data on a standard user computing device such as a PC, and employ the data so gathered via an independent external application without requiring a custom system, expensive hardware modifications, or system rebuilds. An integrated interface program allows such ultrasonic data to be invoked by a variety of such external applications having access to the integrated interface program via a standard, predetermined platform such as visual basic or c++.

Description

    CROSS REFERENCES TO RELATED APPLICATIONS
  • This application is a continuation-in-part of U.S. application Ser. No. 10/354,946 filed Jan. 30, 2003 which is a continuation-in-part of U.S. application Ser. No. 10/094,950 filed Mar. 11, 2002 which is a continuation-in-part of International Application PCT/US02/05764 filed on Feb. 22, 2002 which is a continuation-in-part of application Ser. No. 09/822,764 filed Mar. 30, 2001, which is a continuation-in-part of application Ser. No. 09/791,491 filed Feb. 22, 2001, which is a continuation-in-part of International Application No. PCT/US00/17236 filed on Jun. 22, 2000 which is a continuation-in-part of U.S. application Ser. No. 09/449,780 filed on Nov. 26, 1999 and claims the benefit of U.S. Provisional Application No. 60/140,430, filed on Jun. 22, 1999, the entire contents of the above applications being incorporated herein by reference in their entirety.[0001]
  • BACKGROUND OF THE INVENTION
  • Conventional ultrasound imaging systems typically include a hand-held probe coupled by cables to a large rack-mounted console processing and display unit. The probe typically includes an array of ultrasonic transducers which transmit ultrasonic energy into a region being examined and receive reflected ultrasonic energy returning from the region. The transducers convert the received ultrasonic energy into low-level electrical signals which are transferred over the cable to the processing unit. The processing unit applies appropriate beam forming techniques to combine the signals from the transducers to generate an image of the region of interest. [0002]
  • Typical conventional ultrasound systems include a transducer array each transducer being associated with its own processing circuitry located in the console processing unit. The processing circuitry typically includes driver circuits which, in the transmit mode, send precisely timed drive pulses to the transducer to initiate transmission of the ultrasonic signal. These transmit timing pulses are forwarded from the console processing unit along the cable to the scan head. In the receive mode, beamforming circuits of the processing circuitry introduce the appropriate delay into each low-level electrical signal from the transducers to dynamically focus the signals such that an accurate image can subsequently be generated. [0003]
  • SUMMARY OF THE INVENTION
  • In accordance with a preferred embodiment of the invention, provides for further improvements in portable ultrasound medical imaging systems developed for use with personal computers. In one embodiment the control circuitry and beamforming circuitry are localized in a portable assembly. Such an integrated package simplifies the cable requirements of the assembly, without adding significant weight. [0004]
  • Traditional ultrasonic imaging systems have been dedicated systems having specialized hardware for processing the large amounts of data generated by ultrasonic transducers providing input to such systems. These imaging systems tend to be unwieldy, expensive, and difficult to upgrade. Further, since dedicated systems have specialized components, it is difficult to employ the gathered ultrasound data in other contexts, such as by downloading to another application for processing and/or operations which are unavailable on the native dedicated system. Accordingly, it is beneficial to provide an ultrasonic imaging system operable on a standard, commercially available, user computing device without specific hardware modifications, and adapted to interface with an external application without modification to the ultrasonic imaging system. In this manner, a.user may gather ultrasonic data on a standard user computing device such as a personal computer (PC), and employ the data so gathered via an independent external application without requiring a custom system, expensive hardware modifications, or system rebuilds. [0005]
  • A system and method for gathering ultrasonic data on a standard user computing device and employing the data via an integrated interface program allows such ultrasonic data to be invoked by a variety of external applications having access to the integrated interface program via a standard, predetermined platform such as visual basic or c++. [0006]
  • The system provides external application integration in an ultrasonic imaging system by defining an ultrasonic application server for performing ultrasonic operations. An integrated interface program with a plurality of entry points into the ultrasonic application server is defined. The entry points are operable to access each of the ultrasonic operations. An external application sends a command indicative of at least one of the ultrasonic operations. The command is transmitted via the integrated interface program to the ultrasonic application server. Concurrently, at periodic intervals, raw ultrasonic data indicative of ultrasonic image information is received by the ultrasonic application server over a predetermined communication interface. A result corresponding to the command is computed by the ultrasonic application server, and transmitted to the external application by the integrated interface program. [0007]
  • An embodiment of the invention includes a probe having a plurality of circuit boards or circuit panels that are mounted within a generally rectangular cavity within a hand-held housing. The circuit panels each have one or more integrated circuits and are mounted in planes that are parallel to one another. These integrated circuits can be fabricated using a standard CMOS process that support voltage levels between 3.3 V and 200 V. [0008]
  • A particular embodiment of the invention utilizes two or three circuit boards or panels, a center panel having a center system controller and a communication link to an external processor. The center panel can be mounted between a pair of surrounding panels, each including a memory and a beamforming circuit. The system accommodates the use of different probe elements and can employ a variable power supply that is adjusted to different levels for different probes. Also, it is desirable to use a variable clock generator so that different frequencies can be selected for different probes. [0009]
  • Another preferred embodiment of the invention provides a small probe that is connected by a first cable to an interface-housing. The interface housing can contain the beamformer device and associated circuits and is a small light weight unit that can be held in one hand by the user while the other hand manipulates the probe. The probe can be any of several conventional probes that can be interchangeably connected by cable to the interface housing. Alternatively, the interface housing can be worn on the body of the user with a strap, on the forearm or the waist with a belt, for example, or in a pocket of the user. A preferred embodiment using such an interface can include two or three circuit boards as described in greater detail herein. The interface housing is connected to a personnel computer by standard FireWire or serial bus connection. [0010]
  • In another preferred embodiment, the probe incorporating the beamformer, or the probe with the interface housing can be connected to a wearable personal computer. In this embodiment, the computer performing scan conversion, post signal processing or color doppler processing is located in a housing worn by the user, such as on the forearm, on the waist or in a pocket. A power supply board can be inserted into the probe, into the interface housing or in another external pod and can include a DC-DC converter. The display system can also include a head mounted display. A hand-held controller can be connected to the computer or interface by wire or wireless connection. [0011]
  • A preferred embodiment of the invention can utilize certain safety features including circuits that check the power supply voltage level, that test every channel of the beamformer and assists in setting gain levels, that counts pulses per second and automatically shuts off the system to prevent over-radiating of the patient. [0012]
  • Another preferred embodiment of the invention employs the use of dedicated controls that the user can employ to perform specific tasks during a patient study. These controls are readily accessible and intuitive in use. These controls provide for freezing or unfreezing of the image on the display, for recording an image in electronic memory, to measure distances in two dimensions using a marker or caliper and a “set” function fix two markers or calipers on screen, a track ball, touchpad or other manually manipulated element to control the marker, a time gain compensation control, such as 8 slide pots, to correct for sound attenuation in the body, scale or depth control to provide a zoom feature and for selection of focal zones. [0013]
  • The system can be employed with a number of probe systems and imaging methods. These include the generation of color Doppler, power Doppler and spectral density studies. These studies can be aided by the use of contrast agents that are introduced into the body during a study to enhance the response to ultrasound signals. Such agents can also include medications that are acoustically released into the body when they are activated by specific acoustic signals generated by the probe transducer array. [0014]
  • In accordance with another aspect of the present invention, a system for ultrasonic imaging includes a probe and a computing device. The probe has a transducer array, control circuitry and a digital communication control circuit. The control circuitry includes a transmit/receive module, beamforming module and a system controller. A computing device connects to the digital communication control circuit of the probe with a communication interface. The computer processes display data. [0015]
  • The communication interface between the probe and the computing device is a wireless interface in several embodiments. In an embodiment, the wireless interface is a radio frequency (RF) interface. In another embodiment, the wireless interface is an infrared interface (IR). In an alternative embodiment, the communication interface between the probe and the computing device is a wired link. [0016]
  • In a preferred embodiment, the beamforming module is a charge domain processor beamforming module. The control circuitry has a pre-amp/time-gain compensation (TGC) module. [0017]
  • A supplemental display device is connected to the computing device by a second communication interface. The supplemental display device is a computing device in several embodiments. At least one of the communication interfaces can be a wireless interface. [0018]
  • In an embodiment, the communication between the probe and the computing device is a wireless interface. The second communication interface between the supplemental display device and the computing device is wireless. In an embodiment, the second communication interface includes a hub to connect a plurality of secondary supplemental devices. [0019]
  • In another preferred embodiment, the ultrasonic imaging system includes a handheld probe system which is in communication with a remotely located computing device. The computing device can be a handheld portable infonrmation device such as a personal digital assistant provided by Compaq or Palm, Inc. The communication link between the probe and the computing device is a wireless link such as, but not limited to, IEEE 1394 (FireWire). The computing device may be used for controlling, monitoring or displaying ultrasonic imaging data. [0020]
  • A method of controlling an ultrasonic imaging system from a unitary operating position facilitates ultrasonic image processing by defining ultrasonic imaging operations and defining a range of values corresponding to each of the ultrasonic imaging operations. An operator then selects, via a first control, one of the ultrasonic imaging operations, and then selects, via a second control, a parameter in the range of values corresponding to the selected ultrasonic imaging operation. The ultrasonic imaging system applies the selected ultrasonic imaging operation employing the selected parameter. In this manner, the operator produces the desired ultrasonic image processing results by employing both the first control and the second control from a common operating position from one hand, thereby allowing the operator to continue scanning with a free hand while continuing to control the ultrasonic imaging system. [0021]
  • The ultrasonic imaging system is controlled from a control keypad accessible from one hand of the operator, or user. The other hand of the operator may therefore be employed in manipulating an ultrasonic probe attached to the ultrasonic imaging system for gathering ultrasonic data employed in the ultrasonic imaging operations. The first control allows qualitative selection of the various ultrasonic imaging operations which may be invoked using the system. The second control allows quantitative selection of parameters along a range to be employed in the ultrasonic operation. The range of parameters may be a continuum, or may be a series of discrete values along the range. The control keypad includes two keys for scrolling through the qualitative ultrasonic operations, and two keys for selecting the quantitative parameters along the corresponding range. [0022]
  • The ultrasonic imaging system in accordance with preferred embodiments may be used for patient monitoring systems such as bedside monitoring system, pacemaker monitoring, for providing image guided implants, and pacemaker implantation. Further, preferred embodiments of the systems of the present invention may be used for cardiac rhythm management, for radiation therapy systems and for image guided surgery, such as, but not limited to, image guided neurosurgery, breast biopsy and computer enabled surgery. [0023]
  • The ultrasonic imaging operations which may be invoked include scanning operations, to be applied to live, real time ultrasonic image gathering, and processing operations, which may be applied to live or frozen ultrasonic images. Typical scanning ultrasonic imaging operations which are known to those skilled in the art and which may be applied by the ultrasonic imaging system include size, depth, focus, gain, Time Gain Compensation (TGC) and TGC lock. Typical processing ultrasonic imaging operations include view, inversion, palette, smoothing, persistence, map, and contrast. [0024]
  • Preferred embodiments of the present invention include control and data transfer methods that allow a third party Windows® based application to control, for example, a portable Windows® based ultrasound system by running the ultrasound application as a background task, sending control commands to the ultrasound application server and receiving images (data) in return. Further, the embodiment configures a portable ultrasound Windows® based application as a server of live ultrasound image frames supplying another Windows® based application that acts as a client. This client application receives these ultrasound image frames and processes them further. In addition, an alternate embodiment configures the portable ultrasound Windows® based application as a server, interacting with a third party client application via two communication mechanisms, for example, a component object model (COM) automation interface used by third party, hereinafter referred to as an external application or a client to startup and control the portable ultrasound Windows® based application and a high-speed shared memory interface to deliver live ultrasound images. [0025]
  • A preferred embodiment includes and configures a shared memory interface to act as a streaming video interface between a portable Windows® based Ultrasound application and another third party Windows® based application. This streaming video interface is designed to provide ultrasound images to a third party client in real-time. [0026]
  • A preferred embodiment allows the third party Windows® based application to control the flow rate of images from the portable ultrasound Windows® based application through the shared memory interface within the same PC platform and the amount of memory required to implement this interface. These controls consist of a way to set the number of image buffers, the size of each buffer and the rate of image transfer. This flow rate control can be set for zero data loss thus ensuring that every frame is delivered to the third party Windows® based application from the ultrasound system, or minimum latency thus delivering the latest frame generated by the ultrasound system to the third party Windows® based application first. [0027]
  • A preferred embodiment formats the ultrasound image frame such that probe, spatial, and temporal information can be interpreted by the third party Windows® based application as it retrieves the images (generated by the portable ultrasound Windows® based application) from the shared memory interface. The actual image data passed between the server (i.e. portable ultrasound application) and the client application (third party Windows® based application) is a Microsoft” device independent bitmap (DIB) with 8 bit pixels and a 256 entry color table. The image frame also contains a header that provides the following additional information, for example, but not limited to, Probe Type, Probe Serial Number, Frame Sequence Number, Frame Rate, Frame Timestamp, Frame Trigger Timestamp, Image Width (in pixels), Image Height (in pixels), Pixel Size (in X and Y), Pixel Origin (x,y location of the first pixel in image relative to the Transducer Head, and Direction (spatial direction along or across each line of the image). [0028]
  • Further, the preferred embodiment controls the shared memory interface used to transfer ultrasound images between a Windows® based portable ultrasound system and a third party Windows® based system through the use of ActiveX controls. The Windows® based portable ultrasound application contains an ActiveX control that transfers a frame into the shared memory and sends out a Windows® Event (that includes a pointer to the frame just written) to the third party Windows® based application. This third party application has a similar ActiveX control that receives this Event and pulls the image frame out of shared memory. [0029]
  • In accordance with a preferred embodiment the present invention includes a method for providing streaming video in an ultrasonic imaging system including providing an ultrasonic application server having at least one ultrasonic operation and corresponding ultrasonic data. The method further includes sending, from an external application, a command indicative of one of the ultrasonic operations, executing in the ultrasonic application server, a result corresponding to the commands and sending data from the ultrasonic server to the external application. A shared memory is in communication with the ultrasonic application server and the external application. The method further includes an integrated interface program having a plurality of entry points into the application server, transmitting via the integrated interface program a command to the ultrasonic application server, receiving over a predetermined communication interface, ultrasonic data indicative of ultrasonic image formation and transmitting the result to the external application via the integrated interface program. [0030]
  • The integrated interface program is adapted to transmit real-time imaging data including ultrasonic imaging for radiation therapy planning and treatment, minimally invasive and robotic surgery methods including biopsy procedures, invasive procedures such as catheter introduction for diagnostic and therapeutic angiography, fetal imaging, cardiac imaging, vascular imaging, imaging during endoscopic procedures, imaging for telemedicine applications, imaging for veterinary applications, cryotherapy and ultrasound elastography. [0031]
  • In preferred embodiments, the streaming video includes radio frequency data, real-time image data and transformation parameters. The external application can reside on the same computing device as the ultrasonic application server or be resident on a different computing device. The external application communicates with the ultrasonic application server using a control program using a component object model automation interface and a shared memory interface. [0032]
  • The command in the method for providing streaming video includes operations selected from the group consisting of an ultrasound application initialization/shutdown functions such as, for example, start ultrasound application, load preset files, exit application; ultrasound setup functions such as, for example, set shared memory parameters, initialize communication to shared memory, set image frame size, set shared memory size, set transfer priority (for low latency, high throughput, or first in, first out), set image resolution and format; and ultrasound image capture functions such as, for example, freeze live data, fetch live data, and resume live imaging. [0033]
  • The ultrasonic application server includes a graphical user interface having image control presets which are operable to store image settings. The image settings include application controls such as, for example, image mode, patient name, patient ID; B-mode controls, for example, size, depth, focus, TGC, change examination type; M-mode controls, for example, sweep speed, scan line position; image quality controls, for example, brightness, contrast, invert, palette, smoothing persistence; and Doppler controls, for example, color region of interest, pulse repetition rate, wall filter, steering angle, color gain, color invert, color priority, color baseline and line density control. [0034]
  • In a preferred embodiment, in certain applications such as angiographic monitoring, it is desirable for the system user to remain visually focussed on the patient and not be distracted by the need to view a display outside the user's field of view. For such applications, a display can be integrated into the probe housing and/or the interface housing. [0035]
  • The foregoing and other objects, features and advantages of the invention will be apparent from the following more particular description of preferred embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.[0036]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic block diagram of an integrated probe system. [0037]
  • FIGS. [0038] 2A-2C illustrate a particular embodiment of packaging integrated probe electronics.
  • FIG. 3A is a schematic block diagram of a particular embodiment of an integrated probe system. [0039]
  • FIGS. 3B and 3C illustrate embodiments of the transmit/receive circuit. [0040]
  • FIG. 3D illustrates an alternate embodiment in which the probe housing is separated from the interface housing by a cable. [0041]
  • FIG. 4A is a block diagram of a particular 1 -dimensional time-domain beamformer. [0042]
  • FIG. 4B illustrates another preferred embodiment of a beamformer in accordance with the invention. [0043]
  • FIG. 5A is a functional block diagram of the system controller of FIG. 3. [0044]
  • FIG. 5B schematically illustrates a timing diagram for the control of modules in the system. [0045]
  • FIG. 6 shows a block diagram of an ultrasonic imaging system adapted for external application integration as defined by the present claims. [0046]
  • FIG. 7A shows an integrated interface program operable for use with a local external application. [0047]
  • FIG. 7B shows an integrated interface program operable for use with a remote external application. [0048]
  • FIG. 8 shows a flowchart of external application integration as defined herein. [0049]
  • FIG. 9 shows a graphical user interface (GUI) for use with the ultrasonic imaging system as defined herein. [0050]
  • FIG. 10 is a preferred embodiment of a portable ultrasound system in accordance with the invention. [0051]
  • FIG. 11 illustrates a wearable or body mounted ultrasound system in accordance with the invention. [0052]
  • FIG. 12 illustrates an interface system using a standard communication link to a personal computer. [0053]
  • FIG. 13 shows the top-level screen of a graphical user interface (GUI) for controlling the ultrasonic imaging system. [0054]
  • FIG. 14 shows a unitary control keypad for use in conjunction with the GUI of FIGS. [0055] 15A-15B.
  • FIG. 15A shows a graphical user interface (GUI) for controlling the scanning operations of the ultrasonic imaging system. [0056]
  • FIG. 15B shows a graphical user interface (GUI) for controlling the processing operations of the ultrasonic imaging system; and [0057]
  • FIG. 16 shows a state diagram corresponding to the GUI of FIGS. [0058] 15A-15B.
  • FIG. 17A is a block diagram illustrating an ultrasound imaging system with wired and wireless communication. [0059]
  • FIG. 17B is a block diagram illustrating an ultrasound imaging system with wireless and wired communication. [0060]
  • FIG. 17C is a block diagram illustrating an ultrasound imaging system with wireless communication. [0061]
  • FIG. 18 is a block diagram illustrating an ultrasound imaging system with a remote or secondary controller/viewer and wireless communication. [0062]
  • FIG. 19 is a block diagram illustrating an ultrasound imaging system with wired and wireless network communication capability. [0063]
  • FIG. 20 is a diagram illustrating further details of the architecture of the ultrasound imaging system in accordance with a preferred embodiment of the present invention. [0064]
  • FIG. 21 is a diagram of a wireless viewer graphical user interface in accordance with a preferred embodiment of the present invention. [0065]
  • FIG. 22 is a diagram of a facility wide ultrasound image distribution system in accordance with a preferred embodiment of the present invention. [0066]
  • FIG. 23 is a diagram illustrating an ultrasound imaging system in accordance with a preferred embodiment of the present invention. [0067]
  • FIG. 24 is a block diagram illustrating a personal digital assistant (PDA) in communication with the host computer or probe system in accordance with preferred embodiment of the present invention. [0068]
  • FIGS. [0069] 25A-25C illustrate an ultrasound system in accordance with a preferred embodiment of the present invention integrated with an angiography system, a high frequency image of the carotid artery with directional power doppler and an image of the carotid artery with simultaneous quantitative spectral doppler, respectively.
  • FIGS. 26A and 26B illustrate an ultrasound image of vessel walls in accordance with a preferred embodiment of the system of the present invention and a catheter used with the system, respectively. [0070]
  • FIGS. 27A and 27B illustrate a radiation planning system integrating the ultrasound system in accordance with preferred embodiments of the present invention and the probe of the ultrasound system, respectively. [0071]
  • FIGS. 28A and 28B illustrate an ultrasonic imaging system for cryotherapy in accordance with a preferred embodiment of the present invention and a probe used in the system, respectively. [0072]
  • FIG. 29 is a schematic diagram illustrating a robotic imaging and surgical system integrating the ultrasound system in accordance with a preferred embodiment of the present invention. [0073]
  • FIG. 30 is a schematic diagram illustrating an imaging and telemedicine system integrating the ultrasound system in accordance with a preferred embodiment of the present invention. [0074]
  • FIGS. 31 A and 31B are three-dimensional images from fetal imaging obtained from an ultrasound system in accordance with a preferred embodiment of the present invention. [0075]
  • FIG. 32 is a block diagram illustrating the structure of the physical shared memory in accordance with a preferred embodiment of the present invention. [0076]
  • FIG. 33 is a schematic block diagram of the processing flow between the server side, the client side and the shared memory control in accordance with a preferred embodiment of the present invention. [0077]
  • FIG. 34 is a view of a graphical user interface of the Autoview user interface in accordance with a preferred embodiment of the present invention. [0078]
  • FIG. 35 illustrates a view of a main screen display of a graphical user interface in accordance with a preferred embodiment of the present invention. [0079]
  • FIGS. [0080] 36A-36C are views of a graphical user interface showing the icons used to control the size of windows, and creation of floating windows in accordance with a preferred embodiment of the present invention.
  • FIGS. 37A and 37B are views of a graphical user interface illustrating a patient folder and an image folder directory in accordance with a preferred embodiment of the present invention. [0081]
  • FIG. 38 illustrates a tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention, whereby different modes of imaging can be selected. [0082]
  • FIG. 39 illustrates a measurement tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention. [0083]
  • FIG. 40 illustrates a playback tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention. [0084]
  • FIGS. 41A and 41B illustrate Live/Freeze interface buttons in a graphical user interface in accordance with a preferred embodiment of the present invention. [0085]
  • FIG. 42 illustrates a file tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention. [0086]
  • FIG. 43 illustrates a view of patient information screen in a graphical user interface in accordance with a preferred embodiment of the present invention. [0087]
  • FIG. 44 illustrates further interface buttons in a patient interface screen in accordance with a preferred embodiment of the present invention. [0088]
  • FIG. 45 illustrates a view of a screen for adding a new patient in a graphical user interface in accordance with a preferred embodiment of the present invention. [0089]
  • FIG. 46 illustrates an image in the B-mode including the controls provided by a graphical user interface in the B-mode in accordance with a preferred embodiment of the present invention. [0090]
  • FIGS. [0091] 47A-47H illustrate the control interfaces for adjusting a B-mode image in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • FIG. 48 illustrates the image quality control setting provided in the B-mode image option in accordance with a preferred embodiment of the present invention. [0092]
  • FIG. 49 illustrates an M-mode image and the controls provided to adjust the M-mode image in accordance with a preferred embodiment of the present invention. [0093]
  • FIG. 50 illustrates a portable ultrasound imaging system including a hand-held probe with an integrated display in accordance with a preferred embodiment of the present invention. [0094]
  • FIGS. 51 A and 51B are embodiments illustrating a hand-held scan head having a display unit integrated on the scan head housing or the display unit being attached to the scan head, respectively, in accordance with the present invention. [0095]
  • FIG. 52 illustrates the single board computer and beamformer circuits that form the processing unit in accordance with a preferred embodiment of the present invention. [0096]
  • FIGS. 53A and 53B illustrate alternate preferred embodiments wherein the beamformer electronics are housed either in the hand-held scan head assembly or in the processing unit, respectively, in accordance with the present invention. [0097]
  • FIG. 54 illustrates a trapezoidal scan format using a uniform angular increment with non-uniform tangential increment. [0098]
  • FIG. 55 illustrates a trapezoidal scan format using uniform tangential increment with non-uniform angle increment. [0099]
  • FIG. 56 schematically describes preferred embodiments for generating trapezoides scan formats in accordance with the present invention. [0100]
  • FIG. 57 illustrates the steering angle comparison wherein the solid line equals even scan line space with adjustable steering angle (α=0.5), dashed line equals uniform tangential increment. [0101]
  • FIG. 58 illustrates the steering angle as a function of scan line for the uniform scan line position approach.[0102]
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 is a schematic block diagram of an integrated probe system. Illustrated are a [0103] target object 1, a front-end probe 3, and a host computer 5, and a supplemental display/recording device 9. The front-end probe 3 integrates a transducer array 10 and control circuitry into a single hand-held housing. The control circuitry includes a transmit/receive module 12, a pre-amp/time-gain compensation (TGC) module 14, a charge domain processor (CDP) beamforming module 16, and a system controller 18. Memory 15 stores program instructions and data. The CDP beamformer integrated circuit 16 includes a computational capacity that can be used to calculate the delay coefficients used in each channel. The probe 3 interfaces with the host computer 5 over a communications link 40, which can follow a standard high-speed communications protocol, such as the FireWire (IEEE P1394 Standards Serial Interface) or fast (e.g., 200 Mbits/second or faster) Universal Serial Bus (USB 2.0) protocol. The standard communication link to the personal computer operates at least at 100 Mbits/second or higher, preferably at 200 Mbits/second, 400 Mbits/second or higher. Alternatively, the link 40 can be a wireless connection such as an infrared (IR) link. The probe 3 thus includes a communications chipset 20.
  • The components in the portable ultrasound system require a continuous source of data for correct operation. For instance, the [0104] beamformer 16 requires steering data, the transmit circuitry 12 requires data to instruct it where to focus the next pulse and when to fire, and the TGC 14 needs to know what gain level is appropriate at the given time. Additionally, further information may be required synchronous to the scanning operation to control how the beamformed data is sent back to the host. For instance, a DATAVALID signal can be helpful to reduce the amount of data that the host 5 actually has to process. Along with data, the various parts of the ultrasound system relies on common synchronization for the system to work in harmony. For example, the transmitter must be fired at an exact time with respect to when the beamformer is looking at a particular position.
  • Engineering goals of the ultrasonic probe include small size, thermal management, low-power consumption, and the capability and flexibility to allow efficient high resolution imaging as well as calibration and experimentation. The small size and low-power operation implies dense storage. The capability and flexibility entails the ability to use irregular firing sequences, concurrent reprogramming and use for seamless adaptive beamforming modes, as well as full flexibility to perform debugging and complete-set imaging. Ergonomic, economic portable design also requires a cost-effective, non-encumbering connection between the [0105] scan head 3 and the PC host 5. A general description of the probe system can be found in International Application PCT/US96/11166, filed on Jun. 28, 1996, in U.S. application Ser. No. 08/981,427 filed on Dec. 29, 1997 now U.S. Pat. No. 5,964,709 issued on Oct. 12, 1999, in U.S. application Ser. No. 08/599,816 filed on Feb. 12, 1996 now U.S. Pat. No. 5,690,114 issued on Nov. 25, 1997, in U.S. application Ser. Nos. 08/496,804 and 08/496,805 both filed on Jun. 29, 1995, now U.S. Pat. Nos. 5,590,658 and 5,839,442, respectively, issued Jan. 7, 1997 and Nov. 24, 1998, respectively, and further embodiments are described in U.S. application Ser. No. 09/364,699 filed Jul. 30, 1999, now U.S. Pat. No. 6,292,433 issued on Sep. 18, 2001, in International Application No. PCT/US98/02291 filed on Feb. 3, 1998, and in U.S. application Ser. No. 09/447,144 filed on Nov. 23, 1999 now U.S. Pat. No. 6,379,304 issued on Apr. 30, 2002, in International Application No. PCT/US97/24291 filed on Dec. 23, 1997 the above patents and applications being incorporated herein by reference in their entirety.
  • Additional factors of interest include ease, speed, and low-cost of design and manufacturing. These factors motivate the use of a Field Programmable Gate Array (FPGA) architecture. Additionally, they involve the use of a design that can be extended easily to diverse applications. [0106]
  • FIGS. [0107] 2A-2C illustrate a particular embodiment of integrated probe electronics. FIG. 2A is a perspective view showing a transducer array housing 32, an upper circuit board 100A, a lower circuit board 100B, and a central circuit board 200. Also shown is a lower Molex connector 150B carrying data and signal lines between a central circuit board 200 and the lower circuit board 100B. The transducer array housing 32 can be a commercially available unit having a pair of flexible cable connectors 120A, 120B (See FIG. 2C) connected to the upper board I OOA and lower board 100IB, respectively, with strain relief. FIG. 2B is a back-end view of the probe, which also shows an upper Molex connector 150A. FIG. 2C is a side-view of the probe. Using 8 mm high Molex connectors 150A, 150B, the entire stack has a thickness of approximately 30 or less, with this particular embodiment being about 21 mm.
  • Small size is achieved through the use of modem fabrication and packaging techniques. For example, by exploiting modem semiconductor fabrication techniques, numerous circuit functions can be integrated onto single chips. Furthermore, the chips can be mounted using space-saving packaging, such as chip on-board technology. As technology improves, it is expected that the size of the electronic components will decrease further. [0108]
  • More functionality can be included within the hand-held probe such as a [0109] wireless IEEE 1394 connection to the personal computer. A display can be mounted directly on the hand-held probe, for example, to provide a more usable and user-friendly instrument.
  • FIG. 3A is a schematic block diagram of a particular embodiment of an integrated probe system. The [0110] host computer 5 can be a commercially available personal computer having a microprocessor CPU 52 and a communications chipset 54. A communications cable 40 is connected through a communications port 56 to the communications chipset 54.
  • The front-[0111] end probe 3′ includes a transducer head 32, which can be an off-the-shelf commercial product, and an ergonomic hand-held housing 30. The transducer head 32 houses the transducer array 10. The housing 30 provides a thermally and electrically insulated molded plastic handle that houses the beamforming and control circuitry.
  • The, beamforming circuitry, as shown, can be embodied in a pair of [0112] analog circuit boards 100A, 100B. Each analog circuit board 100A, 100B includes a respective transmit/receive chip 112A, 112B; a preamp/ TGC chip 114A, 114B; a beamformer chip 116A, 116B; all of which are interconnected with a pair of the memory chips 115A-1, 115B-1, 115A-2, 115B-2 via an operational bus 159A, 159B. In a particular embodiment of the invention, the memory chips are Video Random Access Memory (VRAM) chips and the operational bus is 32 bits wide. Furthermore, preamp/TGC chips 114 and beamformer chips 116 operate on 32 channels simultaneously. The transmit/receive chips 112 include a 64 channel driver and a 64-to-32 demultiplexer.
  • FIG. 4A is a block diagram of a particular 1-dimensional time-domain beamformner. The [0113] beamformer 600 features 32-channel programmable apodized delay lines. In addition, the beamforiner 600 can include an on-chip output bandpass filtering and analog-to-digital conversion.
  • As illustrated in FIG. 4A, the [0114] beamformer 600 includes a plurality of single channel beamforming processors 620 subscript I, . . . , 620 subscript J. imaging signals are represented by solid leader lines, digital data is represented by dashed leader lines, and clock and control signals are illustrated by alternating dot and dash leader lines. A timing controller 610 and memory 615 interface with the single channel beamforming processors 620. Each single channel beamforming processor includes clock circuitry 623, memory and control circuitry 625, a programmable delay unit with sampling circuitry 621, in a multiplier circuit 627.
  • Each [0115] programmable delay unit 621 receives an imaging signal echo E from a respective transducer element. The outputs from the single channel beamforming processors 620 are added in a summer 630. A frequency impulse response (FIR) filter 640 processes the resulting imaging signal, which is digitized by the analog-to-digital (A/D) converter 650. In a particular embodiment of the invention, both the FIR filter 640 and the A/D converter 650 are fabricated on chip with the beamforming processors 620.
  • The choice of a Field Programmable Gate Array (FPGA) implementation as well as extensibility for ease of modification, points to the use of VRAMs for the memory modules. VRAM is a standard Dynamic RAM (DRAM) with an additional higher-speed serial access port. While DRAM has two basic operations, for example, read and write memory location, VRAM adds a third operation: transfer block to serial readout register. This transfers a block (typically 128 or 256 words) of data to the serial readout register which can then be clocked out at a constant rate without further tying up the DRAM core. Thus refresh, random access data read/write, and sequential readout can operate concurrently. Alternate embodiments may include a synchronous Dynamic Ram (synchDRAM) memory. [0116]
  • In the [0117] probe 3′, dual-ported operation is beneficial so the data loading performed by the host 5 can be decoupled from data sent to memory modules. A modular architecture which allows additional VRAMs to be added in order to obtain additional bandwidth is useful, particularly when the exact data rate requirements may change. Using wide memories, the data does not have to be buffered before going to the various destination modules in the system. A particular embodiment uses five 256 Kword by 16 bit VRAMs which yields a total of 80 output lines. If fewer output lines are required, fewer VRAMs can be used. If more output lines are required, only very minor modifications to the controller have to be made.
  • The downside is that VRAM is lower density than other varieties of DRAM. Currently only 512 Kbyte VRAM chips are available. Synchronous DRAM (SDRAM) is 2 Mbyte/chip, but expects buffering of all data from the memory to the various destination modules because it is not continuous. The use of SDRAM implies that the modules accept data bursts instead of continuous data. Additionally, more buffering of host data can be used or else concurrent readout and loading may not be possible. Using a multiple data rate feature in the controller can reduce the storage requirements making VRAM a first embodiment. However, a further preferred embodiment uses SDRAM to provide further improvements in the speed and capacity of the system. [0118]
  • The control circuitry, as shown in FIG. 3A, is embodied in a [0119] digital circuit board 200. The digital circuit board 200 includes a FireWire chipset 220, a system control chip 218 to control the scan head, and a memory chip 215. In a particular embodiment of the invention, the memory chip 215 is a VRAM chip and the system control chip 218 is interconnected to the various memory chips 115, 215 over a control bus 155, which in this particular application is 16 bits wide.
  • As illustrated, the [0120] system control chip 218 provides scan head control signals to be transmit/receive chips 112A, 112B over respective signal lines 152A, 152B. The transmit/receive chips 112A, 112B energize the transducer array 10 over transmit lines 124A, 124B. Received energy from the transducer array 10 is provided to the transmit/receive chips 112A, 112B over receive lines 122A, 122B. The received signals are provided to the pre-amp/ TGC chips 114A, 114B. After being amplified, the signals are provided to the beamformer chips 116A, 116B. Control signals are exchanged between the beamformer and the system controller over signal lines 154A, 154B to adjust the scan beam.
  • The five [0121] VRAM chips 115A-1, 115A-2, 115B-1, 115B-2, 215 serve to supply the real-time control data needed by the various operating modules. The term “operating modules” refers to the different parts of the system that require control data - namely the beamformers 116A, 116B, transmit/receive chips 112A, 112B, and preamp/ TGC chips 114A, 114B. The system controller 218 maintains proper clocking and operation of the VRAM to assure continuous data output. Additionally, it generates clocks and control signals for the various operating modules of the system so that they know when the data present at the DRAM serial port output is for them. Finally, it also interfaces with the host (PC) 5 via a PC communications protocol (e.g., FireWire or high speed bus) to allow the host 5 to write data into the VRAM.
  • Some of the VRAMs are shared by multiple modules. The 64-bit output of four VRAMs [0122] 15A-1, 115A-2, 115B-1, 1155B-2 is used by both the transmit module as well as the beamformer. This is not a problem, because typically only one requires data at any given time. Additionally, the transmit module chip uses relatively less data and thus it is wasteful to have to dedicate entire VRAMs for transmit operations. In order to allow the VRAM data to be shared by multiple modules, codes are embedded in the VRAM data that the controller deciphers and asserts the appropriate MODCLOCK line.
  • The [0123] fifth VRAM 215 is used to generate data that is not shared by multiple modules. For example, it is convenient to put the control for the TGC here because that data is required concurrently with beamformer data. It can also be useful to have one dedicated control bit which indicates when valid data is available from the beamformer and another bit indicating frame boundaries. Thus, because the location of the data in the VRAM corresponds to the position in the frame scanning sequence, additional bits are synchronized with the operation of the system. CCD clock enable signals can also be generated to gate the CCD clock to conserve power. Lastly, the VRAM can be used to generate test data for a D/A converter to test the analog circuitry with known waveforms.
  • As the system is reduced in size, the number of VRAMs may be reduced. In a SDRAM system clocked twice as fast, the four shared VRAM chips may be merged into two SDRAM chips in a 128 line system, for example. [0124]
  • The data sent to the beamformer and transmit modules are bit-serial within a channel, with all channels being available in parallel. For the transmit module, two transmit channels share each bit line with alternating clocks strobing in data for the two channels. All per channel transmit module coefficients (such as start time) are presented bit-serially. [0125]
  • The data in the VRAM is organized into runs. A run consists of a one word header, which is interpreted by the VRAM controller, followed by zero or more actual data words which are used by the various modules. The headers (see Table 1) specify where the data in the run is destined, how fast it should be clocked out, and how many values there are in the run. (Note that the run destination is only for the data coming out of the 4 VRAMs. The bits coming out of the controller VRAM always have the same destinations.) The headers are also used to encode the special instructions for Jump, Pause, and End described below. [0126]
    TABLE 1
    VRAM Instruction Data Format (Only top VRAM matters)
    Bit Position
    Command
    15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0
    Data Mod Sel (2-7) Rate Length
    Pause
    0 0 1 Rate Pause Count
    (not 0 1)
    Wait 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 1
    Jump 0 0 0 0 0 0 Jump Addr/0x100
    End
    0 0 0 0 0 1 X X X X X X X X X X
  • The data in the VRAM are read out basically sequentially but some variations are allowed to reduce the memory requirements and facilitate system operation based on several observations about how the ultrasound system operates. [0127]
  • The first observation is that the peak control data rate requirements are far higher than the average rates needed. This is because, during close zone imaging, the focus may be updated at every clock to maintain maximal sharpness. However, for deep zones approaching the far field, the focusing parameters need not vary very quickly. Thus the data maybe supplied at a lower rate. This is accomplished by the use of a 2-bit RATE field associated with each run (see Table 2). The RATE field allows the specified run to be clocked out at either the full system clock rate (which can be 8-32 MHZ), one-half, one-quarter, or one-eighth of that rate. [0128]
    TABLE 2
    Rate Field Definitions
    Rate
    Bit
    12 Bit 11 Data Meaning Pause Length
    0 0 New Data Every Clock PauseCount Clock
    0 1 New Data Every Other Clock PauseCount*2 Clocks
    1 0 New Data Every 4 Clocks PauseCount* Clocks
    1 1 New Data Every 8 Clocks PauseCount*8 Clocks
  • The next observation is that there are often large gaps during which time data is not required. After a transmit pulse is fired into a deep zone, a relatively large amount of time can pass before its echo is received and the beamformer is activated. Thus it is advantageous to not have to waste VRAM space for work time periods. For this reason, explicit pause commands are allowed. When the [0129] system controller 218 receives a pause command, it waits the specified number of clock cycles before reading the next word in the VRAM memory. The PAUSECOUNT is an 11 bit number which can take on the range 1-2047. This is additionally scaled by the RATE field to allow pauses of up to 16376 (2047*8) system clock cycles. Note that the RATE field can only take on the values 0, 2 and 3 because a pause of RATE 1 is interpreted as a wait command, described next. This is not a problem, however, because typically only RATE 0 is used for maximum wait accuracy (to within one clock) and RATE 3 is used for maximum wait time (up to 16376 clock cycles).
  • Because the data from the beamformer [0130] 116 has to be sent back to the host 5 over a bandwidth-constrained link, buffering and flow-control are required to prevent data loss. The buffering is achieved by a 16 K by 18 FIFO while the flow control is achieved by feeding the FIFO fullness indication back to the system controller 218. In this way, if the FIFO becomes too full, the scanning stops until the FIFO has been emptied. However, the scanning should not stop arbitrarily because it is timed with the propagation of the sound waves. Thus explicit synchronization points can be inserted into the code, and at these points the controller waits until the FIFO is empty enough to proceed safely. The wait command is used to indicate these synchronization points. The wait command causes the controller to wait until the WAITPROCEED line is high. In one embodiment, this is connected (via the aux FPGA) to the “not half-full” indicator on the FIFO. Thus the wait commands can be placed at least every 8 K data-generating cycles to assure that data overflow cannot occur. Because this is greater than one ultrasound line, it still allows multi-line interleaving to be used.
  • The next command is the jump command. This allows non-sequential traversal through the VRAM memory. This is employed so that the VRAM memory can be modified concurrently with the readout operation and also to make it easier to add and remove variable size control sequences. To understand why this is useful, consider the following example: Imagine that one wants to change the data in VRAM locations [0131] 512-1023 while continuing operation of the scanning using the other locations. If the host were to just modify locations 512-1023, there is no guarantee that they will not be used exactly when they are in the middle of being modified. Thus the data would be in an indeterminate state and can lead to an erroneous sequence. However, if location 512 is first modified to be ajump to location 1024, and locations to 513-1023 are then modified to their new values, and location 512 is then finally modified to its new value, this race condition cannot occur. (Assuming that it is not reading locations 513-1023 at the start of the modifications but blank regions can be left to get around this.) Additionally “subroutines” (which can only be used once per scan due to the fact that the return is coded as an absolute jump) can be used to allow easy change of the scan sequence.
  • A jump always takes 128 cycles to execute because the system controller has to load this new start address into the VRAMs and transfer the new row of data to the serial shift register. This typically takes only about 25 cycles, but because other parts of the system controller may have access to the VRAM (such as the refresh or host controller), a safe upper bound is used to maintain a fixed delay. [0132]
  • The last command is the end command. This is used at the end of the sequence for a frame to tell the system controller that the frame has completed. The controller then stops fetching instructions until it is restarted (from location [0133] 0) by host if it is in single-frame mode. If it is in continuous mode then it will start immediately on the next frame. (After 128 cycles required for the implied jump 0).
  • FIG. 5A is a functional block diagram of the architecture of the system controller of FIG. 3A. The [0134] system controller 218 has four basic parts: a readout controller 282, a host controller 284, the refresh controller 286, and the Arbiter 288. The first three support the three basic operations on the VRAM: reading out data, writing in of data at host's request, and refreshing the DRAM core. The arbiter 288 is responsible for merging the requests of the first three sections into one connection to the VRAM's DRAM core. Only one of the first three sections can have control at a given time, so the explicitly request control and wait until this request is acknowledged by the arbiter 288. They also must tell the arbiter 288 when they are still using the DRAM so that the arbiter knows not to grant it to one of the other sections. This is done via the INUSE lines.
  • Additionally the [0135] arbiter 288 sends the host controller 284 a RELREQ or relinquish request signal to ask the host controller 284 to give up ownership of the DRAM core because some other section wants it. Note that only the host 284 controller needs to be asked to relinquish the bus because the readout controller 284 and refresh controller 286 both only use the DRAM core for fixed short intervals. The host controller 284, however, can hold on to the DRAM as long as there is data coming over the FireWire to be written into the DRAM, so it needs to be told when to temporarily stop transferring data.
  • Note that the serial section of the VRAMs is not multiplexed—it is always controlled by the [0136] readout controller 282. The VRAM serial data also only goes to the readout controller 282.
  • The [0137] readout controller 282 controls the sequencing of the data out the VRAMs' serial access ports. This involves parsing the data headers to determine what locations should be read, clocking the VRAM Serial Clock at the correct time, driving the module control lines, and also arranging for the proper data from the VRAM's DRAM core to be transferred into the serial access memory.
  • The [0138] host controller 284 is the part of the VRAM Controller that interfaces to the host 5 via FireWire to allow the host to write into the VRAM. When the host wants to write into the VRAM, it sends asynchronous packets specifying which VRAM and which addresses to modify as well as the new data to write. The host controller 284 then asks the arbiter 288 for access to the VRAM. When the DRAM core is not in use by either the readout 282 or refresh 286 controller, the arbiter 288 grants control to the host controller 284. The host controller 284 then takes care of address and control signal generation. When the whole packet has been decoded, the host controller 284 releases its request line giving up the DRAM control, allowing the other two sections to use it.
  • The [0139] refresh controller 286 is responsible for periodically generating refresh cycles to keep the DRAM core of the VRAM from losing its data. The refresh controller 286 has its own counter to keep track of when it needs to request a refresh. Once it gains access to the VRAMs via the arbiter 288, it generates one refresh cycle for each of the VRAMs sequentially. This reduces the amount of spikes on the DRAM power supply lines as compared to refreshing all 5 VRAMs in parallel.
  • The REFRATE inputs control how many system clock cycles occur between refresh cycles. (See Table 3.) This is compensate for different system clock rates. Additionally, refresh may be disabled for debugging purposes. [0140]
    TABLE 3
    Refresh Rate Definitions
    Minimum
    System clock cycles System Clock to achieve
    RefRate1 RefRate0 between refresh cycles 16 μs refresh rate
    0 0 128  8 MHZ
    0 1 256 16 MHZ
    1 0 512 32 MHZ
    1 1 No Refresh
  • The arbiter controls [0141] 288 the access to the VRAM by the Readout, Host, and Refresh Controller 282, 284, 286 sections. Only one section may have access to the DRAM port of the VRAM at any given time. The arbiter 288 does not reassign control of the VRAM to another section until the section with control relinquishes it by de-asserting its IN_USE line. The sections are prioritized with the Readout Controller 282 getting the highest priority and the host controller 284 getting the lowest priority. The reasoning is that if the readout controller 282 needs access to the VRAM, but does not get it, then the system may break down as the serial output data will be incorrect. The refresh controller 286 can tolerate occasional delay, although it should not happen much. Finally, the host controller 284 can potentially tolerate very long delays because the host can be kept waiting without too many consequences except that the writing of the VRAM may take longer.
  • A highly capable, yet cost-effective and physically non-encumbering connection between the scan head and host computer is possible using the FireWire standard (also known as IEEE 1394). The FireWire standard is used for multimedia equipment and allows 100-200 Mbps and preferably in the range of 400-800 Mbps operation over an inexpensive 6 wire cable. Power is also provided on two of the six wires so that the FireWire cable is the only necessary electrical connection to the probe head. A power source such as a battery or EEE1394 hub can be used. The FireWire protocol provides both isochronous communication for transferring high-rate, low-latency video data as well as asynchronous, reliable communication that can be used for configuration and control of the peripherals as well as obtaining status information from them. Several chipsets are available to interface custom systems to the FireWire bus. Additionally, PCI-to-FireWire chipsets and boards are currently available to complete the other end of the head-to-host connection. CardBus-to-FireWire boards can also be used. [0142]
  • Although the VRAM controller directly controls the ultrasound scan head, higher level control, initialization, and data processing and display comes from a general purpose host such as a desktop PC, laptop, or palmtop computer. The display can include a touchscreen capability. The host writes the VRAM data via the VRAM Controller. This is performed both at initialization as well as whenever any parameters change (such as number or positions of zones, or types of scan head) requiring a different scanning pattern. During routine operation when data is just being continually read from the scan head with the same scanning parameters, the host need not write to the VRAM. Because the VRAM controller also tracks where in the scan pattern it is, it can perform the packetization to mark frame boundaries in the data that goes back to the host. The control of additional functions such as power-down modes and querying of buttons, or dial on the head can also be performed via the FireWire connection. [0143]
  • Although FireWire chipsets manage electrical and low-level protocol interface to the FireWire interface, the system controller has to manage the interface to the FireWire chipset as well as handling higher level FireWire protocol issues such as decoding asynchronous packets and keeping frames from spanning isochronous packet boundaries. [0144]
  • Asynchronous data transfer occurs at anytime and is asynchronous with respect to the image data. Asynchronous data transfers take the form of a write or read request from one node to another. The writes and reads are to a specific range of locations in the target node's address space. The address space can be 48 bits. The individual asynchronous packet lengths are limited to 1024 bytes for 200 Mbps operation. Both reads and writes are supported by the system controller. Asynchronous writes are used to allow the host to modify the VRAM data as well as a control word in the controller which can alter the operation mode. Asynchronous reads are used to query a configuration ROM (in the system controller FPGA) and can also be used to query external registers or I/O such as a “pause” button. The configuration ROMs contain a querible “unique ID” which can be used to differentiate the probe heads as well as allow node-lockings of certain software features based on a key. [0145]
  • Using isochronous transfers, a node reserves a specified amount of bandwidth, and it gets guaranteed low-overhead bursts of link access every {fraction (1/8000)} second. All image data from the head to the host is sent via isochronous packets. The FireWire protocol allows for some packet-level synchronization and additional synchronization is built into the system controller. [0146]
  • The asynchronous write request packets are sent from the host to the probehead in order to: [0147]
  • a) Configure the Link Layer controller chip (TI GPLynx or TI GP2 Lynx) [0148]
  • b) Control the system controller FPGA [0149]
  • c) Write sequencing data into the VRAM [0150]
  • Both the “Asynchronous Write Request with Block Payload” or the “Asynchronous Write Request with Quadlet Payload” forms can be used. The later simply restricts the payload to one quadlet (4 bytes). The formats of the two packets are shown in Table 4 and Table 5. Note that these are how the packets are passed on by the TI LINK controller chip. The difference between this and the format over the wire is that the CRCs are stripped and the speed code (spd) and acknowledgment code (ackSent) are appended to the end. The Adaptec API and device driver take care of assembling the packets. [0151]
    TABLE 4
    Asynchronous Write Request with Quadlet Payload as Delivered by TI LINK chip
    1 destinationOffsetHi
    2
    3 Data 1 Data 2 Data 3
    4 spd ackSent
    Word Bit (bit 0 is MSB)
    5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
  • [0152]
    TABLE 5
    Asynchronous Write Request with Block Payload as Delivered by TI LINK chip
    Bit (bit 0 is MSB)
    Word 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
    0 tLabel rt tCode=1 priority
    1 destinationOffsetHi
    2
    3 extendedTcode
    4 Data 1 Data 2 Data 3
    5 Data 5 Data 6 Data 7
    6 . . . . . . . . .
    3+N/4    Data N-3    Data N-2    Data N-1
    4 spd ackSent
  • The destinationID field holds the node ED of the destination which is the probe head FireWier controller. The physical layer chip can use this to determine if the packet is for it. The system controller can ignore this field. The tLabel field is used to match requests and responses. For write requests, this does not matter and can be ignored. The rt is the retry code used at link and/or phy level. It is not used by the system controller. The tCode field is the transaction code which determines what type of packet it is. In particular 0 is for quadlet write requests and 1 is for block write requests. The system controller parses this field to determine what type of packet it is. Currently only tCode values of 0 and 1 are recognized. The priority field is used by the PHY chip only and is ignored by the system controller. It is used in, i.e. in selecting which unit on the interface is to recieve a particular packet of data. [0153]
  • Next, the destinationOffsetHi and destinationOffsetLo fields form the 48 but destination start address. This indicates within the node what the data should be used for. The system controller used the destinationOffsetHi to determine the function as shown in table 6. Note that only the 3 least significant bits of the destinationOffsetHi field are currently examined. The spd field indicates the speed at which the data was sent while the ackSent field is use to indicate status by saying how the LINK chip acknowledged the packet. [0154]
    TABLE 6
    destinationOffsetHi values
    destinationOffsetHi Meaning
    0 Write VRAM 0
    1 Write VRAM 1
    2 Write VRAM 2
    3 Write VRAM 3
    4 Write VRAM 4
    5 Write ISO packet Length Register
    6 Write System Controller Mode Word
    7 Wrote to LINK chip
  • As can be seen, destinationOffsetHi values of 0-4 correspond to writing the VRAMs. In this case the destinationOffsetLow is set to the byte address to start writing. This is twice the standard VRAM address which is typically formed in 16-bit words. Note also that the start address (destinationOffsetLow) and the length (dataLength) can both be multiples of 4 such that all operations are quadlet aligned. The payload data is little endian and thus need not be converted if written by an Intel PC host. The length (dataLength) must additionally be between 4 and 128 bytes due to the size of the GPLynx FIFO. The total FIFO size is 200 bytes, but 72 bytes are dedicated to the asynchronous transmit FIFO required for read responses. [0155]
  • A destinationOffsetHi value of 5 signifies that the system controller ISO Packet Length register is to be written. The ISO Packet Length has to be set in the controller to allow it to correctly format the ISO packets back to the host via FireWire. An explicit counter in the system controller is used due to the fact that the TI GPLynx chip does not assert the end-of-packet indication until one word too late. Note that the ISO Packet length also has to be set in the LINK chip. The value written is the number of 16-bit words in the ISO Packet length which also has to be set in the LINK chip. The value written is the number of 16-bit words in the ISO packet (i.e. bytes/2) and it is written in little endian order because it is only interpreted by system controller and not the LINK chip. [0156]
  • Specifying a destinationOffsetHi value of 6 signifies that the system controller mode word is to be modified. Currently only the least significant 16 bits are used out of each quadlet and all quadlets go to the same place so writing multiple values just causes the system controller mode word to be rewritten. Please note that the payload data is again little endian. (Putting these two facts together yields that the first two out of every four bytes are used and the second two are ignored.) The definition of the system controller Mode Word is given in Table 7. [0157]
    TABLE 7
    System Controller Mode Word
    Bit (bit 31 is MSB)
    31-36 15-8 7 6 5 4 3 2 1 0
    unused BOF unused unused Abort- Single Run Extra 2 Extra 1 Data
    Word Frame Frame Loop-
    back
  • The BOF Word field is used to set the value that the system controller will put in the high byte of the first word of an isochronous packet to indicate the beginning of frame. The BOF word field can be set to some value that is not likely to occur in typical data. This not crucial, however, because choosing a BOF word that occurs in the data will make it more likely to miss incorrect frame synchronization but will never cause false alarms where it thinks it is mis-synchronized but is really correctly synchronized. The initial value upon reset is 80 hex. [0158]
  • The AbortFrame, SingleFrame, and Run bits are used to control the system operation. Their use is shown in Table [0159] 8. The data FIFO is never allowed to fully empty so an entire frame can not be read out until part of the next one is the queue.
    TABLE 8
    Use of AbortFrame, SingleFrame, and Run bits in System
    Controller Mode Word
    Abort Frame Single Frame Run Meaning
    1 0 0 Abort any current frame and wait
    0 1 0 Start a single new frame
    0 0 1 Keep scanning new frames
    0 0 0 Let any current frame complete
  • The DataLoopback bit is used to control whether the data that is read back from the host comes from A/D or from one of the VRAMs. (Currently this is [0160] VRAM 1.) This second option can used for test purposes to test the digital data generation and collection without testing the beamformer and A/D conversion. A 0 in the DataLoopback bit indicates normal operation of reading from A/D while a 1 means that it should get data from the VRAM.
  • The Extra1 and Extra2 bits are available for general use. They are latched by the system controller and currently brought out on pins called EXTRACLOCK0 and EXTRACLOCK1 but can be used for any purpose. [0161]
  • Finally setting destinationOffsetHi to 7 indicates that the data in the asynchronous packet be written back to the FireWire Link chip. This allows any of the TI TSB12LV31's (or 32's) registers to be modified by the host. This can be used to configure and enable the Isochronous data transmit. The destinationOffsetLow specifies the first register to write. Because the registers are all 4-bytes in size and must be written in their entirety, destinationOffsetLow and dataLength must both be multiples of 4. Multiple consecutive registers can be written with a single packet. Note that the data is big-endian because the TSB12LV31 is designed as big-endian. This byte-swapping must be performed by the Intel PC host. [0162]
  • Read request packets are used to asynchronously read data from the probehead. This currently only consists of configuration ROM data (see below) but can be easily used for other types of data such as status information or button indications. [0163]
  • The Adaptec device drivers send Asynchronous Read Requests in response to explicit application requests as well as to interrogate the node's FireWire configuration ROM in response to a SendPAPICommand of P_GET_DEV_INFO or after a bus reset or when an application tries to obtain a handle to a node. [0164]
  • Asynchronous read requests can either be of the quadlet or block variety as with the asynchronous write requests. The formnats are shown in Table 9 and Table 10. They are similar to the write request formats. [0165]
    TABLE 9
    Asynchronous Read Request with Quadlet Payload as Delivered by TI LINK chip
    Bit (bit 0 is MSB)
    Word 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
    0 tLabel rt tCode=4 priority
    1 destinationOffsetHi
    2
    4 spd ackSent
  • [0166]
    TABLE 10
    Asynchronous Read Request with Quadlet Payload as Delivered by TI LINK chip
    Bit (bit 0 is MSB)
    Word 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
    0 tLabel rt tCode=5 priority
    1 destinationOffsetHi
    2
    3 extendedTcode
    4 spd ackSent
  • As with the asynchronous write packets, the destinationOffsetHi and destinationOffsetLow determine what is being requested. The high addresses are defined for use as Control and Status Registers and the configuration ROM while the lower address are for more general purpose use. In particular, the FireWire configuration ROM starts at destinationOffsetHi=0×ffff, and destinationOffsetLow=0×f0000400, for example. [0167]
  • When the system controller receives a Quadlet or Block Read Request packet from the TI LINK chip's General Receive FIFO, it formulates a Quadlet or Block Read Response packet and places it in the LINK chip's Asynchronous Transmit FIFO. The format of these packets (as placed in the Asynchronous Transmit FIFO) is shown in Table 11 and Table 12. [0168]
    TABLE 11
    Asynchronous Read Response with Quadlet Payload as Expected by TI LINK chip
    Bit (bit 0 is MSB)
    Word 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
    0 spd tLabel rt tCode=6 priority
     1d rCode reserved = 0
    2
    3 Data 1 Data 2 Data 3
  • [0169]
    TABLE 12
    Asynchronous Read Response with Block Payload as Expected by TI LINK chip
    Bit (bit 0 is MSB)
    Word 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
    0 spd tLabel rt tCode=7 priority
    1 rCode reserved = 0
    2
    3 extendedTcode
    4 Data 1 Data 2 Data 3
    5 Data 5 Data 6 Data 7
    . . . . . . . . . . . .
    3+N/4    Data N-3    Data N-2    Data N-1
  • The spd, tLabel, rt, and priority values are copied from the request packet. The destinationID is taken from the sourceID of the request packet. Note that all packet CRCs are generated by the TI LINK chip and are thus note included the data that the system controller must generate. (The ROM CRCs do have to be computed explicitly off-line.) [0170]
  • The rCode field is used to indicate the status of the reply. In particular, 0 means resp_complete indicating all is well. A value of 6 means resp type error indicating that some field of the packet was invalid or unsupported. In this case, if the request was a block request then the dataLength of the response packet must be 0 and no data should be included. A resp_type_error is returned if the dataLength or destinationOffsetLow of the request packet were not multiples of 4 or if the dataLength was not between 4 and 32 (for block packets). This is because the TI chip's asynchronous transmit FIFO is configured to be 12 quadlets (for 8 payload quadlets+4 quadlet header) so that the receive FIFO can be 36 quadlets in order to allow 128 byte payload write packets. The longest request the Adaptec device drivers should request is 8 quadlets because that is the length of the configuration ROM. In any case, it is assumed that if a long transfer failed, it falls back toga smaller request. [0171]
  • The FireWire specification expects each FireWire node to have a configuration ROM that contains various details about the device, its requirements, and its capabilities. This ROM is to be queried via Read Request packets. There are two types of ROM implementations: a minimal ROM and a general ROM. The former has only one quadlet (4-byte) piece of data indicating a 24-bit vendor DD. The general ROM has many other fields, and many which are optional ranging from the ASCII name of the vendor and device to its power consumption and how to access its capabilities. [0172]
  • One of the required fields in a general ROM is a node unique ID. This consists of the 24-bit vendor ID and a 40-bit chip ID. The 40-bit chip-ID is up to the vendor to assign such that all nodes have unique values. The node unique ID's are required to keep a consistent handle on the device if the FireWire bus is reset or reconfigured during operation. When a device is first opened, the application reads its configuration ROM and determines if it wants to work with it. If so it records its node unique ID and opens a connection to the device via that node unique ID. This is then at any given time mapped to its FireWire ID (16-bit) by the host adapter and its device driver. If the topology changes or a FireWire bus reset occurs, the node's FireWire ID can change, however the node unique ID will not. Thus, in such an event, the adapter automatically determines the new FireWire ID and continues. Thus for smooth operation, particularly with multiple heads attached to the system, implementing node unique IDs and the configuration ROM is required. [0173]
  • The configuration ROM is divided into several sections. The sections of particular interest are the first word, which defines the length and CRC of the ROM, the next 4 words comprising the Bus_Info_Block, which gives some fixed 1394-specific information (such as Node Unique ID), and the last 3 words representing the Root Directory which is a set of key-value tagged entries. Only the two required key-value pairs are included the ROM built into the FPGA. An 8-word ROM that can be used is shown in Table 13. [0174]
    TABLE 13
    FireWire Configuration ROM built into FPGA
    Bit (bit 0 is MSB)
    Word 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
    0 crc_length=0x07 rom_crc_value=0xfbc8
    1 0x33 (“3”) 0x39 (“9”) 0x34 (‘4”)
    2 cyc_clk_acc=0xff max_rec=6 reserve=0x000
    3 chip_is_hi=0
    4
    5 Root_Dir_CRC=0xbc8e
    6 module_vendor_id=1234567 (0x12d687)
    7 node_capabilities0x000000
  • Isochronous packets are used for the probehead-to-host communication of beamformed data. This is conceptually a stream of 16-bit numbers punctuated by frame markers. The frame markers are important to keep in sync with where in the frame the data corresponds. While some ultrasound systems use elaborate frame and line markers embedded in the data, the integrated system can use a single auxiliary bit, which is not sent as part of the data, to mark frame boundaries. Line boundaries can be derived by knowing the VRAM sequencing program. [0175]
  • While asynchronous packets can be sent at will and do not have any guarantee of bandwidth availability, isochronous packets can be used as low-overhead way to send a guarenteed rate of data. Once a peripheral reserves a specified amount of bandwidth, it gets guaranteed bursts of link access every {fraction (1/8000)} second. All data from the head to the host is sent via isochronous packets. Because isochronous packets are limited to {fraction (1/8000)} second, this is a frame of data. The FireWire specification describes the use of synchronization bits which can be used to tag each isochronous packet with a 4 bit SYNC code. The Adaptec FireWire-to-PCI bridge can then use the Sync field to assure proper frame alignment. However, the TI GPLynx Controller chip only supports frame-level granularity of when to send packets and not packet level so when the System. Controller tells the FireWire link chip it has data, it must be prepared to send a whole frame of data. Because the FIFO is much smaller than a frame, a sage option is to reduce the effective FireWire frame size to one packet. Then a specific Beginning of Frame (BOF) code in the high byte of the first word of every ultrasound frame and force the start of ultrasound frames to occur at the beginning of FireWire frames (and packets) and do frame-level synchronization in the Ultrasound application software. For efficiency, a full ultrasound frame of data can still be read in one FireWire call (and hence one interrupt). [0176]
  • There are three steps in setting up for Isochronous head-to-host data transfers. These initialization steps need only be performed once per probe initialization. [0177]
  • The first step is to reserve isochronous bandwidth. This reservation causes a central record of the request (in the FireWire isochronous cycle manager node) to be kept to assure that the total bandwidth allocated does not exceed the total bandwidth of the link. For example, this reservation is achieved using the [0178] Adaptec API BusConfig 0 command with Cmd field set to P_ALLOCATE_RESOURCE. A requested payload in bytes is passed in. This can be the amount of data desired in every {fraction (1/8000)} second. Setting this value too high simply wastes reserved bandwidth on the FireWire interface which is not a problem if there is only one device. Setting this value too low may constrain the head-to-host data rate. No overflows or data loss are likely to occur, the scanning may simply proceed slower. The resource allocation call will return both an isochronous channel number as well as the payload size granted. This payload size granted may be less than that requested if part of the link has already been reserved.
  • The next step is to set the system controller ISO packet length word to tell how long of an ISO packet to expect. [0179]
  • The final step is to initialize the probehead LINK chip. This is done via the writeback to LINK chip asynchronous packets described above. In particular, initializing [0180] registers 54 h, 58 h, and 5 ch is necessary. The probehead can then be told to start sequencing and the data will flow back.
  • If multiple probes are connected to the system then the isochronous bandwidth reservation can take place once but at any given time, only one probe's isochronous transmission (as well as its sequencing) is enabled. [0181]
  • As previously described, isochronous data transfers are used to deliver the probe head data to the host. Maintaining frame synchronization is necessary. The FireWire will support sub-frame packetization of about 3000 bytes but it is up to the system controller to implement frame synchronization on top of this. Synchronization is achieved via two methods: [0182]
  • 1. The high byte of the first word in the first packet of a frame is set to the Beginning of Frame (BOF) code. (This can be set in the system controller Mode word). [0183]
  • 2. All frames are padded to consume a whole number of packets. [0184]
  • When these two are combined, they guarantee that frame synchronization will be maintained if the correct number of packets are read at a time and the resynchronization can be effected by just scanning the high-byte of the first word of each packet in the data stream. [0185]
  • An example packetization is shown in Table 14. This depicts 4 packets of 4 words (8 bytes) apiece showing one complete ultrasound frame and the first packet of the next frame. The ultrasound frame size is 10 words. As can be seen, the Hi byte of the first word is set to the BOF code. This can be examined to assure that proper synchronization has been maintained. The data is then split into the three packets [0186] 1-3. Because the frame ends in the middle of packet 3, the end of packet 3 is padded with the BOF code in the high word. Importantly, this means that the first word of the fourth packet will be the first word of the second frame even though the ultrasound frame size is not a multiple of the packet size.
    TABLE 14
    Example Packetization of Isochronous Head-to-Host Data
    Packet Word Lo Byte Hi Byte
    1 1 Data 1 Lo BOF
    (Frame 1) 2 Data 2 Lo Data 2 Hi
    3 Data 3 Lo Data 3 Hi
    4 Data 4 Lo Data 4 Hi
    2 1 Data 5 Lo Data 5 Hi
    (Frame 1) 2 Data 6 Lo Data 6 Hi
    3 Data 7 Lo Data 7 Hi
    4 Data 8 Lo Data 8 Hi
    3 1 Data 9 Lo Data 9 Hi
    (Frame 1) 2 Data 10 Lo  Data 10 Hi
    3 Data 1 Lo BOF
    4 Data 1 Lo BOF
    4 1 Data 1 Lo BOF
    (Frame 2) 2 Data 3 Lo Data 2 Hi
    3 Data 3 Lo Data 3 Hi
    4
  • The TSB12LV31 (or 32) performs packetization of the isochronous data but informs the system controller of packet boundaries via the ISORST signal. The system controller then uses this to reset its internal word-to-byte multiplexer as well as packetization circuitry. If it receives a frame marker from the FIFO then stops clocking data out of the FIFO until it receive a ISORST pulse. [0187]
  • The module interface defines how the various modules in the system are controlled by the VRAM controller. There are two types of modules, those that receive data from the four VRAMs which are shared (two on each analog board), and those that receive data from the VRAM on the digital board, (via the VRAM controller) which is dedicated. The two types of modules use different control signals to synchronize their operation. [0188]
  • Much of the timing depends on the speed of the runs of the module (shared/dedicated VRAM usage.) FIG. 5B shows typical timing for the different module interfacing modes for a typical program sequence. [0189]
  • As previously stated, VRAMDATA, the data from the loopback VRAM, control the execution. The diagonal shaded boxes denote header data used by the VRAM controller while the shaded boxes denote module data in FIG. 5B. The data in the four other VRAMs go to the modules. The data from the first VRAM is looped back into the system controller and then used for dedicated data supply for things like the TGC, feedback control, etc. [0190]
  • In clocks [0191] 1-4 in FIG. 5B a run of data at a rate 1/1 destined for module 0. The header is clocked out at clock 1. The pulse of NEWRUNCLOCK at clock 1 lets the modules know that the next clock will be the first in a run. They thus reset their internal run-related state if necessary. The data is clocked out during clocks 2, 3, and 4. Since the data is destined for module 0, the MODCLOCK0 is pulsed once per new data word. Module 0 should latch the data at VRAMDATA on the rising edge of MODCLOCK0.
  • Note that the access and hold times of the VRAM (T[0192] acc and Thold in FIG. 5B) must be observed carefully. Since the access time of the VRAM is 15 ns-25 ns depending on the speed grade the hold time can be as low as 4 ns, this does not leave a lot of margin when operating at data no earlier than Tclk-Tacc before the rising edge of their module clock. (Any skew between SC and the MODCLOCK tightens this bound accordingly but due to the way the VRAM controller was designed to generate both signals as gated clocks from the same MASTER CLK the skew is minimal assuming that the loading conditions are not too dissimilar.) given a master clock frequency of 33 MHz and the fast VRAM, this gives 15 ns slack. Using the slower VRAMs gives 5 ns slack.
  • The modules accepting data at the full rate must additionally make sure that they do not latch the data more than T[0193] hold after the rising clock. This is because the same clock is used to retrieve the next words from the VRAM. Thus in general modules should make sure to delay the data inputs at least as much as they delay the clock inputs to effectively clock at or before the rising edge of their module clock. This second constraint does not exist when 1/2, 1/4, or 1/8 rate data is used.
  • Since the first example is of 1/1 rate data, the MODULEFASTCLOCK0 signal follows the MODULECLOCK0 line. They will only differ when 1/2, 1/4, or 1/8 rate data is used. [0194]
  • Clocks [0195] 7-15 show a run of length 2 at rate 1/4 destined for Module 2. Thus new data will be clocked out of the VRAMs only once every 4th master clock. Here MODULEFASTCLOCK2 will exhibit different behavior than MODULECLOCK2. Again the NEWRUNCLOCK at clock 7 signals that a new run is beginning on the next clock cycle. During clock 7, the VRAM controller has latched the header data indicating that the next run is for module 2 at a rate of 1/4. Also during clock 7, the VRAM generates the module data that the module will use. At clock 8, a MODCLOCK2 occurs, telling module 2 to latch in and use the VRAM's data. Note that the data will present until the master clock before the next MODCLOCK2.
  • Although MODCLOCK2 is only clocked once per new data word, MODULEFASTCLOCK2 is clocked once per master clock for the duration of the run. This is useful for modules, such as the beamformer which may only need data at a lower rate but need to perform computation at the full rate. The MODNEWDATA signal can also be used by modules using the MODFASTCLOCK lines to determine on which of the fast clocks new data has been presented. [0196]
  • Clocks [0197] 16-18 show the result of a pause command. Here the NEWRUNCLOCK is sequenced as usual but no MODCLOCK or MODFASTCLOCK is generated.
  • As noted above, the particular embodiment was chosen based on a number of criteria, including simplicity of implementation using an FPGA. This motivated the use of VRAMs. An ASIC interface using more dense SDRAM requires at least some buffering, but this can be built into the controller, or alternatively, with the beamformer, T/R circuit or amplifier modules. In this way they receive bursts of data as opposed to the simple synchronous, continuous data that the above system supplies. The benefit is that SDRAMs are more dense and can provide data at higher rates, which reduces the parts count. Such a configuration is shown in FIG. 4B, for example, in which the 64 or 128 channel ([0198] 660 I-660 J) system is configured on one or two printed circuit boards. In this two board system, the T/R circuit and the preamplifier/TGC circuit are fabricated in a single integrated circuit and are placed on one board with a CDP beamformer that is formed as a second integrated circuit. The beamformer control circuits can include the calculation of weighted inputs with processor 670. The memory for this system is either a SDRAM or VRAM located on the second board along with the system controller and the digital communication control circuit.
  • Returning to FIG. 3A, the [0199] standard FireWire cable 40 includes a plurality of FireWire signal lines 42 and a FireWire power line 44. In order to provide the necessary voltages, the FireWire power line 44 is fed to an inline DC-DC converter 300. The DC-DC converter 300 generates the necessary voltages and provides them over a plurality of power lines 46. These new power lines 46 are repackaged with the FireWire signal lines 42 in a custom cable 40′. In the probe housing 3′, the FireWire signal lines 42 are connected to the FireWire chipset 220 and the custom power lines 46 are connected to a power distributor 48, which filters and distributes the various voltages over respective internal voltage lines 148A, 148B, 248. In addition, the power distributor 48 may perform additional DC-DC conversions, as described in more detail below.
  • The transmit/receive control chip is needed to interface with the transducer array. In a transmit mode, the chip can provide delays to the high-voltage driving pulses applied to each of the selected transducer elements such that the transmitted pulses will be coherently summed on the image place at the required transmit focus point. In a receive mode, it provides connection of the reflected sound waves received by a selected element to its corresponding amplifier. The functions of a multi-channel transmit/receive chip can be separated into two parts: a core function which provide low-voltage transmit/receive control and a buffer function which level shifts the low-voltage transmit/receive control into high voltage and directly interfaces with the transducer array. The core function of the transmit/receive chip includes a global counter which broadcasts a master clock and bit values to each channel processor; a global memory which controls transmit frequency, pulse number, pulse sequence and transmit/receive select; a local comparator which provides delay selection for each channel. For example, for a 60 MHZ clock and a 10 bit global counter, it can provide each channel with up to 17 μs delay; a local frequency counter which provides programmable transmit frequency; a local pulse counter which provides different pulse sequences. For example, a 7-bit counter can provide programmable transmitted pulse lengths from one pulse up to 128 pulses; a locally programmable phase selector which provides sub-clock delay resolution. For example, for a 60 MHz master clock and a two-to-one phase selector provides 8 ns delay resolution. [0200]
  • While typically the period of the transmit-chip clock determines the delay resolution, a technique called programmable subdlock delay resolution allows the delay resolution to be more precise than the clock period. With programmable subelock delay resolution, the output of the frequency counter is gated with a phase of the clock that is programmable on a per-channel basis. In the simplest formn, a two-phase clock is used and the output of the frequency counter- is either gated with the asserted or Deasserted clock. Alternatively, multiple skewed clocks can be used. One per channel can be selected and used to gate the coarse timing signal from the frequency counter. [0201]
  • As can be seen in FIG. 3B, a semiconductor process that can support both high-voltage and low-voltage operations is ideally matched for a single-chip solution to the transmit/receive chip described above. The core function of the transmit/receive chip can be implemented on low-voltage transistors to reduce power consumption. The level-shifting function can be implemented on high-voltage transistors to provide the necessary driving pulses to the transducer array. However, only selected semiconductor processes can make the integration of both high-voltage (buffer [0202] 292) and low-voltage transistors (294) on one chip 290 possible. As a result, the high/low voltage process has been so far offered only with 0.8-to-lum-design rules. With these design rules, a 64-channel transmit/receive chip can easily be integrated on a single chip in less than 1 cm2 chip area.
  • In order to save power and silicon area, a [0203] multi-chip module 295 can be used to implement a transmit/receive chip. For example, a deep-sub-micron process can be used to implement the core function 296 of the module, and a separate process can be used to implement the buffer 298 function. As shown in FIG. 3C, the multi-chip set can be mounted in a single package to realize the transmit/receive control function. With multi-chip module approach, a 128-channel transmit/receive controller can easily be integrated on one package.
  • FIG. 3D illustrates an alternate embodiment in which the [0204] transducer array 10′ is located in a separate probe housing 410 connected to the interface housing 404 by a cable 412. Such a system is also illustrated in connection with FIG. 12. Note that another embodiment involves a probe housing in which certain circuit elements such as the transmit/receive circuitry and/or the preamp/TGC circuitry is included with the transducer array while the beamformer, system control and memory circuits remain in the interface. The system in FIG. 3D provides for the use of standard probes and a beamformer interface that weighs less than 10 lbs and which can be connected to a standard personal computer. The interface 404 has a volume of less than 1500 cm3 and a weight that is preferably less than 5 lbs.
  • FIG. 6 shows a block diagram of another particular embodiment of an ultrasonic imaging system adapted f[0205] 6r external application integration. Referring to FIG. 6, the transducer array housing 32 and associated circuitry are connected to a system controller 500 via an ultrasound (US) interface 502. The system controller 500 is connected to a host user computing device 5 such as a PC via a standard interface 40 which is a predetermined communication link, such as an IEEE 1394 interface, also known as FireWire. The US data therefore, is transmitted to a user computing device 5 via the standard interface 40, relieving the need for specialized components to be employed in the user computing device 5. The user computing device 5 therefore provides an ultrasonic application server which may be integrated with an external application, as will be described further below.
  • The ultrasonic application server running on the [0206] user computer device 5, therefore, receives the US data, and makes it available to be invoked by an external application for further processing. The external application may be either local, and therefore running on the user computer device 5, or remote, and accessing the ultrasonic application server remotely.
  • FIG. 7A shows an integrated interface program operable for use with a local external application. Referring to FIG. 7A, the [0207] ultrasonic server application 504 is running on the user computing device 5. A local external application 506 is also running on the user computing device 5, and transmits to and from the ultrasonic server application 504 via an integrated interface program 508. The integrated interface program 508 contains a series of predetermined entry points 510 a . . . 510 n corresponding to operations which the ultrasonic application server 504 may perform on behalf of the local external application 506. The local external application 506 sends a command, which includes an instruction and optional parameters as defined by the predetermined entry points 510. The local external application 506 transmits the command to the ultrasonic server application 504 by invoking the entry point 510 n in the integrated interface program which corresponds to intended operation. The entry point may be invoked by procedure or function call via a stack call, message transmission, object passing, or other suitable interprocess communication mechanism. In a particular embodiment, Windows® messages may be used.
  • The command is received by the [0208] ultrasonic server application 504 via the desired entry point 510 n from the integrated interface program 508, and is processed. The ultrasonic server application 504 executes a result corresponding to the desired function, and transmits the result back to the external application 506 via the integrated interface program 508, typically by similar interprocess communication mechanisms employed in transmitting the corresponding command. The operations performed by the ultrasonic application server may include the following as referenced in Table 15:
    TABLE 15
    OPERATION DESCRIPTION
    Freeze Image Freeze active ultrasound data image; used to
    capture still frames
    Resume Live Obtain realtime ultrasound image
    Export Frame Export a frame of ultrasound image data in a
    format as determined by the parameters
    Application Status Return a status code of a previous operation
    Initialize Initialize Ultrasonic Application Server to begin
    receiving commands from an external
    application
    Exit Application Disconnect external application from the
    Ultrasonic Application Server
  • and may also include others by defining an entry point in the [0209] integrated interface program 508 and a corresponding operation in the ultrasonic server application 504.
  • The result received by the local [0210] external application 506, therefore, may be employed and analyzed by any functions provided by the local external application 506. The local external application 506 may be extended and modified to provide desired functions without modifying the ultrasonic application server 504 or the integrated interface program 508. Further, additional entry points 510 n to other operations provided by the ultrasonic server application 504 may require only modification of the integrated interface program 508. Further, multiple external applications may access the integrated interface program 508 by computing the proper instructions and parameters of the commands as defined by the integrated interface program 508.
  • In particular embodiments, the external application is operable to process [0211] 2 dimensional and 3 dimensional radiation therapy data, fetal image data, cardiac image data, and image guided surgery data. Such applications are employed in the medical field by operators such as surgeons to provide visual feedback about medical information. For example, fetal image data is used to view a fetus in utero. By employing multidimensional data to provide a visual image, conditions such as birth defects, treatable ailments, gender, size, and others can be determined. Similarly, radiation therapy data may be employed to simultaneously display information about the direction and intensity of radiation treatment, and a visual image of the treatment area. Such visual image data may also be employed in image guided surgery, to indicate the location of a surgical instrument. Such information is particularly useful in contexts such as brain surgery, where it may not be possible to expose the afflicted area.
  • FIG. 7B shows an [0212] integrated interface program 508 operable for use with a remote external application. In such an embodiment, a remote external application 512 is running on a remote computing device 514 such as a PC, and is connected to the user computing device 5 via a public access network 517 such as the Internet via a communication link 518. The integrated interface program 508 includes connection points 516 a . . . 516 n such as remote procedure call (RPC) points or other inter-node communication mechanism. In a particular embodiment the connection points are sockets in accordance with the TCP/IP protocol.
  • Similar to the local [0213] external application 506, the remote external application 512 is operable to compute a command corresponding to an intended operation in the ultrasonic application server 504. The connection points 516 n are generally operable to receive a command transmitted from the remote external application 512. The ultrasonic application server 504 sends a result corresponding to the command, and transmits the result back to the remote external application 512 via the integrated interface program 508 by an inter-node communication mechanism such as that used to transmit the command. Further, the same integrated interface program could have both entry points 510 n, generally to be accessed by the local external application 506, and connection points 516 n, generally accessible by the remote external application 512.
  • FIG. 8 shows a flowchart of external application integration. Referring to FIGS. 6, 7A, [0214] 7B and 8, an external application determines a desired US operation to be employed in processing and/or analysis, as depicted at step 550. The operation may provide data, and may cause a certain result or state change, or a combination. The external application determines the instruction corresponding to this operation, as shown at step 552, as defined by the integrated interface program. The external application then determines if any parameters are required for the operation, as disclosed at step 554. If parameters are required, the external application determines the parameters, as depicted at step 556. If no parameters are required, execution continues. The external application determines a command including the instruction and any required parameters, corresponding to the desired US operation, as shown at step 558. The command is transmitted to the ultrasonic application server via the integrated interface program, as disclosed at step 560. The transmission may be by any suitable method, such as those described above and others, depending on whether the external application is local or remote.
  • Ultrasonic data is received by the [0215] ultrasonic server application 504 via the standard communication interface 40 indicative of ultrasonic image information, as depicted at step 562. As described above, the ultrasonic data is received via a test probe disposed in contact with the subject, or patient, for viewing such visual information as radiation therapy data, fetal image data, cardiac image data, and image guided surgery data. Information such as the ultrasonic application server 504 executes a result corresponding to the command from the ultrasonic data, as disclosed at step 564. Thus step 564 may involve control signals being generated to define or re-define a region of interest in which radiation is to be directed for treatment. The ultrasonic application server 504 then transmits the computed result to the external application via the integrated interface program 508, as shown at step 566. Note that it is expected that many successive command and results are computed, and the ultrasonic data is concurrently sent in an iterative manner over the standard communication interface 40.
  • In another particular embodiment, the integrated application program includes both entry points for local external applications, and connection points for remote external applications. The instructions and parameters corresponding to the entry points are known to the local external application, and the instruction and parameters corresponding to the connection points are known to the remote external application. Further, there may be both an entry point and a connection point operable to invoke the same operation in the integrated application server. In such an embodiment, a semaphore or reentrancy mechanism is employed in the ultrasonic application server to avoid deadlock or simultaneous attempts to invoke the same operation. Both the local and remote external applications invoke the ultrasound application server via the integrated interface program [0216] 508 (FIGS. 7A and 7B).
  • The ultrasonic application server also includes a graphical user interface for manipulating operations without accessing the external application. Referring to FIG. 9, a [0217] control bar 578 of a top level GUI screen is shown. The control bar allows manipulation of tools affecting image settings of the display via image control presets. The image settings are controlled for each of three sizes small 570 a, medium 570 b, and large 570 c. For each size, the image settings within that size may be controlled, including depth 572, focus 574, and time gain compensation 576. Each of these settings may be saved under a user defined name for later recall. The user clicks on a save button and is prompted to enter a file name. Each of the three sets of image settings corresponding to the size settings 570 a, 570 b, and 570 c is then stored corresponding to the file name, and may be recalled by the user at a later time.
  • Those skilled in the art should readily appreciate that the programs defining the operations and methods defined herein are deliverable to a user computing device and a remote computing device in many forms, including but not limited to a) information permanently stored on non-writeable storage media such as ROM devices, b) information alterably stored on writeable storage media such as floppy disks, magnetic tapes, CDs, RAM devices, and other magnetic and optical media, or c) information conveyed to a computer through communication media, for example using baseband signaling or broadband signaling techniques, as in an electronic network such as the Internet or telephone modem lines. The operations and methods may be implemented in a software executable by a processor or as a set of instructions embedded in a carrier wave. Alternatively, the operations and methods may be embodied in whole or in part using hardware components, such as Application Specific Integrated Circuits (ASICs), state machines, controllers or other hardware components or devices, or a combination of hardware, software, and firmware components. [0218]
  • FIG. 10 illustrates a preferred embodiment of a [0219] portable ultrasound system 470 in accordance with the invention. A personnel computer 472 such as a laptop, a hand-held computer or a desktop workstation can provide power and a standard interface (e.g. IEEE 1394 or USB) to a housing 474 along cable 476. Housing 474 includes a DC-DC converter to deliver power along cable 480 to interface housing (482,490). This interface housing has two or three circuit boards 484,486, 488 as described previously. A standard transducer housing 496 with transducer array 498 is connected to the interface housing along cable 494 and connector 492. The beamformer integrated circuit mounted on circuit board 486 requires steering data, the transmit circuitry requires data to provide proper transmit focus and the TGC must have gain level information for a given depth.
  • FIG. 11 illustrates a wearable ultrasound imaging system that can include a belt mounted [0220] computer 360 or interface connected big cable 362 to hand-held probe 364, a second hand-held unit 368 that can include various controls including a mouse control and buttons to freeze the image displayed or to store a particular image in electronic memory. The unit 368 can be connected by wireless (RF or infrared) connection or by cable 366 to housing 360. The computer 360 can be connected to a desktop, laptop or hand-held display or can be connected by cable to a headmounted display system 370 that includes a microphone, a pair of speakers for audio and a high resolution display positioned adjacent the user's eye.
  • Another preferred embodiment is illustrated in FIG. 12 in which a [0221] laptop computer 450, having a flat panel display and a standard keyboard, has been programmed to perform scan conversion, doppler processing etc. on a beamformed representation of the region of interest that has been transmitted from interface housing 454 along a standard communications link such as cable 458 that conforms to the IEEE 1394 FireWire standard or the USB 2.0 standard, for example. The computer 450 and/or the interface can optionally include a control panel 452, 456, that can be used to control the study being conducted. A preferred embodiment of the interface housing 454 is controlled solely by the personnel computer 450 and provides for the use of standard transducer array probes that can be interchangeably attached to the interface housing 454 with a cable. Alternately, an additional remote controller 464 can be used to control system operation. The interface 454 can house the circuit boards on which the beamformer, memory, system controller and digital communication circuits are mounted. The interface 454 is connected to the hand-held probe 460 with a cable that is preferably between two feet and six feet in length, however longer lengths can be used. The transmit/receive and/or the preamplifier/TGC circuits can be in the probe housing 460 or in the interface housing 454. The computer can also be configured for gigabit Ethernet operation and for transmitting video and image data over networks to remote systems at clinics or hospitals. The video data can also be sent to a VCR or standard video recorder or video camera with an IEEE 1394 part for recording on videotape. The VCR or video camera can be controlled using the computer.
  • Returning to FIG. 1, the [0222] host 5 can be a desktop, laptop palmtop or other portable computer executing software instructions to display ultrasound images. In addition to real-time B-mode ultrasound images for displaying soft-tissue structures in the human body, Doppler ultrasound data can be used to display an estimate of blood velocity in the body in real time. Three different velocity estimation systems exist: color-flow imaging (CFI), power-Doppler and spectral sonogram.
  • The color-flow imaging modality interrogates a specific region of the body, and displays a real-time image of mean velocity distribution. The CFI's are usually shown on top of the dynamic B-mode image. To determine the direction of blood flow, different colors indicate velocity toward and away from the transducer. [0223]
  • While color flow images display the mean or standard deviation of the velocity of reflectors (i.e., blood cells) in a given region, power Doppler (PD) displays a measurement of the amount of moving reflectors in the area, similar to a B-mode image's display of the total amount of reflectivity. A PD image is an energy image in which the energy of the flow signal is displayed. These images give no velocity information but only show the location of flow. [0224]
  • The spectral Doppler or spectral sonogram modality utilizes a pulsed-wave system to interrogate a single range gate and displays the velocity distribution as a function of time. This sonogram can be combined with a B-mode image to yield a duplex image. Typically, the top side of the display shows a B-mode image of the region under investigation, and the bottom shows the sonogram. Similarly, the sonogram can also be combined with the CFI image to yield a triplex image. Thus, the time for data acquisition is divided between acquiring all three sets of data. Consequently, the frame rate of the complex image is generally decreased, compared to either CFI or duplex imaging. [0225]
  • A pulsed-Doppler processor for color-flow map applications is now described. Color Doppler (CD) or color-flow imaging combines, in a single modality, the capabilities of ultrasound to image tissue and to investigate blood flow. CD images consist of Doppler information that can be color-encoded and superimposed on a B-mode gray-scale image. [0226]
  • Color-flow imaging is a mean velocity estimator. There are two different techniques in computing the mean velocity. First, in a pulsed Doppler system fast fourier transformer (FFTs) can be used to yield the velocity distribution of the region of interest, and both the mean and variance of the velocity profile can be calculated and displayed as a color flow image. The other approach uses a one-dimensional auto correlation. [0227]
  • An estimate of the mean velocity in the range gate gives an indication of the volume flow rate. Given that the frequency of the reflected, range-gated signal is proportional to the flow velocity, the spatial mean velocity is determined by the mean angular frequency. [0228] ϖ = - + ω P ( ω ) ω - + P ( ω ) ω ( 1 )
    Figure US20040015079A1-20040122-M00001
  • Here, P(ω) is the power-spectral density of the received, demodulated signal. The inverse Fourier transform of the power-spectral density is the autocorrelation: [0229] R ( τ ) = - + P ( ω ) exp ( j ω τ ) ω ( 2 )
    Figure US20040015079A1-20040122-M00002
  • The derivative of the autocorrelation with respect to τ is: [0230] R ( τ ) = - + P ( ω ) exp ( j ω τ ) ω ( 3 )
    Figure US20040015079A1-20040122-M00003
  • Substituting Eqs. (2) and (3) into Eq. (1) yields: [0231] ω _ = R ( 0 ) jR ( 0 ) ( 4 )
    Figure US20040015079A1-20040122-M00004
  • Therefore, the mean velocity estimator can be reduced to an estimation of the autocorrelation and the derivative of the autocorrelation. The estimator given by the proceeding expression can be calculated when data from two returned lines are used, i.e., [0232]
  • {overscore (ω)}=−f prf arctan(Φ)   (5)
  • where [0233] Φ = 1 N c - 1 i = 0 N c - 2 y ( i + 1 ) x ( i ) - x ( i + 1 ) y ( i ) 1 N c - 1 i = 0 N c - 2 x ( i + 1 ) x ( i ) + y ( i + 1 ) y ( i ) ( 6 )
    Figure US20040015079A1-20040122-M00005
  • f[0234] prf is the pulse repetition frequency, and Nc, are the number of lines used in autocorrelation estimator. In practice, more then 2 lines are used to improve the signal-to-noise ratio. Data from several RF lines are needed in order to get useful velocity estimates by the auto-correlation technique. Typically, between 8 and 16 lines are acquired for the same image direction. The lines are divided into range gates throughout the image depths and the velocity is estimated along the lines.
  • For duplex imaging, the CFI pulses are interspersed between the B-mode image pulses. For CFI pulses, it is known that a longer duration pulse train gives an estimator with a lower variance, however, good spatial resolution necessitates a short pulse train. Consequently, a separate pulse train must be used for the B-mode image, because the CFI pulse train is too long for high-resolution, gray-scale images. [0235]
  • For color-flow imaging, CFI, the velocity estimator is given by Eq. (5). This can be computed by serial processing, since the arrival of samples for a new line results in the addition of the new data to an already calculated sum. Four multiplications, three additions, and a subtraction are performed for each range gate and each new line. Stationary echo cancellation is also performed for each new sample. A filter with N[0236] e, coefficients necessitates 2Ne multiplications and additions per gate and line.
  • Assuming that all data samples are used for CFI imaging, the total number of multiplications and additions per second is [0237]
  • N ops=(2N e+2)Mf0   (7)
  • where Mf[0238] 0 is the number of data samples per second. This is a conservative value since B-mode lines are interspersed with CF imaging lines causing time to be lost switching between modes. It follows that N ops = η ( nN e + 2 ) Mf 0 N c - N b N c ( 8 )
    Figure US20040015079A1-20040122-M00006
  • where N[0239] c, is the number of CFI lines per estimate, NB is the number of B-mode image lines interspersed between CFI lines, and η denotes the effective time spent on acquiring useful data.
  • For a CFI system using 8 lines per estimate, an echo cancellation filter with 4 coefficients and an 8 times-oversampled 41 MHZ pulse, one B-mode line is interspersed between CFI lines and 80% of the time is consumed acquiring data. Using Eq. (7), the number of calculations per second is N[0240] ops=172×106 . This is within the capability of a current Pentium-class laptop computer. Thus, all of the CFI signal processing can be performed in software using a state-of-the-art microprocessor.
  • While Color Flow Imaging (CFI) has been an effective diagnostic tool in clinical cardiovascular applications, Power Doppler (PD) imaging provides an alternative method of displaying the blood stream in the insonified regions of interest. While CF imaging displays the mean or standard deviation of the velocity of reflectors (e.g., blood cells) in a given region, PD displays a measurement of the density of moving reflectors in the area, similar to the B-mode image's display of reflectivity. Thus, Power Doppler is akin to a B-mode image with stationary reflectivity suppressed. This is particularly useful for viewing moving particles with small cross-sectional scattering, such as red blood cells. [0241]
  • Power Doppler displays the integrated Doppler power instead of the mean frequency shift as used for color Doppler imaging. As discussed in the previous section, the color-flow mapping is a mean-frequency estimator that is expressed as [0242] ω _ = - + ω P ( ω ) ω - + P ( ω ) ω ( 9 )
    Figure US20040015079A1-20040122-M00007
  • where {overscore (ω)} represents mean-frequency shift and P(ω) is the power-spectral density of the received signal. The inverse Fourier transform of the power-spectral density is the autocorrelation: [0243] R ( τ ) = - + P ( ω ) exp ( j ω τ ) ω ( 10 )
    Figure US20040015079A1-20040122-M00008
  • The total Doppler power can be expressed as the integral of the power-spectral density over all angular frequencies, [0244] pw = - + P ( ω ) ω ( 11 )
    Figure US20040015079A1-20040122-M00009
  • By observing the similarities between Eq. (2) into (10), it follows that the 0th lag of the auto-correlation function can be used to compute the integrated total Doppler power. [0245]
  • R(0)=∫P(ω)exp(0)dω=∫P({overscore (ω)})dω=pw   (12)
  • In other words, the integrated power in the frequency domain is the same as the integrated power in the time domain and hence the power Doppler can be computed from either the time-domain or the frequency-domain data. In either case, the undesired signals from the surrounding tissue, such as the vessel walls, should be removed via filtering. This calculation is also referred to as a Wall filter. [0246]
  • In a preferred embodiment, the PD can be computed in software running on a microprocessor; similar to the computation of the CFI processing described above. Parallel computation units, such as those in the Intel Pentium™ and Pentium II's MMX coprocessors, allow rapid computation of the required functions. A Digital Signal Processor (DSP) can also be used to perform this task. For either case, a software implementation permits the flexibility to change and investigate digital signal processing algorithms and transmitting signals that achieve the best performance as region of interest changes. [0247]
  • The above showed that the frequency content of the Doppler signal is related to the velocity distribution of the blood. It is common to devise a system for estimating blood movement at a fixed depth in tissue. A transmitter emits an ultrasound pulse that propagates into and interacts with tissue and blood. The backscattered signal is received by the same transducer and amplified. For a multiple-pulse system, one sample is acquired for each line or pulse emitted. A display of the distribution of velocities can be made by Fourier transforming the received signal and showing the result. This display is also called a sonogram. Often a B-mode image is presented along with the sonogram in a duplex system, and the area of investigation, or range gate, is shown as an overlay on the B-mode image. The placement and size of the range gate is determined by the user. In turn, this selects the epoch for data processing. The range gate length determines the area of investigation and sets the length of the emitted pulse. [0248]
  • The calculates spectral density is displayed on a screen with frequency on the y-axis and time on the x-axis. The intensity of a pixel on the screen indicates the magnitude of the spectrum; thus, it is proportional to the number of blood scatterers moving at a particular velocity. [0249]
  • The range gate length and position are selected by the user. Through this selection, both emitted pulse and pulse repetition frequency are determined. The size of the range gate is determined by the length of the pulse. The pulse duration is [0250] T p = 21 g c = M f ( 13 )
    Figure US20040015079A1-20040122-M00010
  • where the gate length is l[0251] g and M is the number of periods. The gate duration determines how rapidly pulse echo lines can be acquired. This is referred to as the pulse-repetition frequency or f prf c 2 d 0 , ( 14 )
    Figure US20040015079A1-20040122-M00011
  • where d[0252] 0 is the distance to the gate. For example, a 4 period, 7 M HZ pulse is used for probing a blood vessel lying at a depth of 3 cm with a 10 ms observation time.
  • The gate length is computed as [0253]
  • lg=0.44 mm.   (15)
  • The pulse-repetition frequency is [0254] f prf c 2 d 0 25 KHz . ( 16 )
    Figure US20040015079A1-20040122-M00012
  • The total number of independent spectral lines is N=T[0255] obsfprf=250. It follows that the maximum detectable velocity is v max = f prf 2 c 2 f 0 = 1.4 m s . ( 17 )
    Figure US20040015079A1-20040122-M00013
  • Using a 256-point FFT to compute the Fourier transform, the total number of multiplications/additions per second required for the preceding example is less than 10 MOPs/s. In a preferred embodiment, the sonograph computation can be carried out in software running on a microprocessor (similar to the computation of the CFI processing described above). Parallel computation units, such as those inside the Intel Pentium™ and Pentium II's MMX coprocessors, allow rapid computation of the required FFT functions. All three velocity estimation systems can be implemented in software on current microprocessors, such as the Intel Pentium, or digital signal processors (DSP). [0256]
  • Methods employing contrast agents have been developed to enhance certain imaging methods. Stabilized microbubbles are used for ultrasound contrast imaging because of their unique acoustic properties compared to biological tissues. They present superior backscattering and nonlinear behavior, and fragility upon exposure to ultrasound. A number of ultrasound imaging modalities have been created to exploit these features. [0257]
  • In fundamental B-Mode imaging, the transmitting and receiving frequencies are the same. The echogenicity of blood is significantly increased with the administration of a contrast material. Gas microbubbles scatter sound much more intensely than an equivalent size liquid or solid particle owing to the acoustic impedance mismatch (particularly the difference in compressibility) between the gas and the surrounding tissue or blood. This effect will be observed in Doppler and M-Mode imaging techniques as well. One disadvantage of using fundamental B-Mode for contrast imaging is that the level of the echoes created by the bubbles is similar to the level of the echoes resulting from the biological tissues. [0258]
  • A technique using the second harmonic relies on the fact that bubbles generate harmonics of the transmitted frequency at a level much higher than the harmonics generated by the tissues. By creating images from the signal received at twice the transmitted frequency, high image contrast is achieved between regions with and without bubbles. A problem with this imaging modality is that a short pulse (typically used in B-mode imaging) has a broad bandwidth and the transmitting and receiving frequencies overlap, contaminating the harmonic image with the fundamental frequency. To alleviate this problem, the pulse length is increased to achieve a narrow bandwidth, however, at the expense of decreasing the axial resolution of the image. [0259]
  • The pulse inversion method (also called wideband harmonic imaging or dual pulse imaging), solves the problem of overlapping frequencies observed with the second harmonic technique. Each scan line is formed by summing the signals received from two ultrasound pulses, where the second pulse is inverted and slightly delayed relative to the first. This procedure cancels the response of all linear scatters (if there is no tissue movement between the two pulses) while enhancing the effects of nonlinear scatterers. Because there is delay between the two pulses, any bubble displacement adds an additional signal, resulting in velocity-dependent enhancement. [0260]
  • Because most ultrasound contrast agents are destroyed by ultrasound irradiation, intermittent or gated imaging techniques have been used. By acquiring an image frame at each cardiac cycle (or after several cardiac cycles), ultrasound exposure is reduced, increasing the longevity of the contrast agents in the region of interest on the image. Another benefit of intermittent imaging is the filling of vascular space during the off-cycle. The degree of filling produces enhancement that is directly related to blood volume of blood flow, since the higher flow rate, the greater the number of bubbles that enters the region of interest, and thus the greater the fractional blood volume. [0261]
  • The stimulated acoustic emission method (also known as transient response imaging) typically involves color Doppler with the transmitting power set high to ensure bubble disruption with the first pulse. When the bubbles collapse, a broadband acoustic signal is generated. Since ultrasound Doppler systems compare the backscattered signal with respect to a “clean” reference signal, this loss of frequency correlation caused by the bubble collapse is interpreted by the machine as a random Doppler shift, resulting in a mosaic of colors at the location of the microbubbles. [0262]
  • A preferred embodiment of the invention employs a spatial filter in providing a power doppler image, for example. This spatial or high pass filter can also be used effectively with a contrast agent to further differentiate between blood flow and the surrounding vessel or artery. First the power is computed and a two pulse canceller is employed. The ratio of the power of the signal before and after the filter provides a data set yielding clear images of moving fluid within the body. [0263]
  • A preferred embodiment of the invention employs a spatial filter in providing a power doppler image, for example. This spatial or high pass filter can also be used effectively with a contrast agent to further differentiate between blood flow and the surrounding vessel or artery. First the power is computed and a two pulse canceller is employed. The ratio of the power of the signal before and after the filter provides a data set yielding clear images of moving fluid within the body. [0264]
  • FIG. 13 shows the top-level screen of a graphical user interface (GUI) for controlling the ultrasonic imaging system. Referring to FIG. 13, ultrasonic image data gathered by the hand-held probe is displayed and manipulated by the ultrasonic imaging system using this screen. A [0265] selection bar 702 allows the operator to select the active focus areas of the screen. An image area 704 displays the ultrasonic image of the subject area. A patient information area 706 displays information about the subject from whom ultrasonic data is being gathered. A Time Gain Compensation area 708 provides feedback about time gain compensation, described further below. A control bar 710 allows qualitative and quantitative selection of ultrasonic imaging operations, as will be described further below with respect to FIGS. 15A and 15B.
  • FIG. 14 shows the unitary, directional keypad which provides a single operating position from which to control the ultrasonic imaging operations. Referring to FIG. 14, an up [0266] arrow key 712 and a down arrow key 714 allow a user to scroll through the qualitative ultrasonic imaging operations of the system, as will be described further below. A left arrow key 716 and a right arrow key 718 allow a user to select quantitative parameters corresponding to the ultrasonic imaging operation selected. As described above, the quantitative parameters may be in a range of discrete values, or may span a continuum. A control key 720, employed in conjunction with the up arrow key 712 or down arrow key 714 allows an operator to toggle between two control tabs depicted in FIGS. 15A and 15B, as will be described further below. Since all keys employed in controlling and selecting the ultrasonic imaging operations are accessible from a common operating position, an operator may focus on the ultrasonic image of the subject and on the hand-held probe, and need not be distracted by unwieldy controls. Traditional directional keypads allow only directional control to be applied by the directional keypads, and do not allow both qualitative and quantitative selection of operations from a common, unitary operating position accessible by a single hand.
  • FIGS. 15A and 15B show qualitative and quantitative selection of ultrasonic imaging operations via invoking the unitary directional keypad of FIG. 14. Referring to FIG. 15A, ultrasonic imaging operations applicable to scanning are shown. The scanning operations are directed active acquisition of real-time, dynamic ultrasonic image data, and are typically applied as the hand-held probe is manipulated over the subject imaging area. A [0267] size operation 722 sets a series of predetermined defaults for other ultrasonic imaging operations. A small, medium, or large subject may be selected via the left and right arrow keys 716, 718 (FIG. 14). A depth operation 724 allows selection of a depth parameter via the arrow keys 716, 718. Focus is controlled by a focus 726 operation. Gain 728 control adjusts the TGC for all TGC settings 730 a-730 h. TGC operations 730 a-730 f adjusts amplification of return signals at varying depth, ranging from the least depth 730 a to greatest depth 730 h, via the arrow keys 716-718.
  • Referring to FIG. 15B, ultrasonic imaging operations applicable to processing are shown. The processing operations may be applied to static real-time or frozen images. An inversion operation is controlled by the [0268] inversion 732 selection, and rotates the image via the arrow keys 716, 718 (FIG. 14). Palate, smoothing, persistence, and mapping 734, 736, 738 and 740, respectively are selected via the up and down arrow keys 712, 714, and parameters selected via the arrow keys 716, 718 (FIG. 14). Brightness and contrast scales are selected via sliders 742 and 744, respectively, and are changed using arrow keys 716, 718.
  • FIG. 16 shows a state diagram depicting transition between the ultrasonic imaging operations depicted in FIGS. 15A and 15B. Referring to FIGS. 1, 14, and [0269] 16, the Tab 746 operations are selected via the up and down arrow keys 712, 714 and transition according to the following state sequence: size 600, depth 602, focus 604, Gain 606 and TGC degrees 608, 610, 612, 614, 616, 618, 620 and 622. Similarly, the Tab 2 operations are selected according to the following sequence: invert 624, palette 626, smoothing 628, persistence 630, map 632, brightness 634, and contrast 636. As indicated above, selection of operations may be toggled between Tab 1 746 and Tab 2 748 using control key 720 and arrow keys 712, 714.
  • The scanning operations shown in FIG. 15A are displayed on [0270] Tab 1 746, as shown in FIG. 13. The processing operations shown in FIG. 15B are displayed and selected on Tab 2, as shown in FIG. 13. Referring again to FIG. 14, control is toggled between Tab 1 746 and Tab 2 748 using a combination of the control key 720 and either the up or down arrow keys 712, 714, as shown by dotted lines 638 a and 638 b.
  • In general the use of medical ultrasound systems requires the user to have significant training and regular practice to keep skills at a high level. Another embodiment of the invention involves providing the user with an intuitive and simple way to use the interface, and with the ability to quickly and automatically set imaging parameters based on a software module. This enables general medical personnel with limited ultrasound experience to obtain diagnostic-quality images without having to adjust the controls. The “Quick Look” feature provides the user with a very simple mechanism of image optimization. It allows the user to simply adjust the image so as to obtain appropriate diagnostic image quality with one push of one button. [0271]
  • The benefits of programmed image parameters are many. The user no longer is required to adjust multiple controls in order to obtain a good image. Exams may be performed in a shorter period of time as a result. The use of this feature also results in more uniform images, regardless of the skills and expertise of the user. This approach is advantageous when performing exams under adverse circumstances such as emergency medical procedures performed in ambulances or remote locations. [0272]
  • The procedure involves the use of predefined histograms. Separate histograms are provided for different anatomical structures that are to be examined. The user chooses a structure, similar to the existing method of choosing a preset. Once the structure is chosen, the user places the transducer on the area of interest in the scanning window. At that time, pressing the selected control button triggers the system to adjust the system contrast and brightness control values so that a histogram of the gray levels in the image closely matches the corresponding pre-defined histogram for that structure. The result is an image of diagnostic image quality that is easily recreated. [0273]
  • The procedure is highly dependent upon the brightness and contrast controls. As a result, a preferred embodiment provides an independent control which allows the user to adjust for ambient lighting changes. In many applications the programmed parameters gets the user very close, but they may choose to fine tune the contrast and brightness. [0274]
  • Referring to FIG. 17A, the [0275] integrated probe system 24 includes the front end probe 3, the host computer 5, and a portable information device such as a personal digital assistant (PDA) 9. The PDA 9, such as a Palm Pilot device, or other hand-held computing device is a remote display and/or recording device 9. In the embodiment shown, the front end probe 3 is connected to the host computer 5 by the communication link 40 that is a wired link. The host computer 5, a computing device, is connected to the PDA 9 by a communication link or interface 46 that is wireless link 46.
  • In that the integrated [0276] ultrasound probe system 24 in the embodiment described has a Windows®-based host computer 5, the system can leverage the extensive selection of software available for the Windows® operating system. One potentially useful application is electronically connecting ultrasound systems allowing physicians to send and receive messages, diagnostic images, instructions, reports or even remotely controlling the front-end probe 3 using the system.
  • The connections through the communication links or [0277] interfaces 40 and 746 can be either wired through an Ethernet or wireless through a wireless communication link such as, but not limited to, IEEE 802.11a, IEEE 802.11b, Hyperlink or HomeRF. FIG. 17A shows a wired link for the communication link 40 and a wireless link for the communication link 746. Alternative embodiments and protocols for wired links are described above with respect to FIG. 1. It is recognized that other wired embodiments or protocols can be used.
  • The [0278] wireless communication link 746 can use various different protocols, such as, an RF link which may be implemented using all or parts of a specialized protocol, such as the IEEE 1394 protocol stack or Bluetooth system protocol stack. IEEE 1394 is a preferred interface for high bandwidth applications such as high quality digital video editing of ultrasonic imaging data. The Bluetooth protocol uses a combination of circuit and packet switching. Slots can be reserved for synchronous packets. Bluetooth can support an asynchronous data channel, up to three simultaneous synchronous channels, or a channel which simultaneously supports asynchronous data and synchronous voice. Each synchronous channel support a 64 kb/s synchronous (voice) channel in each direction. The asynchronous channel can support maximal 723.2 kb/s asymmetric, or 433.9 kb/s symmetric.
  • The Bluetooth system consists of a radio unit, a link control unit, and a support unit for link management and host terminal interface functions. The link controller carries out the baseband protocols and other low-level link routines. [0279]
  • The Bluetooth system provides a point-to-point connection (only two Bluetooth units involved), or a point-to-multipoint connection. In the point-to-multipoint connection, the channel is shared among several Bluetooth units. Two or more units sharing the same channel form a piconet. One Bluetooth unit acts as the master of the piconet, whereas the other units act as slaves. Up to seven slaves can be active in a piconet. [0280]
  • The Bluetooth link controller has two major states: STANDBY and CONNECTION, in addition, there are seven substates, page, page scan, inquiry, inquiry scan, master response, slave response, and inquiry response. The substates are interim states that are used to add new slaves to a piconet. [0281]
  • The link may also be implemented using, but not limited to, Home RF, or the IEEE 802.11 wireless LAN specification. For more information on the IEEE 802.11 Wireless LAN specification, see the Institute of Electrical and Electronic Engineers (IEEE) standard for Wireless LAN incorporated herein by reference. IEEE standards can be found on the World Wide Web at the Universal Resource Locator (URL) www.ieee.org. For example, hardware supporting IEEE standard 802.11b provides a communications link between two personal computers at 2 and 11 Mbps. The frequency bands allocated for transmission and reception of the signals is approximately 2.4 GHz. In comparison, IEEE standard 802.11a provides 54 Mbps communications. The frequency allocation for this standard is around 5 GHz. Recently, vendors, such as Proxim, have manufactured PC Cards and access points (basestations) that use a proprietary data-doubling, chipset, technology to achieve 108 Mbps communications. The chip that provides the data doubling (the AR5000) is manufactured by Atheros Communications. As with any radio system, the actual data rate maintained between two computers is related to the physical distance between the transmitter and receiver. [0282]
  • The [0283] wireless link 746 can also take on other formns, such as, an infrared communications link as defined by the Infrared Data Association (IrDA). Depending on the type of communication desired (i.e., Bluetooth, Infrared, etc.) the host computer 5 and the remote display and/or recording device 9 each has the desired communication port.
  • FIG. 17B shows the [0284] communication link 40 between the probe 3 and the host computer 5 as a wireless link. The communication link 746 between the host computer 5 and the PDA 9 is shown as a wired link.
  • The integrated [0285] probe system 24 of FIG. 17C has wireless links for both the communication link 40 between the probe 3 and the host computer 5 and the communication link 746 between the host computer 5 and the PDA 9. It is recognized that wired and wireless links can both be used together or in the alternative, can be exclusively wired links or wireless links in a system 24.
  • The remote display and/or [0286] recording device 9 of the integrated probe system 24 of FIG. 18 is a remote computing system 26. The remote computing system 26 in addition to having remote display and/or recording capability can also remotely control the probe 3. The communication link 746 is shown as a wireless link. The communication link 40 between the probe 3 and the host computer 5 is shown as a wired link.
  • An example of a remote control system includes using a wearable computer (such as the one manufactured by Xybernaut Corporation), a pair of high-speed, wireless PC Cards (such as those provided by Proxim) and the ultrasound program and the [0287] probe 3. A portable-networked ultrasound system can be configured weighing less than 2.5 pounds. Using a program similar to Microsoft® NetMeeting, a real-time connection between a remote PC and the wearable computer can be established. The remote host can monitor all interactions with the wearable computer, including real-time ultrasound imaging (at display rates up to approximately 4 frames per second). NetMeeting can also be used to “take control” of the wearable computer and manage the ultrasound session from the remote personal computer in real time. In addition, images and iterative executable software instructions that are archived to the hard disk on the wearable computer can be transferred at 108 Mbps to the host computer. With this technology, real time ultrasound diagnoses can be performed and relayed to a remote sight at speeds that rival a hardwired 100 million bits per second (Mbps) local area network (LAN).
  • FIG. 19 illustrates an [0288] integrated probe system 800 that has a hub 748 for connecting a plurality of remote devices 9 to the host computer 5. The communication link 750 from the hub 748 to the remote devices are shown both as wireless and wired links. It is recognized that a completely wired network such as a LAN or Ethernet can be used. In the alternative, with a wireless transceiver and port in each of the computers (remote device) 9, a wireless Network/Communication system can readily be established. With the recent advent of high-speed wireless standards, such as IEEE 802.11a, the communications between the remote and local machines can rival that of a wired, 100 Mbps local area network (LAN). Another alternative is using a Bluetooth system to form a piconet.
  • The increasing use of combined audio-visual and computer data is leading to greater need for multimedia networking capabilities and solutions are beginning to emerge that are included in preferred embodiments of the present invention. Standardization of multimedia networking is underway, and [0289] IEEE 1394 is emerging as the leading contender, capable of interfacing with a number of audio-visual (AV), computer and other digital consumer electronics and providing transmission bandwidth of up to 400 Mbps.
  • Preferred embodiments use [0290] IEEE 1394 technology which uses a wireless solution for the transmission of 1394 protocols over IEEE 802.11, the emerging standard for wireless data transmission in the corporate environment and increasingly in the home as well. In a preferred embodiment IEEE 1394 is implemented as a Protocol Adaptation Layer (PAL) on top of the 802.11 radio hardware and Ethernet protocols, bringing together a convergence of these important technologies. This protocol adaptation layer enables the PC to function as a wireless 1394 device. The engineering goal is for real delivered IEEE 1394 bandwidth sufficient for the transmission of a single high-definition MPEG2 video stream (or multiple standard-definition MPEG2 video streams) from one room in a facility to another.
  • Preferred embodiments of the present invention include the use of wireless transmission of [0291] IEEE 1394 at 2.4 GHz using Wi-LAN's Wideband Orthogonal Frequency Division Multiplexing (W-OFDM) technology. This development establishes W-OFDM, the most bandwidth-efficient wireless transmission technology, as one of the technologies capable of providing data rates necessary for in-home multimedia networking.
  • The [0292] Wireless IEEE 1394 system includes an MPEG-2 data stream generator, which feeds a multiple transport stream into a Set Top Box (STB) such as provided by Philips Semiconductors. The STB converts this signal to an IEEE 1394 data stream and applies it to the W-OFDM radio system such as provided by Wi-LAN™. The radio transmitter then sends the IEEE 1394 data stream over the air to the corresponding W-OFDM receiver in the host computer, for example. On the receive side, the IEEE 1394 signal is demodulated and sent to two STBs, which display the content of the different MPEG-2 data streams on two separate TV monitors. Using IEEE 1394 as the interface for the wired part of the network optimizes the entire system for transmission of isochronous information (voice, live video) and provides an ideal interface to multimedia devices in the facility. W-OFDM technology is inherently immune to the effects of multipath. Like all modulation schemes, OFDM encodes data inside a radio frequency (RF) signal. Radio communications are often obstructed by occurring noise, stray and reflected signals. By sending high-speed signals concurrently on different frequencies, OFDM technology offers robust communications. OFDM-enabled systems are highly tolerant to noise and multipath, making wide-area and in-home multi-point coverage possible. Additionally, as these systems are very efficient in use of bandwidth, many more high-speed channels are possible within a frequency band. W-OFDM is a cost-effective variation of OFDM that allows much larger throughputs than conventional OFDM by using a broad frequency band. W-OFDM further processes the signal to maximize the range. These improvements to conventional OFDM result in the dramatically increased transmission speeds.
  • OFDM technology is becoming increasingly more visible as American and European standardization committees are choosing it as the only technology capable of providing reliable wireless high data rate connections. European terrestrial digital video broadcasting uses OFDM and the IEEE 802.11 working group recently selected OFDM in its proposed 6 to 54 Mbps wireless LAN standard. The European Telecommunications Standards Institute is considering W-OFDM for the ETSI BRAN standard. Detailed information on Wi-LAN™ can be found on the Web at http://www.wi-lan.com/Philips Semiconductors, a division of Royal Philips Electronics, headquartered in Eindhoven, The Netherlands. Additional information on Philips Semiconductors can be obtained by accessing its home page at http://www.semiconductors.philips.com/. [0293]
  • Further, NEC Corporation's wireless transmission technology based on the [0294] IEEE 1394 high-speed serial bus capable of 400 megabits (Mbps), at transmission ranges of up to 7 meters through interior walls and up to 12 meters by line-of-sight may also be used in preferred embodiments. This embodiment uses 60 GHz millimeter wavelength transmissions, which does not require any kind of license, with the amplitude shift keying (ASK) modulation scheme and the development of a low cost transceiver. This embodiment incorporates an echo detection function in NEC's PD72880 400 Mbps long-distance transmission physical layer device, to prevent the influence of signal reflections, a significant obstacle to stable operation of IEEE 1394 over a wireless connection.
  • [0295] Wireless IEEE 1394 can play an important role in bridging the PC to clusters of interconnected IEEE 1394 devices, which can be in another room in the facility. Three example applications are sourcing video or audio stream from a PC, providing internet content and connectivity to a IEEE 1394 cluster, and provide command, control and configuration capabilities to the cluster. In the first embodiment, the PC may provide data to someone in another room in a facility. In the second embodiment, the PC may provide an avenue for 1394 enabled devices to access the Internet. In the third embodiment, the PC plays the role of orchestrating activities in the 1394 clusters and routing data within the clusters and over bridges—though the actual data does not flow through the PC.
  • FIG. 20 is a diagram showing the provision of wireless access to the images created by a preferred embodiment ultrasound imaging system and the associated architecture. The [0296] imaging system 906 exports patient information and images to files in corresponding folders. Executable software instructions have all functionality required to implement the ultrasonic imaging methods described hereinbefore.
  • The [0297] wireless agent 910 serves to detect patient directories and image files and opens a port for wireless clients to get connection thereto. Upon establishing a connection it sends back to the client list of patients and corresponding images. For example, the wireless agent 910 may include data interface circuitry which may include a first port such as a RF interface port.
  • The [0298] wireless viewer 912 residing on a handheld side can establish connection to the wireless agent 910 and retrieve patient and image information. Upon user selection of the patient and image it initiates file transmission from the wireless agent. Upon receiving an image the Viewer 912 displays this image along with patient information. The image gets stored on the handheld for future use. The handheld user can view images retrieved in previous sessions or can request new image transmission.
  • FIG. 24 is a block diagram illustrating a portable information device such as a personal digital assistant (PDA) or any computing device according to an exemplary embodiment of the present invention. The link interface or [0299] data interface circuitry 1020 illustrates, but is not limited to, one link interface for establishing a wireless link to another device. The wireless link is preferable an RF link, defined by IEEE 1394 communications specifications. However, the wireless link can take on other forms, such as the infrared communications link as defined by the Infrared Data Association (IrDA). The PDA includes a processor 1050 that is capable of executing an RF stack 1150 that communicates with a data interface circuitry 1020 through bus 1110. The processor 1050 is also connected through bus 1110 to user interface circuitry 1040, data storage 1090 and memory 1100.
  • The [0300] data interface circuitry 1020 includes a port such as the RF interface port. The RF link interface may include a first connection 1022 which includes radio-frequency (RF) circuitry 1024 for converting signals into radio-frequency output and for accepting radio-frequency input. The RF circuitry 1024 can send and receive RF data communications via a transceiver that establishes communication port 1026. RF communication signals received by the RF circuitry 1024 are converted into electrical signals and relayed to the RF stack 1150 in processor 1050 via bus 1110. The radio interface 1024, 1026 and the link between the laptop PC (host computer) and the PDA may be implemented by, without limitation, IEEE 1394 specifications.
  • Similarly, the PC host computer has a RF stack and circuitry to be able to communicate to the remotely located image viewer. In a preferred embodiment, the remote image viewer may be used to monitor and/or control the ultrasonic imaging operations not just display the resultant imaging data. [0301]
  • The current market offers a lot of different options related to wireless connectivity. In a preferred embodiment, spread-spectrum technology Wireless LAN is used. Among wireless LAN solutions the most advanced is the 802.11b standard. Many manufacturers offer 802.11b compliant equipment. Compatibility with the selected handheld is the major criteria in a specified class of wireless connectivity options. [0302]
  • The handheld market offers various handheld devices as well. For imaging purposes it is very important to have high quality screen and enough processing power to display an image. Considering these factors, in a preferred embodiment, a Compaq iPAQ is used, in particular a Compaq iPAQ 3870 is used. A wireless PC card compatible with the handheld is used such as Compaq's Wireless PC Card WL110 and corresponding Wireless Access Point. [0303]
  • FIG. 21 illustrates the [0304] image viewer 920 in communication with the personal computer in a preferred embodiment or the probe in an alternate embodiment. The image viewer has user interface buttons 922, 924, 926, 928 that allow the user to interface with the ultrasonic imaging system computer or probe in accordance with preferred embodiments of the present invention. In a preferred embodiment, a communicating interface such as button 922 allows the user to initiate a connection with the ultrasonic imaging application. Similarly, button 924 is used to terminate an established connection with the ultrasonic imaging application. A button 926 functions as a selection button that is used to provide a list of patients and corresponding images that are selectable. These images are either stored locally or remotely. If selected, the image that may be stored remotely is transmitted to the viewer. The selected image is displayed on the viewer 930.
  • Additional communication interface buttons such as [0305] button 928 functions as an options button which may, but is not limited to, allow changing configuration parameters such as an internet protocol (IP) address.
  • FIG. 22 is a diagram illustrating a preferred embodiment ultrasound image collection and distribution system including four major software components. The main hardware element of the system is [0306] ultrasound probe 942 a . . . n. The probe in communication with the laptop computer 944 a . . . n allows generation of the ultrasound images and related patient information and submits images and information to an image/patient information distribution server 946. The distribution server utilizes an SQL database server 948 to store and retrieve images and related patient information. The SQL server provides distributed database management. Multiple workstations can manipulate data stored on the server, and the server coordinates operations and performs resource-intensive calculations.
  • Image viewing software or executable instructions may be implemented in two different embodiments. In a first embodiment, a full stationary version of the Image Viewer as described in FIG. 21 may reside on a workstation or laptop computer equipped with high bandwidth network connection. In a second embodiment, a light weight version of the Image Viewer may reside on a small PocketPC handheld [0307] 952 equipped with IEEE 802.11b and/or IEEE 802.11a compliant network card. The PocketPC image viewer implements only limited functionality allowing basic image viewing operations. The wireless network protocols 950 such as IEEE 802.11 may be used to transmit information to a handheld or other computing devices 952 in communication with a hospital network.
  • This preferred embodiment describes the ultrasound imaging system to cover hospital wide image collecting and retrieving needs. It also provides instant access to non-image patient related information. In order to provide inter-hospital information exchange, image distribution servers have the ability to maintain connectivity with each other across wide area networks. [0308]
  • In another preferred embodiment, the probe may directly communicate with a remote computing device such as a [0309] PDA 964 using a wireless communication link 966. The communication link may use the IEEE 1394 protocol. The probe and the PDA both have an RF stack and circuitry described with respect to FIG. 24 to communicate using wireless protocols. The probe includes a transducer array, beamforming circuitry, transmit/receive module, a system controller and digital communication control circuitry. Post processing of the ultrasonic image data including scan conversion is provided in the PDA.
  • A preferred embodiment of the microminiaturized PC enabled ultrasound imaging system runs on an industry standard PC and [0310] Windows® 2000 operating system (OS). It is therefore network ready which makes it ideal for telemedicine solutions while being cost efficient. It provides open architecture support embedded and thus integrated with third party applications. The preferred embodiment includes an enhanced Application Programming Interface (API), common interface, export support for third party applications, such as, but not limited to, for example, radiation therapy planning, image guided surgery, integrated solutions, for example, calculations, three-dimensional and reporting packages. The API provides a set of software interrupts, calls, and data formats that application programs use to initiate contact with network services, mainframe communication programs, telephone equipment or program-to-program communications. Software based feature enhancements reduce hardware obsolescence and provide efficient upgrades.
  • Further, the preferred embodiment includes system-on-chip integrated circuits (ICs) which run on PCs and have a large channel count, large dynamic range, high image quality, full feature sets, broad diagnostic coverage, minimal supply chain requirements, simplified design for easy testing and high reliability, and very low maintenance costs. [0311]
  • As previously described herein, the preferred embodiment includes a PC based design which is intuitive, has a simple graphical user interface, is easy to use and train with, which leverages PC industry know-how, robust electronics, high quality displays and low manufacturing costs. It also provides support of software controlled communications with other applications, which are embedded applications that allows patient data, scanner image, Current Procedural Terminology (CPT) code management, which is a numeric coding system by which physicians record their procedures and services, physician's plan, outcome assessment reports, all on an integrated PC. The reforms to the health care system have been applying pressure to lower costs, highlight the need for first visit/in-field diagnosis, data storage and retrieval solutions which when combined with technology innovations such as, for example, data storage and retrieval based on the Digital Imaging and Communications in Medicine (DICOM) standard, broadband and Picture Archiving and Communications Systems (PACS) drives, changes in patient record storage and retrieval and transmission, innovations in lower cost/handheld devices for ultrasound data acquisition, all which enable the preferred embodiment of the present invention. The DICOM standard aids the distribution and viewing of medical images such as, for example, ultrasound, Magnetic Resonance Images (MRIs), and CT scans. Broadband is a wide area network term that refers to a transmission facility providing bandwidth greater than 45 Mbps. Broadband systems are generally fiber optic in nature. [0312]
  • A preferred embodiment of the present invention provides image acquisition and end-user application, for example, radiation therapy, surgery, angiography, all applications executed on the same platform. This provides low cost, user friendly controls through a common software interface. The ultrasound system has scalable user interfaces for advanced users and has an intuitive Windows® based PC interface. A preferred embodiment of the ultrasound system also provides an enhanced diagnostic ability due to the features of one-stop image capture, analysis, storage, retrieval and transmittal capability for the data and images. A high image quality is provided by a 128 channel bandwidth. Besides ease of use, the ultrasound system also provides patient access at any time, any location and using any tool. Point of care imaging is provided with a 10 ounce probe in accordance with a preferred embodiment of the present invention. The data storage and retrieval abilities are based on the DICOM standard and are compatible with off-the-shelf third party analytical and patient record systems. The ultrasound system in accordance with a preferred embodiment also provides immediate image transfer ability using, but not limited to, for example, electronic mail, LAN/WAN, DICOM and Digital Imaging Network—Picture Archiving and Communications Systems (DINPACs). The choices to display the images captured include, but are not limited to, a desktop computer, a laptop computer, wearable personal computers and handheld devices such as personal digital assistants. [0313]
  • FIGS. [0314] 25A-25C illustrate an ultrasound system 1200 in accordance with a preferred embodiment of the present invention integrated with an angiography system, a high frequency image 1220 of the carotid artery with directional power doppler and an image 1240 of the carotid artery with simultaneous. quantitative spectral doppler, respectively. During an acute coronary syndrome, multiple atherosclerotic plaques typically rupture, suggesting that the syndrome is associated with overall coronary instability. Intravascular ultrasound with the system of the present invention can evaluate the entire coronary circulation. Ultrasonographic screening reduces mortality from abdominal aortic aneurysms. The ultrasound system of a present invention provides easy guidance and confirmation of aortic arch placement, helps the rapid delivery of cold perfusate into the aortic arch, hypothermic preservation of the brain, heart, and spinal cord. Further, sensor monitoring for critical flow/temperature/physiological data can be provided. Automatic computer controlled flow-temperature adjustments can be facilitated along with exsanguination control and blood pressure management using the embodiment of the present invention. Preferred embodiments use a touch screen display.
  • In an alternate preferred embodiment, the ultrasound system ensures accurate vessel localization. FIGS. 26A and 26B illustrate an ultrasound image of [0315] vessel walls 1260 in accordance with a preferred embodiment of the system of the present invention and a catheter placement 1270 used with the system. Surgeons can use the ultrasonic system for catheter or line placements for both guidance and confirmation of placement. In preferred embodiments, the image file or raw RF data is directly accessed using a direct digital memory access. Thus, the ultrasound system provides real-time RF data output.
  • In an alternate embodiment, the ultrasound system of the present invention contributes to accurate surgical planning and imaging by providing neuro-navigation during neurosurgery. [0316]
  • In an alternate embodiment, the ultrasound system of the present invention assists in radiation therapy by helping in planning and treatment phases. FIGS. 27A and 27B illustrate a radiation planning system [0317] 1280 integrating the ultrasound system in accordance with a preferred embodiment of the present invention and the probe 1290 of the ultrasound system, respectively. The ultrasound image can be integrated in the display.
  • FIGS. 28A and 28B illustrate an [0318] ultrasonic imaging system 1300 for cryotherapy in accordance with a preferred embodiment of the present invention and a probe 1310 used in the system, respectively. In! prostate cancer patients with limited disease, percutaneous ultrasound-guided cryosurgery applied focally can spare one neurovascular bundle and thus preserve potency without compromising cancer control. The cryotherapy can be used for urological surgery also. Preferred embodiments of the present invention provide multi-plane images with processing instructions that easily switch between planes in real time. At least two orthogonal transducer arrays having 64 or 128 elements can be used.
  • FIG. 29 is a schematic diagram [0319] 1320 illustrating a robotic imaging and surgical system integrating the ultrasound system in accordance with a preferred embodiment of the present invention. The system ensures appropriate vessel harvesting. The operating surgeon uses the ultrasound system to visualize the forceps and cautery controls. The surgeon is seated across the room from the patient and peers into a monitor and manipulates the robot with controls such as, for example, joystick-like hand controls. The robotic arms slip through the small, for example, nickel-size incisions between the ribs. A camera, forceps and a cautery are used to free up the mammary artery and attach it to the heart. The smaller incisions due to ultrasonic~image guided surgery results in lower trauma to patients, less post-operative pain, less patient morbidity and shorter recovery times.
  • Further, image-guided surgery benefits from the provision of real-time RF data output in accordance with preferred embodiments of the system. In contrast to raw RF data, processed data includes data compression processing which masks differences between bone and tissue. RF data emphasizes the reflectivity and thus the differences between bone and tissue. Thus, the output data from the beamformer can be formatted and provided to enhance surgical imaging. This is enabling for surgeries such as, for example, hip and pelvic replacements. [0320]
  • In addition, computer enhanced image guided surgery further benefits a patient as it combines the dexterity of open surgery with low patient trauma. [0321]
  • In another preferred embodiment, an ultrasound system can be used for pacemaker placement surgery and monitoring. A three-dimensional ultrasound can be integrated into the systems for providing direct access to digital data via a shared memory. [0322]
  • FIG. 30 is a schematic diagram [0323] 1340 illustrating an imaging and telemedicine system integrating the ultrasound system in accordance with a preferred embodiment of the present invention. Preferred embodiments of the system output the real-time RF digital data or the front-end data.
  • FIGS. 31A and 31B are three-dimensional images from fetal imaging obtained from an ultrasound system in accordance with a preferred embodiment of the present invention. Preferred embodiments for fetal imaging use the streaming data. Using an RF transmitter in the ultrasound probe the location of each frame of data can be provided thus allowing for spatial registration. Three-dimensional alignment is provided by looking at the frame locations. [0324]
  • In a preferred embodiment, the ultrasound imaging system provides an elastographic image of a tissue's elastic properties both in-vitro and in-vivo. Ultrasound elastography is an imaging technique whereby local axial tissue strains are estimated from differential ultrasonic speckle displacements. These displacements are generated by a weak, quasi-static stress field. The resultant strain image is called an elastogram. Most pathological changes are associated with changes in tissue stiffness. Palpation is an effective method for lesion detection and evaluation. Many cancers (breast, prostate) are isoechoic, and hence difficult to detect by ultrasound alone. [0325]
  • Elastography uses the principle by which a small compression (strain) of the tissue results in a small compression of the signal (similar to frequency modulation.) Ultrasound elastography conveys new and clinically important tissue information. The tradeoffs among engineering and elastographic image parameters are now reasonably well understood. Elastography can operate in hypoechoic areas, for example, shadows. Reliable small elastic contrast exists among normal soft-tissue components, good carrier-to-noise ratio (CNR) allows its visualization. Pathology generally exhibits large elastic contrast. Areas that can benefit from the elastography include breast, prostate, vasculature, small parts and treatment monitoring. [0326]
  • Currently, breast cancer is the most frequent cancer in women. Every ninth woman in the U.S. is affected during her lifetime. It is well known that palpation of the breast is a very helpful mean to detect conspicuous lesions. Although a lot of effort is put into screening methods for breast cancer, in the majority of cases the patient herself is the first who notices palpable changes in her breast during self-examination. Although it can support the diagnostics of breast tissue, there is still a need for an imaging modality that can provide a direct measure of material parameters related to tissue elasticity such as Young's modulus. Concerning breast imaging, the elasticity tensor can be reconstructed three-dimensionally using magnetic resonance imaging. A semi-quantitative measure of elasticity with ultrasound has recently become a real-time imaging modality. As described hereinbefore, the strain imaging or elastography method is helpful to describe mechanical properties of tissue in vivo. Elastography compares ultrasonic radio frequency (RF) data of an object before and after the application of a slight compression step. Time delays between the pre- and post-compression RF signals can be estimated and converted to mechanical displacement in the axial direction. The derivative of axial displacement leads to axial strain as a semi-quantitative measure of elastic properties. [0327]
  • The quality of strain images or elastograms is limited by noise due to signal decorrelation in the time delay estimation. One method to minimize this source of error is to apply motor-driven compression plates and to use multi-compression averaging. Another method is to correct for lateral displacements by interpolating A-lines two- dimensionally. For breast imaging, initial clinical results have been published, which indicate that ultrasound elastography has the potential for improving differential diagnosis of benign and malignant breast lesions. Preferred embodiments of the present invention ultrasound systems can be used for ultrasound elastography and provide elastograms which can be used to diagnose benign and malignant lesions. [0328]
  • In accordance with a preferred embodiment, when the suspect lesion is found, a slight compression can be applied to the breast and a palpation performed with the transducer including both compression and relaxation. Usually the system is able to store the last two compression cycles in a cine-buffer of the ultrasound system, including approximately 80 images. In the conventional B-mode it is only possible to record demodulated echo data, i.e., gray-scaled imaged data. In order to acquire RF data for elastography at the same time, a color-mode window of 20×40 mm, i.e., 20 mm depth and 40 mm width can be used. The RF data from the color-mode window, which is usually used to calculate flow parameters, can be recorded as IQ-data (base-band data). Prior to the off-line calculation of strain images, the limited bandwidth of color-mode RF data is compensated for. The ultrasound system is reprogrammed to use broadband transmit pulses and broadband receive filter for the color-mode. After performing time delay estimation on every two successive frames of an IQ-data series, a series of time-delay images or axial displacement images is obtained, respectively. [0329]
  • The purpose of imaging elastic tissue properties using ultrasound elastography is to support the detection, localization and differential diagnosis of lesions. Strain imaging is a semi-quantitative method. Therefore, it is definitely possible to evaluate elastograms qualitatively. A qualitative method for the evaluation of ultrasound elastograms uses a gray-scaled colormap. The appearance (visualization, brightness, margin) and size of each lesion on the elastogram in comparison to the B-mode image, in order to distinguish breast tissues is used. However, the results of a qualitative image analysis depend on the choice of the colormap and image scaling, respectively. [0330]
  • As described hereinbefore, the ultrasound systems of the present invention are used in minimally invasive surgery and robotic surgery methods including biopsy procedures, catheter introduction for diagnostic and therapeutic angiography, fetal imaging, cardiac imaging, vascular imaging, imaging during endoscopic procedures, imaging for telemedicine applications and imaging for veterinary applications, radiation therapy, and cryotherapy, without limitation. The embodiments use computer based tracking systems and CT and MR images to pinpoint the precise location of the target areas. Alternative preferred embodiments of ultrasound systems can at lower cost and using smaller footprint devices provide images just prior, during, and just after the procedure. Preferred embodiments overcome the need of requiring a separate ultrasound appliance to be wheeled into the procedure room and a method to move images from the ultrasound to the device that is tracking position and registering targets against previously captured CT and MR images. A preferred embodiment of the ultrasound system provides a fully integrated solution, since it can run its' ultrasound application on the same platform as any third party application that is processing the images. The system includes a streaming video interface, an interface between a third party application and the system's ultrasound application. A key component of this system allows the two applications to run on the same computer platform (using the same operating system (OS)) such as, for example, Windows® based platform, other platforms such as Linux can also be used and thus providing a seamless integration of the two applications. The details of the software interface to move images from the system's ultrasound application to another application are described herein below. [0331]
  • Preferred embodiments include control and data transfer methods that allow a third party Windows® based application to control, for example, a portable Windows® based ultrasound system by running the ultrasound application as a background task, sending control commands to the ultrasound application and receiving images (data) in return. Further, the embodiment configures a portable ultrasound Windows® based application as a server of live ultrasound image frames supplying another Windows® based application that acts as a client. This client application receives these ultrasound image frames and processes them further. In addition, an alternate embodiment configures the portable ultrasound Windows® based application as a server, interacting with a third party client application via two communication mechanisms, for example a component object model (COM) automation interface used by third party, hereinafter referred to interchangeably as external or a client to startup and control the portable ultrasound Windows® based application and a high-speed shared memory interface to deliver live ultrasound images. [0332]
  • A preferred embodiment includes and configures a shared memory interface to act as a streaming video interface between a portable Windows® based Ultrasound application and another third party Windows® based application. This streaming video interface is designed to provide ultrasound images to a third party client in real-time. [0333]
  • A preferred embodiment allows the third party Windows® based application to control the flow rate of images from the portable ultrasound Windows® based application through the shared memory interface within the same PC platform and the amount of memory required to implement this interface. These controls consist of a way to set the number of image buffers, the size of each buffer and the rate of image transfer. This flow rate control can be set for zero data loss thus ensuring that every frame is delivered to the third party Windows® based application from the ultrasound system, or minimum latency thus delivering the latest frame generated by ultrasound system to the third party Windows® based application first. [0334]
  • A preferred embodiment formats the ultrasound image frame such that probe, spatial, and temporal information can be interpreted by the third party Windows® based application as it retrieves the images (generated by the portable ultrasound Windows® based application) from the shared memory interface. The actual image data passed between the server (i.e. portable ultrasound application) and the client application (third party Windows® based application) is a Microsofto device independent bitmap (DIB) with 8 bit pixels and a 256 entry color table. The image frame also contains a header that provides the following additional information, for example, but not limited to, Probe Type, Probe Serial Number, Frame Sequence Number, Frame Rate, Frame Timestamp, Frame Trigger Timestamp, Image Width (in pixels), Image Height (in pixels), Pixel Size (in X and Y), Pixel Origin (x, y location of the first pixel in image relative to the Transducer Head, and Direction (spatial direction along or across each line of the image). [0335]
  • Further, the preferred embodiment controls the shared memory interface used to transfer ultrasound images between a Windows® based portable ultrasound system and a third party Windows® based system through the use of ActiveX controls. The Windows® based portable ultrasound application contains an ActiveX control that transfers a frame into the shared memory and sends out a Windows® Event (that includes a pointer to the frame just written) to the third party Windows® based application. This third party application has a similar ActiveX control that receives this Event and pulls the image frame out of shared memory. [0336]
  • The graphical user interface includes one or more control programs, each of which is preferably a self-contained, for example, client-side script. The control programs are independently configured for, among other functions, generating graphical or text-based user controls in the user interface, for generating a display area in the user interface as directed by the user controls, or for displaying the processed streaming media. The control programs can be implemented as ActiveX controls, as Java applets, or as any other self-contained and/or self-executing application, or portion thereof, operable within a media gateway container environment and controllable through the web page. [0337]
  • Ultrasonic content can be displayed within a frame in the graphical user interface. In an embodiment, the program generates an instance of an ActiveX control. ActiveX refers to a set of object-oriented programming technologies and toofs provided by Microsof® Corporation of Redmond, Washington. The core part of the ActiveX technology is the component object model (COM). A program run in accordance with the ActiveX environment is known as “component,” a self-sufficient program that can be run anywhere in the network, as long as the program is supported. This component is commonly known as an “ActiveX control.” Thus, an ActiveX control is a component program object that can be re-used by many application programs within a computer or among computers in a network, regardless of the programming language with which it was created. An ActiveX control runs in what is known as a container, which is an application program utilizing the COM program interfaces. [0338]
  • One advantage of using a component is that it can be re-used by many applications, which are known as “component containers.” Another advantage is that an ActiveX control can be created using one of several well-known languages or development tools, including C++, Visual Basic, or PowerBuilder, or with scripting tools such as VBScript. ActiveX controls can be downloaded as small executable programs, or as self-executable code for Web pages animation, for example. Similar to ActiveX controls, and suitable for the client-side scripts, are applets. An applet is typically a self-contained, self-executing computer program written in Java™, a web-based, object-oriented programming language promulgated by SUN Microsystems Corporation of Sunnyvale, Calif. [0339]
  • The control programs can be stored and accessed locally at the client system, or downloaded from the network. Downloading is typically done by encapsulating a control program in one or more markup language-based files. The control programs can also be used for any commonly-needed task by an application program running in one of several operating system environments. Windows®, Linux and Macintosh are examples of operating system environments that can be used in preferred embodiments. [0340]
  • A preferred embodiment of the Ultrasound Imaging System has specific software architecture for the image streaming capabilities. This Ultrasound Imaging System is an application that controls the Ultrasound Probe of a preferred embodiment and allows to obtain and display visual images for medical purposes. The Imaging System has it's own graphical user interface. This interface has reach in features and is conveniently organized to provide maximum flexibility working with the separate images as well as streams of images. Some of the possible medical applications require developing of graphical user interfaces with significantly different features. This involves integration of the Imaging System into other more complicated medical system. The preferred embodiment allows exporting imaging data in a highly effective and convenient fashion for original equipment manufacturers (OEMs) to have direct access to imaging data. [0341]
  • The quality of the Image Streaming solution in accordance with a preferred embodiment is measured by the following criteria such as, data transfer performance. Imaging data consume significant amount of memory and processor power. Large number of separate image frames are required to produce live medical video patient examination. It becomes very important to minimize data coping operations in a process of transferring data from one process generating video data to a process consuming video data. The second criteria includes industry standard imaging format. Since applications consuming video imaging data are intended to be developed by third party companies data can be represented in industry standard formats. A third criteria is convenience. Imaging data may be presented by means of a programming interface that is convenient to use and does not require additional learning. [0342]
  • Further, the criteria includes scalability and extendibility. Streaming data architecture may be easily extendable to accommodate new data types. It may provide basic framework for future multiplication of video streams targeting more then one data receiving process. [0343]
  • The image streaming architecture of the preferred embodiment provides methods of data transportation between two processes. The image streaming architecture defines operational parameters regulating data transferring process, and describes the mechanism of transferring parameters between processes. One of the methods to transfer operational parameters from a third party client application to the imaging system of a preferred embodiment is by using existing COM interface. [0344]
  • In a preferred embodiment, the image transferring architecture intensively uses object-oriented programming methodology and inter-processing capabilities of the Microsoft Windows® operating system. Object-oriented methodology provides a necessary foundation allowing an architectural solution that satisfies the necessary requirements. It also lays ground for future enhancements and extensions making modification relatively simple and backward compatible. [0345]
  • Video imaging data represent complicated data structures with mutual interferences between different data elements. It also permits and often requires different interpretation of the same data elements. The preferred embodiment of the following image transferring architecture includes a shared memory for physical data exchange. For example, Windows® shared memory is a fast and economical way to exchange data between processes. Further, the shared memory can be subdivided into separate sections of a fixed size in certain embodiments. Each section can then be at a minimum a controllable unit. In addition, the imaging data can be abstracted as objects. Each frame of the imaging data can be represented by a separate object. The objects can then be mapped to the sections of the shared memory. [0346]
  • Preferred embodiments can include the locking-unlocking of a section-object. The programming API notification mechanism used is an event-driven mechanism. Event-driven mechanisms are implementation based on C++ pure-virtual functions. [0347]
  • In a preferred embodiment, the image transferring architecture consists of three layers: an application programming interface (API) layer, a programming interface implementation and shared memory access layer, and a physical shared memory layer. The application programming interface layer provides two different C++ class library interfaces to applications on a client and server side. All the associated sequence of instructions that belongs to the application itself is part of this layer as well. Application derived classes and their implementation are the key elements of application programming interface layer. The server which is the imaging data provider side uses, for example, Object Transmitter class and related derived and base classes. The client which is the imaging data consumer side uses an Object Factory class, for example, and related derived and base classes. [0348]
  • The programming interface implementation layer provides two different Dynamic Link Libraries (DLLs) implementing classes for the applications. This layer maps objects of the classes associated with the application to an internal implementation of objects accessing the shared memory physical system object. This layer allows the hiding of all implementation specific member variables and functions from the scope of the application. Thus, the application programming interface layers become uncluttered, easy to understand and use. The server side application can use, for example, Object-Xmitter.DLL, while the client side application can use, for example, ObjectFactory.DLL. [0349]
  • The physical shared memory layer represents the operating system object implementing shared memory functionality. It also describes the structure of the shared memory, it's segmentation, and memory controlling blocks. [0350]
  • With respect to the organization of the shared memory since shared memory is intended to be used for interprocess communications the operating system specifies a unique name at the time of it's creation. In order to manage the shared memory, other interprocess communications (IPC) system objects are required. They need to have unique names as well. To simplify a unique name generation process only one base is required. All other names are derived from the base one by an implementation code. Thus, the application programming interface requires the specification of only one base name for the logical shared memory object. The same unique name can be used by both the server side of the application and the client side of the application. [0351]
  • The server side of the application is responsible for the creation of shared memory creation. In a process of creation, it has to specify not only unique name of the shared memory but other configuration parameters. These parameters include, but are not limited to, segments count which specifies the number of segments to be allocated, segment size and operational flags. There are three such flags in a preferred embodiment. The first one specifies the segment submission and retrieval order. It can be one of, Last In First Out (LIFO), First In First Out (FIFO), or Last In Out (LIO). LIO is a modification of the usual LIFO in such a way that whenever at the time when a new frame arrives, if it finds frames that were ready for retrieval, but yet not locked for retrieval, they are erased. The second flag specifies shared memory implementation behavior under a condition when a new segment allocation is requested but there is no segment available. Typically it may happen when receiving application process data slower then submitting the application. This flag may allow deleting one of the previously allocated segments. If it does not allow deleting one of the previously allocated segments, it reports an exceptional condition back to the application. Using this flag application may automatically select overwriting of data in a shared memory or it may control the data overwrite process itself. The third flag can be used only when the second one allows overwriting segments in a shared memory. It specifies how to select a segment to be overwritten. By default, shared memory implementation deletes the youngest or the most recently submitted data segment. Alternatively, the oldest segment can be selected for overwrite process. [0352]
  • At the time of the creation of the shared memory it's physical layout is initialized. Since the operating system does not allow address calculation in a physical shared memory data pointers are not used within shared memory. All addressing within the shared memory controlling blocks and segments is implemented in terms of relative offsets from the Virtual Origin (VO). With the offset zero from the VO, the shared memory heading structure is allocated. It contains all the parameters listed herein above. FIG. 32 is a block diagram illustrating the structure of the physical shared memory. [0353]
  • Immediately after the allocation of the shared memory heading structure [0354] 1382 follows the creation of array of headers for every memory segment. The memory segment header contains the occupied size of the segment, unique tag of the class of the object mapped to the segment, and the segment state. Each segment can be in one of four states: unused where the segment is available for allocation, locked for write where the segment is mapped to an object of a specific class and currently is being formed, written, wherein the segment is mapped to an object of a specific class and available for retrieval, and locked for read, wherein the segment is mapped to an object of a specific class and currently is in a process on data retrieval. Since every segment has it's own state it is possible for the application to lock more then one segment for object forming and object retrieval. This allows the system to have flexible multithreaded architecture on both the server and client sides of the application. Further, the ability to have more then one segment in a “written” state provides a “buffering” mechanism nullifying or minimizing performance difference of the applications on the server and client sides.
  • The last element in a physical shared memory layout contains memory segments. The logical shared memory besides physical shared memory contains a physical system mutex [0355] 1388 and system event 1390. The physical mutex provides mutual exclusive access to physical shared memory. The physical event is of a manual control type. It stays at the level “high” all the time when at least one of the segments has a “written” state. It goes to the level “low” only when there is no single segment in a “written” state. This mechanism allows to retrieve “written” objects from the shared memory without passing control to an operating system within the same time-slice allocation for the thread.
  • In a preferred embodiment, the object transmitting programming interface consists of three classes: namely, AObjectXmitter, USFrame, and BModeFrame. The AObjectXmitter class allows the initiation of an object transferring service specifying desired operational parameters. Once the AObjectXmitter class object is instantiated the initialized objects of USFrame and BModeFrame classes can be created. The USFrame class constructor requires a reference to an object of the AObjectXmitter class. The first action that has to be accomplished upon instantiation of the USFrame object is to establish association of the object with one of the segments in the shared memory. The function Allocateo maps an object to an unused shared memory segment and locks this segment for the current object usage. At the time of mapping an object a bitmap size may be provided by an application. The provided size represents only the size required for bitmap data not including the memory size required for other data elements of the object. [0356]
  • The BModeFrame class is a class derived from the USFrame class. It inherits all the methods and functionality that the base class has. The only additional functionality provided by BModeFrame class is additional methods allowing to provide information related specifically to the BMode operation. [0357]
  • After the USFrame or BModeFrame class object has been instantiated and mapped to the shared memory segment the application can fill all desired data elements of the object. It is not necessary to provide a value for every data element. At the time when an object is being mapped to the shared memory segment, all data elements of the object are being initialized with default values. The only data elements that are not initialized upon mapping are bitmap data elements. When the server side of the application has provided all desired data elements it can hand over the object to the client side of the application by calling a method, for example, Submit( ). [0358]
  • The USFrame or BModeFrame object can be reused by means of subsequent remapping and resubmitting. Alternatively, it can be deleted and a new one can be created when it is appropriate for an application. Since object instantiation does not require any interprocess-communication mechanisms, it is as simple as memory allocation for an ordinary variable. [0359]
  • There are at least two advantages of the architecture of the preferred embodiment. Since the ObjectXmitter class does have knowledge about the USFrame or BModeFrame class, it is very easy to introduce additional classes similar or directly or indirectly derived from the USFrame class. This allows to produce future versions of Object Transmitting Programming Interface without requiring any modifications to the code or sequence of instructions that was developed for existing embodiments. Further, Object Transmitting Programming Interface classes do not have any member variables. This provides two more benefits of the interface. The first one is that these classes are COM object interface oriented and can be directly used for the COM object interface specification and implementation. The second benefit is that these classes effectively hide all implementation specific details making the interface very clear, easy to understand and use. [0360]
  • The Object Transmitting Programming Interface is implemented by the ObjectXmitter.DLL. For every object created by the application there is a mirroring implementation object being created by the code residing in the ObjectXmitter.DLL. Since every programming interface class has corresponding mirroring class in implementation modifications are facilitated and extend currently to specified image types. This can be accomplished by the creation of the corresponding mirroring classes in the implementation DLL. Implementation objects are responsible for handling of the shared memory and the mapping of programming interface objects. An embodiment of the present invention includes the DLL allowing instantiate of only one ObjectXmitter class object using only one communication channel with the one client application. Object Transmitting implementation transmits not only object data but provides additional information describing the object type transferred. [0361]
  • The Object Factory Programming Interface consists of three classes: AObjectFactory, USFrame, and BModeFrame. The class AObjectFactory contains three pure virtual member functions. This makes this class an abstract class that cannot be instantiated by an application. It is required from the application to define its own class derived from the AObjectFactory class. There is no need to define any “special” class derived from the AObjectFactory class. Since the application intends to process images that would be received, the chances that it will have a class processing images are very high. An image processing class can very well be derived from AObjectFactory class. [0362]
  • The class derived from an AObjectFactory class has to define and implement only pure virtual functions such as, for example, OnFrameOverrun( ), OnUSFrame( ), and OnBModeFrame( ). For instance, a derived class can be defined as follows: [0363]
    Class ImageProcessor: public AObjectFactory {
    public:
    ImageProcessor(void);
    ˜ImageProcessor(void);
    virtual unsigned long OnFrameOverrun(void);
    virtual unsigned long OnBModeFrame(const BModeFrame * frame);
    virtual unsigned long OnUSFrame(const USFrame *frame);
    };
  • Upon instantiation of an object of the class ImageProcessor base class member function Open( ) can be called. This function provides a shared memory name that matches to the shared memory name being used by the server side of application. Function Open( ) connects the client application to the server application via a specified shared memory. [0364]
  • At any moment after opening the shared memory, the application can expect a call on a virtual function OnFrameOverrun( ), OnUSFrame( ), and OnBModeFrame( ). Every invocation of OnUSFrame( ) function carries as an argument an object of USFrame class type. Every invocation of OnBModeFrame( ) function carries as an argument an object of BModeFrame class type. There is no need for an application to instantiate an object of USFrame or BModeFrame class. USFrame and BModeFrame objects are “given” to an application by underlying implementation on an AObjectFactory class. [0365]
  • The only action that application needs to accomplish is to process received frame and to release the “given” object. The application does not attempt to delete a frame object, as deletion is done by an underlying implementation. Member function Release( ) of USFrame object is called only when all data processing is done by the application and USFrame object or object of the derived class is no longer needed by the application. [0366]
  • Once the application has received an object of a class USFrame or BModeFrame it can retrieve imaging data and process them appropriately. The application needs to be aware that it does processing of the frame object data in a separate thread and make sure that processing function is written using a thread-safe programming technique. Since any of the pure virtual functions are being called within a separate thread created by the implementation DLL none of the subsequent calls are possible before virtual function returns control back to the calling thread. This means that as long as the application has not returned control to implementation created thread any new frames cannot be received by the application. Meanwhile the server side of the application can continue to submit extra frames. This eventually causes the shared memory to overflow and prevents any new frame transmission. [0367]
  • All the time when the application processes frame data it keeps shared memory resources locked from subsequent remapping. The more frames not released by the application the less shared memory segments are available for Object Transmitting Interface on the server side of the application. If framing objects are not being released with an appropriate speed ratio eventually all memory segments of the shared memory are locked by the client application. At that time the image transmitting application stops sending new frames or overwrites frames that are not locked yet by the receiving application. If all the segments were locked by the receiving application the transmitting application does not even have an option to overwrite existing frames. [0368]
  • The function OnFrameOverrun( ) is called when the Frame Overrun is raised by the servicing application. This condition is raised any time when the servicing application makes an attempt to submit a new frame and there is not any available shared segments to map an object to. This condition can be cleared only by the client side of application by means of calling function ResetFrameOverrun( ). If this function is not called by the client application the Frame Overrun condition is raised and OnFrameOverrun( ) pure virtual function is called again. [0369]
  • The Object Factory Interface has the same advantages that were outlined herein above in describing the Object Transmitting Interface. In addition to these advantages, it implements an event-driven programming method that minimizes programming effort and maximizes execution performance. At the same time there are functions such as, for example, USFrames( ), BModeFrames( ), GetUSFrame( ), and GetBModeFrame( ). These functions can be used to implement less efficient “polling” programming methods. [0370]
  • The Object Factory Programming Interface is implemented by the ObjectFactory.DLL. This DLL retrieves an object class type information as well as object related data from the shared memory. It creates an object of the type that is used by the transmitter. The Object factory implementation maps newly created objects to the corresponding data. Object factory implementation has a separate thread that fires newly generated and mapped object via pure virtual function event. The application “owns” this object for the duration of processing and by calling Releaseo function indicates that the object is no longer needed by the application. The factory implementation releases resources allocated for the object locally as well as shared memory resources. [0371]
  • The processing flow described herein above is pictorially represented in the block diagram FIG. 33. Preferred embodiments, include the ease of code maintenance and feature enhancement for the image transferring mechanism. The Object Transferring and Object Factory Interfaces as well as their implementations allow such modifications to be made at a relatively low development cost. With respect to Object Modification, the shared memory implementation is completely independent from the transferred data types. Thus, any type modification does not require making any changes to the underlying code controlling shared memory. Since transferred data is encapsulated within classes of a particular type the only action that is needed to modify transferring an object is to modify the corresponding class defining such object. Since objects represent a class derivation tree any modification of the base class causes appropriate change of every object of the derived classes. Such modifications of the object types do not affect application code not related to modified object classes. [0372]
  • The new types of objects can be introduced by deriving a new class from one of the existing classes. A newly derived class can be derived from the appropriate level of the base classes. An alternative way to create a new object type is by the creation of a new base class. This method may have the advantage in the case when a newly defined class differs from existing ones significantly. [0373]
  • With respect to multiple Object Transferring Channels, alternate preferred embodiments, can support more than one AObjectXmitter class object and more then one corresponding communication channel. It also can be extended in such a way that it allows communication channels transmitting objects in opposite directions. This allows the application to distribute imaging data to more then one client application. It can accept incoming communication controlling image creation and probe operation. [0374]
  • Further, wireless and remote image streaming channels can be accommodated in preferred embodiments. A same Object Transmitting Programming Interface can be implemented to transfer images not via the shared memory but via the high-speed wireless communication network such as, for example, ISO 802.11a. It also can be used to transfer images across a wired Ethernet connection. Remote and wireless image streaming assumes that the recipient computing system can differ in performance. This makes the selection of a model of the recipient's device one of the important factors for the successful implementation. [0375]
  • The streamed imaging included in preferred embodiments thus utilizes a shared- memory client-server architecture that provides high bandwidth with low overhead. [0376]
  • The Ultrasound Imaging System software application of a preferred embodiment is used as a server of live ultrasound image frames by a client application. This client- server relationship is supported by two communications mechanisms as described hereinabove. A COM automation interface is used by the client application to start-up and control the ultrasound imaging system application. A high-speed shared-memory interface delivers live ultrasound images with probe identification, spatial and temporal information from the application to the client application. [0377]
  • Complexities of the shared-memory implementation are encapsulated for the client application in a simple ActiveX COM API (TTFrameReceiver). The shared-memory communications have flexible parameters that are specified by the client application. Queue order, number of buffers, buffer size and overwrite permission are all specified by the client when opening the image-frame stream. The queue order mode can be specified as First-In-First-Out (FIFO), Last-In-First-Out (LIFO) and Last-In-Out (LIO). In general, the FIFO mode is preferred when zero data loss is more important than minimum latency. The LIO mode delivers only the most recent image frames and is preferred when minimum latency is more important than data loss. The LIFO mode can be used when minimum latency and minimum data loss are both important. However, in the LIFO mode, frames might not always be delivered in sequential order and a more complicated client application is required to sort them after they are received. Overwrite permission, when all of the shared-memory buffers are full, is specified as not allowed, overwrite oldest and overwrite newest. [0378]
  • Each image frame contains a single ultrasound image, probe identification information, pixel spatial information and temporal information. The image format is a standard Microsoft device independent bitmap (DIB) with 8-bit pixels and a 256-entry color table. [0379]
  • The TTFrameReceiver ActiveX control provides two schemes for receiving frames. The first scheme is event driven. A COM event, FrameReady, is fired when a frame has been received. Following the FrameReady event, the image and associated data can be read using the data access methods of the interface. After image and other data have been copied, the client releases the frame by calling the ReleaseFrame method. The next FrameReady event does not occur until after the previous frame is released. In another embodiment, the client can poll for the next available frame using the WaitForFrame method. [0380]
  • In a preferred embodiment, both the client application and the server application are executed on the same computer. The computer can be running the Microsoft[0381] ® Windows® 2000/XP operating system, for example, without limitation. The client application (USAutoView) can be developed using Microsoft® Visual C++6.0 and MFC. The source code can be-compiled, for example, in Visual Studio 6.0. The server side COM Automation interface and the TTFrameReceiver ActiveX control may be compatible with other MS Windows® software development environments and languages.
  • In an embodiment of the present invention, the name of the server side COM automation interface (ProgfD) is, for example, “Ultrasound.Document” and the interface is registered on the computer the first time the application is run. The dispatch interface can be imported into a client application from a type library. [0382]
  • In a preferred embodiment, automation interface is extended to support frame streaming with the addition of different methods such as void OpenFrameStream (BSTR* queueName, short numBuffers, long bufferSize, BSTR* queueOrder, short overwritepermission). The Opens frame stream transmitter on the server side; opens the shared-memory interface to the client application, queueNam,e is a unique name of the shared-memory “file” and is the same name that is used when opening the receiver, numBuffers is the number of buffers in the shared-memory queue, bufferSize is the size of each buffer in the shared-memory queue in bytes wherein the buffer size is 5120 bytes larger than the largest image that can be transmitted, queueOrder “LIO”, “FIFO”, or “LIFO”, overwritePermission is 0 for overwrite not allowed, 1 for overwrite oldest, or 2 for overwrite newest. Note, OpenFrameStream must be called before opening the TTFrameReceiver control. [0383]
  • The next additional methods include void CloseFrameStream( ) which closes the frame stream transmitter on the server side, void StartTransmitting( ), which tells the server side to start transmitting ultrasound frames, void StopTransmitting( ), which tells the server side to stop transmitting ultrasound frames, and short GetFrameStreamStatus( ), which gets the status of the frame stream transmitter. It is important to check that the stream transmitter is open before opening the TTFrameReceiver. The COM automation interface is non-blocking and the OpenFrameStream call cannot occur at the instant it is called from the client application. [0384]
  • In a preferred embodiment, the TTFrameReceiver ActiveX Control is the client application's interface to the live ultrasound frame stream. Frame Stream Control Methods include boolean Open(BSTR name), which opens the frame stream receiver. The frame stream receiver cannot be opened until after the frame stream transmitter on the server has been opened. It also includes boolean Close( ), which closes the frame stream receiver, long WaitForFrame(long timeoutms), which wait for a frame to be ready or until end of timeout period, and boolean ReleaseFrame( ), which release the current image frame. The current frame can be released as soon as all of the desired data has been copied. The next frame cannot be received until the current frame is released. The return values of the other data access functions are not valid after the current frame is released until the next FrameReady event. [0385]
  • Data Access Methods in a preferred embodiment for the image includes long GetPtrBitmapinfo( ), which gets a pointer to the header (with color table) of the DIB that contain the image. The ultrasound image is stored as a standard Microsoft device independent bitmap (DIB). BITMAPINFO and BITMAPINFOHEADER structures can be cast to the returned pointer as needed. Memory for the BITMAPINFO structure is allocated in shared-memory and may not be de-allocated; instead, ReleaseFrame( ) can be called to return the memory to the shared-memory mechanism. Further methods include long GetPtrBitmapBits( ), which gets a pointer to the image pixels. The returned pointer can be cast as needed for use with the Microsoft DIB API. Memory for the bitmap pixels is allocated in shared-memory and may not be de-allocated; instead, ReleaseFrame( ) is called to return the memory to the shared-memory mechanism. [0386]
  • The methods related to probe identification include short GetProbeType( ), which gets the defined ultrasound probe type.being used, BSTR GetProbeType( ), which gets the defined probe name, long GetProbeSN( ), which gets the serial number of the probe being used. [0387]
  • With respect to temporal information, the methods include short GetSequenceNum( ), which gets the sequence number of the current frame. The sequence number is derived from an 8-bit counter and thus repeats every 256 frames. It is useful for determining gaps in the frame sequence and for re-ordering frames received when using the LIFO buffer order mode. Further, double GetRate( ), gets the frame rate when combined with the sequence number, provides precise relative timing for the received frames, BSTR GetTimestamp( ), which gets a timestamp for the current frame which provides an absolute time for the current frame that may be useful when synchronizing to external events. The resolution is approximately milliseconds. Timestamps can be averaged and used in conjunction with rate and sequence number to achieve higher precision. Lastly, with respect to temporal information, the methods include BSTR GetTriggerTimestamp( ), which gets a timestamp for the start of ultrasound scanning wherein the ultrasound probe is stopped when “freezing” the image. The trigger timestamp is recorded when live imaging is resumed. [0388]
  • Spatial Information in preferred embodiments has the following methods, short GetXPixels( ), which get the width of the image in pixels; short GetYPixels( ), which gets the height of the image in pixels; double GetXPixelSize( ), which gets the size of each pixel in the x-direction, (x-direction is defined to be horizontal and parallel to each image line); and double GetYPixelSize( ), which gets the size of each pixel in the y-direction. The y-direction is defined to be vertical and perpendicular to each image line. Further, double GetXOrigin( ), which get the x-location of the first pixel in the image relative to the transducer head and double GetYOrigin( ), which gets the x-location of the first pixel in the image relative to the transducer head. The positive y-direction is defined to be away from the transducer head into the patient. Another method includes short GetXDirection( ), which gets the spatial direction along each line of the image. The positive x-direction is defined to be away from the probe marker. The short GetYDirection( ), gets the spatial direction across each line of the image. The positive y-direction is defined to be away from the transducer head into the patient. [0389]
  • The spatial position of any pixel in the image relative to the transducer head can easily be calculated as follows: [0390]
  • PX=OX+NX*SX*DX
  • PY=OY+NY*SY*DY
  • wherein, [0391]
  • P=the position of the pixel relative to the transducer head, [0392]
  • O=the origin [0393]
  • N=the index of the pixel in the image, [0394]
  • S=the pixel size [0395]
  • D=the direction of the pixel. [0396]
  • Further, events in a preferred embodiment, void FrameReady( ) is used when a frame is ready and data can be read. The handler copies data from the data access methods and then calls ReleaseFrame( ). It is recommended that any kind of indefinite processing, for example, function that invokes message loops be avoided in the handler. Further, void FrameOverrun( ) is used when the server is unable to send a frame or a frame has to be overwritten in the buffers because the buffers are full. This only applies to the FIFO and LIFO modes, since the LIO automatically releases old buffers. This event is usefully for determining whether the client application is reading frames quickly enough and whether the number of buffers allocated is sufficient for the latency of the client. [0397]
  • In a preferred embodiment, USAutoView is a sample client application that automates the server side and displays live ultrasound image frames. It has functions to demonstrate starting and stopping the server side, hiding and showing the server side, toggling between showing and not showing graphics on the image, freezing and resuming the ultrasound acquisition, loading a preset exam, changing the designated patient size, changing the image size, spatial information, and inverting the image. [0398]
  • FIG. 34 is a view of a graphical user interface used for a USAutoView UI in accordance with a preferred embodiment of the present invention. The USAutoView program is a Windows® dialog application with three ActiveX components. TTFrameReceiver which supplies ActiveX interface to receive ultrasound frames, TTAutomate which encapsulates automation of the server side and TTSimplelmageWnd which is the image display window. CUSAutoViewDlg is the main dialog. It manages the automation of the server side through the TTAutomate control, receiving ultrasound frames through TTFrameReceiver and image display through TTSimplelrnageWnd. The OnStartUS( ) method of CUSAutoViewDlg calls the TTAutomate and TTFrameReceiver methods needed to start or stop automation and data transmission from the server side. [0399]
  • The method OnFramReady( ) handles the FrameReady event from TTFrameReciever. It copies the desired data from TTFrameReceiver and then releases the frame with TTFrameReceiver's ReleaseFrame( ) method. It avoids any functions that perform indeterminate processing, such as functions that invoke message loops. [0400]
  • TTAutomate is an ActiveX control that encapsulates automation functions for the server side. The native COM Automation interface of the server side is non-blocking and requires waiting with GetStatusFlags to coordinate functions. TTAutomate wraps each function in the required wait loops. The wait loops allow Windows® messages to be processed so that the client application's user interface thread does not become blocked while waiting. Although automation methods in TTAutomate cannot return until the function has been completed, other Windows® messages are still processed before the function is completed. It is recommended to prevent multiple concurrent calls from message handlers to TTAutomate methods, as coordination with the server side is generally non-reentrant. Source code for this control is included in the USAutoView workspace. It may be reused or modified as desired. [0401]
  • TTSimplelmageWnd is an ActiveX control that provides a display window for device independent bitmaps (DIB's). The two properties of the display interface are long DIBitmaplnfo and long DIBits. DIBitmaplnfo corresponds to a pointer to a block of memory that contains the BITMAPINFO structure for the DIB. DIBits corresponds to a pointer to a block of memory that contains the image pixels. To load a new image, the DIBitmapInfo is set to the pointer to the bitmap info of the DIB. Then DIBits is set to the pointer to the bitmap bits. When DIBits is set, the pointer that was set for DIBitmapInfo is expected to still be valid and both the bitmap info and bitmap bits are copied internally for display on the screen. Both DIBitmapInfo and DIBits are set to zero to clear the image. Source code for this control is included in the USAutoView workspace. It may be reused or modified as desired. [0402]
  • The preferred embodiments of the present invention include a plurality of probe types. For example, the probes include, but are not limited to, a convex-linear transducer array operating between, 2-4 MHz, a phased-linear transducer array operating between 2-4 MHz, a convex-linear endocavity transducer array operating between 4-8 MHz, a linear transducer array operating between 4-8 MHz and a linear transducer array operating between 5-10 MHz. [0403]
  • Preferred embodiments of the portable ultrasound system of the present invention provide high resolution images such as the following during an examination: B-mode, M-mode, Color Doppler (CD), Pulsed Wave Doppler (PWD), Directional Power Doppler (DirPwr) and Power Doppler (PWR). Once the system software is installed the probe device is connected into a desktop or laptop. The probe can be an industry standard transducer connected to a 28 oz. case that contains the system's beamforming hardware. If the probe is connected to a laptop, then a 4-pin FireWire cable is connected to a [0404] IEEE 1394 serial connection located on a built-in MediaBay. However, if the probe is connected to a desktop, the computer may not be equipped with a MediaBay. One can connect the probe using an External DC Module (EDCM) connector. Before connecting the probe, one needs to make sure that the FireWire is connected on both the right and left sides of the computer.
  • In an embodiment, the EDCM is designed to accept a 6-pin IEEE 1394 (also referred to as FireWire) cable at one end and a Lemo connector from the probe at the other end. The EDCM accepts an input DC voltage from +10 to +40 Volts. Further, the system, in an embodiment, can be connected to a host computer with [0405] IEEE 1394. The 6-pin IEEE 1394 input to the EDCM can originate from any IEEE 1394 equipped host computer running, for example, the Windows® 2000 operating system. An external IEEE 1394 hub may also be necessary to provide the requisite DC voltage to the EDCM. In a host computer equipped with IEEE 1394, there are one of two types of IEEE 1394 connectors; a 4-pin or a 6-pin. The 6-pin connector is most often found in PC-based workstations that use internal PCI-bus cards. Typically, the 6-pin connector provides the necessary DC voltage to the EDCM. A 6-pin-male to 6-pin-male IEEE 1394 cable is used to connect the host computer to the EDCM.
  • The 4-pin connector is found in laptop computers that do not contain a MediaBay in accordance with a preferred embodiment or provide a DC voltage output. When using this connector type, an external IEEE-1394 hub can be used to power the EDCM and the probe. [0406]
  • When power is not provided from the host computer, an external IEEE-1394 hub can be used between the host computer and the EDCM. The hub derives its power from a wall outlet and is connected using a medical-grade power supply that conforms to the IEC 60601-1 electrical safety standard. [0407]
  • To connect the hub to the host computer, a 4-pin-male to 6-pin-male or 6-pin-male to 6-pin-male IEEE cable is required. The appropriate connector (4-pin or 6-pin) is inserted into the host computer and the 6-pin connector into the hub. The hub is then connected to the EDCM using a 6-pin-male to 6-pin-[0408] male IEEE 1394 cable. An IEEE 1394 hub is only necessary when the host computer cannot supply at least +10 to +40 DC volts and 10 watts power to the EDCM. If the host computer can supply adequate voltage and power, a 6-pin-male to 6-pin-male IEEE 1394 cable can be used to connect the computer directly to the EDCM.
  • FIG. 35 illustrates a view of a main screen display of a graphical user interface in accordance with a preferred embodiment of the present invention. When the user starts the system in accordance with the present invention the main screen displays. To help the user navigate, the main screen can be considered as four separate work areas that provide information to help one perform tasks. These include a menu bar, an image display window, an image control bar and a tool bar. [0409]
  • In order to resize windows and regions the user can click the small buttons in the upper right of the window to close, resize, and exit the program. A user interface or button closes the window but leaves the program running (minimizing the window). A system button appears at the bottom of the screen, in the area called the taskbar. By clicking the system button in the taskbar the window re-opens. Another interface button enlarges the window to fill the entire screen (called maximizing), however, when the window is at its largest, the frame rates may decrease. Another interface button returns the window to the size that it was before being enlarged. The system program can be closed by another interface button. [0410]
  • The user can increase or decrease the width of each region of the application to meet one's needs. For example, to make the Explorer window more narrow, the cursor is placed at either end of the region and by clicking and dragging the new desired size is obtained. One can re-position the size and location of each region so that they become floating windows. To create floating windows, the user simply clicks one's mouse on the double-edged border of the specific region and drags it until it appears as a floating window. To restore the floating window back to original form, one double-clicks in the window. These functionalities are depicted in FIGS. [0411] 36A-36C which are views in a graphical user interface in accordance with a preferred embodiment of the present invention.
  • The Explorer window provides the nested level file directory for all patient folders the user creates images that are created and saved. The folder directory structure includes the following, but is not limited to, patient folder, and an image folder. The patient folder directory is where patient information files are stored along with any associated images. The image folder directory contains images by date and exam type. The images in this directory are not associated with a patient and are created without patient information. FIGS. [0412] 37A-37B illustrate the patient folder and image folder in accordance with a preferred embodiment of the present invention. The menu bar at the top of the screen provides nine options one can use to perform basic tasks. To access a menu option simply click the menu name to display the drop-down menu options. The user can also access any menu by using its shortcut key combination.
  • The Image display window provides two tabs: Image Display and Patient Information. The user clicks on the Image Display tab to view the ultrasound image. The image is displayed in the window according to the control settings that are defined. Once the image is saved, when the user retrieves it again, the category, date and time of the image is also shown in the Image Display window. The Patient Info tab is used to enter new patient information which is later stored in a patient folder. The user can access this tab to also make modifications and updates to the patient information. [0413]
  • The Image Mode bar is illustrated in FIG. 38 in accordance with a preferred embodiment of the present invention. It provides six control modes the user can choose from when performing an examination. The modes include B-Mode which is a brightness mode providing a standard two-dimensional display in real time, an M-Mode which is used to display motion along a line depicted in the B-mode image as a function of time, and CD mode which is Color Doppler tab that displays, in real time, a two-dimensional image of blood flow overlaid to the B-mode image. The hues in the color palette indicate mean flow velocity, and the different colors indicate the direction of blood flow. [0414]
  • Further, Pulsed-Wave Doppler (PWD) mode displays a line in the B-mode image, which contains the sample size and location of interest. The pulsed Doppler waveform depicts the instantaneous velocity of flow within that sample, as a function of time. [0415]
  • The Directional Power Doppler (DirPwr) mode, displays, in real time, a two-dimensional image of blood flow overlaid to the B-mode image. The hues in the color palette indicated the density of red blood cells. Brighter hues indicate greater density. The different colors indicate the direction of blood flow. [0416]
  • The Power Doppler (Pwr) mode displays, in real time, a two-dimensional image of blood flow overlaid to the B-mode image. The hues in the color palette indicate the density of red blood cells. Brighter hues indicate greater density. It should be noted that directional information is not provided. Power Doppler is not subject to aliasing. It is generally more sensitive to low flow than Color Doppler or Directional Power Doppler. [0417]
  • When a user is in the process of performing the ultrasound, which can be referred to as viewing a live image, the user chooses the control mode they wish to use to view the image. The tabs available in the Image Control bar compliment the specific mode the user selects from the Image Mode bar. FIG. 38 illustrates the tool bar in the graphical user interface in accordance with a preferred embodiment of the present invention. [0418]
  • FIG. 39 illustrates a measurement tool bar in a graphical user interface in accordance with a preferred embodiment of the present invention. The Measurements toolbar provides the following buttons: a zoom button that lets a user magnify the selected region of the image of interest, an ellipse button lets one perform real and ellipse measurements on the image, a measure distance button that performs distance measurements on an image, an SD button that provides cursors for measurement of the systolic and diastolic portions of the pulse Doppler waveform, and a delete button that removes a selected measurement or the last measurement made, a text button that lets one enter text on live or frozen images. [0419]
  • FIG. 40 illustrates a playback toolbar in a graphical user interface in accordance with a preferred embodiment of the present invention. The Playback toolbar provides the following buttons: a play button that lets one play and pause loops of data. The user can play or pause up to sixty frames of loop information. The flip side of this button is a Pause Loop which lets the user pause the loops of date in Play mode. Further, a previous button that lets a user return to the previous frame during Playback Mode, a next image button that allows a user to advance to the next frame during Playback Mode, a status indicator button that shows graphically and numerically the frame number being viewed. [0420]
  • The Live/Freeze buttons that are used during a scan to record the examination or save the image to a file are illustrated in FIGS. 41A and 41B in accordance with a preferred embodiment of the present invention. The live button provides a real-time image display, while the freeze button freezes the image during the scan to allow the user to print or save to a file. [0421]
  • FIG. 42 illustrates the file toolbar in a graphical user interface of a preferred embodiment. The file toolbar provides the following buttons: a save button saves the current image to a file, a save Loop button that saves the maximum allowed number of previous frames as a Cine loop, and a print button that lets the user print the current image. [0422]
  • The preferred embodiments of the present invention also provide an online help system from the system program, that provides information grouped as contents, index and search. [0423]
  • The preferred embodiments of the present invention provides the steps a user needs to take to set up and modify information relating to a new patient in the system program. The user can enter new patient information so that pertinent exam history can be retained when a patient is scanned. A patient folder is created when one sets up a new patient which stores the patient information. All examination images for the patient are stored in this folder and are viewable by accessing the system Explorer window. [0424]
  • FIG. 43 illustrates a view of a patient information screen in a graphical user interface of a preferred embodiment. The Patient Information screen is accessible by selecting the Patient Information tab from the main system window. Several fields in the Patient Information screen provide drop-down arrows. The drop-down arrows displays a list box of choices for a field when selected. The choices available in the list box are created based on new data the user enters into a field for patients each time they perform an exam. For example, one can enter a new comment or choose a previously entered comment available from the list box in the Comment field. In the examination location and clinical information fields, one can enter new data or choose a value from a list box displaying existing names or locations. [0425]
  • FIG. 44 illustrates further interface buttons in a patient interface screen. The interface buttons provide the following functions: a save button lets a user save new or modified patient information, a new Patient button clears the Patient Information screen so the user can add a new patient, a cancel button stops the performing function and reverts back to last saved data, and a help button provides access to the online system Help. If previous data was entered for the last patient, when one clicks on a New Patient the data is eliminated from the screen. A dialog box displays prompting one to save the data. If the user chooses Yes, the data is saved. If the user chooses No, the screen is cleared with no changes saved. If one chooses Cancel, the operation is cancelled. [0426]
  • To view the data in the file, the user locates the specific patient information folder and clicks to select the Patient Information file. The data appears again in the Patient Information screen. If the user clicks the Cancel button while entering new patient information, the data is lost and cannot be retrieved. [0427]
  • FIG. 45 illustrates a view of a screen for adding a new patient in a graphical user interface of a preferred embodiment. When adding a new patient, the user can enter information in the fields that display in the Patient Information screen. When the user is finished adding new patient information, a Patient Information file is created that resides in the system directory. The Patient Information file is stored in the patient folder. Any associated images for a patient are also stored in the same directory. The Patient Information tab can be chosen in the image area to enter new patient information. Further, the New Patient button can be chosen at the bottom to clear all previous information entered from the last patient. [0428]
  • In the examination (exam) selection field, the user can choose the exam type they want for this examination. Each time one performs an examination on the specific patient, the user can choose the new examination type. The information is stored as part of the image filename. Further, to save a new patient that has been added, the user clicks on Save. The patient information is saved in the Patient Information file and displays in the system Explorer window next to the patient folder for the patient. Important patient information the user entered in the Patient Information tab is displayed in the Image Display screen. To view the patient information, one clicks the Image Display tab. The patient information is shown across the top of the screen and is saved with scanned images one creates for the patient. The user, be it a clinician or an ultrasound technician, for example, can update information for an existing patient. First they need to retrieve the appropriate file and then make their changes. [0429]
  • Ultrasound is primarily an operator-dependent imaging technology. The quality of images and the ability to make a correct diagnosis based on scans depends upon precise image adjustments and adequate control settings applied during the exam. The user can optimize the image quality during a patient exam while using any of the six image modes available in the system software. A two-dimensional (2D) image control setting tab that provides adjustments for size depth, focus, as well as time gain compensation is available for each of the six modes. An image quality (IQ) control setting that allows the user to further adjust the image for clarity is also available for each of the six modes. [0430]
  • Preferred embodiments of the present invention, provide B-mode and M-mode images. The B-mode (Brightness mode) tab provides two-dimensional image adjustments that allow the user to control the size, depth, focus, and overall image gain as well as TGC (Time Gain Compensation). The user can further manipulate the image quality by selecting from various palettes, smoothing, persistence settings and maps offered in the Image Quality tab menu. B-mode is selected when the user wants to view only a two-dimensional display of the anatomy. The B-mode provides the standard image mode options necessary for an ultrasound using the 2D and IQ (image control) settings. To select the B-mode for image display, the user chooses the B-mode button from the image mode bar, or from the Modes menu. [0431]
  • FIG. 46 illustrates an image in the B-mode including the controls provided by a graphical user interface of a preferred embodiment. FIGS. [0432] 47A-47H illustrate the different control interfaces for adjusting a B-mode image in the graphical user interface of the preferred embodiment. For adjusting size, the user can choose the parameters for the scan that meet the size of the patient, or structured anatomy. For example, the user clicks the T-shirt button that matches the patient size for small, medium, or large (or for superficial, moderately deep, and deep areas of interest). hI the alternative, the user can access the size option from the Image menu and choose the size from the drop-down list. Selection of the appropriate scan size can provide the user with fast and easy baseline settings. Other B-mode settings such as Gain, Focus and Depth have been optimized according to the T-shirt size in a preferred embodiment. All controls return to default settings when a new T-shirt size is selected. This feature makes it easy for the user to reset multiple parameters.
  • With respect to adjusting depth, the user can control the field of view for the scanned image by using depth control. If they want to capture deeper structures, they increase the depth. If there is a large part of the exam display that is unused or not necessary at the bottom of the screen, the user decreases the depth. To select the depth, the user clicks on the down arrow next to the Depth field label, and chooses a value from the list of available options. The available depth option depends on the probe that the user is working with. To decrease the depth, the user chooses a lower value from the depth list box. Or, in the alternative, the user can access the depth option from the Image menu and choose depth from the drop-down list. [0433]
  • The depth control adjusts the user's field of view. It increases one's field of view to see larger or deeper structures. Depth control also decreases the user's field of vision to enlarge the display of structures near the skin line. After adjusting Depth, the user may want to adjust the TGC and focus control settings. [0434]
  • With respect to adjusting focus, the user can change the position of the focal zone to specify a location for the optimal area of focus. A graphic caret is positioned on the depth scale to represent the focal zone. To adjust the focus, the user clicks on the down-arrow next to the Focus label to view the list box, or selects Focus from the Image menu. The list of values displayed in the menu can be chosen. The available focal position options depend on the probe being used. The Focus optimizes the image by increasing the resolution for a specific area. [0435]
  • With regard to adjusting Gain, the users increase or decrease the amount of echo information displayed in an image. Adjusting Gain may have the effect of brightening or darkening the image if sufficient echo information is generated. To adjust the Gain in a preferred embodiment, the user slides the bar illustrated in FIG. 47F to the right to increase, or to the left to decrease the level. Gain allows the user to balance echo contrast so that cystic structure appear echo-free and reflecting tissue fills in. [0436]
  • With respect to adjusting TGC, or the Time Gain Compensation, a slider illustrated in FIG. 47C adjusts how the control amplifies returning signals to correct for the attenuation caused by tissues at increasing depths. Each of the eight TGC slider bars are spaced proportionately to the depth. Depending on the adjustments, the TGC curve on the image display illustrated in FIG. 47H shows the values that match the TGC control. When a user changes the depth, the TGC is resealed across the new depth range. The TGC is used to balance the image so that the brightness of echoes is the same from near field to far field. [0437]
  • FIG. 48 illustrates the image quality control setting provided in the B-mode image option in a preferred embodiment. To select the I.Q. (image quality) control setting, the user clicks the I.Q. tab at the bottom of the image control bar, or accesses the Image menu. The user can invert the scanned image to the left or right by clicking on the Left/Right button. Further, the user can also invert the image in the top to bottom direction by clicking on the Up/Down button. It is appropriate to refer to the display to confirm the orientation of the image. [0438]
  • In addition, the palette can be adjusted by the user. The user can choose from a palette color range to define the reference bar, or the user can choose to use the default conventional gray scale display. The down-arrow next to the Palette field label can be clicked or selected to view the list box of color choices. The color the user wants can be chosen. The color scale changes to the new color and is represented in the image display. The gray palette is the most frequently used palette. To determine if another palette will improve visualization of the anatomy, the user can cycle through the available options. The anatomy that is being imaged has an effect on which palette is most advantageous. [0439]
  • With respect to image smoothing, the user can select from a range of A to E smooth image display. The user can click on the down-arrow next to the Smoothing field label to view the list box of values. The value displays in the probe information and is the first value in A/4/E. Increasing the smoothing increases the amount of interpolation between scan lines which makes the image appear smoother. It also decreases the frame rate. The opposite is true when the user decreases the amount of smoothing. [0440]
  • For image persistence, the user selects from a range of 0 to 4 to define the amount of image frame averaging is desired. By clicking on the down-arrow next to the Persistence field label the user views the list box of values. The value displays in the probe information and is the second value in A/4/E. When the persistence rate is high, the image appears less speckled and smoother. However, increasing the persistence rate also increases the possibility that the image appears to be blurred if the tissue is moving when the user freezes the image. When the persistence is low, the opposite is true. [0441]
  • With respect to an image map, the user can select from a range of A to F in the Map label field to change gray levels. The user clicks on the down-arrow next to the Map field label to view the list box of values. The value displays in the probe information and is the third value in A/4/E. Adjusting the map can assist in closely viewing certain anatomical features or to detect subtle pathologies. [0442]
  • To adjust the brightness of the image tone of the display, the user does so by defining the brightness range. By adjusting the brightness control to the right, the brightness of the image is increased. By adjusting the brightness control to the left, the brightness is decreased. Increasing the brightness increases the overall brightness of the image. The brightness is adjusted to correspond with map and contrast values. [0443]
  • The contrast of the image tone of the display is adjusted by defining the contrast range. By adjusting the contrast control to the right, the contrast of the image is increased. By adjusting the contrast control to the left, the contrast is decreased. Increasing the contrast decreases the amount of levels of gray in the image and makes the image contrast more. The opposite is true when the user decreases the contrast. It is recommended to adjust contrast to correspond and compliment the brightness and map value. [0444]
  • FIG. 49 illustrates the M-mode image option in accordance with a preferred embodiment of the present invention. The M-mode (motion mode) provides a display format and measurement capability that represents tissue displacement (motion) occurring over time along a single vector. The M-mode is used to determine patterns of motion for objects within the ultrasound beam. Typically, this mode is used for viewing motion patterns of the heart. When a user chooses M-mode for the image display the user can view the image in B-mode at the top of the screen as well as view the M-mode depth scale in the window at the bottom of the screen. [0445]
  • To select the M-mode for image display, the user chooses the M-mode button from the image mode bar. The following image control settings can be used to make adjustments to image display: 2D-Two dimensional, I.Q.-Image quality, an M-mode. [0446]
  • To select the M-mode image control setting, the user clicks the M-mode tab at the bottom of the image control bar. To adjust the sweep, you can change the speed at which the timeline is swept. To increase the speed, in the Sweep Speed field, the user clicks the Fast button. To decrease the speed, the user clicks the Slow button. One can also access speed by right-clicking in the M-mode window. As necessary, the M-mode display is optimized by adjusting the Depth, Focus, Gain and TGC controls on the 2D tab of the Image Control bar. The user can also adjust the image quality by selecting the IQ tab in the Image Control bar. To adjust the size of the 2D and M-mode image display, the user clicks on the horizontal bar between the two images and drags to the new appropriate window size. Changing the Sweep Speed can affect the thermal index (TI) and/or mechanical index (MI). [0447]
  • To adjust the scan line position in the B-mode window, the user can click on the left or right arrows next to the Scan Line Position label. The scan line moves accordingly to the left or the right. The M-mode cursor line can be moved manually. [0448]
  • During a scan, live images are recorded by frame. Depending upon the mode the user selects, a certain amount of frames are recorded. For example, the B-mode allows the capture of up to 60 frames in a Cine loop. When the user freezes a real-time image during a scan, all movement is suspended in the image display area. The freezed frame can be saved as a single image file or an entire image loop dependent upon the mode. [0449]
  • The ultrasound images can be magnified and text annotation can be added to the image area. Further, measurements accompanying ultrasound images can be added to supplement other clinical procedures available to the attending physician. The accuracy of measurements is not only determined by the system software, but also by the use of proper medical protocols by the users. The user can create measurements for Distance, Ellipse, or Peak Systole/End Diastole depending upon the mode you are using. [0450]
  • Images and loops can be exported in formats that can be viewed by others who do not have the system software. The preferred embodiment of the present invention can electronically mail (e-mail) an image and loop files or include them as graphics in other applications. The user can export images to one of the following graphic file formats: Bitmap (.bmp), and DICOM (.dcm). The user can export loops to one of the following graphic file formats: Tagged Image File Format (.tif), DICOM (.dcm). The Obstetrical measurement generated by the preferred embodiments of the system can easily be exported to the R4 Obstetrical reporting package. [0451]
  • Examination types in preferred embodiments are considered presets that contain standard image control settings. The system provides the user with a wide variety of exam types. The exam types automatically optimize multiple parameters one can use during a scan. The examination types available depend on which probe is being used. Although several probes may provide the same examination type, the preset image control parameters are unique to the characteristics of each probe. Each examination type contains three T-shirt presets: Small, Medium, and Large. The T-shirt icons, also interfaces represent predefined parameters for image control settings used with small, medium and large patients, or for superficial, moderately deep and deep areas of interest. [0452]
  • The image settings that can be optimized for each size using a two dimensional graphical interface include: depth, focus, gain, and TGC. The image setting controls that are optimized for each examination type using the image quality graphical user interface includes: left/right, up/down, palette, smoothing, persistence, map, brightness and contrast. The image setting controls that are optimized for each examination type using the M-mode graphical user interface include: sweep speed, scan line position, and B-mode to M-mode display ration. The image setting controls that are optimized for each examination type using the Pulsed Wave Doppler (PWD) interface include: sweep speed, velocity display, PRF (Pulse repetition frequency), wall filter, steering angle, invert, correction angle, sample volume size, gain, baseline, and sound volume. [0453]
  • Further, the image setting controls that are optimized for each examination type using the Color Doppler (CD) graphical user interface include: scan area, PRF (Pulse repetition frequency), wall filter, steering angle, invert, color gain, priority, persistence, baseline, high spatial resolution, and high frame rate. The image setting controls that are optimized for each examination type using the Direction Power Doppler (DirPwr) interface include: scan area, PRF (Pulse repetition frequency), wall filter, steering angle, invert, gain, priority, persistence, baseline, high spatial resolution, and high frame rate. The image setting controls that are optimized for each examination type using the PWR interface include: scan area, PRF (Pulse repetition frequency), wall filter, steering angle, gain, priority, persistence, high spatial resolution, and high frame rate. [0454]
  • Although the user can use the preset examination types available with the probe, the user can also create customized exam types based on the presets. Customized exam types can include specific modifications to the preset image control setting parameters. The user can select a customized exam type for future use without, having to perform the exact settings againi. Any examination type can be customized to include the specific control settings the user wishes to use. [0455]
  • Certain embodiments include diagnostic ultrasound images being displayed on a display monitor away from the operator's hands. Larger ultrasound imaging systems have the display integrated in the main operator console. The operator is accustomed to turning his/her head between his/her hand holding the probe or scan head, the system console and display, or he/she keeps his/her eyes on the display and manipulates the scan head without looking at the patient. This does not work well for some medical procedures in which the operator is performing visually intensive operations at the same location on the patient where the scan head is. Therefore, it is very beneficial to locate the display proximate to the operator's hands. Therefore, in alternate preferred embodiments, a visor mounted display is a workable alternative, but is deemed uncomfortable for some operators. [0456]
  • A preferred embodiment integrates the display on the hand-held scan head, thus allowing the operator to easily view the image and operate the probe or scan head, as well as perform operations in the same local area with the other hand. The data/video processing unit is also compact and portable, and may be placed close to the operator or alternatively at a remote location. Optionally, in another embodiment, a display is also integrated into the data/video processing unit. The processing unit also provides an external monitor port for use with traditional display monitors. [0457]
  • FIG. 50 illustrates a preferred embodiment of a portable [0458] ultrasound imaging system 1900 including a hand-held scan head with integrated display and a portable processing unit. The hand-held scan head assembly 1902 comprises an ultrasound transducer and a compact video display 1904. The display unit 1942 can be integrated into the hand-held scan head either directly on the scan head housing as illustrated in FIG. 51 A, or the display unit 1962 can be mounted onto the scan head via a mechanical swivel assembly 1966 as illustrated in FIG. 51B, which may be permanently attached or detachable. Alternatively, the display can be mounted on the interface 1908 in close proximity to the probe 1904. The interface can use a system such as that shown in FIG. 3D, for example, that is modified to have a video cable and power running alongside the Firewire connection from the processor housing 5.
  • The on-the-probe display unit such as shown in FIG. 51 A can be a 2.5 inch to 6.4 inch color LCD panel display, with a screen resolution ranging from a quarter VGA (320×240 pixels) to full VGA (640×480 pixels). The video data delivered to the display may be composite video via a thin coaxial cable, or digital VGA using Low Voltage Differential Signaling (LVDS) via four coaxial or twisted-pair wires. The display may be powered by battery in the unit, or by DC-power supplied by the portable data/[0459] video processing unit 1946 via power wires from the processing unit.
  • In a preferred embodiment, the video and power wires for the display are integrated with the transducer data wires for the scan head to form a [0460] single cable assembly 1948 that connects the hand-held scan head to the portable data/video processing unit.
  • The data/[0461] video processing unit 1908 is compact and portable. In a preferred embodiment, the beamformer electronics is an integral part of the hand-held scan head assembly, and the scan head communicates with the processing unit via a Firewire (IEEE 1394) cable as illustrated in FIG. 53A.
  • In another preferred embodiment, the beamformer electronics is moved inside the processing unit to further reduce the size and weight of the hand-held scan head as illustrated in FIG. 53B. The processing unit in this configuration comprises of a compact [0462] single board 1982 computer and the beamformer electronics as illustrated in FIG. 52. The beamformer electronics includes a digital processing printed circuit board 1984 and an analog processing printed circuit board 1986. The beamforming electronics communicates with the single board computer via a Firewire (IEEE 1394) interface 1988.
  • In an embodiment, the compact single board computer has a printed circuit board size of a 5¼ inch disk drive or a 3½ inch disk drive. One embodiment of the present invention uses a NOVA-[0463] 7800-P800 single board computer in a 5¼ inch form factor, with a low power Mobile Pentium-III 800 MHz processor, 512 Mbytes of memory, and has on board interface ports for Firewire (IEEE 1394), LAN, Audio, IDE, PCMCIA and Flash memories.
  • For some dedicated applications, the entire system includes the hand-held scan head with an integrated display and the portable data/video processing unit, can be operated without any controls other than power on/off. For other applications, the system is equipped with an optional operator interface such as buttons and knobs, either on the processing unit, or integrated in the scan head, or both. The processing unit provides an additional video output to drive an external monitor, or optionally an integrated display on the processing unit itself. [0464]
  • When an external monitor is attached to the processing unit, and/or an integrated monitor is available on the processing unit, the hand-held scan head display may be used in a View-Finder mode wherein the operator uses the smaller hand-held display to locate the region of interest, and uses the larger monitor to observe more detailed images either in real time, or capture-then-review manner. [0465]
  • There are different methods to realize trapezoidal scan format using Linear Arrays. A first method is to perform conventional rectangular linear scanning over the center of the array and scan a portion of a sector field at each end of the array. A second approach is to activate groups of elements of the transducer successively such that the transmitted beams form a sector scan field having the origin point located behind the transducer array. [0466]
  • The second approach has only one coordinate system, thus it simplifies the scan conversion. It also creates a more uniform image over the whole field of view. However, because all the beams originate from one point, it is difficult to have both steering angle and scan-position (tangential) increments be uniform as illustrated in FIGS. 54 and [0467] 55, respectively. The uniform angular increment approach has higher scan line density in the center of the array than on the side, so the images on the side are more degraded. The scan line density of the constant.tangential increment approach is uniform, but almost every line has larger steering angle than the uniform angular increment, which degrades the image quality especially in the near field.
  • A preferred embodiment creates trapezoidal image scans using an extension of the two methods. The steered angle of each scan line is determined by a monotonic function of the scan position as illustrated in FIG. 56, or vice versa, which can be written as θ=f(x[0468] s) or xs=g(θ). By choosing different functions f(xs) or g(θ), one can get different scan patterns. Three different approaches include the following:
  • 1. Even scan line space and uniform steered angle increment. [0469]
  • If one defines the function θf(x[0470] s)=λxs and has the scan line position be evenly spaced in the face of the transducer, one can have both even scan line space uniform steered angle increment. The scan conversion between the two coordinate systems, i.e. scan coordinate system (xs,r) or (θ,r), and display screen coordinate system (x,y) is given as follows:
  • 1.1 Scan position x[0471] s and radius r to raster image coordinate (x,y)
  • x=x s +r sin(λx s)
  • y=r cos(λx s)
  • 1.2 raster image coordinate (x,y) to scan position x[0472] s and radius r
  • The x[0473] s can be calculated by solving the following equation
  • y tan(λxs)+x s −x=0
  • after x[0474] s has been calculated, r = r cos ( λ x s )
    Figure US20040015079A1-20040122-M00014
  • This method involves a solution using a numerical method, and is computationally expensive. Two alternative ways are as follows: [0475]
  • 2. Even scan line space with adjustable steering angle. [0476]
  • The scan lines are even space located at the front face of the tranducer. The steered angle is defined as [0477] θ = arctan ( x s β - α x s ) where β = x s max tan θ max + α x max
    Figure US20040015079A1-20040122-M00015
  • By adjusting a parameter, a, one can control the steering angle of each scan line. The scan conversion between the two coordinate systems, i.e. scan coordinate system (x[0478] s, r), and display screen coordinate system (x,y) is given as follows:
  • 2.1 Scan position x[0479] s and radius r to raster image coordinate (x,y) x = x s + r sin ( arctan ( x s β - α x s ) ) y = r cos ( arctan ( x s β - α x s ) )
    Figure US20040015079A1-20040122-M00016
  • 2.2 raster image coordinate (x,y) to scan position x[0480] s and radius r x s = 2 x b y + b + α x + ( y + b + α x ) 2 - 4 α b x r = x 2 + ( y + b - α x ) 2 - l 2 + ( b - α x ) 2
    Figure US20040015079A1-20040122-M00017
  • FIG. 57 shows a comparison of the steering angle using the method of the preferred embodiment and the uniform tangential increment approach. Note that at every beam position, the steering angle using the preferred embodiment approach is smaller than the uniform tangential approach (better near field). FIG. 58 shows more clearly that the preferred embodiment approach can have uniform tangential increment and also nearly uniform steering angle increments. [0481]
  • 3. Uniform steering angle increment with adjustable scan position. [0482]
  • The third method to increment the steering angle uniformly, and adjust each scan line position by the following function: [0483]
  • x s=(β−α| tan θ|)tan θ
  • where [0484] β = x s max tan θ max + α tan θ max
    Figure US20040015079A1-20040122-M00018
  • 3.1 Scan angle θ and radius r to raster image coordinate (x,y). [0485] x = ( r + β - α tan θ cos θ ) sin θ
    Figure US20040015079A1-20040122-M00019
  • 3.2 Raster image coordinate (x,y) to scan angle θ and radius r. [0486] θ = arctan ( 2 x y + β + ( y + β ) 2 - 4 α b x ) r = y cos θ
    Figure US20040015079A1-20040122-M00020
  • The claims should not be read as limited to the described order or elements unless stated to that effect. Therefore, all embodiments that come within the scope and spirit of the following claims and equivalents thereto are claimed as the invention. [0487]
  • While this invention has been particularly shown as described with references to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention. [0488]

Claims (154)

What is claimed:
1. A method for providing streaming video in an ultrasonic imaging system comprising:
providing an ultrasonic application server having at least one ultrasonic operation and corresponding ultrasonic data;
sending, from an external application, a command indicative of at least one of the ultrasonic operations;
executing, in the ultrasonic application server, a result corresponding to the commands; and
sending data from the ultrasonic application server to the external application.
2. The method of claim 1 further comprising providing a shared memory in communication with the ultrasonic application server and the external application.
3. The method of claim 1 further comprising the steps of:
defining an integrated interface program having a plurality of entry points into the ultrasonic application server, the entry points operable to access each of the at least one ultrasonic operations;
transmitting, via the integrated interface program, the command to the ultrasonic application server;
receiving, over a predetermined communication interface, ultrasonic data indicative of ultrasonic image information; and
transmitting, via the integrated interface program, the result to the external application.
4. The method of claim 3 wherein the integrated interface program is adapted to transmit real-time imaging data including ultrasonic imaging for radiation therapy planning and treatment, minimally invasive and robotic surgery methods including biopsy procedures, catheter introduction for diagnostic and therapeutic angiography, fetal imaging, cardiac imaging, vascular imaging, imaging during endoscopic procedures, imaging for telemedicine applications, imaging for veterinary applications, cryotherapy, and ultrasound elastography.
5. The method of claim 1 wherein the streaming video includes radio frequency data.
6. The method of claim 1 wherein the result is real-time image data and transformation parameters.
7. The method of claim 1 wherein the external application is on a remote computer.
8. The method of claim 7 wherein the remote computer is connected to the ultrasonic application server by a public access network.
9. The method of claim 8 wherein the public access network is the Internet.
10. The method of claim 1 wherein the external application resides on the same computer.
11. The method of claim 1 wherein the command includes an instruction and at least one parameter.
12. The method of claim 1 wherein the external application communicates with the ultrasonic application server using at least one of a control program using a component object model automation interface and a shared memory interface.
13. The method of claim 1 wherein the command includes operations selected from the group consisting of ultrasound application initialization and shutdown functions, ultrasound setup functions and ultrasound image capture functions such as, for example, freeze live data, fetch live data, and resume live imaging.
14. The method of claim 1 wherein control instructions using ActiveX controls are used to transfer data between the ultrasonic application server and the external application.
15. The method of claim 14 wherein the command conforms to transmitting via an integrated communication interface and program which conforms to a predetermined protocol.
16. The method of claim 15 wherein the protocol is TCP/IP.
17. The method of claim 1 wherein the receiving of ultrasonic data further comprises receiving according to a standardized interface.
18. The method of claim 17 wherein the standardized interface is IEEE 1394.
19. The method of claim 1 wherein the ultrasonic application server includes a graphical user interface (GUI).
20. The method of claim 19 wherein the GUI includes image control presets.
21. The method of claim 20 wherein the image control presets are operable to store image settings.
22. The method of claim 21 wherein the image settings include settings selected from the group consisting of application controls, B-mode controls, M-mode controls, image quality controls, and Doppler controls.
23. The method of claim 1 further comprising:
providing a probe housing having a transducer array that is connected to a processing circuit having a beamforming circuit, a memory, a system controller integrated circuit and a digital communication control circuit;
connecting the digital communication control circuit to a personal computer with a standard communication interface; and
transmitting data along the communication interface.
24. The method of claim 23 further comprising providing an interface housing in which the first circuit board assembly and the second board assembly are mounted.
25. The method of claim 23 further comprising providing an interface housing in which a first circuit board assembly having the beamforming circuit and a second circuit board assembly having the memory, controller and communication control circuit are mounted.
26. The method of claim 23 further comprising providing a body mounted personal computer.
27. The method of claim 23 further comprising providing a body mounted interface housing.
28. The method of claim 23 wherein the communication interface is a wireless interface.
29. The method of claim 28 wherein the wireless interface is a RF interface.
30. The method of claim 28 wherein the wireless interface is an infrared interface.
31. The method of claim 23 further comprising the probe housing having a display unit.
32. A system for providing streaming data in an ultrasonic imaging system comprising:
a user computing device having an ultrasonic application server operable to receive and process ultrasonic data via a predetermined interface;
an integrated interface program in communication with the ultrasonic application server and operable to invoke an operation in the ultrasonic application server;
an external application operable to generate a command corresponding to the operation, and further operable to transmit the command to the integrated interface program, the integrated interface program invoking the ultrasonic application server to compute a result in response to the command.
33. The system of claim 32 wherein the integrated interface program is adapted to transmit real-time imaging data including ultrasonic imaging for radiation therapy planning and treatment, minimally invasive and robotic surgery methods including biopsy procedures, catheter introduction for diagnostic and therapeutic angiography, fetal imaging, cardiac imaging, vascular imaging, imaging during endoscopic procedures, imaging for telemedicine applications, imaging for veterinary applications, cryotherapy, and ultrasound elastography.
34. The system of claim 32 wherein the result further comprises real time image data and transformation parameters.
35. The system of claim 32 further comprising a shared memory in communication with the ultrasonic application server and the external application.
36. The system of claim 32 further comprising a remote computer wherein the external application is on a remote computer.
37. The system of claim 32 wherein the streaming data includes radio frequency data.
38. The system of claim 32 further comprising a public access network, the remote computer being connected to the ultrasonic application via the public access network.
39. The system of claim 38 wherein the public access network is the Internet.
40. The system of claim 32 wherein the external application resides on the user computing device.
41. The system of claim 32 wherein the command further comprises an instruction and at least one parameter.
42. The system of claim 41 wherein the command conforms to a predetermined interprocess communication interface.
43. The system of claim 42 wherein the command includes operations selected from the group consisting of ultrasound application initialization and shutdown functions, ultrasound setup functions and ultrasound image capture functions such as, for example, freeze live data, fetch live data, and resume live imaging.
44. The system of claim 32 wherein at least one of the interfaces is a wireless interface.
45. The system of claim 39 wherein the transmitting via the integrated interface program further includes sockets.
46. The system of claim 39 wherein the transmitting via the integrated interface program conforms to a predetermined protocol.
47. The system of claim 40 wherein the predetermined protocol is TCP/IP.
48. The system of claim 32 wherein the external application communicates with the ultrasonic application server using at least one of a control program using a component object model automation interface and a shared memory interface.
49. The system of claim 39 further comprising a standardized interface, wherein the ultrasonic data is received via the standardized interface.
50. The system of claim 42 wherein the standardized interface is IEEE 1394.
51. The system of claim 39 wherein the ultrasonic application server includes a graphical user interface (GUI).
52. The system of claim 51 wherein the GUI includes image control presets.
53. The system of claim 52 wherein the image control presets are operable to store image settings.
54. The system of claim 53 wherein the image settings include settings selected from the group consisting of application controls, B-mode controls, M-mode controls, image quality controls, and Doppler controls.
55. The system of claim 32 further comprising:
a probe housing having a transducer array;
an interface system communicating with the probe housing, the interface system having a beamforming circuit, a memory, a system controller integrated circuit and a communication control circuit connected to the computing device with a standard communication interface.
56. The system of claim 55 further comprising the probe housing having a display unit.
57. The system of claim 55 wherein the interface system has a first circuit board assembly and a second circuit board assembly are mounted in an interface housing.
58. The system of claim 57 wherein the first circuit board assembly and the second circuit board assembly are electrically connected by a connector.
59. The system of claim 55 wherein the computing device comprises a body mounted system.
60. The system of claim 55 wherein the memory further comprises a video random access memory (VRAM).
61. The system of claim 55 wherein the memory comprises a synchronous dynamic random access memory (synchDRAM).
62. The system of claim 55 wherein the standard communication interface comprises an IEEE 1394 interface.
63. The system of claim 55 wherein the standard communication interface comprises a universal serial bus (USB) interface.
64. The system of claim 55 wherein the communication system is a wireless interface between the communication control circuit of the interface system and the computing device.
65. A computer program product having computer program instructions for integration of a external application in an ultrasonic imaging system comprising:
computer program instructions for defining an ultrasonic application server having at least one ultrasonic operation;
computer program code for defining an integrated interface program having a plurality of entry points into the ultrasonic application server, the entry points operable to access each of the at least one ultrasonic operations;
computer program code for sending, from the external application, a command indicative of at least one of the ultrasonic operations; and
computer program code for executing, in the ultrasonic application server, a result corresponding to the command.
66. A method of external application integration in an ultrasonic imaging system comprising:
providing an ultrasonic application server having at least one ultrasonic operation;
sending, from the external application, a command indicative of at least one of the ultrasonic operations; and
executing, in the ultrasonic application server, a result corresponding to the command.
67. The method of claim 66 further comprising the steps of:
defining an integrated interface program having a plurality of entry points into the ultrasonic application server, the entry points operable to access each of the at least one ultrasonic operations;
transmitting, via the integrated interface program, the command to the ultrasonic application server;
receiving, over a predetermined communication interface, ultrasonic data indicative of ultrasonic image information; and
transmitting, via the integrated interface program, the result to the external application.
68. The method of claim 67 wherein the integrated interface program is adapted to transmit information pertinent to data selected from the group consisting of radiation therapy, fetal images, cardiac images, and image guided surgery.
69. The method of claim 66 wherein the result is image data and transformation parameters.
70. The method of claim 66 wherein the external application is on a remote computer.
71. The method of claim 70 wherein the remote computer is connected to the ultrasonic application server by a public access network.
72. The method of claim 71 wherein the public access network is the Internet.
73. The method of claim 66 wherein the external application is on the same computer.
74. The method of claim 66 wherein the command includes an instruction and at least one parameter.
75. The method of claim 66 wherein the command conforms to a predetermined interprocess communication interface.
76. The method of claim 66 wherein the command includes operations selected from the group consisting of freeze live data, fetch live data, export image, exit, initialize, and get status.
77. The method of claim 66 wherein the transmitting via the integrated interface program employs sockets.
78. The method of claim 77 wherein the transmitting via the integrated interface program conforms to a predetermined protocol.
79. The method of claim 78 wherein the protocol is TCP/IP.
80. The method of claim 66 wherein the receiving of ultrasonic data further comprises receiving according to a standardized interface.
81. The method of claim 80 wherein the standardized interface is IEEE 1394.
82. The method of claim 66 wherein the ultrasonic application server includes a graphical user interface (GUI).
83. The method of claim 82 wherein the GUI includes image control presets.
84. The method of claim 83 wherein the image control presets are operable to store image settings.
85. The method of claim 84 wherein the image settings include settings selected from the group consisting of size, depth, focus, time gain compensation (TGC) and TGC lock.
86. The method of claim 66 further comprising:
providing a probe housing having a transducer array that is connected to a processing circuit having a beamforming circuit, a memory, a system controller integrated circuit and a digital communication control circuit;
connecting the digital communication control circuit to a personal computer with a standard communication interface; and
transmitting data along the communication interface.
87. The method of claim 86 further comprising the probe housing having a display unit.
88. The method of claim 86 further comprising providing an interface housing in which the first circuit board assembly and the second board assembly are mounted.
89. The method of claim 86 further comprising providing an interface housing in which a first circuit board assembly having the beamforming circuit and a second circuit board assembly having the memory, controller and communication control circuit are mounted.
90. The method of claim 86 further comprising providing a body mounted personal computer.
91. The method of claim 86 further comprising providing a body mounted interface housing.
92. The method of claim 86 wherein the communication interface is a wireless interface.
93. The method of claim 92 wherein the wireless interface is a RF interface.
94. The method of claim 92 wherein the wireless interface is an infrared interface.
95. A system for external application integration in an ultrasonic imaging system comprising:
a user computing device having an ultrasonic application server operable to receive and process ultrasonic data via a predetermined interface;
an integrated interface program in communication with the ultrasonic application server and operable to invoke operations in the ultrasonic application server;
an external application operable to generate a command corresponding to the operations, and further operable to transmit the commands to the integrated interface program, wherein the integrated interface program invokes the ultrasonic application server to compute a result in response to the command, and transmits the result back to the external application.
96. The system of claim 95 wherein the integrated interface program is adapted to transmit information pertaining to data selected from the group consisting of radiation therapy data, fetal images, cardiac images, and image guided surgery.
97. The system of claim 95 wherein the result further comprises image data and transformation parameters.
98. The system of claim 95 further comprising a remote computer wherein the external application is on a remote computer.
99. The system of claim 95 further comprising a public access network, wherein the remote computer is connected to the ultrasonic application via the public access network.
100. The system of claim 99 wherein the public access network is the Internet.
101. The system of claim 95 wherein the external application is on a remote computer.
102. The system of claim 95 wherein the command further comprises an instruction and at least one parameter.
103. The system of claim 102 wherein the command conforms to a predetermined interprocess communication interface.
104. The system of claim 103 wherein the command includes operations selected from the group consisting of freeze live data, fetch live data, export image, exit, initialize, and get status.
105. The system of claim 95 wherein at least one of the interfaces is a wireless interface.
106. The system of claim 100 wherein the transmitting via the integrated interface program further includes sockets.
107. The system of claim 100 wherein the transmitting via the integrated interface program conforms to a predetermined protocol.
108. The system of claim 101 wherein the predetermined protocol is TCP/IP.
109. The system of claim 100 further comprising a standardized interface, wherein the ultrasonic data is received via the standardized interface.
110. The system of claim 103 wherein the standardized interface is IEEE 1394.
111. The system of claim 100 wherein the ultrasonic application server includes a graphical user interface (GUI).
112. The system of claim 106 wherein the GUI includes image control presets.
113. The system of claim 95 further comprising:
a probe housing having a transducer array;
an interface system communicating with the probe housing, the interface system having a beamforming circuit, a memory, a system controller integrated circuit and a communication control circuit connected to the computing device with a standard communication interface.
114. The system of claim 113 wherein the interface system has a first circuit board assembly and a second circuit board assembly are mounted in an interface housing.
115. The system of claim 114 wherein the first circuit board assembly and the second circuit board assembly are electrically connected by a connector.
116. The system of claim 113 wherein the computing device comprises a body mounted system.
117. The system of claim 113 wherein the memory further comprises a video random access memory (VRAM).
118. The system of claim 113 wherein the probe housing further comprising a display unit.
119. The system of claim 113 wherein the standard communication interface comprises an IEEE 1394 interface.
120. The system of claim 113 wherein the standard communication interface comprises a universal serial bus (USB) interface.
121. The system of claim 113 wherein the communication system is a wireless interface between the communication control circuit of the interface system and the computing device.
122. A system for accessing and displaying ultrasonic imaging data, comprising:
a portable information device having a wireless interface port; and
a computing device having a wireless interface port being operable to communicate with the portable information device and able to respond to requests for transmitting ultrasonic imaging data to the portable information device.
123. The system of claim 122 further comprising a standardized interface in the portable information device and the computing device, wherein the ultrasonic data is requested and received via the standardized interface.
124. The system of claim 123 wherein the standardized interface is an IEEE 1394 interface.
125. The system of claim 122 further comprising:
a probe having a transducer array, a beamforming circuit, a memory, a system controller integrated circuit and a communication control circuit, the probe being operable to communicate with at least one of the computing device and the portable information device.
126. A method for accessing and displaying ultrasonic imaging data comprising the steps of:
sending a connection request from a portable information device to a computing device in communication with a probe having a transducer array, control circuitry and communication control circuitry;
accepting and responding to the connection request by the computing device; and
sending ultrasonic imaging data from the computing device to the portable information device.
127. The method of claim 126, wherein the portable information device and the computing device communicate amongst each other using at least one wireless link.
128. The method of claim 127, wherein the wireless link is a radio-frequency (RF) communication link.
129. A computer readable medium having stored therein a set of instructions for causing a processing unit to execute the steps of the method of claim 126.
130. A portable information device for accessing and displaying ultrasonic imaging data, comprising:
an interface for receiving a user input;
a wireless transceiver for transmitting and receiving wireless messages; and
a processor coupled to the interface and the wireless transceiver, the processor being configured to execute an ultrasonic imaging application, the processor being further configured to format a wireless message in accordance with a predetermined protocol suitable for communication with a computing device having ultrasonic imaging data, the processor sending a wireless connection message to the wireless transceiver for transmission.
131. The portable information device of claim 130, wherein the wireless transceiver further comprises a radio frequency transmitter and receiver and wherein the predetermined protocol further comprises a radio frequency protocol such as one complaint with IEEE 1394.
132. The portable information device of claim 130 further comprising one of at least controlling and monitoring a plurality of operations of the collection and processing of the ultrasonic imaging data in at least one of a probe assembly and at a computing device.
133. An ultrasonic imaging system comprising:
a probe having a transducer array, and a control circuitry and a digital communication control circuit, the control circuitry including a transmit/receive module, beamforming module and a system controller; and
a computing device connected to the digital communication control circuit of the probe with a communication interface, the computer processing display data.
134. The system of claim 133 wherein the communication interface between the probe and the computing device is a wireless interface.
135. The system of claim 134 wherein the wireless interface is a RF interface.
136. The system of claim 134 wherein the wireless interface is an infrared interface.
137. The system of claim 133 wherein the communication interface between the probe and the computing device is a wired link.
138. The system of claim 133 wherein the beamforming module is a charge domain processor beamforming module and wherein the control circuitry further comprises a pre-amp/TGC module.
139. The system of claim 133 further comprising a supplemental display device connected to the computing device by a second communication interface.
140. The system of claim 139 wherein the supplemental display device is a computing device.
141. The system of claim 139 wherein the communication interface between the probe and the computing device is a wireless interface and the second communication interface between the supplemental display device and the computing device is wireless.
142. The system of claim 139 wherein the second communication interface includes a hub to connect a plurality of secondary supplemental devices.
143. The system of claim 139 wherein the computing device has a different velocity estimation system further comprising one of color-flow imaging, power-Doppler and spectral sonogram.
144. The system of claim 133 wherein the probe has a display integral with a probe housing.
145. An ultrasound imaging system, comprising:
a probe housing having a transducer array and a display unit; and
a processing unit in communication with the probe housing.
146. The ultrasound imaging system of claim 145 wherein the display unit is at least a 2.5 inch color liquid crystal display (LCD) panel display.
147. The ultrasound imaging system of claim 145 wherein the screen resolution ranges from a quarter VGA (320×240 pixels) to full VGA (640×480 pixels).
148. The ultrasound imaging system of claim 145 wherein the display unit is integrated into the probe housing.
149. The ultrasound imaging system of claim 145 wherein the display unit is detachably mounted onto the probe housing.
150. The ultrasound imaging system of claim 145 wherein the processing unit comprises a single board computer and beamforming electronics.
151. The ultrasound imaging system of claim 150 wherein the beamforming electronics and the single board computer interface have, a standardized Firewire.
152. The ultrasound imaging system of claim 150 wherein the single board computer comprises a processor, a memory unit and a plurality of interface ports.
153. The ultrasound imaging system of claim 152 wherein the interface ports comprises an IEEE 1394 port, LAN, Audio, IDE, PCMCIA and Flash memories.
154. The ultrasound imaging system of claim 145 further comprising a trapzoidal display.
US10/386,360 1999-06-22 2003-03-11 Ultrasound probe with integrated electronics Abandoned US20040015079A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/386,360 US20040015079A1 (en) 1999-06-22 2003-03-11 Ultrasound probe with integrated electronics
US13/846,231 US11547382B2 (en) 1999-06-22 2013-03-18 Networked ultrasound system and method for imaging a medical procedure using an invasive probe

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US14043099P 1999-06-22 1999-06-22
US09/449,780 US6530887B1 (en) 1996-12-24 1999-11-26 Ultrasound probe with integrated electronics
PCT/US2000/017236 WO2000079300A1 (en) 1999-06-22 2000-06-22 Ultrasound probe with integrated electronics
US09/791,491 US6783493B2 (en) 1999-06-22 2001-02-22 Ultrasound probe with integrated electronics
US09/822,764 US6669633B2 (en) 1999-06-22 2001-03-30 Unitary operator control for ultrasonic imaging graphical user interface
PCT/US2002/005764 WO2002068992A2 (en) 1999-06-22 2002-02-22 Ultrasound probe with integrated electronics
US10/094,950 US6969352B2 (en) 1999-06-22 2002-03-11 Ultrasound probe with integrated electronics
US10/354,946 US9402601B1 (en) 1999-06-22 2003-01-30 Methods for controlling an ultrasound imaging procedure and providing ultrasound images to an external non-ultrasound application via a network
US10/386,360 US20040015079A1 (en) 1999-06-22 2003-03-11 Ultrasound probe with integrated electronics

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/354,946 Continuation-In-Part US9402601B1 (en) 1999-06-22 2003-01-30 Methods for controlling an ultrasound imaging procedure and providing ultrasound images to an external non-ultrasound application via a network

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/846,231 Continuation US11547382B2 (en) 1999-06-22 2013-03-18 Networked ultrasound system and method for imaging a medical procedure using an invasive probe

Publications (1)

Publication Number Publication Date
US20040015079A1 true US20040015079A1 (en) 2004-01-22

Family

ID=30449718

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/386,360 Abandoned US20040015079A1 (en) 1999-06-22 2003-03-11 Ultrasound probe with integrated electronics
US13/846,231 Active 2024-08-05 US11547382B2 (en) 1999-06-22 2013-03-18 Networked ultrasound system and method for imaging a medical procedure using an invasive probe

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/846,231 Active 2024-08-05 US11547382B2 (en) 1999-06-22 2013-03-18 Networked ultrasound system and method for imaging a medical procedure using an invasive probe

Country Status (1)

Country Link
US (2) US20040015079A1 (en)

Cited By (320)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020107445A1 (en) * 1999-03-11 2002-08-08 Assaf Govari Implantable and insertable passive tags
US20030004411A1 (en) * 1999-03-11 2003-01-02 Assaf Govari Invasive medical device with position sensing and display
US20030018246A1 (en) * 1999-03-11 2003-01-23 Assaf Govari Guidance of invasive medical procedures using implantable tags
US20030050557A1 (en) * 1998-11-04 2003-03-13 Susil Robert C. Systems and methods for magnetic-resonance-guided interventional procedures
US20040024306A1 (en) * 2002-07-29 2004-02-05 Hamilton Craig A. Cardiac diagnostics using time compensated stress test cardiac MRI imaging and systems for cardiac diagnostics
US20040227723A1 (en) * 2003-05-16 2004-11-18 Fisher-Rosemount Systems, Inc. One-handed operation of a handheld field maintenance tool
US20040242998A1 (en) * 2003-05-29 2004-12-02 Ge Medical Systems Global Technology Company, Llc Automatic annotation filler system and method for use in ultrasound imaging
US20050050403A1 (en) * 2003-08-26 2005-03-03 Frank Glaser Method for requesting information regarding a network subscriber station in a network of distributed stations, and network subscriber station for carrying out the method
US20050114175A1 (en) * 2003-11-25 2005-05-26 O'dea Paul J. Method and apparatus for managing ultrasound examination information
US20050113689A1 (en) * 2003-11-21 2005-05-26 Arthur Gritzky Method and apparatus for performing multi-mode imaging
US20050131279A1 (en) * 2003-04-01 2005-06-16 Boston Scientific Scimed, Inc. Articulation joint for video endoscope
US20050148878A1 (en) * 2003-12-19 2005-07-07 Siemens Medical Solutions Usa, Inc.. Probe based digitizing or compression system and method for medical ultrasound
US20050148873A1 (en) * 2003-12-19 2005-07-07 Siemens Medical Solutions Usa, Inc. Ultrasound adaptor methods and systems for transducer and system separation
US20050197536A1 (en) * 2003-04-01 2005-09-08 Banik Michael S. Video endoscope
US20050228287A1 (en) * 2004-04-08 2005-10-13 Sonosite, Inc. Systems and methods providing ASICs for use in multiple applications
US20050234321A1 (en) * 2004-03-26 2005-10-20 Fuji Photo Film Co., Ltd. Diagnostic support system and method used for the same
US20050246064A1 (en) * 2004-04-29 2005-11-03 Smith Gregory C Method for detecting position errors using a motion detector
US20060029901A1 (en) * 2004-07-02 2006-02-09 Discus Dental Impressions, Inc. Light guide for dentistry applications
US20060034538A1 (en) * 2004-07-23 2006-02-16 Agfa Corporation Method and system of automating echocardiogram measurements and reporting
US20060084875A1 (en) * 2004-10-14 2006-04-20 Scimed Life Systems, Inc. Integrated bias circuitry for ultrasound imaging devices
US20060092893A1 (en) * 2004-11-03 2006-05-04 Mark Champion Method and system for processing wireless digital multimedia
US20060092930A1 (en) * 2004-10-28 2006-05-04 General Electric Company Ultrasound beamformer with high speed serial control bus packetized protocol
US20060116579A1 (en) * 2004-11-29 2006-06-01 Pai-Chi Li Ultrasound imaging apparatus and method thereof
US20060122489A1 (en) * 2004-06-23 2006-06-08 Makoto Kato Vascular endothelial reactivity measuring apparatus and method for controlling the measuring apparatus
EP1672552A1 (en) * 2005-08-02 2006-06-21 Agilent Technologies Inc Independently installable component for measurement device
US20060173335A1 (en) * 2005-01-11 2006-08-03 General Electric Company Ultrasound beamformer with scalable receiver boards
US20060203963A1 (en) * 2005-02-14 2006-09-14 Helmut Biedermann Medical apparatus system, and method for operation thereof
WO2006111872A2 (en) * 2005-04-18 2006-10-26 Koninklijke Philips Electronics, N.V. Pc-based portable ultrasonic diagnostic imaging system
US20060293594A1 (en) * 2005-06-24 2006-12-28 Siemens Aktiengesellschaft Device for carrying out intravascular examinations
US7162462B1 (en) * 2003-03-14 2007-01-09 Unisys Corporation Providing time sensitivity to an inference engine
US20070011118A1 (en) * 2005-06-28 2007-01-11 Snook James A Addressing Scheme for Neural Modeling and Brain-Based Devices using Special Purpose Processor
US20070016027A1 (en) * 2005-07-14 2007-01-18 Marco Gerois D Method and apparatus for utilizing a high speed serial data bus interface within an ultrasound system
US20070038086A1 (en) * 2005-06-20 2007-02-15 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and method of ultrasonic measurement
US20070043454A1 (en) * 2005-08-22 2007-02-22 John Sonnenberg Multi-function remote controller and programmer for landscape systems
US20070066894A1 (en) * 2003-03-28 2007-03-22 Koninklijke Philips Electronics N.V. Remote wireless control device for an ultrasound machine and method
US20070088416A1 (en) * 2001-04-13 2007-04-19 Surgi-Vision, Inc. Mri compatible medical leads
US20070109294A1 (en) * 2003-11-26 2007-05-17 Koninklijke Philips Electronics Nv Workflow optimization for high thoughput imaging enviroments
WO2007067200A2 (en) * 2005-10-26 2007-06-14 Aloka Co., Ltd. Method and apparatus for elasticity imaging
US20070161904A1 (en) * 2006-11-10 2007-07-12 Penrith Corporation Transducer array imaging system
US20070167766A1 (en) * 2005-12-26 2007-07-19 Masao Takimoto Ultrasonic diagnostic apparatus
EP1815796A1 (en) * 2004-11-17 2007-08-08 Hitachi Medical Corporation Ultrasonograph and ultrasonic image display method
US20070195539A1 (en) * 2006-02-21 2007-08-23 Karl Storz Gmbh & Co. Kg Ultra wide band wireless optical endoscopic device
US20070258632A1 (en) * 2006-05-05 2007-11-08 General Electric Company User interface and method for identifying related information displayed in an ultrasound system
US20070282201A1 (en) * 2006-05-03 2007-12-06 Nam Ju Kim Ultrasonic moving-picture real-time service system and method and recording medium having embodied thereon computer program for performing method
US20080027322A1 (en) * 2004-02-26 2008-01-31 Siemens Medical Solutions Usa, Inc. Steered continuous wave doppler methods and systems for two-dimensional ultrasound transducer arrays
US20080058635A1 (en) * 1998-11-04 2008-03-06 Johns Hopkins University School Of Medicine Mri-guided therapy methods and related systems
US20080086054A1 (en) * 2006-10-04 2008-04-10 Slayton Michael H Ultrasound system and method for imaging and/or measuring displacement of moving tissue and fluid
US20080104533A1 (en) * 2006-10-31 2008-05-01 Steffen List Method and system for generation of a user interface
US20080114248A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080114241A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080114253A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080114251A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080114245A1 (en) * 2006-11-10 2008-05-15 Randall Kevin S Transducer array imaging system
US20080114247A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080125655A1 (en) * 2006-11-23 2008-05-29 Medison Co., Ltd. Portable ultrasound system
US20080144777A1 (en) * 2006-12-14 2008-06-19 Wilson Kevin S Portable digital radiographic devices
US20080146925A1 (en) * 2006-12-14 2008-06-19 Ep Medsystems, Inc. Integrated Electrophysiology and Ultrasound Imaging System
EP1935343A1 (en) * 2006-12-18 2008-06-25 Esaote S.p.A. Ergonomic housing for electroacoustic transducers and ultrasound probe with said housing
US20080194951A1 (en) * 2005-04-18 2008-08-14 Koninklijke Philips Electronics N.V. Ultrasonic Diagnostic Imaging System Configured By Probe Firmware
US20080194964A1 (en) * 2007-02-08 2008-08-14 Randall Kevin S Ultrasound imaging systems
US20080194920A1 (en) * 2007-02-09 2008-08-14 Siemens Aktiengesellschaft Methods for determining parameters and planning clinical studies in automatic study and data management systems
US20080208061A1 (en) * 2007-02-23 2008-08-28 General Electric Company Methods and systems for spatial compounding in a handheld ultrasound device
US20080208045A1 (en) * 2005-05-10 2008-08-28 Koninklijke Philips Electronics N.V. Optimization of User Settings for an Ultrasonic Imaging System
US20080255451A1 (en) * 2007-04-10 2008-10-16 C.R. Bard, Inc. Low power ultrasound system
US20080287789A1 (en) * 2007-05-14 2008-11-20 Sonosite, Inc. Computed volume sonography
US20080287780A1 (en) * 2007-05-16 2008-11-20 James Geoffrey Chase Integral based parameter identification applied to three dimensional tissue stiffness reconstruction in a digital image-based elasto-tomography system
US20080287807A1 (en) * 2007-05-16 2008-11-20 James Geoffrey Chase Global motion invariant signatures for fast and accurate motion tracking in a digital image-based elasto-tomography system
US20080314530A1 (en) * 2007-06-22 2008-12-25 Li-Ming Cheng Window coverings
US20090043196A1 (en) * 2007-08-08 2009-02-12 Aloka Co., Ltd. Ultrasound diagnosis apparatus
WO2009021179A1 (en) * 2007-08-09 2009-02-12 Volcano Corporation Controller user interface for a catheter lab intravascular ultrasound system
US20090062648A1 (en) * 2007-08-29 2009-03-05 Siemens Medical Solutions Usa, Inc. Automatic gain control in medical diagnostic ultrasound imaging
US20090062644A1 (en) * 2002-06-07 2009-03-05 Mcmorrow Gerald System and method for ultrasound harmonic imaging
WO2009012298A3 (en) * 2007-07-16 2009-03-26 Sunrise Medical Hhg Inc Physiological data collection system
US20090088774A1 (en) * 2007-09-30 2009-04-02 Nitish Swarup Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US20090105585A1 (en) * 2007-05-16 2009-04-23 Yanwei Wang System and method for ultrasonic harmonic imaging
US20090112070A1 (en) * 2007-10-31 2009-04-30 Yen-Shan Lin Telemedicine Device and System
US20090112093A1 (en) * 2007-10-25 2009-04-30 Medison Co., Ltd. Ultrasound diagnostic device and method for forming scan line data
US20090141631A1 (en) * 2007-12-03 2009-06-04 Nec Laboratories America, Inc. Voice adaptive gateway pacing methods and systems for wireless multi-hop networks
US20090149758A1 (en) * 2000-12-13 2009-06-11 Leonard Smith Gain Setting in Doppler Haemodynamic Monitors
US20090149750A1 (en) * 2005-03-30 2009-06-11 Hitachi Medical Corporation Ultrasonic Diagnostic Apparatus
US20090227873A1 (en) * 2004-04-19 2009-09-10 Koninklijke Philips Electronics, N.V. Data visualization method for an ultrasound imaging system
US20090240642A1 (en) * 2005-06-28 2009-09-24 Neurosciences Research Foundation, Inc. Neural modeling and brain-based devices using special purpose processor
US20090247871A1 (en) * 2008-03-25 2009-10-01 Tomy Varghese Rapid two/three-dimensional sector strain imaging
US20090306518A1 (en) * 2008-06-06 2009-12-10 Boston Scientific Scimed, Inc. Transducers, devices and systems containing the transducers, and methods of manufacture
US20090307619A1 (en) * 2008-06-05 2009-12-10 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US20100016763A1 (en) * 2008-07-17 2010-01-21 Keilman George W Intraluminal fluid property status sensing system and method
US20100056922A1 (en) * 2008-09-02 2010-03-04 Thierry Florent Method and diagnostic ultrasound apparatus for determining the condition of a person's artery or arteries
US20100094131A1 (en) * 2006-10-02 2010-04-15 Washington, University Of Ultrasonic estimation of strain induced by in vivo compression
US20100118482A1 (en) * 2008-11-13 2010-05-13 Mosaid Technologies Incorporated System including a plurality of encapsulated semiconductor chips
US20100121195A1 (en) * 2008-11-13 2010-05-13 Kang Hak Il Medical instrument
US20100125205A1 (en) * 2008-11-20 2010-05-20 Sang Shik Park Adaptive Persistence Processing Of Elastic Images
US20100145195A1 (en) * 2008-12-08 2010-06-10 Dong Gyu Hyun Hand-Held Ultrasound System
US20100160997A1 (en) * 2001-04-13 2010-06-24 Greatbatch Ltd. Tuned energy balanced system for minimizing heating and/or to provide emi protection of implanted leads in a high power electromagnetic field environment
US20100168821A1 (en) * 2001-04-13 2010-07-01 Greatbatch Ltd. Switched diverter circuits for minimizing heating of an implanted lead in a high power electromagnetic field environment
US20100183208A1 (en) * 2009-01-21 2010-07-22 Kabushiki Kaisha Toshiba Image display method, medical diagnostic imaging apparatus, and medical image processing apparatus
US20100191236A1 (en) * 2001-04-13 2010-07-29 Greatbatch Ltd. Switched diverter circuits for minimizing heating of an implanted lead and/or providing emi protection in a high power electromagnetic field environment
US20100187304A1 (en) * 2009-01-29 2010-07-29 Searete Llc, A Limited Liability Corporation Of The State Delaware Diagnostic delivery service
US20100191094A1 (en) * 2009-01-29 2010-07-29 Searete Llc Diagnostic delivery service
US20100208397A1 (en) * 2008-12-17 2010-08-19 Greatbatch Ltd. Switched safety protection circuit for an aimd system during exposure to high power electromagnetic fields
US20100217262A1 (en) * 2001-04-13 2010-08-26 Greatbatch Ltd. Frequency selective passive component networks for active implantable medical devices utilizing an energy dissipating surface
US20100222856A1 (en) * 2006-06-08 2010-09-02 Greatbatch Ltd. Band stop filter employing a capacitor and an inductor tank circuit to enhance MRI compatibility of active medical devices
US20100228130A1 (en) * 2009-03-09 2010-09-09 Teratech Corporation Portable ultrasound imaging system
US20100249598A1 (en) * 2009-03-25 2010-09-30 General Electric Company Ultrasound probe with replaceable head portion
US20100256448A1 (en) * 2003-04-01 2010-10-07 Boston Scientific Scimed, Inc. Fluid manifold for endoscope system
WO2010114573A1 (en) * 2009-04-01 2010-10-07 Analogic Corporation Ultrasound probe
US20100292571A1 (en) * 2009-05-13 2010-11-18 Washington, University Of Nodule screening using ultrasound elastography
US20100295870A1 (en) * 2009-05-22 2010-11-25 Amir Baghdadi Multi-source medical imaging system
US20100312092A1 (en) * 2004-02-09 2010-12-09 Roch Listz Maurice Method and system for vascular elastography
US20100324423A1 (en) * 2009-06-23 2010-12-23 Essa El-Aklouk Ultrasound transducer device and method of operation
US20100331694A1 (en) * 2008-02-07 2010-12-30 Koji Waki Ultrasonic diagnostic apparatus.
US20110019893A1 (en) * 2009-07-22 2011-01-27 Norbert Rahn Method and Device for Controlling the Ablation Energy for Performing an Electrophysiological Catheter Application
US20110054349A1 (en) * 2007-12-27 2011-03-03 Devicor Medical Products, Inc. Clutch and valving system for tetherless biopsy device
US20110087091A1 (en) * 2009-10-14 2011-04-14 Olson Eric S Method and apparatus for collection of cardiac geometry based on optical or magnetic tracking
US20110105904A1 (en) * 2009-04-24 2011-05-05 Yasuhito Watanabe Wireless ultrasonic diagnostic apparatus, wireless ultrasonic probe, and probe authentication method
US20110112778A1 (en) * 2009-11-12 2011-05-12 Medison Co., Ltd. Ultrasound system and method for providing doppler sound
US20110160582A1 (en) * 2008-04-29 2011-06-30 Yongping Zheng Wireless ultrasonic scanning system
USD640977S1 (en) 2009-09-25 2011-07-05 C. R. Bard, Inc. Charging station for a battery operated biopsy device
US8002713B2 (en) 2002-03-19 2011-08-23 C. R. Bard, Inc. Biopsy device and insertable biopsy needle module
US8012102B2 (en) 2005-01-31 2011-09-06 C. R. Bard, Inc. Quick cycle biopsy system
US20110218436A1 (en) * 2010-03-06 2011-09-08 Dewey Russell H Mobile ultrasound system with computer-aided detection
US8016772B2 (en) 2002-03-19 2011-09-13 C. R. Bard, Inc. Biopsy device for removing tissue specimens using a vacuum
US20110224552A1 (en) * 2008-12-03 2011-09-15 Koninklijke Philips Electronics N.V. Ultrasound assembly and system comprising interchangable transducers and displays
USRE42856E1 (en) 2002-05-29 2011-10-18 MRI Interventions, Inc. Magnetic resonance probes
US8052615B2 (en) 2004-07-09 2011-11-08 Bard Peripheral Vascular, Inc. Length detection system for biopsy device
US20110276283A1 (en) * 2006-12-07 2011-11-10 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US8095224B2 (en) 2009-03-19 2012-01-10 Greatbatch Ltd. EMI shielded conduit assembly for an active implantable medical device
US20120041278A1 (en) * 2010-03-12 2012-02-16 Rajendra Padma Sadhu User wearable portable communication device
US8118732B2 (en) 2003-04-01 2012-02-21 Boston Scientific Scimed, Inc. Force feedback control system for video endoscope
US20120057009A1 (en) * 2010-09-03 2012-03-08 Liao Shih-Wen Wireless Endoscope Apparatus
US20120092527A1 (en) * 2010-10-18 2012-04-19 General Electric Company Method for Multiple Image Parameter Adjustment Based on Single User Input
US8162851B2 (en) 2003-03-29 2012-04-24 C. R. Bard, Inc. Biopsy needle system having a pressure generating unit
US8228347B2 (en) 2006-05-08 2012-07-24 C. R. Bard, Inc. User interface and methods for sonographic display device
US20120197124A1 (en) * 2011-02-01 2012-08-02 Fujifilm Corporation Ultrasound diagnostic apparatus
US8251917B2 (en) 2006-08-21 2012-08-28 C. R. Bard, Inc. Self-contained handheld biopsy needle
US8262585B2 (en) 2005-08-10 2012-09-11 C. R. Bard, Inc. Single-insertion, multiple sampling biopsy device with linear drive
US8262586B2 (en) 2006-10-24 2012-09-11 C. R. Bard, Inc. Large sample low aspect ratio biopsy needle
US20120232397A1 (en) * 2011-03-11 2012-09-13 Fujifilm Corporation Ultrasound probe and ultrasound diagnostic apparatus
US20120232540A1 (en) * 2011-03-10 2012-09-13 Thomas Baur Surgical instrument with digital data interface
US8267868B2 (en) 2005-08-10 2012-09-18 C. R. Bard, Inc. Single-insertion, multiple sample biopsy device with integrated markers
US8282574B2 (en) 2005-08-10 2012-10-09 C. R. Bard, Inc. Single-insertion, multiple sampling biopsy device usable with various transport systems and integrated markers
US8312771B2 (en) 2006-11-10 2012-11-20 Siemens Medical Solutions Usa, Inc. Transducer array imaging system
WO2012158720A1 (en) * 2011-05-15 2012-11-22 Spacelabs Healthcare, Llc User configurable central monitoring station
US20120296210A1 (en) * 2007-08-10 2012-11-22 Ultrasonix Medical Corporation Wireless network having portable ultrasound devices
US20120324397A1 (en) * 2011-06-20 2012-12-20 Tabb Alan Patz System and method for wireless interaction with medical image data
US20130028153A1 (en) * 2011-07-25 2013-01-31 Samsung Electronics Co., Ltd. Wireless communication method of probe for ultrasound diagnosis and apparatus therefor
US20130079630A1 (en) * 2011-09-28 2013-03-28 Terumo Kabushiki Kaisha Imaging apparatus for diagnosis
US8430824B2 (en) 2009-10-29 2013-04-30 Bard Peripheral Vascular, Inc. Biopsy driver assembly having a control circuit for conserving battery power
CN103140162A (en) * 2010-09-29 2013-06-05 奥林巴斯医疗株式会社 Medical system, medical system communications method, medical image photography device, and server
US20130144165A1 (en) * 2010-06-09 2013-06-06 Emad S. Ebbini Dual mode ultrasound transducer (dmut) system and method for controlling delivery of ultrasound therapy
US20130150721A1 (en) * 2010-12-24 2013-06-13 Panasonic Corporation Ultrasound diagnostic apparatus and ultrasound diagnostic apparatus control method
US8485987B2 (en) 2006-10-06 2013-07-16 Bard Peripheral Vascular, Inc. Tissue handling system with reduced operator exposure
US8485989B2 (en) 2009-09-01 2013-07-16 Bard Peripheral Vascular, Inc. Biopsy apparatus having a tissue sample retrieval mechanism
CN103202712A (en) * 2012-01-17 2013-07-17 三星电子株式会社 Probe Device, Server, System For Diagnosing Ultrasound Image, And Method Of Processing Ultrasound Image
US20130184582A1 (en) * 2012-01-16 2013-07-18 Yuko KANAYAMA Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image parallel display method
WO2013109965A1 (en) * 2012-01-19 2013-07-25 Brigham And Women's Hospital, Inc. Data reconstruction for improved ultrasound imaging
US20130245449A1 (en) * 1996-06-28 2013-09-19 Sonosite, Inc. Balance body ultrasound system
US20130246714A1 (en) * 2012-03-16 2013-09-19 Oracle International Corporation System and method for supporting buffer allocation in a shared memory queue
US20130253317A1 (en) * 2010-12-15 2013-09-26 Koninklijke Philips Electronics N.V. Ultrasound imaging system with patient-specific settings
US20130249842A1 (en) * 2012-03-26 2013-09-26 General Electric Company Ultrasound device and method thereof
WO2013148730A3 (en) * 2012-03-26 2013-11-28 Teratech Corporation Tablet ultrasound system
US8597206B2 (en) 2009-10-12 2013-12-03 Bard Peripheral Vascular, Inc. Biopsy probe assembly having a mechanism to prevent misalignment of components prior to installation
US8597205B2 (en) 2007-12-20 2013-12-03 C. R. Bard, Inc. Biopsy device
US8600519B2 (en) 2001-04-13 2013-12-03 Greatbatch Ltd. Transient voltage/current protection system for electronic circuits associated with implanted leads
US8622894B2 (en) 2003-04-01 2014-01-07 Boston Scientific Scimed, Inc. Articulation joint
US20140031694A1 (en) * 2012-07-26 2014-01-30 Interson Corporation Portable ultrasonic imaging probe including a transducer array
EP2710960A1 (en) * 2012-09-24 2014-03-26 Samsung Electronics Co., Ltd Ultrasound apparatus and information providing method of the ultrasound apparatus
US8690793B2 (en) 2009-03-16 2014-04-08 C. R. Bard, Inc. Biopsy device having rotational cutting
US8708929B2 (en) 2009-04-15 2014-04-29 Bard Peripheral Vascular, Inc. Biopsy apparatus having integrated fluid management
US20140129004A1 (en) * 2012-02-28 2014-05-08 Panasonic Corporation Display apparatus for control information, method for displaying control information, and system for displaying control information
WO2013039910A3 (en) * 2011-09-12 2014-05-15 Bedford, Freeman & Worth Publishing Group, Llc Interactive online laboratory
US20140180721A1 (en) * 2012-12-26 2014-06-26 Volcano Corporation Data Labeling and Indexing in a Multi-Modality Medical Imaging System
US20140236009A1 (en) * 2011-11-10 2014-08-21 Fujifilm Corporation Ultrasound diagnostic apparatus and ultrasound image producing method
US8845548B2 (en) 2009-06-12 2014-09-30 Devicor Medical Products, Inc. Cutter drive assembly for biopsy device
US8882763B2 (en) 2010-01-12 2014-11-11 Greatbatch Ltd. Patient attached bonding strap for energy dissipation from a probe or a catheter during magnetic resonance imaging
US8903505B2 (en) 2006-06-08 2014-12-02 Greatbatch Ltd. Implantable lead bandstop filter employing an inductive coil with parasitic capacitance to enhance MRI compatibility of active medical devices
WO2014194288A1 (en) * 2013-05-31 2014-12-04 eagleyemed, Inc. Dynamic adjustment of image compression for high resolution live medical image sharing
US20140364741A1 (en) * 2013-06-11 2014-12-11 Samsung Electronics Co., Ltd. Portable ultrasonic probe
US20140371592A1 (en) * 2012-02-06 2014-12-18 Hitachi Aloka Medical, Ltd. Mobile ultrasonic diagnostic device
US20150032004A1 (en) * 2013-07-24 2015-01-29 Samsung Electronics Co., Ltd. Ultrasonic probe, system including the same, and operation method thereof
US20150065881A1 (en) * 2013-08-29 2015-03-05 Samsung Medison Co., Ltd. Ultrasound diagnostic apparatus and method of operating the same
US8977355B2 (en) 2001-04-13 2015-03-10 Greatbatch Ltd. EMI filter employing a capacitor and an inductor tank circuit having optimum component values
US9002458B2 (en) 2013-06-29 2015-04-07 Thync, Inc. Transdermal electrical stimulation devices for modifying or inducing cognitive state
US20150099968A1 (en) * 2013-10-07 2015-04-09 Acist Medical Systems, Inc. Systems and methods for controlled single touch zoom
US9070453B2 (en) 2010-04-15 2015-06-30 Ramot At Tel Aviv University Ltd. Multiple programming of flash memory without erase
US20150190111A1 (en) * 2014-01-03 2015-07-09 William R. Fry Ultrasound-guided non-invasive blood pressure measurement apparatus and methods
US9108066B2 (en) 2008-03-20 2015-08-18 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
US20150238168A1 (en) * 2012-09-13 2015-08-27 Koninklijke Philips N.V. Mobile 3d wireless ultrasound image acquisition device and ultrasound imaging system
WO2015137543A1 (en) * 2014-03-14 2015-09-17 알피니언메디칼시스템 주식회사 Software-based ultrasound imaging system
US20150265857A1 (en) * 2014-03-21 2015-09-24 Stephen R. Barnes Hybrid Ultrasound and Magnetic Resonance Imaging Device
US9152765B2 (en) 2010-03-21 2015-10-06 Spacelabs Healthcare Llc Multi-display bedside monitoring system
US20150285886A1 (en) * 2014-04-03 2015-10-08 Industry-Academic Cooperation Foundation, Yonsei University Method and apparatus for adjusting the parameters of a magnetic resonance image
US20150305824A1 (en) * 2014-04-26 2015-10-29 Steven Sounyoung Yu Technique for Inserting Medical Instruments Using Head-Mounted Display
US9173641B2 (en) 2009-08-12 2015-11-03 C. R. Bard, Inc. Biopsy apparatus having integrated thumbwheel mechanism for manual rotation of biopsy cannula
US20150346157A1 (en) * 2013-02-07 2015-12-03 Siemens Aktiengesellschaft Method and device for improving the saft analysis when measuring irregularities
US20150350329A1 (en) * 2014-06-03 2015-12-03 Canon Kabushiki Kaisha Method and apparatus for transmitting sensor data in a wireless network
US20150374346A1 (en) * 2013-02-15 2015-12-31 B-K Medical Aps Ultrasound display client
US9248283B2 (en) 2001-04-13 2016-02-02 Greatbatch Ltd. Band stop filter comprising an inductive component disposed in a lead wire in series with an electrode
US9283409B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, Llc Energy based fat reduction
US9283410B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9295828B2 (en) 2001-04-13 2016-03-29 Greatbatch Ltd. Self-resonant inductor wound portion of an implantable lead for enhanced MRI compatibility of active implantable medical devices
US9298889B2 (en) 2007-03-09 2016-03-29 Spacelabs Healthcare Llc Health data collection tool
US9295444B2 (en) 2006-11-10 2016-03-29 Siemens Medical Solutions Usa, Inc. Transducer array imaging system
USD754357S1 (en) 2011-08-09 2016-04-19 C. R. Bard, Inc. Ultrasound probe head
US9320537B2 (en) 2004-10-06 2016-04-26 Guided Therapy Systems, Llc Methods for noninvasive skin tightening
WO2016068604A1 (en) * 2014-10-31 2016-05-06 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US9333334B2 (en) 2014-05-25 2016-05-10 Thync, Inc. Methods for attaching and wearing a neurostimulator
WO2016077822A1 (en) 2014-11-14 2016-05-19 Ursus Medical, Llc Ultrasound beamforming system and method based on aram array
US20160139789A1 (en) * 2014-11-18 2016-05-19 Samsung Medison Co., Ltd. Ultrasound imaging apparatus and method of controlling the same
US20160135786A1 (en) * 2014-11-18 2016-05-19 General Electric Company Wireless ultrasound probe tethered to a pod
US20160174937A1 (en) * 2014-12-23 2016-06-23 General Electric Company Wireless ultrasound probe
US9384652B2 (en) 2010-11-19 2016-07-05 Spacelabs Healthcare, Llc System and method for transfer of primary alarm notification on patient monitoring systems
US9399126B2 (en) 2014-02-27 2016-07-26 Thync Global, Inc. Methods for user control of neurostimulation to modify a cognitive state
WO2016123069A1 (en) * 2015-01-26 2016-08-04 Northeastern University Internet-linked ultrasonic network
US9421029B2 (en) 2004-10-06 2016-08-23 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US9427596B2 (en) 2013-01-16 2016-08-30 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
US9427600B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US9427601B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, Llc Methods for face and neck lifts
US9440096B2 (en) 2004-10-06 2016-09-13 Guided Therapy Systems, Llc Method and system for treating stretch marks
US20160346936A1 (en) * 2015-05-29 2016-12-01 Kuka Roboter Gmbh Selection of a device or object using a camera
US9510802B2 (en) 2012-09-21 2016-12-06 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US20170055944A1 (en) * 2015-09-02 2017-03-02 Aningbo Youchang Ultrasonic Technology Co., Ltd Method for controlling wireless intelligent ultrasound fetal imaging system
US9604020B2 (en) 2009-10-16 2017-03-28 Spacelabs Healthcare Llc Integrated, extendable anesthesia system
US9649091B2 (en) 2011-01-07 2017-05-16 General Electric Company Wireless ultrasound imaging system and method for wireless communication in an ultrasound imaging system
US20170146643A1 (en) * 2015-11-19 2017-05-25 Analog Devices, Inc. Analog ultrasound beamformer
US9667889B2 (en) 2013-04-03 2017-05-30 Butterfly Network, Inc. Portable electronic devices with integrated imaging capabilities
US9694212B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, Llc Method and system for ultrasound treatment of skin
US20170220024A1 (en) * 2014-07-30 2017-08-03 Kawasaki Jukogyo Kabushiki Kaisha Robot control program generation method and apparatus
US20170252013A1 (en) * 2016-03-01 2017-09-07 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus and non-transitory computer-readable medium
US9760584B2 (en) 2012-03-16 2017-09-12 Oracle International Corporation Systems and methods for supporting inline delegation of middle-tier transaction logs to database
US9784805B2 (en) 2007-06-19 2017-10-10 Koninklijke Philips N.V. MRI radio frequency receiver comprising digital down converter with connector that passes analog signal being contained within radio frequency receiver coil unit
US9797764B2 (en) 2009-10-16 2017-10-24 Spacelabs Healthcare, Llc Light enhanced flow tube
US9827449B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
RU2638621C2 (en) * 2012-01-18 2017-12-14 Конинклейке Филипс Н.В. Ultrasonic management of needle trajectory during biopsy
US9877699B2 (en) 2012-03-26 2018-01-30 Teratech Corporation Tablet ultrasound system
USRE46699E1 (en) 2013-01-16 2018-02-06 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
US20180060490A1 (en) * 2016-08-29 2018-03-01 Siemens Healthcare Gmbh Medical imaging system
US9931514B2 (en) 2013-06-30 2018-04-03 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
US20180153515A1 (en) * 2016-12-02 2018-06-07 Samsung Medison Co., Ltd. Ultrasonic probe and ultrasonic diagnostic apparatus including the same
US10080889B2 (en) 2009-03-19 2018-09-25 Greatbatch Ltd. Low inductance and low resistance hermetically sealed filtered feedthrough for an AIMD
US20190059728A1 (en) * 2011-02-17 2019-02-28 Tyto Care Ltd. System and method for performing an automatic and remote trained personnel guided medical examination
US20190102046A1 (en) * 2017-09-29 2019-04-04 Qualcomm Incorporated Layer for inducing varying delays in ultrasonic signals propagating in ultrasonic sensor
US10285673B2 (en) 2013-03-20 2019-05-14 Bard Peripheral Vascular, Inc. Biopsy device
US20190156935A1 (en) * 2017-11-20 2019-05-23 Nihon Kohden Corporation Patient monitor and physiological information management system
US10350421B2 (en) 2013-06-30 2019-07-16 Greatbatch Ltd. Metallurgically bonded gold pocket pad for grounding an EMI filter to a hermetic terminal for an active implantable medical device
US10401493B2 (en) * 2014-08-18 2019-09-03 Maui Imaging, Inc. Network-based ultrasound imaging system
US10420960B2 (en) 2013-03-08 2019-09-24 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US10456111B2 (en) 2006-12-07 2019-10-29 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US10469846B2 (en) 2017-03-27 2019-11-05 Vave Health, Inc. Dynamic range compression of ultrasound images
WO2019222478A2 (en) 2018-05-17 2019-11-21 Teratech Corporation Portable ultrasound system
US10499885B2 (en) * 2013-06-25 2019-12-10 Hitachi, Ltd. Ultrasound system and method, and ultrasound probe
CN110675902A (en) * 2018-07-03 2020-01-10 爱思开海力士有限公司 Storage system and operation method of storage system
US10537304B2 (en) 2008-06-06 2020-01-21 Ulthera, Inc. Hand wand for ultrasonic cosmetic treatment and imaging
US10559409B2 (en) 2017-01-06 2020-02-11 Greatbatch Ltd. Process for manufacturing a leadless feedthrough for an active implantable medical device
US10561837B2 (en) 2011-03-01 2020-02-18 Greatbatch Ltd. Low equivalent series resistance RF filter for an active implantable medical device utilizing a ceramic reinforced metal composite filled via
US10589107B2 (en) 2016-11-08 2020-03-17 Greatbatch Ltd. Circuit board mounted filtered feedthrough assembly having a composite conductive lead for an AIMD
US10588602B2 (en) * 2015-02-10 2020-03-17 Samsung Electronics Co., Ltd. Portable ultrasound apparatus and control method for the same
US10603521B2 (en) 2014-04-18 2020-03-31 Ulthera, Inc. Band transducer ultrasound therapy
US10617384B2 (en) 2011-12-29 2020-04-14 Maui Imaging, Inc. M-mode ultrasound imaging of arbitrary paths
US10639008B2 (en) 2009-10-08 2020-05-05 C. R. Bard, Inc. Support and cover structures for an ultrasound probe head
US10646201B2 (en) 2014-11-18 2020-05-12 C. R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US10653392B2 (en) 2013-09-13 2020-05-19 Maui Imaging, Inc. Ultrasound imaging using apparent point-source transmit transducer
US20200160990A1 (en) * 2018-11-20 2020-05-21 Siemens Healthcare Gmbh Control unit for a medical imaging system comprising a processor and a logic gate; imaging system and method for controlling a medical imaging system
US10675000B2 (en) 2007-10-01 2020-06-09 Maui Imaging, Inc. Determining material stiffness using multiple aperture ultrasound
US10699811B2 (en) 2011-03-11 2020-06-30 Spacelabs Healthcare L.L.C. Methods and systems to determine multi-parameter managed alarm hierarchy during patient monitoring
CN111449757A (en) * 2020-04-10 2020-07-28 京东方科技集团股份有限公司 Telemedicine robot, control method and charging method thereof
US20200275908A1 (en) * 2017-11-17 2020-09-03 Nihon Kohden Corporation Ultrasonic probe and ultrasonic measurement system
EP3413803B1 (en) * 2016-02-12 2020-09-23 Qualcomm Incorporated Ultrasound devices for estimating blood pressure and other cardiovascular properties
US10816650B2 (en) 2016-05-27 2020-10-27 Interson Corporation Ultrasonic imaging probe including composite aperture receiving array
US10820885B2 (en) 2012-06-15 2020-11-03 C. R. Bard, Inc. Apparatus and methods for detection of a removable cap on an ultrasound probe
US10835208B2 (en) 2010-04-14 2020-11-17 Maui Imaging, Inc. Concave ultrasound transducers and 3D arrays
US10856843B2 (en) 2017-03-23 2020-12-08 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US10856846B2 (en) 2016-01-27 2020-12-08 Maui Imaging, Inc. Ultrasound imaging with sparse array probes
US10864385B2 (en) 2004-09-24 2020-12-15 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
CN112105945A (en) * 2018-04-24 2020-12-18 皇家飞利浦有限公司 Ultrasound imaging system for high resolution broadband harmonic imaging
US20210016060A1 (en) * 2019-07-15 2021-01-21 Boston Scientific Scimed, Inc. Medical systems, devices, and related methods
US10905396B2 (en) 2014-11-18 2021-02-02 C. R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US10905888B2 (en) 2018-03-22 2021-02-02 Greatbatch Ltd. Electrical connection for an AIMD EMI filter utilizing an anisotropic conductive layer
US10912945B2 (en) 2018-03-22 2021-02-09 Greatbatch Ltd. Hermetic terminal for an active implantable medical device having a feedthrough capacitor partially overhanging a ferrule for high effective capacitance area
US20210038188A1 (en) * 2012-12-26 2021-02-11 Philips Image Guided Therapy Corporation Measurement navigation in a multi-modality medical imaging system
US10987026B2 (en) 2013-05-30 2021-04-27 Spacelabs Healthcare Llc Capnography module with automatic switching between mainstream and sidestream monitoring
US20210212660A1 (en) * 2020-01-09 2021-07-15 Hitachi, Ltd. Ultrasound diagnosis apparatus and program
TWI736206B (en) * 2019-05-24 2021-08-11 九齊科技股份有限公司 Audio receiving device and audio transmitting device
US11103213B2 (en) 2009-10-08 2021-08-31 C. R. Bard, Inc. Spacers for use with an ultrasound probe
US11115475B2 (en) 2015-01-26 2021-09-07 Northeastern University Software-defined implantable ultrasonic device for use in the internet of medical things
TWI739156B (en) * 2019-09-16 2021-09-11 臺北醫學大學 System and method for biological object imagimg and treatment
US11116474B2 (en) 2013-07-23 2021-09-14 Regents Of The University Of Minnesota Ultrasound image formation and/or reconstruction using multiple frequency waveforms
US11116483B2 (en) 2017-05-19 2021-09-14 Merit Medical Systems, Inc. Rotating biopsy needle
US11198014B2 (en) 2011-03-01 2021-12-14 Greatbatch Ltd. Hermetically sealed filtered feedthrough assembly having a capacitor with an oxide resistant electrical connection to an active implantable medical device housing
US11207548B2 (en) 2004-10-07 2021-12-28 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US11224895B2 (en) 2016-01-18 2022-01-18 Ulthera, Inc. Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof
US20220026483A1 (en) * 2020-07-09 2022-01-27 Tektronix, Inc. Indicating a probing target for a fabricated electronic circuit
US11235179B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc Energy based skin gland treatment
US20220039281A1 (en) * 2020-07-31 2022-02-03 FLIR Belgium BVBA Modular electrical power distribution system with module detection systems and methods
US11241218B2 (en) 2016-08-16 2022-02-08 Ulthera, Inc. Systems and methods for cosmetic ultrasound treatment of skin
US11253233B2 (en) 2012-08-10 2022-02-22 Maui Imaging, Inc. Calibration of multiple aperture ultrasound probes
US11338156B2 (en) 2004-10-06 2022-05-24 Guided Therapy Systems, Llc Noninvasive tissue tightening system
US11432800B2 (en) * 2019-03-25 2022-09-06 Exo Imaging, Inc. Handheld ultrasound imager
US11446003B2 (en) 2017-03-27 2022-09-20 Vave Health, Inc. High performance handheld ultrasound
US11452504B2 (en) * 2019-04-02 2022-09-27 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Regional contrast enhancement based on complementary information to reflectivity information
US11458337B2 (en) 2017-11-28 2022-10-04 Regents Of The University Of Minnesota Adaptive refocusing of ultrasound transducer arrays using image data
US20220378403A1 (en) * 2021-05-25 2022-12-01 Fujifilm Healthcare Corporation Ultrasound diagnostic apparatus and diagnosis assistance method
US11531096B2 (en) 2017-03-23 2022-12-20 Vave Health, Inc. High performance handheld ultrasound
US11547382B2 (en) 1999-06-22 2023-01-10 Teratech Corporation Networked ultrasound system and method for imaging a medical procedure using an invasive probe
US11547384B2 (en) 2011-04-14 2023-01-10 Regents Of The University Of Minnesota Vascular characterization using ultrasound imaging
US20230061122A1 (en) * 2021-08-24 2023-03-02 Saudi Arabian Oil Company Convex ultrasonic sensor for weld inspection
US11596812B2 (en) 2018-04-06 2023-03-07 Regents Of The University Of Minnesota Wearable transcranial dual-mode ultrasound transducers for neuromodulation
US11602372B2 (en) 2019-12-31 2023-03-14 Auris Health, Inc. Alignment interfaces for percutaneous access
US11631496B2 (en) 2013-09-12 2023-04-18 Johnson & Johnson Surgical Vision, Inc. Computer-based operating room support system
US11660147B2 (en) * 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
US11666305B2 (en) * 2018-02-12 2023-06-06 Koninklijke Philips N.V. Workflow assistance for medical doppler ultrasound evaluation
US11675073B2 (en) 2003-11-26 2023-06-13 Teratech Corporation Modular portable ultrasound systems
US11715560B2 (en) 2013-09-12 2023-08-01 Johnson & Johnson Surgical Vision, Inc. Computer-based operating room support system
US11724133B2 (en) 2004-10-07 2023-08-15 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US11730451B2 (en) 2018-03-22 2023-08-22 Exo Imaging, Inc. Integrated ultrasonic transducers
US11766241B2 (en) * 2018-04-27 2023-09-26 Fujifilm Corporation Ultrasound system in which an ultrasound probe and a display device are wirelessly connected with each other and method for controlling ultrasound system in which an ultrasound probe and a display device are wirelessly connected with each other
US11793498B2 (en) 2017-05-19 2023-10-24 Merit Medical Systems, Inc. Biopsy needle devices and methods of use
US11844500B2 (en) 2017-05-19 2023-12-19 Merit Medical Systems, Inc. Semi-automatic biopsy needle device and methods of use
US11883688B2 (en) 2004-10-06 2024-01-30 Guided Therapy Systems, Llc Energy based fat reduction
US20240041433A1 (en) * 2016-06-04 2024-02-08 Otonexus Medical Technologies, Inc. Apparatus and method for characterization of a ductile membrane, surface and sub-surface properties
US11944849B2 (en) 2018-02-20 2024-04-02 Ulthera, Inc. Systems and methods for combined cosmetic treatment of cellulite with ultrasound

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9402601B1 (en) * 1999-06-22 2016-08-02 Teratech Corporation Methods for controlling an ultrasound imaging procedure and providing ultrasound images to an external non-ultrasound application via a network
JP6049371B2 (en) * 2011-11-09 2016-12-21 東芝メディカルシステムズ株式会社 Ultrasound diagnostic system
JP6363229B2 (en) 2014-02-05 2018-07-25 ベラソン インコーポレイテッドVerathon Inc. Ultrasonic data collection
FR3036943A1 (en) * 2015-06-02 2016-12-09 Echosens NON-INVASIVE HEPATIC LESION DETECTION DEVICE
US20170000456A1 (en) * 2015-07-01 2017-01-05 Edan Instruments, Inc. Apparatus and method for semi-automatic ultrasound transducer connector lock
US10575825B2 (en) 2015-07-27 2020-03-03 Siemens Medical Solutions Usa, Inc. Doppler imaging
EP3350589A4 (en) * 2015-09-18 2019-10-23 Chirp Microsystems Inc. Programmable ultrasonic transceiver
WO2017114874A2 (en) * 2015-12-29 2017-07-06 Koninklijke Philips N.V. Ultrasound imaging system with a multi-mode touch screen interface
US20200129151A1 (en) * 2018-10-25 2020-04-30 Butterfly Network, Inc. Methods and apparatuses for ultrasound imaging using different image formats
WO2020103103A1 (en) * 2018-11-22 2020-05-28 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic data processing method, ultrasonic device and storage medium
US11185379B2 (en) * 2019-01-10 2021-11-30 Verily Life Sciences Llc Comprehensive messaging system for robotic surgical systems
JP7211150B2 (en) * 2019-02-21 2023-01-24 コニカミノルタ株式会社 ULTRASOUND DIAGNOSTIC DEVICE, ULTRASOUND IMAGE GENERATING METHOD AND PROGRAM
WO2020206075A1 (en) * 2019-04-05 2020-10-08 Butterfly Network, Inc. Wireless ultrasound device and related apparatus and methods
EP4230145A4 (en) * 2020-11-18 2024-04-03 Wuhan United Imaging Healthcare Co Ltd Ultrasonic imaging method, system and storage medium
WO2023086430A1 (en) * 2021-11-10 2023-05-19 Genesis Medtech (USA) Inc. Integrated digital surgical system
US20230200847A1 (en) * 2021-12-29 2023-06-29 Creare Llc Penetrative Medical Access Devices, and Related Methods and Systems

Citations (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4070905A (en) * 1975-10-13 1978-01-31 The Commonwealth Of Australia Ultrasonic beam scanning
US4140022A (en) * 1977-12-20 1979-02-20 Hewlett-Packard Company Acoustic imaging apparatus
US4159462A (en) * 1977-08-18 1979-06-26 General Electric Company Ultrasonic multi-sector scanner
US4310907A (en) * 1978-12-08 1982-01-12 Matsushita Electric Industrial Company, Limited Scan converter for a sector scan type ultrasound imaging system
US4319489A (en) * 1980-03-28 1982-03-16 Yokogawa Electric Works, Ltd. Ultrasonic diagnostic method and apparatus
US4344327A (en) * 1979-12-28 1982-08-17 Aloka Co., Ltd. Electronic scanning ultrasonic diagnostic system
US4368643A (en) * 1979-11-16 1983-01-18 Matsushita Electric Industrial Company, Limited Ultrasonic imaging by radial scan beams emanating from a hypothetical point located behind linear transducer array
US4573477A (en) * 1982-04-28 1986-03-04 Aloka Co., Ltd. Ultrasonic diagnostic apparatus
US4582065A (en) * 1984-06-28 1986-04-15 Picker International, Inc. Ultrasonic step scanning utilizing unequally spaced curvilinear transducer array
US4622977A (en) * 1983-12-05 1986-11-18 Aloka Co., Ltd. Ultrasonic diagnostic apparatus
US4664122A (en) * 1984-07-25 1987-05-12 Kabushiki Kaisha Toshiba Ultrasonic transducer array used in ultrasonic diagnosis apparatus
US4759375A (en) * 1985-12-26 1988-07-26 Aloka Co., Ltd. Ultrasonic doppler diagnostic apparatus
US4852577A (en) * 1988-04-07 1989-08-01 The United States Of America As Represented By The Department Of Health And Human Services High speed adaptive ultrasonic phased array imaging system
US4896283A (en) * 1986-03-07 1990-01-23 Hewlett-Packard Company Iterative real-time XY raster path generator for bounded areas
US4937797A (en) * 1988-11-14 1990-06-26 Hewlett-Packard Company Method and apparatus for controlling scan line direction in a linear array ultrasonic doppler scanning system
US4949259A (en) * 1987-10-29 1990-08-14 Hewlett-Packard Company Delay coefficient generator for accumulators
US4962667A (en) * 1985-10-09 1990-10-16 Hitachi, Ltd. Ultrasonic imaging apparatus
US5123415A (en) * 1990-07-19 1992-06-23 Advanced Technology Laboratories, Inc. Ultrasonic imaging by radial scan of trapezoidal sector
US5148810A (en) * 1990-02-12 1992-09-22 Acuson Corporation Variable origin-variable angle acoustic scanning method and apparatus
US5235986A (en) * 1990-02-12 1993-08-17 Acuson Corporation Variable origin-variable angle acoustic scanning method and apparatus for a curved linear array
US5261408A (en) * 1990-02-12 1993-11-16 Acuson Corporation Variable origin-variable acoustic scanning method and apparatus
US5295485A (en) * 1991-12-13 1994-03-22 Hitachi, Ltd. Ultrasonic diagnostic system
US5379771A (en) * 1993-04-06 1995-01-10 Kabushiki Kaisha Toshiba Ultrasonic imaging apparatus
US5383457A (en) * 1987-04-20 1995-01-24 National Fertility Institute Method and apparatus for processing images
US5590658A (en) * 1995-06-29 1997-01-07 Teratech Corporation Portable ultrasound imaging system
US5609155A (en) * 1995-04-26 1997-03-11 Acuson Corporation Energy weighted parameter spatial/temporal filter
US5615679A (en) * 1995-02-06 1997-04-01 Ge Yokogawa Medical Systems, Limited Method of displaying ultrasonic images and apparatus for ultrasonic diagnosis
US5680536A (en) * 1994-03-25 1997-10-21 Tyuluman; Samuel A. Dual motherboard computer system
US5685307A (en) * 1995-02-28 1997-11-11 Iowa State University Research Foundation, Inc. Method and apparatus for tissue characterization of animals using ultrasound
US5715823A (en) * 1996-02-27 1998-02-10 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with universal access to diagnostic information and images
US5718228A (en) * 1996-03-13 1998-02-17 Fujitsu Ltd. Ultrasonic diagnostic apparatus
US5722412A (en) * 1996-06-28 1998-03-03 Advanced Technology Laboratories, Inc. Hand held ultrasonic diagnostic instrument
US5758649A (en) * 1995-09-01 1998-06-02 Fujitsu Limited Ultrasonic module and ultrasonic diagnostic system
US5763785A (en) * 1995-06-29 1998-06-09 Massachusetts Institute Of Technology Integrated beam forming and focusing processing circuit for use in an ultrasound imaging system
US5774876A (en) * 1996-06-26 1998-06-30 Par Government Systems Corporation Managing assets with active electronic tags
US5795297A (en) * 1996-09-12 1998-08-18 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with personal computer architecture
US5798461A (en) * 1993-06-02 1998-08-25 Hewlett-Packard Company Methods and apparatus for ultrasound imaging using combined scan patterns
US5817024A (en) * 1996-06-28 1998-10-06 Sonosight, Inc. Hand held ultrasonic diagnostic instrument with digital beamformer
US5839442A (en) * 1995-06-29 1998-11-24 Teratech Corporation Portable ultrasound imaging system
US5855556A (en) * 1997-09-19 1999-01-05 Fujitsu Ltd. Ultrasonic diagnostic apparatus
US5891030A (en) * 1997-01-24 1999-04-06 Mayo Foundation For Medical Education And Research System for two dimensional and three dimensional imaging of tubular structures in the human body
US5893363A (en) * 1996-06-28 1999-04-13 Sonosight, Inc. Ultrasonic array transducer transceiver for a hand held ultrasonic diagnostic instrument
US5897498A (en) * 1996-09-25 1999-04-27 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with electronic message communications capability
US5904652A (en) * 1995-06-29 1999-05-18 Teratech Corporation Ultrasound scan conversion with spatial dithering
US5957846A (en) * 1995-06-29 1999-09-28 Teratech Corporation Portable ultrasound imaging system
US5961610A (en) * 1996-08-13 1999-10-05 General Electric Company Systems, methods and apparatus for generating and controlling display of medical images
US5964709A (en) * 1995-06-29 1999-10-12 Teratech Corporation Portable ultrasound imaging system
US5971923A (en) * 1997-12-31 1999-10-26 Acuson Corporation Ultrasound system and method for interfacing with peripherals
US6032120A (en) * 1997-12-16 2000-02-29 Acuson Corporation Accessing stored ultrasound images and other digital medical images
US6063030A (en) * 1993-11-29 2000-05-16 Adalberto Vara PC based ultrasound device with virtual control user interface
US6101407A (en) * 1998-02-13 2000-08-08 Eastman Kodak Company Method and system for remotely viewing and configuring output from a medical imaging device
US6106468A (en) * 1999-04-05 2000-08-22 Agilent Technologies, Inc. Ultrasound system employing a unified memory
US6111816A (en) * 1997-02-03 2000-08-29 Teratech Corporation Multi-dimensional beamforming device
US6135961A (en) * 1996-06-28 2000-10-24 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US6139498A (en) * 1998-12-29 2000-10-31 Ge Diasonics Israel, Ltd. Ultrasound system performing simultaneous parallel computer instructions
US6139496A (en) * 1999-04-30 2000-10-31 Agilent Technologies, Inc. Ultrasonic imaging system having isonification and display functions integrated in an easy-to-manipulate probe assembly
US6142946A (en) * 1998-11-20 2000-11-07 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with cordless scanheads
US6159150A (en) * 1998-11-20 2000-12-12 Acuson Corporation Medical diagnostic ultrasonic imaging system with auxiliary processor
US6198283B1 (en) * 1998-09-18 2001-03-06 Ge Medical Systems Global Technology Llc System and method of phase sensitive MRI reconstruction using partial k-space data and including a network
US6210327B1 (en) * 1999-04-28 2001-04-03 General Electric Company Method and apparatus for sending ultrasound image data to remotely located device
US6306089B1 (en) * 1999-09-24 2001-10-23 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with customized measurements and calculations
US6337306B1 (en) * 1997-02-06 2002-01-08 Rhone-Poulenc Agro Phenoxyacetic acid derivatives and their use as herbicides
US6381557B1 (en) * 1998-11-25 2002-04-30 Ge Medical Systems Global Technology Company, Llc Medical imaging system service evaluation method and apparatus
US6464636B1 (en) * 2000-10-18 2002-10-15 Koninklijke Philips Electronics N.V. Configuration tool for use in ultrasound imaging device
US6475146B1 (en) * 2001-09-24 2002-11-05 Siemens Medical Solutions Usa, Inc. Method and system for using personal digital assistants with diagnostic medical ultrasound systems
US6511426B1 (en) * 1998-06-02 2003-01-28 Acuson Corporation Medical diagnostic ultrasound system and method for versatile processing
US20030028113A1 (en) * 1995-06-29 2003-02-06 Teratech Corporation Ultrasound scan conversion with spatial dithering
US6530887B1 (en) * 1996-12-24 2003-03-11 Teratech Corporation Ultrasound probe with integrated electronics
US6537219B2 (en) * 2001-04-04 2003-03-25 Koninklijke Philips Electronics N.V. Static focus ultrasound apparatus and method
US20030073894A1 (en) * 1999-06-22 2003-04-17 Tera Tech Corporation Ultrasound probe with integrated electronics
US20030088182A1 (en) * 2001-09-28 2003-05-08 Teratech Corporation Ultrasound imaging system
US6569097B1 (en) * 2000-07-21 2003-05-27 Diagnostics Ultrasound Corporation System for remote evaluation of ultrasound information obtained by a programmed application-specific data collection device
US20030125629A1 (en) * 2002-01-02 2003-07-03 Ustuner E. Tuncay Ultrasound system and method
US6603494B1 (en) * 1998-11-25 2003-08-05 Ge Medical Systems Global Technology Company, Llc Multiple modality interface for imaging systems including remote services over a network
US20030233046A1 (en) * 2002-06-17 2003-12-18 Board Of Trustees Of The University Of Arkansas Ultrasonic guided catheter deployment system
US6669633B2 (en) * 1999-06-22 2003-12-30 Teratech Corporation Unitary operator control for ultrasonic imaging graphical user interface
US20050228281A1 (en) * 2004-03-31 2005-10-13 Nefos Thomas P Handheld diagnostic ultrasound system with head mounted display
US7110583B2 (en) * 2001-01-31 2006-09-19 Matsushita Electric Industrial, Co., Ltd. Ultrasonic diagnostic device and image processing device
US7331925B2 (en) * 2000-07-21 2008-02-19 Verathon, Inc. System for remote evaluation of ultrasound information obtained by a programmed application-specific data collection device
US20130184587A1 (en) * 2012-01-17 2013-07-18 Samsung Electronics Co., Ltd. Probe device, server, system for diagnosing ultrasound image, and method of processing ultrasound image

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4976702A (en) * 1989-04-17 1990-12-11 Serad, Inc. Syringe needle guard
US5291893A (en) * 1992-10-09 1994-03-08 Acoustic Imaging Technologies Corporation Endo-luminal ultrasonic instrument and method for its use
US5526816A (en) 1994-09-22 1996-06-18 Bracco Research S.A. Ultrasonic spectral contrast imaging
DE4446429C1 (en) * 1994-12-23 1996-08-22 Siemens Ag Device for treating an object with focused ultrasound waves
US5553620A (en) 1995-05-02 1996-09-10 Acuson Corporation Interactive goal-directed ultrasound measurement system
AU700274B2 (en) 1995-06-29 1998-12-24 Teratech Corporation Portable ultrasound imaging system
US6256529B1 (en) 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US5603323A (en) 1996-02-27 1997-02-18 Advanced Technology Laboratories, Inc. Medical ultrasonic diagnostic system with upgradeable transducer probes and other features
US5851186A (en) 1996-02-27 1998-12-22 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with universal access to diagnostic information and images
US5986662A (en) * 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US6515016B2 (en) * 1996-12-02 2003-02-04 Angiotech Pharmaceuticals, Inc. Composition and methods of paclitaxel for treating psoriasis
US5853367A (en) * 1997-03-17 1998-12-29 General Electric Company Task-interface and communications system and method for ultrasound imager control
ATE234042T1 (en) 1997-03-25 2003-03-15 Dwl Elektron Systeme Gmbh DEVICE FOR OBSERVING VESSELS, IN PARTICULAR BLOOD VESSELS
AU4318499A (en) 1997-11-24 1999-12-13 Burdette Medical Systems, Inc. Real time brachytherapy spatial registration and visualization system
US6171244B1 (en) 1997-12-31 2001-01-09 Acuson Corporation Ultrasonic system and method for storing data
CA2332107A1 (en) 1998-05-13 1999-11-18 Inbae Yoon Penetrating endoscope and endoscopic surgical instrument with cmos image sensor and display
AU3999299A (en) 1998-05-18 1999-12-06 Intracom Corporation System for transmitting video images over a computer network to a remote receiver
US6203499B1 (en) * 1998-10-05 2001-03-20 Atl Ultrasound Inc. Multiple angle needle guide
US6216539B1 (en) * 1998-11-23 2001-04-17 Csi Technology, Inc. Equipment setup for ultrasonic monitoring
US6424996B1 (en) 1998-11-25 2002-07-23 Nexsys Electronics, Inc. Medical network system and method for transfer of information
US6337481B1 (en) 1998-11-25 2002-01-08 General Electric Company Data binning method and apparatus for pet tomography including remote services over a network
US6126605A (en) 1998-12-31 2000-10-03 General Electric Company Ultrasound color flow display optimization by adjusting dynamic range
US6547730B1 (en) * 1998-12-31 2003-04-15 U-Systems, Inc. Ultrasound information processing system
US6839762B1 (en) 1998-12-31 2005-01-04 U-Systems, Inc. Ultrasound information processing system and ultrasound information exchange protocol therefor
US6514201B1 (en) * 1999-01-29 2003-02-04 Acuson Corporation Voice-enhanced diagnostic medical ultrasound system and review station
US20010050087A1 (en) * 1999-03-24 2001-12-13 Pmd Holdings Corp. Ultrasonic detection of restenosis in stents
WO2000060522A2 (en) 1999-04-01 2000-10-12 Acist Medical Systems, Inc. An integrated medical information management and medical device control system and method
US6519632B1 (en) * 1999-04-28 2003-02-11 General Electric Company Method and apparatus for configuring imaging system to communicate with multiple remote devices
JP2000316865A (en) 1999-05-12 2000-11-21 Olympus Optical Co Ltd Ultrasonic image sonograph
US6126608A (en) 1999-05-18 2000-10-03 Pie Medical Equipment B.V. Portable ultrasound diagnostic system with handsfree display
US9402601B1 (en) 1999-06-22 2016-08-02 Teratech Corporation Methods for controlling an ultrasound imaging procedure and providing ultrasound images to an external non-ultrasound application via a network
US20040015079A1 (en) 1999-06-22 2004-01-22 Teratech Corporation Ultrasound probe with integrated electronics
US6251073B1 (en) 1999-08-20 2001-06-26 Novasonics, Inc. Miniaturized ultrasound apparatus and method
US6325759B1 (en) * 1999-09-23 2001-12-04 Ultrasonix Medical Corporation Ultrasound imaging system
US6734880B2 (en) 1999-11-24 2004-05-11 Stentor, Inc. User interface for a medical informatics systems
US20020156864A1 (en) * 2000-06-06 2002-10-24 Kniest James Newton System for wireless exchange of data with hand held devices
US6368279B1 (en) 2000-09-15 2002-04-09 Siemens Medical Solutions, Usa Inc. Time-delay compensation system and method for adaptive ultrasound imaging
US6524246B1 (en) 2000-10-13 2003-02-25 Sonocine, Inc. Ultrasonic cellular tissue screening tool
US7597663B2 (en) 2000-11-24 2009-10-06 U-Systems, Inc. Adjunctive ultrasound processing and display for breast cancer screening
US20040260790A1 (en) * 2000-12-21 2004-12-23 Ge Medical System Global Technology Company, Llc Method and apparatus for remote or collaborative control of an imaging system
US7162439B2 (en) 2000-12-22 2007-01-09 General Electric Company Workstation configuration and selection method and apparatus
US20020193685A1 (en) 2001-06-08 2002-12-19 Calypso Medical, Inc. Guided Radiation Therapy System
US7418480B2 (en) * 2001-10-25 2008-08-26 Ge Medical Systems Global Technology Company, Llc Medical imaging data streaming
US7115093B2 (en) 2001-11-21 2006-10-03 Ge Medical Systems Global Technology Company, Llc Method and system for PDA-based ultrasound system
US6633833B2 (en) 2001-12-14 2003-10-14 Ge Medical Systems Global Technology Company, Llc Method and apparatus for remote diagnosis of an ultrasound scanner
EP1551303A4 (en) 2002-05-16 2009-03-18 Karmanos B A Cancer Inst Method and system for combined diagnostic and therapeutic ultrasound system incorporating noninvasive thermometry, ablation control and automation
US20090112089A1 (en) 2007-10-27 2009-04-30 Bill Barnard System and method for measuring bladder wall thickness and presenting a bladder virtual image
US7549961B1 (en) 2003-07-31 2009-06-23 Sonosite, Inc. System and method supporting imaging and monitoring applications
US8055323B2 (en) 2003-08-05 2011-11-08 Imquant, Inc. Stereotactic system and method for defining a tumor treatment region
EP2857861B1 (en) 2003-11-26 2020-07-22 Teratech Corporation Modular portable ultrasound systems
US8095203B2 (en) 2004-07-23 2012-01-10 Varian Medical Systems, Inc. Data processing for real-time tracking of a target in radiation therapy
US7298819B2 (en) 2004-09-30 2007-11-20 Accuray Incorporated Flexible treatment planning
JP2008515583A (en) 2004-10-12 2008-05-15 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Ultrasonic touchscreen user interface and display
US20080161686A1 (en) 2006-10-31 2008-07-03 Nahi Halmann Methods and apparatus for controlling handheld medical devices
US20090198132A1 (en) 2007-08-10 2009-08-06 Laurent Pelissier Hand-held ultrasound imaging device having reconfigurable user interface
US9414804B2 (en) 2007-08-24 2016-08-16 General Electric Company Diagnostic imaging device having protective facade and method of cleaning and disinfecting same
WO2010051587A1 (en) 2008-11-07 2010-05-14 Signostics Limited Dynamic control of medical device user interface
JP5566766B2 (en) 2009-05-29 2014-08-06 株式会社東芝 Ultrasonic diagnostic apparatus, image display apparatus, image display method, and display method
US20110125022A1 (en) 2009-11-25 2011-05-26 Siemens Medical Solutions Usa, Inc. Synchronization for multi-directional ultrasound scanning

Patent Citations (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4070905A (en) * 1975-10-13 1978-01-31 The Commonwealth Of Australia Ultrasonic beam scanning
US4159462A (en) * 1977-08-18 1979-06-26 General Electric Company Ultrasonic multi-sector scanner
US4140022A (en) * 1977-12-20 1979-02-20 Hewlett-Packard Company Acoustic imaging apparatus
US4140022B1 (en) * 1977-12-20 1995-05-16 Hewlett Packard Co Acoustic imaging apparatus
US4310907A (en) * 1978-12-08 1982-01-12 Matsushita Electric Industrial Company, Limited Scan converter for a sector scan type ultrasound imaging system
US4368643A (en) * 1979-11-16 1983-01-18 Matsushita Electric Industrial Company, Limited Ultrasonic imaging by radial scan beams emanating from a hypothetical point located behind linear transducer array
US4344327A (en) * 1979-12-28 1982-08-17 Aloka Co., Ltd. Electronic scanning ultrasonic diagnostic system
US4344327B1 (en) * 1979-12-28 1994-05-03 Aloka Co Ltd Electronic scanning ultrasonic diagnostic system
US4319489A (en) * 1980-03-28 1982-03-16 Yokogawa Electric Works, Ltd. Ultrasonic diagnostic method and apparatus
US4573477B1 (en) * 1982-04-28 1991-10-22 Aloka Co Ltd
US4573477A (en) * 1982-04-28 1986-03-04 Aloka Co., Ltd. Ultrasonic diagnostic apparatus
US4622977A (en) * 1983-12-05 1986-11-18 Aloka Co., Ltd. Ultrasonic diagnostic apparatus
US4622977B1 (en) * 1983-12-05 1992-01-07 Aloka Co Ltd
US4582065A (en) * 1984-06-28 1986-04-15 Picker International, Inc. Ultrasonic step scanning utilizing unequally spaced curvilinear transducer array
US4664122A (en) * 1984-07-25 1987-05-12 Kabushiki Kaisha Toshiba Ultrasonic transducer array used in ultrasonic diagnosis apparatus
US4962667A (en) * 1985-10-09 1990-10-16 Hitachi, Ltd. Ultrasonic imaging apparatus
US4759375A (en) * 1985-12-26 1988-07-26 Aloka Co., Ltd. Ultrasonic doppler diagnostic apparatus
US4896283A (en) * 1986-03-07 1990-01-23 Hewlett-Packard Company Iterative real-time XY raster path generator for bounded areas
US5383457A (en) * 1987-04-20 1995-01-24 National Fertility Institute Method and apparatus for processing images
US4949259A (en) * 1987-10-29 1990-08-14 Hewlett-Packard Company Delay coefficient generator for accumulators
US4852577A (en) * 1988-04-07 1989-08-01 The United States Of America As Represented By The Department Of Health And Human Services High speed adaptive ultrasonic phased array imaging system
US4937797A (en) * 1988-11-14 1990-06-26 Hewlett-Packard Company Method and apparatus for controlling scan line direction in a linear array ultrasonic doppler scanning system
US5148810A (en) * 1990-02-12 1992-09-22 Acuson Corporation Variable origin-variable angle acoustic scanning method and apparatus
US5235986A (en) * 1990-02-12 1993-08-17 Acuson Corporation Variable origin-variable angle acoustic scanning method and apparatus for a curved linear array
US5261408A (en) * 1990-02-12 1993-11-16 Acuson Corporation Variable origin-variable acoustic scanning method and apparatus
US5123415A (en) * 1990-07-19 1992-06-23 Advanced Technology Laboratories, Inc. Ultrasonic imaging by radial scan of trapezoidal sector
US5295485A (en) * 1991-12-13 1994-03-22 Hitachi, Ltd. Ultrasonic diagnostic system
US5379771A (en) * 1993-04-06 1995-01-10 Kabushiki Kaisha Toshiba Ultrasonic imaging apparatus
US5798461A (en) * 1993-06-02 1998-08-25 Hewlett-Packard Company Methods and apparatus for ultrasound imaging using combined scan patterns
US6063030A (en) * 1993-11-29 2000-05-16 Adalberto Vara PC based ultrasound device with virtual control user interface
US5680536A (en) * 1994-03-25 1997-10-21 Tyuluman; Samuel A. Dual motherboard computer system
US5615679A (en) * 1995-02-06 1997-04-01 Ge Yokogawa Medical Systems, Limited Method of displaying ultrasonic images and apparatus for ultrasonic diagnosis
US5685307A (en) * 1995-02-28 1997-11-11 Iowa State University Research Foundation, Inc. Method and apparatus for tissue characterization of animals using ultrasound
US5609155A (en) * 1995-04-26 1997-03-11 Acuson Corporation Energy weighted parameter spatial/temporal filter
US5860930A (en) * 1995-04-26 1999-01-19 Acuson Corporation Energy weighted parameter spatial/temporal filter
US5839442A (en) * 1995-06-29 1998-11-24 Teratech Corporation Portable ultrasound imaging system
US5904652A (en) * 1995-06-29 1999-05-18 Teratech Corporation Ultrasound scan conversion with spatial dithering
US5763785A (en) * 1995-06-29 1998-06-09 Massachusetts Institute Of Technology Integrated beam forming and focusing processing circuit for use in an ultrasound imaging system
US20030028113A1 (en) * 1995-06-29 2003-02-06 Teratech Corporation Ultrasound scan conversion with spatial dithering
US6106472A (en) * 1995-06-29 2000-08-22 Teratech Corporation Portable ultrasound imaging system
US5590658A (en) * 1995-06-29 1997-01-07 Teratech Corporation Portable ultrasound imaging system
US5690114A (en) * 1995-06-29 1997-11-25 Teratech Corporation Portable ultrasound imaging system
US5964709A (en) * 1995-06-29 1999-10-12 Teratech Corporation Portable ultrasound imaging system
US5957846A (en) * 1995-06-29 1999-09-28 Teratech Corporation Portable ultrasound imaging system
US5758649A (en) * 1995-09-01 1998-06-02 Fujitsu Limited Ultrasonic module and ultrasonic diagnostic system
US5715823A (en) * 1996-02-27 1998-02-10 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with universal access to diagnostic information and images
US5718228A (en) * 1996-03-13 1998-02-17 Fujitsu Ltd. Ultrasonic diagnostic apparatus
US5774876A (en) * 1996-06-26 1998-06-30 Par Government Systems Corporation Managing assets with active electronic tags
US6135961A (en) * 1996-06-28 2000-10-24 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US5722412A (en) * 1996-06-28 1998-03-03 Advanced Technology Laboratories, Inc. Hand held ultrasonic diagnostic instrument
US5893363A (en) * 1996-06-28 1999-04-13 Sonosight, Inc. Ultrasonic array transducer transceiver for a hand held ultrasonic diagnostic instrument
US5817024A (en) * 1996-06-28 1998-10-06 Sonosight, Inc. Hand held ultrasonic diagnostic instrument with digital beamformer
US5961610A (en) * 1996-08-13 1999-10-05 General Electric Company Systems, methods and apparatus for generating and controlling display of medical images
US5795297A (en) * 1996-09-12 1998-08-18 Atlantis Diagnostics International, L.L.C. Ultrasonic diagnostic imaging system with personal computer architecture
US5897498A (en) * 1996-09-25 1999-04-27 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with electronic message communications capability
US6530887B1 (en) * 1996-12-24 2003-03-11 Teratech Corporation Ultrasound probe with integrated electronics
US5891030A (en) * 1997-01-24 1999-04-06 Mayo Foundation For Medical Education And Research System for two dimensional and three dimensional imaging of tubular structures in the human body
US6111816A (en) * 1997-02-03 2000-08-29 Teratech Corporation Multi-dimensional beamforming device
US6337306B1 (en) * 1997-02-06 2002-01-08 Rhone-Poulenc Agro Phenoxyacetic acid derivatives and their use as herbicides
US5855556A (en) * 1997-09-19 1999-01-05 Fujitsu Ltd. Ultrasonic diagnostic apparatus
US6032120A (en) * 1997-12-16 2000-02-29 Acuson Corporation Accessing stored ultrasound images and other digital medical images
US5971923A (en) * 1997-12-31 1999-10-26 Acuson Corporation Ultrasound system and method for interfacing with peripherals
US6101407A (en) * 1998-02-13 2000-08-08 Eastman Kodak Company Method and system for remotely viewing and configuring output from a medical imaging device
US6511426B1 (en) * 1998-06-02 2003-01-28 Acuson Corporation Medical diagnostic ultrasound system and method for versatile processing
US6198283B1 (en) * 1998-09-18 2001-03-06 Ge Medical Systems Global Technology Llc System and method of phase sensitive MRI reconstruction using partial k-space data and including a network
US6142946A (en) * 1998-11-20 2000-11-07 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with cordless scanheads
US6159150A (en) * 1998-11-20 2000-12-12 Acuson Corporation Medical diagnostic ultrasonic imaging system with auxiliary processor
US6381557B1 (en) * 1998-11-25 2002-04-30 Ge Medical Systems Global Technology Company, Llc Medical imaging system service evaluation method and apparatus
US6603494B1 (en) * 1998-11-25 2003-08-05 Ge Medical Systems Global Technology Company, Llc Multiple modality interface for imaging systems including remote services over a network
US6139498A (en) * 1998-12-29 2000-10-31 Ge Diasonics Israel, Ltd. Ultrasound system performing simultaneous parallel computer instructions
US6106468A (en) * 1999-04-05 2000-08-22 Agilent Technologies, Inc. Ultrasound system employing a unified memory
US6210327B1 (en) * 1999-04-28 2001-04-03 General Electric Company Method and apparatus for sending ultrasound image data to remotely located device
US6139496A (en) * 1999-04-30 2000-10-31 Agilent Technologies, Inc. Ultrasonic imaging system having isonification and display functions integrated in an easy-to-manipulate probe assembly
US6669633B2 (en) * 1999-06-22 2003-12-30 Teratech Corporation Unitary operator control for ultrasonic imaging graphical user interface
US20030176787A1 (en) * 1999-06-22 2003-09-18 Teratech Corporation Ultrasound probe with integrated electronics
US20030073894A1 (en) * 1999-06-22 2003-04-17 Tera Tech Corporation Ultrasound probe with integrated electronics
US6306089B1 (en) * 1999-09-24 2001-10-23 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with customized measurements and calculations
US6569097B1 (en) * 2000-07-21 2003-05-27 Diagnostics Ultrasound Corporation System for remote evaluation of ultrasound information obtained by a programmed application-specific data collection device
US7331925B2 (en) * 2000-07-21 2008-02-19 Verathon, Inc. System for remote evaluation of ultrasound information obtained by a programmed application-specific data collection device
US6464636B1 (en) * 2000-10-18 2002-10-15 Koninklijke Philips Electronics N.V. Configuration tool for use in ultrasound imaging device
US7110583B2 (en) * 2001-01-31 2006-09-19 Matsushita Electric Industrial, Co., Ltd. Ultrasonic diagnostic device and image processing device
US6537219B2 (en) * 2001-04-04 2003-03-25 Koninklijke Philips Electronics N.V. Static focus ultrasound apparatus and method
US6475146B1 (en) * 2001-09-24 2002-11-05 Siemens Medical Solutions Usa, Inc. Method and system for using personal digital assistants with diagnostic medical ultrasound systems
US20030088182A1 (en) * 2001-09-28 2003-05-08 Teratech Corporation Ultrasound imaging system
US20030125629A1 (en) * 2002-01-02 2003-07-03 Ustuner E. Tuncay Ultrasound system and method
US20030233046A1 (en) * 2002-06-17 2003-12-18 Board Of Trustees Of The University Of Arkansas Ultrasonic guided catheter deployment system
US20050228281A1 (en) * 2004-03-31 2005-10-13 Nefos Thomas P Handheld diagnostic ultrasound system with head mounted display
US20130184587A1 (en) * 2012-01-17 2013-07-18 Samsung Electronics Co., Ltd. Probe device, server, system for diagnosing ultrasound image, and method of processing ultrasound image

Cited By (658)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130245449A1 (en) * 1996-06-28 2013-09-19 Sonosite, Inc. Balance body ultrasound system
US9301705B2 (en) 1998-11-04 2016-04-05 Johns Hopkins University School Of Medicine System and method for magnetic-resonance-guided electrophysiologic and ablation procedures
US20030050557A1 (en) * 1998-11-04 2003-03-13 Susil Robert C. Systems and methods for magnetic-resonance-guided interventional procedures
US8099151B2 (en) 1998-11-04 2012-01-17 Johns Hopkins University School Of Medicine System and method for magnetic-resonance-guided electrophysiologic and ablation procedures
US20080058635A1 (en) * 1998-11-04 2008-03-06 Johns Hopkins University School Of Medicine Mri-guided therapy methods and related systems
US7844319B2 (en) 1998-11-04 2010-11-30 Susil Robert C Systems and methods for magnetic-resonance-guided interventional procedures
US7822460B2 (en) 1998-11-04 2010-10-26 Surgi-Vision, Inc. MRI-guided therapy methods and related systems
US20030018246A1 (en) * 1999-03-11 2003-01-23 Assaf Govari Guidance of invasive medical procedures using implantable tags
US20030004411A1 (en) * 1999-03-11 2003-01-02 Assaf Govari Invasive medical device with position sensing and display
US20020107445A1 (en) * 1999-03-11 2002-08-08 Assaf Govari Implantable and insertable passive tags
US7590441B2 (en) * 1999-03-11 2009-09-15 Biosense, Inc. Invasive medical device with position sensing and display
US11547382B2 (en) 1999-06-22 2023-01-10 Teratech Corporation Networked ultrasound system and method for imaging a medical procedure using an invasive probe
US8075485B2 (en) * 2000-12-13 2011-12-13 Deltex Medical Limited Gain setting in doppler haemodynamic monitors
US20090149758A1 (en) * 2000-12-13 2009-06-11 Leonard Smith Gain Setting in Doppler Haemodynamic Monitors
US20100168821A1 (en) * 2001-04-13 2010-07-01 Greatbatch Ltd. Switched diverter circuits for minimizing heating of an implanted lead in a high power electromagnetic field environment
US20100160997A1 (en) * 2001-04-13 2010-06-24 Greatbatch Ltd. Tuned energy balanced system for minimizing heating and/or to provide emi protection of implanted leads in a high power electromagnetic field environment
US8457760B2 (en) 2001-04-13 2013-06-04 Greatbatch Ltd. Switched diverter circuits for minimizing heating of an implanted lead and/or providing EMI protection in a high power electromagnetic field environment
US20100191236A1 (en) * 2001-04-13 2010-07-29 Greatbatch Ltd. Switched diverter circuits for minimizing heating of an implanted lead and/or providing emi protection in a high power electromagnetic field environment
US9242090B2 (en) 2001-04-13 2016-01-26 MRI Interventions Inc. MRI compatible medical leads
US8855785B1 (en) 2001-04-13 2014-10-07 Greatbatch Ltd. Circuits for minimizing heating of an implanted lead and/or providing EMI protection in a high power electromagnetic field environment
US9248283B2 (en) 2001-04-13 2016-02-02 Greatbatch Ltd. Band stop filter comprising an inductive component disposed in a lead wire in series with an electrode
US8989870B2 (en) 2001-04-13 2015-03-24 Greatbatch Ltd. Tuned energy balanced system for minimizing heating and/or to provide EMI protection of implanted leads in a high power electromagnetic field environment
US20070088416A1 (en) * 2001-04-13 2007-04-19 Surgi-Vision, Inc. Mri compatible medical leads
US9295828B2 (en) 2001-04-13 2016-03-29 Greatbatch Ltd. Self-resonant inductor wound portion of an implantable lead for enhanced MRI compatibility of active implantable medical devices
US8509913B2 (en) 2001-04-13 2013-08-13 Greatbatch Ltd. Switched diverter circuits for minimizing heating of an implanted lead and/or providing EMI protection in a high power electromagnetic field environment
US20100217262A1 (en) * 2001-04-13 2010-08-26 Greatbatch Ltd. Frequency selective passive component networks for active implantable medical devices utilizing an energy dissipating surface
US8751013B2 (en) 2001-04-13 2014-06-10 Greatbatch Ltd. Switched diverter circuits for minimizing heating of an implanted lead and/or providing EMI protection in a high power electromagnetic field environment
US8219208B2 (en) 2001-04-13 2012-07-10 Greatbatch Ltd. Frequency selective passive component networks for active implantable medical devices utilizing an energy dissipating surface
US8600519B2 (en) 2001-04-13 2013-12-03 Greatbatch Ltd. Transient voltage/current protection system for electronic circuits associated with implanted leads
US8977355B2 (en) 2001-04-13 2015-03-10 Greatbatch Ltd. EMI filter employing a capacitor and an inductor tank circuit having optimum component values
US8172773B2 (en) 2002-03-19 2012-05-08 C. R. Bard, Inc. Biopsy device and biopsy needle module that can be inserted into the biopsy device
US10335128B2 (en) 2002-03-19 2019-07-02 C. R. Bard, Inc. Biopsy device and insertable biopsy needle module
US11382608B2 (en) 2002-03-19 2022-07-12 C. R. Bard, Inc. Disposable biopsy unit
US10271827B2 (en) 2002-03-19 2019-04-30 C. R. Bard, Inc. Disposable biopsy unit
US9421002B2 (en) 2002-03-19 2016-08-23 C. R. Bard, Inc. Disposable biopsy unit
US8002713B2 (en) 2002-03-19 2011-08-23 C. R. Bard, Inc. Biopsy device and insertable biopsy needle module
US8052614B2 (en) 2002-03-19 2011-11-08 C. R. Bard, Inc. Biopsy device having a vacuum pump
US8016772B2 (en) 2002-03-19 2011-09-13 C. R. Bard, Inc. Biopsy device for removing tissue specimens using a vacuum
US8951209B2 (en) 2002-03-19 2015-02-10 C. R. Bard, Inc. Biopsy device and insertable biopsy needle module
US8109885B2 (en) 2002-03-19 2012-02-07 C. R. Bard, Inc. Biopsy device for removing tissue specimens using a vacuum
US9439631B2 (en) 2002-03-19 2016-09-13 C. R. Bard, Inc. Biopsy device and insertable biopsy needle module
US9072502B2 (en) 2002-03-19 2015-07-07 C. R. Bard, Inc. Disposable biopsy unit
USRE42856E1 (en) 2002-05-29 2011-10-18 MRI Interventions, Inc. Magnetic resonance probes
USRE44736E1 (en) 2002-05-29 2014-01-28 MRI Interventions, Inc. Magnetic resonance probes
US20090062644A1 (en) * 2002-06-07 2009-03-05 Mcmorrow Gerald System and method for ultrasound harmonic imaging
US20040024306A1 (en) * 2002-07-29 2004-02-05 Hamilton Craig A. Cardiac diagnostics using time compensated stress test cardiac MRI imaging and systems for cardiac diagnostics
US8494611B2 (en) * 2002-07-29 2013-07-23 Wake Forest University Health Sciences Cardiac diagnostics using time compensated stress test cardiac MRI imaging and systems for cardiac diagnostics
US7162462B1 (en) * 2003-03-14 2007-01-09 Unisys Corporation Providing time sensitivity to an inference engine
US20070066894A1 (en) * 2003-03-28 2007-03-22 Koninklijke Philips Electronics N.V. Remote wireless control device for an ultrasound machine and method
US8728004B2 (en) 2003-03-29 2014-05-20 C.R. Bard, Inc. Biopsy needle system having a pressure generating unit
US8162851B2 (en) 2003-03-29 2012-04-24 C. R. Bard, Inc. Biopsy needle system having a pressure generating unit
US20100256448A1 (en) * 2003-04-01 2010-10-07 Boston Scientific Scimed, Inc. Fluid manifold for endoscope system
US8608648B2 (en) 2003-04-01 2013-12-17 Boston Scientific Scimed, Inc. Articulation joint
US20050197536A1 (en) * 2003-04-01 2005-09-08 Banik Michael S. Video endoscope
US10765307B2 (en) 2003-04-01 2020-09-08 Boston Scientific Scimed, Inc. Endoscopic imaging system
US8118732B2 (en) 2003-04-01 2012-02-21 Boston Scientific Scimed, Inc. Force feedback control system for video endoscope
US8622894B2 (en) 2003-04-01 2014-01-07 Boston Scientific Scimed, Inc. Articulation joint
US20050131279A1 (en) * 2003-04-01 2005-06-16 Boston Scientific Scimed, Inc. Articulation joint for video endoscope
US11324395B2 (en) 2003-04-01 2022-05-10 Boston Scientific Scimed, Inc. Endoscopic imaging system
US8535219B2 (en) 2003-04-01 2013-09-17 Boston Scientific Scimed, Inc. Fluid manifold for endoscope system
US9913573B2 (en) 2003-04-01 2018-03-13 Boston Scientific Scimed, Inc. Endoscopic imaging system
US8475366B2 (en) 2003-04-01 2013-07-02 Boston Scientific Scimed, Inc. Articulation joint for a medical device
US8425408B2 (en) 2003-04-01 2013-04-23 Boston Scientific Scimed, Inc. Articulation joint for video endoscope
US20100048999A1 (en) * 2003-04-01 2010-02-25 Boston Scientific Scimed, Inc. Video endoscope
US20100076266A1 (en) * 2003-04-01 2010-03-25 Boston Scientific Scimed, Inc Articulation joint for video endoscope
US20040227723A1 (en) * 2003-05-16 2004-11-18 Fisher-Rosemount Systems, Inc. One-handed operation of a handheld field maintenance tool
US7199784B2 (en) * 2003-05-16 2007-04-03 Fisher Rosemount Systems, Inc. One-handed operation of a handheld field maintenance tool
US6988990B2 (en) * 2003-05-29 2006-01-24 General Electric Company Automatic annotation filler system and method for use in ultrasound imaging
US20040242998A1 (en) * 2003-05-29 2004-12-02 Ge Medical Systems Global Technology Company, Llc Automatic annotation filler system and method for use in ultrasound imaging
US20050050403A1 (en) * 2003-08-26 2005-03-03 Frank Glaser Method for requesting information regarding a network subscriber station in a network of distributed stations, and network subscriber station for carrying out the method
JP2005102176A (en) * 2003-08-26 2005-04-14 Thomson Licensing Sa Request method for information about network substitution station and network substitution station executing the method
JP4619726B2 (en) * 2003-08-26 2011-01-26 トムソン ライセンシング Method for requesting information relating to network subscriber station and network subscriber station performing the method
US20050113689A1 (en) * 2003-11-21 2005-05-26 Arthur Gritzky Method and apparatus for performing multi-mode imaging
US20050114175A1 (en) * 2003-11-25 2005-05-26 O'dea Paul J. Method and apparatus for managing ultrasound examination information
US20070109294A1 (en) * 2003-11-26 2007-05-17 Koninklijke Philips Electronics Nv Workflow optimization for high thoughput imaging enviroments
US8712798B2 (en) * 2003-11-26 2014-04-29 Koninklijke Philips N.V. Workflow optimization for high throughput imaging environments
US11675073B2 (en) 2003-11-26 2023-06-13 Teratech Corporation Modular portable ultrasound systems
US7998072B2 (en) * 2003-12-19 2011-08-16 Siemens Medical Solutions Usa, Inc. Probe based digitizing or compression system and method for medical ultrasound
US20050148878A1 (en) * 2003-12-19 2005-07-07 Siemens Medical Solutions Usa, Inc.. Probe based digitizing or compression system and method for medical ultrasound
US20050148873A1 (en) * 2003-12-19 2005-07-07 Siemens Medical Solutions Usa, Inc. Ultrasound adaptor methods and systems for transducer and system separation
US8257262B2 (en) 2003-12-19 2012-09-04 Siemens Medical Solutions Usa, Inc. Ultrasound adaptor methods and systems for transducer and system separation
US20100312092A1 (en) * 2004-02-09 2010-12-09 Roch Listz Maurice Method and system for vascular elastography
US7972268B2 (en) 2004-02-26 2011-07-05 Siemens Medical Solutions Usa, Inc. Steered continuous wave doppler methods for two-dimensional ultrasound transducer arrays
US20080027322A1 (en) * 2004-02-26 2008-01-31 Siemens Medical Solutions Usa, Inc. Steered continuous wave doppler methods and systems for two-dimensional ultrasound transducer arrays
US20050234321A1 (en) * 2004-03-26 2005-10-20 Fuji Photo Film Co., Ltd. Diagnostic support system and method used for the same
US7680678B2 (en) * 2004-03-26 2010-03-16 Fujifilm Corporation Diagnostic support system and method used for the same
US8213467B2 (en) * 2004-04-08 2012-07-03 Sonosite, Inc. Systems and methods providing ASICs for use in multiple applications
US20050228287A1 (en) * 2004-04-08 2005-10-13 Sonosite, Inc. Systems and methods providing ASICs for use in multiple applications
US20090227873A1 (en) * 2004-04-19 2009-09-10 Koninklijke Philips Electronics, N.V. Data visualization method for an ultrasound imaging system
US20050246064A1 (en) * 2004-04-29 2005-11-03 Smith Gregory C Method for detecting position errors using a motion detector
US20060122489A1 (en) * 2004-06-23 2006-06-08 Makoto Kato Vascular endothelial reactivity measuring apparatus and method for controlling the measuring apparatus
US8025502B2 (en) * 2004-07-02 2011-09-27 Discus Dental, Llc Light guide for dentistry applications
US20060029901A1 (en) * 2004-07-02 2006-02-09 Discus Dental Impressions, Inc. Light guide for dentistry applications
US8864680B2 (en) 2004-07-09 2014-10-21 Bard Peripheral Vascular, Inc. Transport system for biopsy device
US8992440B2 (en) 2004-07-09 2015-03-31 Bard Peripheral Vascular, Inc. Length detection system for biopsy device
US8157744B2 (en) 2004-07-09 2012-04-17 Bard Peripheral Vascular, Inc. Tissue sample flushing system for biopsy device
US9872672B2 (en) 2004-07-09 2018-01-23 Bard Peripheral Vascular, Inc. Length detection system for biopsy device
US9345458B2 (en) 2004-07-09 2016-05-24 Bard Peripheral Vascular, Inc. Transport system for biopsy device
US8926527B2 (en) 2004-07-09 2015-01-06 Bard Peripheral Vascular, Inc. Tissue sample flushing system for biopsy device
US8366636B2 (en) 2004-07-09 2013-02-05 Bard Peripheral Vascular, Inc. Firing system for biopsy device
US9456809B2 (en) 2004-07-09 2016-10-04 Bard Peripheral Vascular, Inc. Tissue sample flushing system for biopsy device
US8052615B2 (en) 2004-07-09 2011-11-08 Bard Peripheral Vascular, Inc. Length detection system for biopsy device
US10499888B2 (en) 2004-07-09 2019-12-10 Bard Peripheral Vascular, Inc. Tissue sample flushing system for biopsy device
US10166011B2 (en) 2004-07-09 2019-01-01 Bard Peripheral Vascular, Inc. Transport system for biopsy device
US20060034538A1 (en) * 2004-07-23 2006-02-16 Agfa Corporation Method and system of automating echocardiogram measurements and reporting
US10864385B2 (en) 2004-09-24 2020-12-15 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US9895560B2 (en) 2004-09-24 2018-02-20 Guided Therapy Systems, Llc Methods for rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US11590370B2 (en) 2004-09-24 2023-02-28 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US10328289B2 (en) 2004-09-24 2019-06-25 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US10238894B2 (en) 2004-10-06 2019-03-26 Guided Therapy Systems, L.L.C. Energy based fat reduction
US10888716B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, Llc Energy based fat reduction
US10252086B2 (en) 2004-10-06 2019-04-09 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US9694211B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US9707412B2 (en) 2004-10-06 2017-07-18 Guided Therapy Systems, Llc System and method for fat and cellulite reduction
US9533175B2 (en) 2004-10-06 2017-01-03 Guided Therapy Systems, Llc Energy based fat reduction
US9713731B2 (en) 2004-10-06 2017-07-25 Guided Therapy Systems, Llc Energy based fat reduction
US9522290B2 (en) 2004-10-06 2016-12-20 Guided Therapy Systems, Llc System and method for fat and cellulite reduction
US11717707B2 (en) 2004-10-06 2023-08-08 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US10960236B2 (en) 2004-10-06 2021-03-30 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US9827450B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9827449B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US11400319B2 (en) 2004-10-06 2022-08-02 Guided Therapy Systems, Llc Methods for lifting skin tissue
US10603523B2 (en) 2004-10-06 2020-03-31 Guided Therapy Systems, Llc Ultrasound probe for tissue treatment
US11167155B2 (en) 2004-10-06 2021-11-09 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10888717B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, Llc Probe for ultrasound tissue treatment
US9421029B2 (en) 2004-10-06 2016-08-23 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US9833639B2 (en) 2004-10-06 2017-12-05 Guided Therapy Systems, L.L.C. Energy based fat reduction
US11179580B2 (en) 2004-10-06 2021-11-23 Guided Therapy Systems, Llc Energy based fat reduction
US9833640B2 (en) 2004-10-06 2017-12-05 Guided Therapy Systems, L.L.C. Method and system for ultrasound treatment of skin
US11697033B2 (en) 2004-10-06 2023-07-11 Guided Therapy Systems, Llc Methods for lifting skin tissue
US10046182B2 (en) 2004-10-06 2018-08-14 Guided Therapy Systems, Llc Methods for face and neck lifts
US10046181B2 (en) 2004-10-06 2018-08-14 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US10010724B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US11207547B2 (en) 2004-10-06 2021-12-28 Guided Therapy Systems, Llc Probe for ultrasound tissue treatment
US9440096B2 (en) 2004-10-06 2016-09-13 Guided Therapy Systems, Llc Method and system for treating stretch marks
US10525288B2 (en) 2004-10-06 2020-01-07 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US10245450B2 (en) 2004-10-06 2019-04-02 Guided Therapy Systems, Llc Ultrasound probe for fat and cellulite reduction
US11235179B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc Energy based skin gland treatment
US9320537B2 (en) 2004-10-06 2016-04-26 Guided Therapy Systems, Llc Methods for noninvasive skin tightening
US9694212B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, Llc Method and system for ultrasound treatment of skin
US11883688B2 (en) 2004-10-06 2024-01-30 Guided Therapy Systems, Llc Energy based fat reduction
US10532230B2 (en) 2004-10-06 2020-01-14 Guided Therapy Systems, Llc Methods for face and neck lifts
US9427600B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US10603519B2 (en) 2004-10-06 2020-03-31 Guided Therapy Systems, Llc Energy based fat reduction
US9283410B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9283409B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, Llc Energy based fat reduction
US9427601B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, Llc Methods for face and neck lifts
US10888718B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US9974982B2 (en) 2004-10-06 2018-05-22 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US10010726B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US11235180B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US10010721B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, L.L.C. Energy based fat reduction
US10610706B2 (en) 2004-10-06 2020-04-07 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10610705B2 (en) 2004-10-06 2020-04-07 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US10010725B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, Llc Ultrasound probe for fat and cellulite reduction
US10265550B2 (en) 2004-10-06 2019-04-23 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US11338156B2 (en) 2004-10-06 2022-05-24 Guided Therapy Systems, Llc Noninvasive tissue tightening system
US11207548B2 (en) 2004-10-07 2021-12-28 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US11724133B2 (en) 2004-10-07 2023-08-15 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US20110218442A1 (en) * 2004-10-14 2011-09-08 Scimed Life Systems, Inc. Integrated bias circuitry for ultrasound imaging devices configured to image the interior of a living being
US8313438B2 (en) 2004-10-14 2012-11-20 Scimed Life Systems, Inc. Integrated bias circuitry for ultrasound imaging devices configured to image the interior of a living being
US7967754B2 (en) * 2004-10-14 2011-06-28 Scimed Life Systems, Inc. Integrated bias circuitry for ultrasound imaging devices configured to image the interior of a living being
US20060084875A1 (en) * 2004-10-14 2006-04-20 Scimed Life Systems, Inc. Integrated bias circuitry for ultrasound imaging devices
US20060092930A1 (en) * 2004-10-28 2006-05-04 General Electric Company Ultrasound beamformer with high speed serial control bus packetized protocol
US7611463B2 (en) 2004-10-28 2009-11-03 General Electric Company Ultrasound beamformer with high speed serial control bus packetized protocol
US7228154B2 (en) 2004-11-03 2007-06-05 Sony Corporation Method and system for processing wireless digital multimedia
US20060092893A1 (en) * 2004-11-03 2006-05-04 Mark Champion Method and system for processing wireless digital multimedia
JPWO2006054635A1 (en) * 2004-11-17 2008-05-29 株式会社日立メディコ Ultrasonic diagnostic apparatus and ultrasonic image display method
US20090124903A1 (en) * 2004-11-17 2009-05-14 Takashi Osaka Ultrasound Diagnostic Apparatus and Method of Displaying Ultrasound Image
EP1815796A1 (en) * 2004-11-17 2007-08-08 Hitachi Medical Corporation Ultrasonograph and ultrasonic image display method
US8708912B2 (en) 2004-11-17 2014-04-29 Hitachi Medical Corporation Ultrasound diagnostic apparatus and method of displaying ultrasound image
EP1815796A4 (en) * 2004-11-17 2009-10-28 Hitachi Medical Corp Ultrasonograph and ultrasonic image display method
JP5113387B2 (en) * 2004-11-17 2013-01-09 株式会社日立メディコ Ultrasonic diagnostic apparatus and ultrasonic image display method
US20060116579A1 (en) * 2004-11-29 2006-06-01 Pai-Chi Li Ultrasound imaging apparatus and method thereof
US20060173335A1 (en) * 2005-01-11 2006-08-03 General Electric Company Ultrasound beamformer with scalable receiver boards
US8002708B2 (en) 2005-01-11 2011-08-23 General Electric Company Ultrasound beamformer with scalable receiver boards
US8702621B2 (en) 2005-01-31 2014-04-22 C.R. Bard, Inc. Quick cycle biopsy system
US11166702B2 (en) 2005-01-31 2021-11-09 C.R. Bard, Inc. Quick cycle biopsy system
US8012102B2 (en) 2005-01-31 2011-09-06 C. R. Bard, Inc. Quick cycle biopsy system
US8702622B2 (en) 2005-01-31 2014-04-22 C.R. Bard, Inc. Quick cycle biopsy system
US9161743B2 (en) 2005-01-31 2015-10-20 C. R. Bard, Inc. Quick cycle biopsy system
US10058308B2 (en) 2005-01-31 2018-08-28 C. R. Bard, Inc. Method for operating a biopsy apparatus
US20060203963A1 (en) * 2005-02-14 2006-09-14 Helmut Biedermann Medical apparatus system, and method for operation thereof
US9144413B2 (en) * 2005-03-30 2015-09-29 Hitachi Medical Corporation Ultrasonic diagnostic apparatus
US20090149750A1 (en) * 2005-03-30 2009-06-11 Hitachi Medical Corporation Ultrasonic Diagnostic Apparatus
US20080194951A1 (en) * 2005-04-18 2008-08-14 Koninklijke Philips Electronics N.V. Ultrasonic Diagnostic Imaging System Configured By Probe Firmware
WO2006111872A3 (en) * 2005-04-18 2007-03-01 Koninkl Philips Electronics Nv Pc-based portable ultrasonic diagnostic imaging system
US9301730B2 (en) * 2005-04-18 2016-04-05 Koninklijke Philips N.V. Ultrasonic diagnostic imaging system configured by probe firmware
WO2006111872A2 (en) * 2005-04-18 2006-10-26 Koninklijke Philips Electronics, N.V. Pc-based portable ultrasonic diagnostic imaging system
US20080208045A1 (en) * 2005-05-10 2008-08-28 Koninklijke Philips Electronics N.V. Optimization of User Settings for an Ultrasonic Imaging System
US7871379B2 (en) * 2005-06-20 2011-01-18 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and method of ultrasonic measurement
US20070038086A1 (en) * 2005-06-20 2007-02-15 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and method of ultrasonic measurement
US20060293594A1 (en) * 2005-06-24 2006-12-28 Siemens Aktiengesellschaft Device for carrying out intravascular examinations
US7627540B2 (en) * 2005-06-28 2009-12-01 Neurosciences Research Foundation, Inc. Addressing scheme for neural modeling and brain-based devices using special purpose processor
US20090240642A1 (en) * 2005-06-28 2009-09-24 Neurosciences Research Foundation, Inc. Neural modeling and brain-based devices using special purpose processor
US20070011118A1 (en) * 2005-06-28 2007-01-11 Snook James A Addressing Scheme for Neural Modeling and Brain-Based Devices using Special Purpose Processor
US8326782B2 (en) 2005-06-28 2012-12-04 Neurosciences Research Foundation, Inc. Addressing scheme for neural modeling and brain-based devices using special purpose processor
US8126828B2 (en) * 2005-06-28 2012-02-28 Neuroscience Research Foundation, Inc. Special purpose processor implementing a synthetic neural model of the human brain
US20070016027A1 (en) * 2005-07-14 2007-01-18 Marco Gerois D Method and apparatus for utilizing a high speed serial data bus interface within an ultrasound system
EP1672552A1 (en) * 2005-08-02 2006-06-21 Agilent Technologies Inc Independently installable component for measurement device
US20070033587A1 (en) * 2005-08-02 2007-02-08 Stephan Siebrecht Independently installable component for measurement device
US8961430B2 (en) 2005-08-10 2015-02-24 C.R. Bard, Inc. Single-insertion, multiple sampling biopsy device usable with various transport systems and integrated markers
US8771200B2 (en) 2005-08-10 2014-07-08 C.R. Bard, Inc. Single insertion, multiple sampling biopsy device with linear drive
US8267868B2 (en) 2005-08-10 2012-09-18 C. R. Bard, Inc. Single-insertion, multiple sample biopsy device with integrated markers
US11219431B2 (en) 2005-08-10 2022-01-11 C.R. Bard, Inc. Single-insertion, multiple sampling biopsy device with linear drive
US8728003B2 (en) 2005-08-10 2014-05-20 C.R. Bard Inc. Single insertion, multiple sample biopsy device with integrated markers
US8282574B2 (en) 2005-08-10 2012-10-09 C. R. Bard, Inc. Single-insertion, multiple sampling biopsy device usable with various transport systems and integrated markers
US10010307B2 (en) 2005-08-10 2018-07-03 C. R. Bard, Inc. Single-insertion, multiple sampling biopsy device with linear drive
US8721563B2 (en) 2005-08-10 2014-05-13 C. R. Bard, Inc. Single-insertion, multiple sample biopsy device with integrated markers
US10368849B2 (en) 2005-08-10 2019-08-06 C. R. Bard, Inc. Single-insertion, multiple sampling biopsy device usable with various transport systems and integrated markers
US11849928B2 (en) 2005-08-10 2023-12-26 C. R. Bard, Inc. Single-insertion, multiple sampling biopsy device usable with various transport systems and integrated markers
US8262585B2 (en) 2005-08-10 2012-09-11 C. R. Bard, Inc. Single-insertion, multiple sampling biopsy device with linear drive
US20070043454A1 (en) * 2005-08-22 2007-02-22 John Sonnenberg Multi-function remote controller and programmer for landscape systems
WO2007067200A3 (en) * 2005-10-26 2007-12-06 Aloka Co Ltd Method and apparatus for elasticity imaging
WO2007067200A2 (en) * 2005-10-26 2007-06-14 Aloka Co., Ltd. Method and apparatus for elasticity imaging
US9629604B2 (en) * 2005-12-26 2017-04-25 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus
US20070167766A1 (en) * 2005-12-26 2007-07-19 Masao Takimoto Ultrasonic diagnostic apparatus
US20070195539A1 (en) * 2006-02-21 2007-08-23 Karl Storz Gmbh & Co. Kg Ultra wide band wireless optical endoscopic device
EP1847214A3 (en) * 2006-04-20 2010-11-03 Karl Storz Endovision, Inc. Ultra wide band wireless optical endoscopic device
EP1847214A2 (en) * 2006-04-20 2007-10-24 Karl Storz Endovision, Inc. Ultra wide band wireless optical endoscopic device
US20070282201A1 (en) * 2006-05-03 2007-12-06 Nam Ju Kim Ultrasonic moving-picture real-time service system and method and recording medium having embodied thereon computer program for performing method
US9024971B2 (en) 2006-05-05 2015-05-05 General Electric Company User interface and method for identifying related information displayed in an ultrasound system
US20070258632A1 (en) * 2006-05-05 2007-11-08 General Electric Company User interface and method for identifying related information displayed in an ultrasound system
US8471866B2 (en) 2006-05-05 2013-06-25 General Electric Company User interface and method for identifying related information displayed in an ultrasound system
US8937630B2 (en) 2006-05-08 2015-01-20 C. R. Bard, Inc. User interface and methods for sonographic display device
US8228347B2 (en) 2006-05-08 2012-07-24 C. R. Bard, Inc. User interface and methods for sonographic display device
US8432417B2 (en) 2006-05-08 2013-04-30 C. R. Bard, Inc. User interface and methods for sonographic display device
US20100222856A1 (en) * 2006-06-08 2010-09-02 Greatbatch Ltd. Band stop filter employing a capacitor and an inductor tank circuit to enhance MRI compatibility of active medical devices
US8275466B2 (en) 2006-06-08 2012-09-25 Greatbatch Ltd. Band stop filter employing a capacitor and an inductor tank circuit to enhance MRI compatibility of active medical devices
US9119968B2 (en) 2006-06-08 2015-09-01 Greatbatch Ltd. Band stop filter employing a capacitor and an inductor tank circuit to enhance MRI compatibility of active medical devices
US9008799B2 (en) 2006-06-08 2015-04-14 Greatbatch Ltd. EMI filter employing a self-resonant inductor bandstop filter having optimum inductance and capacitance values
US8903505B2 (en) 2006-06-08 2014-12-02 Greatbatch Ltd. Implantable lead bandstop filter employing an inductive coil with parasitic capacitance to enhance MRI compatibility of active medical devices
US9439632B2 (en) 2006-08-21 2016-09-13 C. R. Bard, Inc. Self-contained handheld biopsy needle
US10617399B2 (en) 2006-08-21 2020-04-14 C.R. Bard, Inc. Self-contained handheld biopsy needle
US8251917B2 (en) 2006-08-21 2012-08-28 C. R. Bard, Inc. Self-contained handheld biopsy needle
US8951208B2 (en) 2006-08-21 2015-02-10 C. R. Bard, Inc. Self-contained handheld biopsy needle
US9144417B2 (en) 2006-09-15 2015-09-29 Volcano Corporation Controller user interface for a catheter lab intravascular ultrasound system
US8491477B2 (en) 2006-10-02 2013-07-23 University Of Washington Ultrasonic estimation of strain induced by in vivo compression
US20100094131A1 (en) * 2006-10-02 2010-04-15 Washington, University Of Ultrasonic estimation of strain induced by in vivo compression
US20080086054A1 (en) * 2006-10-04 2008-04-10 Slayton Michael H Ultrasound system and method for imaging and/or measuring displacement of moving tissue and fluid
US9241683B2 (en) * 2006-10-04 2016-01-26 Ardent Sound Inc. Ultrasound system and method for imaging and/or measuring displacement of moving tissue and fluid
US9566045B2 (en) 2006-10-06 2017-02-14 Bard Peripheral Vascular, Inc. Tissue handling system with reduced operator exposure
US8485987B2 (en) 2006-10-06 2013-07-16 Bard Peripheral Vascular, Inc. Tissue handling system with reduced operator exposure
US11559289B2 (en) 2006-10-06 2023-01-24 Bard Peripheral Vascular, Inc. Tissue handling system with reduced operator exposure
US10172594B2 (en) 2006-10-06 2019-01-08 Bard Peripheral Vascular, Inc. Tissue handling system with reduced operator exposure
US8262586B2 (en) 2006-10-24 2012-09-11 C. R. Bard, Inc. Large sample low aspect ratio biopsy needle
US11583261B2 (en) 2006-10-24 2023-02-21 C. R. Bard, Inc. Large sample low aspect ratio biopsy needle
US10149664B2 (en) 2006-10-24 2018-12-11 C. R. Bard, Inc. Large sample low aspect ratio biopsy needle
US20080104533A1 (en) * 2006-10-31 2008-05-01 Steffen List Method and system for generation of a user interface
US8086965B2 (en) * 2006-10-31 2011-12-27 Siemens Aktiengesellschaft Method and system for generation of a user interface
US8490489B2 (en) 2006-11-10 2013-07-23 Siemens Medical Solutions Usa, Inc. Transducer array imaging system
US20080114248A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US8499634B2 (en) * 2006-11-10 2013-08-06 Siemens Medical Solutions Usa, Inc. Transducer array imaging system
US9084574B2 (en) 2006-11-10 2015-07-21 Siemens Medical Solution Usa, Inc. Transducer array imaging system
US20070161904A1 (en) * 2006-11-10 2007-07-12 Penrith Corporation Transducer array imaging system
US8312771B2 (en) 2006-11-10 2012-11-20 Siemens Medical Solutions Usa, Inc. Transducer array imaging system
US20080114247A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080114250A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US9295444B2 (en) 2006-11-10 2016-03-29 Siemens Medical Solutions Usa, Inc. Transducer array imaging system
US20080114241A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080114253A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080114245A1 (en) * 2006-11-10 2008-05-15 Randall Kevin S Transducer array imaging system
US20080114251A1 (en) * 2006-11-10 2008-05-15 Penrith Corporation Transducer array imaging system
US20080125655A1 (en) * 2006-11-23 2008-05-29 Medison Co., Ltd. Portable ultrasound system
US9055920B2 (en) 2006-12-07 2015-06-16 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US9259209B2 (en) 2006-12-07 2016-02-16 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US10321891B2 (en) 2006-12-07 2019-06-18 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US11633174B2 (en) 2006-12-07 2023-04-25 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for Time Gain and Lateral Gain Compensation
US10456111B2 (en) 2006-12-07 2019-10-29 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US8403855B2 (en) * 2006-12-07 2013-03-26 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US20110276283A1 (en) * 2006-12-07 2011-11-10 Samsung Medison Co., Ltd. Ultrasound system and signal processing unit configured for time gain and lateral gain compensation
US7684544B2 (en) * 2006-12-14 2010-03-23 Wilson Kevin S Portable digital radiographic devices
US20080146925A1 (en) * 2006-12-14 2008-06-19 Ep Medsystems, Inc. Integrated Electrophysiology and Ultrasound Imaging System
US20080144777A1 (en) * 2006-12-14 2008-06-19 Wilson Kevin S Portable digital radiographic devices
EP1935343A1 (en) * 2006-12-18 2008-06-25 Esaote S.p.A. Ergonomic housing for electroacoustic transducers and ultrasound probe with said housing
US20080194964A1 (en) * 2007-02-08 2008-08-14 Randall Kevin S Ultrasound imaging systems
US9706976B2 (en) * 2007-02-08 2017-07-18 Siemens Medical Solutions Usa, Inc. Ultrasound imaging systems and methods of performing ultrasound procedures
US20080194920A1 (en) * 2007-02-09 2008-08-14 Siemens Aktiengesellschaft Methods for determining parameters and planning clinical studies in automatic study and data management systems
US8529446B2 (en) * 2007-02-09 2013-09-10 Siemens Aktiengesellschaft Methods for determining parameters and planning clinical studies in automatic study and data management systems
US20080208061A1 (en) * 2007-02-23 2008-08-28 General Electric Company Methods and systems for spatial compounding in a handheld ultrasound device
US9298889B2 (en) 2007-03-09 2016-03-29 Spacelabs Healthcare Llc Health data collection tool
US8500645B2 (en) * 2007-04-10 2013-08-06 C. R. Bard, Inc. Low power ultrasound system
US9826960B2 (en) 2007-04-10 2017-11-28 C. R. Bard, Inc. Low power ultrasound system
US20080255451A1 (en) * 2007-04-10 2008-10-16 C.R. Bard, Inc. Low power ultrasound system
US20080287789A1 (en) * 2007-05-14 2008-11-20 Sonosite, Inc. Computed volume sonography
US9213086B2 (en) * 2007-05-14 2015-12-15 Fujifilm Sonosite, Inc. Computed volume sonography
US8249691B2 (en) 2007-05-16 2012-08-21 Boundary Life Sciences Global motion invariant signatures for fast and accurate motion tracking in a digital image-based elasto-tomography system
US20080287780A1 (en) * 2007-05-16 2008-11-20 James Geoffrey Chase Integral based parameter identification applied to three dimensional tissue stiffness reconstruction in a digital image-based elasto-tomography system
US20080287807A1 (en) * 2007-05-16 2008-11-20 James Geoffrey Chase Global motion invariant signatures for fast and accurate motion tracking in a digital image-based elasto-tomography system
US20110208043A1 (en) * 2007-05-16 2011-08-25 Boundary Life Sciences Global motion invariant signatures for fast and accurate motion tracking in a digital image-based elasto-tomography system
US20090105585A1 (en) * 2007-05-16 2009-04-23 Yanwei Wang System and method for ultrasonic harmonic imaging
US9784805B2 (en) 2007-06-19 2017-10-10 Koninklijke Philips N.V. MRI radio frequency receiver comprising digital down converter with connector that passes analog signal being contained within radio frequency receiver coil unit
US20080314530A1 (en) * 2007-06-22 2008-12-25 Li-Ming Cheng Window coverings
WO2009012298A3 (en) * 2007-07-16 2009-03-26 Sunrise Medical Hhg Inc Physiological data collection system
US20110172503A1 (en) * 2007-07-16 2011-07-14 Sunrise Medical Hhg, Inc. Physiological Data Collection System
US9307951B2 (en) * 2007-08-08 2016-04-12 Hitachi Aloka Medical, Ltd. Ultrasound diagnosis apparatus
US20090043196A1 (en) * 2007-08-08 2009-02-12 Aloka Co., Ltd. Ultrasound diagnosis apparatus
US8289284B2 (en) 2007-08-09 2012-10-16 Volcano Corporation Controller user interface for a catheter lab intravascular ultrasound system
US8531428B2 (en) 2007-08-09 2013-09-10 Volcano Corporation Controller user interface for a catheter lab intravascular ultrasound system
US8803837B2 (en) 2007-08-09 2014-08-12 Volcano Corporation Controller user interface for a catheter lab intravascular ultrasound system
WO2009021179A1 (en) * 2007-08-09 2009-02-12 Volcano Corporation Controller user interface for a catheter lab intravascular ultrasound system
US20120296210A1 (en) * 2007-08-10 2012-11-22 Ultrasonix Medical Corporation Wireless network having portable ultrasound devices
US8414493B2 (en) * 2007-08-29 2013-04-09 Siemens Medical Solutions Usa, Inc. Automatic gain control in medical diagnostic ultrasound imaging
US20090062648A1 (en) * 2007-08-29 2009-03-05 Siemens Medical Solutions Usa, Inc. Automatic gain control in medical diagnostic ultrasound imaging
US20120179169A1 (en) * 2007-09-30 2012-07-12 Nitish Swarup User interface with state machine for alternate tool mode for robotic surgical tools
US20090088774A1 (en) * 2007-09-30 2009-04-02 Nitish Swarup Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US9649174B2 (en) * 2007-09-30 2017-05-16 Intuitive Surgical Operations, Inc. User interface with state machine for alternate tool mode for robotic surgical tools
US9339343B2 (en) 2007-09-30 2016-05-17 Intuitive Surgical Operations, Inc. User interface methods for alternate tool modes for robotic surgical tools
US9050120B2 (en) 2007-09-30 2015-06-09 Intuitive Surgical Operations, Inc. Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US10675000B2 (en) 2007-10-01 2020-06-09 Maui Imaging, Inc. Determining material stiffness using multiple aperture ultrasound
US9022938B2 (en) * 2007-10-25 2015-05-05 Madison Co., Ltd. Ultrasound diagnostic device and method for forming scan line data
US20090112093A1 (en) * 2007-10-25 2009-04-30 Medison Co., Ltd. Ultrasound diagnostic device and method for forming scan line data
US20090112070A1 (en) * 2007-10-31 2009-04-30 Yen-Shan Lin Telemedicine Device and System
US7839859B2 (en) * 2007-12-03 2010-11-23 Nec Laboratories America, Inc. Voice adaptive gateway pacing methods and systems for wireless multi-hop networks
US20090141631A1 (en) * 2007-12-03 2009-06-04 Nec Laboratories America, Inc. Voice adaptive gateway pacing methods and systems for wireless multi-hop networks
US9775588B2 (en) 2007-12-20 2017-10-03 C. R. Bard, Inc. Biopsy device
US10687791B2 (en) 2007-12-20 2020-06-23 C. R. Bard, Inc. Biopsy device
US8597205B2 (en) 2007-12-20 2013-12-03 C. R. Bard, Inc. Biopsy device
US8858463B2 (en) 2007-12-20 2014-10-14 C. R. Bard, Inc. Biopsy device
US20110054349A1 (en) * 2007-12-27 2011-03-03 Devicor Medical Products, Inc. Clutch and valving system for tetherless biopsy device
US8454532B2 (en) 2007-12-27 2013-06-04 Devicor Medical Products, Inc. Clutch and valving system for tetherless biopsy device
US8864682B2 (en) 2007-12-27 2014-10-21 Devicor Medical Products, Inc. Clutch and valving system for tetherless biopsy device
US20100331694A1 (en) * 2008-02-07 2010-12-30 Koji Waki Ultrasonic diagnostic apparatus.
US9108066B2 (en) 2008-03-20 2015-08-18 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
US20090247871A1 (en) * 2008-03-25 2009-10-01 Tomy Varghese Rapid two/three-dimensional sector strain imaging
US8403850B2 (en) * 2008-03-25 2013-03-26 Wisconsin Alumni Research Foundation Rapid two/three-dimensional sector strain imaging
US20110160582A1 (en) * 2008-04-29 2011-06-30 Yongping Zheng Wireless ultrasonic scanning system
US9367214B2 (en) 2008-06-05 2016-06-14 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US8555201B2 (en) * 2008-06-05 2013-10-08 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US20090307619A1 (en) * 2008-06-05 2009-12-10 Qualcomm Incorporated Wireless communication device having deterministic control of foreground access of the user interface
US20090306518A1 (en) * 2008-06-06 2009-12-10 Boston Scientific Scimed, Inc. Transducers, devices and systems containing the transducers, and methods of manufacture
US8197413B2 (en) 2008-06-06 2012-06-12 Boston Scientific Scimed, Inc. Transducers, devices and systems containing the transducers, and methods of manufacture
US11723622B2 (en) 2008-06-06 2023-08-15 Ulthera, Inc. Systems for ultrasound treatment
US11123039B2 (en) 2008-06-06 2021-09-21 Ulthera, Inc. System and method for ultrasound treatment
US10537304B2 (en) 2008-06-06 2020-01-21 Ulthera, Inc. Hand wand for ultrasonic cosmetic treatment and imaging
US20100016763A1 (en) * 2008-07-17 2010-01-21 Keilman George W Intraluminal fluid property status sensing system and method
US20100056922A1 (en) * 2008-09-02 2010-03-04 Thierry Florent Method and diagnostic ultrasound apparatus for determining the condition of a person's artery or arteries
US20100118482A1 (en) * 2008-11-13 2010-05-13 Mosaid Technologies Incorporated System including a plurality of encapsulated semiconductor chips
US8908378B2 (en) 2008-11-13 2014-12-09 Conversant Intellectual Property Management Inc. System including a plurality of encapsulated semiconductor chips
US8472199B2 (en) * 2008-11-13 2013-06-25 Mosaid Technologies Incorporated System including a plurality of encapsulated semiconductor chips
US20100121195A1 (en) * 2008-11-13 2010-05-13 Kang Hak Il Medical instrument
US8337406B2 (en) 2008-11-20 2012-12-25 Medison Co., Ltd. Adaptive persistence processing of elastic images
EP2189116A1 (en) * 2008-11-20 2010-05-26 Medison Co., Ltd. Adaptive persistence processing of elastic images
US20100125205A1 (en) * 2008-11-20 2010-05-20 Sang Shik Park Adaptive Persistence Processing Of Elastic Images
US20110224552A1 (en) * 2008-12-03 2011-09-15 Koninklijke Philips Electronics N.V. Ultrasound assembly and system comprising interchangable transducers and displays
US20100145195A1 (en) * 2008-12-08 2010-06-10 Dong Gyu Hyun Hand-Held Ultrasound System
US20100208397A1 (en) * 2008-12-17 2010-08-19 Greatbatch Ltd. Switched safety protection circuit for an aimd system during exposure to high power electromagnetic fields
US8447414B2 (en) 2008-12-17 2013-05-21 Greatbatch Ltd. Switched safety protection circuit for an AIMD system during exposure to high power electromagnetic fields
US20100183208A1 (en) * 2009-01-21 2010-07-22 Kabushiki Kaisha Toshiba Image display method, medical diagnostic imaging apparatus, and medical image processing apparatus
US10210309B2 (en) * 2009-01-21 2019-02-19 Toshiba Medical Systems Corporation Image display method, medical diagnostic imaging apparatus, and medical image processing apparatus
US20100191092A1 (en) * 2009-01-29 2010-07-29 Bowers Jeffrey A Diagnostic delivery service
US20100189224A1 (en) * 2009-01-29 2010-07-29 Searete Llc Diagnostic delivery service
US20100187304A1 (en) * 2009-01-29 2010-07-29 Searete Llc, A Limited Liability Corporation Of The State Delaware Diagnostic delivery service
US8111809B2 (en) 2009-01-29 2012-02-07 The Invention Science Fund I, Llc Diagnostic delivery service
US8254524B2 (en) 2009-01-29 2012-08-28 The Invention Science Fund I, Llc Diagnostic delivery service
US8249218B2 (en) 2009-01-29 2012-08-21 The Invention Science Fund I, Llc Diagnostic delivery service
US8116429B2 (en) 2009-01-29 2012-02-14 The Invention Science Fund I, Llc Diagnostic delivery service
US20100191094A1 (en) * 2009-01-29 2010-07-29 Searete Llc Diagnostic delivery service
US8031838B2 (en) 2009-01-29 2011-10-04 The Invention Science Fund I, Llc Diagnostic delivery service
US20100189219A1 (en) * 2009-01-29 2010-07-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Diagnostic delivery service
US20100191093A1 (en) * 2009-01-29 2010-07-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Diagnostic delivery service
US8130904B2 (en) 2009-01-29 2012-03-06 The Invention Science Fund I, Llc Diagnostic delivery service
US8041008B2 (en) 2009-01-29 2011-10-18 The Invention Science Fund I, Llc Diagnostic delivery service
US8047714B2 (en) 2009-01-29 2011-11-01 The Invention Science Fund I, Llc Diagnostic delivery service
US8083406B2 (en) 2009-01-29 2011-12-27 The Invention Science Fund I, Llc Diagnostic delivery service
US20100191091A1 (en) * 2009-01-29 2010-07-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Diagnostic delivery service
US20100228130A1 (en) * 2009-03-09 2010-09-09 Teratech Corporation Portable ultrasound imaging system
US8690793B2 (en) 2009-03-16 2014-04-08 C. R. Bard, Inc. Biopsy device having rotational cutting
US8095224B2 (en) 2009-03-19 2012-01-10 Greatbatch Ltd. EMI shielded conduit assembly for an active implantable medical device
US10080889B2 (en) 2009-03-19 2018-09-25 Greatbatch Ltd. Low inductance and low resistance hermetically sealed filtered feedthrough for an AIMD
US20100249598A1 (en) * 2009-03-25 2010-09-30 General Electric Company Ultrasound probe with replaceable head portion
WO2010114573A1 (en) * 2009-04-01 2010-10-07 Analogic Corporation Ultrasound probe
US10736602B2 (en) 2009-04-01 2020-08-11 Bk Medical Holding Company, Inc. Ultrasound probe
US8708930B2 (en) 2009-04-15 2014-04-29 Bard Peripheral Vascular, Inc. Biopsy apparatus having integrated fluid management
US8708929B2 (en) 2009-04-15 2014-04-29 Bard Peripheral Vascular, Inc. Biopsy apparatus having integrated fluid management
US8708928B2 (en) 2009-04-15 2014-04-29 Bard Peripheral Vascular, Inc. Biopsy apparatus having integrated fluid management
US9622718B2 (en) 2009-04-24 2017-04-18 Konica Minolta, Inc. Wireless ultrasonic diagnostic apparatus, wireless ultrasonic probe, and probe authentication method
EP2422703A4 (en) * 2009-04-24 2014-04-16 Panasonic Corp Wireless ultrasonic diagnostic device, wireless ultrasonic probe, and probe certification method
EP2422703A1 (en) * 2009-04-24 2012-02-29 Panasonic Corporation Wireless ultrasonic diagnostic device, wireless ultrasonic probe, and probe certification method
US20110105904A1 (en) * 2009-04-24 2011-05-05 Yasuhito Watanabe Wireless ultrasonic diagnostic apparatus, wireless ultrasonic probe, and probe authentication method
US8366619B2 (en) * 2009-05-13 2013-02-05 University Of Washington Nodule screening using ultrasound elastography
US20100292571A1 (en) * 2009-05-13 2010-11-18 Washington, University Of Nodule screening using ultrasound elastography
US20100295870A1 (en) * 2009-05-22 2010-11-25 Amir Baghdadi Multi-source medical imaging system
US8845548B2 (en) 2009-06-12 2014-09-30 Devicor Medical Products, Inc. Cutter drive assembly for biopsy device
US9468424B2 (en) 2009-06-12 2016-10-18 Devicor Medical Products, Inc. Cutter drive assembly for biopsy device
US20100324423A1 (en) * 2009-06-23 2010-12-23 Essa El-Aklouk Ultrasound transducer device and method of operation
US20110019893A1 (en) * 2009-07-22 2011-01-27 Norbert Rahn Method and Device for Controlling the Ablation Energy for Performing an Electrophysiological Catheter Application
US10575833B2 (en) 2009-08-12 2020-03-03 C. R. Bard, Inc. Biopsy apparatus having integrated thumbwheel mechanism for manual rotation of biopsy cannula
US9173641B2 (en) 2009-08-12 2015-11-03 C. R. Bard, Inc. Biopsy apparatus having integrated thumbwheel mechanism for manual rotation of biopsy cannula
US9655599B2 (en) 2009-08-12 2017-05-23 C. R. Bard, Inc. Biopsy apparatus having integrated thumbwheel mechanism for manual rotation of biopsy cannula
US9949726B2 (en) 2009-09-01 2018-04-24 Bard Peripheral Vscular, Inc. Biopsy driver assembly having a control circuit for conserving battery power
US9282949B2 (en) 2009-09-01 2016-03-15 Bard Peripheral Vascular, Inc. Charging station for battery powered biopsy apparatus
US8485989B2 (en) 2009-09-01 2013-07-16 Bard Peripheral Vascular, Inc. Biopsy apparatus having a tissue sample retrieval mechanism
USD640977S1 (en) 2009-09-25 2011-07-05 C. R. Bard, Inc. Charging station for a battery operated biopsy device
US8283890B2 (en) 2009-09-25 2012-10-09 Bard Peripheral Vascular, Inc. Charging station for battery powered biopsy apparatus
US11103213B2 (en) 2009-10-08 2021-08-31 C. R. Bard, Inc. Spacers for use with an ultrasound probe
US10639008B2 (en) 2009-10-08 2020-05-05 C. R. Bard, Inc. Support and cover structures for an ultrasound probe head
US8597206B2 (en) 2009-10-12 2013-12-03 Bard Peripheral Vascular, Inc. Biopsy probe assembly having a mechanism to prevent misalignment of components prior to installation
US8409098B2 (en) * 2009-10-14 2013-04-02 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for collection of cardiac geometry based on optical or magnetic tracking
US20110087091A1 (en) * 2009-10-14 2011-04-14 Olson Eric S Method and apparatus for collection of cardiac geometry based on optical or magnetic tracking
US9604020B2 (en) 2009-10-16 2017-03-28 Spacelabs Healthcare Llc Integrated, extendable anesthesia system
US9797764B2 (en) 2009-10-16 2017-10-24 Spacelabs Healthcare, Llc Light enhanced flow tube
US8430824B2 (en) 2009-10-29 2013-04-30 Bard Peripheral Vascular, Inc. Biopsy driver assembly having a control circuit for conserving battery power
US8808197B2 (en) 2009-10-29 2014-08-19 Bard Peripheral Vascular, Inc. Biopsy driver assembly having a control circuit for conserving battery power
US20110112778A1 (en) * 2009-11-12 2011-05-12 Medison Co., Ltd. Ultrasound system and method for providing doppler sound
US9026385B2 (en) * 2009-12-11 2015-05-05 Samsung Medison Co., Ltd. Ultrasound system and method for providing Doppler sound
US8882763B2 (en) 2010-01-12 2014-11-11 Greatbatch Ltd. Patient attached bonding strap for energy dissipation from a probe or a catheter during magnetic resonance imaging
US20110218436A1 (en) * 2010-03-06 2011-09-08 Dewey Russell H Mobile ultrasound system with computer-aided detection
US20120041278A1 (en) * 2010-03-12 2012-02-16 Rajendra Padma Sadhu User wearable portable communication device
US9545230B2 (en) 2010-03-12 2017-01-17 Rajendra Padma Sadhu User wearable portable communication device for collection and transmission of physiological data
US8568313B2 (en) * 2010-03-12 2013-10-29 Rajendra Padma Sadhu User wearable portable communication device for collection and transmission of physiological data
US9152765B2 (en) 2010-03-21 2015-10-06 Spacelabs Healthcare Llc Multi-display bedside monitoring system
US10835208B2 (en) 2010-04-14 2020-11-17 Maui Imaging, Inc. Concave ultrasound transducers and 3D arrays
US9070453B2 (en) 2010-04-15 2015-06-30 Ramot At Tel Aviv University Ltd. Multiple programming of flash memory without erase
US20130144165A1 (en) * 2010-06-09 2013-06-06 Emad S. Ebbini Dual mode ultrasound transducer (dmut) system and method for controlling delivery of ultrasound therapy
US11076836B2 (en) 2010-06-09 2021-08-03 Regents Of The University Of Minnesota Dual mode ultrasound transducer (DMUT) system and method for controlling delivery of ultrasound therapy
US10231712B2 (en) * 2010-06-09 2019-03-19 Regents Of The University Of Minnesota Dual mode ultrasound transducer (DMUT) system and method for controlling delivery of ultrasound therapy
US20120057009A1 (en) * 2010-09-03 2012-03-08 Liao Shih-Wen Wireless Endoscope Apparatus
EP2606814A4 (en) * 2010-09-29 2013-08-21 Olympus Medical Systems Corp Medical system, medical system communications method, medical image photography device, and server
EP2606814A1 (en) * 2010-09-29 2013-06-26 Olympus Medical Systems Corp. Medical system, medical system communications method, medical image photography device, and server
CN103140162A (en) * 2010-09-29 2013-06-05 奥林巴斯医疗株式会社 Medical system, medical system communications method, medical image photography device, and server
US20120092527A1 (en) * 2010-10-18 2012-04-19 General Electric Company Method for Multiple Image Parameter Adjustment Based on Single User Input
US8526669B2 (en) * 2010-10-18 2013-09-03 General Electric Company Method for multiple image parameter adjustment based on single user input
CN102579072A (en) * 2010-10-18 2012-07-18 通用电气公司 Method for multiple image parameter adjustment based on single user input
US9384652B2 (en) 2010-11-19 2016-07-05 Spacelabs Healthcare, Llc System and method for transfer of primary alarm notification on patient monitoring systems
US20130253317A1 (en) * 2010-12-15 2013-09-26 Koninklijke Philips Electronics N.V. Ultrasound imaging system with patient-specific settings
US20130150721A1 (en) * 2010-12-24 2013-06-13 Panasonic Corporation Ultrasound diagnostic apparatus and ultrasound diagnostic apparatus control method
US9649091B2 (en) 2011-01-07 2017-05-16 General Electric Company Wireless ultrasound imaging system and method for wireless communication in an ultrasound imaging system
US8876721B2 (en) * 2011-02-01 2014-11-04 Fujifilm Corporation Ultrasound diagnostic apparatus
CN102626321A (en) * 2011-02-01 2012-08-08 富士胶片株式会社 Ultrasound diagnostic apparatus
US20120197124A1 (en) * 2011-02-01 2012-08-02 Fujifilm Corporation Ultrasound diagnostic apparatus
US20190059728A1 (en) * 2011-02-17 2019-02-28 Tyto Care Ltd. System and method for performing an automatic and remote trained personnel guided medical examination
US11198014B2 (en) 2011-03-01 2021-12-14 Greatbatch Ltd. Hermetically sealed filtered feedthrough assembly having a capacitor with an oxide resistant electrical connection to an active implantable medical device housing
US10561837B2 (en) 2011-03-01 2020-02-18 Greatbatch Ltd. Low equivalent series resistance RF filter for an active implantable medical device utilizing a ceramic reinforced metal composite filled via
US10596369B2 (en) 2011-03-01 2020-03-24 Greatbatch Ltd. Low equivalent series resistance RF filter for an active implantable medical device
US11071858B2 (en) 2011-03-01 2021-07-27 Greatbatch Ltd. Hermetically sealed filtered feedthrough having platinum sealed directly to the insulator in a via hole
US9131953B2 (en) * 2011-03-10 2015-09-15 Erbe Elektromedizin Gmbh Surgical instrument with digital data interface
US20120232540A1 (en) * 2011-03-10 2012-09-13 Thomas Baur Surgical instrument with digital data interface
US20120232397A1 (en) * 2011-03-11 2012-09-13 Fujifilm Corporation Ultrasound probe and ultrasound diagnostic apparatus
US11562825B2 (en) 2011-03-11 2023-01-24 Spacelabs Healthcare L.L.C. Methods and systems to determine multi-parameter managed alarm hierarchy during patient monitoring
US8702611B2 (en) * 2011-03-11 2014-04-22 Fujifilm Corporation Ultrasound probe and ultrasound diagnostic apparatus
US10699811B2 (en) 2011-03-11 2020-06-30 Spacelabs Healthcare L.L.C. Methods and systems to determine multi-parameter managed alarm hierarchy during patient monitoring
US11139077B2 (en) 2011-03-11 2021-10-05 Spacelabs Healthcare L.L.C. Methods and systems to determine multi-parameter managed alarm hierarchy during patient monitoring
US11547384B2 (en) 2011-04-14 2023-01-10 Regents Of The University Of Minnesota Vascular characterization using ultrasound imaging
GB2505133B (en) * 2011-05-15 2017-07-19 Spacelabs Healthcare Llc User configurable central monitoring station
WO2012158720A1 (en) * 2011-05-15 2012-11-22 Spacelabs Healthcare, Llc User configurable central monitoring station
GB2505133A (en) * 2011-05-15 2014-02-19 Spacelabs Healthcare Llc User configurable central monitoring station
US20120324397A1 (en) * 2011-06-20 2012-12-20 Tabb Alan Patz System and method for wireless interaction with medical image data
US10129926B2 (en) * 2011-07-25 2018-11-13 Samsung Electronics Co., Ltd. Wireless communication method of probe for ultrasound diagnosis and apparatus therefor
US20130028153A1 (en) * 2011-07-25 2013-01-31 Samsung Electronics Co., Ltd. Wireless communication method of probe for ultrasound diagnosis and apparatus therefor
USD754357S1 (en) 2011-08-09 2016-04-19 C. R. Bard, Inc. Ultrasound probe head
WO2013039910A3 (en) * 2011-09-12 2014-05-15 Bedford, Freeman & Worth Publishing Group, Llc Interactive online laboratory
US20130079630A1 (en) * 2011-09-28 2013-03-28 Terumo Kabushiki Kaisha Imaging apparatus for diagnosis
US11234679B2 (en) 2011-11-10 2022-02-01 Fujifilm Corporation Ultrasound diagnostic apparatus and ultrasound image producing method
US20140236009A1 (en) * 2011-11-10 2014-08-21 Fujifilm Corporation Ultrasound diagnostic apparatus and ultrasound image producing method
US10321895B2 (en) * 2011-11-10 2019-06-18 Fujifilm Corporation Ultrasound diagnostic apparatus and ultrasound image producing method
US10617384B2 (en) 2011-12-29 2020-04-14 Maui Imaging, Inc. M-mode ultrasound imaging of arbitrary paths
US20130184582A1 (en) * 2012-01-16 2013-07-18 Yuko KANAYAMA Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image parallel display method
US10335118B2 (en) * 2012-01-16 2019-07-02 Canon Medical Systems Corporation Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image parallel display method
CN103202712A (en) * 2012-01-17 2013-07-17 三星电子株式会社 Probe Device, Server, System For Diagnosing Ultrasound Image, And Method Of Processing Ultrasound Image
RU2627638C2 (en) * 2012-01-17 2017-08-09 Самсунг Электроникс Ко., Лтд. Device of sensor, server, system for diagnostics of ultrasound image and method of processing of ultrasonic image
RU2638621C2 (en) * 2012-01-18 2017-12-14 Конинклейке Филипс Н.В. Ultrasonic management of needle trajectory during biopsy
WO2013109965A1 (en) * 2012-01-19 2013-07-25 Brigham And Women's Hospital, Inc. Data reconstruction for improved ultrasound imaging
US9706979B2 (en) * 2012-02-06 2017-07-18 Hitachi, Ltd. Mobile ultrasonic diagnostic device
US20140371592A1 (en) * 2012-02-06 2014-12-18 Hitachi Aloka Medical, Ltd. Mobile ultrasonic diagnostic device
US11430415B2 (en) 2012-02-28 2022-08-30 Panasonic Intellectual Property Management Co., Ltd. Apparatus and method
US20160365073A1 (en) * 2012-02-28 2016-12-15 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for control information, method for displaying control information, and system for displaying control information
US10978025B2 (en) * 2012-02-28 2021-04-13 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for control information, method for displaying control information, and system for displaying control information
US20220392423A1 (en) * 2012-02-28 2022-12-08 Panasonic Intellectual Property Management Co., Ltd. Apparatus and method
US10170083B2 (en) * 2012-02-28 2019-01-01 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for control information, method for displaying control information, and system for displaying control information
US20190096369A1 (en) * 2012-02-28 2019-03-28 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for control information, method for displaying control information, and system for displaying control information
US9459606B2 (en) * 2012-02-28 2016-10-04 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for control information, method for displaying control information, and system for displaying control information
US11769470B2 (en) * 2012-02-28 2023-09-26 Panasonic Intellectual Property Management Co., Ltd. Apparatus and method for obtaining and displaying appliance photographic image and supplemental data
US20140129004A1 (en) * 2012-02-28 2014-05-08 Panasonic Corporation Display apparatus for control information, method for displaying control information, and system for displaying control information
US20130246714A1 (en) * 2012-03-16 2013-09-19 Oracle International Corporation System and method for supporting buffer allocation in a shared memory queue
US9405574B2 (en) 2012-03-16 2016-08-02 Oracle International Corporation System and method for transmitting complex structures based on a shared memory queue
US10289443B2 (en) 2012-03-16 2019-05-14 Oracle International Corporation System and method for sharing global transaction identifier (GTRID) in a transactional middleware environment
US9760584B2 (en) 2012-03-16 2017-09-12 Oracle International Corporation Systems and methods for supporting inline delegation of middle-tier transaction logs to database
US9665392B2 (en) 2012-03-16 2017-05-30 Oracle International Corporation System and method for supporting intra-node communication based on a shared memory queue
US9389905B2 (en) 2012-03-16 2016-07-12 Oracle International Corporation System and method for supporting read-only optimization in a transactional middleware environment
US9658879B2 (en) * 2012-03-16 2017-05-23 Oracle International Corporation System and method for supporting buffer allocation in a shared memory queue
US10133596B2 (en) 2012-03-16 2018-11-20 Oracle International Corporation System and method for supporting application interoperation in a transactional middleware environment
JP2019141629A (en) * 2012-03-26 2019-08-29 テラテク・コーポレーシヨン Portable medical ultrasonic wave imaging unit
US9024902B2 (en) * 2012-03-26 2015-05-05 General Electric Company Ultrasound device and method thereof
WO2013148730A3 (en) * 2012-03-26 2013-11-28 Teratech Corporation Tablet ultrasound system
US10667790B2 (en) 2012-03-26 2020-06-02 Teratech Corporation Tablet ultrasound system
US20130249842A1 (en) * 2012-03-26 2013-09-26 General Electric Company Ultrasound device and method thereof
US11179138B2 (en) 2012-03-26 2021-11-23 Teratech Corporation Tablet ultrasound system
US11857363B2 (en) 2012-03-26 2024-01-02 Teratech Corporation Tablet ultrasound system
US9877699B2 (en) 2012-03-26 2018-01-30 Teratech Corporation Tablet ultrasound system
US10820885B2 (en) 2012-06-15 2020-11-03 C. R. Bard, Inc. Apparatus and methods for detection of a removable cap on an ultrasound probe
US11364010B2 (en) 2012-07-26 2022-06-21 Interson Corporation Portable ultrasound imaging probe including a transducer array
US20140031694A1 (en) * 2012-07-26 2014-01-30 Interson Corporation Portable ultrasonic imaging probe including a transducer array
US10499878B2 (en) * 2012-07-26 2019-12-10 Interson Corporation Portable ultrasonic imaging probe including a transducer array
US11253233B2 (en) 2012-08-10 2022-02-22 Maui Imaging, Inc. Calibration of multiple aperture ultrasound probes
US20150238168A1 (en) * 2012-09-13 2015-08-27 Koninklijke Philips N.V. Mobile 3d wireless ultrasound image acquisition device and ultrasound imaging system
US9510802B2 (en) 2012-09-21 2016-12-06 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US9802063B2 (en) 2012-09-21 2017-10-31 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US20150196277A1 (en) * 2012-09-24 2015-07-16 Samsung Medison Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10588603B2 (en) * 2012-09-24 2020-03-17 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
EP2710960A1 (en) * 2012-09-24 2014-03-26 Samsung Electronics Co., Ltd Ultrasound apparatus and information providing method of the ultrasound apparatus
US20150051491A1 (en) * 2012-09-24 2015-02-19 Samsung Medison Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
EP3446634A1 (en) * 2012-09-24 2019-02-27 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
EP2898832A1 (en) * 2012-09-24 2015-07-29 Samsung Electronics Co., Ltd Ultrasound apparatus and information providing method of the ultrasound apparatus
EP2974664A1 (en) * 2012-09-24 2016-01-20 Samsung Electronics Co., Ltd Ultrasound apparatus and information providing method of the ultrasound apparatus
CN103654860A (en) * 2012-09-24 2014-03-26 三星电子株式会社 Ultrasound apparatus and information providing method of the ultrasound apparatus
US20140088428A1 (en) * 2012-09-24 2014-03-27 Samsung Medison Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10413277B2 (en) * 2012-09-24 2019-09-17 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10617391B2 (en) 2012-09-24 2020-04-14 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10285666B2 (en) * 2012-09-24 2019-05-14 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10537307B2 (en) 2012-09-24 2020-01-21 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
EP3494892A3 (en) * 2012-09-24 2019-07-10 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
CN107320130A (en) * 2012-09-24 2017-11-07 三星电子株式会社 The information providing method of ultrasonic device and ultrasonic device
EP2946731A1 (en) * 2012-09-24 2015-11-25 Samsung Electronics Co., Ltd Ultrasound apparatus and information providing method of the ultrasound apparatus
EP3494891A3 (en) * 2012-09-24 2019-07-17 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10595827B2 (en) * 2012-09-24 2020-03-24 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10642953B2 (en) * 2012-12-26 2020-05-05 Philips Image Guided Therapy Corporation Data labeling and indexing in a multi-modality medical imaging system
US11766238B2 (en) * 2012-12-26 2023-09-26 Philips Image Guided Therapy Corporation Measurement navigation in a multi-modality medical imaging system
US20140180721A1 (en) * 2012-12-26 2014-06-26 Volcano Corporation Data Labeling and Indexing in a Multi-Modality Medical Imaging System
US20210038188A1 (en) * 2012-12-26 2021-02-11 Philips Image Guided Therapy Corporation Measurement navigation in a multi-modality medical imaging system
US9427596B2 (en) 2013-01-16 2016-08-30 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
USRE46699E1 (en) 2013-01-16 2018-02-06 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
US20150346157A1 (en) * 2013-02-07 2015-12-03 Siemens Aktiengesellschaft Method and device for improving the saft analysis when measuring irregularities
US10222352B2 (en) * 2013-02-07 2019-03-05 Siemens Aktiengesellschaft Method and device for improving the SAFT analysis when measuring irregularities
US20150374346A1 (en) * 2013-02-15 2015-12-31 B-K Medical Aps Ultrasound display client
US10420960B2 (en) 2013-03-08 2019-09-24 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US11517772B2 (en) 2013-03-08 2022-12-06 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US11779316B2 (en) 2013-03-20 2023-10-10 Bard Peripheral Vascular, Inc. Biopsy device
US10285673B2 (en) 2013-03-20 2019-05-14 Bard Peripheral Vascular, Inc. Biopsy device
US9667889B2 (en) 2013-04-03 2017-05-30 Butterfly Network, Inc. Portable electronic devices with integrated imaging capabilities
US10987026B2 (en) 2013-05-30 2021-04-27 Spacelabs Healthcare Llc Capnography module with automatic switching between mainstream and sidestream monitoring
US9197899B2 (en) 2013-05-31 2015-11-24 Eagleyemed Inc. Dynamic adjustment of image compression for high resolution live medical image sharing
WO2014194288A1 (en) * 2013-05-31 2014-12-04 eagleyemed, Inc. Dynamic adjustment of image compression for high resolution live medical image sharing
US20140364741A1 (en) * 2013-06-11 2014-12-11 Samsung Electronics Co., Ltd. Portable ultrasonic probe
US10327735B2 (en) * 2013-06-11 2019-06-25 Samsung Electronics Co., Ltd. Portable ultrasonic probe having a folder part
US10499885B2 (en) * 2013-06-25 2019-12-10 Hitachi, Ltd. Ultrasound system and method, and ultrasound probe
US9002458B2 (en) 2013-06-29 2015-04-07 Thync, Inc. Transdermal electrical stimulation devices for modifying or inducing cognitive state
US9014811B2 (en) 2013-06-29 2015-04-21 Thync, Inc. Transdermal electrical stimulation methods for modifying or inducing cognitive state
US9233244B2 (en) 2013-06-29 2016-01-12 Thync, Inc. Transdermal electrical stimulation devices for modifying or inducing cognitive state
US10350421B2 (en) 2013-06-30 2019-07-16 Greatbatch Ltd. Metallurgically bonded gold pocket pad for grounding an EMI filter to a hermetic terminal for an active implantable medical device
US9931514B2 (en) 2013-06-30 2018-04-03 Greatbatch Ltd. Low impedance oxide resistant grounded capacitor for an AIMD
US11116474B2 (en) 2013-07-23 2021-09-14 Regents Of The University Of Minnesota Ultrasound image formation and/or reconstruction using multiple frequency waveforms
US20150032004A1 (en) * 2013-07-24 2015-01-29 Samsung Electronics Co., Ltd. Ultrasonic probe, system including the same, and operation method thereof
US20150065881A1 (en) * 2013-08-29 2015-03-05 Samsung Medison Co., Ltd. Ultrasound diagnostic apparatus and method of operating the same
US10052084B2 (en) * 2013-08-29 2018-08-21 Samsung Electronics Co., Ltd. Ultrasound diagnostic apparatus and method of operating the same
US11631496B2 (en) 2013-09-12 2023-04-18 Johnson & Johnson Surgical Vision, Inc. Computer-based operating room support system
US11715560B2 (en) 2013-09-12 2023-08-01 Johnson & Johnson Surgical Vision, Inc. Computer-based operating room support system
US10653392B2 (en) 2013-09-13 2020-05-19 Maui Imaging, Inc. Ultrasound imaging using apparent point-source transmit transducer
US20150099968A1 (en) * 2013-10-07 2015-04-09 Acist Medical Systems, Inc. Systems and methods for controlled single touch zoom
US10387013B2 (en) * 2013-10-07 2019-08-20 Acist Medical Systems, Inc. Systems and methods for controlled single touch zoom
US20150190111A1 (en) * 2014-01-03 2015-07-09 William R. Fry Ultrasound-guided non-invasive blood pressure measurement apparatus and methods
US9399126B2 (en) 2014-02-27 2016-07-26 Thync Global, Inc. Methods for user control of neurostimulation to modify a cognitive state
US10420536B2 (en) 2014-03-14 2019-09-24 Alpinion Medical Systems Co., Ltd. Software-based ultrasound imaging system
WO2015137543A1 (en) * 2014-03-14 2015-09-17 알피니언메디칼시스템 주식회사 Software-based ultrasound imaging system
CN106102584A (en) * 2014-03-14 2016-11-09 爱飞纽医疗机械贸易有限公司 Ultrasonic imaging system based on software
US20150265857A1 (en) * 2014-03-21 2015-09-24 Stephen R. Barnes Hybrid Ultrasound and Magnetic Resonance Imaging Device
US9827448B2 (en) * 2014-03-21 2017-11-28 Siemens Aktiengesellschaft Hybrid ultrasound and magnetic resonance imaging device
US9989609B2 (en) * 2014-04-03 2018-06-05 Industry-Academic Foundation, Yonsei University Method and apparatus for adjusting the parameters of a magnetic resonance image
US20150285886A1 (en) * 2014-04-03 2015-10-08 Industry-Academic Cooperation Foundation, Yonsei University Method and apparatus for adjusting the parameters of a magnetic resonance image
US10603521B2 (en) 2014-04-18 2020-03-31 Ulthera, Inc. Band transducer ultrasound therapy
US11351401B2 (en) 2014-04-18 2022-06-07 Ulthera, Inc. Band transducer ultrasound therapy
US20150305824A1 (en) * 2014-04-26 2015-10-29 Steven Sounyoung Yu Technique for Inserting Medical Instruments Using Head-Mounted Display
US9333334B2 (en) 2014-05-25 2016-05-10 Thync, Inc. Methods for attaching and wearing a neurostimulator
US10250688B2 (en) * 2014-06-03 2019-04-02 Canon Kabushiki Kaisha Method and apparatus for transmitting sensor data in a wireless network
US20150350329A1 (en) * 2014-06-03 2015-12-03 Canon Kabushiki Kaisha Method and apparatus for transmitting sensor data in a wireless network
US20170220024A1 (en) * 2014-07-30 2017-08-03 Kawasaki Jukogyo Kabushiki Kaisha Robot control program generation method and apparatus
US10747200B2 (en) * 2014-07-30 2020-08-18 Kawasaki Jukogyo Kabushiki Kaisha Robot control program generation method and apparatus
US11016191B2 (en) * 2014-08-18 2021-05-25 Maui Imaging, Inc. Network-based ultrasound imaging system
US20200003896A1 (en) * 2014-08-18 2020-01-02 Maui Imaging, Inc. Network-based ultrasound imaging system
US10401493B2 (en) * 2014-08-18 2019-09-03 Maui Imaging, Inc. Network-based ultrasound imaging system
WO2016068604A1 (en) * 2014-10-31 2016-05-06 Samsung Electronics Co., Ltd. Ultrasound apparatus and information providing method of the ultrasound apparatus
US10627510B2 (en) 2014-11-14 2020-04-21 Ursus Medical, Llc Ultrasound beamforming system and method based on analog random access memory array
WO2016077822A1 (en) 2014-11-14 2016-05-19 Ursus Medical, Llc Ultrasound beamforming system and method based on aram array
US20160135786A1 (en) * 2014-11-18 2016-05-19 General Electric Company Wireless ultrasound probe tethered to a pod
US10646201B2 (en) 2014-11-18 2020-05-12 C. R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US10905396B2 (en) 2014-11-18 2021-02-02 C. R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US20160139789A1 (en) * 2014-11-18 2016-05-19 Samsung Medison Co., Ltd. Ultrasound imaging apparatus and method of controlling the same
US10402074B2 (en) * 2014-11-18 2019-09-03 Samsung Medison Co., Ltd. Ultrasound imaging apparatus and method of controlling the same
US11696746B2 (en) 2014-11-18 2023-07-11 C.R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US20160174937A1 (en) * 2014-12-23 2016-06-23 General Electric Company Wireless ultrasound probe
US11115475B2 (en) 2015-01-26 2021-09-07 Northeastern University Software-defined implantable ultrasonic device for use in the internet of medical things
US11826119B2 (en) 2015-01-26 2023-11-28 Northeastern University Internet-linked ultrasonic network for medical devices
US20180000344A1 (en) * 2015-01-26 2018-01-04 Northeastern University Internet-Linked Ultrasonic Network for Medical Devices
US10849503B2 (en) 2015-01-26 2020-12-01 Northeastern University Internet-linked ultrasonic network for medical devices
WO2016123069A1 (en) * 2015-01-26 2016-08-04 Northeastern University Internet-linked ultrasonic network
US10588602B2 (en) * 2015-02-10 2020-03-17 Samsung Electronics Co., Ltd. Portable ultrasound apparatus and control method for the same
US10095216B2 (en) * 2015-05-29 2018-10-09 Kuka Roboter Gmbh Selection of a device or object using a camera
US20160346936A1 (en) * 2015-05-29 2016-12-01 Kuka Roboter Gmbh Selection of a device or object using a camera
US20170055944A1 (en) * 2015-09-02 2017-03-02 Aningbo Youchang Ultrasonic Technology Co., Ltd Method for controlling wireless intelligent ultrasound fetal imaging system
US10413274B2 (en) * 2015-09-02 2019-09-17 Ningbo Marvoto Intelligent Technology Co., Ltd Method for controlling wireless intelligent ultrasound fetal imaging system
US10656254B2 (en) * 2015-11-19 2020-05-19 Analog Devices, Inc. Analog ultrasound beamformer
US20170146643A1 (en) * 2015-11-19 2017-05-25 Analog Devices, Inc. Analog ultrasound beamformer
US11224895B2 (en) 2016-01-18 2022-01-18 Ulthera, Inc. Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof
US10856846B2 (en) 2016-01-27 2020-12-08 Maui Imaging, Inc. Ultrasound imaging with sparse array probes
US11020057B2 (en) 2016-02-12 2021-06-01 Qualcomm Incorporated Ultrasound devices for estimating blood pressure and other cardiovascular properties
EP3413803B1 (en) * 2016-02-12 2020-09-23 Qualcomm Incorporated Ultrasound devices for estimating blood pressure and other cardiovascular properties
US11020058B2 (en) 2016-02-12 2021-06-01 Qualcomm Incorporated Methods and devices for calculating blood pressure based on measurements of arterial blood flow and arterial lumen
US20170252013A1 (en) * 2016-03-01 2017-09-07 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus and non-transitory computer-readable medium
US10980516B2 (en) * 2016-03-01 2021-04-20 Canon Medical Systems Corporation Ultrasonic diagnostic apparatus and non-transitory computer-readable medium
US10816650B2 (en) 2016-05-27 2020-10-27 Interson Corporation Ultrasonic imaging probe including composite aperture receiving array
US11536817B2 (en) 2016-05-27 2022-12-27 Interson Corporation Ultrasonic imaging probe including composite aperture receiving array
US20240041433A1 (en) * 2016-06-04 2024-02-08 Otonexus Medical Technologies, Inc. Apparatus and method for characterization of a ductile membrane, surface and sub-surface properties
US11241218B2 (en) 2016-08-16 2022-02-08 Ulthera, Inc. Systems and methods for cosmetic ultrasound treatment of skin
US10540480B2 (en) * 2016-08-29 2020-01-21 Siemens Healthcare Gmbh Medical imaging system
US20180060490A1 (en) * 2016-08-29 2018-03-01 Siemens Healthcare Gmbh Medical imaging system
US10589107B2 (en) 2016-11-08 2020-03-17 Greatbatch Ltd. Circuit board mounted filtered feedthrough assembly having a composite conductive lead for an AIMD
US20180153515A1 (en) * 2016-12-02 2018-06-07 Samsung Medison Co., Ltd. Ultrasonic probe and ultrasonic diagnostic apparatus including the same
US10559409B2 (en) 2017-01-06 2020-02-11 Greatbatch Ltd. Process for manufacturing a leadless feedthrough for an active implantable medical device
US11531096B2 (en) 2017-03-23 2022-12-20 Vave Health, Inc. High performance handheld ultrasound
US11553896B2 (en) 2017-03-23 2023-01-17 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US10856843B2 (en) 2017-03-23 2020-12-08 Vave Health, Inc. Flag table based beamforming in a handheld ultrasound device
US10681357B2 (en) 2017-03-27 2020-06-09 Vave Health, Inc. Dynamic range compression of ultrasound images
US10469846B2 (en) 2017-03-27 2019-11-05 Vave Health, Inc. Dynamic range compression of ultrasound images
US11446003B2 (en) 2017-03-27 2022-09-20 Vave Health, Inc. High performance handheld ultrasound
US11793498B2 (en) 2017-05-19 2023-10-24 Merit Medical Systems, Inc. Biopsy needle devices and methods of use
US11116483B2 (en) 2017-05-19 2021-09-14 Merit Medical Systems, Inc. Rotating biopsy needle
US11844500B2 (en) 2017-05-19 2023-12-19 Merit Medical Systems, Inc. Semi-automatic biopsy needle device and methods of use
US11086453B2 (en) * 2017-09-29 2021-08-10 Qualcomm Incorporated Layer for inducing varying delays in ultrasonic signals propagating in ultrasonic sensor
US20190102046A1 (en) * 2017-09-29 2019-04-04 Qualcomm Incorporated Layer for inducing varying delays in ultrasonic signals propagating in ultrasonic sensor
US20200275908A1 (en) * 2017-11-17 2020-09-03 Nihon Kohden Corporation Ultrasonic probe and ultrasonic measurement system
US20190156935A1 (en) * 2017-11-20 2019-05-23 Nihon Kohden Corporation Patient monitor and physiological information management system
US11458337B2 (en) 2017-11-28 2022-10-04 Regents Of The University Of Minnesota Adaptive refocusing of ultrasound transducer arrays using image data
US11826585B2 (en) 2017-11-28 2023-11-28 Regents Of The University Of Minnesota Adaptive refocusing of ultrasound transducer arrays using image data
US11666305B2 (en) * 2018-02-12 2023-06-06 Koninklijke Philips N.V. Workflow assistance for medical doppler ultrasound evaluation
US11944849B2 (en) 2018-02-20 2024-04-02 Ulthera, Inc. Systems and methods for combined cosmetic treatment of cellulite with ultrasound
US10905888B2 (en) 2018-03-22 2021-02-02 Greatbatch Ltd. Electrical connection for an AIMD EMI filter utilizing an anisotropic conductive layer
US11712571B2 (en) 2018-03-22 2023-08-01 Greatbatch Ltd. Electrical connection for a hermetic terminal for an active implantable medical device utilizing a ferrule pocket
US11730451B2 (en) 2018-03-22 2023-08-22 Exo Imaging, Inc. Integrated ultrasonic transducers
US10912945B2 (en) 2018-03-22 2021-02-09 Greatbatch Ltd. Hermetic terminal for an active implantable medical device having a feedthrough capacitor partially overhanging a ferrule for high effective capacitance area
US11596812B2 (en) 2018-04-06 2023-03-07 Regents Of The University Of Minnesota Wearable transcranial dual-mode ultrasound transducers for neuromodulation
CN112105945A (en) * 2018-04-24 2020-12-18 皇家飞利浦有限公司 Ultrasound imaging system for high resolution broadband harmonic imaging
US20210077078A1 (en) * 2018-04-24 2021-03-18 Koninklijke Philips N.V. Ultrasound imaging system for high resolution wideband harmonic imaging
US11766241B2 (en) * 2018-04-27 2023-09-26 Fujifilm Corporation Ultrasound system in which an ultrasound probe and a display device are wirelessly connected with each other and method for controlling ultrasound system in which an ultrasound probe and a display device are wirelessly connected with each other
WO2019222478A2 (en) 2018-05-17 2019-11-21 Teratech Corporation Portable ultrasound system
US10698609B2 (en) * 2018-07-03 2020-06-30 SK Hynix Inc. Memory system and operating method of a memory system
CN110675902A (en) * 2018-07-03 2020-01-10 爱思开海力士有限公司 Storage system and operation method of storage system
US11327657B2 (en) 2018-07-03 2022-05-10 SK Hynix Inc. Memory system and operating method of a memory system
US11699520B2 (en) * 2018-11-20 2023-07-11 Siemens Healthcare Gmbh Control unit for a medical imaging system comprising a processor and a logic gate; imaging system and method for controlling a medical imaging system
US20200160990A1 (en) * 2018-11-20 2020-05-21 Siemens Healthcare Gmbh Control unit for a medical imaging system comprising a processor and a logic gate; imaging system and method for controlling a medical imaging system
US11432800B2 (en) * 2019-03-25 2022-09-06 Exo Imaging, Inc. Handheld ultrasound imager
AU2022201206B2 (en) * 2019-03-25 2023-09-28 Exo Imaging, Inc. Handheld ultrasound imager
US20220401076A1 (en) * 2019-04-02 2022-12-22 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Regional contrast enhancement based on complementary information to reflectivity information
US11751852B2 (en) * 2019-04-02 2023-09-12 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Regional contrast enhancement based on complementary information to reflectivity information
US11452504B2 (en) * 2019-04-02 2022-09-27 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Regional contrast enhancement based on complementary information to reflectivity information
TWI736206B (en) * 2019-05-24 2021-08-11 九齊科技股份有限公司 Audio receiving device and audio transmitting device
US11751753B2 (en) * 2019-07-15 2023-09-12 Boston Scientific Scimed, Inc. Medical systems and devices for physically and electronically coupling medical devices
US20210016060A1 (en) * 2019-07-15 2021-01-21 Boston Scientific Scimed, Inc. Medical systems, devices, and related methods
TWI739156B (en) * 2019-09-16 2021-09-11 臺北醫學大學 System and method for biological object imagimg and treatment
US11351400B2 (en) 2019-09-16 2022-06-07 Taipei Medical University Biological object image-capturing and treatment system and method
US11602372B2 (en) 2019-12-31 2023-03-14 Auris Health, Inc. Alignment interfaces for percutaneous access
US11660147B2 (en) * 2019-12-31 2023-05-30 Auris Health, Inc. Alignment techniques for percutaneous access
US20210212660A1 (en) * 2020-01-09 2021-07-15 Hitachi, Ltd. Ultrasound diagnosis apparatus and program
CN111449757A (en) * 2020-04-10 2020-07-28 京东方科技集团股份有限公司 Telemedicine robot, control method and charging method thereof
US11714121B2 (en) * 2020-07-09 2023-08-01 Tektronix, Inc. Indicating a probing target for a fabricated electronic circuit
US20220026483A1 (en) * 2020-07-09 2022-01-27 Tektronix, Inc. Indicating a probing target for a fabricated electronic circuit
US20220039281A1 (en) * 2020-07-31 2022-02-03 FLIR Belgium BVBA Modular electrical power distribution system with module detection systems and methods
US20220378403A1 (en) * 2021-05-25 2022-12-01 Fujifilm Healthcare Corporation Ultrasound diagnostic apparatus and diagnosis assistance method
US20230061122A1 (en) * 2021-08-24 2023-03-02 Saudi Arabian Oil Company Convex ultrasonic sensor for weld inspection

Also Published As

Publication number Publication date
US20140051984A1 (en) 2014-02-20
US11547382B2 (en) 2023-01-10

Similar Documents

Publication Publication Date Title
US20210052256A1 (en) Ultrasound probe with integrated electronics
US11547382B2 (en) Networked ultrasound system and method for imaging a medical procedure using an invasive probe
US20220304661A1 (en) Tablet ultrasound system
US6969352B2 (en) Ultrasound probe with integrated electronics
US6783493B2 (en) Ultrasound probe with integrated electronics
US6669633B2 (en) Unitary operator control for ultrasonic imaging graphical user interface
US20190336101A1 (en) Portable ultrasound system
US20190365350A1 (en) Portable ultrasound system
US20160228091A1 (en) Tablet ultrasound system
TWI710356B (en) Tablet ultrasound system
US8900149B2 (en) Wall motion analyzer
US20230181160A1 (en) Devices and methods for ultrasound monitoring
EP3236855A1 (en) Device and system for monitoring internal organs of a human or animal
WO2019222478A2 (en) Portable ultrasound system
TWI834668B (en) Portable ultrasound system
AU2002238135A1 (en) Ultrasound Probe with Integrated Electronics
US20240057971A1 (en) Transcranial ultrasound devices and methods
JP7214367B2 (en) Carrying case for ultrasonic probe
Kang Medical Ultrasound Imaging and Interventional Component (MUSiiC) Framework for Advanced Ultrasound Image-guided Therapy

Legal Events

Date Code Title Description
AS Assignment

Owner name: TERATECH CORPORATION, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BERGER, NOAH;BRODSKY, MICHAEL;CHIANG, ALICE M.;AND OTHERS;REEL/FRAME:014355/0133;SIGNING DATES FROM 20030609 TO 20030709

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION