US20090012394A1 - User interface for ultrasound system - Google Patents

User interface for ultrasound system Download PDF

Info

Publication number
US20090012394A1
US20090012394A1 US12/112,946 US11294608A US2009012394A1 US 20090012394 A1 US20090012394 A1 US 20090012394A1 US 11294608 A US11294608 A US 11294608A US 2009012394 A1 US2009012394 A1 US 2009012394A1
Authority
US
United States
Prior art keywords
ultrasound
ultrasound system
virtual
control member
display elements
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/112,946
Inventor
Petra Hobelsberger
Walter Duda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Priority to US12/112,946 priority Critical patent/US20090012394A1/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DUDA, WALTER, HOBELSBERGER, PETRA
Publication of US20090012394A1 publication Critical patent/US20090012394A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4427Device being portable or laptop-like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/465Displaying means of special interest adapted to display user selection data, e.g. icons or menus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4405Device being mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/52084Constructional features related to particular user interfaces

Definitions

  • This invention relates generally to ultrasound systems and, more particularly, to a user interface for controlling ultrasound imaging systems, especially portable ultrasound medical imaging systems.
  • Ultrasound systems typically include ultrasound scanning devices, such as ultrasound probes having transducers that allow for performing various ultrasound scans (e.g., imaging a volume or body).
  • the ultrasound probes are typically connected to an ultrasound system for controlling the operation of the probes.
  • the ultrasound system usually includes a control portion (e.g., a control console or portable unit) that provides interfaces for interacting with a user, such as receiving user inputs. For example, different buttons, knobs, etc. can be provided to allow a user to select different options and control the scanning of an object using the connected ultrasound probe.
  • volume probes for example three-dimensional (3D) or four-dimensional (4D) probes
  • certain procedures may require multiple steps and adjustments that can be controlled by different controllers, for example, using several rotatable control members (commonly referred to as rotaries) to adjust different settings.
  • rotatable control members commonly referred to as rotaries
  • numerous control members of each of several different types can be included as part of the control portion.
  • the control members are often mode dependent such that each of the control members control a different function or allow adjusting a different setting based on the mode of operation, for example, a visualization or rendering mode of operation.
  • an ultrasound system includes a user interface having at least one user control member and a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display.
  • a function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.
  • an ultrasound system in accordance with another embodiment, includes an ultrasound volume probe for acquiring one of three-dimensional (3D) ultrasound data and four-dimensional (4D) ultrasound data and a portable control unit having a user interface and a display.
  • the ultrasound volume probe is connected to the portable control unit and wherein manipulation of one of the 3D ultrasound data and 4D ultrasound data is provided by a single user control member of the user interface.
  • a method for controlling an ultrasound probe using a portable ultrasound system includes receiving a user input selecting one of a plurality of virtual display elements on a display on the portable ultrasound system. The method further includes configuring a user control member of the portable ultrasound system based on the received user input to control an operation of the portable ultrasound system.
  • FIG. 1 is a block diagram of an ultrasound system formed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 2 is a block diagram of the ultrasound processor module of FIG. 1 formed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 3 is a top perspective view of a portable ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member.
  • FIG. 4 is a top plan view of a user interface of the portable ultrasound imaging system of FIG. 3 .
  • FIG. 5 is an elevation view of a backend of the portable ultrasound imaging system of FIG. 3 .
  • FIG. 6 is a side elevation view of the portable ultrasound imaging system of FIG. 3 .
  • FIG. 7 is a perspective view of a case for the portable ultrasound imaging system of FIG. 3 .
  • FIG. 8 is a perspective view of a movable cart that is capable of supporting the portable ultrasound imaging system of FIG. 3 .
  • FIG. 9 is a top view of a hand carried or pocket-sized ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member.
  • FIG. 10 is a screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 11 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance an exemplary embodiment of the inventive arrangements.
  • FIG. 12 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 13 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 14 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 15 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 16 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 17 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 18 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 19 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 20 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • the functional blocks are not necessarily indicative of the division between hardware circuitry.
  • one or more of the functional blocks e.g., processors or memories
  • the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
  • the various embodiments may be described in connection with an ultrasound system, the methods and systems described herein are not limited to ultrasound imaging.
  • the various embodiments may be implemented in connection with different types of medical imaging, including, for example, magnetic resonance imaging (MRI) and computed-tomography (CT) imaging.
  • MRI magnetic resonance imaging
  • CT computed-tomography
  • the various embodiments may be implemented in other non-medical imaging systems, for example, non-destructive testing systems.
  • Exemplary embodiments of ultrasound systems provide a user interface for an ultrasound system.
  • a plurality of virtual display elements e.g., display icons
  • the selection of the virtual display elements reconfigures one or more of the user control members for controlling certain parameters, settings, etc. based on the selected virtual display element.
  • FIG. 1 illustrates a block diagram of an ultrasound system 20 formed in accordance with various embodiments of the inventive arrangements.
  • the ultrasound system 20 includes a transmitter 22 that drives an array of elements 24 (e.g., piezoelectric crystals) within a transducer 26 to emit pulsed ultrasonic signals into a body or volume.
  • elements 24 e.g., piezoelectric crystals
  • the transducer 26 may be provided as part of, for example, different types of ultrasound probes.
  • the ultrasound probe may be a volume probe such as a three-dimensional (3D) probe or a four-dimensional (4D) probe wherein the array of elements 24 can be mechanically moved.
  • the array of elements 24 may be swept or swung about an axis powered by a motor 25 .
  • movement of the array of elements 24 is controlled by a motor controller 27 and motor driver 29 .
  • the ultrasound system 20 may have connected thereto an ultrasound probe that is not capable of mechanical movement of the array of elements 24 .
  • the motor controller 27 and motor driver 29 may or may not be provided and/or may be deactivated. Accordingly, the motor controller 27 and motor driver 29 are optionally provided.
  • the emitted pulsed ultrasonic signals are back-scattered from structures in a body, for example, blood cells or muscular tissue, to produce echoes that return to any of the elements 24 .
  • the echoes are received by a receiver 28 .
  • the received echoes are provided to a beamformer 30 that performs beamforming and outputs an RF signal.
  • the RF signal is then provided to an RF processor 32 that processes the RF signal.
  • the RF processor 32 may include a complex demodulator (not shown) that demodulates the RF signal to form IQ data pairs representative of the echo signals.
  • the RF or IQ signal data may then be provided directly to a memory 34 for storage (e.g., temporary storage).
  • the ultrasound system 20 also includes a processor module 36 to process the acquired ultrasound information (e.g., RF signal data or IQ data pairs) and prepare frames of ultrasound information for display on a display 38 .
  • the processor module 36 is adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the acquired ultrasound information.
  • Acquired ultrasound information may be processed in real-time during a scanning session as the echo signals are received. Additionally or alternatively, the ultrasound information may be stored temporarily in the memory 34 during a scanning session and processed in less than real-time in a live or off-line operation.
  • An image memory 40 is included for storing processed frames of acquired ultrasound information that are not scheduled to be displayed immediately.
  • the image memory 40 may comprise any known data storage medium, for example, a permanent storage medium, removable storage medium, etc.
  • the processor module 36 is connected to a user interface 42 that controls operation of the processor module 36 as explained below in more detail and is configured to receive inputs from an operator.
  • the display 38 includes one or more monitors that present patient information, including diagnostic ultrasound images to the user for review, diagnosis, and/or analysis.
  • the display 38 may automatically display, for example, one or more planes from a 3D ultrasound data set stored in the memory 34 or 40 .
  • One or both of the memories 34 , 40 may store 3D data sets of the ultrasound data, where such 3D data sets are accessed to present 2D and 3D images.
  • a 3D ultrasound data set may be mapped into the corresponding memory 34 or 40 , as well as one or more reference planes.
  • the processing of the data, including the data sets is based, at least in part, on user inputs, for example, user selections received at the user interface 42 .
  • the display 38 also may display one or more virtual display elements 50 that are selectable by a user and as described in more detail below. Based on the selection of a virtual display element 49 , one or more corresponding controls of the user interface 42 , for example, the operations controlled by a trackball and/or the like (not shown) may be reconfigured.
  • the ultrasound system 20 acquires data, for example, volumetric data sets by various techniques (e.g., 3D scanning, real-time 3D imaging, volume scanning, 2D scanning with transducers having positioning sensors, freehand scanning using a voxel correlation technique, scanning using 2D or matrix array transducers, etc.).
  • the data may be acquired by mechanically moving the array of elements 24 of the transducer 26 , for example, by performing a sweeping type of scan.
  • the transducer 26 also may be moved manually, such as along a linear or arcuate path, while scanning a region of interest (ROI). At each linear or arcuate position, the transducer 26 obtains scan planes that are stored in the memory 34 .
  • ROI region of interest
  • FIG. 2 illustrates an exemplary block diagram of the processor module 36 of FIG. 1 .
  • the processor module 36 is illustrated conceptually as a collection of sub-modules, but it may also be implemented utilizing any combination of dedicated hardware boards, digital signal processors (DSPs), processors, etc.
  • the sub-modules of FIG. 2 may be implemented utilizing an off-the-shelf PC with a single processor or multiple processors, with functional operations distributed between the processors.
  • the sub-modules of FIG. 2 may also be implemented utilizing a hybrid configuration, in which certain modular functions are performed utilizing dedicated hardware, while the remaining modular functions are performed utilizing an off-the-shelf PC and/or the like.
  • the sub-modules also may be implemented as software modules within a processing unit.
  • the operations of the sub-modules illustrated in FIG. 2 may be controlled by a local ultrasound controller 50 or by the processor module 36 .
  • the sub-modules 52 - 68 perform mid-processor operations.
  • the ultrasound processor module 36 may receive ultrasound data 70 in one of several forms.
  • the received ultrasound data 70 constitutes IQ data pairs representing the real and imaginary components associated with each data sample.
  • the IQ data pairs are provided, for example, to one or more of a color-flow sub-module 52 , a power Doppler sub-module 54 , a B-mode sub-module 56 , a spectral Doppler sub-module 58 , and an M-mode sub-module 60 .
  • ARFI Acoustic Radiation Force Impulse
  • strain sub-module 64 a strain rate sub-module 66
  • Tissue Doppler (TDE) sub-module 68 a Tissue Doppler sub-module 68 , among others.
  • Each of sub-modules 52 - 68 are configured to process the IQ data pairs in a corresponding manner to generate color-flow data 72 , power Doppler data 74 , B-mode data 76 , spectral Doppler data 78 , M-mode data 80 , ARFI data 82 , echocardiographic strain data 84 , echocardiographic strain rate data 86 , and tissue Doppler data 88 , all of which may be stored in a memory 90 (or memory 34 or image memory 40 shown in FIG. 1 ) temporarily before subsequent processing.
  • the data 72 - 88 may be stored, for example, as sets of vector data values, where each set defines an individual ultrasound image frame. The vector data values are generally organized based on the polar coordinate system.
  • a scan converter sub-module 92 accesses and obtains from the memory 90 the vector data values associated with an image frame and converts the set of vector data values to Cartesian coordinates to generate an ultrasound image frame 93 formatted for display.
  • the ultrasound image frames 93 generated by the scan converter sub-module 92 may be provided back to the memory 90 for subsequent processing or may be provided to the memory 34 or image memory 40 .
  • the image frames may be restored in the memory 90 or communicated over a bus 96 to a database (not shown), the memory 34 , the image memory 40 , and/or to other processors (not shown).
  • a 2D video processor sub-module 94 may be used to combine one or more of the frames generated from the different types of ultrasound information.
  • the 2D video processor sub-module 94 may combine different image frames by mapping one type of data to a gray map and mapping the other type of data to a color map for video display.
  • the color pixel data is superimposed on the gray scale pixel data to form a single multi-mode image frame 98 that is again re-stored in the memory 90 or communicated over the bus 96 .
  • Successive frames of images may be stored as a cine loop in the memory 90 or memory 40 (shown in FIG. 1 ).
  • the cine loop represents a first in, first out circular image buffer to capture image data that is displayed in real-time to the user, such as one or more heart cycles.
  • the user may freeze the cine loop by entering a freeze command at the user interface 42 .
  • the user interface 42 may include, for example, a keyboard, mouse, trackball, and/or all other input controls associated with inputting information into the ultrasound system 20 (shown in FIG. 1 ), which input controls may be reconfigured automatically based on selection of a virtual display element 49 (shown in FIG. 1 ) by the user.
  • a 3D processor sub-module 100 is also controlled by the user interface 42 and accesses the memory 90 to obtain spatially consecutive groups of ultrasound image frames (that may be acquired, for example, by a sweeping ultrasound scan) and to generate three dimensional image representations thereof, such as through volume rendering or surface rendering algorithms, as are known.
  • the three-dimensional images may be generated utilizing various imaging techniques, such as ray-casting, maximum intensity pixel projection, and/or the like. Additionally, the three-dimensional images may be displayed over time, thereby providing four-dimensional operation, as is known.
  • a portable ultrasound imaging system 110 as shown in FIG. 3 .
  • the portable ultrasound imaging system 110 may be, for example, a Voluson i compact 4D ultrasound system available from G.E. Healthcare in Waukesha, Wis.
  • the portable ultrasound imaging system 110 controls a probe (not shown) connected to the portable ultrasound imaging system 110 via a probe connector 112 that may be locked to the portable ultrasound imaging system 110 using a probe locking handle 114 .
  • the user interface 42 includes a plurality of user inputs and/or controls, which may be of different types, and are configured to receive commands from a user or operator.
  • the user interface 42 may include a plurality of “soft” buttons 116 , for example, toggle buttons and a keyboard 118 , for example, an alphanumeric keyboard.
  • a functional keyboard portion 120 may be provided that includes other user selectable buttons and controls.
  • Other user controls also may be provided, such as a trackball 122 having a trackball ring 124 and a plurality of associated buttons 126 , which may be activated by the fingers of a user when operating the trackball 126 .
  • a plurality of sliding control members 128 e.g., time control gain potentiometers
  • the portable ultrasound imaging system 110 also includes a display 130 , for example, an integrated LCD display with a display latch 132 provided to lock the display 130 to the user interface 42 .
  • a power button 134 is provided to power on and off the portable ultrasound imaging system 110 .
  • the portable ultrasound imaging system 110 with the user interface 42 and the display defines a portable control unit.
  • miniaturized generally means that the ultrasound system 110 is a handheld or hand-carried device and/or is configured to be carried in a person's hand, pocket, briefcase-sized case, backpack, and/or the like.
  • the ultrasound system 110 may be a hand-carried device having a size of a typical laptop computer, for instance, having dimensions of approximately 2.5 inches in depth, approximately 14 inches in width, and approximately 12 inches in height.
  • the ultrasound system 110 may weigh about ten pounds or less, and is thus easily portable by the operator.
  • the display 130 is configured to display, for example, a medical image and virtual display elements, as described below.
  • ultrasonic data from the portable ultrasound imaging system 110 may be sent to an external device (not shown), such as a printer or display, via a wired or wireless network (or direct connection, for example, via a serial or parallel cable or USB port).
  • the external device may be a computer or a workstation having a display.
  • the external device may be a separate external display or a printer capable of receiving image data from the portable ultrasound imaging system 110 and of displaying or printing images that may have greater resolution than the display 130 .
  • buttons 116 may include a first menu button 140 , a second menu button 142 , a third menu button 144 , and a fourth menu button 146 , each capable of movement in four directions.
  • a plurality of imaging buttons 148 may also be provided to select different imaging functions or operations.
  • a plurality of mode selection buttons 136 also may be provided to select different scanning modes, for example, 2D, 4D, pulsed wave doppler (PW), color flow mode (CFM), etc.
  • the functional keyboard portion 120 also includes other user selectable buttons and controls, such as buttons that allow for obtaining saved information, storing information, manipulating information or displayed images, calculating measurements relating to displayed images, changing a display format, etc.
  • the portable ultrasound imaging system 110 also includes internal and external connections on a back end 160 as shown in FIG. 5 and on a side portion 170 as shown in FIG. 6 .
  • the back end 160 may include a VGA connector 162 (for connection, for example, to an external monitor), an RGB connector 164 (for connection, for example, to a printer) and a power supply input 166 .
  • a network connector 168 for example, an Ethernet LAN input/output also may be provided and one or more USB connectors 169 may be provided.
  • a probe connection 172 for connection to a probe may be provided, and the probe locking handle 114 is provided. It should be noted that different or additional connectors may be provided as desired or known, for example, based on the scanning applications for the portable ultrasound imaging system 110 .
  • the portable ultrasound imaging system 110 also may be transported, stored, or operated in a case 180 , as shown in FIG. 7 .
  • the case 180 may be, for example, a padded case to protect the portable ultrasound imaging system 110 .
  • the portable ultrasound imaging system 110 also may be configured for mounting to or to be supported by a moveable base 190 , for example, a movable cart as shown in FIG. 8 .
  • the moveable base 190 includes a support portion 192 for receiving and supporting the portable ultrasound imaging system 110 and a tray portion 194 that may be used, for example, to store peripherals.
  • the movable base 190 also may include one or more probe holders 196 for supporting and holding therein one or more ultrasound probes, for example, one probe connected to the portable ultrasound imaging system 110 and other probes configured to be connected to the portable ultrasound imaging system 110 .
  • a foot rest 198 also may be provided. Accordingly, the portable ultrasound imaging system 110 may be configured to appear like a console-based type ultrasound imaging system.
  • a hand carried or pocket-sized ultrasound imaging system 200 may be provided as shown in FIG. 9 .
  • the display 130 and user interface 42 can form a single unit.
  • the pocket-sized ultrasound imaging system 200 may be a pocket-sized or hand-sized ultrasound system approximately 2 inches wide, approximately 4 inches in length, and approximately 1 ⁇ 2 inches in depth and/or weigh less than 3 ounces.
  • the display 130 may be, for example, a 320 ⁇ 320 pixel color LCD display (on which a medical image 210 can be displayed).
  • a typewriter-like keyboard 202 of buttons 203 may optionally be included in the user interface 42 .
  • the various embodiments may be implemented in connection with a pocket-sized ultrasound system 200 having different dimensions, weights, and/pr power consumptions.
  • Multi-function controls 204 may each be assigned functions in accordance with the mode of system operation. Therefore, each of the multi-function controls 204 may be configured to provide a plurality of different actions. Label display areas 206 associated with the multi-function controls 204 may be included as necessary on the display 130 .
  • the system 200 may also have additional keys and/or controls 208 for special purpose functions, which may include, but are not limited to “freeze,” “depth control,” “gain control,” “color-mode,” “print,” and “store.”
  • Various embodiments of the inventive arrangements provide virtual display elements (e.g., display icons) that are selectable by a user to change the function controlled by a particular user control.
  • the selection of the virtual display elements reconfigures one or more of the user controls for controlling certain parameters, settings, etc. based on the selected virtual display element.
  • a user is presented with a plurality of virtual display elements 220 a - 220 e that may be displayed, for example, on a screen 222 , such as the display 38 (shown in FIG. 1 ).
  • the virtual display elements 220 a - 220 e are displayed on the screen 222 adjacent (e.g., surrounding) or proximate an image that is selected by a user.
  • the virtual display elements 220 a - 220 e are displayed on the screen 222 .
  • the virtual display elements 220 a - 220 e disappear once the image 225 is no longer selected or the virtual pointer 224 is moved away from the image 225 and another image 226 is selected or the virtual pointer 224 is moved over that image or another user control member is activated (e.g., depressed).
  • the virtual display elements 220 a - 220 e may continue to be displayed in connection with the image 225 for a predetermined period of time (e.g., 2 seconds) even after the image 225 is no longer selected.
  • a user may select one of the virtual display elements 220 a - 220 e .
  • the corresponding function represented by that virtual display element 220 a - 220 e is now adjusted or controlled by one of the controls of the user interface 42 (shown in FIG. 4 ), for example, the trackball 122 .
  • the operation of the trackball 122 is reconfigured and the control thereof remapped, for example, as shown in Table 1 below.
  • the trackball 122 is reconfigured to control or adjust the parameter, function, etc. corresponding to that virtual display element 220 a , which, in the embodiment shown in Table 1, is to control rotation around the X-axis of the image 225 .
  • the operation of the trackball 122 is reconfigured to control or adjust the X-axis rotation.
  • buttons 126 may then click one of the buttons 126 to return the trackball 122 to controlling movement of the virtual pointer 224 and allowing selection of one of the other virtual display elements 220 b - 220 e .
  • another one of the buttons of the user interface 42 may deselect the operation corresponding to a virtual display element 220 a and allow the selection of one of the other virtual display elements 220 b - 220 e.
  • the virtual display elements 220 a - 220 e may be configured as different icons and correspond to different function or operations than those illustrated in Table 1. It also should be noted that the selection of one of the virtual display elements 220 a - 220 e may, instead of reconfiguring the trackball 122 , reconfigure another user control or the user interface 42 or an external user control (e.g., a connected mouse).
  • selectable elements may be displayed on the screen 222 .
  • a plurality of selectable elements 230 may be provided to allow for the selection of a particular visualization mode.
  • FIGS. 11-20 illustrating exemplary screenshots 232 including the virtual display elements 220 a - 220 e , a render visualization mode (which may be selected using the selectable elements 230 ) for a 4D realtime acquisition is shown.
  • the virtual pointer 224 illustrated as a mouse pointer, is moved, for example, using the trackball 122 (shown in FIG. 4 ), over a side 240 of a render box 242 (identifying the region of the image 244 to be rendered).
  • the side 240 may be highlighted (e.g., highlighted by a color) when the virtual pointer 224 is placed over the side 240 .
  • the virtual display elements 220 a - 220 d and 220 f are displayed. It should be noted that the virtual display element 220 e is not displayed in this screenshot, but it may be displayed in some embodiments.
  • the virtual pointer 224 has now been placed over virtual display element 220 f (the icon shaped as a dot) that corresponds to a curved render start function.
  • virtual display element 220 f the icon shaped as a dot
  • the virtual display element 220 f may be highlighted (e.g., highlighted or shadowed in yellow).
  • the trackball 122 is reconfigured to adjust the curved render start function as shown in FIG. 13 .
  • the virtual display element 220 f may be highlighted differently (e.g., highlighted in a different color, such as red) and a curved render start portion 244 of the render box 240 is displayed. It should be noted that once the virtual display element 220 f is selected, the other virtual display elements 220 a - 220 d disappear, and when the trackball 122 is moved, the curved render start portion 244 is changed, for example, curved as adjusted by the trackball 122 instead of straight as shown in FIG. 12 . It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4 ) and the other virtual display elements 220 a - 220 d appear again.
  • the buttons 126 shown in FIG. 4
  • a virtual representation 246 of the trackball 122 may be displayed on the display 130 and indicate the functions corresponding to the trackball 122 and the buttons 126 in the current active display mode.
  • another side 248 (or border) of the render box 240 may be selected and which reconfigures the functionality of the trackball 122 to allow adjustment of the size of the render box 240 .
  • the side 248 may be highlighted (e.g., highlighted in red) and all of virtual display elements 220 a - 220 d and 220 f disappear.
  • the size of the render box 240 is changed. For example, the render box 240 is now smaller in FIG. 14 than in FIG. 11-13 . It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4 ) and the virtual display elements 220 a - 220 d and 220 f appear again.
  • the virtual display element 220 a has been selected and the other virtual display elements 220 b - 220 d and 220 f have disappeared.
  • the selected virtual display element 220 a corresponds to a rotate around the y-axis, which now may be adjusted by the trackball 122 that is reconfigured to control this operation.
  • the selected virtual display element 220 a may be highlighted (e.g., highlighted in red) and when the trackball 122 is moved, the volume data displayed is rotated around or about the y-axis, with the content of the displayed images 244 and 245 changed accordingly. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4 ) and the virtual display elements 220 b - 220 d and 220 f appear again.
  • the virtual pointer 224 has been moved over the image 245 .
  • a different set of virtual display elements 220 a - 220 d and now 220 g appear.
  • the virtual display element 220 g now appears and is configured as a “house” icon.
  • a render box 241 may now appear on the image 245 .
  • the virtual display element 220 g when selected changes the 3D display.
  • the rendered image specifically, the image 245 is now displayed at an angle
  • the render box 241 is displayed as a three-dimensional box and the virtual display element 220 g changes shape, for example, the “house” icon is rotated. If the virtual display element 220 g is again selected, the image 245 will again appear as shown in FIG. 16 and the shape of the virtual display element 220 g will return to the “house” icon as shown in FIG. 16 .
  • FIG. 18 illustrates a quad-view mode and the visualization mode now shows sectional planes.
  • the virtual pointer 224 is now shown as moved over a center dot 252 in the image 250 and the dot is marked, for example, with a marker 254 , such as cross-hairs that may be highlighted, for example, highlighted in yellow.
  • the user may then select the marker 254 , which may, for example, change color to red and a move center dot function is now assigned to the trackball 122 . All of the other virtual display elements 220 a - 220 d also disappear.
  • the center dot 252 is moved and the content of the image 250 , as well as the images 260 and 262 changed accordingly. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4 ) and the virtual display elements 220 a - 220 d appear again.
  • the virtual pointer 224 has been moved over the image 260 and not over any of the virtual display elements 220 a - 220 d .
  • the virtual pointer 224 now has a different shape, for example, a hand instead of an arrow or pointer.
  • a move image functionality is selected and assigned to the trackball 122 .
  • the image 260 is moved on the display.
  • a single user control member can be used to manipulate, for example, 3D or 4D ultrasound data. For example, by assigning different operations to the single user control member based on selecting from a plurality of virtual display elements, the single user control member is reconfigured to control different operations or adjust different settings, parameters, etc.
  • virtual display elements 220 a - 220 g may be displayed in different imaging modes, for example, a tomographic ultrasound image (TUI) mode or a SonoVCAD mode.
  • TTI tomographic ultrasound image
  • SonoVCAD SonoVCAD
  • different virtual display elements corresponding to different operations or functions may be displayed in addition to or instead of some or all of the virtual display elements 220 a - 220 g .
  • only a specific image or images can be adjusted and accordingly, the virtual display elements only appear when the virtual pointer 224 is moved over those images. It also should be noted that only a single image may be displayed instead of the multiple images as illustrated.
  • the various embodiments automatically reconfigure the operation of a user control member (e.g., a trackball) based on a selected virtual display element such that the control operations performed by the user control member are remapped.
  • the user control member is thereby used to adjust or control different functions based on the virtual display element selected.
  • the movement of the user control member is remapped to, for example, allow the particular, setting, parameter, etc. to be adjusted or changed based on the movement of the user control member that has been remapped. For example, a table or database is accessed and the corresponding motion of the user control member is mapped for the particular function, setting, parameter, etc. Thereafter, the relative movement of the user control member adjusts the particular function, setting, parameter, etc. corresponding to the selected virtual display element.
  • At least one technical effect of the various embodiments of the inventive arrangements is automatically changing the control function or operation of a user control member based on the selection of a virtual display element.
  • the user control member is reconfigured or reassigned to control or adjust a different operation or function based on the selected virtual display element.
  • Some embodiments of the inventive arrangements provide a machine-readable medium or media having instructions recorded thereon for a processor or computer to operate an imaging apparatus to perform one or more embodiments of the methods described herein.
  • the medium or media may be any type of CD-ROM, DVD, floppy disk, hard disk, optical disk, flash RAM drive, and/or other type of computer-readable medium, and/or a combination thereof.
  • Such a computer or processor may include a computing device, an input device, a display unit, and/or an interface, for example, for accessing the Internet.
  • the computer or processor may include a microprocessor.
  • the microprocessor may be connected to a communication bus.
  • the computer or processor may also include a memory.
  • the memory may include Random Access Memory (RAM) and/or Read Only Memory (ROM).
  • the computer or processor may further include a storage device, which may be a hard disk drive or a removable storage drive, such as a floppy disk drive, optical disk drive, and/or the like.
  • the storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
  • may include any processor-based or microprocessor-based system, including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein.
  • RISC reduced instruction set computers
  • ASICs application specific integrated circuits
  • the above examples are exemplary only and thus not intended to limit in any way the definition and/or meaning of the term “computer.”
  • the computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data.
  • the storage elements may also store data or other information as desired and/or needed.
  • the storage element may be in the form of an information source or a physical memory element within a processing machine.
  • the set of instructions may include various commands that instruct the computer or processor as a processing machine to perform specific operations, such as the methods and processes of the various embodiments of the inventive arrangements.
  • the set of instructions may be in the form of a software program.
  • the software may be in various forms, such as system software or application software.
  • the software may be in the form of a collection of separate programs, a program module within a larger program, or a portion of a program module.
  • the software may also include modular programming in the form of object-oriented programming.
  • the processing of input data by the processing machine may be in response to user commands, or in response to results of previous processing, or in response to a request made by another processing machine.
  • the terms “software” and “firmware” are interchangeable and include any computer program stored in memory for execution by a computer, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and/or non-volatile RAM (NVRAM) memory.
  • RAM memory random access memory
  • ROM memory read-only memory
  • EPROM memory erasable programmable read-only memory
  • EEPROM memory electrically erasable programmable read-only memory
  • NVRAM non-volatile RAM

Abstract

A user interface for an ultrasound system is provided. The ultrasound system includes the user interface having at least one user control member and a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display. A function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to and the benefit of the filing date of U.S. Provisional Application No. 60/914,893, filed Apr. 30, 2007 for “PORTABLE 3D/4D ULTRASOUND,” which is hereby incorporated by reference in its entirety.
  • BACKGROUND OF INVENTION
  • This invention relates generally to ultrasound systems and, more particularly, to a user interface for controlling ultrasound imaging systems, especially portable ultrasound medical imaging systems.
  • Ultrasound systems typically include ultrasound scanning devices, such as ultrasound probes having transducers that allow for performing various ultrasound scans (e.g., imaging a volume or body). The ultrasound probes are typically connected to an ultrasound system for controlling the operation of the probes. The ultrasound system usually includes a control portion (e.g., a control console or portable unit) that provides interfaces for interacting with a user, such as receiving user inputs. For example, different buttons, knobs, etc. can be provided to allow a user to select different options and control the scanning of an object using the connected ultrasound probe.
  • When using volume probes, for example three-dimensional (3D) or four-dimensional (4D) probes, certain procedures may require multiple steps and adjustments that can be controlled by different controllers, for example, using several rotatable control members (commonly referred to as rotaries) to adjust different settings. As a result, numerous control members of each of several different types can be included as part of the control portion. The control members are often mode dependent such that each of the control members control a different function or allow adjusting a different setting based on the mode of operation, for example, a visualization or rendering mode of operation.
  • As the size of ultrasound systems continue to decrease, the available space of the various controllers on the control portion is limited. Moreover, as processing power continues to increase, portable ultrasound systems, which have increasingly smaller footprints, often include an entire ultrasound system (e.g., processing components, etc.) embodied within a housing having the dimensions of a typical laptop computer or smaller. Thus, the same functionality is often now available in portable systems as in larger systems. However, with the reduced space available in a compact unit, the reduced number of available control members can make it difficult or complex to control certain procedures or adjust different parameters. In some instances, these portable ultrasound systems may not have enough controls to allow a user to control all of the operations that would otherwise be available on a larger system, but that are still desirable in portable systems.
  • BRIEF DESCRIPTION OF INVENTION
  • In accordance with one embodiment, an ultrasound system is provided that includes a user interface having at least one user control member and a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display. A function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.
  • In accordance with another embodiment, an ultrasound system is provided that includes an ultrasound volume probe for acquiring one of three-dimensional (3D) ultrasound data and four-dimensional (4D) ultrasound data and a portable control unit having a user interface and a display. The ultrasound volume probe is connected to the portable control unit and wherein manipulation of one of the 3D ultrasound data and 4D ultrasound data is provided by a single user control member of the user interface.
  • In accordance with yet another embodiment, a method for controlling an ultrasound probe using a portable ultrasound system is provided. The method includes receiving a user input selecting one of a plurality of virtual display elements on a display on the portable ultrasound system. The method further includes configuring a user control member of the portable ultrasound system based on the received user input to control an operation of the portable ultrasound system.
  • BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 is a block diagram of an ultrasound system formed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 2 is a block diagram of the ultrasound processor module of FIG. 1 formed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 3 is a top perspective view of a portable ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member.
  • FIG. 4 is a top plan view of a user interface of the portable ultrasound imaging system of FIG. 3.
  • FIG. 5 is an elevation view of a backend of the portable ultrasound imaging system of FIG. 3.
  • FIG. 6 is a side elevation view of the portable ultrasound imaging system of FIG. 3.
  • FIG. 7 is a perspective view of a case for the portable ultrasound imaging system of FIG. 3.
  • FIG. 8 is a perspective view of a movable cart that is capable of supporting the portable ultrasound imaging system of FIG. 3.
  • FIG. 9 is a top view of a hand carried or pocket-sized ultrasound imaging system formed in accordance with an exemplary embodiment of the inventive arrangements having at least one reconfigurable user input member.
  • FIG. 10 is a screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 11 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance an exemplary embodiment of the inventive arrangements.
  • FIG. 12 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 13 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 14 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 15 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 16 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 17 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 18 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 19 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • FIG. 20 is another screenshot of a display illustrating a plurality of virtual display elements displayed in accordance with an exemplary embodiment of the inventive arrangements.
  • DETAILED DESCRIPTION OF VARIOUS PREFERRED EMBODIMENTS
  • The foregoing summary, as well as the following detailed description of certain embodiments of the inventive arrangements, will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. Thus, for example, one or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or random access memory, hard disk, or the like). Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
  • As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the inventive arrangements are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property.
  • It should be noted that although the various embodiments may be described in connection with an ultrasound system, the methods and systems described herein are not limited to ultrasound imaging. In particular, the various embodiments may be implemented in connection with different types of medical imaging, including, for example, magnetic resonance imaging (MRI) and computed-tomography (CT) imaging. Further, the various embodiments may be implemented in other non-medical imaging systems, for example, non-destructive testing systems.
  • Exemplary embodiments of ultrasound systems provide a user interface for an ultrasound system. A plurality of virtual display elements (e.g., display icons) are selectable by a user to change the function controlled by a particular user control member. The selection of the virtual display elements reconfigures one or more of the user control members for controlling certain parameters, settings, etc. based on the selected virtual display element.
  • FIG. 1 illustrates a block diagram of an ultrasound system 20 formed in accordance with various embodiments of the inventive arrangements. The ultrasound system 20 includes a transmitter 22 that drives an array of elements 24 (e.g., piezoelectric crystals) within a transducer 26 to emit pulsed ultrasonic signals into a body or volume. A variety of geometries may be used, and the transducer 26 may be provided as part of, for example, different types of ultrasound probes. For example, the ultrasound probe may be a volume probe such as a three-dimensional (3D) probe or a four-dimensional (4D) probe wherein the array of elements 24 can be mechanically moved. The array of elements 24 may be swept or swung about an axis powered by a motor 25. In these embodiments, movement of the array of elements 24 is controlled by a motor controller 27 and motor driver 29. However, it should be noted that the ultrasound system 20 may have connected thereto an ultrasound probe that is not capable of mechanical movement of the array of elements 24. In such embodiments, the motor controller 27 and motor driver 29 may or may not be provided and/or may be deactivated. Accordingly, the motor controller 27 and motor driver 29 are optionally provided.
  • The emitted pulsed ultrasonic signals are back-scattered from structures in a body, for example, blood cells or muscular tissue, to produce echoes that return to any of the elements 24. The echoes are received by a receiver 28. The received echoes are provided to a beamformer 30 that performs beamforming and outputs an RF signal. The RF signal is then provided to an RF processor 32 that processes the RF signal. Alternatively, the RF processor 32 may include a complex demodulator (not shown) that demodulates the RF signal to form IQ data pairs representative of the echo signals. The RF or IQ signal data may then be provided directly to a memory 34 for storage (e.g., temporary storage).
  • The ultrasound system 20 also includes a processor module 36 to process the acquired ultrasound information (e.g., RF signal data or IQ data pairs) and prepare frames of ultrasound information for display on a display 38. The processor module 36 is adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the acquired ultrasound information. Acquired ultrasound information may be processed in real-time during a scanning session as the echo signals are received. Additionally or alternatively, the ultrasound information may be stored temporarily in the memory 34 during a scanning session and processed in less than real-time in a live or off-line operation. An image memory 40 is included for storing processed frames of acquired ultrasound information that are not scheduled to be displayed immediately. The image memory 40 may comprise any known data storage medium, for example, a permanent storage medium, removable storage medium, etc.
  • The processor module 36 is connected to a user interface 42 that controls operation of the processor module 36 as explained below in more detail and is configured to receive inputs from an operator. The display 38 includes one or more monitors that present patient information, including diagnostic ultrasound images to the user for review, diagnosis, and/or analysis. The display 38 may automatically display, for example, one or more planes from a 3D ultrasound data set stored in the memory 34 or 40. One or both of the memories 34, 40 may store 3D data sets of the ultrasound data, where such 3D data sets are accessed to present 2D and 3D images. For example, a 3D ultrasound data set may be mapped into the corresponding memory 34 or 40, as well as one or more reference planes. The processing of the data, including the data sets, is based, at least in part, on user inputs, for example, user selections received at the user interface 42.
  • The display 38 also may display one or more virtual display elements 50 that are selectable by a user and as described in more detail below. Based on the selection of a virtual display element 49, one or more corresponding controls of the user interface 42, for example, the operations controlled by a trackball and/or the like (not shown) may be reconfigured.
  • In operation, the ultrasound system 20 acquires data, for example, volumetric data sets by various techniques (e.g., 3D scanning, real-time 3D imaging, volume scanning, 2D scanning with transducers having positioning sensors, freehand scanning using a voxel correlation technique, scanning using 2D or matrix array transducers, etc.). The data may be acquired by mechanically moving the array of elements 24 of the transducer 26, for example, by performing a sweeping type of scan. The transducer 26 also may be moved manually, such as along a linear or arcuate path, while scanning a region of interest (ROI). At each linear or arcuate position, the transducer 26 obtains scan planes that are stored in the memory 34.
  • FIG. 2 illustrates an exemplary block diagram of the processor module 36 of FIG. 1. The processor module 36 is illustrated conceptually as a collection of sub-modules, but it may also be implemented utilizing any combination of dedicated hardware boards, digital signal processors (DSPs), processors, etc. Alternatively, the sub-modules of FIG. 2 may be implemented utilizing an off-the-shelf PC with a single processor or multiple processors, with functional operations distributed between the processors. As a further option, the sub-modules of FIG. 2 may also be implemented utilizing a hybrid configuration, in which certain modular functions are performed utilizing dedicated hardware, while the remaining modular functions are performed utilizing an off-the-shelf PC and/or the like. The sub-modules also may be implemented as software modules within a processing unit.
  • The operations of the sub-modules illustrated in FIG. 2 may be controlled by a local ultrasound controller 50 or by the processor module 36. The sub-modules 52-68 perform mid-processor operations. The ultrasound processor module 36 may receive ultrasound data 70 in one of several forms. In the embodiment of FIG. 2, for example, the received ultrasound data 70 constitutes IQ data pairs representing the real and imaginary components associated with each data sample. The IQ data pairs are provided, for example, to one or more of a color-flow sub-module 52, a power Doppler sub-module 54, a B-mode sub-module 56, a spectral Doppler sub-module 58, and an M-mode sub-module 60. Other sub-modules may also be included, such as an Acoustic Radiation Force Impulse (ARFI) sub-module 62, a strain sub-module 64, a strain rate sub-module 66, a Tissue Doppler (TDE) sub-module 68, among others.
  • Each of sub-modules 52-68 are configured to process the IQ data pairs in a corresponding manner to generate color-flow data 72, power Doppler data 74, B-mode data 76, spectral Doppler data 78, M-mode data 80, ARFI data 82, echocardiographic strain data 84, echocardiographic strain rate data 86, and tissue Doppler data 88, all of which may be stored in a memory 90 (or memory 34 or image memory 40 shown in FIG. 1) temporarily before subsequent processing. The data 72-88 may be stored, for example, as sets of vector data values, where each set defines an individual ultrasound image frame. The vector data values are generally organized based on the polar coordinate system.
  • A scan converter sub-module 92 accesses and obtains from the memory 90 the vector data values associated with an image frame and converts the set of vector data values to Cartesian coordinates to generate an ultrasound image frame 93 formatted for display. The ultrasound image frames 93 generated by the scan converter sub-module 92 may be provided back to the memory 90 for subsequent processing or may be provided to the memory 34 or image memory 40.
  • Once the scan converter sub-module 92 generates the ultrasound image frames 93 associated with the data, the image frames may be restored in the memory 90 or communicated over a bus 96 to a database (not shown), the memory 34, the image memory 40, and/or to other processors (not shown).
  • A 2D video processor sub-module 94 may be used to combine one or more of the frames generated from the different types of ultrasound information. For example, the 2D video processor sub-module 94 may combine different image frames by mapping one type of data to a gray map and mapping the other type of data to a color map for video display. In the final displayed image, the color pixel data is superimposed on the gray scale pixel data to form a single multi-mode image frame 98 that is again re-stored in the memory 90 or communicated over the bus 96. Successive frames of images may be stored as a cine loop in the memory 90 or memory 40 (shown in FIG. 1). The cine loop represents a first in, first out circular image buffer to capture image data that is displayed in real-time to the user, such as one or more heart cycles. The user may freeze the cine loop by entering a freeze command at the user interface 42. The user interface 42 may include, for example, a keyboard, mouse, trackball, and/or all other input controls associated with inputting information into the ultrasound system 20 (shown in FIG. 1), which input controls may be reconfigured automatically based on selection of a virtual display element 49 (shown in FIG. 1) by the user.
  • A 3D processor sub-module 100 is also controlled by the user interface 42 and accesses the memory 90 to obtain spatially consecutive groups of ultrasound image frames (that may be acquired, for example, by a sweeping ultrasound scan) and to generate three dimensional image representations thereof, such as through volume rendering or surface rendering algorithms, as are known. The three-dimensional images may be generated utilizing various imaging techniques, such as ray-casting, maximum intensity pixel projection, and/or the like. Additionally, the three-dimensional images may be displayed over time, thereby providing four-dimensional operation, as is known.
  • Various embodiments of the inventive arrangements can also be implemented in a miniaturized ultrasound imaging system, for example, a portable ultrasound imaging system 110 as shown in FIG. 3. The portable ultrasound imaging system 110 may be, for example, a Voluson i compact 4D ultrasound system available from G.E. Healthcare in Waukesha, Wis. The portable ultrasound imaging system 110 controls a probe (not shown) connected to the portable ultrasound imaging system 110 via a probe connector 112 that may be locked to the portable ultrasound imaging system 110 using a probe locking handle 114. The user interface 42 includes a plurality of user inputs and/or controls, which may be of different types, and are configured to receive commands from a user or operator. For example, the user interface 42 may include a plurality of “soft” buttons 116, for example, toggle buttons and a keyboard 118, for example, an alphanumeric keyboard. Additionally, a functional keyboard portion 120 may be provided that includes other user selectable buttons and controls. Other user controls also may be provided, such as a trackball 122 having a trackball ring 124 and a plurality of associated buttons 126, which may be activated by the fingers of a user when operating the trackball 126. A plurality of sliding control members 128 (e.g., time control gain potentiometers) may also be provided, for example, adjacent the keyboard 118.
  • The portable ultrasound imaging system 110 also includes a display 130, for example, an integrated LCD display with a display latch 132 provided to lock the display 130 to the user interface 42. A power button 134 is provided to power on and off the portable ultrasound imaging system 110. The portable ultrasound imaging system 110 with the user interface 42 and the display defines a portable control unit.
  • It should be noted that as used herein, “miniaturized” generally means that the ultrasound system 110 is a handheld or hand-carried device and/or is configured to be carried in a person's hand, pocket, briefcase-sized case, backpack, and/or the like. For example, the ultrasound system 110 may be a hand-carried device having a size of a typical laptop computer, for instance, having dimensions of approximately 2.5 inches in depth, approximately 14 inches in width, and approximately 12 inches in height. The ultrasound system 110 may weigh about ten pounds or less, and is thus easily portable by the operator. The display 130 is configured to display, for example, a medical image and virtual display elements, as described below.
  • It further should be noted that ultrasonic data from the portable ultrasound imaging system 110 may be sent to an external device (not shown), such as a printer or display, via a wired or wireless network (or direct connection, for example, via a serial or parallel cable or USB port). In some embodiments, the external device may be a computer or a workstation having a display. Alternatively, the external device may be a separate external display or a printer capable of receiving image data from the portable ultrasound imaging system 110 and of displaying or printing images that may have greater resolution than the display 130.
  • With particular reference to the user interface 42, and as shown in more detail in FIG. 4, a plurality of user controls may be provided as part of the user interface 42. For example, “soft” buttons 116 may include a first menu button 140, a second menu button 142, a third menu button 144, and a fourth menu button 146, each capable of movement in four directions. A plurality of imaging buttons 148 may also be provided to select different imaging functions or operations. A plurality of mode selection buttons 136 also may be provided to select different scanning modes, for example, 2D, 4D, pulsed wave doppler (PW), color flow mode (CFM), etc. The functional keyboard portion 120 also includes other user selectable buttons and controls, such as buttons that allow for obtaining saved information, storing information, manipulating information or displayed images, calculating measurements relating to displayed images, changing a display format, etc.
  • The portable ultrasound imaging system 110 also includes internal and external connections on a back end 160 as shown in FIG. 5 and on a side portion 170 as shown in FIG. 6. For example, the back end 160 may include a VGA connector 162 (for connection, for example, to an external monitor), an RGB connector 164 (for connection, for example, to a printer) and a power supply input 166. A network connector 168, for example, an Ethernet LAN input/output also may be provided and one or more USB connectors 169 may be provided. On the side portion 170, and for example, a probe connection 172 for connection to a probe, may be provided, and the probe locking handle 114 is provided. It should be noted that different or additional connectors may be provided as desired or known, for example, based on the scanning applications for the portable ultrasound imaging system 110.
  • The portable ultrasound imaging system 110 also may be transported, stored, or operated in a case 180, as shown in FIG. 7. The case 180 may be, for example, a padded case to protect the portable ultrasound imaging system 110.
  • The portable ultrasound imaging system 110 also may be configured for mounting to or to be supported by a moveable base 190, for example, a movable cart as shown in FIG. 8. The moveable base 190 includes a support portion 192 for receiving and supporting the portable ultrasound imaging system 110 and a tray portion 194 that may be used, for example, to store peripherals. The movable base 190 also may include one or more probe holders 196 for supporting and holding therein one or more ultrasound probes, for example, one probe connected to the portable ultrasound imaging system 110 and other probes configured to be connected to the portable ultrasound imaging system 110. A foot rest 198 also may be provided. Accordingly, the portable ultrasound imaging system 110 may be configured to appear like a console-based type ultrasound imaging system.
  • However, it should be noted that the various embodiments may be implemented in connection with ultrasound systems having different sizes and shapes. For example, a hand carried or pocket-sized ultrasound imaging system 200 may be provided as shown in FIG. 9. In such a system 200, the display 130 and user interface 42 can form a single unit. By way of example, the pocket-sized ultrasound imaging system 200 may be a pocket-sized or hand-sized ultrasound system approximately 2 inches wide, approximately 4 inches in length, and approximately ½ inches in depth and/or weigh less than 3 ounces. The display 130 may be, for example, a 320×320 pixel color LCD display (on which a medical image 210 can be displayed). A typewriter-like keyboard 202 of buttons 203 may optionally be included in the user interface 42. It should be noted that the various embodiments may be implemented in connection with a pocket-sized ultrasound system 200 having different dimensions, weights, and/pr power consumptions.
  • Multi-function controls 204 may each be assigned functions in accordance with the mode of system operation. Therefore, each of the multi-function controls 204 may be configured to provide a plurality of different actions. Label display areas 206 associated with the multi-function controls 204 may be included as necessary on the display 130. The system 200 may also have additional keys and/or controls 208 for special purpose functions, which may include, but are not limited to “freeze,” “depth control,” “gain control,” “color-mode,” “print,” and “store.”
  • Various embodiments of the inventive arrangements provide virtual display elements (e.g., display icons) that are selectable by a user to change the function controlled by a particular user control. The selection of the virtual display elements reconfigures one or more of the user controls for controlling certain parameters, settings, etc. based on the selected virtual display element. In general, and as shown in FIG. 10, a user is presented with a plurality of virtual display elements 220 a-220 e that may be displayed, for example, on a screen 222, such as the display 38 (shown in FIG. 1). The virtual display elements 220 a-220 e are displayed on the screen 222 adjacent (e.g., surrounding) or proximate an image that is selected by a user. For example, when a user places a virtual pointer 224 (e.g., virtual cross-hairs) over a particular image 225, the virtual display elements 220 a-220 e are displayed on the screen 222. It should be noted that the virtual display elements 220 a-220 e disappear once the image 225 is no longer selected or the virtual pointer 224 is moved away from the image 225 and another image 226 is selected or the virtual pointer 224 is moved over that image or another user control member is activated (e.g., depressed). However, the virtual display elements 220 a-220 e may continue to be displayed in connection with the image 225 for a predetermined period of time (e.g., 2 seconds) even after the image 225 is no longer selected.
  • With the virtual display elements 220 a-220 e displayed on the screen 222, a user may select one of the virtual display elements 220 a-220 e. Upon selecting one of the virtual display elements 220 a-220 e, the corresponding function represented by that virtual display element 220 a-220 e is now adjusted or controlled by one of the controls of the user interface 42 (shown in FIG. 4), for example, the trackball 122. Accordingly, when a virtual display element 220 a-220 e is selected, the operation of the trackball 122 is reconfigured and the control thereof remapped, for example, as shown in Table 1 below.
  • TABLE 1
    Screen Ctrl Icon Meaning
    Rot X
    Figure US20090012394A1-20090108-P00001
    Rotate around X-axis when
    in ref image A (respective
    axis in B, C, 3D).
    Rot Y
    Figure US20090012394A1-20090108-P00002
    Rotate around Y-axis when
    in ref image A (respective
    axis in B, C, 3D).
    Rot Z
    Figure US20090012394A1-20090108-P00003
    Rotate around Z-axis when
    in ref image A (respective
    axis in B, C, 3D).
    Parallel Shift
    Figure US20090012394A1-20090108-P00004
    Shift in Z-direction.
    Curved Render Start
    Figure US20090012394A1-20090108-P00005
    Move curved render start
    Move
    Figure US20090012394A1-20090108-P00006
    Move the data around.
    Borders of Can be selected to resize the
    Renderbox Renderbox
    Home3D
    Figure US20090012394A1-20090108-P00007
    Switch
    3D rendered image
    back to initial 3D position.
    Home3Dflat
    Figure US20090012394A1-20090108-P00008
    Toggle between 3D view
    and flat 3D view of the
    rendered image.
  • Accordingly, and for example, if a particular virtual display element 220 a is selected, which may be selected by using the trackball 122 to move the virtual pointer 224 to the image 225 and pressing one of the buttons 126 (shown in FIG. 4), then the trackball 122 is reconfigured to control or adjust the parameter, function, etc. corresponding to that virtual display element 220 a, which, in the embodiment shown in Table 1, is to control rotation around the X-axis of the image 225. Thus, the operation of the trackball 122 is reconfigured to control or adjust the X-axis rotation. A user may then click one of the buttons 126 to return the trackball 122 to controlling movement of the virtual pointer 224 and allowing selection of one of the other virtual display elements 220 b-220 e. Alternatively, another one of the buttons of the user interface 42 may deselect the operation corresponding to a virtual display element 220 a and allow the selection of one of the other virtual display elements 220 b-220 e.
  • It should be noted that the virtual display elements 220 a-220 e may be configured as different icons and correspond to different function or operations than those illustrated in Table 1. It also should be noted that the selection of one of the virtual display elements 220 a-220 e may, instead of reconfiguring the trackball 122, reconfigure another user control or the user interface 42 or an external user control (e.g., a connected mouse).
  • Moreover, other information or selectable elements may be displayed on the screen 222. For example, a plurality of selectable elements 230 may be provided to allow for the selection of a particular visualization mode.
  • Referring now to FIGS. 11-20 illustrating exemplary screenshots 232 including the virtual display elements 220 a-220 e, a render visualization mode (which may be selected using the selectable elements 230) for a 4D realtime acquisition is shown. Specifically, as shown in FIG. 11, the virtual pointer 224, illustrated as a mouse pointer, is moved, for example, using the trackball 122 (shown in FIG. 4), over a side 240 of a render box 242 (identifying the region of the image 244 to be rendered). The side 240 may be highlighted (e.g., highlighted by a color) when the virtual pointer 224 is placed over the side 240. When the side 240 is selected, the virtual display elements 220 a-220 d and 220 f (e.g., icons) are displayed. It should be noted that the virtual display element 220 e is not displayed in this screenshot, but it may be displayed in some embodiments.
  • As shown in FIG. 11, the virtual pointer 224 has now been placed over virtual display element 220 f (the icon shaped as a dot) that corresponds to a curved render start function. When the virtual display element 220 f is selected, or when the virtual pointer 224 is moved over the virtual display element 220 f, the virtual display element 220 f may be highlighted (e.g., highlighted or shadowed in yellow). Once the virtual display element 220 f is selected, the trackball 122 is reconfigured to adjust the curved render start function as shown in FIG. 13. Once the virtual display element 220 f is selected, the virtual display element 220 f may be highlighted differently (e.g., highlighted in a different color, such as red) and a curved render start portion 244 of the render box 240 is displayed. It should be noted that once the virtual display element 220 f is selected, the other virtual display elements 220 a-220 d disappear, and when the trackball 122 is moved, the curved render start portion 244 is changed, for example, curved as adjusted by the trackball 122 instead of straight as shown in FIG. 12. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the other virtual display elements 220 a-220 d appear again.
  • It also should be noted that a virtual representation 246 of the trackball 122 may be displayed on the display 130 and indicate the functions corresponding to the trackball 122 and the buttons 126 in the current active display mode.
  • As shown in FIG. 14, another side 248 (or border) of the render box 240 may be selected and which reconfigures the functionality of the trackball 122 to allow adjustment of the size of the render box 240. The side 248 may be highlighted (e.g., highlighted in red) and all of virtual display elements 220 a-220 d and 220 f disappear. When the trackball 122 is now moved, the size of the render box 240 is changed. For example, the render box 240 is now smaller in FIG. 14 than in FIG. 11-13. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the virtual display elements 220 a-220 d and 220 f appear again.
  • In the screenshot 232 of FIG. 15, the virtual display element 220 a has been selected and the other virtual display elements 220 b-220 d and 220 f have disappeared. The selected virtual display element 220 a corresponds to a rotate around the y-axis, which now may be adjusted by the trackball 122 that is reconfigured to control this operation. The selected virtual display element 220 a may be highlighted (e.g., highlighted in red) and when the trackball 122 is moved, the volume data displayed is rotated around or about the y-axis, with the content of the displayed images 244 and 245 changed accordingly. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the virtual display elements 220 b-220 d and 220 f appear again.
  • In FIG. 16, the virtual pointer 224 has been moved over the image 245. When the virtual pointer 224 is moved over the image 245, a different set of virtual display elements 220 a-220 d and now 220 g appear. In particular, the virtual display element 220 g now appears and is configured as a “house” icon. It should be noted that a render box 241 may now appear on the image 245. The virtual display element 220 g when selected changes the 3D display. In particular, as shown in FIG. 17, the rendered image, specifically, the image 245 is now displayed at an angle, the render box 241 is displayed as a three-dimensional box and the virtual display element 220 g changes shape, for example, the “house” icon is rotated. If the virtual display element 220 g is again selected, the image 245 will again appear as shown in FIG. 16 and the shape of the virtual display element 220 g will return to the “house” icon as shown in FIG. 16.
  • FIG. 18 illustrates a quad-view mode and the visualization mode now shows sectional planes. The virtual pointer 224 is now shown as moved over a center dot 252 in the image 250 and the dot is marked, for example, with a marker 254, such as cross-hairs that may be highlighted, for example, highlighted in yellow. The user may then select the marker 254, which may, for example, change color to red and a move center dot function is now assigned to the trackball 122. All of the other virtual display elements 220 a-220 d also disappear. When the trackball 122 is moved, the center dot 252 is moved and the content of the image 250, as well as the images 260 and 262 changed accordingly. It should be noted that once the adjustment is complete, a user may press any of the buttons 126 (shown in FIG. 4) and the virtual display elements 220 a-220 d appear again.
  • As shown in FIG. 20, the virtual pointer 224 has been moved over the image 260 and not over any of the virtual display elements 220 a-220 d. The virtual pointer 224 now has a different shape, for example, a hand instead of an arrow or pointer. In this mode, if one of the buttons 126 is selected (e.g., pressed by a user), a move image functionality is selected and assigned to the trackball 122. When the trackball 122 is moved, the image 260 is moved on the display.
  • Thus, a single user control member can be used to manipulate, for example, 3D or 4D ultrasound data. For example, by assigning different operations to the single user control member based on selecting from a plurality of virtual display elements, the single user control member is reconfigured to control different operations or adjust different settings, parameters, etc.
  • It should be noted that the some (or all) of virtual display elements 220 a-220 g may be displayed in different imaging modes, for example, a tomographic ultrasound image (TUI) mode or a SonoVCAD mode. However, different virtual display elements corresponding to different operations or functions may be displayed in addition to or instead of some or all of the virtual display elements 220 a-220 g. Also, it should be noted that in some modes, only a specific image or images can be adjusted and accordingly, the virtual display elements only appear when the virtual pointer 224 is moved over those images. It also should be noted that only a single image may be displayed instead of the multiple images as illustrated.
  • Accordingly, the various embodiments automatically reconfigure the operation of a user control member (e.g., a trackball) based on a selected virtual display element such that the control operations performed by the user control member are remapped. The user control member is thereby used to adjust or control different functions based on the virtual display element selected. In one embodiment, based on the selected virtual display element corresponding to a particular function, setting, parameter, etc., the movement of the user control member is remapped to, for example, allow the particular, setting, parameter, etc. to be adjusted or changed based on the movement of the user control member that has been remapped. For example, a table or database is accessed and the corresponding motion of the user control member is mapped for the particular function, setting, parameter, etc. Thereafter, the relative movement of the user control member adjusts the particular function, setting, parameter, etc. corresponding to the selected virtual display element.
  • At least one technical effect of the various embodiments of the inventive arrangements is automatically changing the control function or operation of a user control member based on the selection of a virtual display element. The user control member is reconfigured or reassigned to control or adjust a different operation or function based on the selected virtual display element.
  • Some embodiments of the inventive arrangements provide a machine-readable medium or media having instructions recorded thereon for a processor or computer to operate an imaging apparatus to perform one or more embodiments of the methods described herein. The medium or media may be any type of CD-ROM, DVD, floppy disk, hard disk, optical disk, flash RAM drive, and/or other type of computer-readable medium, and/or a combination thereof.
  • The various embodiments and/or components, for example, the processors, or components and controllers therein, may also be implemented as part of one or more computers or processors. Such a computer or processor may include a computing device, an input device, a display unit, and/or an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and/or Read Only Memory (ROM). The computer or processor may further include a storage device, which may be a hard disk drive or a removable storage drive, such as a floppy disk drive, optical disk drive, and/or the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
  • As used herein, the term “computer” may include any processor-based or microprocessor-based system, including systems using microcontrollers, reduced instruction set computers (RISC), application specific integrated circuits (ASICs), logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only and thus not intended to limit in any way the definition and/or meaning of the term “computer.”
  • The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired and/or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.
  • The set of instructions may include various commands that instruct the computer or processor as a processing machine to perform specific operations, such as the methods and processes of the various embodiments of the inventive arrangements. The set of instructions may be in the form of a software program. The software may be in various forms, such as system software or application software. In addition, the software may be in the form of a collection of separate programs, a program module within a larger program, or a portion of a program module. The software may also include modular programming in the form of object-oriented programming. The processing of input data by the processing machine may be in response to user commands, or in response to results of previous processing, or in response to a request made by another processing machine.
  • As used herein, the terms “software” and “firmware” are interchangeable and include any computer program stored in memory for execution by a computer, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and/or non-volatile RAM (NVRAM) memory. The above memory types are exemplary only and are thus not limiting as to the types of memory usable for storage of a computer program.
  • It is to be understood that the above description is intended to be illustrative and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the inventive arrangements without departing from their scope. For example, the ordering of steps recited in a method need not be performed in a particular order unless explicitly stated or implicitly required (e.g., one step requires the results or a product of a previous step to be available). While some of the dimensions and types of materials described herein are intended to define the parameters of the inventive arrangements, they are by not limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing and understanding the above description. The scope of the inventive arrangements should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and they are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
  • This written description uses examples to disclose the inventive arrangements, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices and/or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

Claims (20)

1. An ultrasound system, comprising:
a user interface having at least one user control member; and
a display having a plurality of virtual display elements displayed thereon when a virtual pointer is positioned over an image displayed on the display and wherein a function controlled by the at least one user control member is determined based on a selected one of the plurality of virtual display elements.
2. The ultrasound system of claim 1, wherein the at least one user control member is configured to be operated to select one of the plurality of virtual display elements using the virtual pointer.
3. The ultrasound system of claim 1, wherein the at least one user control member comprises a trackball.
4. The ultrasound system of claim 1, wherein the display is configured to display only the selected one of the virtual display elements.
5. The ultrasound system of claim 4, wherein the display is configured to display the other display elements when one of (i) an adjustment using the at least one user control member is completed and (ii) a button corresponding to the user control member is activated.
6. The ultrasound system of claim 1, wherein the plurality of virtual display elements comprise icons representative of the corresponding controlled function.
7. The ultrasound system of claim 1, wherein the plurality of virtual display elements are displayed only when the virtual pointer is positioned over an image that can be changed.
8. The ultrasound system of claim 1, wherein the plurality of virtual display elements are changed based on one of a mode of operation and a mode of visualization.
9. The ultrasound system of claim 1, wherein the display automatically displays the plurality of virtual display elements when the virtual pointer is positioned over the image.
10. The ultrasound system of claim 1, wherein the virtual display elements are displayed adjacent the image.
11. The ultrasound system of claim 1, wherein the selected one of the plurality of virtual display elements is highlighted.
12. The ultrasound system of claim 1, wherein the function controlled by the at least one user control member comprises an adjustment.
13. The ultrasound system of claim 1, wherein an icon representing the selected one of the plurality of virtual display elements changes based on an input from the at least one user control member.
14. The ultrasound system of claim 1, wherein the display is configured to display a virtual representation of the at least one user control member along with at least one indicated function corresponding to at least one user control member and one or more buttons associated with the at least one user control member.
15. The ultrasound system of claim 1, further comprising:
a portable ultrasound unit including the user interface and the display.
16. An ultrasound system, comprising:
an ultrasound volume probe for acquiring one of three-dimensional (3D) ultrasound data and four-dimensional (4D) ultrasound data; and
a portable control unit having a user interface and a display, the ultrasound volume probe connected to the portable control unit, and wherein manipulation of one of the 3D ultrasound data and 4D ultrasound data is provided by a single user control member of the user interface.
17. The ultrasound system of claim 16, wherein the single user control member comprises a trackball.
18. The ultrasound system of claim 16, wherein the display is configured to display a plurality of selectable virtual display elements and a type of manipulation provided by the single user control member is determined based on a selected one of the plurality of selectable virtual display elements.
19. The ultrasound system of claim 16, wherein the user interface does not include rotary controls and the manipulation is performed without the use of the rotary controls.
20. A method for controlling an ultrasound probe using a portable ultrasound system, comprising:
receiving a user input selecting one of a plurality of virtual display elements on a display on the portable ultrasound system; and
configuring a user control member of the portable ultrasound system based on the received user input to control an operation of the portable ultrasound system.
US12/112,946 2007-04-30 2008-04-30 User interface for ultrasound system Abandoned US20090012394A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/112,946 US20090012394A1 (en) 2007-04-30 2008-04-30 User interface for ultrasound system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US91489307P 2007-04-30 2007-04-30
US12/112,946 US20090012394A1 (en) 2007-04-30 2008-04-30 User interface for ultrasound system

Publications (1)

Publication Number Publication Date
US20090012394A1 true US20090012394A1 (en) 2009-01-08

Family

ID=40222014

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/112,911 Expired - Fee Related US8038619B2 (en) 2007-04-30 2008-04-30 Motor driver for ultrasound system
US12/112,946 Abandoned US20090012394A1 (en) 2007-04-30 2008-04-30 User interface for ultrasound system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/112,911 Expired - Fee Related US8038619B2 (en) 2007-04-30 2008-04-30 Motor driver for ultrasound system

Country Status (1)

Country Link
US (2) US8038619B2 (en)

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009097652A1 (en) * 2008-02-07 2009-08-13 Signostics Pty Ltd Remote display for medical scanning apparatus
US20100022922A1 (en) * 2004-10-06 2010-01-28 Guided Therapy Systems, L.L.C. Method and system for treating stretch marks
US20110112405A1 (en) * 2008-06-06 2011-05-12 Ulthera, Inc. Hand Wand for Ultrasonic Cosmetic Treatment and Imaging
US20120029353A1 (en) * 2010-08-02 2012-02-02 Guided Therapy Systems, Llc Systems and methods for ultrasound treatment
US8857438B2 (en) 2010-11-08 2014-10-14 Ulthera, Inc. Devices and methods for acoustic shielding
US8915853B2 (en) 2004-10-06 2014-12-23 Guided Therapy Systems, Llc Methods for face and neck lifts
US8932224B2 (en) 2004-10-06 2015-01-13 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US9011337B2 (en) 2011-07-11 2015-04-21 Guided Therapy Systems, Llc Systems and methods for monitoring and controlling ultrasound power output and stability
US9011336B2 (en) 2004-09-16 2015-04-21 Guided Therapy Systems, Llc Method and system for combined energy therapy profile
US9039617B2 (en) 2009-11-24 2015-05-26 Guided Therapy Systems, Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
US9039619B2 (en) 2004-10-06 2015-05-26 Guided Therapy Systems, L.L.C. Methods for treating skin laxity
US20150220259A1 (en) * 2013-07-01 2015-08-06 Samsung Electronics Co. Ltd. Method and apparatus for changing user interface based on user motion information
US9114247B2 (en) 2004-09-16 2015-08-25 Guided Therapy Systems, Llc Method and system for ultrasound treatment with a multi-directional transducer
US20150272547A1 (en) * 2014-03-31 2015-10-01 Siemens Medical Solutions Usa, Inc. Acquisition control for elasticity ultrasound imaging
US9216276B2 (en) 2007-05-07 2015-12-22 Guided Therapy Systems, Llc Methods and systems for modulating medicants using acoustic energy
US9263663B2 (en) 2012-04-13 2016-02-16 Ardent Sound, Inc. Method of making thick film transducer arrays
US9272162B2 (en) 1997-10-14 2016-03-01 Guided Therapy Systems, Llc Imaging, therapy, and temperature monitoring ultrasonic method
US9283409B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, Llc Energy based fat reduction
US9283410B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9320537B2 (en) 2004-10-06 2016-04-26 Guided Therapy Systems, Llc Methods for noninvasive skin tightening
US9452302B2 (en) 2011-07-10 2016-09-27 Guided Therapy Systems, Llc Systems and methods for accelerating healing of implanted material and/or native tissue
US9504446B2 (en) 2010-08-02 2016-11-29 Guided Therapy Systems, Llc Systems and methods for coupling an ultrasound source to tissue
US9510802B2 (en) 2012-09-21 2016-12-06 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US9530398B2 (en) 2012-12-06 2016-12-27 White Eagle Sonic Technologies, Inc. Method for adaptively scheduling ultrasound system actions
US9529080B2 (en) 2012-12-06 2016-12-27 White Eagle Sonic Technologies, Inc. System and apparatus having an application programming interface for flexible control of execution ultrasound actions
US9566454B2 (en) 2006-09-18 2017-02-14 Guided Therapy Systems, Llc Method and sysem for non-ablative acne treatment and prevention
US9694212B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, Llc Method and system for ultrasound treatment of skin
US9700340B2 (en) 2004-10-06 2017-07-11 Guided Therapy Systems, Llc System and method for ultra-high frequency ultrasound treatment
US9827449B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US20180021015A1 (en) * 2015-02-09 2018-01-25 Hitachi, Ltd. Ultrasonic diagnostic device
US9907535B2 (en) 2000-12-28 2018-03-06 Ardent Sound, Inc. Visual imaging system for ultrasonic probe
US9983905B2 (en) 2012-12-06 2018-05-29 White Eagle Sonic Technologies, Inc. Apparatus and system for real-time execution of ultrasound system actions
US10031666B2 (en) 2012-04-26 2018-07-24 Samsung Electronics Co., Ltd. Method and apparatus for displaying function of button of ultrasound apparatus on the button
US10039938B2 (en) 2004-09-16 2018-08-07 Guided Therapy Systems, Llc System and method for variable depth ultrasound treatment
US10076313B2 (en) 2012-12-06 2018-09-18 White Eagle Sonic Technologies, Inc. System and method for automatically adjusting beams to scan an object in a body
US10420960B2 (en) 2013-03-08 2019-09-24 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US10499884B2 (en) 2012-12-06 2019-12-10 White Eagle Sonic Technologies, Inc. System and method for scanning for a second object within a first object using an adaptive scheduler
US10561862B2 (en) 2013-03-15 2020-02-18 Guided Therapy Systems, Llc Ultrasound treatment device and methods of use
US20200060669A1 (en) * 2018-08-22 2020-02-27 Covidien Lp Surgical retractor including three-dimensional (3d) imaging capability
US10603521B2 (en) 2014-04-18 2020-03-31 Ulthera, Inc. Band transducer ultrasound therapy
US10864385B2 (en) 2004-09-24 2020-12-15 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
CN112690825A (en) * 2019-10-22 2021-04-23 通用电气精准医疗有限责任公司 Method and system for providing a hand-drawn rendering start line drawing tool and automatic rendering preset selection
US11207548B2 (en) 2004-10-07 2021-12-28 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US11224895B2 (en) 2016-01-18 2022-01-18 Ulthera, Inc. Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof
US11235179B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc Energy based skin gland treatment
US11241218B2 (en) 2016-08-16 2022-02-08 Ulthera, Inc. Systems and methods for cosmetic ultrasound treatment of skin
US11338156B2 (en) 2004-10-06 2022-05-24 Guided Therapy Systems, Llc Noninvasive tissue tightening system
US11717661B2 (en) 2007-05-07 2023-08-08 Guided Therapy Systems, Llc Methods and systems for ultrasound assisted delivery of a medicant to tissue
US11724133B2 (en) 2004-10-07 2023-08-15 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US11883688B2 (en) 2004-10-06 2024-01-30 Guided Therapy Systems, Llc Energy based fat reduction
US11944849B2 (en) 2018-02-20 2024-04-02 Ulthera, Inc. Systems and methods for combined cosmetic treatment of cellulite with ultrasound

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8721553B2 (en) * 2007-05-15 2014-05-13 General Electric Company Fluid-fillable ultrasound imaging catheter tips
JP2009011711A (en) * 2007-07-09 2009-01-22 Toshiba Corp Ultrasonic diagnosis apparatus
EP2413802A1 (en) * 2009-04-01 2012-02-08 Analogic Corporation Ultrasound probe
US8647279B2 (en) * 2010-06-10 2014-02-11 Siemens Medical Solutions Usa, Inc. Volume mechanical transducer for medical diagnostic ultrasound
US8684933B2 (en) 2010-08-17 2014-04-01 Imsonic Medical, Inc. Handheld ultrasound color flow imaging system with mechanically scanned, mechanically focused multi-element transducers
JP6069848B2 (en) * 2012-02-24 2017-02-01 セイコーエプソン株式会社 Probe head, ultrasonic probe, electronic device and diagnostic device
GB201204831D0 (en) 2012-03-20 2012-05-02 Netscientific Ltd Programmable medical devices
US10517569B2 (en) 2012-05-09 2019-12-31 The Regents Of The University Of Michigan Linear magnetic drive transducer for ultrasound imaging
US11406415B2 (en) 2012-06-11 2022-08-09 Tenex Health, Inc. Systems and methods for tissue treatment
US9414810B2 (en) * 2013-01-24 2016-08-16 B-K Medical Aps Ultrasound imaging system
JP2015136569A (en) * 2014-01-24 2015-07-30 日立金属株式会社 Ultrasonic probe
US9962181B2 (en) 2014-09-02 2018-05-08 Tenex Health, Inc. Subcutaneous wound debridement
CN112535499A (en) 2019-09-20 2021-03-23 巴德阿克塞斯系统股份有限公司 Automated vessel detection tool and method
EP4203799A1 (en) * 2020-09-10 2023-07-05 Bard Access Systems, Inc. Ultrasound probe with pressure measurement capability

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6039047A (en) * 1998-10-30 2000-03-21 Acuson Corporation Method and system for changing the appearance of a control region of a medical device such as a diagnostic medical ultrasound system
US20050168488A1 (en) * 2004-02-03 2005-08-04 Montague Roland W. Combination tool that zooms in, zooms out, pans, rotates, draws, or manipulates during a drag
US6961905B1 (en) * 2000-06-23 2005-11-01 Microsoft Corporation Method and system for modifying an image on a web page
US7010761B2 (en) * 2001-10-18 2006-03-07 Sony Computer Entertainment America Inc. Controller selectable hyperlinks
US7242387B2 (en) * 2002-10-18 2007-07-10 Autodesk, Inc. Pen-mouse system
US20090054768A1 (en) * 2007-08-24 2009-02-26 Menachem Halmann Method and apparatus for voice recording with ultrasound imaging

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5159931A (en) * 1988-11-25 1992-11-03 Riccardo Pini Apparatus for obtaining a three-dimensional reconstruction of anatomic structures through the acquisition of echographic images
US7534211B2 (en) * 2002-03-29 2009-05-19 Sonosite, Inc. Modular apparatus for diagnostic ultrasound
US7282878B1 (en) * 2006-04-28 2007-10-16 Rakov Mikhail A Systems for brushless DC electrical drive control

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6039047A (en) * 1998-10-30 2000-03-21 Acuson Corporation Method and system for changing the appearance of a control region of a medical device such as a diagnostic medical ultrasound system
US6961905B1 (en) * 2000-06-23 2005-11-01 Microsoft Corporation Method and system for modifying an image on a web page
US7010761B2 (en) * 2001-10-18 2006-03-07 Sony Computer Entertainment America Inc. Controller selectable hyperlinks
US7242387B2 (en) * 2002-10-18 2007-07-10 Autodesk, Inc. Pen-mouse system
US20050168488A1 (en) * 2004-02-03 2005-08-04 Montague Roland W. Combination tool that zooms in, zooms out, pans, rotates, draws, or manipulates during a drag
US20090054768A1 (en) * 2007-08-24 2009-02-26 Menachem Halmann Method and apparatus for voice recording with ultrasound imaging

Cited By (119)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9272162B2 (en) 1997-10-14 2016-03-01 Guided Therapy Systems, Llc Imaging, therapy, and temperature monitoring ultrasonic method
US9907535B2 (en) 2000-12-28 2018-03-06 Ardent Sound, Inc. Visual imaging system for ultrasonic probe
US10039938B2 (en) 2004-09-16 2018-08-07 Guided Therapy Systems, Llc System and method for variable depth ultrasound treatment
US9114247B2 (en) 2004-09-16 2015-08-25 Guided Therapy Systems, Llc Method and system for ultrasound treatment with a multi-directional transducer
US9011336B2 (en) 2004-09-16 2015-04-21 Guided Therapy Systems, Llc Method and system for combined energy therapy profile
US9095697B2 (en) 2004-09-24 2015-08-04 Guided Therapy Systems, Llc Methods for preheating tissue for cosmetic treatment of the face and body
US11590370B2 (en) 2004-09-24 2023-02-28 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US10864385B2 (en) 2004-09-24 2020-12-15 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US10328289B2 (en) 2004-09-24 2019-06-25 Guided Therapy Systems, Llc Rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US9895560B2 (en) 2004-09-24 2018-02-20 Guided Therapy Systems, Llc Methods for rejuvenating skin by heating tissue for cosmetic treatment of the face and body
US10888718B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US10610706B2 (en) 2004-10-06 2020-04-07 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US11883688B2 (en) 2004-10-06 2024-01-30 Guided Therapy Systems, Llc Energy based fat reduction
US11717707B2 (en) 2004-10-06 2023-08-08 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US11697033B2 (en) 2004-10-06 2023-07-11 Guided Therapy Systems, Llc Methods for lifting skin tissue
US20100022922A1 (en) * 2004-10-06 2010-01-28 Guided Therapy Systems, L.L.C. Method and system for treating stretch marks
US11400319B2 (en) 2004-10-06 2022-08-02 Guided Therapy Systems, Llc Methods for lifting skin tissue
US11338156B2 (en) 2004-10-06 2022-05-24 Guided Therapy Systems, Llc Noninvasive tissue tightening system
US11235179B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc Energy based skin gland treatment
US11235180B2 (en) 2004-10-06 2022-02-01 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US8932224B2 (en) 2004-10-06 2015-01-13 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US9283409B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, Llc Energy based fat reduction
US9283410B2 (en) 2004-10-06 2016-03-15 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9320537B2 (en) 2004-10-06 2016-04-26 Guided Therapy Systems, Llc Methods for noninvasive skin tightening
US11207547B2 (en) 2004-10-06 2021-12-28 Guided Therapy Systems, Llc Probe for ultrasound tissue treatment
US9421029B2 (en) 2004-10-06 2016-08-23 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US9427600B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US9427601B2 (en) 2004-10-06 2016-08-30 Guided Therapy Systems, Llc Methods for face and neck lifts
US9440096B2 (en) 2004-10-06 2016-09-13 Guided Therapy Systems, Llc Method and system for treating stretch marks
US11179580B2 (en) 2004-10-06 2021-11-23 Guided Therapy Systems, Llc Energy based fat reduction
US11167155B2 (en) 2004-10-06 2021-11-09 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10960236B2 (en) 2004-10-06 2021-03-30 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US9522290B2 (en) 2004-10-06 2016-12-20 Guided Therapy Systems, Llc System and method for fat and cellulite reduction
US10888716B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, Llc Energy based fat reduction
US10888717B2 (en) 2004-10-06 2021-01-12 Guided Therapy Systems, Llc Probe for ultrasound tissue treatment
US9533175B2 (en) 2004-10-06 2017-01-03 Guided Therapy Systems, Llc Energy based fat reduction
US10610705B2 (en) 2004-10-06 2020-04-07 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US9694212B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, Llc Method and system for ultrasound treatment of skin
US9694211B2 (en) 2004-10-06 2017-07-04 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US9700340B2 (en) 2004-10-06 2017-07-11 Guided Therapy Systems, Llc System and method for ultra-high frequency ultrasound treatment
US9707412B2 (en) 2004-10-06 2017-07-18 Guided Therapy Systems, Llc System and method for fat and cellulite reduction
US9713731B2 (en) 2004-10-06 2017-07-25 Guided Therapy Systems, Llc Energy based fat reduction
US10603519B2 (en) 2004-10-06 2020-03-31 Guided Therapy Systems, Llc Energy based fat reduction
US10603523B2 (en) 2004-10-06 2020-03-31 Guided Therapy Systems, Llc Ultrasound probe for tissue treatment
US10532230B2 (en) 2004-10-06 2020-01-14 Guided Therapy Systems, Llc Methods for face and neck lifts
US9827449B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. Systems for treating skin laxity
US9827450B2 (en) 2004-10-06 2017-11-28 Guided Therapy Systems, L.L.C. System and method for fat and cellulite reduction
US9833639B2 (en) 2004-10-06 2017-12-05 Guided Therapy Systems, L.L.C. Energy based fat reduction
US9833640B2 (en) 2004-10-06 2017-12-05 Guided Therapy Systems, L.L.C. Method and system for ultrasound treatment of skin
US10525288B2 (en) 2004-10-06 2020-01-07 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US8915853B2 (en) 2004-10-06 2014-12-23 Guided Therapy Systems, Llc Methods for face and neck lifts
US10265550B2 (en) 2004-10-06 2019-04-23 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US8915870B2 (en) 2004-10-06 2014-12-23 Guided Therapy Systems, Llc Method and system for treating stretch marks
US10252086B2 (en) 2004-10-06 2019-04-09 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10245450B2 (en) 2004-10-06 2019-04-02 Guided Therapy Systems, Llc Ultrasound probe for fat and cellulite reduction
US10010726B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US10010724B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US10010721B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, L.L.C. Energy based fat reduction
US10010725B2 (en) 2004-10-06 2018-07-03 Guided Therapy Systems, Llc Ultrasound probe for fat and cellulite reduction
US10238894B2 (en) 2004-10-06 2019-03-26 Guided Therapy Systems, L.L.C. Energy based fat reduction
US9039619B2 (en) 2004-10-06 2015-05-26 Guided Therapy Systems, L.L.C. Methods for treating skin laxity
US10046181B2 (en) 2004-10-06 2018-08-14 Guided Therapy Systems, Llc Energy based hyperhidrosis treatment
US10046182B2 (en) 2004-10-06 2018-08-14 Guided Therapy Systems, Llc Methods for face and neck lifts
US9974982B2 (en) 2004-10-06 2018-05-22 Guided Therapy Systems, Llc System and method for noninvasive skin tightening
US11724133B2 (en) 2004-10-07 2023-08-15 Guided Therapy Systems, Llc Ultrasound probe for treatment of skin
US11207548B2 (en) 2004-10-07 2021-12-28 Guided Therapy Systems, L.L.C. Ultrasound probe for treating skin laxity
US9566454B2 (en) 2006-09-18 2017-02-14 Guided Therapy Systems, Llc Method and sysem for non-ablative acne treatment and prevention
US11717661B2 (en) 2007-05-07 2023-08-08 Guided Therapy Systems, Llc Methods and systems for ultrasound assisted delivery of a medicant to tissue
US9216276B2 (en) 2007-05-07 2015-12-22 Guided Therapy Systems, Llc Methods and systems for modulating medicants using acoustic energy
WO2009097652A1 (en) * 2008-02-07 2009-08-13 Signostics Pty Ltd Remote display for medical scanning apparatus
US11123039B2 (en) 2008-06-06 2021-09-21 Ulthera, Inc. System and method for ultrasound treatment
US20110112405A1 (en) * 2008-06-06 2011-05-12 Ulthera, Inc. Hand Wand for Ultrasonic Cosmetic Treatment and Imaging
US10537304B2 (en) 2008-06-06 2020-01-21 Ulthera, Inc. Hand wand for ultrasonic cosmetic treatment and imaging
US11723622B2 (en) 2008-06-06 2023-08-15 Ulthera, Inc. Systems for ultrasound treatment
US9345910B2 (en) 2009-11-24 2016-05-24 Guided Therapy Systems Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
US9039617B2 (en) 2009-11-24 2015-05-26 Guided Therapy Systems, Llc Methods and systems for generating thermal bubbles for improved ultrasound imaging and therapy
US9504446B2 (en) 2010-08-02 2016-11-29 Guided Therapy Systems, Llc Systems and methods for coupling an ultrasound source to tissue
US10183182B2 (en) 2010-08-02 2019-01-22 Guided Therapy Systems, Llc Methods and systems for treating plantar fascia
US9149658B2 (en) * 2010-08-02 2015-10-06 Guided Therapy Systems, Llc Systems and methods for ultrasound treatment
US20120029353A1 (en) * 2010-08-02 2012-02-02 Guided Therapy Systems, Llc Systems and methods for ultrasound treatment
US8857438B2 (en) 2010-11-08 2014-10-14 Ulthera, Inc. Devices and methods for acoustic shielding
US9452302B2 (en) 2011-07-10 2016-09-27 Guided Therapy Systems, Llc Systems and methods for accelerating healing of implanted material and/or native tissue
US9011337B2 (en) 2011-07-11 2015-04-21 Guided Therapy Systems, Llc Systems and methods for monitoring and controlling ultrasound power output and stability
US9263663B2 (en) 2012-04-13 2016-02-16 Ardent Sound, Inc. Method of making thick film transducer arrays
US11086513B2 (en) 2012-04-26 2021-08-10 Samsung Electronics Co., Ltd. Method and apparatus for displaying function of button of ultrasound apparatus on the button
US10031666B2 (en) 2012-04-26 2018-07-24 Samsung Electronics Co., Ltd. Method and apparatus for displaying function of button of ultrasound apparatus on the button
US11726655B2 (en) 2012-04-26 2023-08-15 Samsung Electronics Co., Ltd. Method and apparatus for displaying function of button of ultrasound apparatus on the button
US9510802B2 (en) 2012-09-21 2016-12-06 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US9802063B2 (en) 2012-09-21 2017-10-31 Guided Therapy Systems, Llc Reflective ultrasound technology for dermatological treatments
US10499884B2 (en) 2012-12-06 2019-12-10 White Eagle Sonic Technologies, Inc. System and method for scanning for a second object within a first object using an adaptive scheduler
US9529080B2 (en) 2012-12-06 2016-12-27 White Eagle Sonic Technologies, Inc. System and apparatus having an application programming interface for flexible control of execution ultrasound actions
US11490878B2 (en) 2012-12-06 2022-11-08 White Eagle Sonic Technologies, Inc. System and method for scanning for a second object within a first object using an adaptive scheduler
US10235988B2 (en) 2012-12-06 2019-03-19 White Eagle Sonic Technologies, Inc. Apparatus and system for adaptively scheduling ultrasound system actions
US9773496B2 (en) 2012-12-06 2017-09-26 White Eagle Sonic Technologies, Inc. Apparatus and system for adaptively scheduling ultrasound system actions
US10076313B2 (en) 2012-12-06 2018-09-18 White Eagle Sonic Technologies, Inc. System and method for automatically adjusting beams to scan an object in a body
US9530398B2 (en) 2012-12-06 2016-12-27 White Eagle Sonic Technologies, Inc. Method for adaptively scheduling ultrasound system actions
US11883242B2 (en) 2012-12-06 2024-01-30 White Eagle Sonic Technologies, Inc. System and method for scanning for a second object within a first object using an adaptive scheduler
US9983905B2 (en) 2012-12-06 2018-05-29 White Eagle Sonic Technologies, Inc. Apparatus and system for real-time execution of ultrasound system actions
US11517772B2 (en) 2013-03-08 2022-12-06 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US10420960B2 (en) 2013-03-08 2019-09-24 Ulthera, Inc. Devices and methods for multi-focus ultrasound therapy
US10561862B2 (en) 2013-03-15 2020-02-18 Guided Therapy Systems, Llc Ultrasound treatment device and methods of use
US20150220259A1 (en) * 2013-07-01 2015-08-06 Samsung Electronics Co. Ltd. Method and apparatus for changing user interface based on user motion information
US20150301712A1 (en) * 2013-07-01 2015-10-22 Samsung Electronics Co., Ltd. Method and apparatus for changing user interface based on user motion information
US10558350B2 (en) 2013-07-01 2020-02-11 Samsung Electronics Co., Ltd. Method and apparatus for changing user interface based on user motion information
US9792033B2 (en) * 2013-07-01 2017-10-17 Samsung Electronics Co., Ltd. Method and apparatus for changing user interface based on information related to a probe
US10095400B2 (en) * 2013-07-01 2018-10-09 Samsung Electronics Co., Ltd. Method and apparatus for changing user interface based on user motion information
US9904455B2 (en) 2013-07-01 2018-02-27 Samsung Electronics Co., Ltd. Method and apparatus for changing user interface based on user motion information
US20150272547A1 (en) * 2014-03-31 2015-10-01 Siemens Medical Solutions Usa, Inc. Acquisition control for elasticity ultrasound imaging
US11351401B2 (en) 2014-04-18 2022-06-07 Ulthera, Inc. Band transducer ultrasound therapy
US10603521B2 (en) 2014-04-18 2020-03-31 Ulthera, Inc. Band transducer ultrasound therapy
US20180021015A1 (en) * 2015-02-09 2018-01-25 Hitachi, Ltd. Ultrasonic diagnostic device
US11224895B2 (en) 2016-01-18 2022-01-18 Ulthera, Inc. Compact ultrasound device having annular ultrasound array peripherally electrically connected to flexible printed circuit board and method of assembly thereof
US11241218B2 (en) 2016-08-16 2022-02-08 Ulthera, Inc. Systems and methods for cosmetic ultrasound treatment of skin
US11944849B2 (en) 2018-02-20 2024-04-02 Ulthera, Inc. Systems and methods for combined cosmetic treatment of cellulite with ultrasound
US20200060669A1 (en) * 2018-08-22 2020-02-27 Covidien Lp Surgical retractor including three-dimensional (3d) imaging capability
US10828020B2 (en) * 2018-08-22 2020-11-10 Covidien Lp Surgical retractor including three-dimensional (3D) imaging capability
KR102500589B1 (en) * 2019-10-22 2023-02-15 쥐이 프리시즌 헬스케어 엘엘씨 Method and system for providing freehand render start line drawing tools and automatic render preset selections
KR20210048415A (en) * 2019-10-22 2021-05-03 쥐이 프리시즌 헬스케어 엘엘씨 Method and system for providing freehand render start line drawing tools and automatic render preset selections
CN112690825A (en) * 2019-10-22 2021-04-23 通用电气精准医疗有限责任公司 Method and system for providing a hand-drawn rendering start line drawing tool and automatic rendering preset selection

Also Published As

Publication number Publication date
US20090012401A1 (en) 2009-01-08
US8038619B2 (en) 2011-10-18

Similar Documents

Publication Publication Date Title
US20090012394A1 (en) User interface for ultrasound system
US9943288B2 (en) Method and system for ultrasound data processing
US20120108960A1 (en) Method and system for organizing stored ultrasound data
US9420996B2 (en) Methods and systems for display of shear-wave elastography and strain elastography images
US20170238907A1 (en) Methods and systems for generating an ultrasound image
US8469890B2 (en) System and method for compensating for motion when displaying ultrasound motion tracking information
US8172753B2 (en) Systems and methods for visualization of an ultrasound probe relative to an object
US7894663B2 (en) Method and system for multiple view volume rendering
US8414495B2 (en) Ultrasound patch probe with micro-motor
US20110255762A1 (en) Method and system for determining a region of interest in ultrasound data
US9848849B2 (en) System and method for touch screen control of an ultrasound system
US8480583B2 (en) Methods and apparatus for 4D data acquisition and analysis in an ultrasound protocol examination
JP5475516B2 (en) System and method for displaying ultrasonic motion tracking information
US20080161688A1 (en) Portable Ultrasonic Diagnostic Imaging System with Docking Station
US9332966B2 (en) Methods and systems for data communication in an ultrasound system
US20120116218A1 (en) Method and system for displaying ultrasound data
US20100249589A1 (en) System and method for functional ultrasound imaging
US20090153548A1 (en) Method and system for slice alignment in diagnostic imaging systems
US9390546B2 (en) Methods and systems for removing occlusions in 3D ultrasound images
US20090187102A1 (en) Method and apparatus for wide-screen medical imaging
WO2006111874A2 (en) Portable ultrasonic diagnostic imaging system with docking station
US8636662B2 (en) Method and system for displaying system parameter information
EP3451932B1 (en) Ultrasonic imaging system with simplified 3d imaging controls
US20110055148A1 (en) System and method for reducing ultrasound information storage requirements
US20170086789A1 (en) Methods and systems for providing a mean velocity

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HOBELSBERGER, PETRA;DUDA, WALTER;REEL/FRAME:021035/0496

Effective date: 20080430

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION