WO2006130771A2 - Four-dimensional volume of interest - Google Patents

Four-dimensional volume of interest Download PDF

Info

Publication number
WO2006130771A2
WO2006130771A2 PCT/US2006/021291 US2006021291W WO2006130771A2 WO 2006130771 A2 WO2006130771 A2 WO 2006130771A2 US 2006021291 W US2006021291 W US 2006021291W WO 2006130771 A2 WO2006130771 A2 WO 2006130771A2
Authority
WO
WIPO (PCT)
Prior art keywords
voi
time
target
point
generating
Prior art date
Application number
PCT/US2006/021291
Other languages
French (fr)
Other versions
WO2006130771A3 (en
Inventor
Hongwu Wang
John R. Dooley
Jay B. West
Original Assignee
Accuray Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accuray Incorporated filed Critical Accuray Incorporated
Publication of WO2006130771A2 publication Critical patent/WO2006130771A2/en
Publication of WO2006130771A3 publication Critical patent/WO2006130771A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • A61N5/1037Treatment planning systems taking into account the movement of the target, e.g. 4D-image based planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/100764D tomography; Time-sequential 3D tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Abstract

A method and apparatus for a four-dimensional volume of interest is described.

Description

FOUR-DIMENSIONAL VOLUME OF INTEREST TECHNICAL FIELD
[0001] This invention relates to the field of radiation treatment
planning and, in particular, to a volume of interest applied to treatment
planning.
BACKGROUND
[0002] A non-invasive method for pathological anatomy (e.g., tumor,
legion, vascular malformation, nerve disorder, etc.) treatment is external beam
radiation therapy. In one type of external beam radiation therapy, an
external radiation source is used to direct a sequence of x-ray beams at a
pathological anatomy site from multiple angles, with the patient positioned so
the pathological anatomy is at the center of rotation (isocenter) of the beam.
As the angle of the radiation source is changed, every beam passes through
the pathological anatomy site, but passes through a different area of healthy
tissue on its way to the pathological anatomy. As a result, the cumulative
radiation dose at the pathological anatomy is high and the average radiation
dose to healthy tissue is low. The term radiotherapy refers to a procedure in
which radiation is applied to a target region for therapeutic, rather than
necrotic, purposes. The amount of radiation utilized in radiotherapy
treatment sessions is typically about an order of magnitude smaller, as compared to the amount used in a radiosurgery session. Radiotherapy is
typically characterized by a low dose per treatment (e.g., 100-200 centi-Gray
(cGy)), short treatment times (e.g., 10 to 30 minutes per treatment) and
hyperfractionation (e.g., 30 to 45 days of treatment). For convenience, the
term "radiation treatment" is used herein to mean radiosurgery and/or
radiotherapy unless otherwise noted by the magnitude of the radiation.
[0003] Traditionally, medical imaging was used to represent two-
dimensional views of the human anatomy. Modern anatomical imaging
modalities such as computed tomography (CT) are able to provide an
accurate three-dimensional model of a volume of interest (e.g., skull or
pathological anatomy bearing portion of the body) generated from a
collection of CT slices and, thereby, the volume requiring treatment can be
visualized in three dimensions. More particularly, in CT scanning numerous
x-ray beams are passed through a volume of interest in a body structure at
different angles. Then, sensors measure the amount of radiation absorbed by
different tissues. As a patient lies on a couch, an imaging system records x-
ray beams from multiple points. A computer program is used to measure the
differences in x-ray absorption to form cross-sectional images, or "slices" of
the head and brain. These slices are called tomograms, hence the name
"computed tomography." [0004] During treatment planning, a volume of interest (VOI) from
anatomical (e.g., CT) and/or functional imaging is used to delineate structures
to be targeted or avoided with respect to the administered radiation dose. A
volume of interest (VOI) may be defined as a set of planar, closed polygons, as
illustrated in Figure IA. The coordinates of the polygon vertices are defined
as the x/y/z offsets in a given unit from the image origin. Once a VOI has
been defined, it may be represented as a bit wise mask overlaid on the
functional and/or anatomical image (so that each bit is zero or one according
to whether the corresponding image volume pixel (voxel) is contained within
the VOI represented by that bit), or a set of contours defining the boundary of
the VOI in each image slice. Conventional VOI imaging architectures may
utilize a three-tier representation structure: VOI-contourslice-contour. Figure
IB illustrates the three-tier VOI structure in a Unified Modeling Language
(UML) graph with a sample VOI.
[0005] One problem encountered in external beam radiation treatment
is that pathological anatomies (e.g., a tumor) may move during treatment,
which decreases accurate target localization (i.e., accurate tracking of the
position of the target). Most notably, soft tissue targets tend to move with
patient breathing during radiation treatment delivery sessions. Respiratory
motion can move a pathological anatomy in the chest or abdomen, for
example, by more than 3 centimeters (cm). In the presence of such respiratory motion, for example, it is difficult to achieve the goal of precisely and
accurately delivering the radiation dose to the target, while avoiding
surrounding healthy tissue. In external beam radiation treatment, accurate
delivery of the radiation beams to the pathological anatomy being treated can
be critical, in order to achieve the radiation dose distribution that was
computed during the treatment planning stage.
[0006] Conventional methods for tracking anatomical motion utilize
external markers and/or internal fiducial markers. Such conventional
methods do not enable modeling of the anatomical change due to the
respiratory cycle using conventional VOI imaging architectures. Moreover,
such conventional methods do not take into account non-rigid motions and
deformations of surrounding anatomy, as a function of motion cycle.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The present invention is illustrated by way of example, and not
by way of limitation, in the figures of the accompanying drawings.
[0008] Figure IA illustrates a volume of interest defined by a stack of
planar closed polygons.
[0009] Figure IB illustrates a conventional three-tier VOI structure in a
Unified Modeling Language (UML) graph with a sample VOI.
[0010] Figure 2 illustrates one embodiment of the acquisition of pre-
treatment images (e.g., CT scans) of a changing target within a patient's
anatomy.
[0011] Figure 3 illustrates one embodiment of a 4D VOI architecture,
where the fourth dimension in the VOI architecture is a time dimension.
[0012] Figure 4 illustrates one embodiment of generating a VOI using a
4D VOI architecture.
[0013] Figure 5 illustrates of medical diagnostic imaging system
implementing one embodiment of the present invention.
[0014] Figure 6 illustrates one embodiment of a 4D mask volume.
[0015] Figure 7 illustrates a 2-dimensional perspective of radiation
beams of a radiation treatment system directed at a target region according to
a treatment plan. [0016] Figure 8 is a flow chart illustrating one embodiment of
generating a treatment plan using a 4D mask volume.
DETAILED DESCRIPTION
[0017] In the following description, numerous specific details are set
forth such as examples of specific systems, components, methods, etc. in order
to provide a thorough understanding of the present invention. It will be
apparent, however, to one skilled in the art that these specific details need not
be employed to practice the present invention. In other instances, well-known
components or methods have not been described in detail in order to avoid
unnecessarily obscuring the present invention.
[0018] Embodiments of the present invention include various steps,
which will be described below. The steps of the present invention may be
performed by hardware components or may be embodied in machine-
executable instructions, which may be used to cause a general-purpose or
special-purpose processor programmed with the instructions to perform the
steps. Alternatively, the steps may be performed by a combination of
hardware and software.
[0019] Embodiments of the present invention may be provided as a
computer program product, or software, that may include a machine-readable
medium having stored thereon instructions, which may be used to program a
computer system (or other electronic devices) to perform a process. A
machine-readable medium includes any mechanism for storing or
transmitting information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The machine-readable medium may
include, but is not limited to, magnetic storage medium (e.g., floppy diskette);
optical storage medium (e.g., CD-ROM); magneto-optical storage medium;
read-only memory (ROM); random-access memory (RAM); erasable
programmable memory (e.g., EPROM and EEPROM); flash memory;
electrical, optical, acoustical, or other form of propagated signal (e.g., carrier
waves, infrared signals, digital signals, etc.); or other type of medium suitable
for storing electronic instructions.
[0020] Embodiments of the present invention may also be practiced in
distributed computing environments where the machine-readable medium is
stored on and/or executed by more than one computer system. In addition,
the information transferred between computer systems may either be pulled
or pushed across the communication medium connecting the computer
systems, such as in a remote diagnosis or monitoring system. In remote
diagnosis or monitoring, a user may utilize embodiments of the present
invention to diagnose or monitor a patient despite the existence of a physical
separation between the user and the patient.
[0021] Some portions of the description that follow are presented in
terms of algorithms and symbolic representations of operations on data bits
that may be stored within a memory and operated on by a processor. These
algorithmic descriptions and representations are the means used by those skilled in the art to effectively convey their work. An algorithm is generally
conceived to be a self-consistent sequence of acts leading to a desired result.
The acts are those requiring manipulation of quantities. Usually, though not
necessarily, these quantities take the form of electrical or magnetic signals
capable of being stored, transferred, combined, compared, and otherwise
manipulated. It has proven convenient at times, principally for reasons of
common usage, to refer to these signals as bits, values, elements, symbols,
characters, terms, numbers, parameters, or the like.
[0022] It should also be noted that the methods and apparatus are
discussed herein in relation to CT imaging only for ease of explanation. The
method and apparatus discussed herein may also be used to generate VOIs
from other types of medical diagnostic images (anatomical and/or functional),
for example, magnetic resonance (MR), ultrasound (US), nuclear medicine
(NM) PET/SPECT, etc. In addition, the "targets" discussed herein may be an
anatomical feature(s) of a patient such as a pathological or normal anatomy
and may include one or more non-anatomical reference structures.
[0023] Figure 2 illustrates one embodiment of the acquisition of pre-
treatment images (e.g., CT scans) of a changing target within a patient's
anatomy. In the illustrated embodiment, the target 211 (e.g., anatomical
feature volume) may move or undergo a deformation (which may be a non-
rigid deformation) during a patient's respiration, heartbeats, or other motion. While in embodiments herein the target 211 is described as moving
periodically while undergoing a non-rigid deformation, any other type of
motion (e.g. aperiodic motion) and any type of deformation or motion of the
target may be accounted for. Furthermore, while embodiments may be
discussed herein in regards to a CT scans, any other type of imaging modality
may be used, for example, magnetic resonance imaging (MRI), positron
emission tomography (PET), ultrasound, etc.
[0024] In one embodiment, CT scans (e.g., CT image 221, 222, 223) are
taken at different times points b within, for example, a breathing cycle P 270 of
a patient. The time points b correspond to different epochs in the patient
breathing cycle, with, for example, to < ti < t2. The cycle P 270 may be
monitored by an external sensor, for example, a breathing sensor, markers
placed on the chest, etc. The CT images 221, 222, and 223 are taken at time
points to, ti, and t2, respectively, containing the target 211a, 211b and 211c at
those respective time points. The epochs or time points within the breathing
cycle P 270 may be chosen to substantially encompass the overall dynamic
range of the periodic motion 275. For example, in one embodiment, the time
points may include: a time point ti corresponding to a trough of the cycle P; a
time point h corresponding to a peak of the cycle P; and a third time point to
disposed at an intermediate location between the peak and the trough of the
cycle P 270. In other embodiments, the time points selected for taking the CT images may include fewer or more than the three time points to, ti, and te
described above. Accordingly, fewer or more than three CT images may be
used.
[0025] One way of providing a model for the continuous non-rigid
deformation of the anatomy as a function of the motion cycle involves
constructing a 4D mathematical model that morphs the CT image acquired at
one instant or time point in the motion cycle into another CT image acquired
at a subsequent instant or time point in the motion cycle. Any standard
software and/or algorithm that is known and may be commercially available
can be used to morph one image into another image, and to describe this in
terms of a mathematical model. The 4D mathematical model relates the 3D
locations of one or more reference structures (e.g., fiducials, skeletal structure,
etc.) with the 3D locations of the target, as a function of the instant in the
motion cycle. As such, if the deformation of the target is desired to be known
at an intermediate position between CT images e.g., ti.s, the 4D mathematical
model is applied to every voxel in one of the acquired CT images (e.g., CT
image 222) to map the deformation of the target in each voxel of the CT
image. However, such a 4D mathematical modeling may require a lot of
computing power and may be slower than desirable due the deformation
mapping of each voxel in the CT image. [0026] Described hereafter is a method and apparatus for tracking of a
changing (e.g., moving, deforming, etc.) target 211 using a four-dimensional
(4D) VOI, where the fourth dimension in the VOI architecture is a time
dimension. A 4D VOI may be represented as a set {Vo, Vi, . . . , Vk} where each
Vi is a 3D VOI representing a given point in time. In one embodiment, such a
4D VOI may be created by generating a 3D VOI (Vo) in a 3D image Io. In the
above example, the first and second dimensions of the VOI correspond to one
image slice in the 3D VOI and the third dimension corresponds to other slices
in the 3D VOI. It should also be noted that the definitions of the first three
dimensions could be the three axes of any coordinate system defined within
the 3D image space.
[0027] Then, given subsequent images {Io, Ii, . . . , Ik} taken at different
times (e.g., to model anatomical change due to a patient's respiratory cycle),
the corresponding VOIs could be formed by performing a non-rigid
registration to determine the deformation mapping of image L to image L,
and applying the deformation mapping to Vo to give an initial estimate of V.
In one embodiment, one or more additional refinement steps may be applied
to Vi, for example, a model-based refinement if Vi represents an anatomical
organ.
[0028] A VOI representing any given point in time may be formed by
direct interpolation between Vi and Vi+i, where i and i +1 are the VOIs immediately preceding and succeeding a desired point in time. This method
may be much faster than having to interpolate the underlying deformation
field itself and may result in a more efficient way of modeling tissue (e.g.,
organ) deformation over time. This is because the underlying deformation
field maps the space of image L to the space of image L+i, hence if the
deformation field is used directly to map between Vi and Vi+i the underlying
bit mask representation must be used (for example as described below in
relation to Figure 6). However, if interpolation is used between the contours
(e.g., of Figure IA and Figure IB) of Vi and Vi+i, it is possible to perform the
interpolation much faster. In other words, instead of having to look up values
of the deformation field for all the voxels within Vi and Vi+i, the points on the
contours of Vi and Vm may be directly interpolated to create a set of bounding
contours for the intermediate VOI.
[0029] In one embodiment, the 4D VOI architecture may be used with a
robotic based linear accelerator (LINAC) radiosurgery system (as discussed in
further detail below) to supplement or supplant the robot motion tracking
mechanisms that may already present in such a system. Because the
coordinate system in which each member of the VOI set is represented is
arbitrary, a coordinate system may be chosen that is invariant with respect to
the robotic-based LINAC. That is, the compensation for target change that
would have otherwise been determined by the treatment delivery system for the robotic controlled LINAC during the treatment delivery is already
predetermined by the treatment planning system using the 4D VOI
architecture.
[0030] Figure 3 illustrates one embodiment of a 4D VOI architecture,
where the fourth dimension in the VOI architecture is a time dimension. In
this embodiment, the VOI is represented using a three-tier structure (VOI-
contourslice-contour) in a UML graph with an example target. UML is a
graphical language for visualizing, specifying, constructing and documenting
artifacts of a software-intensive system. The UML offers a standard way to
write programming language statements, database schemas, and software
components. UML is well known in the art; accordingly, a more detailed
discussion is not provided herein. In one embodiment, each contour slice in
VOI architecture 200 may be restricted so that it contains only a simple (i.e.
closed boundary with no holes or intersections) contour. If this restriction is
present, contour slices may be created in non-adjacent slices, and
interpolation used to create the contours in the intermediate slices. In another
embodiment, VOI architecture 200 allows multiple contours to be defined for
each contour slice. In this case, VOIs with cavities, branches, and
unconnected bodies may be drawn. Alternatively, the methods described
herein may also be used with other tiered (e.g., 4 tier) VOI architectures. [0031] In this embodiment, VOI architecture 200 includes a contour
tier 210, a contour slice (L) tier 220, a VOI (Vi) tier 230. In this embodiment,
the 4D VOI architecture 200 may represented as a set {Vo, Vi, . . . , Vk) where
each of the Vi is a 3D VOI representing a given point in time. Only three Vi
(Vo 235a, Vi.5, 235b, and V>235c) are illustrated for ease of discussion purposes.
Continuing the example of Figure 2, the CT slices 221 and 223 are taken at
time points ti and t2, respectively, containing the target 211 in its state (e.g.,
position and deformation) at those respective time points 211b and 211c.
[0032] Figure 4 illustrates one embodiment of generating a VOI using a
4D VOI architecture. In this embodiment, the method may include receiving
at least two 3D images Ii 222 and h 223, step 405, and generating at least two
3D VOIs (e.g., Vi and V2) with the at least two corresponding 3D images Ii 222
and I2223, step 410. In step 420, a non-rigid registration of Vi 235b and V>
235c is performed. Registration may be performed using techniques known to
those of ordinary skill in the art; accordingly, a detailed description of
registration is not provided. In one embodiment, for example, a registration
technique as described in "Hybrid point-and-intensity based def ormable
registration for abdominal CT images", J. B. West, C. R. Maurer, Jr., and J. R.
Dooley, Proc. SPIE vol. 5747, pp.204-211, 2005, may be used. The output of
the registration of step 420 may be referred to as a deformation field. A
deformation field relating two images, A and B, is a set of vectors defined such that in any position x in image A, the corresponding element of the
deformation field, D(x), is a vector v such that the position (x+v) in image B
describes the same anatomical location as x in image A.
[0033] Then, in order to determine V1.5236 at a time in between ti and t2,
an interpolation may be performed, using the registration results of step 420,
on Vi 235b and V2235c, step 430. More generally, a VOI representing any
given point in time may be formed by direct interpolation between Vi and Vi+i,
where i and i +1 are the 3D VOIs immediately preceding and succeeding a
desired point in time. For example, a three-dimensional space deformation
model, as described in "D. Bechmann and N. Dubreuil. Animation through
space and time based on a space deformation model. The Journal of
Visualization and Computer Animation.4(3)165-184, 1993" may be used to
generate the interpolated VOI.
[0034] In one embodiment, one or more additional refinements, step
440, may be applied to V1.5236 such as a model-based refinement. With a
model-based refinement, a model of Vi.s236 is used to ensure that the
contours describing V1.5236 give a valid shape for the organ being described
by V1.5236. In one embodiment of a model-based refinement, the principal
modes of variation of the boundary of V1.5236 are stored as part of the model,
and the contours of V1.5236 are refined so that their principal modes of
variation are within given limits of those of the model. Refinement of a VOI may be performed either manually by the user (e.g., through a graphical user
interface) or through the use of an algorithm that operates on the VOI.
[0035] Once V1.5236 has been generated, it may be used to generate a
visualization of the VOI at time point ti.s, step 450. The generation of a
visualization from a VOI could be achieved by rendering the mask volume of
that VOI using volume rendering techniques as described in "Levoy, M., et.
al, Volume Rendering in Radiation Treatment Planning, Proc. First
Conference on Visualization in Biomedical Computing, IEEE Computer
Society Press, Atlanta, Georgia, May, 1990, pp.4-10/' or by directly rendering
the 3D geometrical structure of that VOI. Using 4D VOI architecture 200, the
visualization (e.g., images 222, 224 and 223) may be graphically displayed to a
user to animate changing structures (e.g., image 211) faster and more accurate
than may be possible when performing 4D mathematical modeling to map the
underlying deformation field itself.
[0036] Figure 5 illustrates one embodiment of medical diagnostic
imaging system in which features of the present invention may be
implemented. The medical diagnostic imaging system may be discussed
below at times in relation to CT imaging modality only for ease of
explanation. However, other imaging modalities may be used as previously
mentioned. [0037] Medical diagnostic imaging system 700 includes an imaging
source 710 to generate a beam (e.g., kilo voltage x-rays, mega voltage x-rays,
ultrasound, MRI, etc.) and an imager 720 to detect and receive the beam
generated by imaging source 710. In an alternative embodiment, system 700
may include two diagnostic X-ray sources and/or two corresponding image
detectors. For example, two x-ray sources may be nominally mounted
angularly apart (e.g., 90 degrees apart or 45 degree orthogonal angles) and
aimed through the patient toward the imager(s). A single large imager, or
multiple imagers, can be used that would be illuminated by each x-ray
imaging source. Alternatively, other numbers and configurations of imaging
sources and imagers may be used.
[0038] The imaging source 710 and the imager 720 are coupled to a
digital processing system 730 to control the imaging operation. Digital
processing system 730 includes a bus or other means 735 for transferring data
among components of digital processing system 730. Digital processing
system 730 also includes a processing device 740. Processing device 740 may
represent one or more general-purpose processors (e.g., a microprocessor),
special purpose processor such as a digital signal processor (DSP) or other
type of device such as a controller or field programmable gate array (FPGA).
Processing device 740 may be configured to execute the instructions for
performing the operations and steps discussed herein. In particular, processing device 740 may be configured to execute instructions to perform
the Boolean operations on the contour sets 241-244 to define VOI 231 as
discussed above with respect to Figure 3 and to generate a VOI mask volume
as discussed above with respect to Figure 5.
[0039] Digital processing system 730 may also include system memory
750 that may include a random access memory (RAM), or other dynamic
storage device, coupled to bus 735 for storing information and instructions to
be executed by processing device 740. System memory 750 also may be used
for storing temporary variables or other intermediate information during
execution of instructions by processing device 740. System memory 750 may
also include a read only memory (ROM) and/or other static storage device
coupled to bus 735 for storing static information and instructions for
processing device 740.
[0040] A storage device 760 represents one or more storage devices
(e.g., a magnetic disk drive or optical disk drive) coupled to bus 735 for
storing information and instructions. Storage device 760 may be used for
storing instructions for performing the steps discussed herein.
[0041] Digital processing system 730 may also be coupled to a display
device 770, such as a cathode ray tube (CRT) or liquid crystal display (LCD),
for displaying information (e.g., image slice, animation of the target using the
4D VOI, etc.) to the user. An input device 780, such as a keyboard, may be coupled to digital processing system 730 for communicating information
and/or command selections to processing device 740. One or more other user
input devices, such as a mouse, a trackball, or cursor direction keys for
communicating direction information and command selections to processing
device 740 and for controlling cursor movement on display 770 may also be
used.
[0042] It will be appreciated that the digital processing system 730
represents only one example of a system, which may have many different
configurations and architectures, and which may be employed with the
present invention. For example, some systems often have multiple buses,
such as a peripheral bus, a dedicated cache bus, etc.
[0043] One or more of the components of digital processing system 730
may form a treatment planning system. The treatment planning system may
share its database (e.g., stored in storage device 760) with a treatment delivery
system, so that it is not necessary to export from the treatment planning
system prior to treatment delivery. The treatment planning system may also
include MIRIT (Medical Image Review and Import Tool) to support DICOM
import (so images can be fused and targets delineated on different systems
and then imported into the treatment planning system for planning and dose
calculations), expanded image fusion capabilities that allow the user to treatment plan and view isodose distributions on any one of various imaging
modalities (e.g., MRI, CT, PET, etc.).
[0044] In one embodiment, the treatment delivery system may be an
image guided robotic based linear accelerator (LINAC) radiation treatment
(e.g., for performing radiosurgery) system, such as the CyberKnife® system
developed by Accuray, Inc. of California. In such a system, the LINAC is
mounted on the end of a robotic arm having multiple (e.g., 5 or more) degrees
of freedom in order to position the LINAC to irradiate the pathological
anatomy with beams delivered from many angles in an operating volume
(e.g., sphere) around the patient. Treatment may involve beam paths with a
single isocenter, multiple isocenters, or with a non-isocentric approach (i.e.,
the beams need only intersect with the pathological target volume and do not
necessarily converge on a single point, or isocenter, within the target).
Treatment can be delivered in either a single session (mono-fraction) or in a
small number of sessions (hypo-fractionation) as determined during
treatment planning. Treatment may also be delivered without the use of a
rigid external frame for performing registration of pre-operative position of
the target during treatment planning to the intra-operative delivery of the
radiation beams to the target according to the treatment plan.
[0045] Alternatively, another type of treatment delivery systems may
be used, for example, a gantry based (isocentric) intensity modulated radiotherapy (IMRT) system. In a gantry based system, a radiation source
(e.g., a LINAC) is mounted on the gantry in such a way that it rotates in a
plane corresponding to an axial slice of the patient. Radiation is then
delivered from several positions on the circular plane of rotation. In IMRT,
the shape of the radiation beam is defined by a multi-leaf collimator that
allows portions of the beam to be blocked, so that the remaining beam
incident on the patient has a pre-defined shape. The resulting system
generates arbitrarily shaped radiation beams that intersect each other at the
isocenter to deliver a dose distribution to the target. In IMRT planning, the
optimization algorithm selects subsets of the main beam and determines the
amount of time for which the subset of beams should be exposed, so that the
dose constraints are best met.
[0046] In other embodiments, yet other types of treatment delivery
systems may be used, for example, a stereotactic frame system such as the
GammaKnif e®, available from Elekta of Sweden. With such a system, the
optimization algorithm (also referred to as a sphere packing algorithm) of the
treatment plan determines the selection and dose weighting assigned to a
group of beams forming isocenters in order to best meet provided dose
constraints.
[0047] The 4D VOI architecture described herein may be used to
perform inverse planning. Inverse planning, in contrast to forward planning, allows the medical physicist to independently specify the minimum tumor
dose and the maximum dose to other healthy tissues, and lets the treatment
planning software select the direction, distance, and total number and energy
of the beams. Conventional treatment planning software packages are
designed to import 3-D images from a diagnostic imaging source, for
example, magnetic resonance imaging (MRI), positron emission tomography
(PET) scans, angiograms and computerized x-ray tomography (CT) scans.
These anatomical imaging modalities such as CT are able to provide an
accurate three-dimensional model of a volume of interest (e.g., skull or other
tumor bearing portion of the body) generated from a collection of CT slices
and, thereby, the volume requiring treatment can be visualized in three
dimensions.
[0048] During inverse planning, the VOI 230 is used to delineate
structures to be targeted or avoided with respect to the administered
radiation dose. That is, the radiation source is positioned in a sequence
calculated to localize the radiation dose into VOI 230 that as closely as
possible conforms to the target (e.g., pathological anatomy such as a tumor)
requiring treatment, while avoiding exposure of nearby healthy tissue. Once
the target (e.g., tumor) VOI has been defined, and the critical and soft tissue
volumes have been specified, the responsible radiation oncologist or medical
physicist specifies the minimum radiation dose to the target VOI and the maximum dose to normal and critical healthy tissue. The software then
produces the inverse treatment plan, relying on the positional capabilities of
radiation treatment system, to meet the min/max dose constraints of the
treatment plan.
[0049] The 4D VOI architecture 200 may be used to create a 4D mask
volume, as discussed in further detail below. Hence, beams may be enabled
or disabled depending the target's change over time. Although the change in
target during treatment delivery may be different than the change in the
target during treatment planning (e.g., due to differences in a patient's
respiration at those different times), certain gross changes may be assumed to
be similar. Accordingly, in one embodiment, the 4D VOI architecture 200 may
be used to supplement (or possibly supplant) the robot motion tracking
mechanisms that may otherwise be present in a robotic-based LINAC
radiation treatment system, with finer changes handled by dynamic tracking
capabilities of the treatment delivery system. Dynamic tracking is known in
the art; accordingly a detailed description is not provided. Alternatively, the
4D VOI architecture 200 may be used to supplant the robot motion tracking
mechanisms that may otherwise be present in a robotic-based LINAC
radiation treatment system.
[0050] Because the coordinate system in which each member of the set
is represented is arbitrary, a coordinate system that is invariant with respect to the robot. That is, the compensation for target motion is already taking into
account by the treatment planning software using the 4D VOI architecture
200.
[0051] Figure 6 illustrates one embodiment of a 4D mask volume. For
ease of discussion, 4D mask volume 600 is shown with two overlaid masks
635b and 636 on Vi 235b and Vi.s236, respectively, for times ti and ti.s,
respectively. The VOI mask volumes 635b and 636 are volume
representations of all user defined VOIs that are geometrically considered as a
cuboid composed of many small cuboids of the same size (i.e., the voxels). In
this embodiment, every voxel (e.g., voxels 650, 665, etc.) contains 32 bits.
Alternatively, other number of bit words may be used for a voxel. One bit, or
more, of a voxel (e.g., the ith bit) may be used to represent if the voxel is
covered by a VOI that is defined by the index of the bit. At every voxel
location (e.g., voxel 665), the bit value will be either a "1" or a "0" indicating
whether a particular voxel is part of the target. For example, a "1" bit value
may be used to indicate a voxel is contained within the VOI represented by
that mask position (as conceptually illustrated by the "1" for ith bit of voxel
665 of mask 635b). If, for example, the voxel bit is a "0" (as conceptually
illustrated by the "0" for the ith bit of voxel 667 of mask 636), the treatment
planning algorithm ignores the dose constraints for that corresponding voxel.
The VOI mask volume serves as an interface between the VOI structures and the rest of an imaging system's functions such as, for examples, a 4-D VOI
visualization and dose calculation in treatment planning. Using the 4D mask
volume 600, the radiation beams of a treatment delivery system may be
enabled or disabled based on a target's change in position.
[0052] The dose calculation process in the treatment planning
algorithm considers a set of beams that are directed at the target region 211.
In one embodiment, the treatment planning algorithm is used with a radiation
source that has a collimator that defines the width of the set of beams that is
produced. For each target 211, for example, the number of beams, their sizes
(e.g., as established by the collimator), their positions and orientations are
determined. Having defined the position, orientation, and size of the beams
to be used for planning, how much radiation should be delivered via each
beam is also determined. The total amount of radiation exiting the collimator
for one beam is defined in terms of Monitor Units (MU). Because the intensity
of the radiation source is constant, the MU is linearly related to the amount of
time for which the beam is enabled. The radiation dose absorbed (in units of
cGy) by tissue in the path of the beam is also linearly related to the MU. The
absorbed dose related to a beam is also affected by the collimator size of the
beam, the amount of material between the collimator and the calculation
point, the distance of the collimator from the calculation point, and the
distance of the calculation point from the central axis of the beam. [0053] Figure 7 illustrates a 2-dimensional perspective of radiation
beams of a radiation treatment system directed at a target region according to
a treatment plan. It should be noted that 3 beams are illustrated in Figure 7
only for ease of discussion and that an actual treatment plan may include
more, or fewer, than 3 beams. Furthermore, although the 3 beams appear to
intersect in the 2-dimensional perspective of Figure 7, the beams may not
intersect in their actual 3-dimensional space. The radiation beams need only
intersect with the target volume and do not necessarily converge on a single
point, or isocenter, within the target. In one embodiment, using 4D mask
volume 600 beams may be enabled or disabled based on the change in the
target over time.
[0054] Figure 8 is a flow chart illustrating one embodiment of
generating a treatment plan using a 4D mask volume. In one embodiment,
the treatment planning algorithm receives as input from a user, step 610, the
delineated target region 220 and any critical region 210 on one or more slices
of a CT image; and (2) dose constraints defining the minimum and maximum
doses for target region 220 and the maximum dose for the critical region 210.
It should be noted that additional dose constraints for additional regions may
also be provided. The delineation of the regions and the dose constraints
may be performed in any order. [0055] Then, the treatment planning algorithm performs beam
weighting of each one or more beams of the radiation treatment system to be
used in the treatment plan according to the inputs provided by the user
above. The user or the treatment planning algorithm assigns an arbitrary
weighting to each of one or more beams (e.g., beam 1, beam 2, beam 3 of
Figure 7) of the radiation treatment system. This weighting may be
determined using an algorithm designed to give a suitable "start point" for
planning, may be randomly chosen, or may simply be a constant weighting
for each beam.
[0056] In step 630, the 4D VOI is generated and, in step 640, the 4D
mask volume is generated with the methods discussed above in relation to
Figures 3 and 4. If a voxel bit from a 4D volume mask 600 is a "0", the
planning algorithm ignores the dose constraints for that corresponding dose
voxel for the particular point in time VOI. However, if a voxel bit from dose
contour mask 400 has a "\" bit value for a particular point in time VOI, then
determine whether any penalties should be accessed when performing beam
weighting based on the dose constraints for that dose voxel, step 650. In one
embodiment, in order to reduce dose to a given sensitive organ to minimal
levels, the 4D volume mask may be used so that if the 3D VOI at any of the
time points has a "1" bit value at any position intersected by a beam, that
beam is automatically set to have zero MU in the final plan. [0057] In one embodiment, the following algorithm may be used to
perform beam weighting. In this embodiment, to begin the beam weighting,
step 660, an assumption may be made that the size and trajectory of the beam
set has been defined. Let the beam set be {Bi; 1 < i < N}, where N » 500. Beam
1, beam 2, and beam 3 illustrate in Figure 7 have a respective weight 1, weight
2, weight 3 (i.e., a number of MU assigned to the beam, or how long a beam
will be maintained on) associated with it. The weight in MU of each beam is
designated by wi. The delineated regions are represented as objects Tj
(derived from the 4D mask volume 600), with corresponding minimum and
maximum allowed dose minj and maxj, and critical structures (critical region
210) Q, with corresponding maxj defined. Each region has an integer priority
PJ e [0,100] defining the relative importance of the dose constraints applied to
that region. For each beam, a 4D dose value mask is created. The 4D dose
value mask may be regarded as a set of 3D dose value masks, in the same way
that a 4D VOI is a set of 3D VOIs. Each 3D dose value mask provides a linked
list of floating point values and positions cL(r,t) at a given point in time, where
r is the position within the dose calculation volume, and di is the dose in cGy
delivered to r by beam i when wi is set to unity, and t is the time, represented
as a position in the respiratory cycle. Thus, the total dose at r, at position t in
the respiratory cycle is given by:
Figure imgf000031_0001
Hence the total dose at r, summed over the entire respiratory cycle is,
D(r)= ∑D(r,t) (1),
and the total dose for beam i, summed over the entire respiratory cycle is,
Figure imgf000031_0002
[0058] For each Bi, we define a beam value υ i, where
_ ∑ , ∑ r e T, d , {r) (2)
∑ d , ( r )
[0059] The beam value is the ratio of dose delivered into target region
220 to total dose delivered. To define the initial set of wi for optimization, we
set W1 = υt,\/i . The maximum dose within the dose calculation volume, Dmax,
is computed and the beam weights renormalized so that the new maximum
dose is equal to the largest of the maximum dose constraints, maxj. Hence,
this provides:
w, = ϋ,. sup(maxy)/-Dmax. (3)
[0060] At one iteration of the treatment planning algorithm, the
optimization process looks at all of the dose values in the dose volume and
determine if the target region 211 and a critical region 610 are within the dose constraints. For example, suppose the dose in the target region 211 is
specified to be equal to or greater than 2000 cGy and less than or equal to 2500
cGy. Suppose, the current dose value at grid location for voxel 665 of Figure 6
is 1800 cGy, then the optimization process determines that, at the current
beam weightings, the dose value at voxel 665 is 200 cGy short in order to
satisfy the treatment plan constraints.
[0061] Given the initial weights, the optimization process then alters
the beam weights so that the treatment solution is closer to meeting the
provided dose constraints. First, a set of Aw1, the amount by which each
beam weight may be changed, is defined:
Figure imgf000032_0001
[0062] where s is the search resolution, having an initial value of 1.
[0063] The optimization process iterates through one or more of the
beams and for each of the beams, if a beam weight is increased or decreased
by a certain amount, determines the resulting dose distribution from such a
change (i.e., how such a change alters the amount of violation of the treatment
plan constraints). For example, an increase in one or more of the beam
weights may typically help in achieving the constraint in the target (e.g.,
tumor) region but, depending on the location of the beam, it may also hurt in the critical region due to a possible resulting increase of dose above the
maximum value in the critical region.
[0064] The optimization process traverses the volume of interest, adds
up all the penalties that are incurred by the increase in a beam weight, adds
up all the penalties that are incurred by the decreasing the beam weight (e.g.,
under-dosing the target region), and then provides a result. In one
embodiment, a multiplier may be used with each penalty to stress the
importance of one constraint (e.g., minimum dose value in the target region)
versus another constraint (e.g., maximum dose value in the target region).
For example, it may more important to achieve a minimum dose value than to
stay under the maximum dose value in the target region.
[0065] The optimization process then updates the dose and goes on to
the next beam and repeats the process until it has made its way through the
beam set. The optimization process then reaches a stage where it has looked
at all of the different weights for each of the beams at the different dose levels
and selects the beam weight that provides the optimal resulting dose values in
both the target region and critical region.
[0066] In one embodiment, an iterative optimization process is used as
follows: Iterate over the beams in decreasing order of υ, . For each beam Bj, calculate Pj and PJ , the relative penalties for respectively increasing or
decreasing Wj, that are defined as:
Figure imgf000034_0001
Figure imgf000034_0002
Figure imgf000034_0003
Awjdj (r) and
Figure imgf000034_0004
[0067] where Vi is the volume in mm3 of region i. Hence, the penalty
for this beam is the sum of the additional amount of over-dosing and under¬
dosing that would be created by the change in the beam, weighted by the
priorities of the different regions and normalized according to the region
volumes. If PJ and Pj are both positive, Wj is kept the same, otherwise
change Wj = Wj ± Δw. according to whichever of PJ and Pj is smaller. If the
previous iteration moved Wj in the same direction as this iteration, the
following is set:
Aw j - Aw j + Δ(0)w ;. , /Ej\
else set:
Aw j =A(0)Wj . (6) [0068] The change in dose according to Δw; is computed and applied
to the dose volume before the optimization process moves on to a next beam,
because a correct decision on how to change the beam weight assumes an up-
to-date view of the dose including change sin previous wi. If all Wj remained
unchanged by the current iteration, s is reduced by a factor of 2.
[0069] In an alternative embodiment, the optimization algorithm may
perform convex optimization via, for example, the Simplex algorithm, in an
attempt to find an MU setting for all beams so that the dose constraints are
nowhere violated. The Simplex algorithm is known in the art; accordingly, a
detailed description is not provided. Alternatively, other iterative and non-
iterative optimization algorithms may be used.
[0070] The 4D VOI architecture 200 may also be used with a mixed
planning in which part of the treatment dose is generated by an isocenter
placed using forward planning and part generated by individual beams
during inverse planning.
[0071] It should be noted that although discussed at times herein in
regards to radiation treatment, the methods and apparatus described herein
are not limited for use solely in treatment planning but may also be used
independently for other applications, such as simulation and animation of
object changes (e.g., deformation) over time. In alternative embodiments, the methods and apparatus herein may be used outside of the medical technology
field, such as non-destructive testing of materials (e.g., motor blocks in the
automotive industry and drill cores in the petroleum industry) and seismic
surveying.
[0072] In the foregoing specification, the invention has been described
with reference to specific exemplary embodiments thereof. It will, however,
be evident that various modifications and changes may be made thereto
without departing from the broader spirit and scope of the invention as set
forth in the appended claims. The specification and drawings are,
accordingly, to be regarded in an illustrative sense rather than a restrictive
sense.

Claims

CLAIMSWhat is claimed is:
1. A volume of interest (VOI) architecture comprising a fourth dimension, where the fourth dimension is a time dimension.
2. The VOI architecture of claim 1, further comprising: first, second and third dimensions comprising three space dimensions within an image space.
3. The VOI architecture of claim 1, further comprising a plurality of VOIs, wherein each of the plurality of VOIs is a three-dimensional VOI representing a given point in time.
4. The VOI architecture of claim 3, wherein the three-dimensional VOI comprises three space dimensions within an image space.
5. A method, comprising: generating a first VOI including a target at a first point in time; generating a second VOI including the target at a second point in time; registering the first VOI with the second VOI; and interpolating a third VOI including the target at a third point in time based on the registering of the first VOI with the second VOI, the third point in time being between the first and second points in time.
6. The method of claim 5, further comprising generating a visualization of the third VOI.
7. The method of claim 6, wherein generating the visualization comprises rendering a mask volume of the third VOI.
8. The method of claim 7, wherein rendering comprises volume rendering.
9. The method of claim 7, wherein rendering comprises direct rendering of a three-dimensional geometrical structure of the third VOI.
10. The method of claim 5, further comprising: receiving a first image having the target at the first point in time; and receiving a second image having the target at the second point in time.
11. The method of claim 6, further comprising refining the third VOI.
12. The method of claim 11, wherein refining comprises a model-based refinement.
13. The method of claim 12, wherein the third VOI is changed with respect to at least one of the first and second VOIs.
14. The method of claim 5, wherein the target at the first time point is changed with respect to the target at the second time point based on motion adjacent the target.
15. The method of claim 14, wherein the motion comprises a periodic motion characterized by a cycle.
16. The method of claim 6, wherein the first and second images are acquired in an anatomical imaging modality.
17. The method of claim 16, wherein the anatomical imaging modality is computed tomography.
18. An article of manufacture, comprising a machine-accessible medium including data that, when accessed by a machine, cause the machine to perform operations comprising: generating a first VOI including a target at a first point in time; generating a second VOI including the target at a second point in time; registering the first VOI with the second VOI; and interpolating a third VOI including the target at a third point in time based on the registering, the third point in time being between the first and second points in time.
19. The article of manufacture of claim 18, wherein the instructions further cause the processor to perform the following comprising generating a visualization of the third VOI.
20. The article of manufacture of claim 19, wherein the instructions further cause the processor to perform the following comprising refining the third VOI.
21. The article of manufacture of claim 19, wherein generating the visualization comprises rendering a mask volume of the third VOI.
22. An apparatus, comprising: a storage device to store first and second images; and a processor coupled to the storage device to receive the first and second images, the processor to generate a first VOI including a target at a first point in time and a second VOI including the target at a second point in time, the processor further to register the first VOI with the second VOI and interpolate a third VOI including the target at a third point in time based on the registering, the third point in time being between the first and second points in time.
23. The apparatus of claim 22, wherein the processor is further configured to generate a visualization of the third VOI.
24. The apparatus of claim 23, wherein the processor is further configured to refine the third VOI.
25. The apparatus of claim 23, wherein the processor is further configured to render a mask volume of the third VOI.
26. The apparatus of claim 23, further comprising an imager coupled to the storage device, the imager to generate the first and second images.
27. A method of tracking a changing target within an anatomical region to deliver radiation to the target during motion of the anatomical region, the method, comprising: determining, at a time of treatment planning, a plurality of locations of the target over time using a four-dimensional volume of interest (VOI) having a fourth dimension being a time dimension; and determining, at a time of treatment planning using the four- dimensional VOI, one or more radiation beam trajectories to be delivered to the target within the moving anatomical region at a time of treatment delivery.
28. The method of claim 27, wherein determining comprises performing at least one of enabling and disabling of the one or more radiation beams depending whether the one or more radiation beams intersect the target at the plurality of locations of the target determined using the four-dimensional VOI.
29. The method of claim 27, further comprising generating a four- dimensional mask volume using the four-dimensional VOI, and wherein determining is performed using the four-dimensional mask volume.
30. The method of claim 29, wherein determining comprises: setting a bit in the four-dimensional mask volume to have a value indicating that a corresponding voxel is part of the target at a point in time; and setting a beam, which is to intersect the voxel at the point in time, to have a zero weight.
31. The method of claim 28, further comprising generating the four- dimensional VOI, wherein generating comprises: generating a first VOI including the target at a first point in time; generating a second VOI including the target at a second point in time; registering the first VOI with the second VOI; and interpolating a third VOI including the target at a third point in time based on the registering of the first VOI with the second VOI, the third point in time being between the first and second points in time.
32. The method of claim 31, wherein the target at the first time point is changed with respect to the target at the second time point based on the motion of the anatomical region.
33. The method of claim 32, wherein the motion comprises a periodic motion characterized by a cycle.
34. An apparatus, comprising: means for directly interpolating between a first contour of a first volume of interest (VOI) and a second contour of a second VOI using one or more points on the first and second contours; and means for generating an intermediate contour of an intermediate VOI based on the interpolating.
35. The apparatus of claim 34, further comprising means for generating a visualization of the intermediate VOI.
36. The apparatus of claim 34, further comprising means for refining the third VOI.
37. The apparatus of claim 34, wherein the means for directly interpolating comprises a four-dimensional VOI having a fourth dimension being a time dimension, and wherein the apparatus further comprises: means for determining, at a time of treatment planning, a plurality of locations of a target over time using the four-dimensional volume of interest (VOI); and means for determining, at a time of treatment planning using the four- dimensional VOI, one or more radiation beam trajectories to be delivered to the target within the moving anatomical region at a time of treatment delivery.
PCT/US2006/021291 2005-06-02 2006-06-01 Four-dimensional volume of interest WO2006130771A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/144,247 US7352370B2 (en) 2005-06-02 2005-06-02 Four-dimensional volume of interest
US11/144,247 2005-06-02

Publications (2)

Publication Number Publication Date
WO2006130771A2 true WO2006130771A2 (en) 2006-12-07
WO2006130771A3 WO2006130771A3 (en) 2008-01-17

Family

ID=37482320

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/021291 WO2006130771A2 (en) 2005-06-02 2006-06-01 Four-dimensional volume of interest

Country Status (2)

Country Link
US (1) US7352370B2 (en)
WO (1) WO2006130771A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016070938A1 (en) * 2014-11-07 2016-05-12 Raysearch Laboratories Ab Robust radiotherapy treatment plan generation
EP3081262A1 (en) * 2015-04-14 2016-10-19 RaySearch Laboratories AB A method, a computer program product and a system for optimization of radiotherapy treatment planning
EP2584970A4 (en) * 2010-06-23 2017-08-02 Varian Medical Systems International AG Mechanism for advanced structure generation and editing
EP3228357A1 (en) * 2016-04-08 2017-10-11 RaySearch Laboratories AB Method, computer program product and computer system for radiotherapy treatment planning

Families Citing this family (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8989349B2 (en) * 2004-09-30 2015-03-24 Accuray, Inc. Dynamic tracking of moving targets
EP1909904B1 (en) * 2005-07-25 2013-09-04 Karl Otto Methods and apparatus for the planning of radiation treatments
US7880154B2 (en) 2005-07-25 2011-02-01 Karl Otto Methods and apparatus for the planning and delivery of radiation treatments
US8073102B2 (en) * 2005-10-17 2011-12-06 Alberta Health Services Real-time dose reconstruction using dynamic simulation and image guided adaptive radiotherapy
CA2626538C (en) 2005-10-17 2018-01-23 Alberta Cancer Board Integrated external beam radiotherapy and mri system
US7660481B2 (en) * 2005-11-17 2010-02-09 Vital Images, Inc. Image enhancement using anisotropic noise filtering
US20080021300A1 (en) * 2006-06-29 2008-01-24 Allison John W Four-dimensional target modeling and radiation treatment
US7496174B2 (en) 2006-10-16 2009-02-24 Oraya Therapeutics, Inc. Portable orthovoltage radiotherapy
US7620147B2 (en) 2006-12-13 2009-11-17 Oraya Therapeutics, Inc. Orthovoltage radiotherapy
USRE46953E1 (en) 2007-04-20 2018-07-17 University Of Maryland, Baltimore Single-arc dose painting for precision radiation therapy
US8433159B1 (en) * 2007-05-16 2013-04-30 Varian Medical Systems International Ag Compressed target movement model using interpolation
US8920406B2 (en) 2008-01-11 2014-12-30 Oraya Therapeutics, Inc. Device and assembly for positioning and stabilizing an eye
US8363783B2 (en) 2007-06-04 2013-01-29 Oraya Therapeutics, Inc. Method and device for ocular alignment and coupling of ocular structures
US8184886B2 (en) * 2007-08-21 2012-05-22 Siemens Aktiengesellschaft Deformable 2D-3D registration
US7551717B2 (en) * 2007-08-21 2009-06-23 Wisconsin Alumni Research Foundation Virtual 4D treatment suite
US7801271B2 (en) 2007-12-23 2010-09-21 Oraya Therapeutics, Inc. Methods and devices for orthovoltage ocular radiotherapy and treatment planning
EP3272395B1 (en) 2007-12-23 2019-07-17 Carl Zeiss Meditec, Inc. Devices for detecting, controlling, and predicting radiation delivery
US7720196B2 (en) * 2008-01-07 2010-05-18 Accuray Incorporated Target tracking using surface scanner and four-dimensional diagnostic imaging data
WO2009106784A1 (en) * 2008-02-25 2009-09-03 Inventive Medical Limited Medical training method and apparatus
US8737721B2 (en) * 2008-05-07 2014-05-27 Microsoft Corporation Procedural authoring
DE102008044901A1 (en) * 2008-08-29 2010-03-04 Siemens Aktiengesellschaft Method and device for selecting an irradiation plan and irradiation facility
WO2010120534A1 (en) 2009-03-31 2010-10-21 Whitten Matthew R System and method for radiation therapy treatment planning using a memetic optimization algorithm
US20110050692A1 (en) * 2009-09-01 2011-03-03 Accuray Incorporated Interpolating and rendering sub-phases of a 4d dataset
WO2011160235A1 (en) 2010-06-22 2011-12-29 Karl Otto System and method for estimating and manipulating estimated radiation dose
US10311585B2 (en) 2010-06-23 2019-06-04 Varian Medical Systems International Ag Mechanism for advanced structure generation and editing
WO2012080949A1 (en) 2010-12-15 2012-06-21 Koninklijke Philips Electronics N.V. Contour guided deformable image registration
US8948842B2 (en) 2011-01-21 2015-02-03 Headwater Partners Ii Llc Radiation treatment with multiple imaging elements
US9283404B2 (en) * 2011-01-21 2016-03-15 Headwater Partners Ii Llc Imaging observation timing for assisting radiation treatment
US8900113B2 (en) 2011-01-21 2014-12-02 Headwater Partners Ii Llc Tracking of tumor location for targeted radiation treatment
US9364687B2 (en) 2011-01-21 2016-06-14 Headwater Partners Ii Llc Imaging observation timing based on radiation treatment system element delay
US10152951B2 (en) 2011-02-28 2018-12-11 Varian Medical Systems International Ag Method and system for interactive control of window/level parameters of multi-image displays
DE102011075917B4 (en) * 2011-05-16 2021-10-28 Siemens Healthcare Gmbh Method for providing a 3D image data set with suppressed measuring field exceedance artifacts and computer tomograph
CN103782320B (en) * 2011-08-30 2017-03-15 皇家飞利浦有限公司 In deformable image registration workflow, the correction of user input and deformation vector field is integrated
EP2979248A1 (en) * 2013-03-28 2016-02-03 Koninklijke Philips N.V. Interactive follow-up visualization
WO2015023787A1 (en) * 2013-08-13 2015-02-19 Coffey Dane Computer visualization of anatomical items
US10551464B2 (en) 2013-10-31 2020-02-04 The Board Of Trustees Of The University Of Illinois Three dimensional multislab, multi-shot magnetic resonance elastography
US9974977B2 (en) * 2014-10-27 2018-05-22 Elekta, Inc. Image guidance for radiation therapy
WO2017200527A1 (en) * 2016-05-16 2017-11-23 Hewlett-Packard Development Company, L.P. Generating a shape profile for a 3d object
US10527700B2 (en) 2016-10-19 2020-01-07 The Board Of Trustees Of The University Of Illinois Multiband, multishot magnetic resonance elastography
US10183179B1 (en) 2017-07-21 2019-01-22 Varian Medical Systems, Inc. Triggered treatment systems and methods
US10843011B2 (en) 2017-07-21 2020-11-24 Varian Medical Systems, Inc. Particle beam gun control systems and methods
US10092774B1 (en) 2017-07-21 2018-10-09 Varian Medical Systems International, AG Dose aspects of radiation therapy planning and treatment
US10245448B2 (en) 2017-07-21 2019-04-02 Varian Medical Systems Particle Therapy Gmbh Particle beam monitoring systems and methods
US10549117B2 (en) * 2017-07-21 2020-02-04 Varian Medical Systems, Inc Geometric aspects of radiation therapy planning and treatment
US10609806B2 (en) 2017-07-21 2020-03-31 Varian Medical Systems Particle Therapy Gmbh Energy modulation of a cyclotron beam
US20230097277A1 (en) * 2021-09-29 2023-03-30 Siemens Heal Thineers International Ag On-line adaptive deep inspiration breath-hold treatment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020009224A1 (en) * 1999-01-22 2002-01-24 Claudio Gatti Interactive sculpting for volumetric exploration and feature extraction
US20030184291A1 (en) * 2002-03-28 2003-10-02 Rehwald Wolfgang G. Shifting of artifacts by reordering of k-space
US20030228905A1 (en) * 2002-06-07 2003-12-11 Satoru Osako Game system and game program
US6835137B1 (en) * 1998-08-06 2004-12-28 Namco Limited Game apparatus and communication game system
US6892089B1 (en) * 1999-04-22 2005-05-10 Johns Hopkins University Cardiac motion tracking using cine harmonic phase (HARP) magnetic resonance imaging
US20050261570A1 (en) * 2001-06-08 2005-11-24 Mate Timothy P Guided radiation therapy system

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0153439B1 (en) * 1983-06-03 1993-08-04 Fondazione Pro Juventute Don Carlo Gnocchi Modularly expansible system for real time processing of a TV display, useful in particular for the acquisition of coordinates of known shape objects and method using said system in radiography.
US4788975B1 (en) * 1987-11-05 1999-03-02 Trimedyne Inc Control system and method for improved laser angioplasty
US5396418A (en) * 1988-10-20 1995-03-07 Picker International, Inc. Four dimensional spiral volume imaging using fast retrace
US5384861A (en) * 1991-06-24 1995-01-24 Picker International, Inc. Multi-parameter image display with real time interpolation
US5359513A (en) * 1992-11-25 1994-10-25 Arch Development Corporation Method and system for detection of interval change in temporally sequential chest images
EP0602730B1 (en) * 1992-12-18 2002-06-19 Koninklijke Philips Electronics N.V. Registration of Volumetric images which are relatively elastically deformed by matching surfaces
US5802220A (en) * 1995-12-15 1998-09-01 Xerox Corporation Apparatus and method for tracking facial motion through a sequence of images
US5798982A (en) * 1996-04-29 1998-08-25 The Trustees Of Columbia University In The City Of New York Method for inverting reflection trace data from 3-D and 4-D seismic surveys and identifying subsurface fluid and pathways in and among hydrocarbon reservoirs based on impedance models
JP3053389B1 (en) * 1998-12-03 2000-06-19 三菱電機株式会社 Moving object tracking irradiation device
US6169817B1 (en) * 1998-11-04 2001-01-02 University Of Rochester System and method for 4D reconstruction and visualization
US6757423B1 (en) * 1999-02-19 2004-06-29 Barnes-Jewish Hospital Methods of processing tagged MRI data indicative of tissue motion including 4-D LV tissue tracking
US6139500A (en) * 1999-02-24 2000-10-31 Agilent Technologies Inc. Methods and apparatus for 3D cardiac ultrasound imaging
US6535623B1 (en) * 1999-04-15 2003-03-18 Allen Robert Tannenbaum Curvature based system for the segmentation and analysis of cardiac magnetic resonance images
US6438403B1 (en) * 1999-11-01 2002-08-20 General Electric Company Method and apparatus for cardiac analysis using four-dimensional connectivity
US6563941B1 (en) * 1999-12-14 2003-05-13 Siemens Corporate Research, Inc. Model-based registration of cardiac CTA and MR acquisitions
US6615070B2 (en) * 2000-06-01 2003-09-02 Georgia Tech Research Corporation Automated planning volume contouring algorithm for adjuvant brachytherapy treatment planning in sarcoma
JP3442346B2 (en) * 2000-06-01 2003-09-02 カナガワ アンド カンパニー株式会社 Image forming apparatus and image forming method using the same
US6466813B1 (en) * 2000-07-22 2002-10-15 Koninklijke Philips Electronics N.V. Method and apparatus for MR-based volumetric frameless 3-D interactive localization, virtual simulation, and dosimetric radiation therapy planning
US6539074B1 (en) * 2000-08-25 2003-03-25 General Electric Company Reconstruction of multislice tomographic images from four-dimensional data
ATE413135T1 (en) * 2000-09-14 2008-11-15 Univ Leland Stanford Junior ASSESSMENT OF THE CONDITION OF A JOINT AND THE LOSS OF CARTILAGE TISSUE
US6728424B1 (en) * 2000-09-15 2004-04-27 Koninklijke Philips Electronics, N.V. Imaging registration system and method using likelihood maximization
US7031504B1 (en) * 2000-09-26 2006-04-18 Vital Images, Inc. Image data based retrospective temporal selection of medical images
DE10048029A1 (en) * 2000-09-26 2002-04-25 Philips Corp Intellectual Pty Procedure for calculating a transformation connecting two images
US6473634B1 (en) * 2000-11-22 2002-10-29 Koninklijke Philips Electronics N.V. Medical imaging at two temporal resolutions for tumor treatment planning
US7020305B2 (en) * 2000-12-06 2006-03-28 Microsoft Corporation System and method providing improved head motion estimations for animation
EP1363535B1 (en) * 2001-01-30 2012-01-04 R. Christopher Decharms Methods for physiological monitoring, training, exercise and regulation
US6816607B2 (en) * 2001-05-16 2004-11-09 Siemens Corporate Research, Inc. System for modeling static and dynamic three dimensional anatomical structures by 3-D models
US20030072479A1 (en) * 2001-09-17 2003-04-17 Virtualscopics System and method for quantitative assessment of cancers and their change over time
US7286866B2 (en) * 2001-11-05 2007-10-23 Ge Medical Systems Global Technology Company, Llc Method, system and computer product for cardiac interventional procedure planning
DE10224011A1 (en) * 2002-05-29 2003-12-24 Siemens Ag Computer-aided reconstruction method for a three-dimensional object
AU2003263003A1 (en) * 2002-08-29 2004-03-19 Computerized Medical Systems, Inc. Methods and systems for localizing of a medical imaging probe and of a biopsy needle
US7042975B2 (en) * 2002-10-25 2006-05-09 Koninklijke Philips Electronics N.V. Four-dimensional helical tomographic scanner
FR2848093B1 (en) * 2002-12-06 2005-12-30 Ge Med Sys Global Tech Co Llc METHOD FOR DETECTING THE CARDIAC CYCLE FROM AN ANGIOGRAM OF CORONARY VESSELS
US7505809B2 (en) * 2003-01-13 2009-03-17 Mediguide Ltd. Method and system for registering a first image with a second image relative to the body of a patient
US7486983B2 (en) * 2003-02-12 2009-02-03 Siemens Medical Solutions Usa, Inc. Verification of radiation beam characteristics
US7218320B2 (en) * 2003-03-13 2007-05-15 Sony Corporation System and method for capturing facial and body motion
DE10317367B4 (en) * 2003-04-15 2007-01-11 Siemens Ag Method of performing digital subtraction angiography using native volume data sets
DE10333543A1 (en) * 2003-07-23 2005-02-24 Siemens Ag A method for the coupled presentation of intraoperative as well as interactive and iteratively re-registered preoperative images in medical imaging
US20050053267A1 (en) * 2003-09-05 2005-03-10 Varian Medical Systems Technologies, Inc. Systems and methods for tracking moving targets and monitoring object positions
US7154498B2 (en) * 2003-09-10 2006-12-26 Siemens Medical Solutions Usa, Inc. System and method for spatio-temporal guidepoint modeling
US7796790B2 (en) * 2003-10-17 2010-09-14 Koninklijke Philips Electronics N.V. Manual tools for model based image segmentation
TWI220234B (en) * 2003-10-21 2004-08-11 Ind Tech Res Inst A method to simulate animated images for an object
US7486812B2 (en) * 2003-11-25 2009-02-03 Icad, Inc. Shape estimates and temporal registration of lesions and nodules
US7327865B2 (en) * 2004-06-30 2008-02-05 Accuray, Inc. Fiducial-less tracking with non-rigid image registration
US7366278B2 (en) * 2004-06-30 2008-04-29 Accuray, Inc. DRR generation using a non-linear attenuation model
US7729744B2 (en) * 2004-07-20 2010-06-01 Resonant Medical, Inc. Verifying lesion characteristics using beam shapes
US7283654B2 (en) * 2004-08-26 2007-10-16 Lumeniq, Inc. Dynamic contrast visualization (DCV)
US8989349B2 (en) * 2004-09-30 2015-03-24 Accuray, Inc. Dynamic tracking of moving targets
WO2006062958A2 (en) * 2004-12-10 2006-06-15 Worcester Polytechnic Institute Image-based computational mechanical analysis and indexing for cardiovascular diseases
US7453983B2 (en) * 2005-01-20 2008-11-18 Carestream Health, Inc. Radiation therapy method with target detection

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6835137B1 (en) * 1998-08-06 2004-12-28 Namco Limited Game apparatus and communication game system
US20020009224A1 (en) * 1999-01-22 2002-01-24 Claudio Gatti Interactive sculpting for volumetric exploration and feature extraction
US6892089B1 (en) * 1999-04-22 2005-05-10 Johns Hopkins University Cardiac motion tracking using cine harmonic phase (HARP) magnetic resonance imaging
US20050261570A1 (en) * 2001-06-08 2005-11-24 Mate Timothy P Guided radiation therapy system
US20030184291A1 (en) * 2002-03-28 2003-10-02 Rehwald Wolfgang G. Shifting of artifacts by reordering of k-space
US20030228905A1 (en) * 2002-06-07 2003-12-11 Satoru Osako Game system and game program

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2584970A4 (en) * 2010-06-23 2017-08-02 Varian Medical Systems International AG Mechanism for advanced structure generation and editing
WO2016070938A1 (en) * 2014-11-07 2016-05-12 Raysearch Laboratories Ab Robust radiotherapy treatment plan generation
US10137314B2 (en) 2014-11-07 2018-11-27 Raysearch Laboratories Ab Robust radiotherapy treatment plan generation
EP3081262A1 (en) * 2015-04-14 2016-10-19 RaySearch Laboratories AB A method, a computer program product and a system for optimization of radiotherapy treatment planning
WO2016166059A3 (en) * 2015-04-14 2017-02-16 Raysearch Laboratories Ab A method, a computer program product and a system for optimization of radiotherapy treatment planning
US10888712B2 (en) 2015-04-14 2021-01-12 Raysearch Laboratories Ab Method, a computer program product and a system for optimization of radiotherapy treatment planning
EP3228357A1 (en) * 2016-04-08 2017-10-11 RaySearch Laboratories AB Method, computer program product and computer system for radiotherapy treatment planning
WO2017174643A1 (en) * 2016-04-08 2017-10-12 Raysearch Laboratories Ab Method, computer program product and computer system for radiotherapy treatment planning
US10843009B2 (en) 2016-04-08 2020-11-24 Raysearch Laboratories Ab Method, computer program product and computer system for radiotherapy treatment planning

Also Published As

Publication number Publication date
US7352370B2 (en) 2008-04-01
WO2006130771A3 (en) 2008-01-17
US20060274061A1 (en) 2006-12-07

Similar Documents

Publication Publication Date Title
US7352370B2 (en) Four-dimensional volume of interest
US11547870B2 (en) Radiation treatment planning using four-dimensional imaging data
US8077936B2 (en) Treatment planning software and corresponding user interface
US20060274925A1 (en) Generating a volume of interest using a dose isocontour
CN101248441B (en) Precision registration of x-ray images to cone-beam CT scan for image-guided radiation treatment
US8406851B2 (en) Patient tracking using a virtual image
US7298819B2 (en) Flexible treatment planning
US7623679B2 (en) Temporal smoothing of a deformation model
US7801349B2 (en) Automatic generation of an envelope of constraint points for inverse planning
AU2020264304B2 (en) Adaptive radiotherapy system
US20080037843A1 (en) Image segmentation for DRR generation and image registration
US20090005668A1 (en) Non-invasive method for using 2D angiographic images for radiosurgical target definition
Xing et al. Computational challenges for image-guided radiation therapy: framework and current research
WO2006130863A2 (en) Inverse planning using optimization constraints derived from image intensity
Chao et al. Development of a Dedicated Radiotherapy Unit with Real-Time Image Guidance and Motion Management for Accelerated Partial Breast Irradiation

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06771846

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 06771846

Country of ref document: EP

Kind code of ref document: A2