US20050196025A1 - Method and apparatus for processing medical image data in a network environment - Google Patents
Method and apparatus for processing medical image data in a network environment Download PDFInfo
- Publication number
- US20050196025A1 US20050196025A1 US10/794,104 US79410404A US2005196025A1 US 20050196025 A1 US20050196025 A1 US 20050196025A1 US 79410404 A US79410404 A US 79410404A US 2005196025 A1 US2005196025 A1 US 2005196025A1
- Authority
- US
- United States
- Prior art keywords
- network
- hardware
- logic
- processor
- network service
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
- G06V10/95—Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/803—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
Definitions
- This invention generally relates to image processing systems and in particular to a method and apparatus for processing medical image data in a network environment.
- Radiologists allows radiologists to perform diagnosis of many types of injury and disease by imaging various internal body parts.
- radiologists utilize diagnostic imaging to visualize organs in the abdomen, the chest cavity, the brain and central nervous system, the musculoskeletal system, and other body parts. Diagnostic imaging may be used to detect potential cancer abnormalities; bone densitometry; joint, bone, or soft tissue injuries; and many other types of diseases.
- diagnostic imaging includes many different imaging modalities such as x-rays, ultrasound, computed tomography, magnetic resonance imaging, and nuclear medicine to name but a few.
- diagnostic imaging technology has advanced and medical diagnostic imaging has shifted from a film based system to a digitally based system in which diagnostic images are recorded, transferred, viewed, and stored electronically.
- digital imaging modalities such as computed tomography or magnetic resonance imaging generate a large number of images during each diagnostic examination that are then combined to form a three-dimensional volume image.
- One examination can generate between twenty and one hundred images or more.
- the processing of the large number of image files to render a single fused image is time consuming, so that a technician typically performs the image fusion prior to the fused image being provided to the radiologist for review. In some instances, the image formed by the technician is not exactly what the radiologist needed and the determination is made that other images are necessary.
- the time delay between taking the image and the review by the doctor or radiologist may thus be considerable.
- the patient may be required to return at a later time for more images to be taken, or the patient may be required to wait until the radiologist has reviewed the images and made a determination as to whether more images are needed. In either case, the patient is required to be subjected to further inconvenience.
- a method and apparatus for the processing medical image data in a network environment is provided to reduce the amount of time required to process images for example to combine or fuse a plurality of image data into a single image data.
- a network service is provided that is configured to receive a plurality of image data and to execute one or more image fusion processes in both software and hardware.
- the logic for each image fusion process may be divided between instructions that are to be executed in software and logic that is to be executed in hardware.
- the software instructions may be executed either by a processor within the network element or in a processor associated with network service.
- the hardware logic is executed in a reconfigurable programmable logic device such as a Field Programmable Gate Array (FPGA).
- FPGA Field Programmable Gate Array
- the network service may be included within the network element, be separate from a network element but associated therewith, for example, as a blade server, or may be external to the network and accessible by a plurality of network elements.
- the process logic and hardware parameters may be stored with the network service components or may be stored external to the network as well.
- FIG. 1 is a functional block diagram of a network environment incorporating an embodiment of the present invention
- FIG. 2 is a flow chart illustrating a process according to an embodiment of the present invention
- FIG. 3 is a functional block diagram of a network element according to an embodiment of the invention located.
- FIG. 4 is a functional block diagram of a network element and associated network service module according to an embodiment of the present invention.
- a method and apparatus for processing medical image data in a network environment to form image data from a plurality of image data files in conjunction with the transmission of the plurality of image data files across a network.
- a network service is provided and configured to execute one portion of an image fusion process in software and to execute another portion of the image fusion process in a reconfigurable and reprogrammable hardware device, such as a Field Programmable Gate Array (FPGA).
- FPGA Field Programmable Gate Array
- the division of the process logic between software and hardware is a function of the of process step, the level of complexity of the process, the speed of the processor used to execute the software instruction, the desirability of providing for future upgrades, and other system requirements with respect to speed and memory storage.
- the network image fusion service is provided in conjunction with a network element that is configured to receive a plurality of medical image data, retrieve the process software instructions and the hardware logic files for at least one image fusion process, execute the image fusion process both in software instructions and in hardware logic and to provide the desired fused image as an output.
- the network service executes the plurality of image fusion processes in a predetermined order that may be a stored value or set by an operator.
- the present invention described herein is able to shift responsibility for the image fusion processing from medical equipment used to display the images or an image archive system, to the network service. This allows a user to request a particular image and to receive the image in a shorter period of time than images formed using only software implemented processes or images processed by the reviewing/reporting workstation or image archive system.
- one aspect of the present invention described herein is directed toward a network service configured to fuse multiple medical images into a single medical image
- the invention described is not limited in this manner as other forms of image processing may be done as well, such as edge detection, filtering, morphing, and other types of medical image processing.
- the network image fusion service is described as being used in conjunction with a network which may be, for example, an enterprise network that may be deployed in a medical facility or other facility.
- a network may be, for example, an enterprise network that may be deployed in a medical facility or other facility.
- typical networks may include a radiological information system (RIS) or a hospital information system (HIS).
- RIS radiological information system
- HIS hospital information system
- the network may also be a more extensive network such as a wide area network (WAN), a metro area network (MAN), or a public network such as the Internet.
- image data means the pixel data and other data that are used to render a single image and a plurality of image data means pixel data and other data representative of a plurality of images that may be combined to create one or more combined images.
- Image data for a single image and a plurality of image data representing a plurality of images may be stored in one or more image data files depending on the system configuration and requirements.
- hardware logic files may include without limitation, scripts, setup files, tool lineup files, and other such files that implement the process logic in a reprogrammable device such as an FPGA.
- one or more imaging modalities 102 are configured to generate two or more image data that are to be fused into a single image data.
- the imaging modalities may include, without limitation, an x-ray system, a computer tomography system, an ultrasound system, a magnetic resonance imaging system, or a nuclear medicine system. Other modalities may similarly be used and the invention is not limited to these particular modalities.
- the image modalities 102 transfer the image data to a desired destination for storage, processing, or display via a network 104 that is made up of one or more network elements 106 .
- the image data provided by the image modalities 102 can be provided to an image archive system 108 for storage or to a reviewing/reporting workstation 110 for display to be reviewed by a radiologist.
- an image fusion network service is deployed on the network to provide the image fusion of medical files transferred over the network 104 .
- a network image fusion service module 112 that provides the network image fusion service functionality may be located, for example, on one or more of the network elements 106 configured to communicate on the network.
- the network image fusion module 112 is associated with a network element 106 that is configured to form a part of the network 104 described above.
- the network element 106 may be a router, bridge, gateway, content switch, or other type of network device, capable of hosting the network image fusion service.
- the network image fusion module may be a blade server that is provided with an interface to the associated network element 106 , but that has its own processor, memory, and reconfigurable reprogrammable logic device, e.g., a FPGA, and is capable of executing both the software instructions and the FPGA logic.
- FIG. 2 is a flow chart that depicts a process for performing real time image fusion according to an embodiment of the invention.
- a request for an image created from a plurality of image data supplied by a data source is generated ( 202 ).
- the request may be accompanied by a desired order of execution of one or more image fusion processes.
- the data source may be an image archive system or an imaging modality.
- the data source provides the plurality of image data and places the plurality of image data on the network ( 204 ).
- the plurality of image data are received by the network element associated with the real time image fusion service ( 206 ).
- the network service in conjunction with the network element requests the software instructions and hardware logic files, e.g., the FPGA parameters, for each of the real time image fusion process from a storage location ( 208 ).
- the storage location provides the software instruction and the hardware logic files for the requested process ( 210 ).
- the network element receives the software instruction and the hardware logic files for the requested process and provides these to the network image fusion service ( 212 ).
- the network image fusion service in conjunction with the network element, loads and executes the software instructions and the hardware logic files in the reprogrammable hardware for the image fusion process ( 214 ).
- the network image fusion service in conjunction with the network element, loads and executes the software instructions and the hardware logic files in the reprogrammable hardware for the image fusion process to be executed ( 214 ). If no more processes are to be run, the fused image data on the network ( 218 ).
- the real time fusion service storage location may provide the software instructions and hardware logic files for each process individually, in which case the process iterates through process steps ( 208 ) to ( 214 ) for each process.
- the storage location may provide the software instructions and the hardware logic files for all processes at the same time.
- the real time image fusion service stores the software instructions and hardware logic files and iterates at step ( 214 ) until all the processes have been executed thereby.
- FIG. 3 depicts a network element 106 according to an embodiment of the present invention.
- the network element generally includes a processor 302 , which includes control logic 304 , and a memory 306 .
- the processor 302 , control logic 304 and memory 306 provide the functionality and control of the network element 106 .
- the network element 106 also includes one or more network data ports 308 that enable the network element 106 to be connected to the network 104 .
- a switch fabric 310 under the control of the processor 302 , is provided to interconnect the network data ports 308 and to direct packets between the network ports 308 .
- the switch fabric 310 may be supported by a packet queue 312 that is configured to temporarily store packets or other protocol data units prior to transmission on the network 104 or before being processed by processor 302 .
- the network element 106 may also include one or more subsystems under the control of the processor 302 and control logic 304 . For example, if the network element is configured to make routing decisions for data packets on the network, routing software 314 and routing tables 316 containing routing information may be provided to enable the network element to route data packets and other protocol data units on the network. Other subsystems may include for example a protocol subsystem that includes a protocol stack 318 that is configured to store data and logic to enable the network element to participate in protocol exchanges on the network.
- the network element 106 may also include a security subsystem 320 that may include an authentication module 322 that is configured to store authentication information to authenticate users, devices, network connections, or a combination thereof.
- the security subsystem 320 may further include an authorization module 324 that is configured to provide authorization information to prevent unauthorized access to the network, the network element, or both.
- the security subsystem 320 may also include an accounting module 326 that is configured to enable accounting entries to be established for sessions on the network, the network element, or both.
- the network image fusion service module 112 is configured to perform the image fusion service described above with respect to FIGS. 1 and 2 .
- a network image fusion service module 112 is coupled to the processor 302 and memory 306 .
- the network image fusion service module 112 includes a programmable reconfigurable logic device 328 and a memory 330 .
- the network image fusion service module 112 may optionally include the storage location 114 of the image fusion process(s) software instructions and hardware logic files. Alternatively, the storage location image 114 of the fusion process(s) software instructions and hardware logic file(s) may be external to the network element 106 and coupled to the network element via one of the ports 308 or via a separate port 332 .
- the image fusion service module 112 may be implemented on a network element 106 as illustrated in FIG. 3 or may be implemented on a separate computer platform so that the fusion service module can also be implemented separate from the network element 106 as illustrated in FIG. 4 .
- the image fusion service module 112 includes a processor 402 in addition to the components described above with respect to FIG. 3 .
- the image fusion service module 112 can be implemented, for example, as a blade server external to the associated network element 106 , but coupled thereto to enable the blade server to receive and transmit messages via the network 104 and to make use of the processor and memory and other resources available within the network element 106 .
- the methods described herein may be implemented as a set of program logic that are stored in a computer readable memory within the network element or accessible thereto, and executed on one or more processors within the network element.
- All logic and methods described herein can be embodied using discrete components, integrated circuitry such as an ASIC, PLA, PAL, FPGA, or microprocessor, a state machine, or any other device including any combination thereof.
- Programmable logic can be fixed temporarily or permanently in a tangible medium such as one of the variety of read-only memory (ROM) chips, a computer memory, a disk, or other storage medium.
- Programmable logic can also be fixed in a computer data signal embodied in a carrier wave, allowing the programmable logic to be transmitted over an interface such as a computer bus or communication network. All such embodiments are intended to fall within the scope of the present invention.
Abstract
Description
- 1. Field of the Invention
- This invention generally relates to image processing systems and in particular to a method and apparatus for processing medical image data in a network environment.
- 2. Description of the Related Art
- Medical diagnostic imaging allows radiologists to perform diagnosis of many types of injury and disease by imaging various internal body parts. For example, radiologists utilize diagnostic imaging to visualize organs in the abdomen, the chest cavity, the brain and central nervous system, the musculoskeletal system, and other body parts. Diagnostic imaging may be used to detect potential cancer abnormalities; bone densitometry; joint, bone, or soft tissue injuries; and many other types of diseases. Presently, diagnostic imaging includes many different imaging modalities such as x-rays, ultrasound, computed tomography, magnetic resonance imaging, and nuclear medicine to name but a few.
- Traditionally, almost all diagnostic imaging was film based. An image was recorded on a physical piece of film that had to be developed, provided to the physician for viewing, reviewed by the physician, and recorded and stored in an archive. Often there was a significant time delay between the taking of the image and the physician reviewing the image. In addition, the storage of film images required a large physical space and associated record keeping. If a physician needed to refer to a patient's stored records, the film images needed to be physically found, retrieved, and provided to the physician. Often there was a significant time delay in this process as well.
- To address these issues, diagnostic imaging technology has advanced and medical diagnostic imaging has shifted from a film based system to a digitally based system in which diagnostic images are recorded, transferred, viewed, and stored electronically. Several types of digital imaging modalities, such as computed tomography or magnetic resonance imaging generate a large number of images during each diagnostic examination that are then combined to form a three-dimensional volume image. One examination can generate between twenty and one hundred images or more. The processing of the large number of image files to render a single fused image is time consuming, so that a technician typically performs the image fusion prior to the fused image being provided to the radiologist for review. In some instances, the image formed by the technician is not exactly what the radiologist needed and the determination is made that other images are necessary. The time delay between taking the image and the review by the doctor or radiologist may thus be considerable. Thus, the patient may be required to return at a later time for more images to be taken, or the patient may be required to wait until the radiologist has reviewed the images and made a determination as to whether more images are needed. In either case, the patient is required to be subjected to further inconvenience.
- A method and apparatus for the processing medical image data in a network environment is provided to reduce the amount of time required to process images for example to combine or fuse a plurality of image data into a single image data. According to an embodiment of the present invention, a network service is provided that is configured to receive a plurality of image data and to execute one or more image fusion processes in both software and hardware. The logic for each image fusion process may be divided between instructions that are to be executed in software and logic that is to be executed in hardware. The software instructions may be executed either by a processor within the network element or in a processor associated with network service. The hardware logic is executed in a reconfigurable programmable logic device such as a Field Programmable Gate Array (FPGA). The network service may be included within the network element, be separate from a network element but associated therewith, for example, as a blade server, or may be external to the network and accessible by a plurality of network elements. The process logic and hardware parameters may be stored with the network service components or may be stored external to the network as well.
- Aspects of the present invention are pointed out with particularity in the appended claims. The present invention is illustrated by way of example in the following drawings in which like references indicate similar elements. The following drawings disclose various embodiments of the present invention for purposes of illustration only and are not intended to limit the scope of the invention. For purposes of clarity, not every component may be labeled in every figure. In the figures:
-
FIG. 1 is a functional block diagram of a network environment incorporating an embodiment of the present invention; -
FIG. 2 is a flow chart illustrating a process according to an embodiment of the present invention; -
FIG. 3 is a functional block diagram of a network element according to an embodiment of the invention located; and -
FIG. 4 is a functional block diagram of a network element and associated network service module according to an embodiment of the present invention. - The following detailed description sets forth numerous specific details to provide a thorough understanding of the invention. However, those skilled in the art will appreciate that the invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, protocols, processes, and circuits have not been describe in detail so as not to obscure the invention.
- As described in greater detail below, a method and apparatus is provided for processing medical image data in a network environment to form image data from a plurality of image data files in conjunction with the transmission of the plurality of image data files across a network. In one embodiment of the invention, a network service is provided and configured to execute one portion of an image fusion process in software and to execute another portion of the image fusion process in a reconfigurable and reprogrammable hardware device, such as a Field Programmable Gate Array (FPGA). The division of the process logic between software and hardware is a function of the of process step, the level of complexity of the process, the speed of the processor used to execute the software instruction, the desirability of providing for future upgrades, and other system requirements with respect to speed and memory storage.
- According to an embodiment of the present invention, the network image fusion service is provided in conjunction with a network element that is configured to receive a plurality of medical image data, retrieve the process software instructions and the hardware logic files for at least one image fusion process, execute the image fusion process both in software instructions and in hardware logic and to provide the desired fused image as an output. In the event that there is more than one image fusion process to be executed, the network service executes the plurality of image fusion processes in a predetermined order that may be a stored value or set by an operator. By dividing the execution of the image fusion process logic between software code and hardware logic, and executing these on the network with a network service in conjunction with a network element, the present invention described herein is able to shift responsibility for the image fusion processing from medical equipment used to display the images or an image archive system, to the network service. This allows a user to request a particular image and to receive the image in a shorter period of time than images formed using only software implemented processes or images processed by the reviewing/reporting workstation or image archive system.
- Although one aspect of the present invention described herein is directed toward a network service configured to fuse multiple medical images into a single medical image, the invention described is not limited in this manner as other forms of image processing may be done as well, such as edge detection, filtering, morphing, and other types of medical image processing.
- In the embodiments that follow, the network image fusion service is described as being used in conjunction with a network which may be, for example, an enterprise network that may be deployed in a medical facility or other facility. Examples of typical networks may include a radiological information system (RIS) or a hospital information system (HIS). However, the network may also be a more extensive network such as a wide area network (WAN), a metro area network (MAN), or a public network such as the Internet.
- As used herein, image data means the pixel data and other data that are used to render a single image and a plurality of image data means pixel data and other data representative of a plurality of images that may be combined to create one or more combined images. Image data for a single image and a plurality of image data representing a plurality of images may be stored in one or more image data files depending on the system configuration and requirements. Also as used herein, hardware logic files may include without limitation, scripts, setup files, tool lineup files, and other such files that implement the process logic in a reprogrammable device such as an FPGA.
- As depicted in
FIG. 1 , one ormore imaging modalities 102 are configured to generate two or more image data that are to be fused into a single image data. The imaging modalities may include, without limitation, an x-ray system, a computer tomography system, an ultrasound system, a magnetic resonance imaging system, or a nuclear medicine system. Other modalities may similarly be used and the invention is not limited to these particular modalities. Theimage modalities 102 transfer the image data to a desired destination for storage, processing, or display via anetwork 104 that is made up of one ormore network elements 106. - The image data provided by the
image modalities 102 can be provided to animage archive system 108 for storage or to a reviewing/reportingworkstation 110 for display to be reviewed by a radiologist. According to an embodiment of the present invention, an image fusion network service is deployed on the network to provide the image fusion of medical files transferred over thenetwork 104. A network imagefusion service module 112 that provides the network image fusion service functionality may be located, for example, on one or more of thenetwork elements 106 configured to communicate on the network. - In the embodiments described herein the network
image fusion module 112 is associated with anetwork element 106 that is configured to form a part of thenetwork 104 described above. For example and without limitation, thenetwork element 106 may be a router, bridge, gateway, content switch, or other type of network device, capable of hosting the network image fusion service. Alternatively, as will be explained in more detail below, the network image fusion module may be a blade server that is provided with an interface to the associatednetwork element 106, but that has its own processor, memory, and reconfigurable reprogrammable logic device, e.g., a FPGA, and is capable of executing both the software instructions and the FPGA logic. -
FIG. 2 is a flow chart that depicts a process for performing real time image fusion according to an embodiment of the invention. Initially, a request for an image created from a plurality of image data supplied by a data source is generated (202). Optionally, the request may be accompanied by a desired order of execution of one or more image fusion processes. The data source may be an image archive system or an imaging modality. The data source provides the plurality of image data and places the plurality of image data on the network (204). The plurality of image data are received by the network element associated with the real time image fusion service (206). The network service in conjunction with the network element requests the software instructions and hardware logic files, e.g., the FPGA parameters, for each of the real time image fusion process from a storage location (208). The storage location provides the software instruction and the hardware logic files for the requested process (210). The network element receives the software instruction and the hardware logic files for the requested process and provides these to the network image fusion service (212). The network image fusion service, in conjunction with the network element, loads and executes the software instructions and the hardware logic files in the reprogrammable hardware for the image fusion process (214). If another process is to be run, (216), the network image fusion service, in conjunction with the network element, loads and executes the software instructions and the hardware logic files in the reprogrammable hardware for the image fusion process to be executed (214). If no more processes are to be run, the fused image data on the network (218). - In the event that there is more than one image fusion process, the real time fusion service storage location may provide the software instructions and hardware logic files for each process individually, in which case the process iterates through process steps (208) to (214) for each process. Alternatively, if there is more than one image fusion process, the storage location may provide the software instructions and the hardware logic files for all processes at the same time. In this event, the real time image fusion service stores the software instructions and hardware logic files and iterates at step (214) until all the processes have been executed thereby.
-
FIG. 3 depicts anetwork element 106 according to an embodiment of the present invention. In particular, the network element generally includes aprocessor 302, which includescontrol logic 304, and amemory 306. Theprocessor 302,control logic 304 andmemory 306 provide the functionality and control of thenetwork element 106. Thenetwork element 106 also includes one or morenetwork data ports 308 that enable thenetwork element 106 to be connected to thenetwork 104. Aswitch fabric 310 under the control of theprocessor 302, is provided to interconnect thenetwork data ports 308 and to direct packets between thenetwork ports 308. Theswitch fabric 310 may be supported by apacket queue 312 that is configured to temporarily store packets or other protocol data units prior to transmission on thenetwork 104 or before being processed byprocessor 302. - The
network element 106 may also include one or more subsystems under the control of theprocessor 302 andcontrol logic 304. For example, if the network element is configured to make routing decisions for data packets on the network,routing software 314 and routing tables 316 containing routing information may be provided to enable the network element to route data packets and other protocol data units on the network. Other subsystems may include for example a protocol subsystem that includes aprotocol stack 318 that is configured to store data and logic to enable the network element to participate in protocol exchanges on the network. Thenetwork element 106 may also include asecurity subsystem 320 that may include anauthentication module 322 that is configured to store authentication information to authenticate users, devices, network connections, or a combination thereof. Thesecurity subsystem 320 may further include anauthorization module 324 that is configured to provide authorization information to prevent unauthorized access to the network, the network element, or both. Thesecurity subsystem 320 may also include anaccounting module 326 that is configured to enable accounting entries to be established for sessions on the network, the network element, or both. - As depicted in
FIG. 3 , the network imagefusion service module 112 is configured to perform the image fusion service described above with respect toFIGS. 1 and 2 . In the illustrated embodiments, a network imagefusion service module 112 is coupled to theprocessor 302 andmemory 306. The network imagefusion service module 112 includes a programmablereconfigurable logic device 328 and amemory 330. The network imagefusion service module 112 may optionally include thestorage location 114 of the image fusion process(s) software instructions and hardware logic files. Alternatively, thestorage location image 114 of the fusion process(s) software instructions and hardware logic file(s) may be external to thenetwork element 106 and coupled to the network element via one of theports 308 or via aseparate port 332. - The image
fusion service module 112 may be implemented on anetwork element 106 as illustrated inFIG. 3 or may be implemented on a separate computer platform so that the fusion service module can also be implemented separate from thenetwork element 106 as illustrated inFIG. 4 . In the embodiment illustrated inFIG. 4 , the imagefusion service module 112 includes aprocessor 402 in addition to the components described above with respect toFIG. 3 . In this embodiment, the imagefusion service module 112 can be implemented, for example, as a blade server external to the associatednetwork element 106, but coupled thereto to enable the blade server to receive and transmit messages via thenetwork 104 and to make use of the processor and memory and other resources available within thenetwork element 106. - The methods described herein may be implemented as a set of program logic that are stored in a computer readable memory within the network element or accessible thereto, and executed on one or more processors within the network element. However, it will be apparent to a skilled artisan that all logic and methods described herein can be embodied using discrete components, integrated circuitry such as an ASIC, PLA, PAL, FPGA, or microprocessor, a state machine, or any other device including any combination thereof. Programmable logic can be fixed temporarily or permanently in a tangible medium such as one of the variety of read-only memory (ROM) chips, a computer memory, a disk, or other storage medium. Programmable logic can also be fixed in a computer data signal embodied in a carrier wave, allowing the programmable logic to be transmitted over an interface such as a computer bus or communication network. All such embodiments are intended to fall within the scope of the present invention.
- It should be appreciated that other variations to and modifications of the above-described method and system for transferring and compressing medical image data may be made without departing from the inventive concepts described herein. Accordingly, the invention should not be viewed as limited except by the scope and spirit of the appended claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/794,104 US7492932B2 (en) | 2004-03-05 | 2004-03-05 | Method and apparatus for processing medical image data in a network environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/794,104 US7492932B2 (en) | 2004-03-05 | 2004-03-05 | Method and apparatus for processing medical image data in a network environment |
Publications (2)
Publication Number | Publication Date |
---|---|
US20050196025A1 true US20050196025A1 (en) | 2005-09-08 |
US7492932B2 US7492932B2 (en) | 2009-02-17 |
Family
ID=34912184
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/794,104 Expired - Fee Related US7492932B2 (en) | 2004-03-05 | 2004-03-05 | Method and apparatus for processing medical image data in a network environment |
Country Status (1)
Country | Link |
---|---|
US (1) | US7492932B2 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070162972A1 (en) * | 2006-01-11 | 2007-07-12 | Sensory Networks, Inc. | Apparatus and method for processing of security capabilities through in-field upgrades |
US20090041329A1 (en) * | 2007-08-07 | 2009-02-12 | Nextslide Imaging Llc. | Network Review in Clinical Hematology |
CN103093446A (en) * | 2013-01-18 | 2013-05-08 | 北京理工大学 | Multi-source image fusion device and method based on on-chip system of multiprocessor |
US20130204136A1 (en) * | 2012-02-03 | 2013-08-08 | Delphinus Medical Technologies, Inc. | System and method for imaging a volume of tissue |
US9101290B2 (en) | 2010-02-12 | 2015-08-11 | Delphinus Medical Technologies, Inc. | Method of characterizing breast tissue using multiple contrast enhanced ultrasound renderings |
US9113835B2 (en) | 2011-02-08 | 2015-08-25 | Delphinus Medical Technologies, Inc. | System and method for generating a rendering of a volume of tissue based upon differential time-of-flight data |
US9144403B2 (en) | 2010-02-12 | 2015-09-29 | Delphinus Medical Technologies, Inc. | Method of characterizing the pathological response of tissue to a treatment plan |
US9763641B2 (en) | 2012-08-30 | 2017-09-19 | Delphinus Medical Technologies, Inc. | Method and system for imaging a volume of tissue with tissue boundary detection |
US10123770B2 (en) | 2013-03-13 | 2018-11-13 | Delphinus Medical Technologies, Inc. | Patient support system |
US10143443B2 (en) | 2014-05-05 | 2018-12-04 | Delphinus Medical Technologies, Inc. | Method for representing tissue stiffness |
US10201324B2 (en) | 2007-05-04 | 2019-02-12 | Delphinus Medical Technologies, Inc. | Patient interface system |
US10285667B2 (en) | 2014-08-05 | 2019-05-14 | Delphinus Medical Technologies, Inc. | Method for generating an enhanced image of a volume of tissue |
CN109791518A (en) * | 2016-09-28 | 2019-05-21 | 亚马逊科技公司 | Debugging message is extracted from the FPGA in multi-tenant environment |
CN113016039A (en) * | 2018-11-12 | 2021-06-22 | 皇家飞利浦有限公司 | System and method for processing waveform data in a medical device |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7492480B2 (en) * | 2001-08-27 | 2009-02-17 | Phototype Engraving Company | System for halftone screen production |
CN113646799A (en) | 2018-12-05 | 2021-11-12 | 史赛克公司 | System and method for displaying medical imaging data |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5684621A (en) * | 1995-05-08 | 1997-11-04 | Downing; Elizabeth Anne | Method and system for three-dimensional display of information based on two-photon upconversion |
US6064423A (en) * | 1998-02-12 | 2000-05-16 | Geng; Zheng Jason | Method and apparatus for high resolution three dimensional display |
US6177913B1 (en) * | 1998-04-23 | 2001-01-23 | The United States Of America As Represented By The Secretary Of The Navy | Volumetric display |
US6208318B1 (en) * | 1993-06-24 | 2001-03-27 | Raytheon Company | System and method for high resolution volume display using a planar array |
US20010041991A1 (en) * | 2000-02-09 | 2001-11-15 | Segal Elliot A. | Method and system for managing patient medical records |
US6327074B1 (en) * | 1998-11-25 | 2001-12-04 | University Of Central Florida | Display medium using emitting particles dispersed in a transparent host |
US20020006216A1 (en) * | 2000-01-18 | 2002-01-17 | Arch Development Corporation | Method, system and computer readable medium for the two-dimensional and three-dimensional detection of lesions in computed tomography scans |
US6348793B1 (en) * | 2000-11-06 | 2002-02-19 | Ge Medical Systems Global Technology, Company, Llc | System architecture for medical imaging systems |
US20020029264A1 (en) * | 2000-09-04 | 2002-03-07 | Tetsuo Ogino | Medical image service method, medical software service method, medical image central management server apparatus, medical software central management server apparatus, medical image service system and medical software service system |
US20020067467A1 (en) * | 2000-09-07 | 2002-06-06 | Dorval Rick K. | Volumetric three-dimensional display system |
US6466184B1 (en) * | 1998-12-29 | 2002-10-15 | The United States Of America As Represented By The Secretary Of The Navy | Three dimensional volumetric display |
US6470071B1 (en) * | 2001-01-31 | 2002-10-22 | General Electric Company | Real time data acquisition system including decoupled host computer |
US20030126148A1 (en) * | 2001-11-21 | 2003-07-03 | Amicas, Inc. | System and methods for real-time worklist service |
US20030165262A1 (en) * | 2002-02-21 | 2003-09-04 | The University Of Chicago | Detection of calcifications within a medical image |
-
2004
- 2004-03-05 US US10/794,104 patent/US7492932B2/en not_active Expired - Fee Related
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6208318B1 (en) * | 1993-06-24 | 2001-03-27 | Raytheon Company | System and method for high resolution volume display using a planar array |
US5684621A (en) * | 1995-05-08 | 1997-11-04 | Downing; Elizabeth Anne | Method and system for three-dimensional display of information based on two-photon upconversion |
US5956172A (en) * | 1995-05-08 | 1999-09-21 | 3D Technology Laboratories, Inc. | System and method using layered structure for three-dimensional display of information based on two-photon upconversion |
US6064423A (en) * | 1998-02-12 | 2000-05-16 | Geng; Zheng Jason | Method and apparatus for high resolution three dimensional display |
US6177913B1 (en) * | 1998-04-23 | 2001-01-23 | The United States Of America As Represented By The Secretary Of The Navy | Volumetric display |
US6327074B1 (en) * | 1998-11-25 | 2001-12-04 | University Of Central Florida | Display medium using emitting particles dispersed in a transparent host |
US6466184B1 (en) * | 1998-12-29 | 2002-10-15 | The United States Of America As Represented By The Secretary Of The Navy | Three dimensional volumetric display |
US20020006216A1 (en) * | 2000-01-18 | 2002-01-17 | Arch Development Corporation | Method, system and computer readable medium for the two-dimensional and three-dimensional detection of lesions in computed tomography scans |
US20010041991A1 (en) * | 2000-02-09 | 2001-11-15 | Segal Elliot A. | Method and system for managing patient medical records |
US20020029264A1 (en) * | 2000-09-04 | 2002-03-07 | Tetsuo Ogino | Medical image service method, medical software service method, medical image central management server apparatus, medical software central management server apparatus, medical image service system and medical software service system |
US20020067467A1 (en) * | 2000-09-07 | 2002-06-06 | Dorval Rick K. | Volumetric three-dimensional display system |
US6348793B1 (en) * | 2000-11-06 | 2002-02-19 | Ge Medical Systems Global Technology, Company, Llc | System architecture for medical imaging systems |
US6470071B1 (en) * | 2001-01-31 | 2002-10-22 | General Electric Company | Real time data acquisition system including decoupled host computer |
US20030126148A1 (en) * | 2001-11-21 | 2003-07-03 | Amicas, Inc. | System and methods for real-time worklist service |
US20030165262A1 (en) * | 2002-02-21 | 2003-09-04 | The University Of Chicago | Detection of calcifications within a medical image |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070162972A1 (en) * | 2006-01-11 | 2007-07-12 | Sensory Networks, Inc. | Apparatus and method for processing of security capabilities through in-field upgrades |
US10201324B2 (en) | 2007-05-04 | 2019-02-12 | Delphinus Medical Technologies, Inc. | Patient interface system |
US20090041329A1 (en) * | 2007-08-07 | 2009-02-12 | Nextslide Imaging Llc. | Network Review in Clinical Hematology |
US11399798B2 (en) | 2010-02-12 | 2022-08-02 | Delphinus Medical Technologies, Inc. | Method of characterizing tissue of a patient |
US10278672B2 (en) | 2010-02-12 | 2019-05-07 | Delphinus Medical Technologies, Inc. | Method of characterizing the pathological response of tissue to a treatment plan |
US9101290B2 (en) | 2010-02-12 | 2015-08-11 | Delphinus Medical Technologies, Inc. | Method of characterizing breast tissue using multiple contrast enhanced ultrasound renderings |
US9144403B2 (en) | 2010-02-12 | 2015-09-29 | Delphinus Medical Technologies, Inc. | Method of characterizing the pathological response of tissue to a treatment plan |
US10231696B2 (en) | 2010-02-12 | 2019-03-19 | Delphinus Medical Technologies, Inc. | Method of characterizing tissue of a patient |
US9814441B2 (en) | 2010-02-12 | 2017-11-14 | Delphinus Medical Technologies, Inc. | Method of characterizing tissue of a patient |
US9113835B2 (en) | 2011-02-08 | 2015-08-25 | Delphinus Medical Technologies, Inc. | System and method for generating a rendering of a volume of tissue based upon differential time-of-flight data |
US20130204136A1 (en) * | 2012-02-03 | 2013-08-08 | Delphinus Medical Technologies, Inc. | System and method for imaging a volume of tissue |
US9763641B2 (en) | 2012-08-30 | 2017-09-19 | Delphinus Medical Technologies, Inc. | Method and system for imaging a volume of tissue with tissue boundary detection |
CN103093446A (en) * | 2013-01-18 | 2013-05-08 | 北京理工大学 | Multi-source image fusion device and method based on on-chip system of multiprocessor |
US10123770B2 (en) | 2013-03-13 | 2018-11-13 | Delphinus Medical Technologies, Inc. | Patient support system |
US11064974B2 (en) | 2013-03-13 | 2021-07-20 | Delphinus Medical Technologies, Inc. | Patient interface system |
US10143443B2 (en) | 2014-05-05 | 2018-12-04 | Delphinus Medical Technologies, Inc. | Method for representing tissue stiffness |
US11147537B2 (en) | 2014-05-05 | 2021-10-19 | Delphinus Medical Technologies, Inc. | Method for representing tissue stiffness |
US11298111B2 (en) | 2014-08-05 | 2022-04-12 | Delphinus Medical Technologies, Inc. | Method for generating an enhanced image of a volume of tissue |
US10285667B2 (en) | 2014-08-05 | 2019-05-14 | Delphinus Medical Technologies, Inc. | Method for generating an enhanced image of a volume of tissue |
CN109791518A (en) * | 2016-09-28 | 2019-05-21 | 亚马逊科技公司 | Debugging message is extracted from the FPGA in multi-tenant environment |
CN113016039A (en) * | 2018-11-12 | 2021-06-22 | 皇家飞利浦有限公司 | System and method for processing waveform data in a medical device |
Also Published As
Publication number | Publication date |
---|---|
US7492932B2 (en) | 2009-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8583449B2 (en) | Method and apparatus for providing network based load balancing of medical image data | |
US7492932B2 (en) | Method and apparatus for processing medical image data in a network environment | |
US8165426B2 (en) | Workflow-based management of medical image data | |
CN103648387B (en) | Medical image control system and portable terminal | |
CN103279637B (en) | Medical imaging diagnosis supporting apparatus and image diagnosis supporting method | |
CN1542671A (en) | Method for monitoring checking and/or treating process | |
US20050251006A1 (en) | Method and system for remote post-processing of medical image information | |
US20050207658A1 (en) | Method and apparatus for extracting information from a medical image | |
US20190228857A1 (en) | Methods, systems, and computer readable media for smart image protocoling | |
CN112582070A (en) | Providing and receiving medical data records | |
EP3799056A1 (en) | Cloud-based patient data exchange | |
Robertson et al. | Hospital, radiology, and picture archiving and communication systems | |
JP4812299B2 (en) | Virtual patient system | |
US20090136105A1 (en) | Retrieval system and retrieval method for retrieving medical images | |
WO2020087792A1 (en) | Artificial-intelligence disease analysis method and apparatus, storage medium, and computer device | |
US20040190795A1 (en) | Image sending device and image receiving device | |
US20070214235A1 (en) | Application server for processing medical image data | |
US8156210B2 (en) | Workflow for computer aided detection | |
JP2008253401A (en) | Data management system | |
JP2012179367A (en) | Data management system | |
JP6825606B2 (en) | Information processing device and information processing method | |
KR20190046306A (en) | System for managing diagnostic images with enhanced secruity | |
JP2009106503A (en) | Medical image selecting system and medical image diagnostic apparatus equipped with the system | |
US8285826B2 (en) | Grid computing on radiology network | |
JPH1091401A (en) | Medical system architecture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NORTEL NETWORKS LIMITED, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SCHOFIELD, BRUCE;REEL/FRAME:021848/0441 Effective date: 20040305 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: ROCKSTAR BIDCO, LP, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NORTEL NETWORKS LIMITED;REEL/FRAME:027164/0356 Effective date: 20110729 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: ROCKSTAR CONSORTIUM US LP, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROCKSTAR BIDCO, LP;REEL/FRAME:032425/0867 Effective date: 20120509 |
|
AS | Assignment |
Owner name: RPX CLEARINGHOUSE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROCKSTAR CONSORTIUM US LP;ROCKSTAR CONSORTIUM LLC;BOCKSTAR TECHNOLOGIES LLC;AND OTHERS;REEL/FRAME:034924/0779 Effective date: 20150128 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, N.A., AS COLLATERAL AGENT, IL Free format text: SECURITY AGREEMENT;ASSIGNORS:RPX CORPORATION;RPX CLEARINGHOUSE LLC;REEL/FRAME:038041/0001 Effective date: 20160226 |
|
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees | ||
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20170217 |
|
AS | Assignment |
Owner name: RPX CORPORATION, CALIFORNIA Free format text: RELEASE (REEL 038041 / FRAME 0001);ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:044970/0030 Effective date: 20171222 Owner name: RPX CLEARINGHOUSE LLC, CALIFORNIA Free format text: RELEASE (REEL 038041 / FRAME 0001);ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:044970/0030 Effective date: 20171222 |