US20090034790A1 - Method for customs inspection of baggage and cargo - Google Patents
Method for customs inspection of baggage and cargo Download PDFInfo
- Publication number
- US20090034790A1 US20090034790A1 US11/888,481 US88848107A US2009034790A1 US 20090034790 A1 US20090034790 A1 US 20090034790A1 US 88848107 A US88848107 A US 88848107A US 2009034790 A1 US2009034790 A1 US 2009034790A1
- Authority
- US
- United States
- Prior art keywords
- baggage
- scan data
- scanner
- image
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 238000007689 inspection Methods 0.000 title description 17
- 238000002591 computed tomography Methods 0.000 claims description 42
- 238000009877 rendering Methods 0.000 claims description 26
- 239000000463 material Substances 0.000 claims description 6
- 238000005259 measurement Methods 0.000 claims description 6
- 239000011368 organic material Substances 0.000 claims description 6
- 239000003086 colorant Substances 0.000 claims description 3
- 239000007769 metal material Substances 0.000 claims description 2
- 230000008569 process Effects 0.000 description 30
- 239000002360 explosive Substances 0.000 description 14
- 238000004422 calculation algorithm Methods 0.000 description 12
- 238000001514 detection method Methods 0.000 description 11
- 238000004519 manufacturing process Methods 0.000 description 11
- 238000007794 visualization technique Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 238000012216 screening Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 239000003795 chemical substances by application Substances 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 238000012552 review Methods 0.000 description 3
- 239000000021 stimulant Substances 0.000 description 3
- -1 typical bombs Substances 0.000 description 3
- 229910000831 Steel Inorganic materials 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 150000002739 metals Chemical class 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000010959 steel Substances 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 238000012952 Resampling Methods 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000003124 biologic agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000010894 electron beam technology Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000001208 nuclear magnetic resonance pulse sequence Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000013515 script Methods 0.000 description 1
- 239000002453 shampoo Substances 0.000 description 1
- 239000000344 soap Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G01V5/226—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01T—MEASUREMENT OF NUCLEAR OR X-RADIATION
- G01T7/00—Details of radiation-measuring instruments
-
- G01V5/271—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
Definitions
- Certain embodiments generally relate to methods and systems for providing remote access to baggage scanned images and passenger security information on a global level.
- CT computed tomography
- the CT scanners generate data sets that are used to form images representative of each scanned bag.
- the data sets are currently processed by an automated image recognition system, such as for certain patterns, characteristics and the like.
- an automated image recognition system such as for certain patterns, characteristics and the like.
- the image recognition system identifies a potential threat, the images are brought to the attention of a local operator, for example, who is located at the port of origin of an airline flight.
- CT scanners better known as explosive detection systems (EDS) are capable of producing fully 3-dimensional (3-D) images.
- EDS explosive detection systems
- 3-D images are complex and generally requires sophisticated local operators with expertise in 3-D rendering software tools.
- CT scanners are able to generate a 3-D voxel data set that represents the volume of the scanned bag.
- scanners provide 3-D images by stacking a series of closely spaced cross section images into a 3-D matrix. The 3-D image may then be viewed by a local operator/screener.
- the local operator at the airport terminal usually steps through two-dimensional (2-D) slices (e.g., planes) of the 3-D matrix to detect and identify potential threats within the packed bag.
- the baggage is scanned by a CT scanner and axial slices or images are created of the baggage.
- the local operator/screener views the axial slices or images by scrolling through each image slice one by one to determine if any potential threats are present in an image. Scrolling through over dozens of images (or even more for future generation scanners) for each bag is a laborious task, and the local operator/screener must be alert to detect features of any potential threats within an image in order to flag the possible threats. Examination of each axial slice image gives rise to operator/screener fatigue that eventually will lead to sub-optimal performance by the operator causing him/her to miss some threats.
- a CT 3-D data set of a packed bag is obtained and may, for example, include hundreds of axial slice images. Of these images only a few images may show the potential threat. If the local operator misses anyone of these few images, the undetected threats could result in disaster either while a plane, train, ship, or cargo vessel is in transit or upon arrival at the destination. Customs officials in a destination country must wait until the arrival of the carrier to search the baggage or cargo, either manually or using a scanner, for contraband or any threatening objects.
- a method for inspecting baggage to be transported from a location of origin to a destination.
- the method includes generating scan data representative of a piece of baggage while the baggage is at the location of origin, and storing the scan data in a database.
- the method further provides rendering a rendered view representative of a content of the baggage where the rendered views are based on the scan data retrieved from the database over a network.
- the rendered views whether generated at the point of origin, while en route, or at a destination port, are presented to customs officials at the destination.
- a system for inspecting baggage transported from a location of origin to a destination includes a database to store scan data acquired while scanning a piece of baggage while the baggage is at the location of origin, a network configured to transmit the scan data, and a workstation for producing a rendered view of the content of the piece of baggage at a destination.
- FIG. 1 illustrates a block diagram of a customs inspection system for baggage and cargo formed in accordance with an embodiment of the invention.
- FIG. 2 illustrates a flow diagram of a customs official using electronic unpacking in accordance with an embodiment of the invention.
- FIG. 3 illustrates a flow chart for an exemplary sequence of operations carried out by a scanner to electronically unpack a piece of baggage performed in accordance with an embodiment of the invention.
- FIGS. 4A-41 illustrate screen shots for a display at a local screener's terminal containing an exemplary data set of scanned images formed in accordance with an embodiment of the invention.
- FIG. 5 illustrates a screen shot for a display at a local screener's terminal containing a data set of scanned images of a suitcase and a radio containing simulated explosives show in FIG. 5 formed in accordance with an embodiment of the invention.
- FIG. 6 illustrates a block diagram of exemplary manners in which embodiments of the present invention may be stored, distributed and installed on computer readable medium.
- the terms “a” or “an” are used, to include one or more than one.
- the term “or” is used to refer to a nonexclusive or, unless otherwise indicated.
- the phrase “an image” is not intended to exclude embodiments of the present invention in which data representing an image is generated, but a viewable image is not generated. Therefore, as used herein, the term “image” broadly refers to both viewable images and data representing a viewable image. However, many embodiments generate (or are configured to generate) at least one viewable image.
- a customs inspection system is provided to allow for inspection of passenger baggage or cargo shipment while the baggage/cargo is in transit (e.g., by airplane, train, or ship) prior to arrival at a destination.
- a customs or other official accesses a database containing raw scan data of three-dimensional (3D) views of electronically unpacked baggage/cargo to determine if any illegal substances/objects (e.g., contraband or threatening materials) are being smuggled while the baggage/cargo is en route to a destination port.
- An electronic unpacking process simulates physical unpacking of the packed bags in order for the customs official to visualize various objects within the bag, where the objects may represent threats.
- FIG. 1 illustrates a block diagram of a customs inspection system 10 for baggage and cargo formed in accordance with an embodiment of the invention.
- the system 10 includes a database 12 , a workstation 14 , an x-ray scanner 15 , a CT scanner 16 , an orthogonal sensor 18 , and a passenger information database 20 that are interconnected together by a local area network 22 (e.g., LAN).
- a local area network 22 e.g., LAN
- LAN local area network
- Orthogonal sensors 18 are used to measure different characteristics of objects contained within the baggage to determine the presence of any threats (e.g., explosives, guns, knifes, and the like).
- the CT scanner 16 may include a moving source and moving detector.
- the x-ray scanner 15 may include a stationary source and stationary detector that operate as a line scanner.
- the CT and x-ray scanners 15 and 16 generate scan data representative of the scanned baggage.
- the CT scanner 16 may be used to scan checked baggage and/or carry-on baggage.
- the x-ray scanner 15 may be used to scan checked baggage and/or carry-on baggage.
- the scan data may represent a simple projection view from a fixed, known scan angle (such as with a stationary source and detector).
- the scan data may represent a series of projection views from multiple scan angles about a z-axis (such as with a rotating source and detector).
- the scan data may be used to reconstruct a volumetric data set, from which rendered views are produced.
- the acquired raw scan data, volumetric data set and/or rendered views are stored in the database 12 via a high-speed connection, such as the LAN 22 .
- the projection view may also be stored in the data base 12 .
- Passenger information from the passenger information database 20 is linked or indexed to the stored scan data to associate a particular passenger with the scan data of the passenger's luggage and belongings.
- the database 12 is connected to a network 24 .
- the network 24 may represent the internet, a private network, a high-speed network, an intranet, a local area network (LAN), a wide area network (WAN), a peer-to-peer network, a client/server network, metropolitan area network (MAN) and the like to provide access to database 12 .
- a customs official 26 may be located, for example, at an airport, a seaport, a border entry post, a rail station 34 , and the like.
- the customs official 26 accesses the database 12 via a workstation 28 and the network 24 to inspect the baggage and cargo aboard an air plane.
- There may be multiple workstations 28 for use by customs officials 26 , located in multiple countries.
- the workstations 28 have network interfaces 29 to simultaneously access the database 12 via the network 24 .
- the workstations 28 acquires scan data from database 12 , produces image views therefrom and performs electronic unpacking while the plane, boat, train and the like is inbound to a destination. Typically, during electronic unpacking, one piece of baggage is divided into hundreds of slices or images.
- Electronic unpacking utilizes a stored scan data set to electronically unpack the same identical bag by performing a slice-by-slice processing.
- the electronic unpacking is superior to manual unpacking because the electronic unpacking can identify whether various objects within the bag are innocuous or not, based on the Hounsfield Unit (HU).
- electronic unpacking can determine the exact volume of all objects, both threats and innocuous, as well as contraband objects, within the packed bag.
- the unpacking process may be used to visualize organic material (e.g., typical bombs, explosives or biological weapons) or metals (e.g., guns, knifes, and the like) for detection of certain threats while unpacking the bag.
- a customs official 26 when examining projection or rendered views, on the workstation 28 of objects within the baggage, may contact remote screening experts 30 to determine if an object is a threat.
- the customs official 26 and one or more remote experts 30 with access to the network 24 are able to examine the projection and rendered views simultaneously and discuss whether an object is a threat or not a threat.
- the remote screening experts 30 may utilize a variety of modes to view the rendered views, for example, a laptop 31 , a desktop 32 , a workstation 33 , a personal digital assistant/cell phone 34 , and the like.
- FIG. 2 illustrates a process 100 that allows a customs official 26 to perform an inspection of the baggage/cargo while in transit in accordance with an embodiment of the invention.
- Process 100 illustrates a typical passenger having checked luggage and carry-on belongings when boarding a plane to a particular destination.
- the process 100 may be implemented for any mode of transportation in addition to an airplane, for instance, a ship, a train, a bus, and the like.
- the process 100 may be used for any type of cargo to be transported/shipped to another location by air, freight, or ship, and the like.
- the process 100 begins when a passenger arrives with baggage at a point of departure (e.g., an airport terminal).
- the passenger checks-in with the carrier (e.g., airline, ship, or train), receives his/her ticket confirmation and proceeds to a security area for screening.
- the carrier e.g., airline, ship, or train
- the checked luggage is placed on a conveyor bet and transferred to a secure area to be scanned.
- a scanning device 16 e.g., such as a CT scanner, a cine computed tomography scanner, a helical CT scanner, a four-dimensional (4D) cine computed tomography scanner, an electronic beam scanner, an X-ray scanner, a dual-energy x-ray scanner, dual-energy CT scanner, and the like
- scans the checked-baggage e.g., luggage, suitcases, backpacks, boxes, crates, briefcases, and the like
- cargo e.g., luggage, suitcases, backpacks, boxes, crates, briefcases, and the like
- Each scanning device 16 includes a scanner source and detector that are capable of obtaining a volumetric (or a cross-sectional) scan of each item of interest, a controller module to control operation of the scanner device 16 , a user interface to afford operator control, and a monitor to display images obtained by the scanner 16 .
- the scanner and detector may rotate about the baggage as the baggage is conveyed along a belt (e.g., to perform a helical scan). After the checked-baggage is scanned, the baggage is loaded onto the airplane (following the flow at 109 ).
- the raw scan data (e.g., the 3D volumetric data set for each piece of baggage) generated by the scanning device 16 is stored in real-time to a database 12 .
- the term “real-time” as used through out this document shall include the time period while the object being scanned is still within the scanner device 16 , and shall also include a period of time immediately after the object exits the scanning device 16 .
- “real-time” would include the time from when a bag is checked, the time in which the bag is transported to the flight, the time in flight, and the time in which the bag is transported from the flight to the bag retrieval area at the destination airport.
- the scan data is downloaded from the scanning device 16 to be stored in the database 12 in one of several image formats, for example, DICONDE, TIFF, JPEG, PDF, and the like. Each image file is assigned a header that identifies which scanning device 16 produced the image, the time of the scan, the passenger ID, and other data obtained at the point of scan.
- the image files may be stored for forty-eight (48) hours or more.
- the scanning device 16 may produce rendered views that are pre-sorted and stored as a sequence of images in the database 12 .
- the scan data may also be combined in data sets that are compressed and encrypted prior to storage in the database 12 . Compressed and encrypted data sets are conveyed over a high-speed connection 24 with standard internet transport protocols to a requesting terminal/server or workstation 28 .
- the database 12 is connected to a plurality of terminal/servers 14 , processors 32 or workstations 28 over a high-speed connection 24 (e.g., electronic network, a private network, a LAN, internet and the like) to display a rendered view.
- the other workstations 28 may be co-located in the same terminal building, different buildings in the same general geographic area, as well as multiple locations within a country or multiple locations outside the country. Therefore, multiple terminal/servers or workstations 28 located at multiple locations may simultaneously download raw scan data stored in the database 12 and perform electronic unpacking and threat detection in accordance with various techniques described thereafter.
- the passenger is in the security area and the passenger's carry-on baggage (e.g., purse, wallet, coat, jacket, shoes, back packs, baby strollers, briefcases, laptops, personal digital assistants, cell phones, and the like) that are being carried onto the plane (or a train, a bus, a ship and the like) are placed onto a conveyor belt within a container for scanning by the x-ray scanner 15 prior to passenger boarding.
- the passenger has non carry-on belongings, he/she may proceed to board the plane.
- the carry-on items are scanned by the x-ray scanner 15 and/or a 2D projection data set or to obtain volumetric 3-D data set (e.g., scan data) representative of the baggage.
- the scan data of the carry-on baggage are stored in the database 12 .
- the scanner 15 or 16 is connected to a local terminal/server or workstation 14 having a display that shows projection or rendered images to a local screener to examine for any threats. If a threat is detected, the items may be confiscated and the passenger may be prevented from boarding. However, if the scan of the carry-on baggage does not identify any threats, the baggage is returned to the passenger and process 100 continues by following flow 115 to 118 .
- customs officials 26 located in other countries are able to access the database 12 via a high-speed connection 24 to download scan data to perform a pre-arrival inspection.
- the high-speed connection 24 is a private communications link.
- One aspect of the pre-arrival inspection is to perform electronic unpacking of the baggage/cargo that is in transit by accessing the database 12 to download the raw data and then analyze the raw scan data using a terminal/server or workstation 28 that utilizes electronic unpacking to generate rendered views.
- the customs official has access to all relevant passenger information, as well as the baggage content (both checked and carry-on).
- an electronic unpacking process (described in FIG. 3 ) is performed to generate a three-dimensional (3D) data set of the packed bag or cargo utilizing, for example, the raw scan data of the baggage/cargo stored in the database 12 .
- 3D three-dimensional
- Electronic unpacking utilizes the stored CT raw data set to electronically unpack the same identical bag by performing a slice-by-slice processing.
- the 3D computed tomography data may be interpolated to generate isotropic volume data.
- electronic unpacking offers a number of different visualization techniques/algorithms to visualize the objects within the packed bags.
- the bag may be electronically unpacked in 3D using surface rendering (SR), volume rendering (VR), maximum intensity projection (MIP), or a combination thereof.
- SR surface rendering
- VR volume rendering
- MIP maximum intensity projection
- the electronic unpacking is superior to manual unpacking because the electronic unpacking can identify whether various objects within the bag are innocuous or not, based on the Hounsfield Unit (HU).
- HU Hounsfield Unit
- electronic unpacking can determine the exact volume of all objects, both threats and innocuous, within the packed bag.
- the unpacking process may be used to visualize organic material (e.g., typical bombs) or metals (e.g., guns, knifes, and the like) for detection of certain threats while unpacking the bag.
- a 3D threat detection algorithm may be provided to detect possible threats.
- the electronic unpacking and threat detection enhance both the threat detection accuracy and the image quality of the electronically unpacked bag.
- the bags that are automatically flagged by the threat detection algorithm are sent, along with images of the electronically unpacked bag clearly marked with threats or contrabands, to the database 12 for the customs official's review.
- the rendered images marked to show the threats or contrabands may be sent via a high-speed connection 24 to a local terminal/server or workstation 28 to be displayed to the customs official 26 .
- the terminal/server 28 includes a user interface and a display (not shown) that shows multiple views of an object.
- the display may contain multiple windows that show at least one of a three-dimensional (3D) rendering, a two-dimensional (2D) view, a cut plane, and a magnified view.
- the workstation 28 has the ability to display various angles, perspectives, rotations, magnifications of an object.
- the user interface may allow a customs official 26 to utilize a drawing function to trace, to sketch, or to outline around the area of interest.
- the customs official 26 also has the capability to toggle-on and toggle-off various portions of the display portions. If a display portion is not shown, the remaining portion may be re-sized and/or rotated.
- the object displayed in any of the windows can be rotated about at least two axes, typically a vertical axis and one or both horizontal axes.
- the user interface allows the customs official 26 to measure at the workstation 28 a plurality of distances and save each distance in memory.
- the distances may include a length, a diameter, a radius, and the like. The distances can be utilized to determine a volume of an object.
- user interface provides a variety of markers for the user to identify potential areas of interest, cut-out specific sections or cut-out specific portions of an object, and to identify threats and contraband.
- the electronic unpacking process seamlessly integrates with both the currently deployed and future generation computed tomography (CT) scanners 16 as well as current and future generation explosives detection systems (EDS), while allowing a desired allocation of expert screening 30 capabilities and remote monitoring of the inspection process itself.
- the electronic unpacking process integrates the EDS at all installations (e.g., airports, seaports, buses, trains, and the like) and permits screeners 14 , 28 , and 30 (e.g., local screeners, experts, customs officials and the like) to view the images via a secure network.
- the electronic unpacking process also integrates other orthogonal sensors 18 (e.g., ETD) and the passenger information database 20 .
- the electronic unpacking process may provide an expert-on-demand (EoD) service, through which, the customs official 26 has instant access to remote screening experts 30 located anywhere in the world. Therefore, upon review of the 3D electronically unpacked bag images, the customs official 26 has the option to either accept the bag or request consultation with one or more remote screening experts 30 using an expert-on-demand (EoD) service if the customs official 26 has any questions as to whether the rendered image depicts threats or contraband.
- EoD expert-on-demand
- the electronic unpacking process allows remote experts 30 to be off-site anywhere in the world.
- a remote expert 30 thus, may be off-site with a laptop computer 31 , at home with a PC 32 or on the road with a PDA 34 .
- the electronic unpacking process supports transmitting text, voice, video, white board, and the like.
- Various passenger information data such as passenger itinerary, travel history, credit information, passenger profile, passport information, passenger photograph, family history, age, physical characteristics, job information and the like are also available for review to assist in the decision whether the rendered image depicts a possible threat.
- Communication between the customs official 26 and the remote expert 30 is similar to public domain messenger services such as MSN®, AOL® and Yahoo® messengers.
- the electronic unpacking process is specially developed for security applications with careful considerations of airport inspection process and EDS requirements. Therefore, users or public messenger services are unable to establish a link with expert screeners 30 .
- a customs official 26 upon scrolling through slices of the 3D data set would initially examine separate CT slices and decide whether a threat is present or not. At 126 , any contraband the customs official 26 detected would be marked as such and stored in the database 12 .
- the passenger information database 20 is accessed to determine the owner of the baggage. The owner of the baggage and all the baggage are then placed on a list for a manual inspection when the plane arrives.
- organic material e.g., typical bombs
- illegal metal objects e.g., guns, knifes, and the like
- the scan data associated with the baggage and cargo, as well as, any identified threats/contraband are accessed from the database 12 .
- the plane arrives at the destination and the passengers exit the plane with their carry-on items and the checked baggage is removed from the airplane and placed on a conveyor belt to be transferred to inside an airport terminal.
- the passenger enters a customs inspection area with their carry-on bags and checked luggage. If a passenger has been selected for inspection, the process continues to 134 . However, if a passenger has not been selected for inspection because no illegal substances and no threats have been identified during electronic unpacking, the passenger may take the baggage and exit the customs area.
- a customs official 26 has on a display screen of the workstation 28 the rendered views of all the checked luggage belong to the passenger that has been identified to contain a threat or illegal contraband.
- a manual inspection of the baggage is performed, taking into account the location of the illegal contraband or threat inside the baggage based on the rendered views.
- the illegal contraband may be confiscated and the passenger may be detained per the laws of the local jurisdiction.
- the process terminates.
- FIG. 3 illustrates a flow diagram 300 depicting the process of electronically unpacking a bag.
- a CT scanner scans a piece of baggage for a scannable characteristic to acquire scan data representative of a content of the piece of baggage, wherein the scannable characteristic is an attenuation measured.
- the electronic unpacking process and system maybe implemented in accordance with the techniques described in the present inventor's co-pending patent application titled “Method and Systems for Electronic Unpacking of Baggage and Cargo,” Ser. No. 11/702,794, filed on Feb. 5, 2007, attorney docket number 10000-0002US1, the complete subject matter of which is incorporated by reference in its entirety herein.
- the scan data is stored in the database 12 .
- the scan data provides axial slices (or z-slices) with isotropic pixels.
- a volumetric data set is generated from the scan data, where the volumetric data set includes voxel values that are in Hounsfield units, for the scannable characteristic throughout a volume of interest in the piece of baggage.
- a portion of the volumetric data set is segmented based on the voxel values to identify an object and provide a visual marker outlining the object.
- the voxel values are segmented into categories. The categories are selected by the Hounsfield unit value of the voxel, and the categories being an innocuous material, an organic material and a metallic material.
- a visualization technique is selected.
- a variety of visualization techniques may be utilized to render the isotropic volume data in 3-D.
- the visualization technique may be selected automatically depending on the size of the bag and the density distribution of objects inside the bag.
- the electronic unpacking renders the scanned bag data using surface rendering (SR) 303 , volume rendering (VR) 304 , or maximum intensity projection (MIP) 305 .
- minimum intensity projection (MinIP), multi-planar reformatting (MPR), or radiographic projection (RP) may also be utilized in place of MIP to visualize the 3-D results.
- the selected visualization technique produces a rendered view of the content of the piece of baggage based on voxel values within a selected range from the volumetric data set.
- the rendered view is produced from voxel values that lie within the customs official 26 selected range of thresholds of a selectable range of voxel values wherein the customs official 26 has the ability to interactively adjust the selectable range.
- SR 303 is a visualization that is utilized to unpack a bag to show the exterior surface of objects within the packed bag. But, before the actual visualization step occurs, SR 303 requires the volume data to be preprocessed (e.g., a surface extraction is performed to determine the exterior surface of an object). In general, surface boundary voxels are determined using a threshold value from the isotropic volume data. Then a marching cube algorithm or a marching voxel algorithm is applied to the surface boundary voxels, which provides the surface data of the object.
- the object is rendered in 3-D using a light source and a reflection resulting from the light source.
- a reflection There are three types of reflection, (e.g., diffuse, specular, and ambient reflections), and an object is represented by the sum of the diffuse, specular, and ambient reflections.
- SR 303 handles only surface data of the object after surface extraction, so the surface rendering speed is higher than the other visualization techniques. SR 303 is very good for various texture effects, but SR 303 needs preprocessing of volume data such as surface extraction. Therefore, SR 303 is not suitable for thin and detailed objects as well as that object that have a transparent effect.
- volume Rendering (VR) 304 is a visualization technique that uses volume data directly without preprocessing of the volume data, such as required by surface extraction in SR 303 .
- a characteristic aspect of VR 304 is opacity and color that are determined from the voxel intensities and threshold values. The opacity can have a value between zero and unity, so the opacities of the voxels render multiple objects simultaneously via a transparent effect.
- surface rendering 303 is an extension of volume rendering 304 with an opacity equal to one.
- the colors of voxels are used to distinguish the kinds of objects to be rendered simultaneously. Using opacities and colors of voxels, the objects are rendered in 3-D by a composite ray-tracing algorithm.
- VR 304 The composite ray-tracing algorithm used in VR 304 is based on the theory of physics of light transportation in the case of neglecting scattering and frequency effects.
- VR 304 is known as a method for visualizing thin and detailed objects and suitable for good transparent effects. But VR 304 uses whole volume data, so the volume rendering speed is relatively low due to the expensive computation.
- a Maximum Intensity Projection (MIP) 305 which is a visualization technique that is realized by a simple ray-tracing algorithm.
- MIP 305 uses the intensity volume data directly without preprocessing of any volume data. The ray traverses the voxels and the only maximum voxel value is retained on the projection plane perpendicular to the ray. Similar to VR 304 , the resampling of voxel intensities at the new voxel locations according to the viewing direction is also needed before ray-tracing can be performed.
- MIP 305 is a good visualization method, especially for vessel structures.
- MIP 305 discards the information of depth while transforming 3-D data to 2-D data that results in an ambiguity of geometry of an object in the MIP 305 images. But, this ambiguity can be solved, for example, by showing MIP 305 images at several different angles in a short sequential movie.
- the unpacking process is repeated with a new set of rendering parameters.
- the new rendering parameters may include, for example, the opacity and coloring scheme for volume rendering 304 and lighting conditions for surface rendering 303 , a new rendering region, orientation of the viewpoint, and the like.
- the rendered image as displayed to the customs official 26 may simultaneously co-display at least one of a surface and volume rendered view of the content of the piece of baggage, and an enlarged image of a region of interest from the rendered view.
- the customs official 26 may zoom on a region of interest within the rendered view or may rotate the rendered view to display the content of the piece of baggage from a new viewpoint.
- the customs official 26 decides whether one or more threat objects exist within the packed bag.
- an automatic threat detection analysis of at least a portion of the volumetric data set based on the scannable characteristic is performed.
- the surface rendering 303 and volume rendering 304 process visit all the voxels within the 3-D dataset, where each voxel is classified into one of several categories, such as innocuous, organic, steel, and the like. The voxel is categorized based on the Hounsfield unit value of the voxel.
- Low Hounsfield unit values correspond to voxels for air or water and are classified as innocuous; medium Hounsfield unit values correspond to voxels classified as organic material (e.g., shampoo or explosives); and high Hounsfield unit values correspond to voxels classified as aluminum or steel (e.g., for guns or knives).
- the volume data is initially segmented by determining the edges and borders of an object by connecting together voxels having similar Hounsfield unit values in common. For example, the voxels are connected together using a 3D connectivity algorithm as known in the art, such as the marching-cubes algorithm or a 3D region growing algorithm. Furthermore, by taking the average of each of the connected voxels and utilizing a known smoothing algorithm a surface is provided.
- the volume rendered 304 images are compared against the segmented regions for consistency, and the initial segmentation is modified in accordance.
- the rendered views are generated using a portion of the 3D data to render a particular object (e.g., the threat) or objects within the packed bad, and to discard obstructing structures to clearly render the object of interest (e.g., the threat).
- the detected threat objects are automatically indicated with ellipsoids on the 3D rendered images for the customs official 26 .
- the threats e.g., explosive stimulants
- the rendered views of the threats may be shared across the network 24 .
- the decision is made whether to accept 308 or reject 309 the bag in question.
- the inspection completes and the customs official 26 is presented with the next bag.
- FIG. 4 illustrates an exemplary process for electronic unpacking 200 of a piece of luggage.
- the unpacking begins with a surface rendering of the entire bag 202 as shown in FIG. 4( a ).
- the initial surface rendering with a portion of the bag 202 peeled (i.e., unpacked) shows the radio 204 packed within the bag 202 as shown in FIG. 4( b ).
- the surface rendering of the radio 202 is shown in FIG. 4( c ).
- the CT raw scan data is volume rendered with color transparencies as shown in FIGS. 4 ( d ), 4 ( e ), and 4 ( f ).
- FIG. 5 illustrates a high resolution 3-D view of the image of the radio 204 (shown in FIG. 4( e )).
- the 3-D views may be in color and the detected explosives may be displayed in a shade of orange with an ellipsoid surrounding the explosives.
- the customs official 26 is able to rotate, zoom and localize these 3-D views as necessary, with or without the assistance of a remote expert 30 in resolving the suspect bag.
- the customs official 26 may also utilize some measurement tools (e.g., such as distance/volume) to determine whether an object is a threat.
- the scanners 16 are described in connection with CT and DI scanners and the raw data sets are described in connection with attenuation measurement data.
- the scanners 16 may include a cine computed tomography scanner, a helical CT scanner, a dual-energy x-ray scanner, dual-energy CT scanner, and a four-dimensional (4-D) cine computed tomography scan.
- the scanner 16 may represent an electron beam scanner.
- the scanner 16 may transmit and receive non-x-ray forms of energy, such as electromagnetic waves, microwaves ultraviolet waves, ultrasound waves, radio frequency waves and the like.
- the raw data set is representative of attenuation measurements taken at various detector positions and projection angles, while the object is stationary within the scanner 16 or while the object is continuously moving through the scanner 16 (e.g., helical or spiral scanning).
- the raw data set may represent non-attenuation characteristics of the object.
- the raw data may represent an energy response or signature associated with the object and/or the content of the object, wherein different types of objects may exhibit unique energy responses or signatures.
- explosives, biological agents, and other potentially threatening medium may exhibit unique electromagnetic responses when exposed to certain fields, waves, pulse sequences and the like.
- FIG. 6 illustrates a block diagram of exemplary manners in which embodiments of the present invention may be stored, distributed and installed on computer readable medium.
- the “application” represents one or more of the methods and process operations discussed above.
- the application may represent the process carried out in connection with FIGS. 2-3 as discussed above.
- the application is initially generated and stored as source code 1001 on a source computer readable medium 1002 .
- the source code 1001 is then conveyed over path 1004 and processed by a compiler 1006 to produce object code 1010 .
- the object code 1010 is conveyed over path 1008 and saved as one or more application masters on a master computer readable medium 1011 .
- the object code 1010 is then copied numerous types, as denoted by path 1012 , to produce production application copies 1013 that are saved on separate production computer readable medium 1014 .
- the production computer readable medium 1014 are then conveyed, as denoted by path 1016 , to various systems, devices, terminals and the like.
- a user terminal 1020 , a device 1021 and a system 1022 are shown as examples of hardware components, on which the production computer readable medium 1014 are installed as applications (as denoted by 1030 - 1032 ).
- the source code may be written as scripts, or in any high-level or low-level language.
- Examples of the source, master, and production computer readable medium 1002 , 1011 and 1014 include, but are not limited to, CDROM, RAM, ROM, Flash memory, RAID drives, memory on a computer system and the like.
- Examples of the paths 1004 , 1008 , 1012 , and 1016 include, but are not limited to, network paths, the internet, Bluetooth, GSM, infrared wireless LANs, HIPERLAN, 3G, satellite, and the like.
- the paths 1004 , 1008 , 1012 , and 1016 may also represent public or private carrier services that transport one or more physical copies of the source, master, or production computer readable medium 1002 , 1011 or 1014 between two geographic locations.
- the paths 1004 , 1008 , 1012 and 1016 may represent threads carried out by one or more processors in parallel.
- one computer may hold the source code 1001 , compiler 1006 and object code 1010 . Multiple computers may operate in parallel to product the production application copies 1013 .
- the paths 1004 , 1008 , 1012 , and 1016 may be intra-state, inter-state, intra-country, inter-country, intra-continental, intercontinental and the like.
- the operations noted in FIG. 6 may be performed in a widely distributed manner world-wide with only a portion thereof being performed in the United States.
- the application source code 1001 may be written in the United States and saved on a source computer readable medium 1002 in the United States, but transported to another country (corresponding to path 1004 ) before compiling, copying and installation.
- the application source code 1001 may be written in or outside of the United States, compiled at a compiler 1006 located in the United States and saved on a master computer readable medium 1011 in the United States, but the object code 1010 transported to another country (corresponding to path 1012 ) before copying and installation.
- the application source code 1001 and object code 1010 may be produced in or outside of the United States, but product production application copies 1013 produced in or conveyed to the United States (e.g. as part of a staging operation) before the production application copies 1013 are installed on user terminals 1020 , devices 1021 , and/or systems 1022 located in or outside the United States as applications 1030 - 1032 .
- the phrases “computer readable medium” and “instructions configured to” shall refer to any one or all of i) the source computer readable medium 1002 and source code 1001 , ii) the master computer readable medium and object code 1010 , iii) the production computer readable medium 1014 and production application copies 1013 and/or iv) the applications 1030 - 1032 saved in memory in the terminal 1020 , device 1021 and system 1022 .
Abstract
Description
- Certain embodiments generally relate to methods and systems for providing remote access to baggage scanned images and passenger security information on a global level.
- In recent years there has been increasing interest in the use of imaging devices at airports to improve security. Today thousands of computed tomography (CT) scanners are installed at airports to scan checked baggage. The CT scanners generate data sets that are used to form images representative of each scanned bag. The data sets are currently processed by an automated image recognition system, such as for certain patterns, characteristics and the like. When the image recognition system identifies a potential threat, the images are brought to the attention of a local operator, for example, who is located at the port of origin of an airline flight.
- The CT scanners, better known as explosive detection systems (EDS) are capable of producing fully 3-dimensional (3-D) images. However, the software required to view such 3-D images is complex and generally requires sophisticated local operators with expertise in 3-D rendering software tools. CT scanners are able to generate a 3-D voxel data set that represents the volume of the scanned bag. Conventionally, scanners provide 3-D images by stacking a series of closely spaced cross section images into a 3-D matrix. The 3-D image may then be viewed by a local operator/screener. The local operator at the airport terminal usually steps through two-dimensional (2-D) slices (e.g., planes) of the 3-D matrix to detect and identify potential threats within the packed bag.
- Currently, existing CT based EDS are deployed at airports to detect various threats within packed bags. The suspicious bags are passed onto a human screener who examines individual CT slice images of the scanned bag. The CT slice images of alarmed bags are carefully examined by the human screener who then either accepts or redirects the bag for explosive trace detection (ETD) and/or manual unpacking for a visual inspection. This two step process allows approximately 250 bags per hour to be examined with a false-alarm rate of about 20-30%. Currently, one in five bags must be further inspected by carefully reviewing CT slice images.
- After the baggage is check-in, the baggage is scanned by a CT scanner and axial slices or images are created of the baggage. The local operator/screener views the axial slices or images by scrolling through each image slice one by one to determine if any potential threats are present in an image. Scrolling through over dozens of images (or even more for future generation scanners) for each bag is a laborious task, and the local operator/screener must be alert to detect features of any potential threats within an image in order to flag the possible threats. Examination of each axial slice image gives rise to operator/screener fatigue that eventually will lead to sub-optimal performance by the operator causing him/her to miss some threats. After a bag is checked, a CT 3-D data set of a packed bag is obtained and may, for example, include hundreds of axial slice images. Of these images only a few images may show the potential threat. If the local operator misses anyone of these few images, the undetected threats could result in disaster either while a plane, train, ship, or cargo vessel is in transit or upon arrival at the destination. Customs officials in a destination country must wait until the arrival of the carrier to search the baggage or cargo, either manually or using a scanner, for contraband or any threatening objects.
- There is a need for an improved baggage scanning system and method to allow a customs agent in a foreign country to be able to screen baggage or cargo while the baggage/cargo is in transit before arriving in the destination country. The customs agent needs to be able to perform an inspection of the baggage/cargo for contraband before the illegal material arrives in the country. Thus, the need exists for the customs agent to be able to electronically unpack the scanned baggage/cargo to inspect views of the inside of the packed baggage/cargo while in transit without having to physically unpack the baggage/cargo upon arrival.
- In accordance with certain embodiments, a method is provided for inspecting baggage to be transported from a location of origin to a destination. The method includes generating scan data representative of a piece of baggage while the baggage is at the location of origin, and storing the scan data in a database. The method further provides rendering a rendered view representative of a content of the baggage where the rendered views are based on the scan data retrieved from the database over a network. The rendered views, whether generated at the point of origin, while en route, or at a destination port, are presented to customs officials at the destination.
- According to another embodiment, a system for inspecting baggage transported from a location of origin to a destination is provided. The system includes a database to store scan data acquired while scanning a piece of baggage while the baggage is at the location of origin, a network configured to transmit the scan data, and a workstation for producing a rendered view of the content of the piece of baggage at a destination.
- In the drawings, which are not necessarily drawn to scale, like numerals describe substantially similar components throughout the several views. Like numerals having different letter suffixes represent different instances of substantially similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
-
FIG. 1 illustrates a block diagram of a customs inspection system for baggage and cargo formed in accordance with an embodiment of the invention. -
FIG. 2 illustrates a flow diagram of a customs official using electronic unpacking in accordance with an embodiment of the invention. -
FIG. 3 illustrates a flow chart for an exemplary sequence of operations carried out by a scanner to electronically unpack a piece of baggage performed in accordance with an embodiment of the invention. -
FIGS. 4A-41 illustrate screen shots for a display at a local screener's terminal containing an exemplary data set of scanned images formed in accordance with an embodiment of the invention. -
FIG. 5 illustrates a screen shot for a display at a local screener's terminal containing a data set of scanned images of a suitcase and a radio containing simulated explosives show inFIG. 5 formed in accordance with an embodiment of the invention. -
FIG. 6 illustrates a block diagram of exemplary manners in which embodiments of the present invention may be stored, distributed and installed on computer readable medium. - In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the present invention may be practiced. It is to be understood that the embodiments may be combined, or that other embodiments may be utilized and that structural, logical and electrical changes may be made without departing from the scope of the various embodiments of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents.
- In this document, the terms “a” or “an” are used, to include one or more than one. In this document, the term “or” is used to refer to a nonexclusive or, unless otherwise indicated. Also as used herein, the phrase “an image” is not intended to exclude embodiments of the present invention in which data representing an image is generated, but a viewable image is not generated. Therefore, as used herein, the term “image” broadly refers to both viewable images and data representing a viewable image. However, many embodiments generate (or are configured to generate) at least one viewable image.
- A customs inspection system is provided to allow for inspection of passenger baggage or cargo shipment while the baggage/cargo is in transit (e.g., by airplane, train, or ship) prior to arrival at a destination. A customs or other official accesses a database containing raw scan data of three-dimensional (3D) views of electronically unpacked baggage/cargo to determine if any illegal substances/objects (e.g., contraband or threatening materials) are being smuggled while the baggage/cargo is en route to a destination port. An electronic unpacking process simulates physical unpacking of the packed bags in order for the customs official to visualize various objects within the bag, where the objects may represent threats.
-
FIG. 1 illustrates a block diagram of acustoms inspection system 10 for baggage and cargo formed in accordance with an embodiment of the invention. Thesystem 10 includes adatabase 12, aworkstation 14, anx-ray scanner 15, aCT scanner 16, anorthogonal sensor 18, and apassenger information database 20 that are interconnected together by a local area network 22 (e.g., LAN). When a passenger enters, for example, an airport to board an airplane, the passenger's baggage and carry-on belongings are scanned during a check-in process. TheCT scanner 16 andx-ray scanner 15 scan the checked baggage and carry-on baggage, respectively.Orthogonal sensors 18 are used to measure different characteristics of objects contained within the baggage to determine the presence of any threats (e.g., explosives, guns, knifes, and the like). TheCT scanner 16 may include a moving source and moving detector. Thex-ray scanner 15 may include a stationary source and stationary detector that operate as a line scanner. The CT andx-ray scanners CT scanner 16 may be used to scan checked baggage and/or carry-on baggage. Similarly, thex-ray scanner 15 may be used to scan checked baggage and/or carry-on baggage. The scan data may represent a simple projection view from a fixed, known scan angle (such as with a stationary source and detector). Optionally, the scan data may represent a series of projection views from multiple scan angles about a z-axis (such as with a rotating source and detector). When the scan data represents a series of projection views, the scan data may be used to reconstruct a volumetric data set, from which rendered views are produced. - The acquired raw scan data, volumetric data set and/or rendered views are stored in the
database 12 via a high-speed connection, such as theLAN 22. When the scan data corresponds to one projection view, the projection view may also be stored in thedata base 12. Passenger information from thepassenger information database 20 is linked or indexed to the stored scan data to associate a particular passenger with the scan data of the passenger's luggage and belongings. Thedatabase 12 is connected to anetwork 24. Thenetwork 24 may represent the internet, a private network, a high-speed network, an intranet, a local area network (LAN), a wide area network (WAN), a peer-to-peer network, a client/server network, metropolitan area network (MAN) and the like to provide access todatabase 12. - A
customs official 26 may be located, for example, at an airport, a seaport, a border entry post, arail station 34, and the like. Thecustoms official 26 accesses thedatabase 12 via aworkstation 28 and thenetwork 24 to inspect the baggage and cargo aboard an air plane. There may bemultiple workstations 28, for use bycustoms officials 26, located in multiple countries. Theworkstations 28 havenetwork interfaces 29 to simultaneously access thedatabase 12 via thenetwork 24. Theworkstations 28 acquires scan data fromdatabase 12, produces image views therefrom and performs electronic unpacking while the plane, boat, train and the like is inbound to a destination. Typically, during electronic unpacking, one piece of baggage is divided into hundreds of slices or images. Electronic unpacking utilizes a stored scan data set to electronically unpack the same identical bag by performing a slice-by-slice processing. The electronic unpacking is superior to manual unpacking because the electronic unpacking can identify whether various objects within the bag are innocuous or not, based on the Hounsfield Unit (HU). Moreover, electronic unpacking can determine the exact volume of all objects, both threats and innocuous, as well as contraband objects, within the packed bag. The unpacking process may be used to visualize organic material (e.g., typical bombs, explosives or biological weapons) or metals (e.g., guns, knifes, and the like) for detection of certain threats while unpacking the bag. - In addition, a
customs official 26 when examining projection or rendered views, on theworkstation 28 of objects within the baggage, may contactremote screening experts 30 to determine if an object is a threat. Thecustoms official 26 and one or moreremote experts 30 with access to thenetwork 24 are able to examine the projection and rendered views simultaneously and discuss whether an object is a threat or not a threat. Theremote screening experts 30 may utilize a variety of modes to view the rendered views, for example, alaptop 31, adesktop 32, aworkstation 33, a personal digital assistant/cell phone 34, and the like. -
FIG. 2 illustrates a process 100 that allows acustoms official 26 to perform an inspection of the baggage/cargo while in transit in accordance with an embodiment of the invention. Process 100 illustrates a typical passenger having checked luggage and carry-on belongings when boarding a plane to a particular destination. However, the process 100 may be implemented for any mode of transportation in addition to an airplane, for instance, a ship, a train, a bus, and the like. In addition, the process 100 may be used for any type of cargo to be transported/shipped to another location by air, freight, or ship, and the like. At 102, the process 100 begins when a passenger arrives with baggage at a point of departure (e.g., an airport terminal). At 104, the passenger checks-in with the carrier (e.g., airline, ship, or train), receives his/her ticket confirmation and proceeds to a security area for screening. - At 106, the checked luggage is placed on a conveyor bet and transferred to a secure area to be scanned. At 108, a scanning device 16 (e.g., such as a CT scanner, a cine computed tomography scanner, a helical CT scanner, a four-dimensional (4D) cine computed tomography scanner, an electronic beam scanner, an X-ray scanner, a dual-energy x-ray scanner, dual-energy CT scanner, and the like) scans the checked-baggage (e.g., luggage, suitcases, backpacks, boxes, crates, briefcases, and the like) or cargo to obtain a volumetric data set representative of every voxel within the object of interest. Each
scanning device 16 includes a scanner source and detector that are capable of obtaining a volumetric (or a cross-sectional) scan of each item of interest, a controller module to control operation of thescanner device 16, a user interface to afford operator control, and a monitor to display images obtained by thescanner 16. For example, the scanner and detector may rotate about the baggage as the baggage is conveyed along a belt (e.g., to perform a helical scan). After the checked-baggage is scanned, the baggage is loaded onto the airplane (following the flow at 109). - At 110, the raw scan data (e.g., the 3D volumetric data set for each piece of baggage) generated by the
scanning device 16 is stored in real-time to adatabase 12. The term “real-time” as used through out this document shall include the time period while the object being scanned is still within thescanner device 16, and shall also include a period of time immediately after the object exits thescanning device 16. For example, “real-time” would include the time from when a bag is checked, the time in which the bag is transported to the flight, the time in flight, and the time in which the bag is transported from the flight to the bag retrieval area at the destination airport. - The scan data is downloaded from the
scanning device 16 to be stored in thedatabase 12 in one of several image formats, for example, DICONDE, TIFF, JPEG, PDF, and the like. Each image file is assigned a header that identifies whichscanning device 16 produced the image, the time of the scan, the passenger ID, and other data obtained at the point of scan. The image files may be stored for forty-eight (48) hours or more. Optionally, thescanning device 16 may produce rendered views that are pre-sorted and stored as a sequence of images in thedatabase 12. The scan data may also be combined in data sets that are compressed and encrypted prior to storage in thedatabase 12. Compressed and encrypted data sets are conveyed over a high-speed connection 24 with standard internet transport protocols to a requesting terminal/server orworkstation 28. - In an embodiment, the
database 12 is connected to a plurality of terminal/servers 14,processors 32 orworkstations 28 over a high-speed connection 24 (e.g., electronic network, a private network, a LAN, internet and the like) to display a rendered view. Theother workstations 28 may be co-located in the same terminal building, different buildings in the same general geographic area, as well as multiple locations within a country or multiple locations outside the country. Therefore, multiple terminal/servers orworkstations 28 located at multiple locations may simultaneously download raw scan data stored in thedatabase 12 and perform electronic unpacking and threat detection in accordance with various techniques described thereafter. - At 112, the passenger is in the security area and the passenger's carry-on baggage (e.g., purse, wallet, coat, jacket, shoes, back packs, baby strollers, briefcases, laptops, personal digital assistants, cell phones, and the like) that are being carried onto the plane (or a train, a bus, a ship and the like) are placed onto a conveyor belt within a container for scanning by the
x-ray scanner 15 prior to passenger boarding. At 118, if the passenger has non carry-on belongings, he/she may proceed to board the plane. - At 114, the carry-on items are scanned by the
x-ray scanner 15 and/or a 2D projection data set or to obtain volumetric 3-D data set (e.g., scan data) representative of the baggage. At 116, the scan data of the carry-on baggage are stored in thedatabase 12. Thescanner workstation 14 having a display that shows projection or rendered images to a local screener to examine for any threats. If a threat is detected, the items may be confiscated and the passenger may be prevented from boarding. However, if the scan of the carry-on baggage does not identify any threats, the baggage is returned to the passenger and process 100 continues by followingflow 115 to 118. - At 120, while the plane is in transit,
customs officials 26 located in other countries are able to access thedatabase 12 via a high-speed connection 24 to download scan data to perform a pre-arrival inspection. The high-speed connection 24 is a private communications link. One aspect of the pre-arrival inspection is to perform electronic unpacking of the baggage/cargo that is in transit by accessing thedatabase 12 to download the raw data and then analyze the raw scan data using a terminal/server orworkstation 28 that utilizes electronic unpacking to generate rendered views. The customs official has access to all relevant passenger information, as well as the baggage content (both checked and carry-on). - At 122, an electronic unpacking process (described in
FIG. 3 ) is performed to generate a three-dimensional (3D) data set of the packed bag or cargo utilizing, for example, the raw scan data of the baggage/cargo stored in thedatabase 12. Typically, one piece of baggage is divided into hundreds of slices or images. Electronic unpacking utilizes the stored CT raw data set to electronically unpack the same identical bag by performing a slice-by-slice processing. First, the 3D computed tomography data may be interpolated to generate isotropic volume data. Just as there are many ways to unpack real physical bags, electronic unpacking offers a number of different visualization techniques/algorithms to visualize the objects within the packed bags. For instance, the bag may be electronically unpacked in 3D using surface rendering (SR), volume rendering (VR), maximum intensity projection (MIP), or a combination thereof. The electronic unpacking is superior to manual unpacking because the electronic unpacking can identify whether various objects within the bag are innocuous or not, based on the Hounsfield Unit (HU). Moreover, electronic unpacking can determine the exact volume of all objects, both threats and innocuous, within the packed bag. The unpacking process may be used to visualize organic material (e.g., typical bombs) or metals (e.g., guns, knifes, and the like) for detection of certain threats while unpacking the bag. - Optionally, a 3D threat detection algorithm may be provided to detect possible threats. The electronic unpacking and threat detection enhance both the threat detection accuracy and the image quality of the electronically unpacked bag. The bags that are automatically flagged by the threat detection algorithm are sent, along with images of the electronically unpacked bag clearly marked with threats or contrabands, to the
database 12 for the customs official's review. In addition, the rendered images marked to show the threats or contrabands may be sent via a high-speed connection 24 to a local terminal/server orworkstation 28 to be displayed to thecustoms official 26. - The terminal/
server 28 includes a user interface and a display (not shown) that shows multiple views of an object. For instance, the display may contain multiple windows that show at least one of a three-dimensional (3D) rendering, a two-dimensional (2D) view, a cut plane, and a magnified view. Theworkstation 28 has the ability to display various angles, perspectives, rotations, magnifications of an object. In addition, the user interface may allow acustoms official 26 to utilize a drawing function to trace, to sketch, or to outline around the area of interest. Thecustoms official 26 also has the capability to toggle-on and toggle-off various portions of the display portions. If a display portion is not shown, the remaining portion may be re-sized and/or rotated. For example, the object displayed in any of the windows can be rotated about at least two axes, typically a vertical axis and one or both horizontal axes. The user interface allows thecustoms official 26 to measure at the workstation 28 a plurality of distances and save each distance in memory. The distances may include a length, a diameter, a radius, and the like. The distances can be utilized to determine a volume of an object. Further, user interface provides a variety of markers for the user to identify potential areas of interest, cut-out specific sections or cut-out specific portions of an object, and to identify threats and contraband. - In accordance with certain embodiments, the electronic unpacking process seamlessly integrates with both the currently deployed and future generation computed tomography (CT)
scanners 16 as well as current and future generation explosives detection systems (EDS), while allowing a desired allocation of expert screening 30 capabilities and remote monitoring of the inspection process itself. The electronic unpacking process integrates the EDS at all installations (e.g., airports, seaports, buses, trains, and the like) and permitsscreeners passenger information database 20. - The electronic unpacking process may provide an expert-on-demand (EoD) service, through which, the
customs official 26 has instant access toremote screening experts 30 located anywhere in the world. Therefore, upon review of the 3D electronically unpacked bag images, thecustoms official 26 has the option to either accept the bag or request consultation with one or moreremote screening experts 30 using an expert-on-demand (EoD) service if thecustoms official 26 has any questions as to whether the rendered image depicts threats or contraband. The electronic unpacking process allowsremote experts 30 to be off-site anywhere in the world. Aremote expert 30, thus, may be off-site with alaptop computer 31, at home with aPC 32 or on the road with aPDA 34. The electronic unpacking process supports transmitting text, voice, video, white board, and the like. Various passenger information data, such as passenger itinerary, travel history, credit information, passenger profile, passport information, passenger photograph, family history, age, physical characteristics, job information and the like are also available for review to assist in the decision whether the rendered image depicts a possible threat. Communication between thecustoms official 26 and theremote expert 30 is similar to public domain messenger services such as MSN®, AOL® and Yahoo® messengers. However, the electronic unpacking process is specially developed for security applications with careful considerations of airport inspection process and EDS requirements. Therefore, users or public messenger services are unable to establish a link withexpert screeners 30. - A
customs official 26 upon scrolling through slices of the 3D data set would initially examine separate CT slices and decide whether a threat is present or not. At 126, any contraband thecustoms official 26 detected would be marked as such and stored in thedatabase 12. - If the electronic unpacking process determines any organic material (e.g., typical bombs), illegal metal objects (e.g., guns, knifes, and the like) or detects any contraband while unpacking the bag, the
passenger information database 20 is accessed to determine the owner of the baggage. The owner of the baggage and all the baggage are then placed on a list for a manual inspection when the plane arrives. - At 128, while the aircraft is in transit but near arrival to the destination, the scan data associated with the baggage and cargo, as well as, any identified threats/contraband are accessed from the
database 12. - At 130, the plane arrives at the destination and the passengers exit the plane with their carry-on items and the checked baggage is removed from the airplane and placed on a conveyor belt to be transferred to inside an airport terminal.
- At 132, the passenger enters a customs inspection area with their carry-on bags and checked luggage. If a passenger has been selected for inspection, the process continues to 134. However, if a passenger has not been selected for inspection because no illegal substances and no threats have been identified during electronic unpacking, the passenger may take the baggage and exit the customs area.
- At 134, a
customs official 26 has on a display screen of theworkstation 28 the rendered views of all the checked luggage belong to the passenger that has been identified to contain a threat or illegal contraband. A manual inspection of the baggage is performed, taking into account the location of the illegal contraband or threat inside the baggage based on the rendered views. The illegal contraband may be confiscated and the passenger may be detained per the laws of the local jurisdiction. At 136, the process terminates. -
FIG. 3 illustrates a flow diagram 300 depicting the process of electronically unpacking a bag. Prior to unpacking, a CT scanner scans a piece of baggage for a scannable characteristic to acquire scan data representative of a content of the piece of baggage, wherein the scannable characteristic is an attenuation measured. The electronic unpacking process and system maybe implemented in accordance with the techniques described in the present inventor's co-pending patent application titled “Method and Systems for Electronic Unpacking of Baggage and Cargo,” Ser. No. 11/702,794, filed on Feb. 5, 2007, attorney docket number 10000-0002US1, the complete subject matter of which is incorporated by reference in its entirety herein. The scan data is stored in thedatabase 12. The scan data provides axial slices (or z-slices) with isotropic pixels. A volumetric data set is generated from the scan data, where the volumetric data set includes voxel values that are in Hounsfield units, for the scannable characteristic throughout a volume of interest in the piece of baggage. A portion of the volumetric data set is segmented based on the voxel values to identify an object and provide a visual marker outlining the object. The voxel values are segmented into categories. The categories are selected by the Hounsfield unit value of the voxel, and the categories being an innocuous material, an organic material and a metallic material. - At 302, depending on the type of bag and/or the screener's preference, a visualization technique is selected. A variety of visualization techniques may be utilized to render the isotropic volume data in 3-D. Furthermore, the visualization technique may be selected automatically depending on the size of the bag and the density distribution of objects inside the bag. Depending on which visualization technique selected by the
customs official 26, the electronic unpacking renders the scanned bag data using surface rendering (SR) 303, volume rendering (VR) 304, or maximum intensity projection (MIP) 305. Alternatively, minimum intensity projection (MinIP), multi-planar reformatting (MPR), or radiographic projection (RP) may also be utilized in place of MIP to visualize the 3-D results. The selected visualization technique produces a rendered view of the content of the piece of baggage based on voxel values within a selected range from the volumetric data set. The rendered view is produced from voxel values that lie within thecustoms official 26 selected range of thresholds of a selectable range of voxel values wherein thecustoms official 26 has the ability to interactively adjust the selectable range. - Surface Rendering (SR) 303 is a visualization that is utilized to unpack a bag to show the exterior surface of objects within the packed bag. But, before the actual visualization step occurs,
SR 303 requires the volume data to be preprocessed (e.g., a surface extraction is performed to determine the exterior surface of an object). In general, surface boundary voxels are determined using a threshold value from the isotropic volume data. Then a marching cube algorithm or a marching voxel algorithm is applied to the surface boundary voxels, which provides the surface data of the object. - After surface extraction, the object is rendered in 3-D using a light source and a reflection resulting from the light source. There are three types of reflection, (e.g., diffuse, specular, and ambient reflections), and an object is represented by the sum of the diffuse, specular, and ambient reflections.
SR 303 handles only surface data of the object after surface extraction, so the surface rendering speed is higher than the other visualization techniques.SR 303 is very good for various texture effects, butSR 303 needs preprocessing of volume data such as surface extraction. Therefore,SR 303 is not suitable for thin and detailed objects as well as that object that have a transparent effect. - On the other hand, Volume Rendering (VR) 304 is a visualization technique that uses volume data directly without preprocessing of the volume data, such as required by surface extraction in
SR 303. A characteristic aspect ofVR 304 is opacity and color that are determined from the voxel intensities and threshold values. The opacity can have a value between zero and unity, so the opacities of the voxels render multiple objects simultaneously via a transparent effect. Thus, for example, surface rendering 303 is an extension ofvolume rendering 304 with an opacity equal to one. The colors of voxels are used to distinguish the kinds of objects to be rendered simultaneously. Using opacities and colors of voxels, the objects are rendered in 3-D by a composite ray-tracing algorithm. The composite ray-tracing algorithm used inVR 304 is based on the theory of physics of light transportation in the case of neglecting scattering and frequency effects.VR 304 is known as a method for visualizing thin and detailed objects and suitable for good transparent effects. ButVR 304 uses whole volume data, so the volume rendering speed is relatively low due to the expensive computation. - A Maximum Intensity Projection (MIP) 305 which is a visualization technique that is realized by a simple ray-tracing algorithm.
MIP 305 uses the intensity volume data directly without preprocessing of any volume data. The ray traverses the voxels and the only maximum voxel value is retained on the projection plane perpendicular to the ray. Similar toVR 304, the resampling of voxel intensities at the new voxel locations according to the viewing direction is also needed before ray-tracing can be performed.MIP 305 is a good visualization method, especially for vessel structures. ButMIP 305 discards the information of depth while transforming 3-D data to 2-D data that results in an ambiguity of geometry of an object in theMIP 305 images. But, this ambiguity can be solved, for example, by showingMIP 305 images at several different angles in a short sequential movie. - At 306, if the rendered image is not sufficiently detailed enough to determine if any objects within the bag are threats in order to accept or reject the bag, the unpacking process is repeated with a new set of rendering parameters. The new rendering parameters may include, for example, the opacity and coloring scheme for
volume rendering 304 and lighting conditions forsurface rendering 303, a new rendering region, orientation of the viewpoint, and the like. For instance, the rendered image as displayed to thecustoms official 26 may simultaneously co-display at least one of a surface and volume rendered view of the content of the piece of baggage, and an enlarged image of a region of interest from the rendered view. Thecustoms official 26 may zoom on a region of interest within the rendered view or may rotate the rendered view to display the content of the piece of baggage from a new viewpoint. - At 307, the
customs official 26 decides whether one or more threat objects exist within the packed bag. In an embodiment, an automatic threat detection analysis of at least a portion of the volumetric data set based on the scannable characteristic is performed. Duringelectronic unpacking 104, thesurface rendering 303 andvolume rendering 304 process visit all the voxels within the 3-D dataset, where each voxel is classified into one of several categories, such as innocuous, organic, steel, and the like. The voxel is categorized based on the Hounsfield unit value of the voxel. Low Hounsfield unit values correspond to voxels for air or water and are classified as innocuous; medium Hounsfield unit values correspond to voxels classified as organic material (e.g., shampoo or explosives); and high Hounsfield unit values correspond to voxels classified as aluminum or steel (e.g., for guns or knives). Once the classification marking is completed and 3-D views are generated. The volume data is initially segmented by determining the edges and borders of an object by connecting together voxels having similar Hounsfield unit values in common. For example, the voxels are connected together using a 3D connectivity algorithm as known in the art, such as the marching-cubes algorithm or a 3D region growing algorithm. Furthermore, by taking the average of each of the connected voxels and utilizing a known smoothing algorithm a surface is provided. - Upon completion of the segmentation, the volume rendered 304 images are compared against the segmented regions for consistency, and the initial segmentation is modified in accordance. The rendered views are generated using a portion of the 3D data to render a particular object (e.g., the threat) or objects within the packed bad, and to discard obstructing structures to clearly render the object of interest (e.g., the threat). Once the final segmentation is completed, the detected threat objects are automatically indicated with ellipsoids on the 3D rendered images for the
customs official 26. By combiningvolume rendering 304 with segmentation, the threats (e.g., explosive stimulants) are clearly visible and identified. The rendered views of the threats may be shared across thenetwork 24. - Based on the
customs officials 26 experience, perhaps aided by aremote expert 30 through the Expert-on-Demand service, the decision is made whether to accept 308 or reject 309 the bag in question. At 310, once the final decision is made, the inspection completes and thecustoms official 26 is presented with the next bag. -
FIG. 4 illustrates an exemplary process for electronic unpacking 200 of a piece of luggage. The unpacking begins with a surface rendering of theentire bag 202 as shown inFIG. 4( a). The initial surface rendering with a portion of thebag 202 peeled (i.e., unpacked) shows theradio 204 packed within thebag 202 as shown inFIG. 4( b). The surface rendering of theradio 202 is shown inFIG. 4( c). To view a structure within theradio 202, the CT raw scan data is volume rendered with color transparencies as shown inFIGS. 4 (d), 4(e), and 4(f). Clearly visible due to the different transparencies for different objects are twospeakers 206, a pack ofbatteries 208, and two six-ounce explosive stimulants 210 (e.g., as soap bars) that are hidden underneath the right speaker. The detectedexplosives 210 will be clearly marked for thecustoms officials 26 benefit as shown inFIG. 4( g). The views (e.g.,FIGS. 4( d) through 4(f)) can be displayed in a continuous 3-D display that rotates for the custom's official 26 to examine. Finally, thecustoms official 26 may want to zoom in on the possibleexplosive stimulants 210 underneath theright speaker 206 as shown inFIGS. 4( g), 4(h), and 4(i). -
FIG. 5 illustrates a high resolution 3-D view of the image of the radio 204 (shown inFIG. 4( e)). In an exemplary embodiment, the 3-D views (shown inFIG. 4) may be in color and the detected explosives may be displayed in a shade of orange with an ellipsoid surrounding the explosives. Thecustoms official 26 is able to rotate, zoom and localize these 3-D views as necessary, with or without the assistance of aremote expert 30 in resolving the suspect bag. In addition, thecustoms official 26 may also utilize some measurement tools (e.g., such as distance/volume) to determine whether an object is a threat. - In the above examples, the
scanners 16 are described in connection with CT and DI scanners and the raw data sets are described in connection with attenuation measurement data. For instance thescanners 16 may include a cine computed tomography scanner, a helical CT scanner, a dual-energy x-ray scanner, dual-energy CT scanner, and a four-dimensional (4-D) cine computed tomography scan. However, alternatively other types ofscanners 16 and other types of raw data may be obtained, processed and displayed without departing from the meets and bounds of the present invention. For example, thescanner 16 may represent an electron beam scanner. Alternatively, thescanner 16 may transmit and receive non-x-ray forms of energy, such as electromagnetic waves, microwaves ultraviolet waves, ultrasound waves, radio frequency waves and the like. Similarly, in the above described embodiments, the raw data set is representative of attenuation measurements taken at various detector positions and projection angles, while the object is stationary within thescanner 16 or while the object is continuously moving through the scanner 16 (e.g., helical or spiral scanning). Alternatively, when non-x-ray forms of energy are used, the raw data set may represent non-attenuation characteristics of the object. For example, the raw data may represent an energy response or signature associated with the object and/or the content of the object, wherein different types of objects may exhibit unique energy responses or signatures. For example, explosives, biological agents, and other potentially threatening medium, may exhibit unique electromagnetic responses when exposed to certain fields, waves, pulse sequences and the like. The electromagnetic response of the object and the content of the object are recorded by thescanner 16 as raw scan data stored in thedatabase 12. As a further example, thescanner 16 may be used to obtain finger prints from the object. The finger prints would be recorded as scan data in thedatabase 12.FIG. 6 illustrates a block diagram of exemplary manners in which embodiments of the present invention may be stored, distributed and installed on computer readable medium. InFIG. 6 , the “application” represents one or more of the methods and process operations discussed above. For example, the application may represent the process carried out in connection withFIGS. 2-3 as discussed above. - As shown in
FIG. 6 , the application is initially generated and stored assource code 1001 on a source computerreadable medium 1002. Thesource code 1001 is then conveyed overpath 1004 and processed by acompiler 1006 to produceobject code 1010. Theobject code 1010 is conveyed overpath 1008 and saved as one or more application masters on a master computerreadable medium 1011. Theobject code 1010 is then copied numerous types, as denoted bypath 1012, to produceproduction application copies 1013 that are saved on separate production computerreadable medium 1014. The production computer readable medium 1014 are then conveyed, as denoted bypath 1016, to various systems, devices, terminals and the like. In the example ofFIG. 6 , auser terminal 1020, adevice 1021 and asystem 1022 are shown as examples of hardware components, on which the production computer readable medium 1014 are installed as applications (as denoted by 1030-1032). - The source code may be written as scripts, or in any high-level or low-level language. Examples of the source, master, and production computer readable medium 1002, 1011 and 1014 include, but are not limited to, CDROM, RAM, ROM, Flash memory, RAID drives, memory on a computer system and the like. Examples of the
paths paths paths source code 1001,compiler 1006 andobject code 1010. Multiple computers may operate in parallel to product the production application copies 1013. Thepaths - The operations noted in
FIG. 6 may be performed in a widely distributed manner world-wide with only a portion thereof being performed in the United States. For example, theapplication source code 1001 may be written in the United States and saved on a source computer readable medium 1002 in the United States, but transported to another country (corresponding to path 1004) before compiling, copying and installation. Alternatively, theapplication source code 1001 may be written in or outside of the United States, compiled at acompiler 1006 located in the United States and saved on a master computer readable medium 1011 in the United States, but theobject code 1010 transported to another country (corresponding to path 1012) before copying and installation. Alternatively, theapplication source code 1001 andobject code 1010 may be produced in or outside of the United States, but productproduction application copies 1013 produced in or conveyed to the United States (e.g. as part of a staging operation) before theproduction application copies 1013 are installed onuser terminals 1020,devices 1021, and/orsystems 1022 located in or outside the United States as applications 1030-1032. - As used throughout the specification and claims, the phrases “computer readable medium” and “instructions configured to” shall refer to any one or all of i) the source computer
readable medium 1002 andsource code 1001, ii) the master computer readable medium andobject code 1010, iii) the production computerreadable medium 1014 andproduction application copies 1013 and/or iv) the applications 1030-1032 saved in memory in the terminal 1020,device 1021 andsystem 1022. - It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. While the dimensions, types of materials and coatings described herein are intended to define the parameters of the invention, they are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means—plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
Claims (34)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/888,481 US8320659B2 (en) | 2007-08-01 | 2007-08-01 | Method for customs inspection of baggage and cargo |
PCT/US2008/069028 WO2009017931A2 (en) | 2007-08-01 | 2008-07-02 | A method for customs inspection of baggage and cargo |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/888,481 US8320659B2 (en) | 2007-08-01 | 2007-08-01 | Method for customs inspection of baggage and cargo |
Publications (2)
Publication Number | Publication Date |
---|---|
US20090034790A1 true US20090034790A1 (en) | 2009-02-05 |
US8320659B2 US8320659B2 (en) | 2012-11-27 |
Family
ID=40305171
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/888,481 Active 2030-06-11 US8320659B2 (en) | 2007-08-01 | 2007-08-01 | Method for customs inspection of baggage and cargo |
Country Status (2)
Country | Link |
---|---|
US (1) | US8320659B2 (en) |
WO (1) | WO2009017931A2 (en) |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100046704A1 (en) * | 2008-08-25 | 2010-02-25 | Telesecurity Sciences, Inc. | Method and system for electronic inspection of baggage and cargo |
WO2011130845A1 (en) * | 2010-04-21 | 2011-10-27 | Optosecurity Inc. | Method and system for use in performing security screening |
DE102011004871A1 (en) * | 2011-02-28 | 2012-08-30 | Siemens Aktiengesellschaft | System for performing centralized safety examination of subject matter e.g. baggage in airport, has data management system to perform retransmission of statements between suburban investigation units and evaluation units |
US20130094740A1 (en) * | 2010-06-11 | 2013-04-18 | Bart Vandenberghe | Method for quantifying local bone changes |
US20130101172A1 (en) * | 2011-09-07 | 2013-04-25 | Shehul Sailesh Parikh | X-ray inspection system that integrates manifest data with imaging/detection processing |
US20130170723A1 (en) * | 2011-12-07 | 2013-07-04 | Telesecurity Sciences, Inc. | Extraction of objects from ct images by sequential segmentation and carving |
WO2013158640A1 (en) * | 2012-04-16 | 2013-10-24 | Neurologica Corp. | Wireless imaging system |
WO2013158655A1 (en) * | 2012-04-16 | 2013-10-24 | Neurologica Corp. | Imaging system with rigidly mounted fiducial markers |
US20150022522A1 (en) * | 2012-03-20 | 2015-01-22 | Siemens Corporation | Luggage Visualization and Virtual Unpacking |
US9235889B1 (en) * | 2012-06-11 | 2016-01-12 | University Of Central Florida Research Foundation, Inc. | Systems, apparatus and methods for collecting and storing raw scan data and software for performing data processing, image reconstruction and interpretation |
US9405990B2 (en) * | 2014-08-19 | 2016-08-02 | Morpho Detection, Llc | X-ray diffraction imaging system with signal aggregation across voxels containing objects and method of operating the same |
CN108226196A (en) * | 2018-02-22 | 2018-06-29 | 青岛智慧云谷智能科技有限公司 | A kind of intelligence X-ray machine detection system and detection method |
US20180300668A1 (en) * | 2017-04-18 | 2018-10-18 | International Bridge, Inc. | Item shipping screening and validation |
US20180308255A1 (en) * | 2017-04-25 | 2018-10-25 | Analogic Corporation | Multiple Three-Dimensional (3-D) Inspection Renderings |
WO2018235320A1 (en) * | 2017-06-21 | 2018-12-27 | 株式会社日立製作所 | Adhered matter inspection method |
CN109785446A (en) * | 2019-02-28 | 2019-05-21 | 清华大学 | Image identification system and its method |
US10302807B2 (en) | 2016-02-22 | 2019-05-28 | Rapiscan Systems, Inc. | Systems and methods for detecting threats and contraband in cargo |
US10331979B2 (en) * | 2016-03-24 | 2019-06-25 | Telesecurity Sciences, Inc. | Extraction and classification of 3-D objects |
US10585207B2 (en) | 2008-02-28 | 2020-03-10 | Rapiscan Systems, Inc. | Scanning systems |
CN111352166A (en) * | 2018-12-21 | 2020-06-30 | 同方威视技术股份有限公司 | Method, apparatus, system, device and storage medium for remote baggage inspection |
US10726380B2 (en) | 2014-12-22 | 2020-07-28 | International Bridge, Inc. | Parcel shipping screening and validation |
CN112204623A (en) * | 2018-06-01 | 2021-01-08 | 电子湾有限公司 | Rendering three-dimensional digital model generation |
US20210225088A1 (en) * | 2020-01-20 | 2021-07-22 | Rapiscan Systems, Inc. | Methods and Systems for Generating Three-Dimensional Images that Enable Improved Visualization and Interaction with Objects in the Three-Dimensional Images |
US20210239875A1 (en) * | 2020-01-30 | 2021-08-05 | Hitachi, Ltd. | Alert output timing control apparatus, alert output timing control method, and non-transitory computer readable storage medium |
CN113252712A (en) * | 2021-04-22 | 2021-08-13 | 盛视科技股份有限公司 | Early machine-inspection baggage risk interception system and method |
WO2021199158A1 (en) * | 2020-03-30 | 2021-10-07 | 日本電気株式会社 | Information processing device, information processing method, and recording medium |
US20210398350A1 (en) * | 2018-09-24 | 2021-12-23 | K2M, Inc. | System And Method For Isolating Anatomical Features In Computerized Tomography Data |
CN114615907A (en) * | 2019-10-01 | 2022-06-10 | 约翰·彼得·帕金斯 | Safety device for luggage |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011014192A1 (en) * | 2009-07-31 | 2011-02-03 | Analogic Corporation | Two-dimensional colored projection image from three-dimensional image data |
US9355502B2 (en) * | 2012-12-12 | 2016-05-31 | Analogic Corporation | Synthetic image generation by combining image of object under examination with image of target |
US9306970B2 (en) | 2013-10-25 | 2016-04-05 | MSA Security, Inc. | Systems and methods for facilitating remote security threat detection |
US9922386B2 (en) | 2013-10-25 | 2018-03-20 | Michael Stapleton Associates, LTD | Systems and methods for facilitating remote security threat detection |
CN104658034B (en) * | 2013-11-18 | 2019-03-01 | 清华大学 | The fusion method for drafting of CT image data |
US9457917B2 (en) | 2014-06-16 | 2016-10-04 | Vanderlande Industries, B.V. | Airport baggage handling system |
US10019834B2 (en) | 2014-09-26 | 2018-07-10 | Microsoft Technology Licensing, Llc | Real-time rendering of volumetric models with occlusive and emissive particles |
WO2018115853A1 (en) * | 2016-12-19 | 2018-06-28 | Portr Limited | Baggage transfer |
CN110007358A (en) * | 2019-04-24 | 2019-07-12 | 中国中元国际工程有限公司 | A kind of checking method and system for customs's safety check |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4590558A (en) * | 1981-12-30 | 1986-05-20 | General Electric Company | Method and apparatus for removing objects from CT images |
US5796802A (en) * | 1996-08-19 | 1998-08-18 | Analogic Corporation | Multiple angle pre-screening tomographic systems and methods |
US6317509B1 (en) * | 1998-02-11 | 2001-11-13 | Analogic Corporation | Computed tomography apparatus and method for classifying objects |
US20020198731A1 (en) * | 2001-06-26 | 2002-12-26 | Barnes Jessica M. | Method and apparatus for processing an international passenger |
US6807458B2 (en) * | 2000-09-20 | 2004-10-19 | Steve Quackenbush | Baggage transportation security system |
US20050031076A1 (en) * | 2001-04-03 | 2005-02-10 | L-3 Communications Security And Detections System | Remote baggage screening method |
US20050111618A1 (en) * | 2002-12-23 | 2005-05-26 | Sommer Edward J.Jr. | Method and apparatus for improving baggage screening examination |
US20060109949A1 (en) * | 2004-11-24 | 2006-05-25 | Tkaczyk J E | System and method for acquisition and reconstruction of contrast-enhanced, artifact-reduced ct images |
US20060273257A1 (en) * | 2001-12-21 | 2006-12-07 | Roos Charles E | Computer assisted bag screening system |
US20060274916A1 (en) * | 2001-10-01 | 2006-12-07 | L-3 Communications Security And Detection Systems | Remote data access |
US20070003122A1 (en) * | 2005-06-29 | 2007-01-04 | General Electric Company | Method for quantifying an object in a larger structure using a reconstructed image |
US20070118399A1 (en) * | 2005-11-22 | 2007-05-24 | Avinash Gopal B | System and method for integrated learning and understanding of healthcare informatics |
US7356174B2 (en) * | 2004-05-07 | 2008-04-08 | General Electric Company | Contraband detection system and method using variance data |
US20080118021A1 (en) * | 2006-11-22 | 2008-05-22 | Sandeep Dutta | Methods and systems for optimizing high resolution image reconstruction |
US20080152082A1 (en) * | 2006-08-16 | 2008-06-26 | Michel Bouchard | Method and apparatus for use in security screening providing incremental display of threat detection information and security system incorporating same |
US20080253653A1 (en) * | 2007-04-12 | 2008-10-16 | Todd Gable | Systems and methods for improving visibility of scanned images |
US7813540B1 (en) * | 2005-01-13 | 2010-10-12 | Oro Grande Technologies Llc | System and method for detecting nuclear material in shipping containers |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6370222B1 (en) * | 1999-02-17 | 2002-04-09 | Ccvs, Llc | Container contents verification |
US8031903B2 (en) * | 2001-10-01 | 2011-10-04 | L-3 Communications Security And Detection Systems, Inc. | Networked security system |
US20070168467A1 (en) * | 2006-01-15 | 2007-07-19 | Telesecurity Sciences Inc. | Method and system for providing remote access to baggage scanned images |
-
2007
- 2007-08-01 US US11/888,481 patent/US8320659B2/en active Active
-
2008
- 2008-07-02 WO PCT/US2008/069028 patent/WO2009017931A2/en active Application Filing
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4590558A (en) * | 1981-12-30 | 1986-05-20 | General Electric Company | Method and apparatus for removing objects from CT images |
US5796802A (en) * | 1996-08-19 | 1998-08-18 | Analogic Corporation | Multiple angle pre-screening tomographic systems and methods |
US6317509B1 (en) * | 1998-02-11 | 2001-11-13 | Analogic Corporation | Computed tomography apparatus and method for classifying objects |
US6807458B2 (en) * | 2000-09-20 | 2004-10-19 | Steve Quackenbush | Baggage transportation security system |
US7139406B2 (en) * | 2001-04-03 | 2006-11-21 | L-3 Communications Security And Detection Systems | Remote baggage screening system, software and method |
US20050031076A1 (en) * | 2001-04-03 | 2005-02-10 | L-3 Communications Security And Detections System | Remote baggage screening method |
US20020198731A1 (en) * | 2001-06-26 | 2002-12-26 | Barnes Jessica M. | Method and apparatus for processing an international passenger |
US20060274916A1 (en) * | 2001-10-01 | 2006-12-07 | L-3 Communications Security And Detection Systems | Remote data access |
US20060273257A1 (en) * | 2001-12-21 | 2006-12-07 | Roos Charles E | Computer assisted bag screening system |
US20050111618A1 (en) * | 2002-12-23 | 2005-05-26 | Sommer Edward J.Jr. | Method and apparatus for improving baggage screening examination |
US7356174B2 (en) * | 2004-05-07 | 2008-04-08 | General Electric Company | Contraband detection system and method using variance data |
US20060109949A1 (en) * | 2004-11-24 | 2006-05-25 | Tkaczyk J E | System and method for acquisition and reconstruction of contrast-enhanced, artifact-reduced ct images |
US7813540B1 (en) * | 2005-01-13 | 2010-10-12 | Oro Grande Technologies Llc | System and method for detecting nuclear material in shipping containers |
US20070003122A1 (en) * | 2005-06-29 | 2007-01-04 | General Electric Company | Method for quantifying an object in a larger structure using a reconstructed image |
US20070118399A1 (en) * | 2005-11-22 | 2007-05-24 | Avinash Gopal B | System and method for integrated learning and understanding of healthcare informatics |
US20080152082A1 (en) * | 2006-08-16 | 2008-06-26 | Michel Bouchard | Method and apparatus for use in security screening providing incremental display of threat detection information and security system incorporating same |
US20080118021A1 (en) * | 2006-11-22 | 2008-05-22 | Sandeep Dutta | Methods and systems for optimizing high resolution image reconstruction |
US20080253653A1 (en) * | 2007-04-12 | 2008-10-16 | Todd Gable | Systems and methods for improving visibility of scanned images |
Non-Patent Citations (1)
Title |
---|
Brown, Douglas R. "Advanced Technology for Marine Cargo Security." 7th Marine Transportation System Research & Technology Coordination Conference. The National Academy of Sciences. Washington DC. 17, Nov, 2004. * |
Cited By (58)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11275194B2 (en) | 2008-02-28 | 2022-03-15 | Rapiscan Systems, Inc. | Scanning systems |
US10585207B2 (en) | 2008-02-28 | 2020-03-10 | Rapiscan Systems, Inc. | Scanning systems |
US11768313B2 (en) | 2008-02-28 | 2023-09-26 | Rapiscan Systems, Inc. | Multi-scanner networked systems for performing material discrimination processes on scanned objects |
US8600149B2 (en) | 2008-08-25 | 2013-12-03 | Telesecurity Sciences, Inc. | Method and system for electronic inspection of baggage and cargo |
US20100046704A1 (en) * | 2008-08-25 | 2010-02-25 | Telesecurity Sciences, Inc. | Method and system for electronic inspection of baggage and cargo |
US9014425B2 (en) | 2010-04-21 | 2015-04-21 | Optosecurity Inc. | Method and system for use in performing security screening |
US9773173B2 (en) | 2010-04-21 | 2017-09-26 | Optosecurity Inc. | Method and system for use in performing security screening |
US10275660B2 (en) | 2010-04-21 | 2019-04-30 | Vanderlande Apc Inc. | Method and system for use in performing security screening |
WO2011130845A1 (en) * | 2010-04-21 | 2011-10-27 | Optosecurity Inc. | Method and system for use in performing security screening |
US9036899B2 (en) * | 2010-06-11 | 2015-05-19 | Katholieke Universiteit Leuven | Method for quantifying local bone changes |
US20130094740A1 (en) * | 2010-06-11 | 2013-04-18 | Bart Vandenberghe | Method for quantifying local bone changes |
DE102011004871A1 (en) * | 2011-02-28 | 2012-08-30 | Siemens Aktiengesellschaft | System for performing centralized safety examination of subject matter e.g. baggage in airport, has data management system to perform retransmission of statements between suburban investigation units and evaluation units |
US20190162873A1 (en) * | 2011-09-07 | 2019-05-30 | Rapiscan Systems, Inc. | Distributed Analysis X-Ray Inspection Methods and Systems |
US9111331B2 (en) * | 2011-09-07 | 2015-08-18 | Rapiscan Systems, Inc. | X-ray inspection system that integrates manifest data with imaging/detection processing |
US20130101172A1 (en) * | 2011-09-07 | 2013-04-25 | Shehul Sailesh Parikh | X-ray inspection system that integrates manifest data with imaging/detection processing |
US11099294B2 (en) * | 2011-09-07 | 2021-08-24 | Rapiscan Systems, Inc. | Distributed analysis x-ray inspection methods and systems |
US10830920B2 (en) | 2011-09-07 | 2020-11-10 | Rapiscan Systems, Inc. | Distributed analysis X-ray inspection methods and systems |
US10422919B2 (en) * | 2011-09-07 | 2019-09-24 | Rapiscan Systems, Inc. | X-ray inspection system that integrates manifest data with imaging/detection processing |
US9632206B2 (en) * | 2011-09-07 | 2017-04-25 | Rapiscan Systems, Inc. | X-ray inspection system that integrates manifest data with imaging/detection processing |
US20210405243A1 (en) * | 2011-09-07 | 2021-12-30 | Rapiscan Systems, Inc. | Distributed Analysis X-Ray Inspection Methods and Systems |
US10509142B2 (en) * | 2011-09-07 | 2019-12-17 | Rapiscan Systems, Inc. | Distributed analysis x-ray inspection methods and systems |
US20170299765A1 (en) * | 2011-09-07 | 2017-10-19 | Rapiscan Systems, Inc. | X-Ray Inspection System That Integrates Manifest Data With Imaging/Detection Processing |
US9123119B2 (en) * | 2011-12-07 | 2015-09-01 | Telesecurity Sciences, Inc. | Extraction of objects from CT images by sequential segmentation and carving |
US20130170723A1 (en) * | 2011-12-07 | 2013-07-04 | Telesecurity Sciences, Inc. | Extraction of objects from ct images by sequential segmentation and carving |
CN104488002A (en) * | 2012-03-20 | 2015-04-01 | 西门子公司 | Luggage visualization and virtual unpacking |
US10019833B2 (en) * | 2012-03-20 | 2018-07-10 | Siemens Corporation | Luggage visualization and virtual unpacking |
US20150022522A1 (en) * | 2012-03-20 | 2015-01-22 | Siemens Corporation | Luggage Visualization and Virtual Unpacking |
WO2013158640A1 (en) * | 2012-04-16 | 2013-10-24 | Neurologica Corp. | Wireless imaging system |
WO2013158655A1 (en) * | 2012-04-16 | 2013-10-24 | Neurologica Corp. | Imaging system with rigidly mounted fiducial markers |
US9173620B2 (en) | 2012-04-16 | 2015-11-03 | Neurologica Corp. | Imaging system with rigidly mounted fiducial markers |
US9691167B1 (en) | 2012-06-11 | 2017-06-27 | University Of Central Florida Research Foundation, Inc. | Systems, apparatus and methods for collecting and storing raw scan data and software for performing data processing, image reconstruction and interpretation |
US9235889B1 (en) * | 2012-06-11 | 2016-01-12 | University Of Central Florida Research Foundation, Inc. | Systems, apparatus and methods for collecting and storing raw scan data and software for performing data processing, image reconstruction and interpretation |
US9405990B2 (en) * | 2014-08-19 | 2016-08-02 | Morpho Detection, Llc | X-ray diffraction imaging system with signal aggregation across voxels containing objects and method of operating the same |
US11599844B2 (en) | 2014-12-22 | 2023-03-07 | International Bridge, Inc. | Parcel shipping screening and validation |
US10726380B2 (en) | 2014-12-22 | 2020-07-28 | International Bridge, Inc. | Parcel shipping screening and validation |
US10768338B2 (en) | 2016-02-22 | 2020-09-08 | Rapiscan Systems, Inc. | Systems and methods for detecting threats and contraband in cargo |
US10302807B2 (en) | 2016-02-22 | 2019-05-28 | Rapiscan Systems, Inc. | Systems and methods for detecting threats and contraband in cargo |
US11287391B2 (en) | 2016-02-22 | 2022-03-29 | Rapiscan Systems, Inc. | Systems and methods for detecting threats and contraband in cargo |
US10331979B2 (en) * | 2016-03-24 | 2019-06-25 | Telesecurity Sciences, Inc. | Extraction and classification of 3-D objects |
US20180300668A1 (en) * | 2017-04-18 | 2018-10-18 | International Bridge, Inc. | Item shipping screening and validation |
US10417602B2 (en) * | 2017-04-18 | 2019-09-17 | International Bridge, Inc. | Item shipping screening and validation |
US11334839B2 (en) | 2017-04-18 | 2022-05-17 | International Bridge, Inc. | Item shipping screening and validation |
US10782441B2 (en) * | 2017-04-25 | 2020-09-22 | Analogic Corporation | Multiple three-dimensional (3-D) inspection renderings |
US20180308255A1 (en) * | 2017-04-25 | 2018-10-25 | Analogic Corporation | Multiple Three-Dimensional (3-D) Inspection Renderings |
JP2019007771A (en) * | 2017-06-21 | 2019-01-17 | 株式会社日立製作所 | Deposit inspection method |
WO2018235320A1 (en) * | 2017-06-21 | 2018-12-27 | 株式会社日立製作所 | Adhered matter inspection method |
CN108226196A (en) * | 2018-02-22 | 2018-06-29 | 青岛智慧云谷智能科技有限公司 | A kind of intelligence X-ray machine detection system and detection method |
CN112204623A (en) * | 2018-06-01 | 2021-01-08 | 电子湾有限公司 | Rendering three-dimensional digital model generation |
US11636650B2 (en) * | 2018-09-24 | 2023-04-25 | K2M, Inc. | System and method for isolating anatomical features in computerized tomography data |
US20210398350A1 (en) * | 2018-09-24 | 2021-12-23 | K2M, Inc. | System And Method For Isolating Anatomical Features In Computerized Tomography Data |
CN111352166A (en) * | 2018-12-21 | 2020-06-30 | 同方威视技术股份有限公司 | Method, apparatus, system, device and storage medium for remote baggage inspection |
CN109785446A (en) * | 2019-02-28 | 2019-05-21 | 清华大学 | Image identification system and its method |
CN114615907A (en) * | 2019-10-01 | 2022-06-10 | 约翰·彼得·帕金斯 | Safety device for luggage |
US11594001B2 (en) * | 2020-01-20 | 2023-02-28 | Rapiscan Systems, Inc. | Methods and systems for generating three-dimensional images that enable improved visualization and interaction with objects in the three-dimensional images |
US20210225088A1 (en) * | 2020-01-20 | 2021-07-22 | Rapiscan Systems, Inc. | Methods and Systems for Generating Three-Dimensional Images that Enable Improved Visualization and Interaction with Objects in the Three-Dimensional Images |
US20210239875A1 (en) * | 2020-01-30 | 2021-08-05 | Hitachi, Ltd. | Alert output timing control apparatus, alert output timing control method, and non-transitory computer readable storage medium |
WO2021199158A1 (en) * | 2020-03-30 | 2021-10-07 | 日本電気株式会社 | Information processing device, information processing method, and recording medium |
CN113252712A (en) * | 2021-04-22 | 2021-08-13 | 盛视科技股份有限公司 | Early machine-inspection baggage risk interception system and method |
Also Published As
Publication number | Publication date |
---|---|
WO2009017931A4 (en) | 2009-09-24 |
WO2009017931A3 (en) | 2009-07-23 |
US8320659B2 (en) | 2012-11-27 |
WO2009017931A2 (en) | 2009-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8320659B2 (en) | Method for customs inspection of baggage and cargo | |
US8600149B2 (en) | Method and system for electronic inspection of baggage and cargo | |
US20070297560A1 (en) | Method and system for electronic unpacking of baggage and cargo | |
Rogers et al. | Automated x-ray image analysis for cargo security: Critical review and future promise | |
US7333589B2 (en) | System and method for CT scanning of baggage | |
US7702068B2 (en) | Contraband detection systems and methods | |
US20200051017A1 (en) | Systems and methods for image processing | |
US11106930B2 (en) | Classifying compartments at security checkpoints by detecting a shape of an object | |
CN109074889A (en) | System and method for detecting dangerous material and contraband in cargo | |
US7613316B2 (en) | Methods and apparatus for detecting objects in baggage | |
EP2676128B1 (en) | System and method for multi-scanner x-ray inspection | |
US20060022140A1 (en) | Methods and apparatus for detection of contraband using terahertz radiation | |
US20070159185A1 (en) | Security scanners with capacitance and magnetic sensor arrays | |
CN110208295A (en) | Integrate shipping bill data and imaging/detection processing X-ray inspection system | |
CN101558327A (en) | System and method for integrating explosive detection systems | |
US20200394442A1 (en) | Screening technique for prohibited objects at security checkpoints | |
US7839971B2 (en) | System and method for inspecting containers for target material | |
US20100092057A1 (en) | Method and system for identifying a containment vessel | |
US20080253653A1 (en) | Systems and methods for improving visibility of scanned images | |
US20090226032A1 (en) | Systems and methods for reducing false alarms in detection systems | |
Chouai et al. | CH-Net: Deep adversarial autoencoders for semantic segmentation in X-ray images of cabin baggage screening at airports | |
JP4376900B2 (en) | Security scanner with capacitance and magnetic sensor array | |
Shutko et al. | Two-Dimensional Spectral Detector for Baggage Inspection X-Ray System | |
US20230169619A1 (en) | Two-stage screening technique for prohibited objects at security checkpoints using image segmentation | |
Mamchur et al. | Analysis of the state of the art on non-intrusive object-screening techniques. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TELESECURITY SCIENCES, INC., NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SONG, SAMUEL MOON-HO;BOYD, DOUGLAS PERRY;REEL/FRAME:019702/0249 Effective date: 20070730 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2553); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 12 |