WO2006022660A2 - Computer assisted bag screening system - Google Patents

Computer assisted bag screening system Download PDF

Info

Publication number
WO2006022660A2
WO2006022660A2 PCT/US2004/024607 US2004024607W WO2006022660A2 WO 2006022660 A2 WO2006022660 A2 WO 2006022660A2 US 2004024607 W US2004024607 W US 2004024607W WO 2006022660 A2 WO2006022660 A2 WO 2006022660A2
Authority
WO
WIPO (PCT)
Prior art keywords
series
objects
containing discrete
articles containing
images
Prior art date
Application number
PCT/US2004/024607
Other languages
French (fr)
Other versions
WO2006022660A3 (en
Inventor
Charles E. Roos
Edward J. Sommer, Jr.
Original Assignee
Roos Charles E
Sommer Edward J Jr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Roos Charles E, Sommer Edward J Jr filed Critical Roos Charles E
Priority to PCT/US2004/024607 priority Critical patent/WO2006022660A2/en
Priority to EP04821375A priority patent/EP1685574A4/en
Priority to PCT/US2004/029616 priority patent/WO2005086616A2/en
Publication of WO2006022660A2 publication Critical patent/WO2006022660A2/en
Publication of WO2006022660A3 publication Critical patent/WO2006022660A3/en

Links

Classifications

    • G01V5/20

Definitions

  • the inventors have, in a previous invention, utilized computerized pointer technology, such as a touch screen, to select and define the spatial position of objects selected by an inspector from within a mixture of objects (U.S. Patent 6,124,560).
  • the present invention extends these capabilities to provide not only location of selected objects to a computer but additionally to provide instant electronic communications among security checkpoint personnel of the presence, appearance, and location of selected objects and provides that this information along with the images of the selected objects within their mixture of objects and associated data be electronically stored in a searchable computerized database.
  • the present invention further provides that computerized algorithms can utilize this information to learn the characteristics of selected objects over time so to allow the computer to automatically help in the selection of similar objects.
  • the inventive computer assisted bag screening system provides means for rapid identification of potentially objectionable inclusions within bags, packages, wrappings, or other containers.
  • the use of this technology can rapidly screen such objects with minimal change to existing facilities.
  • the screening system uses computer technology and a touch screen, or other pointing device, that may enable operators to select several objects per minute without stress.
  • the touch screen or pointing device provides an electronic registration within a computer of objects selected by an operator.
  • Bag screeners are often untrained, poorly paid and highly stressed. These screeners are typically expected to inspect up to 12 bags per minute. The repetitive tasks and lack of experienced scanners have caused lapses of attention. There needs to be systematic checks on performance, better training and reduced stress for the bag screeners before it will be feasible to check all of the checked bag as required in the recent law signed by the President.
  • the innovative imaging and touch screen security and communications system of the present invention can be applied at locations using x-rays or other imaging technology for inspection and screening of objects and people entering into a secure area.
  • the system can significantly improve efficiency and increase traffic flow through a security checkpoint, for instance at an airport or other security sensitive installation (such as a power plant, military base, federal building, etc.) by computerizing and streamlining communications between image inspection personnel, bag search personnel, and supervisory personnel.
  • the system will provide a searchable computerized database of all x- ray images, associated data, search alerts and search results for people and objects passing through the security checkpoint.
  • the information in the database will tag x-ray or other type images containing potential threat objects with the actual identities of the objects allowing computerized image processing correlations to be developed to provide computer aided identification of potential threat objects to assist the image inspector.
  • the database will be usable for investigating security lapses and for analyzing activities at a security checkpoint and throughout a local area network linking security checkpoints.
  • the use of the new computer assisted screening system permits the separation of the image inspection activity from the dirt, noise and confusion of existing security checkpoints. It makes the work of screening less repetitive and the higher productivity permits the payment of higher wages. There is no requirement that image inspectors be in the same location or even the same area of the country as the bags, packages, people, etc. that are being checked. The image inspectors can be assigned in response to the changing traffic loads at different security checkpoints.
  • the remote inspection also provides a means for supervisors to constantly monitor, train and inspect the performance of the bag screeners.
  • the operator touches the object on the touch screen.
  • the bag can then be sent to an area for further inspection, as by being diverted to an alternative conveyor or search area.
  • the results of the search can be entered into a computerized database through touching appropriate identifications on a menu presented on a touch screen to the searcher or by using another type computerized pointing device such as a mouse or light pen with a computer monitor.
  • the new screening system at airports, power plants, and other restricted locations enables inspectors doing the detailed investigations to have the benefit of seeing all of the touch screen identified suspicious objects highlighted on their monitor screens.
  • the system provides that all image scans, associated data, and screener touch screen responses are saved to a computerized database thereby providing a searchable computerized record of objects, people, etc. passing through security checkpoints and allowing later analysis of activities at security checkpoints.
  • the new system further enables objects that are identified by a first bag screener to be automatically referred to a supervisor for a second opinion.
  • the supervisor may either clear the bag for loading or the bag can be diverted to a special area for a detailed inspection.
  • the present invention can make use of artificial intelligence to highlight objects that are similar to those selected by the human operator. After the screener has selected certain objects several times, the computer is able to recognize certain shapes and to highlight them. It is the aim of the screening system to make the job of screening much more productive and to provide a preliminary scan of the large percentage of the baggage that does not need a more detailed inspection.
  • touch screen or other pointing device interface, computer, and artificial intelligence programs in the present invention make bag screening much faster and less stressful, it does not replace the human operator. Additionally it provides a means to monitor the performance of image inspectors and provides a quick way to get a second opinion if any suspicious object is seen.
  • supervisors may run on-line checks to ensure that the screening operators are providing reliable identification of targeted objects. If the supervisor, viewing the same images as a primary image screener, selects 100 objects and 85 are also found by the primary screener then the relative detection efficiency of the primary screener is 85% or 85/100.
  • An important feature is that the computer, having learned to identify threat objects in images through its artificial intelligence program, can similarly perform a check of image inspector efficiency and automatically report its results. It is a significant feature that measurement of such image inspection efficiency, by a supervisor or by the computer, can also be performed off-line using stored images with stored touch screen responses retrieved from the computerized database of checkpoint activities.
  • Another significant feature of the invention is that it can be used to monitor and record trainee inspector responses to images presented to them during training.
  • the efficiency of trainees for image inspection can be measured by the instructor or by the computer.
  • Real images from a database of images from a security checkpoint can be used to train the new inspectors.
  • the new inspectors' responses to these images can be compared to the stored responses of the security personnel manning the security checkpoint in addition to the responses of the trainee instructor.
  • computerized images of threat objects can be projected into an image, such as is done by the new "Threat Image Projection System" or TIPS system recently introduced by the FAA for x-ray image inspection personnel testing and training. Interfacing with TIPS, the inspectors' touch screen responses can be monitored for efficiency of detection, both in pre-job training and during on-the-job training and monitoring.
  • the system provides essentially instantaneous touch screen communications among security personnel with regard to inspection by x-ray or other type imaging generated at security inspection points.
  • advantages offered by the system are:
  • Figure 1 is a perspective view of a preferred embodiment of the present invention.
  • FIG. 2 is a block diagram showing the interactive screens and screening apparatus of the present invention.
  • Figure 3 is a pictorial view of interactive screens of the apparatus of the invention.
  • Figure 4 is a pictorial view of a second view of interactive screens of the apparatus of the invention.
  • Figure 5 is a pictorial view of a third view of interactive screens of the apparatus of the invention.
  • Figure 6 is a pictorial view of a particular interactive screens of the apparatus of the invention.
  • Figure 7 is a block diagram of inspection stations and the interactive apparatus for multiline screening of articles according to the invention.
  • FIG. 8 is a block diagram showing the software interfaces of the apparatus of the invention.
  • FIG 9 is a block diagram showing the hardware components of the software interfaces of Figure 8.
  • a screening system 1 advances one or more articles 6 such as baggage, parcels and mail along a conveyor 2 to be inspected at an inspection zone 3z which is irradiated with electromagnetic radiation from radiation source 5.
  • the preferred source for irradiation is an x- ray tube (i.e., x-ray radiation)
  • suitable radiation sources 5 can include, for example, Klystron tube (microwave radiation), UV lamp (ultraviolet radiation), IR lamp (infrared radiation), and Radio-Nuclide source (gamma rays).
  • the conveyor 2 may be a belt conveyor, a slide, a vibrating pan conveyor, a free fall trajectory, or any other means for conveying materials as is effective for transferring the article 6 through the inspection zone 3z.
  • a radiation sensor array 3 s (for the x-ray receiver incorporated in high precision bag x- ray inspection systems currently available), is positioned to view the inspection zone 3z so to provide data arrays corresponding to measurements of electromagnetic radiation emanating from inspection zone 3z and from any article 6 being conveyed through the inspection zone 3z.
  • the electromagnetic radiation emanating from article 6 in the illustrated embodiment are x-rays, which are projected and detected by sensor array 3s, the output of which is supplied to computer processing unit 7.
  • the detected radiation may be reflected radiation, radiation transmitted through article 6, radiation emitted from article 6 through fluorescence, or any other forms of radiation resulting from interaction of article 6 (or objects within the article) with the incident radiation from source 5.
  • the choice of irradiation is dependent upon the material characteristics of the article and objects of interest and which type of radiation is responsive to such characteristics.
  • Sensor array 3s is shown positioned below the conveyor 2 although in practice it may be located at any position required by the type of radiation and to give the desired view. Sensor array 3 s is selected to be sensitive in the wavelength range of x-ray electromagnetic radiation transiting from article 6 within the inspection zone 3z when irradiated by source 5.
  • a single source 5 is sufficient however other types of radiation may require the single source 5 or multiple sources 5a.
  • the geometry of sensor array 3s is determined by the application (herein x-rays for bag screening).
  • the sensor array 3s may be a linear array of sensors or any area array of sensors. It may physically span the full width and/or length of the inspection zone 3z or it may be more compact and use optics to scan the width and/or length of the inspection zone such as with a CCD camera.
  • Sensor array 3 s may be positioned on the same side of the inspection zone 3z as is irradiation source 5 or it may be positioned on the opposite side of inspection zone 3z from source 5 or positioned at any other location with respect to irradiation source 5 and inspection zone 3z.
  • the effective wavelength ranges of sensor array 3 s and source 5 may each be in any one of the microwave wavelength range, the ultraviolet wavelength range, the visible light wavelength range, the infrared wavelength range, the x-ray wavelength range, or the gamma ray wavelength range or any combination thereof.
  • Sensor array 3 s may be fitted with special filters which allow only certain wavelengths of electromagnetic radiation to reach sensor array 3 s for measurement.
  • sensor array 3 s data corresponding to electromagnetic x- ray measurements emanating from the inspection zone 3 and from article 6 within the inspection zone 3z are transmitted from sensor array 3 s to computer 7 and/or touch sensitive screen 4 over transmitting cables 8 or by wireless means. Control of sensor array 3 s operation may also be provided by computer 7 over transmitting cables 8 or by wireless transmission. Sensor array 3 s data received by computer 7 are processed for analog to digital conversion if not already digital by nature of sensor array 3 s and micro-computer processed into digitized electronic images which are transmitted over cables 8 to touch sensitive screen 4 which is capable of electronically registering the coordinates on the screen of a manual touch by a human operator. Touch screen 4 displays the digitized images corresponding to the sensor data from sensor array 3s of the inspection zone 3z and the article 6 to be screened within the inspection zone 3z. Alternatively, the sensor array 3s can transmit the images directly to the touch sensitive screen 4.
  • the human operator views the electronic images on touch screen 4 and manually touches the image of any significant material object within an article 6 being inspected, in this case object image I, which the operator wishes to be identified for further action, such as direct visual inspection, which may require the article 6 ) to be removed from the stream of articles 6 on conveyor 2.
  • object image I which the operator wishes to be identified for further action, such as direct visual inspection, which may require the article 6
  • Preselected categories of significant information such as spatial coordinates describing the location, prior identifications of similar articles or a highlight of the object on the touch sensitive screen 4 of the touch by operator may be registered by touch sensitive screen 4 and transmitted over cables 8 to computer 7.
  • Computer 7 associates the touch screen information of the registered touch with corresponding information in the computer 7 and further associates the information on other touch screens 4i or other computer monitors in the system.
  • Computer 7 then electronically tracks the location of selected article 6 as it is further conveyed along conveyor 2 ( as to other connected conveyors for further inspection).
  • Computer 7 may contain a pre-compiled pattern database or identification and pattern recognition algorithms which can perform learning of selections by an operator as the operator makes the selections.
  • identification and pattern recognition algorithms may be accomplished by computerized neural networks or other such pattern recognition computer code. Identification by pattern recognition of the objects can be performed by using, for example, the edge enhancement and image contour extraction techniques. Further details of pattern recognition and its interaction with robotic systems is described in the published text titled "Robot Vision,” Berthold Klaus Paul Horn, MIT Press, Cambridge, Mass. (1991), the disclosure of which is incorporated herein, in its entirety.
  • Neural network techniques for identifying and learning patterns are described in, for example, the published text "Neural Networks for Pattern Recognition," Christopher M. Bishop, Oxford University Press, New York (1995), (hereafter "Bishop"), the disclosure of which is incorporated herein, in its entirety.
  • Bishop chapters 3-6 describe neural network techniques including single and multiple layer perception as well as different error functions that can be used for training neural networks.
  • Bishop, chapters 9 and 10 describe techniques for learning and generalization in a neural network.
  • Such computerized learning systems are sometimes referred to as artificial intelligence systems.
  • an operator will initially make touch screen selections of suspicious objects within a mixture in a package or bag to be further examined.
  • the associated electronic images will be processed through the computer algorithms with the imaging patterns distinctive to the selected objects noted by the algorithms.
  • the computer algorithms associate the distinctive properties of the imaging patterns with objects to be selected for extraction and begin to electronically select similar patterns for extraction without input from the human operator. In this way the computerized system learns those objects to be further examined or diverted and after sufficient learning experience may be able to perform certain screening without input from the operator.
  • a touch screen 4 for making the selection of objects I to be further examined or extracted from the mixture is a matter of preference.
  • Similar pointing devices interfaced to a display screen could be used such as a computer mouse, a track ball, a joystick, a touch pad, a light pen, or other such device.
  • the inventors have chosen the touch screen 4 as the preferred pointing device based upon their intensive studies of some of these various types of devices for screening applications.
  • FIG. 2 illustrates a bag screening system 1 with multiple security checkpoints for a building site such as a public building being any structure from a courthouse to an airport.
  • the system 1 incorporates a computer processing unit 7 connected to a plurality of screening units 3 s, which in the illustrated embodiment is a conventional bag x-ray system.
  • Alternative or additional screening units based on other energy levels/frequencies may be incorporated to take advantage of transmission, reflection or absorption of such energies by particular materials for which the screening is intended.
  • an article 6 of baggage, mail or the like is placed on a conveyor 2 to transit the screening area (irradiation zone of the x-ray) and the image of the transiting article 6 is displayed on screens 4.
  • the screens of Figure 3 show the status of the touch screens 4 as an x-ray image of an article of baggage 6 is first displayed. If there are no regions of interest (ROFs) found by the inspector at Touch Screen 4A (as perhaps located at the initial inspection site, whether at a security area on an airport concourse or in a remote location in the bag handling area such as prior to aircraft loading) or by the Supervisor at the Supervisory Touch Screen 4S, then the status of the touch screens 4 remain unchanged until the bag 6 passes out of the x-ray machine 3 and a new article of baggage 6 is introduced.
  • ROIs regions of interest
  • Figure 4 illustrates the operation of the screening system 1 after an Inspector at Touch Screen 4A touches a suspicious object I on the screen.
  • An identifying circle C marks the region of interest on screen 4A and the image is repeated on screens 4B and 4S.
  • the image on Touch Screen 4B remains on the screen as a guide for continued searching until the bag 6 in question is opened, searched or otherwise evaluated and the suspicious object I is determined to be acceptable or non-acceptable cargo.
  • a commercial bag/luggage tracking system as already known to those skilled in the art however, integrated to include the scanning capabilities of the present invention, insures that the appropriate baggage article 6 which requires further search is extracted from the baggage flow, as by being diverted to a dedicated conveyor, and thereby preventing continuation through the system to cargo loading.
  • Figure 5 illustrates the continued flow of additional baggage articles through the x-ray system for inspection.
  • Touch Screen 4A and the Supervisory Touch screen 4S show the image of the current article of bag 6 transiting the system while also showing a smaller inset screen Ai until the previously identified contaminated baggage article has been cleared.
  • Touch Screen 4B continues to hold the image of the contaminated baggage article until the article is opened, searched, evaluated and the suspicious object is identified as acceptable or non-acceptable cargo.
  • Touch Screen 4B displays an insert note An with the message "Searched?", which is maintained until the article of baggage has been processed and a determination made of its acceptability.
  • Touch Screen 4B may alternatively be provided with a Categorization Screen as is illustrated in Figure 6.
  • the present system may incorporate the evaluation or categorization of the noted object I.
  • the following evaluations may be made: NO RESTRICTED Object FOUND; FIREARM; KNIFE; SHARP, POINTED Object; SPRAY CAN; BOMB; EXPLOSIVE; AMMUNITION.
  • Other designations may be added or substituted without departing from the scope of the present invention.
  • the evaluation When the evaluation is made, it may be entered by merely touching the appropriate line (as is well known in the art) and that evaluation may be stored in the computer 7 which provides the overall control of system components and compilation and exercise of data received and output.
  • the intent of the system is to digitize and store in a database all x-ray images and all regions of interest (ROI' s) marked via the touch screen including those not selected on a touch screen for search.
  • the images are all date and time stamped and identified as to the individual security station (e.g., on a concourse or in a bag handling area). Additionally, corresponding digitized photographic images of the articles, digitized photographic images of the person who presented the bag or articles for inspection, and other pertinent information may also be stored along with the digitized x-ray images.
  • This procedure provides a searchable history of all articles and people carrying the articles entered into the system, touch screen inspector responses, and search results should review be needed at any time for verification of security.
  • the database of images and corresponding - identification information entered provides a capability for processing the data to compile correlations, associations, and histories of objects and individuals connected thereto, entering a particular secure premises through the system.
  • the body of collected information may be used in investigative issues and for predictive purposes.
  • having digitized images allows the application of image processing which can be used for computerized examination of images for identification of suspicious objects. This may assist the visual inspection of the images by security personnel. It is a collateral use of the inventive system to be a learning/teaching tool enabling human inspectors to identify suspicious objects to the system through the touch screen interface as the system "learns" the characteristics of particular objects and itself alerts the human inspectors to non-acceptable objects.
  • FIG. 7 illustrates the hardware interconnection of multiple scanners according to the present invention.
  • the LAN can be within one installation, such as an airport or public building, or over a wide area such as being distributed over a geographic region, such as a state or country.
  • a Supervisor Station may become an Image Inspection Station, wherein image inspection takes place remotely from the entrance station (as with the screening station at the security check point at an airport).
  • the Image Inspector 1 may be located at Supervisor Station 1 (which now becomes an inspection station) with another Supervisor Station located on the LAN which will provide supervisory functions.
  • the Touch Screen for Image Inspector 1 is unmanned.
  • Figure 8 shows a block diagram of software interfaces for implementing the preferred embodiment of the invention depicted in Figure 1.
  • the sensor array 3 s sends data through A/D converter 46 to microprocessor 47.
  • Human operator touches 48 the touch sensitive screen 4 on the displayed image to identify an object I ( Figure 4 ) for further examination.
  • This input to touch sensitive screen 4 is tagged 49 by microprocessor 47 as a selection icon 51 in microprocessor 47 host memory and on touch screen 4.
  • the object is further identified on associated touch screens such as 4A, 4B and 4S.
  • additional sub routines for identification and data storage may be implemented into and by the computer 7, such as those described in connection with Figure 6.
  • Figure 9 shows a block diagram of a hardware implementation for the preferred embodiment of Figures 1 and 8.
  • an x-ray screen 56 is used to display images derived from sensor array 3 s ( Figures 1 and 8) and interfaces to computer 7 (which incorporates microprocessor 47) through a frame grabber card 57. Images from the x-ray screen 56 are displayed as digitized or analog electronic images on the touch screen 4. User touch input 48 ( Figure 8) is sent back to computer 7 and processed in conjunction with the electronic images and those additional steps of identification and sequencing as are described and illustrated in connection with Figures 2 though 7.

Abstract

A method and apparatus (1) are disclosed for screening articles (6) utilizing a computerized touch sensitive screen or other computerized pointing device for operator identification and electronic marking 'of objects within the article to be further examined. An operator positioned at a computerized touch sensitive screen views electronic images of the articles to be screened as they are conveyed past a sensor array (3s) which transmits sequences of images of the series either directly or through a computer (7) to the touch sensitive display screen (4). The operator manually 'touches' objects widiin the articles displayed on the screen to be identified on coupled interactive screens thereby registering the objects selected within the computer. The computer then provides sequential information identifying actions to be taken and information and data to be recorded.

Description

PCT PATENT APPLICATION
Attorney Docket No. A398306.7WO
TITLE OF THE INVENTION:
Computer Assisted Bag Screening System
INVENTORS:
Charles E. Roos, a citizen of the United States and a resident of Nashville, Tennessee Edward J. Sommer, Jr., a citizen of the United States and a resident of Nashville,
Tennessee
CROSS REFERENCE TO RELATED APPLICATIONS:
This application relates to U.S. Provisional Application Serial No. 60/342,290 filed December 21, 2001 and U.S. Utility Application Serial No. 10/328,328 filed 12/23/2002.
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT: None
BACKGROUND OF THE INVENTION:
Current bag screening methods at entry points to secure areas in airports and other security sensitive installations are essentially manual systems that take little advantage of today's computerized technologies. Typically, situated at the entry to an airport concourse, for instance, is a security checkpoint station having x-ray machines for inspection of carry-on baggage. An inspector who examines the x-ray images of bags passing through a unit is stationed at each x-ray machine. If an inspector sees a potential threat object he/she typically stops image inspection and verbally alerts search personnel stationed at the end of the x-ray machine conveyor to remove the bag in question to search for the object Often the search person is called over to the x-ray image monitor to look at the object on the screen to help in the search of the baggage. In the meantime, flow of traffic through the security checkpoint halts while this communication is going on. Lines at these x-ray machines are notoriously long and can be quite slow, especially with today's increased threats of terrorism. halts while this communication is going on. Lines at these x-ray machines are notoriously long and can be quite slow, especially with today's increased threats of terrorism.
Additionally there are no detailed records kept of people, baggage, and objects passing through security checkpoints in our airports or at other security sensitive installations. With the thousands of people, bags, packages, etc. being screened daily at a checkpoint it is impossible for security personnel to accurately remember and keep track of who or what passed through their checkpoint, even just a few minutes earlier.
The inventors have, in a previous invention, utilized computerized pointer technology, such as a touch screen, to select and define the spatial position of objects selected by an inspector from within a mixture of objects (U.S. Patent 6,124,560). The present invention extends these capabilities to provide not only location of selected objects to a computer but additionally to provide instant electronic communications among security checkpoint personnel of the presence, appearance, and location of selected objects and provides that this information along with the images of the selected objects within their mixture of objects and associated data be electronically stored in a searchable computerized database. The present invention further provides that computerized algorithms can utilize this information to learn the characteristics of selected objects over time so to allow the computer to automatically help in the selection of similar objects.
1. Field of the Invention
The inventive computer assisted bag screening system provides means for rapid identification of potentially objectionable inclusions within bags, packages, wrappings, or other containers. The use of this technology can rapidly screen such objects with minimal change to existing facilities. The screening system uses computer technology and a touch screen, or other pointing device, that may enable operators to select several objects per minute without stress. The touch screen or pointing device provides an electronic registration within a computer of objects selected by an operator.
Security personnel have high job turnover rates. Bag screeners are often untrained, poorly paid and highly stressed. These screeners are typically expected to inspect up to 12 bags per minute. The repetitive tasks and lack of experienced scanners have caused lapses of attention. There needs to be systematic checks on performance, better training and reduced stress for the bag screeners before it will be feasible to check all of the checked bag as required in the recent law signed by the President. The innovative imaging and touch screen security and communications system of the present invention can be applied at locations using x-rays or other imaging technology for inspection and screening of objects and people entering into a secure area. It is believed that the system can significantly improve efficiency and increase traffic flow through a security checkpoint, for instance at an airport or other security sensitive installation (such as a power plant, military base, federal building, etc.) by computerizing and streamlining communications between image inspection personnel, bag search personnel, and supervisory personnel. Furthermore, the system will provide a searchable computerized database of all x- ray images, associated data, search alerts and search results for people and objects passing through the security checkpoint. The information in the database will tag x-ray or other type images containing potential threat objects with the actual identities of the objects allowing computerized image processing correlations to be developed to provide computer aided identification of potential threat objects to assist the image inspector. The database will be usable for investigating security lapses and for analyzing activities at a security checkpoint and throughout a local area network linking security checkpoints.
The use of the new computer assisted screening system permits the separation of the image inspection activity from the dirt, noise and confusion of existing security checkpoints. It makes the work of screening less repetitive and the higher productivity permits the payment of higher wages. There is no requirement that image inspectors be in the same location or even the same area of the country as the bags, packages, people, etc. that are being checked. The image inspectors can be assigned in response to the changing traffic loads at different security checkpoints. The remote inspection also provides a means for supervisors to constantly monitor, train and inspect the performance of the bag screeners.
When the image inspecting operator sees any object illuminated by the inventive screening system that merits further attention, the operator touches the object on the touch screen. The bag can then be sent to an area for further inspection, as by being diverted to an alternative conveyor or search area. The results of the search can be entered into a computerized database through touching appropriate identifications on a menu presented on a touch screen to the searcher or by using another type computerized pointing device such as a mouse or light pen with a computer monitor.
The new screening system at airports, power plants, and other restricted locations enables inspectors doing the detailed investigations to have the benefit of seeing all of the touch screen identified suspicious objects highlighted on their monitor screens. The system provides that all image scans, associated data, and screener touch screen responses are saved to a computerized database thereby providing a searchable computerized record of objects, people, etc. passing through security checkpoints and allowing later analysis of activities at security checkpoints.
The new system further enables objects that are identified by a first bag screener to be automatically referred to a supervisor for a second opinion. The supervisor may either clear the bag for loading or the bag can be diverted to a special area for a detailed inspection.
The present invention can make use of artificial intelligence to highlight objects that are similar to those selected by the human operator. After the screener has selected certain objects several times, the computer is able to recognize certain shapes and to highlight them. It is the aim of the screening system to make the job of screening much more productive and to provide a preliminary scan of the large percentage of the baggage that does not need a more detailed inspection.
While the touch screen or other pointing device interface, computer, and artificial intelligence programs in the present invention make bag screening much faster and less stressful, it does not replace the human operator. Additionally it provides a means to monitor the performance of image inspectors and provides a quick way to get a second opinion if any suspicious object is seen. Utilizing remote computerized pointing device monitoring, such as with a touch screen, supervisors may run on-line checks to ensure that the screening operators are providing reliable identification of targeted objects. If the supervisor, viewing the same images as a primary image screener, selects 100 objects and 85 are also found by the primary screener then the relative detection efficiency of the primary screener is 85% or 85/100. An important feature is that the computer, having learned to identify threat objects in images through its artificial intelligence program, can similarly perform a check of image inspector efficiency and automatically report its results. It is a significant feature that measurement of such image inspection efficiency, by a supervisor or by the computer, can also be performed off-line using stored images with stored touch screen responses retrieved from the computerized database of checkpoint activities.
Another significant feature of the invention is that it can be used to monitor and record trainee inspector responses to images presented to them during training. As described above, the efficiency of trainees for image inspection can be measured by the instructor or by the computer. Real images from a database of images from a security checkpoint can be used to train the new inspectors. The new inspectors' responses to these images can be compared to the stored responses of the security personnel manning the security checkpoint in addition to the responses of the trainee instructor. Additionally, computerized images of threat objects can be projected into an image, such as is done by the new "Threat Image Projection System" or TIPS system recently introduced by the FAA for x-ray image inspection personnel testing and training. Interfacing with TIPS, the inspectors' touch screen responses can be monitored for efficiency of detection, both in pre-job training and during on-the-job training and monitoring.
SUMMARY OF THE INVENTION
The system provides essentially instantaneous touch screen communications among security personnel with regard to inspection by x-ray or other type imaging generated at security inspection points. Among the advantages offered by the system are:
Increases throughput at a security inspection station while increasing effectiveness of the inspection process.
Allows a second coincident real-time visual inspection of images by a supervisor or second inspector.
Enables rapid automated communications between image touch screen inspectors and object (i.e. bag) search personnel.
Enables rapid automated communications between supervisory personnel and image touch screen inspectors and object search personnel.
Provides rapid automated feedback to inspection and supervisory personnel from search personnel as to status of a bag search.
Provides electronic registration of location and appearance of selected objects allowing storage of images with associated data.
Provides a searchable computerized record of inspection station activity. Provides for a computer to utilize algorithms to learn the characteristics of selected objects over time enabling the computer to automatically help in the selection of similar objects.
Provides computerized methods to measure the detection efficiency of image inspectors.
Provides a training tool for security personnel
DESCRIPTION OF THE DRAWINGS
Figure 1 is a perspective view of a preferred embodiment of the present invention.
Figure 2 is a block diagram showing the interactive screens and screening apparatus of the present invention.
Figure 3 is a pictorial view of interactive screens of the apparatus of the invention.
Figure 4 is a pictorial view of a second view of interactive screens of the apparatus of the invention.
Figure 5 is a pictorial view of a third view of interactive screens of the apparatus of the invention.
Figure 6 is a pictorial view of a particular interactive screens of the apparatus of the invention.
Figure 7 is a block diagram of inspection stations and the interactive apparatus for multiline screening of articles according to the invention.
Figure 8 is a block diagram showing the software interfaces of the apparatus of the invention.
Figure 9 is a block diagram showing the hardware components of the software interfaces of Figure 8.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings and Figure 1 in particular, a screening system 1 advances one or more articles 6 such as baggage, parcels and mail along a conveyor 2 to be inspected at an inspection zone 3z which is irradiated with electromagnetic radiation from radiation source 5. In the illustrated embodiment the preferred source for irradiation is an x- ray tube (i.e., x-ray radiation), however other suitable radiation sources 5 can include, for example, Klystron tube (microwave radiation), UV lamp (ultraviolet radiation), IR lamp (infrared radiation), and Radio-Nuclide source (gamma rays). The conveyor 2 may be a belt conveyor, a slide, a vibrating pan conveyor, a free fall trajectory, or any other means for conveying materials as is effective for transferring the article 6 through the inspection zone 3z. A radiation sensor array 3 s (for the x-ray receiver incorporated in high precision bag x- ray inspection systems currently available), is positioned to view the inspection zone 3z so to provide data arrays corresponding to measurements of electromagnetic radiation emanating from inspection zone 3z and from any article 6 being conveyed through the inspection zone 3z. The electromagnetic radiation emanating from article 6 in the illustrated embodiment are x-rays, which are projected and detected by sensor array 3s, the output of which is supplied to computer processing unit 7. In alternative embodiments, the detected radiation may be reflected radiation, radiation transmitted through article 6, radiation emitted from article 6 through fluorescence, or any other forms of radiation resulting from interaction of article 6 (or objects within the article) with the incident radiation from source 5. The choice of irradiation is dependent upon the material characteristics of the article and objects of interest and which type of radiation is responsive to such characteristics. Sensor array 3s is shown positioned below the conveyor 2 although in practice it may be located at any position required by the type of radiation and to give the desired view. Sensor array 3 s is selected to be sensitive in the wavelength range of x-ray electromagnetic radiation transiting from article 6 within the inspection zone 3z when irradiated by source 5. In the present embodiment utilizing x-rays, a single source 5 is sufficient however other types of radiation may require the single source 5 or multiple sources 5a. The geometry of sensor array 3s is determined by the application (herein x-rays for bag screening). For instance the sensor array 3s may be a linear array of sensors or any area array of sensors. It may physically span the full width and/or length of the inspection zone 3z or it may be more compact and use optics to scan the width and/or length of the inspection zone such as with a CCD camera. Sensor array 3 s may be positioned on the same side of the inspection zone 3z as is irradiation source 5 or it may be positioned on the opposite side of inspection zone 3z from source 5 or positioned at any other location with respect to irradiation source 5 and inspection zone 3z. The effective wavelength ranges of sensor array 3 s and source 5 may each be in any one of the microwave wavelength range, the ultraviolet wavelength range, the visible light wavelength range, the infrared wavelength range, the x-ray wavelength range, or the gamma ray wavelength range or any combination thereof. Sensor array 3 s may be fitted with special filters which allow only certain wavelengths of electromagnetic radiation to reach sensor array 3 s for measurement.
In the present embodiment, sensor array 3 s data corresponding to electromagnetic x- ray measurements emanating from the inspection zone 3 and from article 6 within the inspection zone 3z are transmitted from sensor array 3 s to computer 7 and/or touch sensitive screen 4 over transmitting cables 8 or by wireless means. Control of sensor array 3 s operation may also be provided by computer 7 over transmitting cables 8 or by wireless transmission. Sensor array 3 s data received by computer 7 are processed for analog to digital conversion if not already digital by nature of sensor array 3 s and micro-computer processed into digitized electronic images which are transmitted over cables 8 to touch sensitive screen 4 which is capable of electronically registering the coordinates on the screen of a manual touch by a human operator. Touch screen 4 displays the digitized images corresponding to the sensor data from sensor array 3s of the inspection zone 3z and the article 6 to be screened within the inspection zone 3z. Alternatively, the sensor array 3s can transmit the images directly to the touch sensitive screen 4.
The human operator (i.e., Inspector, Supervisor, etc.) views the electronic images on touch screen 4 and manually touches the image of any significant material object within an article 6 being inspected, in this case object image I, which the operator wishes to be identified for further action, such as direct visual inspection, which may require the article 6 ) to be removed from the stream of articles 6 on conveyor 2. Preselected categories of significant information, such as spatial coordinates describing the location, prior identifications of similar articles or a highlight of the object on the touch sensitive screen 4 of the touch by operator may be registered by touch sensitive screen 4 and transmitted over cables 8 to computer 7. Computer 7 associates the touch screen information of the registered touch with corresponding information in the computer 7 and further associates the information on other touch screens 4i or other computer monitors in the system. Computer 7 then electronically tracks the location of selected article 6 as it is further conveyed along conveyor 2 ( as to other connected conveyors for further inspection). Computer 7 may contain a pre-compiled pattern database or identification and pattern recognition algorithms which can perform learning of selections by an operator as the operator makes the selections. Such identification and pattern recognition algorithms may be accomplished by computerized neural networks or other such pattern recognition computer code. Identification by pattern recognition of the objects can be performed by using, for example, the edge enhancement and image contour extraction techniques. Further details of pattern recognition and its interaction with robotic systems is described in the published text titled "Robot Vision," Berthold Klaus Paul Horn, MIT Press, Cambridge, Mass. (1991), the disclosure of which is incorporated herein, in its entirety.
Learning the recognized object patterns can be performed using known neural network or other pattern recognition and learning systems. Neural network techniques for identifying and learning patterns are described in, for example, the published text "Neural Networks for Pattern Recognition," Christopher M. Bishop, Oxford University Press, New York (1995), (hereafter "Bishop"), the disclosure of which is incorporated herein, in its entirety. Bishop chapters 3-6 describe neural network techniques including single and multiple layer perception as well as different error functions that can be used for training neural networks. Bishop, chapters 9 and 10 describe techniques for learning and generalization in a neural network. Such computerized learning systems are sometimes referred to as artificial intelligence systems.
In this case an operator will initially make touch screen selections of suspicious objects within a mixture in a package or bag to be further examined. As the operator makes selections, the associated electronic images will be processed through the computer algorithms with the imaging patterns distinctive to the selected objects noted by the algorithms. As similar objects are repetitively selected by the operator the computer algorithms associate the distinctive properties of the imaging patterns with objects to be selected for extraction and begin to electronically select similar patterns for extraction without input from the human operator. In this way the computerized system learns those objects to be further examined or diverted and after sufficient learning experience may be able to perform certain screening without input from the operator.
The choice of using a touch screen 4 for making the selection of objects I to be further examined or extracted from the mixture is a matter of preference. Similar pointing devices interfaced to a display screen could be used such as a computer mouse, a track ball, a joystick, a touch pad, a light pen, or other such device. The inventors have chosen the touch screen 4 as the preferred pointing device based upon their intensive studies of some of these various types of devices for screening applications.
Figure 2 illustrates a bag screening system 1 with multiple security checkpoints for a building site such as a public building being any structure from a courthouse to an airport. The system 1 incorporates a computer processing unit 7 connected to a plurality of screening units 3 s, which in the illustrated embodiment is a conventional bag x-ray system. Alternative or additional screening units based on other energy levels/frequencies may be incorporated to take advantage of transmission, reflection or absorption of such energies by particular materials for which the screening is intended. In the present embodiment, an article 6 of baggage, mail or the like is placed on a conveyor 2 to transit the screening area (irradiation zone of the x-ray) and the image of the transiting article 6 is displayed on screens 4.
The screens of Figure 3 show the status of the touch screens 4 as an x-ray image of an article of baggage 6 is first displayed. If there are no regions of interest (ROFs) found by the inspector at Touch Screen 4A (as perhaps located at the initial inspection site, whether at a security area on an airport concourse or in a remote location in the bag handling area such as prior to aircraft loading) or by the Supervisor at the Supervisory Touch Screen 4S, then the status of the touch screens 4 remain unchanged until the bag 6 passes out of the x-ray machine 3 and a new article of baggage 6 is introduced.
Figure 4 illustrates the operation of the screening system 1 after an Inspector at Touch Screen 4A touches a suspicious object I on the screen. An identifying circle C marks the region of interest on screen 4A and the image is repeated on screens 4B and 4S. The image on Touch Screen 4B remains on the screen as a guide for continued searching until the bag 6 in question is opened, searched or otherwise evaluated and the suspicious object I is determined to be acceptable or non-acceptable cargo. A commercial bag/luggage tracking system, as already known to those skilled in the art however, integrated to include the scanning capabilities of the present invention, insures that the appropriate baggage article 6 which requires further search is extracted from the baggage flow, as by being diverted to a dedicated conveyor, and thereby preventing continuation through the system to cargo loading.
In the meantime while the contaminated baggage article is rerouted for further examination, Figure 5 illustrates the continued flow of additional baggage articles through the x-ray system for inspection. Noteworthy is that the method of the present invention that Touch Screen 4A and the Supervisory Touch screen 4S show the image of the current article of bag 6 transiting the system while also showing a smaller inset screen Ai until the previously identified contaminated baggage article has been cleared. Touch Screen 4B continues to hold the image of the contaminated baggage article until the article is opened, searched, evaluated and the suspicious object is identified as acceptable or non-acceptable cargo. In keeping with the inventive method, Touch Screen 4B displays an insert note An with the message "Searched?", which is maintained until the article of baggage has been processed and a determination made of its acceptability. Once the bag article is cleared at the station having Touch Screen 4B, and the message "Searched?" is touched, the screens 4 revert to the image of the currently viewed article, and the inset images Ai disappear. Touch Screen 4B may alternatively be provided with a Categorization Screen as is illustrated in Figure 6.
With the categorization method included, the present system may incorporate the evaluation or categorization of the noted object I. In the present embodiment, the following evaluations may be made: NO RESTRICTED Object FOUND; FIREARM; KNIFE; SHARP, POINTED Object; SPRAY CAN; BOMB; EXPLOSIVE; AMMUNITION. Other designations may be added or substituted without departing from the scope of the present invention. When the evaluation is made, it may be entered by merely touching the appropriate line (as is well known in the art) and that evaluation may be stored in the computer 7 which provides the overall control of system components and compilation and exercise of data received and output. The intent of the system is to digitize and store in a database all x-ray images and all regions of interest (ROI' s) marked via the touch screen including those not selected on a touch screen for search. The images are all date and time stamped and identified as to the individual security station (e.g., on a concourse or in a bag handling area). Additionally, corresponding digitized photographic images of the articles, digitized photographic images of the person who presented the bag or articles for inspection, and other pertinent information may also be stored along with the digitized x-ray images. This procedure provides a searchable history of all articles and people carrying the articles entered into the system, touch screen inspector responses, and search results should review be needed at any time for verification of security. The database of images and corresponding - identification information entered provides a capability for processing the data to compile correlations, associations, and histories of objects and individuals connected thereto, entering a particular secure premises through the system. As should be understood, the body of collected information may be used in investigative issues and for predictive purposes. Additionally, having digitized images allows the application of image processing which can be used for computerized examination of images for identification of suspicious objects. This may assist the visual inspection of the images by security personnel. It is a collateral use of the inventive system to be a learning/teaching tool enabling human inspectors to identify suspicious objects to the system through the touch screen interface as the system "learns" the characteristics of particular objects and itself alerts the human inspectors to non-acceptable objects.
Figure 7 illustrates the hardware interconnection of multiple scanners according to the present invention. There can be as many Inspection Stations and Supervisor Stations on the local area network (LAN) as is desired. The LAN can be within one installation, such as an airport or public building, or over a wide area such as being distributed over a geographic region, such as a state or country. In the context of the single or multiple area disposition, a Supervisor Station may become an Image Inspection Station, wherein image inspection takes place remotely from the entrance station (as with the screening station at the security check point at an airport). In that case, the Image Inspector 1 may be located at Supervisor Station 1 (which now becomes an inspection station) with another Supervisor Station located on the LAN which will provide supervisory functions. In such an embodiment, the Touch Screen for Image Inspector 1 is unmanned.
Figure 8 shows a block diagram of software interfaces for implementing the preferred embodiment of the invention depicted in Figure 1. In Figure 8 the sensor array 3 s sends data through A/D converter 46 to microprocessor 47. Human operator touches 48 the touch sensitive screen 4 on the displayed image to identify an object I (Figure 4 ) for further examination. This input to touch sensitive screen 4 is tagged 49 by microprocessor 47 as a selection icon 51 in microprocessor 47 host memory and on touch screen 4. The object is further identified on associated touch screens such as 4A, 4B and 4S. Depending upon the implemented embodiment, additional sub routines for identification and data storage may be implemented into and by the computer 7, such as those described in connection with Figure 6. Figure 9 shows a block diagram of a hardware implementation for the preferred embodiment of Figures 1 and 8. In this case an x-ray screen 56 is used to display images derived from sensor array 3 s (Figures 1 and 8) and interfaces to computer 7 (which incorporates microprocessor 47) through a frame grabber card 57. Images from the x-ray screen 56 are displayed as digitized or analog electronic images on the touch screen 4. User touch input 48 (Figure 8) is sent back to computer 7 and processed in conjunction with the electronic images and those additional steps of identification and sequencing as are described and illustrated in connection with Figures 2 though 7.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification be considered as exemplary only, with the true scope and spirit of the invention being indicated by the following claims.
What is claimed is

Claims

1. A method for inspecting a series of articles containing discrete objects to select for further examination discernable objects within the series comprising steps of:
(a) conveying a series of articles containing discrete objects into and through an inspection zone;
(b) irradiating said series of articles containing discrete objects with incident electromagnetic radiation while in said inspection zone;
(c) measuring the electromagnetic radiation emanating from said irradiated series of articles containing discrete objects;
(d) processing said measured electromagnetic radiation to produce one or more electronic images of irradiated articles within said irradiated series of articles containing discrete objects and presenting said electronic images for visual display;
(e) interactively selecting, from said visual display, one or more discrete objects discernable within the images of said series of articles containing one or more discrete objects ; and
(f) displaying said selections on additional displays of said images of said series of articles containing discrete objects.
2. The method of claim 1 wherein said selections are visually marked on the images of said series of articles containing discrete objects presented on said displays.
3. The method of claim 1 wherein said selections are electronically registered within a computer.
4. The method of claim 3 wherein said selections are stored in a searchable database of a computer along with said electronic images of said series of articles containing discrete objects.
5. The method of claim 1 wherein said selecting takes place on a visual display at a location remote from the location where said irradiating occurs.
6. The method of claim 3 wherein said electronically registered selections are processed by computer algorithms such that the algorithms learn over time the characteristics of similar selected objects.
7. The method of claim 6 wherein said algorithms utilize said learned characteristics to perform computerized selection of similar objects.
8. A method for inspecting a series of articles containing discrete objects and selecting for further examination discernable objects within the series comprising steps of:
(a) conveying a series of articles containing discrete objects into and through an inspection zone;
(b) irradiating said series of articles containing discrete objects with incident electromagnetic radiation while in said inspection zone;
(c) measuring the electromagnetic radiation emanating from said irradiated series of articles containing discrete objects;
(d) processing said measured electromagnetic radiation to produce one or more electronic images of said irradiated series of articles containing discrete objects and presenting said electronic images as two separate identical visual displays;
(e) first interactively selecting, from a first visual display, first selected objects discernable within the images of said series of articles containing discrete objects;
(f) second interactively selecting, from a second visual display, second selected objects discernable within the images of said series of articles containing discrete objects; and
(g) comparing said first selected objects with said second selected objects to determine relative efficiencies of said first interactively selecting and said second interactively selecting.
9. The method according to claim 8 wherein said second visual display is located remotely from said first visual display.
10. A method for inspecting a series of articles containing discrete objects and selecting for further examination discernable objects within the series comprising steps of:
(a) conveying a series of articles containing discrete objects into and through an inspection zone;
(b) irradiating said series of articles containing discrete objects with incident electromagnetic radiation while in said inspection zone;
(c) measuring the electromagnetic radiation emanating from said irradiated series of articles containing discrete objects;
(d) processing said measured electromagnetic radiation to produce one or more electronic images of said irradiated series of articles containing discrete objects and presenting said electronic images for visual display;
(e) first interactively selecting, from said visual display, selected objects discernable within the images of said series of articles containing discrete objects;
(f) electronically registering said selected objects within a computer;
(g) applying computerized algorithms which have learned characteristics of prior interactively selected objects to process said electronic images and make computerized selections of objects displaying characteristics of prior interactively selected objects ;
(h) comparing said electronically registered first interactively selected objects to said computerized selections of objects to determine relative efficiency of said first interactively selecting to the computerized selecting.
11. The method according to claim 4 wherein said selections along with said electronic images of said series of articles containing discrete objects are stored in a searchable database along with digitized photographic images of individuals associated with said series of articles containing discrete objects.
12. The method according to claim 11 wherein said selections, electronic images, and digitized photographic images are stored in a computer searchable database along with other information associated with said series and said individuals.
13. The method according to claim 4 where said searchable database is analyzed to compile correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
14. The method according to claim 11 where said searchable database is analyzed to compile correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
15. The method according to claim 12 where said searchable database is analyzed to compile correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
16. An apparatus for inspecting a series of articles containing discrete objects to select for further examination discernable objects within the series comprising:
(a) a conveyor for conveying a series of articles containing discrete objects into and through an inspection zone;
(b) an electromagnetic radiation source that irradiates said series of articles containing discrete objects within said inspection zone with incident electromagnetic radiation;
(c) a sensor that examines said series of articles containing discrete objects within said inspection zone and measures electromagnetic radiation emanating from said series of articles containing discrete objects;
(d) A computer that processes said measured electromagnetic radiation to produce electronic images of irradiated articles within said irradiated series of articles containing discrete objects as they pass through said inspection zone and a display coupled to said computer that presents a visual display of said electronic images; and
(e) a human operator interface to said visual display such that an operator may interactively select from said images selected objects to be further examined ; and
(f) means for transmitting said images with said interactive selections to additional visual displays.
17. The apparatus of claim 16 wherein said selections are visually marked on the images of said series of articles containing discrete objects presented on said displays.
18. The apparatus of claim 16 wherein said selections are electronically registered within a computer.
19. The apparatus of claim 18 wherein said selections are stored in a searchable database in said computer along with said electronic images of said series of articles containing discrete objects.
20. The apparatus of claim 16 wherein said human operator interface and said visual display is located remote from said inspection zone.
21. The apparatus of claim 18 wherein said electronically registered selections are processed by computer algorithms that learn over time the characteristics of similar selected objects.
22. The apparatus of claim 21 wherein said algorithms utilize said' learned characteristics to perform computerized selection of similar objects.
23. Apparatus for inspecting a series of articles containing discrete objects and selecting for further examination discernable objects within the series comprising: (a) a conveyor that conveys a series of articles containing discrete objects into and through an inspection zone;
(b) an electromagnetic radiation source that irradiates said series of articles containing discrete objects within said inspection zone with incident electromagnetic radiation;
(c) a sensor that examines said series of articles containing discrete objects within said inspection zone and measures electromagnetic radiation emanating from said series of articles containing discrete objects;
(d) a microprocessor that processes said measured electromagnetic radiation to produce electronic images of said irradiated series of articles containing discrete objects as they pass through said inspection zone and two separate displays coupled to said microprocessor that present identical visual displays of said electronic images;
(e) a human operator interface to each of said visual displays such that an operator positioned at each display may interactively select from said images selected objects to be further examined; and
(f) means to compare objects selected at each of the two visual displays to determine relative selection efficiencies of operators utilizing the said interface at each display to select objects to be further examined.
24. The apparatus of claim 23 wherein said second visual display is located remotely from said first visual display.
25. Apparatus for inspecting a series of articles containing discrete objects and selecting for further examination discernable objects within the series comprising:
(a) a conveyor that conveys a series of articles containing discrete objects into and through an inspection zone;
(b) an electromagnetic radiation source that irradiates said series of articles containing discrete objects within said inspection zone with incident electromagnetic radiation; (c) a sensor that examines said series of articles containing discrete objects within said inspection zone and measures electromagnetic radiation emanating from said series of articles containing discrete objects;
(d) a microprocessor that processes said measured electromagnetic radiation to produce electronic images of said irradiated series of articles containing discrete objects as they pass through said inspection zone and a display coupled to said microprocessor that presents a visual display of said electronic images;
(e) a human operator interface to said visual display such that an operator may interactively select from said images selected objects to be further examined;
(f) a computer for electronically registering said selected objects;
(g) computerized algorithms which can learn characteristics of prior interactively selected objects to process said electronic images and make computerized selections of objects displaying characteristics of prior interactively selected objects; and
(h) means to compare said electronically registered first interactively selected objects to said computerized selections of objects to determine relative efficiency of said human operator selecting to the computerized selecting.
26. The apparatus of claim 25 wherein said selections along with said electronic images of said series of articles containing discrete objects are stored in a searchable database along with digitized photographic images of individuals associated with said series of articles containing discrete objects.
27. The apparatus of claim 26 wherein said selections, electronic images, and digitized photographic images are stored in a searchable database along with other information associated with said mixtures and said individuals.
28. The apparatus of claim 26 further comprising means to analyze said searchable database to compile at least one of correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
29. The apparatus of claim 26 further comprising means to analyze said searchable database to compile at least one of correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
30. The apparatus of claim 28 further comprising means to analyze said searchable database to compile at least one of correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
31. A method for inspecting a series of articles containing discrete objects to select for further examination an article or articles within the series comprising steps of: a) conveying a series of articles containing discrete objects into and through an inspection zone; b) irradiating said series of articles containing discrete objects with incident electromagnetic radiation while in said inspection zone; c) measuring the electromagnetic radiation emanating from said irradiated series of articles containing discrete objects; d) processing said measured electromagnetic radiation to produce one or more electronic images of irradiated articles within said irradiated series of articles containing discrete objects and presenting said electronic images for visual display; e) interactively selecting, from said visual display, one or more articles within the images of said series of articles containing discrete objects; and f) displaying said selections on additional displays of said images of said series of articles containing discrete objects.
32. The method of claim 31 wherein said selections are visually marked on the images of said series of articles containing discrete objects presented on said displays.
33. The method of claim 31 wherein said selections are electronically registered within a computer.
34. The method of claim 33 wherein said selections are stored in a searchable database of a computer along with said electronic images of said series of articles containing discrete objects.
35. The method of claim 31 wherein said selecting takes place on a visual display at a location remote from the location where said irradiating occurs.
36. The method according to claim 34 wherein said selections along with said electronic images of said series of articles containing discrete objects are stored in a searchable database along with digitized photographic images of individuals associated with said series of articles containing discrete objects.
37. The method according to claim 36 wherein said selections, electronic images, and digitized photographic images are stored in a computer searchable database along with other information associated with said series and said individuals.
38. The method according to claim 34 where said searchable database is analyzed to compile correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
39. The method according to claim 36 where said searchable database is analyzed to compile correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
40. The method according to claim 37 where said searchable database is analyzed to compile correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
41. An apparatus for inspecting a series of articles containing discrete objects to select for further examination articles within the series comprising: a) a conveyor for conveying a series of articles containing discrete objects into and through an inspection zone; b) an electromagnetic radiation source that irradiates said series of articles containing discrete objects within said inspection zone with incident electromagnetic radiation; c) a sensor that examines said series of articles containing discrete objects within said inspection zone and measures electromagnetic radiation emanating from said series of articles containing discrete objects; d) a computer that processes said measured electromagnetic radiation to produce electronic images of irradiated articles within said irradiated series of articles containing discrete objects as they pass through said inspection zone and a display coupled to said computer that presents a visual display of said electronic images; e) a human operator interface to said visual display such that an operator may interactively select from said images selected articles to be further examined; f) means that transmits said images with said interactive selections to additional visual displays.
42. The apparatus of claim 41 wherein said selections are visually marked on the images of said series of articles containing discrete objects presented on said displays.
43. The apparatus of claim 41 wherein said selections are electronically registered within a computer.
44. The apparatus of claim 43 wherein said selections are stored in a searchable database in said computer along with said electronic images of said series of articles containing discrete objects.
45. The apparatus of claim 41 wherein said human operator interface and said visual display is located remote from said inspection zone.
46. The apparatus of claim 44 wherein said selections along with said electronic images of said series of articles containing discrete objects are stored in a searchable database in said computer along with digitized photographic images of individuals associated with said series of articles containing discrete objects.
47. The apparatus of claim 46 wherein said selections, electronic images, and digitized photographic images are stored in said searchable database in said computer along with other information associated with said series and said individuals.
48. The apparatus of claim 44 further comprising means to analyze said searchable database to compile at least one of correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
49. The apparatus of claim 46 further comprising means to analyze said searchable database to compile at least one of correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
50. The apparatus of claim 47 further comprising means to analyze said searchable database to compile at least one of correlations, associations, and histories of objects and individuals related thereto, entering a particular secure premises.
51. A method for inspecting a series of articles selected from the group consisting of baggage, luggage, and packages, the method comprising steps of: a) conveying a series of articles at least into and through an inspection zone; b) irradiating an article of said series of articles with electromagnetic radiation while in said inspection zone; c) collecting at least some of said electromagnetic radiation of said irradiating step; d) providing electronic image data indicative of said article and a content thereof based on said electromagnetic radiation of said collecting step; e) determining if said article includes an object selected from the group consisting of a firearm, a knife, a sharp object, a pointed object, a spray can, a bomb, an explosive composition, ammunition, and combinations thereof, wherein said determining step comprises utilizing said electronic image data; and f) electronically tracking a location of said article; and g) separating said article from a remainder of said series of articles upon determining a presence of a said object.
52. The method of claim 51 wherein said separating step is accomplished electronically.
53. The method of claim 51 wherein said separating step is performed manually.
54. The method of claim 51 wherein said determining step is accomplished electronically.
55. The method of claim 54 wherein said separating step is accomplished electronically.
56. The method of claim 54 wherein said separating step is performed manually.
57. The method of claim 51 wherein said determining step comprises comparing said electronic image data with predetermined image characteristics data of at least one said object.
58. The method of claim 51 further comprising displaying at least first and second visual images of said article based on said electronic image data of said providing step.
59. The method of claim 58 wherein the first visual image is displayed at a remote location from said second visual image.
60. The method of claim 59 wherein the first and second images are displayed in real time.
61. The method of claim 51 further comprising: displaying a visual image of said article based on said electronic image data of said providing step, wherein said displaying step occurs at a first location; and forwarding at least one of said visual image and said electronic image data to a second location remote from said first location.
62. A system for inspecting a series of articles selected from the group consisting of baggage, luggage, and packages, the system comprising: a) a conveyor for conveying a series of articles; b) an inspection area located along at least a portion of said conveyor, wherein said inspection area comprises: i. an electromagnetic radiation source for irradiating an article that is conveyed through said inspection area; and ii. a sensor that collects electromagnetic radiation utilized to irradiate said article and that converts said electromagnetic radiation into electrical signals; c) a computer that processes said electrical signals to produce electronic image data indicative of said article; and d) a first station comprising: i. a first display electrically interconnected with said computer, wherein said first display provides at least one visual image based on said electrical image data, and wherein said at least one visual image is indicative of said article and a content thereof; and ii. a first operator interface electrically interconnected with said computer, wherein said first operator interface enables a first operator to select an image from said at least one image, said image being representative of said article.
63. The system of claim 62 further comprising an electronic tracking system for tracking said article relative to said conveyor.
64. The system of claim 63 further comprising automated means for separating said article from a remainder of said series of articles.
65. The system of claim 62 further comprising means for separating said article from a remainder of said series of articles.
66. The system of claim 65 wherein said means for separating is automated.
67. The system of claim 62 further comprising a database of electronic image data indicative of at least one object selected from the group consisting of a firearm, a knife, a sharp object, a pointed object, a spray can, a bomb, an explosive composition, ammunition, and combinations thereof.
68. The system of claim 62 further comprising a second station remote from said first station, said second station comprising: i. a second display for providing said at least one visual image based on said electrical image data, wherein said at least one visual image is indicative of said article and said content thereof; and ii. a second operator interface to enable an second operator to interactively select an image from said at least one image, said image being representative of said article.
69. The system of claim 68 further comprising automated means for separating said article from a remainder of said series of articles, wherein both of said first and second stations are electrically interconnected with said automated means.
70. The system of claim 69 further comprising an electronic tracking system for tracking said article relative to said conveyor, wherein said first and second stations are electrically interconnected with said electronic tracking system.
PCT/US2004/024607 2003-09-10 2004-07-30 Computer assisted bag screening system WO2006022660A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/US2004/024607 WO2006022660A2 (en) 2004-07-30 2004-07-30 Computer assisted bag screening system
EP04821375A EP1685574A4 (en) 2003-09-10 2004-09-10 Method and apparatus for improving baggage screening examination
PCT/US2004/029616 WO2005086616A2 (en) 2003-09-10 2004-09-10 Method and apparatus for improving baggage screening examination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2004/024607 WO2006022660A2 (en) 2004-07-30 2004-07-30 Computer assisted bag screening system

Publications (2)

Publication Number Publication Date
WO2006022660A2 true WO2006022660A2 (en) 2006-03-02
WO2006022660A3 WO2006022660A3 (en) 2007-03-01

Family

ID=35967946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/024607 WO2006022660A2 (en) 2003-09-10 2004-07-30 Computer assisted bag screening system

Country Status (1)

Country Link
WO (1) WO2006022660A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3164862A1 (en) * 2014-07-01 2017-05-10 Smiths Heimann GmbH Projection of hazardous items into x-ray images of inspection objects
JP2019527848A (en) * 2016-11-24 2019-10-03 ヌクテック カンパニー リミテッド Security inspection simulation training apparatus, method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6707879B2 (en) * 2001-04-03 2004-03-16 L-3 Communications Security And Detection Systems Remote baggage screening system, software and method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6707879B2 (en) * 2001-04-03 2004-03-16 L-3 Communications Security And Detection Systems Remote baggage screening system, software and method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3164862A1 (en) * 2014-07-01 2017-05-10 Smiths Heimann GmbH Projection of hazardous items into x-ray images of inspection objects
JP2019527848A (en) * 2016-11-24 2019-10-03 ヌクテック カンパニー リミテッド Security inspection simulation training apparatus, method and system
EP3547286A4 (en) * 2016-11-24 2020-06-17 Nuctech Company Limited Simulated training apparatus, method and system for security check

Also Published As

Publication number Publication date
WO2006022660A3 (en) 2007-03-01

Similar Documents

Publication Publication Date Title
US7012256B1 (en) Computer assisted bag screening system
US7286634B2 (en) Method and apparatus for improving baggage screening examination
US8031903B2 (en) Networked security system
US10275660B2 (en) Method and system for use in performing security screening
US6218943B1 (en) Contraband detection and article reclaim system
RU2399955C2 (en) System for detecting multiple threats
WO2005086616A2 (en) Method and apparatus for improving baggage screening examination
CN105510992B (en) The method for tracing and system of target item
US20050198226A1 (en) Security system with distributed computing
US11544533B2 (en) Network of intelligent machines
CN112612066B (en) Personnel security inspection method and personnel security inspection system
US20090313078A1 (en) Hybrid human/computer image processing method
WO2006022660A2 (en) Computer assisted bag screening system
US20230153657A1 (en) Network of intelligent machines
WO2010140943A1 (en) Concurrent multi-person security screening system
Schatzki et al. Visual detection of particulates in x-ray images of processed meat products
Siegel et al. “Robotics Systems for Deployment of Explosive Detection Sensors and Instruments
Fobes et al. Aviation security human factors test and evaluation master plan for the airport demonstration
Snyder Test and Evaluation Plan for the Checkpoint Evaluation at the Hartsfield Atlanta International Airport
Micklich et al. Evaluation and analysis of non-intrusive techniques for detecting illicit substances

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase