WO1989000741A1 - Identification information storage and retrieval - Google Patents

Identification information storage and retrieval Download PDF

Info

Publication number
WO1989000741A1
WO1989000741A1 PCT/US1988/002503 US8802503W WO8900741A1 WO 1989000741 A1 WO1989000741 A1 WO 1989000741A1 US 8802503 W US8802503 W US 8802503W WO 8900741 A1 WO8900741 A1 WO 8900741A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
bit stream
pixel values
individual
signature
Prior art date
Application number
PCT/US1988/002503
Other languages
French (fr)
Inventor
William F. Huxley
John C. Correa
Gary R. Greenfield
Joseph P. Cossette
Original Assignee
Etc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Etc. filed Critical Etc.
Priority to BR888807142A priority Critical patent/BR8807142A/en
Publication of WO1989000741A1 publication Critical patent/WO1989000741A1/en
Priority to DK143589A priority patent/DK143589A/en
Priority to KR1019890700521A priority patent/KR890702156A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K3/00Methods or arrangements for printing of data in the shape of alphanumeric or other characters from a record carrier, e.g. interpreting, printing-out from a magnetic tape
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • G07C9/24Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder by means of a handwritten signature
    • G07C9/247Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder by means of a handwritten signature electronically, e.g. by comparing signal of hand-writing with a reference signal from the pass
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • G07C9/25Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
    • G07C9/253Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition visually
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • G07C9/25Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
    • G07C9/257Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition electronically

Definitions

  • This invention relates to digitally storing and retrieving information identifying individuals, e.g., images of their faces and signatures.
  • Such information may be used, for example, when a purchaser gives a credit card to a merchant to buy something.
  • the merchant may check the purchaser's signature against the signature on the back of the card and may try to confirm the purchaser's identity by comparing his face with the picture on his driver's license.
  • One digital technique for confirming a person's- identity involves first digitally storing the person's fingerprint in a central location. When an identification needs to be made, the person's fingerprint is again taken on a ' machine capable of converting the fingerprint to digital data. The data is sent to the central station where it is compared to the stored fingerprint. A signal is sent back to the remote station if the fingerprints match.
  • a general feature of the invention provides a method for compressing a face image into a digital bit stream; the method includes the steps of converting the image to an array of raw pixel values; reducing the image to generate a smaller array of " pixel values; reducing the gray scale range of the pixel values in rhe smaller array; deriving data words indicative cf the pixel values; and encoding the data words of the bit stream in accordance with the code of the type in which more frequently occuring data words are encoded as shorter sets of bits.
  • Preferred embodiments include the following features* , •
  • the method may be adapted for confirming, at multiple locations, the identities of a predetermined group of individuals; the method would then further include the steps of capturing information about the face image that identifies each individual for - compressing into the digital bit stream; recording the compressed bit stream on a pocket-size recording medium to be provided to each- individual; and providing apparatus at each location for recovering the image from the recorded bit stream, for confirming the identities of the individuals.
  • the recording medium may be a magnetic strip, or a printed strip, in either case part of a plastic card.
  • the image may include a signature image.
  • each individual may be provided with a key to the storage address at which the compressed bit stream for the individual is recorded at the central location, and the bit stream is delivered to the local apparatus on the basis of the key information.
  • the key is ' stored as bits on a pocket-size recording medium to be provided to the invidual .
  • the bit stream is no longer than 5,000 bits.
  • the image is reduced to the smaller array to a degree specified by a user based on viewing the image.
  • Each data word, prior to encoding, is the difference between two of the reduced gray-scale pixel values.
  • the method also includes remapping the pixel values to a coarser range of pixel values. The remapping is more sensitive to tones in the middle of the scale than at the extremes of the scale. As a result, individuals can be rapidly and accurately identified based on their face images and signature images. The face and signature images are stored on a card that can be easily carried by the individual and yet be difficult to counterfeit.
  • access by the individual to the face and signature images can be restricted by storing all of the images centrally.
  • the total amount of bits required to •store the face and signature images is small, reducing the amount of time needed to transmit them and the amount of space needed to store them.
  • Fig. 1 is a block diagram of an identification system.
  • Fig. 2 is a block diagram of a capture terminal or unit
  • Fig. 3 is a diagram of a captured frame illustrating cropping.
  • Fig. 4 is a flow chart of the routine for processing a face image.
  • Fig* 5 is a diagram of the scaling process.
  • Fig. 6 is a flow chart of the routine for processing a signature.
  • Fig. 7 shows the make-up of an identification packet .
  • Fig. 8 is a block diagram of a display terminal or- unit .
  • Figs. 9, 10 are flow charrs respectively of the routines for displaying a face image and a signature.
  • Fig. 11 is a diagram of the bilinear interpolation process.
  • Fig. 12 is a diagram of a credit card. Structure
  • a system 10 for digitally storing and retrieving identification information e.g., face images and signatures of individuals
  • the images and signatures are captured by so-called capture terminals, e.g., the terminal labeled 12.
  • a display terminal e.g., terminal 14
  • Capture terminal 12 is connected by a dedicated telephone line 16 to a central station 18.
  • Line 16 carries digital data representing a captured image and. signature to be held in conventional storage 20 in central station 18.
  • Central station 18 includes conventional communication apparatus 22 for receiving and sending the digital data and a conventional data processing device 24 for supervising the communication apparatus and processing the stored data, among other things.
  • Display terminal 14 also is connected to central station 18 by a conventional dial-up telephone line 26 for receiving data representing stored images and signatures for use in identifying individuals.
  • a combined capture terminal and display terminal 28, capable of performing both the capture and display functions, is connected to central station 18 via a dial-up line 30.
  • Another display terminal 32 is connected via a satellite link 34 to central station 18.
  • Free-standing capture and display units 36, 38 may also be provided.
  • Capture unit 36 is not connected to central station 18 and instead stores the image and signature information, e.g. , on the magnetic stripe of a 5 conventional credit card.
  • Display unit 38 similarly is not- connected to central station 18 and displays a signature and image directly from the credit card stripe,
  • a wide variety of combinations of terminals and units (beyond the exemplary combination shown in Fig. 1)
  • One example is ' a system for identifying credit card users.
  • the image and signature of each authorized credit card holder would be captured either on the credit card itself or in the central station storage device.
  • the merchant would display the purported image and signature of the individual to verify his identity.
  • the display could be generated either directly from the credit card stripe or from data delivered from the central station. For this
  • 25 purpose other information stored- on the credit card may also be useful.
  • an identifying number stored on the card may be used at the central station to point to the stored image and signature that are to be returned to the display terminal .
  • a conventional monochrome video camera 40 (e. g. , model number supplied by ) is positioned to capture a face image of an individual 42 with the aid of approriate lighting (not shown) .
  • a second monochrome video camera 44 e. g. , model number supplied by .
  • Signature 46 (model........) is positioned to capture an image of the signature 46 of individual 42.
  • Signature 46 may ⁇ be written on a piece of paper and then held in position by a carrier (not shown) .
  • each camera 40, 44 is a conventional raster scanned analog composite NTSC video signal.
  • the NTSC video signals are delivered to a conventional frame grabber 48.
  • a microprocessor controls the operation of cameras 40, 44, and frame grabber 48 (under the direction of a user working at an interactive terminal 52 connected to the microprocessor) so that the face image and the signature image are delivered to frame grabber 48 at the proper times.
  • Interactive terminal 52 may be" a personal computer monitor and keyboard which is menu driven.
  • Frame grabber 48 samples and digitizes the NTSC signals and for each image delivers to a data storage 54, a stream of 307,200 raw pixel values in 480 rows of 640 columns each.
  • Each raw pixel value is eight bits long and the values are generated by the frame grabber so that the set of pixel values for the image are reasonably well distributed among all 256 possible eight-bit pixel values .
  • the arrays of raw pixel values representing both the face image and the signature are processed to generate a compressed bit stream that captures both the image and the signature in a relatively small number of bits, small enough to be stored on the magnetic stripe of a conventional credit card or to be sent over a communication line in a very short time period.
  • the processing of the raw pixel values is done by microprocessor 50 under control of a ⁇ oaram stored in conventional program storage, based on input from the user.
  • the raw pixel values generated by camera 40 include information about both the face of the individual and the background surrounding the face.
  • the frame 60 of raw pixel values may be larger than a cropped frame 62 which is still large enough to include all of the information of interest.
  • the first step in processing the face image is to crop (70) the frame to include only those pixel values within cropped frame 62.
  • the height and width of cropped frame 62 are indicated by the user via interactive terminal 52 in a conventional manner. For example the boundary lines of cropped frame 62 can be adjusted using conventional cursor controls until the proper boundary lines are reached.
  • the user can dictate both the aspect ratio of the cropped frame (ratio of height to width) and the total number of pixels remaining in the cropped frame (determined by the absolute magnitudes of the height and width).
  • the pri .ary effect of cropping is to eliminate from the raw pixels those which are irrelevant to the face image so that no effort will be wasted on subsequently processing the irrelevant pixels.
  • the next step is to scale the image (72), that is reduce the number of pixels in each dimension by a scaling factor.
  • the user selects the scaling factor via the interactive terminal 52 and the same scaling factor is applied to both dimensions.
  • the pixel values of the scaled image are then remapped (74) from 8 bits to 4 bits.
  • the remapping is done by assigning tones in the 8-bit pixel value range to tones in the remapped 4-bit pixel range as follows.
  • the remapping is center biased, i.e., more sensitive to tones in the middle of the scale than at the extremes of the scale.
  • the remapping table is as follows:
  • the cropped, scaled, remapped pixel values are subjected to data compression (76) in accordance with a Shannon-Fano code of the following form.
  • Shannon-Fano code of the following form.
  • the output of the coding process is a stream of bits that represent code words in the code based on the magnitude of the calculated difference in the pixel values. If the difference is zero (the most likely situation) the code word is a single "1" bit. If the diffe.
  • the code word is one of three different 3-bit codewords (respectively 001, 010, and 011). If the difference is any other value, the code word is a particular 3-bit code word (000) followed by a 4-bit value equal to the magnitude of the actual difference.
  • the result of the coding process is a relatively short stream of bits, e.g., less than 5,000 bits.
  • the actual number of bits depends on each stage of the processing, including the cropping, scaling factor, remapping technique, and coding.
  • image quality There is a tradeoff between image quality and the number of bits in the final stream. For example, the number of bits can be reduced by more sharply scaling the image, but the image quality is reduced. In most cases, it should be possible to obtain recognition quality face images with a final bit stream that is short enough to be recorded on the magnetic stripe of a conventional credit card.
  • the image is first cropped (90) in the same way as for the face image.
  • the next step (unlike the face image processing) is to threshold the cropped image (92), that is to map each 8-bit pixel value is converted to a 1-bit black or white value.
  • the threshold value for choosing between black or white is selectable by the operator using the interactive terminal 52 (Fig. 1).
  • the thresholded image is scaled (94) in a manner similar to the face image except that microprocessor 50 automatically determines the scaling factor in order to compress the total number of pixel values to a target number.
  • the scaled pixel values are then subjected to a two-stage data compression.
  • the first stage is two dimensional run length encoding (96) by the so-called run length predictive differential quantizing technique (described in Huang et al. eds. , Picture Bandwidth Compression, Gordon J. Beach, NY, 1972) .
  • the purpose is to identify two-dimensional regions that are to be black.
  • a line is run length encoded in a conventional manner. Then the difference between the run length codes for that line and the run-length codes for the next line are determined.
  • the line-to-line difference values obtained in that process are then Shannon-Fano coded (98) using the same code as for the face image, except that the entry which follows a 000 code is either a position difference value or a run length difference value.
  • the length for each entry is calculated during signature compression, so the lengths may be distinct for a given signature and each length may vary from one signature to the next.
  • the result is a coded bit stream representing the signature.
  • the bit stream may be as short as, e.g., 1400 bits, but the actual number may vary depending on the cropping, thresholding, scaling, and coding. Similar tradeoffs of image quality and bit stream length (as for the face image) exist.
  • an identification packet 90 is assembled.
  • the packet includes a header (which among other things may contain a packet identifier, an issuer identifier, a description of the contents and size of the packet, and an identification of the coding technique used); the signature bit steam; the face image bit stream; a compressed version of textual information entered by the user (e.g., a credit limit for the credit card holder); and a check sum for the entire packet for use in later checking the integrity of the packet.
  • the identification packet is then delivered from data storage 54 (where it was stored by microprocessor 50) to a magnetic stripe recorder which records the packet on the magnetic stripe 57 of plastic card 59.
  • the identification packet is passed from data storage 54 to a conventional communication unit 61 capable of communication with the communication device 22 in the central station.
  • the packet is stored in storage 20 at a location where it can be later retrieved on the basis of the identifier included in the header. That same identifier may also be recorded as bits on a credit card in so ⁇ .e manner so that it can later be retrieved.
  • the display packet is loaded into a data storage unit 100 either from magnetic stripe 59 via a magnetic stripe reader 102 (in the case of a standalone display unit 38) or from the central office via a communication unit 104 (in the case of a display terminal 14).
  • the loading of the packet is controlled by a microprocessor 106 (based on a program stored in program storage 108) and under direction of a user acting through a keyboard 110.
  • Microprocessor 106 using generally the reverse procedures of those used in the capture terminal or unit, regenerates the cropped face image and cropped signature pixel arrays and from them generates a combined display pixel array for loading into a conventional frame buffer 112.
  • a conventional video monitor 114 is
  • microprocessor 106 uses the array in the frame buffer to produce a display of the face image and signature of the individual whose identity is being checked.
  • the user may instruct the microprocessor to display the textual information from the packet.
  • microprocessor 106 first Shannon-Fano decodes (120) the face image bit stream to recover an array of 4-bit pixel values. Next the 4-bit pixel values are mapped back to 8-bit pixel values according to the following table.
  • the image is then magnified by adding more pixel values between the existing pixel values using bilinear interpolation (122).
  • each new pixel 130 in the magnified pixel array 132 is calculated as a weighted average of the four adjacent pixel values 134, 136, 138, 140 in the original pixel array 142, the weighting being based on the distance of the new pixel 130 from each of the original pixels as placed in the magnified array 132.
  • Bilinear interpolation is discussed in Castleman, Digital Image Processing, Prentice-Hall, 1979.
  • microprocessor 106 first Shannon-Fano docodes (140) the bit stream and then run length decodes it (142) to recover the thresholded scaled pixel array.
  • the array is similarly magnified using bilinear interpolation (144).
  • the usual 1/4" wide magnetic stripe 152 is treated as having four tracks three of which comply with ANSI standards, with tracks 1 and 2 (recorded at 210 bits per inch (bpi)) and track 3 used for conventional puroses .
  • Track 0 is then split into two half-tracks 154, 156 and recorded at 2600 bits per inch with flux changes recording zero-valued bits on one half-track and one-valued bits on the other half-track.
  • all four tracks may be recorded outside the ANSI standards at a higher bit rate in order to include all of the conventionally recorded information plus identification packet.
  • each individual to whom a card is issued would be required to have his "picture" and an image of his signature captured and stored using a capture terminal.
  • the compressed identification packet would either be stored directly on the issued credit card or be stored in the central station, or both, with the key to retrieving the stored packet also being recorded on the credit card (perhaps on another track) .
  • the merchant When an individual presents the card to buy something, the merchant would run the stripe through the stripe reader in the display terminal. The person's picture and signature would be displayed to the merchant as well as information such as the person's credit limit.
  • the identification packet would either be read , , directly from the credit card, or would be retrieved over a telepone line from the central station based on the storage location read from the card and sent to the central station.
  • the system may be used in a wide variety of applications other than the credit card example described " .
  • identification information may be captured and retrieved, including fingerprint information.
  • the cropping process could be done by providing a zoom lens on the video camera and having the user adjust the lens so that the face image fills the entire screen. Then the original raw pixel values would not include any unnecessary background pixels.
  • the identification information could be stored in other ways than on the magnetic stripe of a credit card, e.g. in a hologam on the card, or on an optically readable stripe.
  • the information which identifies the identi ication packet may be stored independently of the packet itself, e.g., elsewhere on the card, so that the packet and the information identifying it act as lock and key to prevent use of the card in a manner which does not involve detecting both the lock and key.
  • the threshold scaled pixel array may be scaled up (without interpolation) or scaled up and edge-smoothed.

Abstract

A method for compressing a face image (62) into a digital bit stream; the method includes the steps of converting the image (62) to an array of raw pixel values; reducing the image (62) to generate a smaller array of pixel values (72); reducing the gray scale range of the pixel values in the smaller array (74); deriving data words indicative of the pixel values (74); and encoding the data words of the bit stream in accordance with the code of the type in which more frequently occurring data words are encoded as shorter sets of bits (76).

Description

Identification Information Storage and Retrieval Background of the Invention This invention relates to digitally storing and retrieving information identifying individuals, e.g., images of their faces and signatures.
Such information may be used, for example, when a purchaser gives a credit card to a merchant to buy something. Conventionally , the merchant may check the purchaser's signature against the signature on the back of the card and may try to confirm the purchaser's identity by comparing his face with the picture on his driver's license.
One digital technique for confirming a person's- identity, proposed in United States Patent 4,636,622, issued January 13, 1987, involves first digitally storing the person's fingerprint in a central location. When an identification needs to be made, the person's fingerprint is again taken on a 'machine capable of converting the fingerprint to digital data. The data is sent to the central station where it is compared to the stored fingerprint. A signal is sent back to the remote station if the fingerprints match.
Summary of the Invention A general feature of the invention provides a method for compressing a face image into a digital bit stream; the method includes the steps of converting the image to an array of raw pixel values; reducing the image to generate a smaller array of "pixel values; reducing the gray scale range of the pixel values in rhe smaller array; deriving data words indicative cf the pixel values; and encoding the data words of the bit stream in accordance with the code of the type in which more frequently occuring data words are encoded as shorter sets of bits.
Preferred embodiments include the following features* , •
The method may be adapted for confirming, at multiple locations, the identities of a predetermined group of individuals; the method would then further include the steps of capturing information about the face image that identifies each individual for - compressing into the digital bit stream; recording the compressed bit stream on a pocket-size recording medium to be provided to each- individual; and providing apparatus at each location for recovering the image from the recorded bit stream, for confirming the identities of the individuals. The recording medium may be a magnetic strip, or a printed strip, in either case part of a plastic card. The image may include a signature image. In the case of central storage of the bit streams, each individual may be provided with a key to the storage address at which the compressed bit stream for the individual is recorded at the central location, and the bit stream is delivered to the local apparatus on the basis of the key information. The key is' stored as bits on a pocket-size recording medium to be provided to the invidual .
The bit stream is no longer than 5,000 bits. The image is reduced to the smaller array to a degree specified by a user based on viewing the image. Each data word, prior to encoding, is the difference between two of the reduced gray-scale pixel values. The method also includes remapping the pixel values to a coarser range of pixel values. The remapping is more sensitive to tones in the middle of the scale than at the extremes of the scale. As a result, individuals can be rapidly and accurately identified based on their face images and signature images. The face and signature images are stored on a card that can be easily carried by the individual and yet be difficult to counterfeit.
Alternatively access by the individual to the face and signature images can be restricted by storing all of the images centrally. The total amount of bits required to •store the face and signature images is small, reducing the amount of time needed to transmit them and the amount of space needed to store them.
Other advantages and features will become apparent from the following description of the preferred embodiment, and from the claims. ' Description of the Preferred Embodiment
We first briefly describe the drawings.
Fig. 1 is a block diagram of an identification system.
Fig. 2 is a block diagram of a capture terminal or unit
Fig. 3 is a diagram of a captured frame illustrating cropping.
Fig. 4 is a flow chart of the routine for processing a face image. Fig* 5 is a diagram of the scaling process.
Fig. 6 is a flow chart of the routine for processing a signature.
Fig. 7 shows the make-up of an identification packet . Fig. 8 is a block diagram of a display terminal or- unit .
Figs. 9, 10 are flow charrs respectively of the routines for displaying a face image and a signature. Fig. 11 is a diagram of the bilinear interpolation process.
Fig. 12 is a diagram of a credit card. Structure Referring to Fig. 1, in a system 10 for digitally storing and retrieving identification information (e.g., face images and signatures of individuals) the images and signatures are captured by so-called capture terminals, e.g., the terminal labeled 12. To identify an individual whose image and signature have been captured, a display terminal, e.g., terminal 14, displays a previously stored image and signature which purportedly correspond to the individual who is to be identified. Capture terminal 12 is connected by a dedicated telephone line 16 to a central station 18. Line 16 carries digital data representing a captured image and. signature to be held in conventional storage 20 in central station 18. Central station 18 includes conventional communication apparatus 22 for receiving and sending the digital data and a conventional data processing device 24 for supervising the communication apparatus and processing the stored data, among other things. Display terminal 14 also is connected to central station 18 by a conventional dial-up telephone line 26 for receiving data representing stored images and signatures for use in identifying individuals. A combined capture terminal and display terminal 28, capable of performing both the capture and display functions, is connected to central station 18 via a dial-up line 30. Another display terminal 32 is connected via a satellite link 34 to central station 18. Free-standing capture and display units 36, 38 may also be provided. Capture unit 36 is not connected to central station 18 and instead stores the image and signature information, e.g. , on the magnetic stripe of a 5 conventional credit card. Display unit 38 similarly is not- connected to central station 18 and displays a signature and image directly from the credit card stripe, A wide variety of combinations of terminals and units (beyond the exemplary combination shown in Fig. 1)
10 may be provided, including large numbers of capture terminals, display terminals, capture units, and display units, with communication to the central station being over any combination of available communication links. Such a system has a broad range of
15 applications. One example is 'a system for identifying credit card users. In such a system, the image and signature of each authorized credit card holder would be captured either on the credit card itself or in the central station storage device. Each time an individual
20, presents a credit card for use, the merchant would display the purported image and signature of the individual to verify his identity. The display could be generated either directly from the credit card stripe or from data delivered from the central station. For this
25 purpose other information stored- on the credit card may also be useful. For example, an identifying number stored on the card may be used at the central station to point to the stored image and signature that are to be returned to the display terminal .
30. Referring to Fig. 2, in a capture terminal 12 or a capture unit 36, a conventional monochrome video camera 40 (e. g. , model number supplied by ) is positioned to capture a face image of an individual 42 with the aid of approriate lighting (not shown) . A second monochrome video camera 44
(model........) is positioned to capture an image of the signature 46 of individual 42. Signature 46 mayτ be written on a piece of paper and then held in position by a carrier (not shown) .
The output of each camera 40, 44 is a conventional raster scanned analog composite NTSC video signal. The NTSC video signals are delivered to a conventional frame grabber 48. A microprocessor controls the operation of cameras 40, 44, and frame grabber 48 (under the direction of a user working at an interactive terminal 52 connected to the microprocessor) so that the face image and the signature image are delivered to frame grabber 48 at the proper times. Interactive terminal 52 may be" a personal computer monitor and keyboard which is menu driven.
Frame grabber 48 samples and digitizes the NTSC signals and for each image delivers to a data storage 54, a stream of 307,200 raw pixel values in 480 rows of 640 columns each. Each raw pixel value is eight bits long and the values are generated by the frame grabber so that the set of pixel values for the image are reasonably well distributed among all 256 possible eight-bit pixel values . The arrays of raw pixel values representing both the face image and the signature are processed to generate a compressed bit stream that captures both the image and the signature in a relatively small number of bits, small enough to be stored on the magnetic stripe of a conventional credit card or to be sent over a communication line in a very short time period. The processing of the raw pixel values is done by microprocessor 50 under control of a Ό oaram stored in conventional program storage, based on input from the user.
Referring to Fig. 3, the raw pixel values generated by camera 40 include information about both the face of the individual and the background surrounding the face. Thus the frame 60 of raw pixel values may be larger than a cropped frame 62 which is still large enough to include all of the information of interest. Referring to Fig. 4, the first step in processing the face image is to crop (70) the frame to include only those pixel values within cropped frame 62. The height and width of cropped frame 62 are indicated by the user via interactive terminal 52 in a conventional manner. For example the boundary lines of cropped frame 62 can be adjusted using conventional cursor controls until the proper boundary lines are reached. Thus the user can dictate both the aspect ratio of the cropped frame (ratio of height to width) and the total number of pixels remaining in the cropped frame (determined by the absolute magnitudes of the height and width). The pri .ary effect of cropping is to eliminate from the raw pixels those which are irrelevant to the face image so that no effort will be wasted on subsequently processing the irrelevant pixels.
The next step is to scale the image (72), that is reduce the number of pixels in each dimension by a scaling factor. The user selects the scaling factor via the interactive terminal 52 and the same scaling factor is applied to both dimensions. The scaling is accomplished by a simple averaging process. Referring to Fig. 5, for example, for a scaling factor of 4, each region 80 containing 4 x 4 = 16 pixels 81 in the cropped frame 82 is mapped into a single pixel 84 in the scaled image 86 by averaging the sixteen pixel values of the pixels in region 80.
Referring again to Fig. 4, the pixel values of the scaled image are then remapped (74) from 8 bits to 4 bits. The remapping is done by assigning tones in the 8-bit pixel value range to tones in the remapped 4-bit pixel range as follows. The remapping is center biased, i.e., more sensitive to tones in the middle of the scale than at the extremes of the scale. The remapping table is as follows:
Figure imgf000010_0001
Next, the cropped, scaled, remapped pixel values are subjected to data compression (76) in accordance with a Shannon-Fano code of the following form. (An example of Shannon—Fano Coding is set forth on p. 741 of Pratt, Digital Image Processing, Wiley, New York, 1978) . First the difference between each pixel value and the preceding pixel value (where the pixel values are taken one after another, row by row) is calculated. The output of the coding process is a stream of bits that represent code words in the code based on the magnitude of the calculated difference in the pixel values. If the difference is zero (the most likely situation) the code word is a single "1" bit. If the diffe. ence is +1, -1, or +2, the code word is one of three different 3-bit codewords (respectively 001, 010, and 011). If the difference is any other value, the code word is a particular 3-bit code word (000) followed by a 4-bit value equal to the magnitude of the actual difference.
The result of the coding process is a relatively short stream of bits, e.g., less than 5,000 bits. The actual number of bits depends on each stage of the processing, including the cropping, scaling factor, remapping technique, and coding. There is a tradeoff between image quality and the number of bits in the final stream. For example, the number of bits can be reduced by more sharply scaling the image, but the image quality is reduced. In most cases, it should be possible to obtain recognition quality face images with a final bit stream that is short enough to be recorded on the magnetic stripe of a conventional credit card.
Referring to Fig. 6, in processing the signature image, the image is first cropped (90) in the same way as for the face image. The next step (unlike the face image processing) is to threshold the cropped image (92), that is to map each 8-bit pixel value is converted to a 1-bit black or white value. The threshold value for choosing between black or white is selectable by the operator using the interactive terminal 52 (Fig. 1).
Next, the thresholded image is scaled (94) in a manner similar to the face image except that microprocessor 50 automatically determines the scaling factor in order to compress the total number of pixel values to a target number.
The scaled pixel values are then subjected to a two-stage data compression. The first stage is two dimensional run length encoding (96) by the so-called run length predictive differential quantizing technique (described in Huang et al. eds. , Picture Bandwidth Compression, Gordon J. Beach, NY, 1972) . The purpose is to identify two-dimensional regions that are to be black. First, a line is run length encoded in a conventional manner. Then the difference between the run length codes for that line and the run-length codes for the next line are determined. The line-to-line difference values obtained in that process are then Shannon-Fano coded (98) using the same code as for the face image, except that the entry which follows a 000 code is either a position difference value or a run length difference value. The length for each entry is calculated during signature compression, so the lengths may be distinct for a given signature and each length may vary from one signature to the next. The result is a coded bit stream representing the signature. The bit stream may be as short as, e.g., 1400 bits, but the actual number may vary depending on the cropping, thresholding, scaling, and coding. Similar tradeoffs of image quality and bit stream length (as for the face image) exist.
Referring to Fig. 7, after the compressed signature and image bit streams are generated, an identification packet 90 is assembled. The packet includes a header (which among other things may contain a packet identifier, an issuer identifier, a description of the contents and size of the packet, and an identification of the coding technique used); the signature bit steam; the face image bit stream; a compressed version of textual information entered by the user (e.g., a credit limit for the credit card holder); and a check sum for the entire packet for use in later checking the integrity of the packet.
Referring again to Figs. 1, 2, in the case of a stand alone capture unit, the identification packet is then delivered from data storage 54 (where it was stored by microprocessor 50) to a magnetic stripe recorder which records the packet on the magnetic stripe 57 of plastic card 59.
Alternatively, in the case of a capture terminal that is connected to a central station, the identification packet is passed from data storage 54 to a conventional communication unit 61 capable of communication with the communication device 22 in the central station. At the central station the packet is stored in storage 20 at a location where it can be later retrieved on the basis of the identifier included in the header. That same identifier may also be recorded as bits on a credit card in soπ.e manner so that it can later be retrieved.
Referring to Fig. 8, in a display terminal or unit 14, 38, the display packet is loaded into a data storage unit 100 either from magnetic stripe 59 via a magnetic stripe reader 102 (in the case of a standalone display unit 38) or from the central office via a communication unit 104 (in the case of a display terminal 14). In each case the loading of the packet is controlled by a microprocessor 106 (based on a program stored in program storage 108) and under direction of a user acting through a keyboard 110. Microprocessor 106, using generally the reverse procedures of those used in the capture terminal or unit, regenerates the cropped face image and cropped signature pixel arrays and from them generates a combined display pixel array for loading into a conventional frame buffer 112. A conventional video monitor 114. including driver) uses the array in the frame buffer to produce a display of the face image and signature of the individual whose identity is being checked. Alternatively, the user may instruct the microprocessor to display the textual information from the packet. Referring to Fig. 9, i 'regenerating the face image, microprocessor 106 first Shannon-Fano decodes (120) the face image bit stream to recover an array of 4-bit pixel values. Next the 4-bit pixel values are mapped back to 8-bit pixel values according to the following table.
4-3it Pixel Value 0 1 2
3 4 5 6 7
8 9 10 11 12
13 14 15
Figure imgf000015_0001
The image is then magnified by adding more pixel values between the existing pixel values using bilinear interpolation (122).
Referring to Fig. 11, in bilinear interpolation each new pixel 130 in the magnified pixel array 132 is calculated as a weighted average of the four adjacent pixel values 134, 136, 138, 140 in the original pixel array 142, the weighting being based on the distance of the new pixel 130 from each of the original pixels as placed in the magnified array 132. Bilinear interpolation is discussed in Castleman, Digital Image Processing, Prentice-Hall, 1979.
Following magnification, the pixels are subjected to contrast enhancement (123), e.g., using a convolutional filtering technique of the kind described in the Pratt reference at page 322. Referring to Fig. 10, to regenerate the signature image from the compressed signature bitstream, microprocessor 106 first Shannon-Fano docodes (140) the bit stream and then run length decodes it (142) to recover the thresholded scaled pixel array. The array is similarly magnified using bilinear interpolation (144).
Referring to Fig. 11, in one technique for recording the identification packet on a conventional credit card 150, the usual 1/4" wide magnetic stripe 152 is treated as having four tracks three of which comply with ANSI standards, with tracks 1 and 2 (recorded at 210 bits per inch (bpi)) and track 3 used for conventional puroses . Track 0 is then split into two half-tracks 154, 156 and recorded at 2600 bits per inch with flux changes recording zero-valued bits on one half-track and one-valued bits on the other half-track. Alternatively all four tracks may be recorded outside the ANSI standards at a higher bit rate in order to include all of the conventionally recorded information plus identification packet. Operation
In the example of a system used for identifying credit card holders, each individual to whom a card is issued would be required to have his "picture" and an image of his signature captured and stored using a capture terminal. The compressed identification packet would either be stored directly on the issued credit card or be stored in the central station, or both, with the key to retrieving the stored packet also being recorded on the credit card (perhaps on another track) .
When an individual presents the card to buy something, the merchant would run the stripe through the stripe reader in the display terminal. The person's picture and signature would be displayed to the merchant as well as information such as the person's credit limit. The identification packet would either be read , , directly from the credit card, or would be retrieved over a telepone line from the central station based on the storage location read from the card and sent to the central station.
Other embodiments are within the following claims. For exan.ple, the system may be used in a wide variety of applications other than the credit card example described". And other types of identification information may be captured and retrieved, including fingerprint information. The cropping process could be done by providing a zoom lens on the video camera and having the user adjust the lens so that the face image fills the entire screen. Then the original raw pixel values would not include any unnecessary background pixels. The identification information could be stored in other ways than on the magnetic stripe of a credit card, e.g. in a hologam on the card, or on an optically readable stripe.
Other data compression techniques could be used, with the particular technique being indicated in the identification packet header, so that the display process can be performed in the appropriate way.
The information which identifies the identi ication packet may be stored independently of the packet itself, e.g., elsewhere on the card, so that the packet and the information identifying it act as lock and key to prevent use of the card in a manner which does not involve detecting both the lock and key.
Other techniσues av be used to process the signature image for redisplays. Following the run-length decoding step, the threshold scaled pixel array may be scaled up (without interpolation) or scaled up and edge-smoothed.

Claims

Claims
1. A method for compressing a face image into a digital bit stream including the steps of converting the image to an array of raw pixel values, reducing the image to generate a smaller array of pixel values, reducing the gray-scale range of the pixel values in the smaller array, deriving .data words indicative of said pixel values, and encoding said data words as said bit stream in accordance with a code of the type in which more frequently occurring data words are encoded as shorter sets of bits. '
2 . The method of claim 1 wherein said bit stream is no longer than 5000 bits.
3. The method of claim 1 wherein said image is reduced to said smaller array to a degree specified by a user based on viewing said image.
4, The method of claim 1 wherein each said data word comprises the difference between two of the reduced gray-scale pixel values.
.
5. The method of claim 1 further comprising remapping said pixel values to a coarser range of pixel values.
6. The method of claim 5 wherein said remapping is more sensitive to tones in the middle of the scale than at the extremes of the scale.
7. The method of claim 1 adapted for use in confirming, at multiple locations, the identities of a predetermined group of individuals, further comprising capturing information about said face image that identifies each individual for compressing into said dicrital bit stream, recording said digital bit stream on a pocket-size recording medium to be provided to each said individual, and providing apparatus at each said location for recovering said image from said recorded bit stream for confirming the identity of said individual.
8. The method of claim 7 wherein said recording medium comprises a magnetic strip.
9. The method of claim 7 wherein said recording medium comprises a printed strip.
10. The method of claim 7 wherein said strip is part of a plastic card.
11. The method of claim 7 wherein said face image further comprises a signature image.
12. The method of claim 1 adapted for use in confirming, at multiple locations, the identities of a predetermined group of individuals, further comprising capturing information about said face image that identifies each individual for compressing into said digital bit stream, capturing information about a signature image that identifies each individual and converting said signature information for inclusion in said digital bit stream, recording said digital bit streams for said individuals at a central location, and providing local apparatus at each said location for receiving a said bit stream for a selected said individual from said central location and recovering said face and signature images from said bit stream to confirm the identity of said individual.
13. The method of claim 12 further comprising providing each said individual with a key to the storage address at which said digital bit stream for said individual is recorded at said central location, and delivering said bit stream to said local apparatus on the basis of said key information.
14. The method of claim 13 wherein said key is stored as bits on a pocket-size recording medium to be provided to said individual.
PCT/US1988/002503 1987-07-22 1988-07-22 Identification information storage and retrieval WO1989000741A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
BR888807142A BR8807142A (en) 1987-07-22 1988-07-22 STORAGE AND RECOVERY OF IDENTIFICATION INFORMATION
DK143589A DK143589A (en) 1987-07-22 1989-03-22 PROCEDURES FOR STORING AND RETURNING IDENTIFICATION INFORMATION
KR1019890700521A KR890702156A (en) 1987-07-22 1989-03-23 Storage and retrieval of identification information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US7649087A 1987-07-22 1987-07-22
US076,490 1987-07-22

Publications (1)

Publication Number Publication Date
WO1989000741A1 true WO1989000741A1 (en) 1989-01-26

Family

ID=22132364

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1988/002503 WO1989000741A1 (en) 1987-07-22 1988-07-22 Identification information storage and retrieval

Country Status (7)

Country Link
EP (1) EP0330684A4 (en)
JP (1) JPH02501098A (en)
KR (1) KR890702156A (en)
AU (1) AU2126088A (en)
BR (1) BR8807142A (en)
DK (1) DK143589A (en)
WO (1) WO1989000741A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0428233A1 (en) * 1989-11-15 1991-05-22 N.V. Nederlandsche Apparatenfabriek NEDAP Method and apparatus for producing admission tickets
WO1992003804A1 (en) * 1990-08-14 1992-03-05 John Mclean & Sons (Electrical) Dingwall Ltd Document security system
EP0493243A1 (en) * 1990-12-28 1992-07-01 France Telecom Method for the identification and authentification of data characterizing an individual
EP0533891A1 (en) * 1991-03-20 1993-03-31 PROKOSKI, Francine J. Method for identifying individuals from analysis of elemental shapes derived from biosensor data
EP0543893A1 (en) * 1990-07-03 1993-06-02 SOLTESZ, John, A. System for the secure storage and transmission of data
US5355411A (en) * 1990-08-14 1994-10-11 Macdonald John L Document security system
EP0638880A1 (en) * 1993-08-10 1995-02-15 Audio DigitalImaging, Inc. A method of verifying fake-proof video identification data
US5473740A (en) * 1993-12-29 1995-12-05 International Business Machines Corporation Method and apparatus for interactively indicating image boundaries in digital image cropping
WO1999036889A2 (en) * 1998-01-16 1999-07-22 Nexus Corporation S.A. Transaction system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4424587A (en) * 1980-04-16 1984-01-03 Scantron Gmbh & Co. Elektronische Lesegerate Kg Method and apparatus for the identification of articles

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3603552A1 (en) * 1985-02-06 1986-08-07 Rca Corp., Princeton, N.J. METHOD AND DEVICE FOR REDUCING IMAGE DATA
JPS61217879A (en) * 1985-03-25 1986-09-27 Matsushita Electric Works Ltd Picture collation system
GB8509001D0 (en) * 1985-04-09 1985-05-15 Strain J Optical data storage card
FR2592197A1 (en) * 1985-12-19 1987-06-26 Nixon John Method of identifying a person, especially a person requesting a service such as, for example, a banking transaction, with the aid of an identification card, device for implementing the method, identification cards usable for the abovementioned method and method for producing such cards
US4754487A (en) * 1986-05-27 1988-06-28 Image Recall Systems, Inc. Picture storage and retrieval system for various limited storage mediums

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4424587A (en) * 1980-04-16 1984-01-03 Scantron Gmbh & Co. Elektronische Lesegerate Kg Method and apparatus for the identification of articles

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP0330684A4 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5181786A (en) * 1989-11-15 1993-01-26 N.V. Nederlandsche Apparatenfabriek Nedap Method and apparatus for producing admission tickets
EP0428233A1 (en) * 1989-11-15 1991-05-22 N.V. Nederlandsche Apparatenfabriek NEDAP Method and apparatus for producing admission tickets
EP0543893A4 (en) * 1990-07-03 1994-03-30 John A. Soltesz
EP0543893A1 (en) * 1990-07-03 1993-06-02 SOLTESZ, John, A. System for the secure storage and transmission of data
AU664022B2 (en) * 1990-08-14 1995-11-02 John Mclean & Sons (Electrical) Dingwall Ltd. Document security system
US5355411A (en) * 1990-08-14 1994-10-11 Macdonald John L Document security system
WO1992003804A1 (en) * 1990-08-14 1992-03-05 John Mclean & Sons (Electrical) Dingwall Ltd Document security system
FR2671210A1 (en) * 1990-12-28 1992-07-03 Villa Pierre METHOD FOR IDENTIFYING AND AUTHENTICATING INFORMATION CHARACTERIZING AN INDIVIDUAL
US5228094A (en) * 1990-12-28 1993-07-13 France Telecom Etablissement Autonome De Droit Public (Centre National D'etudes Des Telecommunications Process of identifying and authenticating data characterizing an individual
EP0493243A1 (en) * 1990-12-28 1992-07-01 France Telecom Method for the identification and authentification of data characterizing an individual
EP0533891A1 (en) * 1991-03-20 1993-03-31 PROKOSKI, Francine J. Method for identifying individuals from analysis of elemental shapes derived from biosensor data
EP0533891A4 (en) * 1991-03-20 1993-07-07 Francine J. Prokoski Method for identifying individuals from analysis of elemental shapes derived from biosensor data
EP0638880A1 (en) * 1993-08-10 1995-02-15 Audio DigitalImaging, Inc. A method of verifying fake-proof video identification data
US5473740A (en) * 1993-12-29 1995-12-05 International Business Machines Corporation Method and apparatus for interactively indicating image boundaries in digital image cropping
WO1999036889A2 (en) * 1998-01-16 1999-07-22 Nexus Corporation S.A. Transaction system
WO1999036889A3 (en) * 1998-01-16 2000-02-17 Nexus Corp S A Transaction system
US7058603B1 (en) 1998-01-16 2006-06-06 Nexus Corporation Sa. Transaction system

Also Published As

Publication number Publication date
BR8807142A (en) 1989-10-17
KR890702156A (en) 1989-12-23
AU2126088A (en) 1989-02-13
DK143589D0 (en) 1989-03-22
EP0330684A4 (en) 1989-12-13
DK143589A (en) 1989-05-19
JPH02501098A (en) 1990-04-12
EP0330684A1 (en) 1989-09-06

Similar Documents

Publication Publication Date Title
JP2766200B2 (en) Method and apparatus for compressing, storing and retrieving images on magnetic transaction card
US4754487A (en) Picture storage and retrieval system for various limited storage mediums
US6192138B1 (en) Apparatus and method for embedding/unembedding supplemental information
EP0543893B1 (en) System for the secure storage and transmission of data
US4703347A (en) Individuality discriminating system
EP0130415A2 (en) Data compression for graphics images
US6463185B1 (en) Information recording medium and information reproducing apparatus
CN1327391C (en) Image characteristic identification signal generation apparatus and image characteristic identification signal generation method
WO1989000741A1 (en) Identification information storage and retrieval
EP0154766B1 (en) A method for processing gray scale picture elements to facilitate data compression
AU664022B2 (en) Document security system
JP2981133B2 (en) How to reduce document page image size
JPH07220092A (en) Method and apparatus for encoding of data by predetermined value
JPH0564134A (en) Digital recording electronic still camera having character input means
CN1212723C (en) Digital image processing method and device
JPS63167569A (en) Picture file device
JPS62222364A (en) Person collative identification device
JPH07236110A (en) Digital still camera
JPH0541800A (en) Picture decoding processor and decoding method
JPH0777972A (en) Image processor
Himmel Signature and facial image compression by boundary encoding
JPH04249476A (en) Picture data compressing device
JPH04151975A (en) Digital electronic still camera
JPS62130465A (en) Personal collation device
JPH0654290A (en) Data compressing and filing system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BB BG BR DK FI HU JP KP KR LK MC MG MW NO RO SD SU

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE

WWE Wipo information: entry into national phase

Ref document number: 1988906636

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1988906636

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1988906636

Country of ref document: EP