US20160228039A1 - Biometric apparatus and method for touch-sensitive devices - Google Patents

Biometric apparatus and method for touch-sensitive devices Download PDF

Info

Publication number
US20160228039A1
US20160228039A1 US15/131,659 US201615131659A US2016228039A1 US 20160228039 A1 US20160228039 A1 US 20160228039A1 US 201615131659 A US201615131659 A US 201615131659A US 2016228039 A1 US2016228039 A1 US 2016228039A1
Authority
US
United States
Prior art keywords
contact
data
user
body part
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/131,659
Inventor
Waleed Sami Haddad
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Biogy Inc
Original Assignee
Biogy Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Biogy Inc filed Critical Biogy Inc
Priority to US15/131,659 priority Critical patent/US20160228039A1/en
Publication of US20160228039A1 publication Critical patent/US20160228039A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • H04W12/065Continuous authentication

Definitions

  • Embodiments of the disclosure are directed to a biometric system and method for use in devices that include a touch sensor.
  • Method embodiments of the disclosure involve sensing a contact event occurring between a body part of a user and a touch sensor and storing, for the contact event, a sequence of data frames each comprising contact data associated with a different portion of the user's body part.
  • Method embodiments further involve generating biometric signature data using the sequence of data frames.
  • Apparatus embodiments of the disclosure include a touch sensor configured to sense a contact event occurring between a body part of the user and the touch sensor.
  • the touch sensor is configured to produce contact data in response to the sensed contact event.
  • a processor is coupled to the touch sensor and memory.
  • the processor is configured to store in the memory a sequence of data frames each comprising contact data associated with a different portion of the user's body part.
  • the processor is further configured to generate biometric signature data using the sequence of data frames.
  • FIGS. 1-5 are flow diagrams showing biometric processes in accordance with various embodiments of the disclosure.
  • FIG. 6 is a flow diagram showing biometric processes including generation of visual and/or audio instructions in accordance with various embodiments of the disclosure
  • FIG. 7 is a flow diagram showing biometric processes involving an enrollment process for generating a biometric signature and post-generation use of the biometric signature in accordance with various embodiments of the disclosure
  • FIG. 8 is a flow diagram showing biometric processes involving unlocking of an electronic device using a biometric signature in accordance with various embodiments of the disclosure
  • FIGS. 9-11 and 13 are flow diagrams showing biometric processes involving a triggering algorithm in accordance with various embodiments of the disclosure.
  • FIGS. 12 and 14 are diagrams depicting aspects of biometric processes involving the triggering algorithms of FIGS. 11 and 13 , respectively, in accordance with various embodiments of the disclosure.
  • FIG. 15 is a block diagram of a system which includes an electronic device configured to implement a biometric system and method according to various embodiments of the disclosure.
  • Embodiments of the disclosure relate to a biometric system and method for use with a wide variety of systems and devices.
  • a biometric system and method according to embodiments of the disclosure find particular utility when incorporated into mobile electronic devices, such as portable communication devices.
  • biometrics as a component of an effective security protocol
  • biometric device that is easily implementable on mobile phones.
  • fingerprint readers swipe sensors, and a few area sensors
  • voice and face recognition applications that can be used with the microphone or camera available on most phones, however these work poorly, and would require additional, or much better biometric hardware to be added to the phone in order for them to be reliable and accurate enough to provide serious security.
  • User acceptance is also a very important factor in biometrics, and many users are wary of the familiar biometric systems, and in particular of fingerprint readers.
  • a new biometric system that is easy to use, simple to implement using existing hardware on mobile phones and tablets, and is different enough from any of the common biometrics that are used now could fill the need for an effective way of verifying a user's identity without a long technology development cycle, or the need for hardware manufacturers to build a new device into their products. This would allow the added security of biometrics to be put in place on the very short timescale needed to support the rising security needs of mobile transactions.
  • Embodiments of the inventions described herein use modern touch screen technology to acquire the patterns of one, two, three or four of the user's fingers pressed onto the screen together, or in a particular sequence.
  • a time-dependent pattern is used, not just a simple static finger geometry pattern.
  • a rapid series of “frames” of the user's finger geometry is recorded as he/she presses down onto the screen.
  • this time-dependent pattern has additional and useful complexity. This time-dependence actually allows each user to develop a special technique of hand placement, perhaps with additional conscious movements that only he/she knows, in a sense adding a password-like component to the pattern.
  • the approach is not limited to hand or finger geometry. It can be used with other body parts, most notably the pinna of the ear. This can be seen as potentially desirable since most touch screen phones are constructed in such a way that when the user holds the phone to talk, he/she will inevitably press his/her ear against the touch screen somewhat consistently every time the phone is used. It may be possible to successfully enroll the ear print pattern in the same way as the time-dependent hand print pattern would be, and then to use this ear print pattern as another biometric signature.
  • Latent prints are a problem with some fingerprint systems because the user may leave a clear image of his/her fingerprint on the sensor (or possibly elsewhere) that can be copied using one of several methods, and used to create a false fingerprint, or, in some cases, the latent print left on a sensor can be induced to trigger the sensor again through the application of heat, cold, illumination, or something else, depending on the sensor technology.
  • the issue of latent prints is essentially rendered inconsequential by the element of time-dependence in the approaches described herein.
  • the spatial dimension of the data will be quite low-resolution with current state-of-the-art touchscreens, and therefore not by itself contain sufficient information, however, with the time dimension added, the dataset will effectively be three-dimensional, and contain a great deal of information that can be used to differentiate a large number of users, as well as provide enough information density to overcome the difficulties of natural variations in biometric signatures from which all biometric systems suffer. If the touchscreen happens to have the ability to also measure pressure as a function of position on the screen, such as a pressure-sensitive touch screen does, then yet another dimension can be added to the biometric signature: that of pressure. The data set would now effectively be four-dimensional, having two spatial dimensions and pressure, all as a function of time.
  • FIG. 1 there is illustrated a flow diagram showing biometric processes in accordance with various embodiments.
  • the biometric method depicted in FIG. 1 involves sensing 110 a contact event involving a body part of a user, and producing 112 contact data.
  • the contact data preferably includes spatial data associated with the 2-D geometry of the user's body part and temporal data associated with the development of a contact pattern for the contact event.
  • the method of FIG. 1 further involves generating 114 biometric signature data using the contact data. It is noted that content of the contact data can be different for different embodiments of the disclosure.
  • a typical contact event involves an intentional touching of a touch sensitive device by a user.
  • the user may place one or more fingers (or palm, for example) on a touch sensor of the touch sensitive device, which can define a contact event.
  • the user may use one or more fingers to swipe across a region of the touch sensor, which can define a contact event.
  • static (stationary) and dynamic (moving) contact events are contemplated. It is noted that, in the case of a static contact event, a resulting contact event still involves development of a contact pattern over time, since the area of contact between the user's body part and the touch sensor changes between initial contact and a stationary state.
  • FIG. 2 illustrates a flow diagram showing various biometric processes in accordance with other embodiments.
  • the biometric method involves sensing 120 a contact event involving a body part of a user, and producing 122 contact data.
  • the contact data preferably includes data indicative of a time-dependent pattern of the geometry of different portions of the user's body part as the contact event evolves over time.
  • the method of FIG. 2 further involves generating 124 biometric signature data using the contact data.
  • the flow diagram illustrated in FIG. 3 shows various biometric processes in accordance with some embodiments.
  • the biometric method shown in FIG. 3 involves sensing 130 a contact event involving a body part of a user, and producing 132 contact data.
  • the contact data according to this embodiment preferably includes data indicative of a chronology of frames of data of the contact event (referred to herein as “contact data frames”).
  • the method of FIG. 3 further involves generating 134 biometric signature data using the contact data.
  • FIG. 4 illustrates a flow diagram showing various biometric processes in accordance with further embodiments.
  • the biometric method involves sensing 140 a contact event involving a body part of a user, and producing 142 a sequence of data frames each comprising contact data associated with a different portion of the user's body part as the contact pattern develops.
  • the method of FIG. 4 also involves generating 144 biometric signature data using the contact data.
  • the biometric method involves sensing 150 a contact event involving a body part of a user, and sensing 152 pressure resulting from the contact event.
  • Contact data is produced 154 for this contact event from which a contact pattern can be developed.
  • the contact data in FIG. 5 preferably includes spatial data associated with the 2-D geometry of the user's body part, pressure data associated with the touch, and temporal data associated with the development of a contact and pressure profile for the contact event.
  • the method of FIG. 5 also involves generating 154 biometric signature data using the contact data.
  • FIG. 6 illustrates a flow diagram showing various biometric processes in accordance with some embodiments.
  • the biometric method involves generating 160 visual and/or audio instructions to aid a user in creating or validating a biometric signature.
  • the method also involves sensing 162 a contact event involving a body part of a user in accordance with the generated instructions, and producing 164 contact data indicative of spatial dimensions of the user's body part as a function of time.
  • the method of FIG. 6 further involves generating 166 biometric signature data using the contact data.
  • the flow diagram of FIG. 7 illustrates an enrollment process for generating a biometric signature and post-generation use of the biometric signature in accordance with various embodiments of the disclosure.
  • a user establishes his or her biometric signature which, once created, can be used to provide secured access to electronic devices, applications, websites, and other secured systems and software processes (e.g., bank transactions via a mobile phone or tablet).
  • the enrollment process shown in FIG. 7 involves the user placing 170 a body part on a touch sensor.
  • the user may select which body part (e.g., hand or ear) and number of body parts to be used to create the user's biometric signature.
  • the user may choose to use 3 figures that are placed near the center of the touch sensor.
  • the touch sensor may be pressed against the pinna of the user's ear, which may be convenient for secured use of mobile phones that incorporate a touch sensor.
  • the enrollment process involves sensing 171 a contact event involving the selected body part(s), and producing 172 contact data.
  • the contact data preferably includes data indicative of a time-dependent pattern of the geometry of different portions of the user's body part as the contact event evolves over time.
  • the method further involves generating 173 a biometric signature for the user using the contact data.
  • the processes of blocks 170 - 173 may be repeated 174 to enhance the reliability (e.g., stability, repeatability) of the user's biometric signature.
  • the enrollment process concludes with storing 175 the user's biometric signature for subsequent use.
  • the biometric signature can be stored locally on a mobile electronic device owned by the user, on a remote server or both locally and remotely. The user's biometric signature is now available for use with various secured applications, websites, services, systems, and devices that require user authentication.
  • a biometric algorithm of the present disclosure is operating on an electronic device and that the device is operating an application that requires user authentication prior to allowing use or access to the device and/or application.
  • the method of FIG. 7 further involves attempting 176 to execute a secured application on the electronic device, which may be a mobile phone, tablet or PC, for example.
  • Attempting to execute the application involves creating a contact pattern for the user in the same manner as when generating the user's biometric signature. It is understood that the touch sensor and/or electronic device used to generate the biometric signature can be the same as, or different from, those used to produce the contact pattern.
  • a touch sensor of the electronic device senses 177 contact with a body part of the user and produces 178 contact data in a manner previously described.
  • the body part(s) brought into contact with the touch sensor of the electronic device is the same as that/those used to create the user's biometric signature.
  • the just-produced contact pattern is compared 180 to the user's biometric signature to authenticate the user. This comparison can be performed by the electronic device, by a remote system communicatively coupled to the electronic device, or cooperatively by both entities.
  • the user is granted access 182 to the application only if the contact pattern matches the biometric signature.
  • FIG. 8 illustrates a flow diagram showing various biometric processes in accordance with some embodiments.
  • a procedure for unlocking 190 an electronic device is initiated. It is assumed in the illustrative embodiment of FIG. 8 that a biometric algorithm of the present disclosure is operating on the electronic device and that the device is operating an application that requires user authentication prior to allowing use or access to the device and/or application.
  • the unlocking procedure involves interrupting 192 a normal password prompt typically generated by the application.
  • a biometric signature verification procedure of the present disclosure is initiated 194 .
  • Contact data resulting from sensing contact between a user's body part and a touch sensor of the electronic device is acquired, from which a time-dependent contact pattern is generated. This time-dependent contact pattern is used by the biometric algorithm to verify 196 authenticity of the user.
  • the biometric algorithm passes 198 back control to the application and the electronic device is unlocked only if the verification procedure is successful. If the verification procedure is unsuccessful (i.e., the contact pattern constructed from the contact data does not match the user's biometric signature), a signal indicative of such failure is generated 200 , and the electronic device is maintained in the locked state. In response to the generated signal, a message indicating the unsuccessful verification is communicated to the user, typically via a visual and/or audio message.
  • various embodiments of the disclosure provide for detection 201 of a triggering event which, when detected, causes initiation of biometric signature data acquisition.
  • Detection 201 of a triggering event causes initiation 202 of a procedure for detecting a contact pattern of a user's body part relative to a touch sensor that results from a contact event with the touch sensor.
  • the method further involves recording 204 contact pattern data comprising spatial dimensions of the user's body part in contact with the touch sensor as a function of time.
  • the method also involves terminating 206 the detection and recording processes after either successful completion or failure of the contact pattern detection procedure. It is understood that a variety of triggering methodologies are contemplated, and that a triggering protocol can be implemented for one or both of the biometric signature and contact pattern generation processes.
  • the addition of the time dimension adds a complexity that is both desirable, and potentially excessive, however this complexity can be quantized and controlled through creative design of the data acquisition algorithms.
  • the excessive complexity can come from the fact that a user may change the “speed” with which he/she executes the hand placement on the touchscreen, effectively shortening, or lengthening the extent of the dataset in the time dimension depending on if the placement is faster or slower. This would be the case if acquisition of each “frame” of the pattern is acquired according to an independent clock that runs on the device, collecting the data in fixed time steps in the same way a movie camera, for example, acquires a sequence of images of a scene. This can make matching with the enrolled pattern difficult.
  • One way to handle this is to develop algorithms that can compress or stretch the dataset in the time dimension as part of the pattern-matching component of the biometric algorithm suite.
  • Such a method may be based on existing “morphing” techniques, wavelet transforms, or other known methods, or it may be developed specifically for use in this application. This is one viable approach, however, it is not likely to be the fastest or most efficient, and may have other problems.
  • the time aspect of the user's hand placement need not be linear in this case, as the triggering algorithm will “examine” the pattern as it develops in time, and trigger the capture of the “movie frames” of the pattern automatically based on some criteria of the image (pattern) itself.
  • this specialized triggering algorithm examples of which are described below.
  • FIG. 10 illustrates a flow diagram showing various biometric processes including a triggering methodology in accordance with various embodiments.
  • FIG. 10 shows a number of different processes involving generating a contact pattern in response to a triggering event and validating this contact pattern against a pre-established biometric signature for the user.
  • the triggering processes described herein rely on storing of a sequence of data “points”, which are taken during the enrollment process or are predetermined somewhat arbitrarily.
  • the thresholds for triggering are based on these stored data points (for example, a series of different total mass count levels as is described below).
  • the decision to trigger capturing of a frame of contact data is made based on the first data point (ith data point), then, once that is done, the next triggering event is based on the second data point (ith+1 data point), and so forth.
  • the triggering and contact data generation methodologies shown in FIG. 10 can be implemented for one or both of the biometric signature and contact pattern generation processes.
  • the processes shown in FIG. 10 include sensing 210 a contact pattern evolving in real-time for a contact event involving a body part of a user.
  • the method also involves acquiring 212 data from the contact event that is used by a triggering algorithm, which is typically running on an electronic device operable by the user. If the acquired data meets or exceeds a triggering threshold 214 , a triggering event is declared and a frame of contact data is captured 216 at a current clock time Tx. If the acquired data fails to meet or exceed the triggering threshold 214 , the data acquisition process of block 212 continues.
  • Each triggering event 214 results in the capturing 216 of an additional frame of contact data for the contact event.
  • the sequence of captured data frames defines a contact pattern that evolves as the contact event evolves.
  • a validation operation occurs to verify whether or not the developing or developed contact pattern corresponds to the pre-established biometric signature of the user.
  • each captured contact data frame is added 218 to a developing contact pattern which, when sufficiently formed, is subsequently subjected to validation.
  • each captured frame of contact data to be added to a developing contact pattern is subjected to validation 220 against the pre-established biometric signature of the user.
  • this contact data frame would be considered invalid.
  • a test 224 is made to determine if enough contact data has been collected for the developing contact pattern. If not, the clock continues to run 228 and the data acquisition process in block 212 continues. If a sufficient amount of contact data has been collected for the developing contact pattern, the contact pattern is compared 226 to the user's pre-established biometric signature. If determined to be invalid, the procedure of FIG. 10 is terminated 232 and an invalidity signal is generated. If determined valid, the procedure of FIG. 10 is terminated 234 and a signal confirming validity is generated. It is understood that the validation processes of blocks 226 , 230 , and 232 would not be implicated when creating the user's biometric signature in accordance with the methodology shown in FIG. 10 .
  • Verification of a biometric signature can involve a number of different validation techniques.
  • One approach involves comparing a temporal order of data frames of a developing or developed contact pattern to that of the biometric signature.
  • Another approach involves comparing characteristics of the time-dependent spatial data of a contact pattern with those of the biometric signature. It is to be understood that there are many ways to use the contact information collected in space and time, including arranging the 3-D data (2-D in space, and 1-D in time) into a new format of 2-D data that can be tested for verification using existing pattern recognition methods.
  • the biometric signature can be created from this data in a number of different ways.
  • triggering of screen image capture can be based on “total mass” count (TMC) of the image as a function of time. Triggering can be quantized based on the total amount of “mass” or ratio of bright to dark pixels of the screen that are filled. As the user places his/her hand on the touchscreen, the area filled by the parts of the screen that are covered by the hand is added up to provide a “total mass” count (TMC) as a function of time.
  • TMC total mass count
  • This TMC can be based on either a binarized version of the image, or the grayscale version. It may be the case that using the binarized version of the image to calculate the TMC will be more reliable for triggering purposes, but this will depend on the properties of the screen, and the user's hand.
  • Triggering on the capture of image data of the user's hand during placement will then be set to occur upon reaching specific values of the TMC in sequence as the user places his/her hand on the screen during verification.
  • triggering of sequential “frames” of the touchscreen image of the user's hand will not depend on the time, but upon the image data itself being captured.
  • the TMC values used for the triggering can be simply arbitrary values from low to high, or a set of other, perhaps non-linearly increasing values of the TMC.
  • the set of TMC triggering values may be determined at the time of enrollment by the user, and can be based on the TMC values reached over a linear time sequence during the enrollment process, after which the originally-used time sequence can be abandoned, and triggering will be based on the predetermined TMC values. Otherwise, any other favorable sequence of TMC values can be predetermined, and used for the triggering process, if such a set of values is known from testing to deliver reliable triggering results over a variety of users.
  • FIG. 11 illustrates a flow diagram showing various biometric processes including a triggering methodology in accordance with some embodiments.
  • FIG. 12 illustrates an evolving contact pattern relative to predetermined triggering thresholds.
  • FIG. 12 illustrates an evolving contact pattern relative to predetermined triggering thresholds.
  • triggering of screen image capture can be based on sequential filling of spatial regions of the screen.
  • the filling of various regions of the screen can be used as triggering events. This can be done by dividing up the screen area into a number or regions prior to verification. This can be done arbitrarily, for example by dividing the area up on a 2-D grid, with rectangular regions that are equal to, or larger than, the fundamental spatial resolution of the screen, or using any other arbitrary, pre-determined segmentation desired, such as concentric arc-shaped segments, etc.
  • the arbitrary segmentation can be done, and stored in the device memory prior to the use of the application by any user, or prior to its use by each user. It can also be done after enrollment by the user, thereby making the segmentation map unique for each user.
  • the triggering scheme can be optimized for each user, and for the way each user places his/her fingers on the screen. This would be done via an algorithm that analyzes the finger contact pattern during enrollment, and storing information about the time sequence in which various regions of the screen are “filled” as the user enrolls. The exact time at which a region is filled does not matter, only the relative time at which regions get filled.
  • a start command would initiate the process, and the device will wait to capture a full screen image until the first triggering region is filled. Once this happens, the device again waits until the next screen region in the sequence is filled, and triggers another full-screen image capture. This process continues until verification is terminated.
  • the termination point can also be triggered either by the filling of all the designated regions, or when the final region in the sequence is filled. It can also be terminated early if the designated regions are filled out of sequence. This can be used as part of the verification process.
  • a user's verification attempt can be rejected early if the placement fills the regions in a sequence that is too different from the sequence (or sequences, since enrollment will usually require more than one placement for reliability) stored during enrollment. If the user has shifted his/her position up, down, left or right along the screen during verification compared with the original location during enrollment, a simple translation shift can be applied when analyzing the filling of the screen regions in order to compensate for this. A similar sort translation shift must be used when comparing the actual enrollment pattern with any verification pattern as well.
  • FIG. 13 illustrates a flow diagram showing various biometric processes including a triggering methodology in accordance with some embodiments.
  • FIG. 15 is a block diagram of a system 300 which includes an electronic device configured to implement a biometric system and method according to various embodiments of the disclosure.
  • the system 300 shown in FIG. 15 includes a touch sensor 302 coupled to a processor 304 .
  • the touch sensor 302 may include one or more user-actuatable buttons 303 .
  • the touch sensor may be fabricated according to various technologies, including capacitive, resistive, force, and acoustic technologies, for example.
  • the touch sensor 302 may optionally incorporate an integral or separate pressure sensor capable of sensing pressure at various locations of the sensor surface.
  • the processor 304 includes a clock 305 which may be separate from the main clocks of the processor 304 .
  • the clock 305 may be dedicated to perform clocking functions for the biometric signature algorithms 307 stored in a memory 306 coupled to the processor 304 .
  • a speaker and/or microphone unit 308 may be included.
  • the system 300 may further include one or more wired and/or wireless communication units 310 , such as one or more radios (e.g., cellular, Wi-Fi), transceivers (e.g., Bluetooth), and hardwire interfaces (e.g., Ethernet).
  • the communication unit(s) 310 are coupled to the processor 304 and provide communicate coupling to external systems and networks, such as the Internet 312 .
  • the processor 304 may communicate with a remote server 314 , for example, via the Internet 312 or other communication link.
  • biometric data can be transferred between the processor 304 /memory 306 and the remote server 314 .
  • the processor 304 and the remote server 314 may operate cooperatively during one or more biometric processes described hereinabove.
  • Components of the system 300 shown in FIG. 15 can be incorporated in a variety of electronic devices, such as a mobile phone, tablet, PC, and the like.

Abstract

A touch sensor is configured to sense a contact event occurring between a body part of a user and the touch sensor. The touch sensor is configured to produce contact data in response to the sensed contact event. A processor is coupled to the touch sensor and memory. The processor is configured to store in the memory a sequence of data frames each comprising contact data associated with a different portion of the user's body part. The processor is further configured to generate biometric signature data using the sequence of data frames.

Description

    RELATED PATENT DOCUMENTS
  • This application is a continuation of U.S. patent application Ser. No. 13/651,408, filed Oct. 13, 2012, which claims the benefit of Provisional Patent Application Ser. Nos. 61/546,838, filed on Oct. 13, 2011, and 61/563,138, filed on Nov. 23, 2011, to which priority is claimed and which are hereby incorporated herein by reference.
  • SUMMARY
  • Embodiments of the disclosure are directed to a biometric system and method for use in devices that include a touch sensor. Method embodiments of the disclosure involve sensing a contact event occurring between a body part of a user and a touch sensor and storing, for the contact event, a sequence of data frames each comprising contact data associated with a different portion of the user's body part. Method embodiments further involve generating biometric signature data using the sequence of data frames.
  • Apparatus embodiments of the disclosure include a touch sensor configured to sense a contact event occurring between a body part of the user and the touch sensor. The touch sensor is configured to produce contact data in response to the sensed contact event. A processor is coupled to the touch sensor and memory. The processor is configured to store in the memory a sequence of data frames each comprising contact data associated with a different portion of the user's body part. The processor is further configured to generate biometric signature data using the sequence of data frames.
  • These and other features can be understood in view of the following detailed discussion and the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1-5 are flow diagrams showing biometric processes in accordance with various embodiments of the disclosure;
  • FIG. 6 is a flow diagram showing biometric processes including generation of visual and/or audio instructions in accordance with various embodiments of the disclosure;
  • FIG. 7 is a flow diagram showing biometric processes involving an enrollment process for generating a biometric signature and post-generation use of the biometric signature in accordance with various embodiments of the disclosure;
  • FIG. 8 is a flow diagram showing biometric processes involving unlocking of an electronic device using a biometric signature in accordance with various embodiments of the disclosure;
  • FIGS. 9-11 and 13 are flow diagrams showing biometric processes involving a triggering algorithm in accordance with various embodiments of the disclosure;
  • FIGS. 12 and 14 are diagrams depicting aspects of biometric processes involving the triggering algorithms of FIGS. 11 and 13, respectively, in accordance with various embodiments of the disclosure; and
  • FIG. 15 is a block diagram of a system which includes an electronic device configured to implement a biometric system and method according to various embodiments of the disclosure.
  • DESCRIPTION
  • In the following description of the illustrated embodiments, references are made to the accompanying drawings forming a part hereof, and in which are shown by way of illustration, various embodiments by which the invention may be practiced. It is to be understood that other embodiments may be utilized, and structural and functional changes may be made without departing from the scope of the present invention.
  • Embodiments of the disclosure relate to a biometric system and method for use with a wide variety of systems and devices. A biometric system and method according to embodiments of the disclosure find particular utility when incorporated into mobile electronic devices, such as portable communication devices.
  • In the context of portable communication devices, there is a rapidly increasing need for a biometric system that can be implementable on common, currently available mobile devices, mainly mobile phones and tablet computers. This need appears to be becoming critical as the use of mobile devices for conducting transactions, and in particular financial transactions, increases, while the sophistication, and therefore the risk, of cyber crime grows.
  • Despite the apparent trajectory for the use of mobile devices, and the obvious advantages of biometrics as a component of an effective security protocol, there is no readily available biometric device that is easily implementable on mobile phones. There are small, inexpensive fingerprint readers (swipe sensors, and a few area sensors) that are appearing on some phone models, as well as a few voice and face recognition applications that can be used with the microphone or camera available on most phones, however these work poorly, and would require additional, or much better biometric hardware to be added to the phone in order for them to be reliable and accurate enough to provide serious security. User acceptance is also a very important factor in biometrics, and many users are wary of the familiar biometric systems, and in particular of fingerprint readers.
  • A new biometric system that is easy to use, simple to implement using existing hardware on mobile phones and tablets, and is different enough from any of the common biometrics that are used now could fill the need for an effective way of verifying a user's identity without a long technology development cycle, or the need for hardware manufacturers to build a new device into their products. This would allow the added security of biometrics to be put in place on the very short timescale needed to support the rising security needs of mobile transactions.
  • Embodiments of the inventions described herein use modern touch screen technology to acquire the patterns of one, two, three or four of the user's fingers pressed onto the screen together, or in a particular sequence. To increase the sophistication, and amount of usable biometric information, a time-dependent pattern is used, not just a simple static finger geometry pattern. In other words, a rapid series of “frames” of the user's finger geometry is recorded as he/she presses down onto the screen. Because of the natural 3-D geometry of a user's fingers and hand, this time-dependent pattern has additional and useful complexity. This time-dependence actually allows each user to develop a special technique of hand placement, perhaps with additional conscious movements that only he/she knows, in a sense adding a password-like component to the pattern.
  • Since the user's hand geometry is unique, and not controlled by the user, there will always be involuntary components to the biometric signature that are unknown, even to the user, both spatially (the hand geometry pattern in 2-D on the screen), as well as the involuntary time dependence (the way the pattern changes over time as the user presses down on the screen). This is critical for biometric security because it cannot be given away, even by the user himself/herself.
  • The additional possibility that the user can also make extra voluntary movements during hand placement creates even more uniqueness, and therefore valuable complexity to the biometric “signature” that will be used by the system. Furthermore, the approach is not limited to hand or finger geometry. It can be used with other body parts, most notably the pinna of the ear. This can be seen as potentially desirable since most touch screen phones are constructed in such a way that when the user holds the phone to talk, he/she will inevitably press his/her ear against the touch screen somewhat consistently every time the phone is used. It may be possible to successfully enroll the ear print pattern in the same way as the time-dependent hand print pattern would be, and then to use this ear print pattern as another biometric signature. In principle, other body parts could also be used, such as the lips, knuckles, a part of the arms or legs, foot, etc. These are not likely to be as practical as the hand, or even the pinna of the ear, but they are mentioned here for completeness.
  • One significant advantage of the added element of time-dependence of the biometric signature is that there will not be any possibility for hackers to “lift” a so-called “latent print.” Latent prints are a problem with some fingerprint systems because the user may leave a clear image of his/her fingerprint on the sensor (or possibly elsewhere) that can be copied using one of several methods, and used to create a false fingerprint, or, in some cases, the latent print left on a sensor can be induced to trigger the sensor again through the application of heat, cold, illumination, or something else, depending on the sensor technology. The issue of latent prints is essentially rendered inconsequential by the element of time-dependence in the approaches described herein.
  • This approach is chosen because the resolution of current touch screens is suitable for such use, but they are not capable of resolving finer patterns, such as those of a fingerprint. The use of time dependence in the biometric embodiments disclosed herein represents a unique capability not found in conventional biometrics. The temporal component of the data differentiates it from other biometrics, and opens up a number of possibilities for the user to participate in creating the “password” or “key”, as well as for increasing the complexity of the biometric signature.
  • The spatial dimension of the data will be quite low-resolution with current state-of-the-art touchscreens, and therefore not by itself contain sufficient information, however, with the time dimension added, the dataset will effectively be three-dimensional, and contain a great deal of information that can be used to differentiate a large number of users, as well as provide enough information density to overcome the difficulties of natural variations in biometric signatures from which all biometric systems suffer. If the touchscreen happens to have the ability to also measure pressure as a function of position on the screen, such as a pressure-sensitive touch screen does, then yet another dimension can be added to the biometric signature: that of pressure. The data set would now effectively be four-dimensional, having two spatial dimensions and pressure, all as a function of time.
  • Turning now to FIG. 1, there is illustrated a flow diagram showing biometric processes in accordance with various embodiments. The biometric method depicted in FIG. 1 involves sensing 110 a contact event involving a body part of a user, and producing 112 contact data. The contact data preferably includes spatial data associated with the 2-D geometry of the user's body part and temporal data associated with the development of a contact pattern for the contact event. The method of FIG. 1 further involves generating 114 biometric signature data using the contact data. It is noted that content of the contact data can be different for different embodiments of the disclosure.
  • A typical contact event involves an intentional touching of a touch sensitive device by a user. For example, the user may place one or more fingers (or palm, for example) on a touch sensor of the touch sensitive device, which can define a contact event. By way of further example, the user may use one or more fingers to swipe across a region of the touch sensor, which can define a contact event. It is understood that a wide variety of static (stationary) and dynamic (moving) contact events are contemplated. It is noted that, in the case of a static contact event, a resulting contact event still involves development of a contact pattern over time, since the area of contact between the user's body part and the touch sensor changes between initial contact and a stationary state.
  • FIG. 2 illustrates a flow diagram showing various biometric processes in accordance with other embodiments. In FIG. 2, the biometric method involves sensing 120 a contact event involving a body part of a user, and producing 122 contact data. The contact data preferably includes data indicative of a time-dependent pattern of the geometry of different portions of the user's body part as the contact event evolves over time. The method of FIG. 2 further involves generating 124 biometric signature data using the contact data.
  • The flow diagram illustrated in FIG. 3 shows various biometric processes in accordance with some embodiments. The biometric method shown in FIG. 3 involves sensing 130 a contact event involving a body part of a user, and producing 132 contact data. The contact data according to this embodiment preferably includes data indicative of a chronology of frames of data of the contact event (referred to herein as “contact data frames”). The method of FIG. 3 further involves generating 134 biometric signature data using the contact data.
  • FIG. 4 illustrates a flow diagram showing various biometric processes in accordance with further embodiments. In FIG. 4, the biometric method involves sensing 140 a contact event involving a body part of a user, and producing 142 a sequence of data frames each comprising contact data associated with a different portion of the user's body part as the contact pattern develops. The method of FIG. 4 also involves generating 144 biometric signature data using the contact data.
  • Embodiments of the disclosure can acquire other and/or additional data for purposes of implementing various biometric processes. According to the embodiment shown in FIG. 5, the biometric method involves sensing 150 a contact event involving a body part of a user, and sensing 152 pressure resulting from the contact event. Contact data is produced 154 for this contact event from which a contact pattern can be developed. The contact data in FIG. 5 preferably includes spatial data associated with the 2-D geometry of the user's body part, pressure data associated with the touch, and temporal data associated with the development of a contact and pressure profile for the contact event. The method of FIG. 5 also involves generating 154 biometric signature data using the contact data.
  • FIG. 6 illustrates a flow diagram showing various biometric processes in accordance with some embodiments. In FIG. 6, the biometric method involves generating 160 visual and/or audio instructions to aid a user in creating or validating a biometric signature. The method also involves sensing 162 a contact event involving a body part of a user in accordance with the generated instructions, and producing 164 contact data indicative of spatial dimensions of the user's body part as a function of time. The method of FIG. 6 further involves generating 166 biometric signature data using the contact data.
  • The flow diagram of FIG. 7 illustrates an enrollment process for generating a biometric signature and post-generation use of the biometric signature in accordance with various embodiments of the disclosure. During the enrollment process, a user establishes his or her biometric signature which, once created, can be used to provide secured access to electronic devices, applications, websites, and other secured systems and software processes (e.g., bank transactions via a mobile phone or tablet).
  • The enrollment process shown in FIG. 7 (blocks 170-175) involves the user placing 170 a body part on a touch sensor. The user may select which body part (e.g., hand or ear) and number of body parts to be used to create the user's biometric signature. For example, the user may choose to use 3 figures that are placed near the center of the touch sensor. In another example, the touch sensor may be pressed against the pinna of the user's ear, which may be convenient for secured use of mobile phones that incorporate a touch sensor.
  • With the body part being placed on the touch sensor, the enrollment process involves sensing 171 a contact event involving the selected body part(s), and producing 172 contact data. The contact data according to this embodiment preferably includes data indicative of a time-dependent pattern of the geometry of different portions of the user's body part as the contact event evolves over time. The method further involves generating 173 a biometric signature for the user using the contact data. The processes of blocks 170-173 may be repeated 174 to enhance the reliability (e.g., stability, repeatability) of the user's biometric signature. The enrollment process concludes with storing 175 the user's biometric signature for subsequent use. The biometric signature can be stored locally on a mobile electronic device owned by the user, on a remote server or both locally and remotely. The user's biometric signature is now available for use with various secured applications, websites, services, systems, and devices that require user authentication.
  • In the post-enrollment use example illustrated in FIG. 7 (blocks 176-182), it is assumed that a biometric algorithm of the present disclosure is operating on an electronic device and that the device is operating an application that requires user authentication prior to allowing use or access to the device and/or application. At a time subsequent to the enrollment process, the method of FIG. 7 further involves attempting 176 to execute a secured application on the electronic device, which may be a mobile phone, tablet or PC, for example. Attempting to execute the application involves creating a contact pattern for the user in the same manner as when generating the user's biometric signature. It is understood that the touch sensor and/or electronic device used to generate the biometric signature can be the same as, or different from, those used to produce the contact pattern.
  • With continued reference to FIG. 7, a touch sensor of the electronic device senses 177 contact with a body part of the user and produces 178 contact data in a manner previously described. The body part(s) brought into contact with the touch sensor of the electronic device is the same as that/those used to create the user's biometric signature. The just-produced contact pattern is compared 180 to the user's biometric signature to authenticate the user. This comparison can be performed by the electronic device, by a remote system communicatively coupled to the electronic device, or cooperatively by both entities. The user is granted access 182 to the application only if the contact pattern matches the biometric signature.
  • FIG. 8 illustrates a flow diagram showing various biometric processes in accordance with some embodiments. In FIG. 8, a procedure for unlocking 190 an electronic device is initiated. It is assumed in the illustrative embodiment of FIG. 8 that a biometric algorithm of the present disclosure is operating on the electronic device and that the device is operating an application that requires user authentication prior to allowing use or access to the device and/or application. The unlocking procedure involves interrupting 192 a normal password prompt typically generated by the application. Instead, a biometric signature verification procedure of the present disclosure is initiated 194. Contact data resulting from sensing contact between a user's body part and a touch sensor of the electronic device is acquired, from which a time-dependent contact pattern is generated. This time-dependent contact pattern is used by the biometric algorithm to verify 196 authenticity of the user.
  • The biometric algorithm passes 198 back control to the application and the electronic device is unlocked only if the verification procedure is successful. If the verification procedure is unsuccessful (i.e., the contact pattern constructed from the contact data does not match the user's biometric signature), a signal indicative of such failure is generated 200, and the electronic device is maintained in the locked state. In response to the generated signal, a message indicating the unsuccessful verification is communicated to the user, typically via a visual and/or audio message.
  • In accordance with the biometric processes shown in FIG. 9, various embodiments of the disclosure provide for detection 201 of a triggering event which, when detected, causes initiation of biometric signature data acquisition. Detection 201 of a triggering event causes initiation 202 of a procedure for detecting a contact pattern of a user's body part relative to a touch sensor that results from a contact event with the touch sensor. The method further involves recording 204 contact pattern data comprising spatial dimensions of the user's body part in contact with the touch sensor as a function of time. The method also involves terminating 206 the detection and recording processes after either successful completion or failure of the contact pattern detection procedure. It is understood that a variety of triggering methodologies are contemplated, and that a triggering protocol can be implemented for one or both of the biometric signature and contact pattern generation processes.
  • The addition of the time dimension adds a complexity that is both desirable, and potentially excessive, however this complexity can be quantized and controlled through creative design of the data acquisition algorithms. The excessive complexity can come from the fact that a user may change the “speed” with which he/she executes the hand placement on the touchscreen, effectively shortening, or lengthening the extent of the dataset in the time dimension depending on if the placement is faster or slower. This would be the case if acquisition of each “frame” of the pattern is acquired according to an independent clock that runs on the device, collecting the data in fixed time steps in the same way a movie camera, for example, acquires a sequence of images of a scene. This can make matching with the enrolled pattern difficult. One way to handle this is to develop algorithms that can compress or stretch the dataset in the time dimension as part of the pattern-matching component of the biometric algorithm suite. Such a method may be based on existing “morphing” techniques, wavelet transforms, or other known methods, or it may be developed specifically for use in this application. This is one viable approach, however, it is not likely to be the fastest or most efficient, and may have other problems.
  • An alternate, and possibly better approach would be to quantize the time dimension of the data during acquisition based on the pattern itself, and not on an independent clock. This would require specialized triggering algorithms to “step” the acquisition of the pattern on the screen as a function of time while the user is making the placement. Since the only important thing in the time dimension is what aspects of the pattern appear first, second, third, and so on, by quantizing time through special triggering that is based on the fundamental properties of the pattern itself as it develops during the placement, the time axis of the data set will be controlled in real time, during the placement unintentionally by the user him/herself in an automated fashion. The time aspect of the user's hand placement need not be linear in this case, as the triggering algorithm will “examine” the pattern as it develops in time, and trigger the capture of the “movie frames” of the pattern automatically based on some criteria of the image (pattern) itself. There are several possible designs for this specialized triggering algorithm, examples of which are described below.
  • FIG. 10 illustrates a flow diagram showing various biometric processes including a triggering methodology in accordance with various embodiments. FIG. 10 shows a number of different processes involving generating a contact pattern in response to a triggering event and validating this contact pattern against a pre-established biometric signature for the user. In general terms, the triggering processes described herein rely on storing of a sequence of data “points”, which are taken during the enrollment process or are predetermined somewhat arbitrarily. During use of a secured electronic device, for example, when the user places his/her hand on the touch sensor (e.g., screen), the thresholds for triggering are based on these stored data points (for example, a series of different total mass count levels as is described below). As the placement occurs, the decision to trigger capturing of a frame of contact data is made based on the first data point (ith data point), then, once that is done, the next triggering event is based on the second data point (ith+1 data point), and so forth. As discussed previously, the triggering and contact data generation methodologies shown in FIG. 10 can be implemented for one or both of the biometric signature and contact pattern generation processes.
  • The processes shown in FIG. 10 include sensing 210 a contact pattern evolving in real-time for a contact event involving a body part of a user. The method also involves acquiring 212 data from the contact event that is used by a triggering algorithm, which is typically running on an electronic device operable by the user. If the acquired data meets or exceeds a triggering threshold 214, a triggering event is declared and a frame of contact data is captured 216 at a current clock time Tx. If the acquired data fails to meet or exceed the triggering threshold 214, the data acquisition process of block 212 continues.
  • Each triggering event 214 results in the capturing 216 of an additional frame of contact data for the contact event. Over time, the sequence of captured data frames defines a contact pattern that evolves as the contact event evolves. At some stage of the procedure, a validation operation occurs to verify whether or not the developing or developed contact pattern corresponds to the pre-established biometric signature of the user. In some embodiments, each captured contact data frame is added 218 to a developing contact pattern which, when sufficiently formed, is subsequently subjected to validation. In other embodiments, each captured frame of contact data to be added to a developing contact pattern is subjected to validation 220 against the pre-established biometric signature of the user. By way of example, if a given frame of captured contact data is determined to be out of sequence relative to its expected position within the contact data frame sequence of the biometric signature, this contact data frame would be considered invalid.
  • As the contact data capturing routine continues, a test 224 is made to determine if enough contact data has been collected for the developing contact pattern. If not, the clock continues to run 228 and the data acquisition process in block 212 continues. If a sufficient amount of contact data has been collected for the developing contact pattern, the contact pattern is compared 226 to the user's pre-established biometric signature. If determined to be invalid, the procedure of FIG. 10 is terminated 232 and an invalidity signal is generated. If determined valid, the procedure of FIG. 10 is terminated 234 and a signal confirming validity is generated. It is understood that the validation processes of blocks 226, 230, and 232 would not be implicated when creating the user's biometric signature in accordance with the methodology shown in FIG. 10.
  • Verification of a biometric signature can involve a number of different validation techniques. One approach involves comparing a temporal order of data frames of a developing or developed contact pattern to that of the biometric signature. Another approach involves comparing characteristics of the time-dependent spatial data of a contact pattern with those of the biometric signature. It is to be understood that there are many ways to use the contact information collected in space and time, including arranging the 3-D data (2-D in space, and 1-D in time) into a new format of 2-D data that can be tested for verification using existing pattern recognition methods. In fact, the biometric signature can be created from this data in a number of different ways.
  • According to some embodiments, triggering of screen image capture can be based on “total mass” count (TMC) of the image as a function of time. Triggering can be quantized based on the total amount of “mass” or ratio of bright to dark pixels of the screen that are filled. As the user places his/her hand on the touchscreen, the area filled by the parts of the screen that are covered by the hand is added up to provide a “total mass” count (TMC) as a function of time. This TMC can be based on either a binarized version of the image, or the grayscale version. It may be the case that using the binarized version of the image to calculate the TMC will be more reliable for triggering purposes, but this will depend on the properties of the screen, and the user's hand.
  • Triggering on the capture of image data of the user's hand during placement will then be set to occur upon reaching specific values of the TMC in sequence as the user places his/her hand on the screen during verification. Thus, triggering of sequential “frames” of the touchscreen image of the user's hand will not depend on the time, but upon the image data itself being captured. The TMC values used for the triggering can be simply arbitrary values from low to high, or a set of other, perhaps non-linearly increasing values of the TMC. The set of TMC triggering values may be determined at the time of enrollment by the user, and can be based on the TMC values reached over a linear time sequence during the enrollment process, after which the originally-used time sequence can be abandoned, and triggering will be based on the predetermined TMC values. Otherwise, any other favorable sequence of TMC values can be predetermined, and used for the triggering process, if such a set of values is known from testing to deliver reliable triggering results over a variety of users.
  • FIG. 11 illustrates a flow diagram showing various biometric processes including a triggering methodology in accordance with some embodiments. In FIG. 11, the biometric method involves sensing 240 a contact pattern evolving in real-time for a contact event involving a user's body part. While the contact pattern continues to evolve 242 (e.g., for i=1 to N), the total mass count (ith TMC) of the contact pattern image is determined 244. If the ith TMC exceeds an ith predetermined threshold 246, a triggering event is declared 250, and an ith frame of contact data is captured 252 at the current clock time Tx. If not, the clock continues to run and the processes beginning at block 242 continue. The processes above continue 254 until a sufficient amount of contact data (e.g., i=N) has been captured, at which point a validation process may be initiated. FIG. 12 illustrates an evolving contact pattern relative to predetermined triggering thresholds. When the ith TMC of the contact pattern exceeds the ith triggering threshold, an ith frame of contact data is captured at the current clock time Tx, thereby creating a sequence (i=1 to N) of contact data frames.
  • According to other embodiments, triggering of screen image capture can be based on sequential filling of spatial regions of the screen. Instead of using TMC to determine the points in time at which each new image frame is captured during the hand placement on the screen, the filling of various regions of the screen can be used as triggering events. This can be done by dividing up the screen area into a number or regions prior to verification. This can be done arbitrarily, for example by dividing the area up on a 2-D grid, with rectangular regions that are equal to, or larger than, the fundamental spatial resolution of the screen, or using any other arbitrary, pre-determined segmentation desired, such as concentric arc-shaped segments, etc. The arbitrary segmentation can be done, and stored in the device memory prior to the use of the application by any user, or prior to its use by each user. It can also be done after enrollment by the user, thereby making the segmentation map unique for each user.
  • By choosing the shape and size of segments of the screen to examine during later verification attempts based on how the user placed his/her fingers during enrollment, the triggering scheme can be optimized for each user, and for the way each user places his/her fingers on the screen. This would be done via an algorithm that analyzes the finger contact pattern during enrollment, and storing information about the time sequence in which various regions of the screen are “filled” as the user enrolls. The exact time at which a region is filled does not matter, only the relative time at which regions get filled.
  • During verification, first, a start command would initiate the process, and the device will wait to capture a full screen image until the first triggering region is filled. Once this happens, the device again waits until the next screen region in the sequence is filled, and triggers another full-screen image capture. This process continues until verification is terminated. The termination point can also be triggered either by the filling of all the designated regions, or when the final region in the sequence is filled. It can also be terminated early if the designated regions are filled out of sequence. This can be used as part of the verification process.
  • A user's verification attempt can be rejected early if the placement fills the regions in a sequence that is too different from the sequence (or sequences, since enrollment will usually require more than one placement for reliability) stored during enrollment. If the user has shifted his/her position up, down, left or right along the screen during verification compared with the original location during enrollment, a simple translation shift can be applied when analyzing the filling of the screen regions in order to compensate for this. A similar sort translation shift must be used when comparing the actual enrollment pattern with any verification pattern as well.
  • FIG. 13 illustrates a flow diagram showing various biometric processes including a triggering methodology in accordance with some embodiments. In FIG. 13, the biometric method involves sensing 260 a contact pattern evolving in real-time for a contact event involving a user's body part. While the contact pattern continues to evolve 262 (e.g., for i=1+N), the filling of designated touch sensor segments due to being covered by the user's body part is examined 264. A test is made to determine 266 if a current (ith) segment is properly filled relative to an predetermined ith threshold. If not, the clock continues to run 268 and the processes beginning at block 262 continue. If the current segment is properly filled, a triggering event is declared 270 and an ith frame of contact data is captured 272 at the current clock time Tx, thereby creating a sequence of contact data frames. The processes above continue 274 until a sufficient amount of contact data has been captured, at which point a validation process may be initiated. FIG. 14 illustrates an evolving contact pattern relative to a predetermined triggering thresholds. When filling of an ith segment during contact pattern evolution exceeds an ith triggering threshold, an ith frame of contact data is captured at the current clock time Tx, thereby creating a sequence of contact data frames (i=1 to N).
  • FIG. 15 is a block diagram of a system 300 which includes an electronic device configured to implement a biometric system and method according to various embodiments of the disclosure. The system 300 shown in FIG. 15 includes a touch sensor 302 coupled to a processor 304. The touch sensor 302 may include one or more user-actuatable buttons 303. The touch sensor may be fabricated according to various technologies, including capacitive, resistive, force, and acoustic technologies, for example. The touch sensor 302 may optionally incorporate an integral or separate pressure sensor capable of sensing pressure at various locations of the sensor surface. The processor 304 includes a clock 305 which may be separate from the main clocks of the processor 304. The clock 305 may be dedicated to perform clocking functions for the biometric signature algorithms 307 stored in a memory 306 coupled to the processor 304. A speaker and/or microphone unit 308 may be included.
  • The system 300 may further include one or more wired and/or wireless communication units 310, such as one or more radios (e.g., cellular, Wi-Fi), transceivers (e.g., Bluetooth), and hardwire interfaces (e.g., Ethernet). The communication unit(s) 310 are coupled to the processor 304 and provide communicate coupling to external systems and networks, such as the Internet 312. The processor 304 may communicate with a remote server 314, for example, via the Internet 312 or other communication link. As discussed previously, biometric data can be transferred between the processor 304/memory 306 and the remote server 314. The processor 304 and the remote server 314 may operate cooperatively during one or more biometric processes described hereinabove. Components of the system 300 shown in FIG. 15 can be incorporated in a variety of electronic devices, such as a mobile phone, tablet, PC, and the like.
  • The foregoing description of the example embodiments has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the inventive concepts to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Any or all features of the disclosed embodiments can be applied individually or in any combination are not meant to be limiting, but purely illustrative. It is intended that the scope be limited not with this detailed description, but rather determined by the claims appended hereto.

Claims (21)

1. A method, comprising:
sensing a contact event occurring between a body part of a user and a touch sensor;
storing, for the contact event, a sequence of data frames each comprising contact data associated with a different portion of the user's body part; and
generating biometric signature data using the sequence of data frames.
2. The method of claim 1, wherein each of the data frames comprises spatial dimensions corresponding to the geometry of a portion of the user's body part.
3. The method of claim 1, wherein the sequence of data frames defines a time-dependent pattern of geometry for different portions of the user's body part.
4. The method of claim 1, wherein the sequence of data frames comprises a sequence of geometric images for the contact event.
5. The method of claim 1, wherein each of the data frames comprises:
an involuntary spatial component associated with a contact pattern that evolves during the contact event; and
an involuntary time component associated with the manner in which the contact pattern changes over time during the contact event.
6. The method of claim 1, further comprising:
measuring pressure exerted at different locations of the touch sensor by the user's body part during the contact event; and
generating the biometric signature data using the sequence of data frames and the measured pressure.
7. The method of claim 1, wherein each frame of data is captured in response to a triggering event other than a synchronous clock signal.
8. The method of claim 1, wherein each frame of data is captured in response to a triggering event, the triggering event comprising exceeding a predetermined total mass count threshold, the total mass count corresponding to an area of the touch screen covered by a portion of the body part at a particular time.
9. The method of claim 1, wherein each frame of data is captured in response to a triggering event, the triggering event comprising filling of a designated segment of the touch sensor to a predetermined level.
10. The method of claim 1, wherein the biometric signature data is generated during an enrollment process, and the method further comprises:
sensing a subsequent contact event occurring between the body part of the user and the same or different touch sensor;
storing, for the subsequent contact event, a sequence of subsequent data frames each comprising contact data associated with a different portion of the user's body part;
generating a contact pattern using the subsequent sequence of data frames; and
validating or invalidating the contact pattern using the biometric signature data.
11. The method of claim 1, further comprising:
prior to performing a user-selected function, verifying the biometric signature data of the user; and
performing the user-selected function only upon successful verification of the biometric signature data.
12. The method of claim 1, wherein the method is performed by a mobile electronic device.
13. An apparatus, comprising:
a touch sensor configured to sense a contact event occurring between a body part of a user and the touch sensor, the touch sensor configured to produce contact data in response to the sensed contact event; and
a processor coupled to the touch sensor and memory, the processor configured to store in the memory a sequence of data frames each comprising contact data associated with a different portion of the user's body part, the processor further configured to generate biometric signature data using the sequence of data frames.
14. The apparatus of claim 13, wherein each of the data frames comprises spatial dimensions corresponding to the geometry of a portion of the user's body part.
15. The apparatus of claim 13, wherein the sequence of data frames defines:
a time-dependent pattern of geometry for different portions of the user's body part; or
a sequence of geometric images for the contact event.
16. The apparatus of claim 13, wherein each of the data frames comprises:
an involuntary spatial component associated with a contact pattern that evolves during the contact event; and
an involuntary time component associated with the manner in which the contact pattern changes over time during the contact event.
17. The apparatus of claim 13, further comprising a pressure sensor configured to sense pressure resulting from the contact event, the processor further configured to measure pressure exerted at different locations of the touch sensor by the user's body part during the contact event and generate the biometric signature data using the sequence of data frames and the measured pressure.
18. The apparatus of claim 13, wherein each frame of data is captured in response to a triggering event other than a synchronous clock signal.
19. The apparatus of claim 13, wherein each frame of data is captured in response to a triggering event, the triggering event comprising exceeding a predetermined total mass count threshold, the total mass count corresponding to an area of the touch screen covered by a portion of the body part at a particular time.
20. The apparatus of claim 13, wherein each frame of data is captured in response to a triggering event, the triggering event comprising filling of a designated segment of the touch sensor to a predetermined level.
21-23. (canceled)
US15/131,659 2011-10-13 2016-04-18 Biometric apparatus and method for touch-sensitive devices Abandoned US20160228039A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/131,659 US20160228039A1 (en) 2011-10-13 2016-04-18 Biometric apparatus and method for touch-sensitive devices

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201161546838P 2011-10-13 2011-10-13
US201161563138P 2011-11-23 2011-11-23
US13/651,408 US9314193B2 (en) 2011-10-13 2012-10-13 Biometric apparatus and method for touch-sensitive devices
US15/131,659 US20160228039A1 (en) 2011-10-13 2016-04-18 Biometric apparatus and method for touch-sensitive devices

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/651,408 Continuation US9314193B2 (en) 2011-10-13 2012-10-13 Biometric apparatus and method for touch-sensitive devices

Publications (1)

Publication Number Publication Date
US20160228039A1 true US20160228039A1 (en) 2016-08-11

Family

ID=48082560

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/651,408 Active US9314193B2 (en) 2011-10-13 2012-10-13 Biometric apparatus and method for touch-sensitive devices
US15/131,659 Abandoned US20160228039A1 (en) 2011-10-13 2016-04-18 Biometric apparatus and method for touch-sensitive devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/651,408 Active US9314193B2 (en) 2011-10-13 2012-10-13 Biometric apparatus and method for touch-sensitive devices

Country Status (4)

Country Link
US (2) US9314193B2 (en)
EP (1) EP2766795B1 (en)
BR (1) BR112014008859B1 (en)
WO (1) WO2013056195A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150304745A1 (en) * 2012-03-16 2015-10-22 Nokia Corporation A sound producing vibrating surface
US20160239652A1 (en) * 2013-10-22 2016-08-18 The Regents Of The University Of California Identity authorization and authentication
US9223955B2 (en) 2014-01-30 2015-12-29 Microsoft Corporation User-authentication gestures
US9268928B2 (en) 2014-04-06 2016-02-23 International Business Machines Corporation Smart pen system to restrict access to security sensitive devices while continuously authenticating the user
US10423769B2 (en) * 2014-06-12 2019-09-24 Maxell, Ltd. Information processing device, application software start-up system, and application software start-up method
US9904775B2 (en) * 2014-10-31 2018-02-27 The Toronto-Dominion Bank Systems and methods for authenticating user identity based on user-defined image data
TWI592845B (en) * 2015-08-28 2017-07-21 晨星半導體股份有限公司 Method and associated controller for adaptively adjusting touch-control threshold
US9886098B2 (en) * 2015-12-08 2018-02-06 Georgia Tech Research Corporation Personality identified self-powering keyboard
GB2546459B (en) 2017-05-10 2018-02-28 Tomlinson Martin Data verification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6289114B1 (en) * 1996-06-14 2001-09-11 Thomson-Csf Fingerprint-reading system
US20050213799A1 (en) * 2002-12-19 2005-09-29 Casio Computer Co., Ltd. Pressure activated fingerprint input apparatus
US20100046810A1 (en) * 2008-08-20 2010-02-25 Fujitsu Limited Fingerprint image acquiring device, fingerprint authenticating apparatus, fingerprint image acquiring method, and fingerprint authenticating method

Family Cites Families (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9323489D0 (en) 1993-11-08 1994-01-05 Ncr Int Inc Self-service business system
US5862246A (en) 1994-06-20 1999-01-19 Personal Information & Entry Access Control, Incorporated Knuckle profile identity verification system
US6202151B1 (en) 1997-05-09 2001-03-13 Gte Service Corporation System and method for authenticating electronic transactions using biometric certificates
US6898709B1 (en) 1999-07-02 2005-05-24 Time Certain Llc Personal computer system and methods for proving dates in digital data files
US7409557B2 (en) 1999-07-02 2008-08-05 Time Certain, Llc System and method for distributing trusted time
US6895507B1 (en) 1999-07-02 2005-05-17 Time Certain, Llc Method and system for determining and maintaining trust in digital data files with certifiable time
US6948069B1 (en) 1999-07-02 2005-09-20 Time Certain, Llc Method and system for determining and maintaining trust in digital image files with certifiable time
US6688891B1 (en) 1999-08-27 2004-02-10 Inter-Tares, Llc Method and apparatus for an electronic collaborative education process model
US7704147B2 (en) 1999-10-06 2010-04-27 Igt Download procedures for peripheral devices
US7819750B2 (en) 1999-10-06 2010-10-26 Igt USB software architecture in a gaming machine
US7290072B2 (en) 1999-10-06 2007-10-30 Igt Protocols and standards for USB peripheral communications
US6996538B2 (en) 2000-03-07 2006-02-07 Unisone Corporation Inventory control system and methods
JP4321944B2 (en) * 2000-04-27 2009-08-26 富士通株式会社 Personal authentication system using biometric information
US8087988B2 (en) 2001-06-15 2012-01-03 Igt Personal gaming device and method of presenting a game
US20040236699A1 (en) 2001-07-10 2004-11-25 American Express Travel Related Services Company, Inc. Method and system for hand geometry recognition biometrics on a fob
US7360689B2 (en) 2001-07-10 2008-04-22 American Express Travel Related Services Company, Inc. Method and system for proffering multiple biometrics for use with a FOB
DE10135527A1 (en) 2001-07-20 2003-02-13 Infineon Technologies Ag Mobile station for mobile communications system with individual protection code checked before access to requested service or data is allowed
US6694045B2 (en) 2002-01-23 2004-02-17 Amerasia International Technology, Inc. Generation and verification of a digitized signature
US7197167B2 (en) 2001-08-02 2007-03-27 Avante International Technology, Inc. Registration apparatus and method, as for voting
US6935951B2 (en) 2001-09-04 2005-08-30 Igt Electronic signature capability in a gaming machine
US20030117281A1 (en) 2001-12-21 2003-06-26 Timur Sriharto Dynamic control containment unit
US7898385B2 (en) 2002-06-26 2011-03-01 Robert William Kocher Personnel and vehicle identification system using three factors of authentication
US8412623B2 (en) 2002-07-15 2013-04-02 Citicorp Credit Services, Inc. Method and system for a multi-purpose transactional platform
US20040161728A1 (en) 2003-02-14 2004-08-19 Benevento Francis A. Distance learning system
CA2527829C (en) 2003-05-30 2016-09-27 Privaris, Inc. A man-machine interface for controlling access to electronic devices
CA2438220C (en) 2003-08-06 2011-11-08 Click-Into Inc. Identification of a person based on ultra-sound scan analyses of hand bone geometry
US20050206501A1 (en) 2004-03-16 2005-09-22 Michael Farhat Labor management system and method using a biometric sensing device
JP2007529797A (en) 2004-03-19 2007-10-25 フンベル ローガー All-in-one key or control software card in mobile phones for wireless bicycle keys, cars, houses, RFID tags with authentication and payment functions
US8175345B2 (en) * 2004-04-16 2012-05-08 Validity Sensors, Inc. Unitized ergonomic two-dimensional fingerprint motion tracking device and method
US7496509B2 (en) 2004-05-28 2009-02-24 International Business Machines Corporation Methods and apparatus for statistical biometric model migration
US7314164B2 (en) 2004-07-01 2008-01-01 American Express Travel Related Services Company, Inc. System for biometric security using a smartcard
US20060016868A1 (en) 2004-07-01 2006-01-26 American Express Travel Related Services Company, Inc. Method and system for hand geometry recognition biometrics on a smartcard
US7680263B2 (en) 2004-07-29 2010-03-16 Nortel Networks Limited Agent detector, with optional agent recognition and log-in capabilities, and optional portable call history storage
US7298873B2 (en) 2004-11-16 2007-11-20 Imageware Systems, Inc. Multimodal biometric platform
US20060203880A1 (en) 2005-03-11 2006-09-14 Batcho Ronald F Sr Water stream comfort indication device
US20070011463A1 (en) 2005-07-06 2007-01-11 International Business Machines Corporation Method, system, and computer program product for providing authentication and entitlement services
US7708191B2 (en) 2005-07-28 2010-05-04 Edwin Vega Telebanking apparatus for transferring money or cash value between two parties in the same country or across national borders, for paying bills and browsing the internet
US20070048723A1 (en) 2005-08-19 2007-03-01 Caveon, Llc Securely administering computerized tests over a network
KR101184171B1 (en) 2005-09-29 2012-09-18 엘지전자 주식회사 Control Apparatus and method for Portable terminal of Biometrics touch pad
US8033515B2 (en) 2005-10-17 2011-10-11 Hewlett-Packard Development Company, L.P. System for mounting devices to a display
US9064359B2 (en) 2005-12-02 2015-06-23 Modiv Media, Inc. System for queue and service management
US7630522B2 (en) 2006-03-08 2009-12-08 Microsoft Corporation Biometric measurement using interactive display systems
US20070273658A1 (en) 2006-05-26 2007-11-29 Nokia Corporation Cursor actuation with fingerprint recognition
US7740243B1 (en) 2006-06-09 2010-06-22 Brian Kean Lottery poker game
US7995802B2 (en) 2007-01-22 2011-08-09 International Business Machines Corporation Apparatus and methods for verifying identity using biometric information collected during a pre-enrollment phase
US7900259B2 (en) 2007-03-16 2011-03-01 Prevari Predictive assessment of network risks
US8063889B2 (en) 2007-04-25 2011-11-22 Honeywell International Inc. Biometric data collection system
US20110002461A1 (en) 2007-05-11 2011-01-06 Validity Sensors, Inc. Method and System for Electronically Securing an Electronic Biometric Device Using Physically Unclonable Functions
US20080285814A1 (en) 2007-05-18 2008-11-20 James Martin Di Carlo Independent third party real time identity and age verification process employing biometric technology
US20080316045A1 (en) 2007-06-20 2008-12-25 Mobile Aspects Intelligent medical material cart
US11441919B2 (en) 2007-09-26 2022-09-13 Apple Inc. Intelligent restriction of device operations
US8468211B2 (en) 2007-10-30 2013-06-18 Schlage Lock Company Llc Communication and synchronization in a networked timekeeping environment
US20090184932A1 (en) 2008-01-22 2009-07-23 Apple Inc. Portable Device Capable of Initiating Disengagement from Host System
US8713655B2 (en) 2008-04-21 2014-04-29 Indian Institute Of Technology Method and system for using personal devices for authentication and service access at service outlets
JP5491043B2 (en) 2009-02-25 2014-05-14 京セラ株式会社 Data processing device with biometric authentication function
US8786575B2 (en) 2009-05-18 2014-07-22 Empire Technology Development LLP Touch-sensitive device and method
KR101116621B1 (en) 2009-06-29 2012-03-05 (주)아이티헬스 Biological signal sensing system using touch pad
US8754746B2 (en) 2009-11-16 2014-06-17 Broadcom Corporation Hand-held gaming device that identifies user based upon input from touch sensitive panel
KR101196759B1 (en) 2010-03-29 2012-11-05 에스케이플래닛 주식회사 Portable terminal and method for changing owner mode automatically thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6289114B1 (en) * 1996-06-14 2001-09-11 Thomson-Csf Fingerprint-reading system
US20050213799A1 (en) * 2002-12-19 2005-09-29 Casio Computer Co., Ltd. Pressure activated fingerprint input apparatus
US20100046810A1 (en) * 2008-08-20 2010-02-25 Fujitsu Limited Fingerprint image acquiring device, fingerprint authenticating apparatus, fingerprint image acquiring method, and fingerprint authenticating method

Also Published As

Publication number Publication date
US20130094719A1 (en) 2013-04-18
EP2766795B1 (en) 2020-05-27
EP2766795A1 (en) 2014-08-20
EP2766795A4 (en) 2015-04-08
US9314193B2 (en) 2016-04-19
BR112014008859A2 (en) 2017-04-18
WO2013056195A1 (en) 2013-04-18
BR112014008859B1 (en) 2021-06-22

Similar Documents

Publication Publication Date Title
US9314193B2 (en) Biometric apparatus and method for touch-sensitive devices
US11216546B2 (en) Method for fingerprint authentication using force value
TWI592854B (en) Fingerprint authentication using touch sensor data
EP3317811B1 (en) Fingerprint authentication with template updating
EP3482331B1 (en) Obscuring data when gathering behavioral data
US20170154177A1 (en) Dynamic graphic eye-movement authentication system and method using face authentication or hand authentication
JP2014502763A (en) User identification using biokinematic input
KR20160114608A (en) User-authentication gestures
WO2016115786A1 (en) Terminal unlocking method and device, terminal and computer storage medium
CN107408208B (en) Method and fingerprint sensing system for analyzing a biometric of a user
JP2023549934A (en) Method and apparatus for user recognition
Conti et al. Usability analysis of a novel biometric authentication approach for android-based mobile devices
KR102065912B1 (en) Apparatus and method for obtaining image for user authentication using sensing pressure
CN112492090A (en) Continuous identity authentication method fusing sliding track and dynamic characteristics on smart phone
Akhtar et al. Multitrait Selfie: Low-Cost Multimodal Smartphone User Authentication
CN105701392B (en) Information processing method and electronic equipment
JP2017091276A (en) Operation permission determination device, operation permission determination system, operation permission determination method, and operation permission determination program
TWI646474B (en) Forged-physiological-characteristic filtering device of identity authentication system
CN111353139A (en) Continuous authentication method and device, electronic equipment and storage medium
CN112204571A (en) Method for authenticating user
CN114026614B (en) Method and system for enrolling fingerprints
Srivastva et al. Biometrics based identification techniques (BIT)
CN114519892A (en) Challenge-response method for biometric authentication

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION