[go: up one dir, main page]

US20160012273A1 - Efficient Texture Comparison - Google Patents

Efficient Texture Comparison Download PDF

Info

Publication number
US20160012273A1
US20160012273A1 US14/855,073 US201514855073A US2016012273A1 US 20160012273 A1 US20160012273 A1 US 20160012273A1 US 201514855073 A US201514855073 A US 201514855073A US 2016012273 A1 US2016012273 A1 US 2016012273A1
Authority
US
United States
Prior art keywords
fingerprint
template
pattern
map
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/855,073
Inventor
Wayne C. Westerman
Byron B. Han
Craig A. Marciniak
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US14/855,073 priority Critical patent/US20160012273A1/en
Publication of US20160012273A1 publication Critical patent/US20160012273A1/en
Priority to US15/669,789 priority patent/US9846799B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06K9/00013
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06K9/00067
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1347Preprocessing; Feature extraction
    • G06V40/1359Extracting features related to ridge properties; Determining the fingerprint type, e.g. whorl or loop

Definitions

  • Embodiments described herein relate generally to a device and process for efficient texture pattern comparison and matching, and more specifically to fingerprint matching on a portable device.
  • Fingerprint sensing technology has become widespread in use and is often used to provide secure access to sensitive electronic devices and/or data.
  • capacitive fingerprint sensors may be used to determine an image of a fingerprint through measuring capacitance through each pixel of a capacitive sensor. The higher the capacitance, the nearer the surface of an adjacent or overlying finger to the pixel. Thus, fingerprint ridges provide a higher capacitance in an underlying pixel than do fingerprint valleys.
  • fingerprint sensors such as optical sensors.
  • fingerprint sensors have been tied to relatively powerful computers, such as PCs or laptops, or incorporated in specialty devices specifically designed for fast processing and sufficient battery life of the scanner.
  • Portable user devices such as smart phones and tablets, are more and more common, and include more and more features and functions. Such devices become more powerful and less battery intensive all the time, but still have relatively smaller computational resources and a constant concern over battery consumption rates.
  • a scannable object may be sensed and scanned.
  • a map may be constructed based on the scan results.
  • the map may be compared to one or more stored templates. Results of the comparison may be provided.
  • a secured processor may construct the map and may provide reduced resolution (and/or other versions that contain less information) versions of the map and/or the stored templates to one or more other processors.
  • the one or more other processors may determine a match-set based on matching between the reduced resolution map and stored templates.
  • the secured processor may then identify whether or not a match exists between the map and any stored template based on the match-set.
  • FIG. 1 depicts a block diagram of a sample capacitive sensing array.
  • FIG. 2 depicts a sample electronic device incorporating the embodiment of a capacitive sensing array.
  • FIG. 3 is a cross-sectional view taken along line 4 - 4 of FIG. 2 , showing the embodiment of a capacitive sensing array incorporated into a stack-up with an input device.
  • FIG. 4 is an exemplary process for efficiently matching a scanned pattern according to one exemplary embodiment.
  • FIG. 5 is an exemplary system for efficiently matching a scanned pattern according to one exemplary embodiment.
  • FIG. 6 is an exemplary process for efficiently and securely matching a scanned pattern according to one exemplary embodiment.
  • a smart phone touch screen can be configured with a fingerprint sensor (e.g., a capacitive sensor) over part or all of the touch screen interface, the device housing, and/or other device inputs.
  • a fingerprint sensor e.g., a capacitive sensor
  • a scannable object may be sensed and scanned.
  • a map may be constructed based on the scan results.
  • the map may be compared to one or more stored templates. Results of the comparison may be provided. It should be appreciated that embodiments described herein may be used with any suitable fingerprint sensor, including swipe or strip sensors, two-dimensional array sensors, and the like.
  • a secured processor may construct the map and may provide reduced resolution (and/or other versions that contain less information) versions of the map and/or the stored templates to one or more other processors.
  • the one or more other processors may determine a match-set based on matching between the reduced resolution map and stored templates.
  • the secured processor may then identify whether or not a match exists between the map and any stored template based on the match-set.
  • FIG. 2 depicts an electronic device 200 that may incorporate a fingerprint sensor, e.g., a capacitive sensor.
  • the electronic device may be a mobile telephone, a tablet computing device, a notebook computer, a personal digital assistant, a desktop computer, a portable media player, and the like.
  • the sensor pad may be placed anywhere on device 200 , such as below an input mechanism, e.g., button 210 , an input and/or output mechanism, e.g., screen 220 , and/or a casing/housing, e.g., device housing 230 of the electronic device.
  • the sensor may occupy part of an area (e.g., part of button 210 ), a whole area (e.g., all of screen 220 ), or an area that spans part/all of more than one of the areas.
  • a sensor may cover screen 220 and extend past the edge, covering all or part of forehead area 232 and/or chin area 234 .
  • any portion of the electronic device's enclosure may house the fingerprint sensor.
  • the device can include a separate attachment, such as external scan accessory 240 .
  • Accessory 240 is shown connected to device I/O port 245 , which could be via a flexible wire connection, a ridged connection (e.g., simulating an extension of the device housing via a fastening mechanism (e.g., a snap together interface)).
  • this connection can be wireless via a proprietary protocol or a common protocol (e.g., Bluetooth, WiFi, GSM/CDMA/4G, etc.).
  • FIG. 3 illustrates one exemplary embodiment of a fingerprint scanner/sensor disposed beneath button 210 .
  • FIG. 3 is a cross-sectional view of the electronic device of FIG. 2 , taken along line 4 - 4 of FIG. 2 , which may include the layers: cover dielectric 301 , ink 302 , liquid adhesive 303 , silicon TSV (3um pass) 304 , solder 305 , flex 306 , air gap 307 , stiffener 308 , adhesive 309 , flex 310 , tact 311 , shim 312 , adhesive 313 , and bracket 314 .
  • the fingerprint sensor chip (including both sensor pad and drive ring) may be positioned beneath the button (e.g., 210 ), which may be the cover dielectric 301 .
  • the top layer cover dielectric 301 is concave, as exemplary button 210 may be concave.
  • an ink layer and/or adhesive may be placed between the button's bottom surface and the sensor chip's top surface. The adhesive may bond the chip to the button, for example.
  • One or more solder balls may affix the fingerprint sensor chip to a flex conductor. The solder balls may generally be placed near the center of the fingerprint sensor chip to reduce the likelihood of cracking due to stress.
  • the exemplary scanner shown in FIG. 3 , accessory 240 , and/or any other configuration incorporating a texture sensor/scanner with a user device may include a capacitive sensor (e.g., the same, similar, or different than the capacitive sensor shown in FIG. 1 ), or any number of other types of sensors capable of sensing a texture/pattern of an adjacent or proximate object (e.g., an optical sensor) can be used in one or more exemplary embodiments.
  • a capacitive sensor e.g., the same, similar, or different than the capacitive sensor shown in FIG. 1
  • any number of other types of sensors capable of sensing a texture/pattern of an adjacent or proximate object e.g., an optical sensor
  • the exemplary device can execute an exemplary process for matching a scanned texture with stored templates.
  • FIG. 4 illustrates one such exemplary process.
  • the exemplary process may start at 410 by sensing or detecting a scannable object. This may be a low power state, where power consumption is reduced while waiting for a sensed object.
  • a scannable object can be one close to the device scanner or in contact with the device scanner.
  • the object may be “scannable” if it has a texture that can be detected, and in other exemplary embodiments an object may be scannable based on proximity, while the texture (or lack thereof) can be detected later in the exemplary process.
  • the exemplary process can scan the object at 420 .
  • the sensor results which may vary depending on the type of sensor used (e.g., capacitive, optical, etc.), can then be used to construct a map associated with (e.g., descriptive of) the scanned features of the objects texture at 430 .
  • One such exemplary map can include a ridge flow map or direction map, which represents the direction of ridge flow within the scanned fingerprint image.
  • the exemplary map may contain a grid of integer directions, where each cell in the grid represents, e.g., an 8 ⁇ 8 pixel neighborhood in the image. Ridge flow angles can be quantized into, e.g., 16 integer bi-directional units equally spaced on a semicircle. In this example, starting with vertical direction 0 , direction units can increase clockwise and represent incremental jumps of 11.25 degrees, stopping at direction 15 which is 11.25 degrees shy of vertical. Using this scheme, direction 8 is horizontal.
  • a value of ⁇ 1 in this map represents a neighborhood where no valid ridge flow was determined.
  • Other exemplary methods of producing a ridge flow map are also possible, including different sizes, value ranges, matrix configurations, etc. Further, other map types are also possible, such as a quality map, contrast map, etc.
  • FIG. 5 shows an exemplary system that can be used to execute one or more exemplary processes.
  • the exemplary system can include a sensor 540 , which can be sensor 100 , sensor 240 , the sensor of FIG. 3 , or any number of other exemplary sensors.
  • This sensor can include a separate encryption/security feature/module (not shown) or send data to processor block 500 without a separate security module.
  • the processor 500 can include an application processor (AP) 510 and a secure enclave processor (SEP) 520 . Each of these processors can include multiple processors, multiple cores, or reside on the same processor.
  • the application processor 510 can be a general processor, responsible for several processing tasks of the device it resides within.
  • the secure enclave processor 520 can be specially and/or specifically designed/configured to perform encrypted tasks, such as encrypting data associated with an authorized user's fingerprint/ID-pattern.
  • Processor block 500 can be connected to sensor 540 by any number or wired or wireless connections, using any number of transmission protocols, such as a serial peripheral interface (SPI).
  • Processor block 500 can also be connected to a data storage repository 550 , which can include any number of mediums (e.g., magnetic material, solid state memory, etc.)
  • Data repository 550 can include a secure data repository 555 , which can include encrypted data, e.g., data associated with an authorized user's fingerprint/ID-pattern.
  • Secure repository 555 can be separate from the main repository 550 or a part of the main repository 550 .
  • fingerprint patterns e.g., maps based on a scanned fingerprint pattern
  • the repository can store files for multiple authorized users, files for multiple fingers (e.g., 10 ) of each user, multiple files for each finger, etc.
  • sensor 540 can scan a texture of an object. This texture can be translated into an associated map by sensor 540 , AP 510 , or SEP 520 .
  • the SEP 520 can then retrieve encrypted templates (e.g., based on patterns associated with authorized users), and match the translated map with the encrypted templates.
  • the SEP e.g., via the AP, operating system (OS), and input/output devices (I/O)
  • OS operating system
  • I/O input/output devices
  • personal settings associated with the particular authorized user can also be pre-loaded at unlock.
  • the SEP 520 may have less computational resources than the more general processor AP 510 , and thus be some degree slower. In order to provide efficient and faster matching, certain exemplary embodiments may push some or all of the matching operations to the AP 510 .
  • the AP 510 can identify a match and provide a result or identify the match so that a result can be provided.
  • the SEP 520 may decrypt the match templates and pass them to the AP 510 for match processing.
  • While the SEP 520 may be needed for encryption/decryption (as AP 510 may be unsecured), the process can be greatly sped up, as the SEP 520 only has to perform tasks it was designed for (encryption/decryption), while the more powerful AP 510 can perform the more computationally intensive matching procedures.
  • a potential drawback of the above described exemplary embodiment can be that the AP 510 is unsecured or partially unsecured, and certain exemplary template maps may contain sufficient information that a malicious unauthorized user (e.g., someone who steals the device) could reverse engineer the exemplary template to construct a pattern that could unlock the device (e.g., sufficiently mimic an authorized user's fingerprint pattern).
  • a malicious unauthorized user e.g., someone who steals the device
  • the exemplary template maps may contain sufficient information that a malicious unauthorized user (e.g., someone who steals the device) could reverse engineer the exemplary template to construct a pattern that could unlock the device (e.g., sufficiently mimic an authorized user's fingerprint pattern).
  • an unauthorized user could intercept a decrypted template from the unsecured AP 510 , and use the template data to construct an artificial object with associated properties (e.g. properties that when scanned would produce data that matched the intercepted template).
  • each encrypted ridge map template can have some lower resolution pattern computed and associated with the ridge map.
  • One exemplary pattern could be a histogram of, e.g., the most common angles (e.g., a 2 dimensional (2D) array of common angles).
  • the exemplary pattern could include in each slot an average value over a respective vector of the map.
  • the exemplary pattern could include in each slot a sum of the values over a respective vector of the map.
  • the exemplary pattern could include the smallest or largest value within a respective vector of the map, or could be a difference between a largest and a smallest value within the respective vector of the map.
  • the exemplary pattern could simply be a particular vector, e.g., the pattern is merely the Nth vector of the map.
  • Exemplary patterns can include more than one vector.
  • the exemplary pattern could be the four edge vectors (e.g., the 1 st and Nth column, and the 1 st and Nth row), or any other sampling, positions, or calculated reduction.
  • any other exemplary pattern calculation can be used, where the exemplary pattern includes enough associated information to narrow the candidate list, while omitting enough associated information that the unsecured pattern cannot or cannot easily be reverse engineered into a matching texture.
  • a scanned object can have a ridge map calculated from the scanner input, e.g., in the SEP 520 .
  • This encrypted ridge map can then have an unencrypted pattern calculated (according to the implemented protocol) and sent to the AP 510 .
  • This pattern can be compared to patterns associated with the stored encrypted templates, which can be calculated in real-time or preferably be stored to reduce computation.
  • Several of the templates may be different, but have the same or similar associated patterns, since two different templates may have values the same or similar in the areas used to determine the lower resolution patterns.
  • the AP 510 may return multiple positive results (and might also return a single match or no matches as determined with the scanned pattern to be compared).
  • the SEP 520 can then access the encrypted ridge maps associated with any patterns identified by the AP 510 as matching.
  • the SEP 520 can then compare the ridge map of the scanned pattern with the small subset of possible matches, instead of the entire library of possible matches.
  • This exemplary embodiment can therefore P 1 greatly speed up the computation of map matching by leveraging the powerful AP, while maintaining encrypted security of the stored ridge maps.
  • any number of other exemplary embodiments are also possible, and the above example is presented with certain specific implementations (e.g., using ridge maps for patterns) for illustration purposes, but could be applied to any number of other exemplary embodiments having other exemplary implementations.
  • FIG. 6 illustrates an exemplary embodiment of this exemplary process.
  • the exemplary process can sense or detect an object to scan.
  • the exemplary process scans the object.
  • Secure process 604 then constructs an input map based on the scan results at 630 .
  • the secure process 604 can then construct a lower resolution pattern 635 .
  • Secure process 604 can then load, determine, or otherwise provide stored template patterns (associated with stored template maps) and the input pattern to a process 602 , which can be unsecured, partially secured, secured with a different protocol, or secured in the same manner as process 604 .
  • Process 602 can then run a match comparison of the input pattern and the received template patterns at 650 .
  • process 602 can provide the identity of possible match results to secured process 604 .
  • This can be a pointer, an identification, or the actual matching pattern.
  • the secure process 604 can then run (e.g., at 660 ) a full match comparison of the input map and the stored templates associated with those possible matches identified at 655 .
  • the exemplary process can provide the results at 670 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Input (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A scannable object is sensed and scanned. A map is constructed based on the scan results. The map is compared to one or more stored templates. Results of the comparison are provided. In some implementations, a secured processor may construct the map and may provide reduced resolution (and/or other versions that contain less information) versions of the map and/or the stored templates to one or more other processors. The one or more other processors may determine a match-set based on matching between the reduced resolution map and stored templates. The secured processor may then identify whether or not a match exists between the map and any stored template based on the match-set.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 13/797,970, filed Mar. 12, 2013, entitled “Efficient Texture Comparison,” which claims the benefit under 35 U.S.C. §119(e) to U.S. Provisional Patent Application No. 61/649,210 filed May 18, 2012, entitled “Efficient Texture Comparison,” both of which are incorporated by reference in their entirety as if fully disclosed herein.
  • TECHNICAL FIELD
  • Embodiments described herein relate generally to a device and process for efficient texture pattern comparison and matching, and more specifically to fingerprint matching on a portable device.
  • BACKGROUND DESCRIPTION
  • Fingerprint sensing technology has become widespread in use and is often used to provide secure access to sensitive electronic devices and/or data. Generally, capacitive fingerprint sensors may be used to determine an image of a fingerprint through measuring capacitance through each pixel of a capacitive sensor. The higher the capacitance, the nearer the surface of an adjacent or overlying finger to the pixel. Thus, fingerprint ridges provide a higher capacitance in an underlying pixel than do fingerprint valleys. There are other types of fingerprint sensors, such as optical sensors.
  • Typically, fingerprint sensors have been tied to relatively powerful computers, such as PCs or laptops, or incorporated in specialty devices specifically designed for fast processing and sufficient battery life of the scanner.
  • Portable user devices, such as smart phones and tablets, are more and more common, and include more and more features and functions. Such devices become more powerful and less battery intensive all the time, but still have relatively smaller computational resources and a constant concern over battery consumption rates.
  • Accordingly, there is a need for an improved functionality in highly mobile devices, and a need for a computationally efficient implementation of the improved functionality.
  • SUMMARY
  • The present disclosure provides systems, methods, and apparatuses for efficient texture comparison. A scannable object may be sensed and scanned. A map may be constructed based on the scan results. The map may be compared to one or more stored templates. Results of the comparison may be provided.
  • In some implementations, a secured processor may construct the map and may provide reduced resolution (and/or other versions that contain less information) versions of the map and/or the stored templates to one or more other processors. The one or more other processors may determine a match-set based on matching between the reduced resolution map and stored templates. The secured processor may then identify whether or not a match exists between the map and any stored template based on the match-set.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 depicts a block diagram of a sample capacitive sensing array.
  • FIG. 2 depicts a sample electronic device incorporating the embodiment of a capacitive sensing array.
  • FIG. 3 is a cross-sectional view taken along line 4-4 of FIG. 2, showing the embodiment of a capacitive sensing array incorporated into a stack-up with an input device.
  • FIG. 4 is an exemplary process for efficiently matching a scanned pattern according to one exemplary embodiment.
  • FIG. 5 is an exemplary system for efficiently matching a scanned pattern according to one exemplary embodiment.
  • FIG. 6 is an exemplary process for efficiently and securely matching a scanned pattern according to one exemplary embodiment.
  • DETAILED DESCRIPTION
  • Generally, embodiments discussed herein may provide efficient and secure texture sensing on a device, such as a smart phone. For example, a smart phone touch screen can be configured with a fingerprint sensor (e.g., a capacitive sensor) over part or all of the touch screen interface, the device housing, and/or other device inputs.
  • The present disclosure provides systems, methods, and apparatuses for efficient texture comparison. A scannable object may be sensed and scanned. A map may be constructed based on the scan results. The map may be compared to one or more stored templates. Results of the comparison may be provided. It should be appreciated that embodiments described herein may be used with any suitable fingerprint sensor, including swipe or strip sensors, two-dimensional array sensors, and the like.
  • In some implementations, a secured processor may construct the map and may provide reduced resolution (and/or other versions that contain less information) versions of the map and/or the stored templates to one or more other processors. The one or more other processors may determine a match-set based on matching between the reduced resolution map and stored templates. The secured processor may then identify whether or not a match exists between the map and any stored template based on the match-set.
  • FIG. 2 depicts an electronic device 200 that may incorporate a fingerprint sensor, e.g., a capacitive sensor. The electronic device may be a mobile telephone, a tablet computing device, a notebook computer, a personal digital assistant, a desktop computer, a portable media player, and the like. The sensor pad may be placed anywhere on device 200, such as below an input mechanism, e.g., button 210, an input and/or output mechanism, e.g., screen 220, and/or a casing/housing, e.g., device housing 230 of the electronic device. The sensor may occupy part of an area (e.g., part of button 210), a whole area (e.g., all of screen 220), or an area that spans part/all of more than one of the areas. For example, a sensor may cover screen 220 and extend past the edge, covering all or part of forehead area 232 and/or chin area 234. Essentially, any portion of the electronic device's enclosure may house the fingerprint sensor.
  • In certain exemplary embodiments, the device can include a separate attachment, such as external scan accessory 240. Accessory 240 is shown connected to device I/O port 245, which could be via a flexible wire connection, a ridged connection (e.g., simulating an extension of the device housing via a fastening mechanism (e.g., a snap together interface)). In other exemplary embodiments, this connection can be wireless via a proprietary protocol or a common protocol (e.g., Bluetooth, WiFi, GSM/CDMA/4G, etc.).
  • In certain exemplary embodiments, as mentioned above, the sensor may be included within the device housing, display, or other area, such as input button 210. FIG. 3 illustrates one exemplary embodiment of a fingerprint scanner/sensor disposed beneath button 210. FIG. 3 is a cross-sectional view of the electronic device of FIG. 2, taken along line 4-4 of FIG. 2, which may include the layers: cover dielectric 301, ink 302, liquid adhesive 303, silicon TSV (3um pass) 304, solder 305, flex 306, air gap 307, stiffener 308, adhesive 309, flex 310, tact 311, shim 312, adhesive 313, and bracket 314. As shown in FIG. 3, the fingerprint sensor chip (including both sensor pad and drive ring) may be positioned beneath the button (e.g., 210), which may be the cover dielectric 301. As illustrated, the top layer cover dielectric 301 is concave, as exemplary button 210 may be concave. A similar illustration, with differing dimensions and features, could show a flat screen in this layer extending to a housing, etc. In the exemplary embodiments of a button sensor, an ink layer and/or adhesive may be placed between the button's bottom surface and the sensor chip's top surface. The adhesive may bond the chip to the button, for example. One or more solder balls may affix the fingerprint sensor chip to a flex conductor. The solder balls may generally be placed near the center of the fingerprint sensor chip to reduce the likelihood of cracking due to stress.
  • The exemplary scanner shown in FIG. 3, accessory 240, and/or any other configuration incorporating a texture sensor/scanner with a user device may include a capacitive sensor (e.g., the same, similar, or different than the capacitive sensor shown in FIG. 1), or any number of other types of sensors capable of sensing a texture/pattern of an adjacent or proximate object (e.g., an optical sensor) can be used in one or more exemplary embodiments.
  • Regardless of the location or configuration of the sensor, the exemplary device, including the exemplary sensor, can execute an exemplary process for matching a scanned texture with stored templates. FIG. 4 illustrates one such exemplary process. The exemplary process may start at 410 by sensing or detecting a scannable object. This may be a low power state, where power consumption is reduced while waiting for a sensed object. A scannable object can be one close to the device scanner or in contact with the device scanner. In other exemplary embodiments, the object may be “scannable” if it has a texture that can be detected, and in other exemplary embodiments an object may be scannable based on proximity, while the texture (or lack thereof) can be detected later in the exemplary process.
  • Once a scannable and/or proximate object has been detected, the exemplary process (e.g., using the exemplary device and sensor) can scan the object at 420. The sensor results, which may vary depending on the type of sensor used (e.g., capacitive, optical, etc.), can then be used to construct a map associated with (e.g., descriptive of) the scanned features of the objects texture at 430.
  • One such exemplary map can include a ridge flow map or direction map, which represents the direction of ridge flow within the scanned fingerprint image. As just one example of how a ridge flow map can be computed and stored: the exemplary map may contain a grid of integer directions, where each cell in the grid represents, e.g., an 8×8 pixel neighborhood in the image. Ridge flow angles can be quantized into, e.g., 16 integer bi-directional units equally spaced on a semicircle. In this example, starting with vertical direction 0, direction units can increase clockwise and represent incremental jumps of 11.25 degrees, stopping at direction 15 which is 11.25 degrees shy of vertical. Using this scheme, direction 8 is horizontal. A value of −1 in this map represents a neighborhood where no valid ridge flow was determined. Other exemplary methods of producing a ridge flow map are also possible, including different sizes, value ranges, matrix configurations, etc. Further, other map types are also possible, such as a quality map, contrast map, etc.
  • FIG. 5 shows an exemplary system that can be used to execute one or more exemplary processes. The exemplary system can include a sensor 540, which can be sensor 100, sensor 240, the sensor of FIG. 3, or any number of other exemplary sensors. This sensor can include a separate encryption/security feature/module (not shown) or send data to processor block 500 without a separate security module. The processor 500 can include an application processor (AP) 510 and a secure enclave processor (SEP) 520. Each of these processors can include multiple processors, multiple cores, or reside on the same processor. The application processor 510 can be a general processor, responsible for several processing tasks of the device it resides within. The secure enclave processor 520 can be specially and/or specifically designed/configured to perform encrypted tasks, such as encrypting data associated with an authorized user's fingerprint/ID-pattern.
  • Processor block 500 can be connected to sensor 540 by any number or wired or wireless connections, using any number of transmission protocols, such as a serial peripheral interface (SPI). Processor block 500 can also be connected to a data storage repository 550, which can include any number of mediums (e.g., magnetic material, solid state memory, etc.) Data repository 550 can include a secure data repository 555, which can include encrypted data, e.g., data associated with an authorized user's fingerprint/ID-pattern. Secure repository 555 can be separate from the main repository 550 or a part of the main repository 550. In the example of fingerprint patterns (e.g., maps based on a scanned fingerprint pattern), the repository can store files for multiple authorized users, files for multiple fingers (e.g., 10) of each user, multiple files for each finger, etc.
  • In a first exemplary operation, sensor 540 can scan a texture of an object. This texture can be translated into an associated map by sensor 540, AP 510, or SEP 520. The SEP 520 can then retrieve encrypted templates (e.g., based on patterns associated with authorized users), and match the translated map with the encrypted templates. The SEP (e.g., via the AP, operating system (OS), and input/output devices (I/O)) can then provide a result, such as maintaining the screen lock (no match), or unlocking the device. Personal settings associated with the particular authorized user can also be pre-loaded at unlock.
  • The SEP 520 may have less computational resources than the more general processor AP 510, and thus be some degree slower. In order to provide efficient and faster matching, certain exemplary embodiments may push some or all of the matching operations to the AP 510. The AP 510 can identify a match and provide a result or identify the match so that a result can be provided. In one exemplary embodiments, the SEP 520 may decrypt the match templates and pass them to the AP 510 for match processing. While the SEP 520 may be needed for encryption/decryption (as AP 510 may be unsecured), the process can be greatly sped up, as the SEP 520 only has to perform tasks it was designed for (encryption/decryption), while the more powerful AP 510 can perform the more computationally intensive matching procedures.
  • A potential drawback of the above described exemplary embodiment can be that the AP 510 is unsecured or partially unsecured, and certain exemplary template maps may contain sufficient information that a malicious unauthorized user (e.g., someone who steals the device) could reverse engineer the exemplary template to construct a pattern that could unlock the device (e.g., sufficiently mimic an authorized user's fingerprint pattern). For example, an unauthorized user could intercept a decrypted template from the unsecured AP 510, and use the template data to construct an artificial object with associated properties (e.g. properties that when scanned would produce data that matched the intercepted template).
  • To overcome this potential security drawback, another exemplary embodiment of the present disclosure can include a process of collapsing the full maps into a sort of checksum, hash function, or histogram. For example, each encrypted ridge map template can have some lower resolution pattern computed and associated with the ridge map. One exemplary pattern could be a histogram of, e.g., the most common angles (e.g., a 2 dimensional (2D) array of common angles). The exemplary pattern could include in each slot an average value over a respective vector of the map. The exemplary pattern could include in each slot a sum of the values over a respective vector of the map. The exemplary pattern could include the smallest or largest value within a respective vector of the map, or could be a difference between a largest and a smallest value within the respective vector of the map. The exemplary pattern could simply be a particular vector, e.g., the pattern is merely the Nth vector of the map. Exemplary patterns can include more than one vector. For example, for an N by N map, the exemplary pattern could be the four edge vectors (e.g., the 1st and Nth column, and the 1st and Nth row), or any other sampling, positions, or calculated reduction. Numerous other exemplary embodiments are also possible, and any other exemplary pattern calculation can be used, where the exemplary pattern includes enough associated information to narrow the candidate list, while omitting enough associated information that the unsecured pattern cannot or cannot easily be reverse engineered into a matching texture.
  • In an exemplary process for this exemplary embodiment, a scanned object can have a ridge map calculated from the scanner input, e.g., in the SEP 520. This encrypted ridge map can then have an unencrypted pattern calculated (according to the implemented protocol) and sent to the AP 510. This pattern can be compared to patterns associated with the stored encrypted templates, which can be calculated in real-time or preferably be stored to reduce computation. Several of the templates may be different, but have the same or similar associated patterns, since two different templates may have values the same or similar in the areas used to determine the lower resolution patterns. Thus, the AP 510 may return multiple positive results (and might also return a single match or no matches as determined with the scanned pattern to be compared). The SEP 520 can then access the encrypted ridge maps associated with any patterns identified by the AP 510 as matching. The SEP 520 can then compare the ridge map of the scanned pattern with the small subset of possible matches, instead of the entire library of possible matches. This exemplary embodiment can therefore P1 greatly speed up the computation of map matching by leveraging the powerful AP, while maintaining encrypted security of the stored ridge maps.
  • As mentioned earlier, any number of other exemplary embodiments are also possible, and the above example is presented with certain specific implementations (e.g., using ridge maps for patterns) for illustration purposes, but could be applied to any number of other exemplary embodiments having other exemplary implementations.
  • FIG. 6 illustrates an exemplary embodiment of this exemplary process. At 610, the exemplary process can sense or detect an object to scan. At 620 the exemplary process scans the object. Secure process 604 then constructs an input map based on the scan results at 630. The secure process 604 can then construct a lower resolution pattern 635. Secure process 604 can then load, determine, or otherwise provide stored template patterns (associated with stored template maps) and the input pattern to a process 602, which can be unsecured, partially secured, secured with a different protocol, or secured in the same manner as process 604. Process 602 can then run a match comparison of the input pattern and the received template patterns at 650. At 655, process 602 can provide the identity of possible match results to secured process 604. This can be a pointer, an identification, or the actual matching pattern. The secure process 604 can then run (e.g., at 660) a full match comparison of the input map and the stored templates associated with those possible matches identified at 655. Finally, the exemplary process can provide the results at 670.
  • Although embodiments have been described herein with respect to particular configurations and sequences of operations, it should be understood that alternative embodiments may add, omit, or change elements, operations and the like. Accordingly, the embodiments disclosed herein are meant to be examples and not limitations.

Claims (21)

1-20. (canceled)
21. A system, comprising:
a fingerprint sensor configured to capture fingerprint data;
a secure processor operably connected to the fingerprint sensor, the secure processor configured to:
construct a fingerprint map based on the fingerprint data;
generate a lower resolution fingerprint pattern that represents the fingerprint map; and
generate a lower resolution template pattern that represents a fingerprint template; and
a second processor operably connected to the secure processor, the second processor configured to:
compare the lower resolution fingerprint pattern with the lower resolution template pattern to determine if the lower resolution fingerprint pattern matches the lower resolution template pattern; and
provide a result of the comparison to the secure processor.
22. The system of claim 21, wherein the secure processor is configured to determine if the fingerprint map matches the fingerprint template when the result of the comparison indicates the lower resolution fingerprint pattern matches the lower resolution template pattern.
23. The system of claim 21, further comprising a memory operably connected to the secure processor, the memory storing the fingerprint template.
24. The system of claim 23, wherein the secure processor is configured to encrypt the fingerprint template and store the encrypted fingerprint template in the memory.
25. The system of claim 24, wherein the secure process is configured to decrypt the fingerprint template prior to generating the lower resolution template pattern that represents the fingerprint template.
26. The system of claim 21, wherein the second processor comprises an unsecured processor.
27. The system of claim 21, wherein the fingerprint sensor, the memory, the secure processor, and the second processor are included in an electronic device.
28. The system of claim 21, wherein the memory, the secure processor, and the second processor are included in an electronic device and the fingerprint sensor is operably connected to the electronic device.
29. The system of claim 21, wherein:
the reduced resolution fingerprint pattern comprises at least one of a checksum generated from the fingerprint map, a hash generated from the fingerprint map, or a histogram generated from the fingerprint map; and
the reduced resolution template pattern comprises at least one of a checksum generated from the fingerprint template, a hash generated from the fingerprint template, or a histogram generated from the fingerprint template.
30. A method, comprising:
producing, by a secure processor, a fingerprint pattern based on a fingerprint map, the fingerprint pattern containing less information than the fingerprint map;
producing, by the secure processor, one or more template patterns based on associated fingerprint templates, each template pattern containing less information than an associated fingerprint template;
transmitting the fingerprint pattern and the one or more template patterns from the secure processor to a second processor;
identifying, by the second processor, at least one template pattern that matches the fingerprint pattern;
identifying, to the secure processor, the at least one template pattern that matches the fingerprint pattern; and
determining, by the secure processor, if the fingerprint map associated with the fingerprint pattern matches at least one fingerprint template associated with each identified template pattern.
31. The method of claim 30, further comprising:
encrypting, by the secure processor, the one or more fingerprint templates; and
storing the encrypted one or more fingerprint templates in a memory.
32. The method of claim 31, further comprising:
prior to producing the one or more template patterns, reading, by the secure processor, the one or more encrypted fingerprint templates from the memory; and
decrypting, by the secure processor, the one or more fingerprint templates.
33. The method of claim 31, wherein the memory comprises a secure memory.
34. The method of claim 30, wherein the fingerprint pattern comprises at least one of a checksum generated from the fingerprint map, a hash generated from the fingerprint map, or a histogram generated from the fingerprint map.
35. The method of claim 30, wherein each template pattern comprises at least one of a checksum generated from the fingerprint template, a hash generated from the fingerprint template, or a histogram generated from the fingerprint template.
36. A system, comprising:
a fingerprint sensor configured to capture fingerprint data;
a secure processor operably connected to the fingerprint sensor, the secure processor configured to:
construct a fingerprint map based on the fingerprint data;
produce fingerprint pattern based on the fingerprint map, the fingerprint pattern containing less information than the fingerprint map; and
produce one or more template patterns based on associated fingerprint templates, each template pattern containing less information than an associated fingerprint template;
a second processor operably connected to the secure processor, the second processor configured to:
compare the fingerprint pattern with each template pattern;
identify at least one template pattern that matches the fingerprint pattern; and
provide a result of the comparison to the secure processor,
wherein, based on the result of the comparison, the secure processor is configured to determine if the fingerprint map associated with the fingerprint pattern matches at least one fingerprint template associated with each identified template pattern.
37. The system of claim 36, wherein the secure processor is configured to:
encrypt each fingerprint template and store the encrypted fingerprint template in a memory operably connected to the secure processor; and
decrypt the associated fingerprint templates prior to generating the one or more template patterns.
38. The system of claim 36, wherein the fingerprint sensor, the memory, the secure processor, and the second processor are included in an electronic device.
39. The system of claim 36, wherein the memory, the secure processor, and the second processor are included in an electronic device and the fingerprint sensor is operably connected to the electronic device.
40. The system of claim 36, wherein:
the fingerprint pattern comprises at least one of a checksum generated from the fingerprint map, a hash generated from the fingerprint map, or a histogram generated from the fingerprint map; and
each template pattern comprises at least one of a checksum generated from the fingerprint template, a hash generated from the fingerprint template, or a histogram generated from the fingerprint template.
US14/855,073 2012-05-18 2015-09-15 Efficient Texture Comparison Abandoned US20160012273A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/855,073 US20160012273A1 (en) 2012-05-18 2015-09-15 Efficient Texture Comparison
US15/669,789 US9846799B2 (en) 2012-05-18 2017-08-04 Efficient texture comparison

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261649210P 2012-05-18 2012-05-18
US13/797,970 US9135496B2 (en) 2012-05-18 2013-03-12 Efficient texture comparison
US14/855,073 US20160012273A1 (en) 2012-05-18 2015-09-15 Efficient Texture Comparison

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/797,970 Continuation US9135496B2 (en) 2012-05-18 2013-03-12 Efficient texture comparison

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/669,789 Continuation-In-Part US9846799B2 (en) 2012-05-18 2017-08-04 Efficient texture comparison

Publications (1)

Publication Number Publication Date
US20160012273A1 true US20160012273A1 (en) 2016-01-14

Family

ID=49581341

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/797,970 Expired - Fee Related US9135496B2 (en) 2012-05-18 2013-03-12 Efficient texture comparison
US14/855,073 Abandoned US20160012273A1 (en) 2012-05-18 2015-09-15 Efficient Texture Comparison

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/797,970 Expired - Fee Related US9135496B2 (en) 2012-05-18 2013-03-12 Efficient texture comparison

Country Status (1)

Country Link
US (2) US9135496B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9715616B2 (en) 2012-06-29 2017-07-25 Apple Inc. Fingerprint sensing and enrollment
US9846799B2 (en) 2012-05-18 2017-12-19 Apple Inc. Efficient texture comparison
US10068120B2 (en) 2013-03-15 2018-09-04 Apple Inc. High dynamic range fingerprint sensing
WO2019184973A1 (en) * 2018-03-30 2019-10-03 维沃移动通信有限公司 Fingerprint scanning method and mobile terminal
US10810449B2 (en) 2016-01-04 2020-10-20 Samsung Electronics Co., Ltd. Electronic device and method of operating same
US12112567B2 (en) * 2022-05-10 2024-10-08 Egis Technology Inc. Biometric detection sensor and signal processing method thereof and electronic product

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9286457B2 (en) 2004-06-14 2016-03-15 Rodney Beatson Method and system for providing password-free, hardware-rooted, ASIC-based authentication of a human to a mobile device using biometrics with a protected, local template to release trusted credentials to relying parties
US9135496B2 (en) 2012-05-18 2015-09-15 Apple Inc. Efficient texture comparison
US9202099B2 (en) 2012-06-29 2015-12-01 Apple Inc. Fingerprint sensing and enrollment
JP5484529B2 (en) * 2012-08-07 2014-05-07 ホシデン株式会社 Component module and component module manufacturing method
KR102204765B1 (en) 2012-10-14 2021-01-19 시냅틱스 인코포레이티드 Fingerprint sensor and button combinations and methods of making same
US9651513B2 (en) * 2012-10-14 2017-05-16 Synaptics Incorporated Fingerprint sensor and button combinations and methods of making same
US9111125B2 (en) 2013-02-08 2015-08-18 Apple Inc. Fingerprint imaging and quality characterization
GB2513900B (en) * 2013-05-10 2015-06-03 G4S Monitoring Technologies Ltd Person identification system
US11068875B2 (en) 2013-12-30 2021-07-20 Apple, Inc. Person-to-person payments using electronic devices
CN103761509B (en) * 2014-01-03 2017-04-12 甘肃农业大学 Alignment-free fingerprint matching method based on encrypted circuit and computing circuit
CN103780736A (en) * 2014-01-17 2014-05-07 惠州Tcl移动通信有限公司 Patchwork fingerprint processing method based on mobile terminal and mobile terminal
KR102171082B1 (en) * 2014-02-06 2020-10-28 삼성전자주식회사 Method for processing fingerprint and an electronic device thereof
US10546293B2 (en) * 2014-05-29 2020-01-28 Apple Inc. Apparatuses and methods for using a random authorization number to provide enhanced security for a secure element
US10649497B2 (en) * 2014-07-23 2020-05-12 Apple Inc. Adaptive processes for improving integrity of surfaces
US10282614B2 (en) 2016-02-18 2019-05-07 Microsoft Technology Licensing, Llc Real-time detection of object scanability
CN105959287A (en) * 2016-05-20 2016-09-21 中国银联股份有限公司 Biological feature based safety certification method and device
US20180101669A1 (en) * 2016-10-11 2018-04-12 Qualcomm Incorporated Device to perform secure biometric authentication
TWI631479B (en) * 2017-01-24 2018-08-01 創智能科技股份有限公司 Fingerprint verification method and electronic device
CN110366726A (en) * 2017-03-09 2019-10-22 指纹卡有限公司 The method of user for registering user and for authenticating electronic equipment
WO2018164630A1 (en) * 2017-03-09 2018-09-13 Fingerprint Cards Ab Methods for enrolling a user and for authentication of a user of an electronic device
EP3631665A4 (en) * 2017-05-23 2021-03-03 Fingerprint Cards AB Method and electronic device for authenticating a user
EP3655874B1 (en) 2017-09-20 2024-02-07 Fingerprint Cards Anacatum IP AB Method and electronic device for authenticating a user
US11379813B2 (en) 2018-01-02 2022-07-05 Newstore Inc. System and method for point of sale transactions using wireless device with security circuit
US11171951B2 (en) * 2018-06-07 2021-11-09 Paypal, Inc. Device interface output based on biometric input orientation and captured proximate data
WO2019245437A1 (en) 2018-06-19 2019-12-26 Fingerprint Cards Ab Method and electronic device for authenticating a user
WO2019120324A2 (en) 2019-03-29 2019-06-27 Alibaba Group Holding Limited Cryptography chip with identity verification
CN114553439B (en) 2019-03-29 2023-06-30 创新先进技术有限公司 Encryption key management based on identity information
KR102234825B1 (en) 2019-03-29 2021-04-02 어드밴스드 뉴 테크놀로지스 씨오., 엘티디. Secure execution of cryptographic operations
WO2019120322A2 (en) * 2019-03-29 2019-06-27 Alibaba Group Holding Limited Managing cryptographic keys based on identity information
CN116615914A (en) 2020-12-22 2023-08-18 指纹卡安娜卡敦知识产权有限公司 Fingerprint sensor with column readout

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040034597A1 (en) * 2000-07-07 2004-02-19 Alain Durand System and method for managing micropayment transactions, corresponding client terminal and trader equipment
US20040215615A1 (en) * 2001-06-29 2004-10-28 Alf Larsson Method and device for positioning a finger when verifying a person's identity
US20050286746A1 (en) * 2004-06-25 2005-12-29 Silvester Kelan C Biometric identification data protection
US20110010558A1 (en) * 2007-12-24 2011-01-13 Simone Baldan Biometrics based identification

Family Cites Families (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3863219A (en) 1973-10-09 1975-01-28 Ibm Data preprocessing system for character recognition systems
US5828773A (en) 1996-01-26 1998-10-27 Harris Corporation Fingerprint sensing method with finger position indication
KR100595926B1 (en) 1998-01-26 2006-07-05 웨인 웨스터만 Method and apparatus for integrating manual input
US6690830B1 (en) 1998-04-29 2004-02-10 I.Q. Bio Metrix, Inc. Method and apparatus for encoding/decoding image data
US6788340B1 (en) 1999-03-15 2004-09-07 Texas Instruments Incorporated Digital imaging control with selective intensity resolution enhancement
US6795569B1 (en) 1999-05-11 2004-09-21 Authentec, Inc. Fingerprint image compositing method and associated apparatus
US7225172B2 (en) 1999-07-01 2007-05-29 Yeda Research And Development Co. Ltd. Method and apparatus for multivariable analysis of biological measurements
AU3071001A (en) 1999-12-23 2001-07-09 National University Of Singapore, The Wavelet-enhanced automated fingerprint identification system
US6546152B1 (en) 2000-05-04 2003-04-08 Syscan Technology (Shenzhen) Co. Limited Method and apparatus for providing images in portable 2-D scanners
SE515239C2 (en) 2000-05-15 2001-07-02 Ericsson Telefon Ab L M Method for generating a composite image and apparatus for detecting fingerprints
JP3780830B2 (en) 2000-07-28 2006-05-31 日本電気株式会社 Fingerprint identification method and apparatus
WO2002039714A2 (en) 2000-11-08 2002-05-16 Digimarc Corporation Content authentication and recovery using digital watermarks
US7020305B2 (en) 2000-12-06 2006-03-28 Microsoft Corporation System and method providing improved head motion estimations for animation
US6677932B1 (en) 2001-01-28 2004-01-13 Finger Works, Inc. System and method for recognizing touch typing under limited tactile feedback conditions
US6570557B1 (en) 2001-02-10 2003-05-27 Finger Works, Inc. Multi-touch system and method for emulating modifier keys via fingertip chords
US7331523B2 (en) 2001-07-13 2008-02-19 Hand Held Products, Inc. Adaptive optical image reader
US7616787B2 (en) 2003-10-01 2009-11-10 Authentec, Inc. Methods for finger biometric processing and associated finger biometric sensors
US7746375B2 (en) 2003-10-28 2010-06-29 Koninklijke Philips Electronics N.V. Digital camera with panorama or mosaic functionality
US8077935B2 (en) 2004-04-23 2011-12-13 Validity Sensors, Inc. Methods and apparatus for acquiring a swiped fingerprint image
US8358815B2 (en) 2004-04-16 2013-01-22 Validity Sensors, Inc. Method and apparatus for two-dimensional finger motion tracking and control
US8131026B2 (en) 2004-04-16 2012-03-06 Validity Sensors, Inc. Method and apparatus for fingerprint image reconstruction
US7574022B2 (en) 2004-05-20 2009-08-11 Atrua Technologies Secure system and method of creating and processing partial finger images
AU2005302945B2 (en) 2004-11-15 2012-07-19 Nec Corporation Living body feature innput device
JP2006238410A (en) 2005-01-31 2006-09-07 Fuji Photo Film Co Ltd Imaging device
KR100747446B1 (en) 2005-03-07 2007-08-09 엘지전자 주식회사 Fingerprint recognition device and method of mobile terminal
US8145656B2 (en) * 2006-02-07 2012-03-27 Mobixell Networks Ltd. Matching of modified visual and audio media
WO2008091361A2 (en) 2006-06-19 2008-07-31 Authentec, Inc. Finger sensing device with spoof reduction features and associated methods
US7804984B2 (en) 2006-07-31 2010-09-28 Lumidigm, Inc. Spatial-spectral fingerprint spoof detection
US8154628B2 (en) 2006-09-14 2012-04-10 Mitsubishi Electric Corporation Image processing apparatus and imaging apparatus and method
US8098906B2 (en) 2006-10-10 2012-01-17 West Virginia University Research Corp., Wvu Office Of Technology Transfer & Wvu Business Incubator Regional fingerprint liveness detection systems and methods
US8408456B2 (en) 2006-12-04 2013-04-02 Verizon Services Organization Inc. Systems and methods for controlling access to media content by detecting one or more user fingerprints
US7876310B2 (en) 2007-01-03 2011-01-25 Apple Inc. Far-field input identification
JP4930109B2 (en) 2007-03-06 2012-05-16 ソニー株式会社 Solid-state imaging device, imaging device
WO2009011661A1 (en) * 2007-07-18 2009-01-22 Agency For Science, Technology And Research Method and device for determining a similarity value between minutiae templates
KR101082626B1 (en) 2007-11-09 2011-11-10 후지쯔 가부시끼가이샤 Biological information reading device, biological information reading method, and computer-readable recording medium having biological information reading program
US20110274356A1 (en) 2008-11-12 2011-11-10 University Of Utah Research Foundation Image pattern recognition
US8605960B2 (en) 2009-03-02 2013-12-10 Avago Technologies General Ip (Singapore) Pte. Ltd. Fingerprint sensing device
US8170346B2 (en) 2009-03-14 2012-05-01 Ludwig Lester F High-performance closed-form single-scan calculation of oblong-shape rotation angles from binary images of arbitrary size using running sums
WO2011053255A1 (en) 2009-10-30 2011-05-05 Agency For Science, Technology And Research Methods, devices, and computer readable mediums for processing a digital picture
JP5644773B2 (en) * 2009-11-25 2014-12-24 日本電気株式会社 Apparatus and method for collating face images
WO2011143605A1 (en) 2010-05-13 2011-11-17 Ultra-Scan Corporation Ultrasonic area-array sensor with area-image merging
US8705813B2 (en) 2010-06-21 2014-04-22 Canon Kabushiki Kaisha Identification device, identification method, and storage medium
JP5406990B2 (en) 2010-07-13 2014-02-05 株式会社翔栄 Input device using touch panel and input method thereof
BR112013001537B8 (en) 2010-07-19 2021-08-24 Risst Ltd fingerprint sensors and systems incorporating fingerprint sensors
KR101760258B1 (en) 2010-12-21 2017-07-21 삼성전자주식회사 Face recognition apparatus and method thereof
KR101962445B1 (en) 2011-08-30 2019-03-26 삼성전자 주식회사 Mobile terminal having touch screen and method for providing user interface
US20130083074A1 (en) 2011-10-03 2013-04-04 Nokia Corporation Methods, apparatuses and computer program products utilizing hovering, in part, to determine user interface orientation
US8515139B1 (en) 2012-03-15 2013-08-20 Google Inc. Facial feature detection
US9600709B2 (en) 2012-03-28 2017-03-21 Synaptics Incorporated Methods and systems for enrolling biometric data
US8903141B2 (en) 2012-05-03 2014-12-02 Authentec, Inc. Electronic device including finger sensor having orientation based authentication and related methods
US9135496B2 (en) 2012-05-18 2015-09-15 Apple Inc. Efficient texture comparison
US9202099B2 (en) 2012-06-29 2015-12-01 Apple Inc. Fingerprint sensing and enrollment
US9436864B2 (en) 2012-08-23 2016-09-06 Apple Inc. Electronic device performing finger biometric pre-matching and related methods
US9111125B2 (en) 2013-02-08 2015-08-18 Apple Inc. Fingerprint imaging and quality characterization
NO20131423A1 (en) 2013-02-22 2014-08-25 Idex Asa Integrated fingerprint sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040034597A1 (en) * 2000-07-07 2004-02-19 Alain Durand System and method for managing micropayment transactions, corresponding client terminal and trader equipment
US20040215615A1 (en) * 2001-06-29 2004-10-28 Alf Larsson Method and device for positioning a finger when verifying a person's identity
US20050286746A1 (en) * 2004-06-25 2005-12-29 Silvester Kelan C Biometric identification data protection
US20110010558A1 (en) * 2007-12-24 2011-01-13 Simone Baldan Biometrics based identification

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9846799B2 (en) 2012-05-18 2017-12-19 Apple Inc. Efficient texture comparison
US9715616B2 (en) 2012-06-29 2017-07-25 Apple Inc. Fingerprint sensing and enrollment
US10068120B2 (en) 2013-03-15 2018-09-04 Apple Inc. High dynamic range fingerprint sensing
US10810449B2 (en) 2016-01-04 2020-10-20 Samsung Electronics Co., Ltd. Electronic device and method of operating same
WO2019184973A1 (en) * 2018-03-30 2019-10-03 维沃移动通信有限公司 Fingerprint scanning method and mobile terminal
US11348363B2 (en) 2018-03-30 2022-05-31 Vivo Mobile Communication Co., Ltd. Fingerprint scanning method and mobile terminal
US12112567B2 (en) * 2022-05-10 2024-10-08 Egis Technology Inc. Biometric detection sensor and signal processing method thereof and electronic product

Also Published As

Publication number Publication date
US9135496B2 (en) 2015-09-15
US20130308838A1 (en) 2013-11-21

Similar Documents

Publication Publication Date Title
US9135496B2 (en) Efficient texture comparison
US9436864B2 (en) Electronic device performing finger biometric pre-matching and related methods
CN111444528B (en) Data security protection method, device and storage medium
TWI471809B (en) Electronic device including finger sensor having orientation based authentication and related methods
US9111125B2 (en) Fingerprint imaging and quality characterization
US9176614B2 (en) Adapative sensing component resolution based on touch location authentication
US9047512B2 (en) Contact lenses
US7519204B2 (en) Fingerprint recognition system
US10642317B2 (en) Clasp with integrated biometric sensor to authenticate a user of a dual-housing device
US20180173937A1 (en) Multi-resolution fingerprint sensor
US12039023B2 (en) Systems and methods for providing a continuous biometric authentication of an electronic device
CN105303177A (en) Camera capable of fingerprint identification and corresponding terminal
US9942226B2 (en) NFC package for storing biometric information and electronic device
KR20200098935A (en) Display and electronic device including the same
US20200074133A1 (en) Portable device with fingerprint pattern recognition module
CN101515322B (en) Image sensing device and electronic equipment using the image sensing device
US9846799B2 (en) Efficient texture comparison
CN106709457A (en) Task execution method based on fingerprint, and mobile terminal
US8896559B2 (en) Wire-array pressure and movement sensor
US20160335469A1 (en) Portable Device with Security Module
US12143483B2 (en) Split processing of biometric data
TW202009794A (en) Portable device with fingerprint recognition module
US11366887B2 (en) Biometric authentication
US20180357463A1 (en) Portable Device with Fingerprint Pattern Recognition Module
Jansen et al. Fingerprint identification and mobile handheld devices: Overview and implementation

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE