US12470552B1 - Secure data processing using data packages generated by edge devices - Google Patents
Secure data processing using data packages generated by edge devicesInfo
- Publication number
- US12470552B1 US12470552B1 US17/874,803 US202217874803A US12470552B1 US 12470552 B1 US12470552 B1 US 12470552B1 US 202217874803 A US202217874803 A US 202217874803A US 12470552 B1 US12470552 B1 US 12470552B1
- Authority
- US
- United States
- Prior art keywords
- user
- data
- computing
- computing device
- user data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/083—Network architectures or network communication protocols for network security for authentication of entities using passwords
Definitions
- the present disclosure relates to secure data processing using data packages generated by edge devices.
- Client applications can access resources from servers.
- applications utilize authenticating information that may be used to permit access information related to a user.
- gathering authenticating information is an inherently insecure and therefore challenging process.
- the present techniques can be utilized for secure data processing using data packages generated by edge devices. Obtaining authenticating information is an inherently insecure and therefore challenging process. To address these issues, the systems and methods described herein can scan biometric signatures of a user of a computing device and encrypt user data using, derived from, or otherwise based on the biometric signature and a device identifier of the computing device. A digital key can then be generated that grants access to the encrypted user data. A trusted computing device can verify the encrypted user data based on the digital key and can provide a security token corresponding to the encrypted data. The encrypted user data and the security token can be used to generate a data package, which if altered would invalidate the security token. A second digital key corresponding to the data package can then be generated and transmitted with the data package to other computing systems for processing. These techniques can be performed by an edge-computing device, thereby solving the aforementioned security issues relating to insecure gathering.
- the method may be performed, for example, by one or more processors of a computing device.
- the method can include scanning, using a sensor of the computing device, a physical feature of a user of the computing device.
- the method can include generating, based on the scan of the physical feature, by the computing device a biometric signature corresponding to the physical feature.
- the method can include generating user data based on inputs detected via one or more input devices of the computing device.
- the inputs can include information on the user.
- the method can include retrieving a device identifier corresponding to the computing device.
- the method can include encrypting, based on the biometric signature and the device identifier, the user data to generate encrypted user data.
- the method can include generating a first digital key granting access to the encrypted user data and transmitting the first digital key to a first computing system.
- the method can include receiving, from the first computing system, a security token corresponding to the user data.
- the security token can indicate that at least a subset of the user data is valid.
- the method can include storing, at the computing device, the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device. An attempt to alter the data package can invalidate the security token.
- the method can include generating a second digital key corresponding to the data package.
- the method can include transmitting the data package and the second digital key to a second computing system for use to provide the user a service (e.g., access to a virtual or physical space).
- the data package can be transmitted in response to detecting a user input indicating that the information on the user is to be provided to the second computing system. In some implementations of the method, the data package can be transmitted to the second computing system in response to a request from the second computing system. In some implementations of the method, encrypting the user data based on the biometric signature and the device identifier can include using data based on (e.g., derived from) both the biometric signature and the device identifier to encrypt the user data.
- the physical feature can be a facial feature.
- the biometric signature can be based on facial biometric data detected using the sensor.
- the physical feature can be a voice of the user.
- the sensor can be a microphone of the computing device.
- the user inputs can include an image captured using an image sensor of the computing device.
- the method can include further including analyzing the image to determine image integrity.
- analyzing the image can include using at least one of a non-visible light filter or a neural network.
- the user data can include information extracted from an image captured using an image sensor of the computing device.
- scanning the physical feature of the user can include using a plurality of filters to determine authenticity of the physical feature.
- the plurality of filters corresponds to a plurality of non-visible light frequencies.
- determining authenticity of the physical feature can include determining a likelihood that an appearance of the user is forged.
- generating the user data can include performing a hashing operation on the information on the user.
- the method can include further including detecting, using a location sensor of the computing device, a geolocation of the computing device corresponding to where the inputs are detected by the computing device.
- the user data can be generated to include the geolocation.
- retrieving the device identifier can include requesting the device identifier from an operating system of the computing device.
- the first and second digital keys can be first and second public keys.
- the system can include a computing device comprising one or more processors configured by machine-readable instructions.
- the system can scan, using a sensor of a computing device, a physical feature of a user of the computing device.
- the system can comprise having the computing device generate, based on the scan of the physical feature, a biometric signature corresponding to the physical feature.
- the system can generate user data based on inputs detected via one or more input devices of the computing device. The inputs can include information on the user.
- the system can retrieve a device identifier corresponding to the computing device.
- the system can encrypt, based on the biometric signature and the device identifier, the user data to generate encrypted user data.
- the system can generate a first digital key granting access to the encrypted user data and transmit the first digital key to a first computing system.
- the system can receive, from the first computing system, a security token corresponding to the user data.
- the security token can indicate that at least a subset of the user data is valid.
- the system can store, at the computing device, the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device. An attempt to alter the data package can invalidate the security token.
- the system can generate a second digital key corresponding to the data package.
- the system can transmit the data package and the second digital key to a second computing system for use to provide the user a service.
- the data package can be transmitted in response to detecting a user input indicating that the information on the user is to be provided to the second computing system.
- encrypting the user data based on the biometric signature and the device identifier can include using data based on (e.g., derived from) both the biometric signature and the device identifier to encrypt the user data.
- FIG. 1 is a block diagram of an example system for secure data processing using data packages generated by edge devices, in accordance with one or more example implementations.
- FIG. 2 provides example digital identity element categories that can be maintained in identity databanks and utilized in the techniques described herein, in accordance with one or more example implementations.
- FIG. 3 is a flow diagram of an example method for secure data processing using data packages generated by edge devices, in accordance with one or more example implementations.
- FIG. 4 is a component diagram of an example computing system suitable for use in the various arrangements described herein, in accordance with one or more example implementations.
- Various embodiments described herein relate to systems and methods for secure data processing using data packages generated by edge devices. Obtaining authenticating information is an inherently insecure and therefore challenging process. To address these issues, the systems and methods described herein can scan biometric signatures of a user of a computing device and encrypt user data using, derived from, or otherwise based on the biometric signature and a device identifier of the computing device. A digital key can then be generated that grants access to the encrypted user data. A trusted computing device can verify the encrypted user data based on the digital key and can provide a security token corresponding to the encrypted data. The encrypted user data and the security token can be used to generate a data package, which if altered would invalidate the security token. A second digital key corresponding to the data package can then be generated and transmitted with the data package to other computing systems for processing. These techniques can be performed by an edge-computing device, thereby solving the aforementioned security issues relating to insecure gathering.
- Embodiments of the present techniques can serve as a basis for a wide range of authentication procedures utilizing the data packages, including government services, healthcare and effectively any service that implements authentication of users.
- a single, unique digital identity construct offers a number of advantages, and utilizing the data packages described herein to securely capture the information for that data construct enables increased efficiency through edge computing.
- Traditional passwords may be replaced with identity-based authentication systems that utilize the data packages described herein.
- a common framework for establishing trusted identities for individuals, entities (e.g., organizations), and devices can be achieved (something useful for, e.g., the developing Internet of Things). Secure, context-specific identity validation or confirmation for common services such as hotel check-in, financial institutions, social services, car rental, online authentication, etc., can be achieved.
- the claimed technology increases the security and efficiency for the transmission or sharing of physical documents, which may otherwise be easily being falsified, altered, or tampered with.
- Edge devices as used herein are computing devices that are closer to the peripheries of networks, and often closer to sources of data, than a central server that receives data from multiple computing devices may be.
- a typical edge device may be a mobile device. Additional information may be found at https://spatten.mit.edu/(“SpAtten: Efficient Natural Language Processing”).
- the system 100 may include a trusted computing system 102 , a user device 103 , and a secondary computing system 104 .
- Each of the trusted computing system 102 , the secondary computing system 104 , and the user device 103 can be in communication with one another via the network 101 .
- the network 101 can facilitate communications among the trusted computing system 102 , the user device 103 , and the secondary computing system 104 over, for example, the internet or another network via any of a variety of network protocols such as Ethernet, Bluetooth, Cellular, or Wi-Fi.
- Each device in system 100 may include one or more processors, memories, network interfaces, and user interfaces.
- the memory may store programming logic that, when executed by the processor, controls the operation of the corresponding computing device.
- the memory may also store data in databases.
- the network interfaces allow the computing devices to communicate wirelessly or otherwise.
- the various components of devices in system 100 may be implemented via hardware (e.g., circuitry), software (e.g., executable code), or any combination thereof.
- the trusted computing system 102 can include at least one processor and a memory (e.g., a processing circuit).
- the memory can store processor-executable instructions that, when executed by a processor, cause the processor to perform one or more of the operations described herein.
- the processor may include a microprocessor, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), etc., or combinations thereof.
- the memory may include, but is not limited to, electronic, optical, magnetic, or any other storage or transmission device capable of providing the processor with program instructions.
- the memory may further include a floppy disk, CD-ROM, DVD, magnetic disk, memory chip, ASIC, FPGA, read-only memory (ROM), random-access memory (RAM), electrically erasable programmable ROM (EEPROM), erasable programmable ROM (EPROM), flash memory, optical media, or any other suitable memory from which the processor can read instructions.
- the instructions may include code from any suitable computer programming language.
- the trusted computing system 102 can include one or more computing devices or servers that can perform various functions as described herein.
- the trusted computing system 102 can include any or all of the components and perform any or all of the functions of the computer system 400 described herein in conjunction with FIG. 4 .
- the trusted computing system 102 may be a computing system of a trusted entity, such as a government entity or a trusted and independent third party, which maintains information that is known to correspond to one or more users (sometimes referred to as “verified” or “ground truth” information).
- the trusted computing system 102 may be maintained or operated by non-financial institutions and may be associated with government agencies, social media platforms, or user databases, among others.
- the trusted computing system 102 may include one or more network interfaces that facilitate communication with other computing systems of the system 100 via the network 101 .
- the system 100 may include multiple trusted computing systems 102 , which may be controlled or operated by a single entity or multiple entities.
- the user device 103 can include at least one processor and a memory (e.g., a processing circuit).
- the memory can store processor-executable instructions that, when executed by a processor, cause the processor to perform one or more of the operations described herein.
- the processor may include a microprocessor, an ASIC, an FPGA, etc., or combinations thereof.
- the memory may include, but is not limited to, electronic, optical, magnetic, or any other storage or transmission device capable of providing the processor with program instructions.
- the memory may further include a floppy disk, CD-ROM, DVD, magnetic disk, memory chip, ASIC, FPGA, ROM, RAM, EEPROM, EPROM, flash memory, optical media, or any other suitable memory from which the processor can read instructions.
- the instructions may include code from any suitable computer programming language.
- the user device 103 can include one or more computing devices or servers that can perform various functions as described herein.
- the user device 103 can include any or all of the components and perform any or all of the functions of the computer system
- the user device 103 may include mobile or non-mobile devices, such as smartphones, tablet computing devices, wearable computing devices (e.g., a smartwatch, smart optical wear, etc.), personal computing devices (e.g., laptops or desktops), voice-activated digital assistance devices (e.g., smart speakers having chat bot capabilities), portable media devices, vehicle information systems, or the like.
- the user device 103 may access one or more software applications running locally or remotely.
- the user device 103 may operate as a “thin client” device, which presents user interfaces for applications that execute remotely (e.g., at the trusted computing system 102 , the secondary computing system(s) 104 , etc.).
- the user device 103 can be associated with a respective device identifier.
- the identifier may be a universally unique identifier (UUID), a globally unique identifier (GUID), a media access control (MAC) address, an internet protocol (IP) address, a device serial number, a serial number of a component of the user device 103 , a predetermined or randomly generated value associated with the user device 103 , or any type of identifier that identifies the user device 103 or the components thereof.
- UUID universally unique identifier
- GUID globally unique identifier
- MAC media access control
- IP internet protocol
- Input from the user received via the user device 103 may be communicated to the server executing the remote application, which may provide additional information to the user device 103 or execute further operations in response to the user input.
- a user may access any of the computing devices of the system 100 through various user devices 103 at the same time or at different times.
- the user may access one or more computing systems of the system 100 via a digital assistance device 103 while also accessing one or more computing systems of the system 100 using a wearable computing device 103 (e.g., a smart watch).
- the user may access one or more computing systems of the system 100 via a digital assistance device 103 and later access the system 100 via a vehicle information system 103 , via desktop computing system, or a laptop computing system.
- the user device 103 can execute a client application 118 , which may provide one or more user interfaces and receive user input via one or more input/output (I/O) devices.
- the client application 118 may be provided by or be associated with the trusted computing system 102 or the secondary computing system 104 .
- the client application 118 may be a web-based application that is retrieved and displayed in a web-browser executing at the trusted computing system 102 or the secondary computing system 104 .
- the client application 118 can execute locally at the user device 103 and may communicate information with the secondary computing systems 104 or the trusted computing system 102 via the network 101 .
- the client application 118 can access one or more device identifiers using an application programming interface (API) of an operating system of the user device 103 .
- API application programming interface
- the client application 118 can access a predetermined region of memory where the user device 103 stores one or more device identifiers.
- the client application 118 may present one or more user interfaces, for example, in response to user input or interactions with displayed interactive user interface elements.
- the user interfaces may include user interfaces that capture user information from one or more sensors 126 , as described herein.
- the user interfaces may include text or other instructions that direct the user of the user device 103 to capture one or more images of the user, place their finger on a fingerprint scanner, or provide other types of biometric input.
- the user interfaces can include interactive elements that enable a user to provide various user data 120 , send requests, or to navigate between user interfaces of the client application 118 .
- the client application 118 can be used, for example, to generate one or more data packages 124 using the techniques described herein.
- the user data 120 that is obtained by the client application 118 can include any type of information relating to the user, including biometric information such as images of the user's face (or parts thereof), fingerprint scans, one or more voice samples, an iris scan (or an image of the user's eye), palm or finger vein patterns, retinal scans, or the like. Additionally, the user data 120 can include one or more documents that include user information, such as a driver's license of the user, a passport of the user, or any other type of identifying document.
- Non-identifying information that is associated with the user may also be included, such as records of activities (e.g., interactions, websites visited, applications executed or launched, physical or virtual locations, etc.) performed using the user device, records of offline activities (e.g., transaction records, historic records of user location over periods of time, etc.), or other types of information that may be associated with the user.
- the user data 120 can be stored in one or more data structures, with each portion of the user data 120 being indexed by a corresponding label or tag value, which can be used to access the respective portion of the user data 120 .
- One or more portions of the user data 120 can be encrypted (or decrypted) using a respective digital key 122 .
- the client application 118 can generate one or more digital keys 122 using the techniques described herein.
- the digital keys can be, for example, encryption or decryption keys. Some examples include symmetric encryption/decryption keys or asymmetric encryption/decryption keys (e.g., a private key and a public key).
- the digital keys 122 can be generated to encrypt information communicated by the user device 103 via the client application 118 to improve the security of the information in transit.
- the digital keys 122 can be used to protect encrypted information such that the encrypted information cannot be accessed unless using a corresponding decryption key. Key sharing algorithms can be utilized by the user device 103 to share one or more of the digital keys 122 with other computing systems.
- the key sharing algorithms can include, but are not limited to, the Rivest-Shamir-Adelman (RSA) algorithm, the Diffie-Hellman algorithm, the elliptic-curve Diffie-Hellman (ECDH) algorithm, the ephemeral Diffie-Hellman algorithm, the elliptic-curve ephemeral Diffie-Hellman (ECDHE) algorithm, and the pre-shared key (PSK) algorithm, among others.
- the digital keys 122 can be generated using any suitable encryption/decryption key generation algorithm.
- respective digital keys 122 can be generated for user-selected portions of the user data 120 , or portions of the user data 120 selected by the client application 118 .
- digital keys 122 can be generated for particular portions of the user data 120 and for particular trusted computing systems 102 .
- certain portions of the user data 120 can be encrypted using a first digital key 122 for a first trusted computing system 102
- other portions of the user data 120 may be encrypted using a second digital key 122 for a second trusted computing system 102 . This enables the first computing system 102 and the second computing system 102 to access only the respective portions of the user data 120 corresponding to their respective key, while the rest of the user data 120 remains encrypted and inaccessible.
- the client application 118 can generate one or more data packages 124 using the techniques described herein.
- the data packages 124 can include any of the user data 120 that is encrypted using one or more corresponding digital keys 122 .
- Corresponding decryption keys generated as part of the digital keys 122 can be stored for each data package in association with the data package 124 , and shared with a corresponding trusted computing system 102 using an appropriate key sharing algorithm, as described herein.
- the key sharing algorithms can be performed prior to sharing the data package 124 with one or more trusted computing systems 102 .
- the data packages 124 can serve as containers or secure enclaves for information and may be generated by the client application based on a secure token provided by the trusted computing system 102 .
- the trusted computing system 102 can generate a secure token upon verifying that the encrypted user data 120 is authentic.
- the secure token can be provided to the client application 118 , which can then generate a data package that includes the secure token and the encrypted user data 120 .
- the client application generates the data package by encrypting the user data 120 based on the secure token.
- the data package can be generated such that that any change to the data package will cause verification of the data package using the security token to fail.
- Additional digital keys 122 may also be generated to further encrypt the data packages 124 , using the techniques described herein.
- the user device 103 can include one or more sensors 126 .
- the sensors 126 can include one or more biometric sensors or ambient sensors, or any other type of sensor capable of capturing information about a user or an environment in which the user is present.
- the sensors 126 can include components that capture ambient sights and sounds (such as cameras and microphones), and that allow the user to provide inputs (e.g., a touchscreen, stylus, force sensor for sensing pressure on a display screen, and biometric components such as a fingerprint reader, a heart monitor that detects cardiovascular signals, an iris scanner, and so forth).
- the sensors 126 may include one or more location sensors to enable the user device 103 to determine its location relative to, for example, other physical objects or relative to geographic locations.
- Example location sensors include global positioning system (GPS) devices and other navigation and geolocation devices, digital compasses, gyroscopes and other orientation sensors, as well as proximity sensors or other sensors that allow the user device 103 to detect the presence and relative distance of nearby objects and devices.
- GPS global positioning system
- digital compasses digital compasses
- gyroscopes and other orientation sensors
- proximity sensors or other sensors that allow the user device 103 to detect the presence and relative distance of nearby objects and devices.
- the trusted computing system 102 can include a database 106 , which may store user profiles 108 .
- the user profiles 108 may each be associated with a corresponding user and may include corresponding trusted user data 110 .
- the trusted user data 110 can be any data that is confirmed to be truthful or correct as relating to the user to which the respective user profile 108 corresponds.
- the trusted user data 110 can include any information about the user.
- the trusted user data 110 may include, but is not limited to, personally identifying data (e.g., name and social security number), psychographics data (e.g., personality, values, opinions, attitudes, interests, and lifestyles), transactional data (e.g., preferred products, purchase history, transaction history), demographic data (e.g., address, age, education), financial data (e.g., income, assets, credit score), biometric information (e.g., images of the user's face, fingerprint scans, one or more voice samples, an iris scan (or an image of the user's eye), palm or finger vein patterns, retinal scans, etc.), or other user or account data that is maintained or otherwise accessible to the trusted computing system 102 .
- the trusted computing system 102 can receive the information for the trusted user data 110 from trusted sources, such as in-person meetings with the user, government agencies, or other sources of truth.
- the trusted computing system 102 can utilize the trusted user data 110 in a user profile 108 to verify the authenticity of one or more portions of the encrypted user data 120 received from the user device 103 . Verifying the information may include performing a decryption technique using a corresponding digital key 122 shared with the trusted computing system 102 by the user device 103 using an appropriate key sharing algorithm. After decrypting the encrypted user data 120 (or portions of the user data 120 to which the decryption key(s) correspond) the trusted computing system 102 can compare the information in the decrypted data to the trusted user 110 . The trusted computing system 102 can identify what portions of the decrypted user data 120 match those (or reasonably correspond to) corresponding portions of the trusted user data 110 .
- the security token can be any type of value that can be used to verify the integrity of the encrypted user and may be utilized with a generated data package 124 to prevent data tampering.
- the user devices 103 can communicate with the trusted computing system 102 to carry out the techniques described herein.
- the client application 118 of the user device 103 can communicate with the trusted computing system 102 via the secure API 112 .
- the trusted computing system 102 can maintain and provide access to the secure API 112 to various authorized computing systems, such as the user device 103 via the network 101 .
- the secure API 112 can be an API, such as a web-based API corresponding to a particular network address uniform resource identifier (URI), or uniform resource locator (URL), among others.
- the secure API 112 can be a client-based API, a server API (SAPI), or an Internet Server API (ISAPI).
- Various protocols may be utilized to access the secure API 112 , including a representational state transfer (REST) API, a simple object access protocol (SOAP) API, a Common Gateway Interface (CGI) API, or extensions thereof.
- the secure API 112 may be implemented in part using a network transfer protocol, such as the hypertext transfer protocol (HTTP), the secure hypertext transfer protocol (HTTPS), the file transfer protocol (FTP), the secure file transfer protocol (FTPS), each of which may be associated with a respective URI or URL.
- HTTP hypertext transfer protocol
- HTTPS secure hypertext transfer protocol
- FTP file transfer protocol
- FTPS secure file transfer protocol
- the secure API can be secured utilizing one or more encryption techniques, such that the secure API prevents data tampering or data leakage.
- the secondary computing system 104 can include at least one processor and a memory (e.g., a processing circuit).
- the memory can store processor-executable instructions that, when executed by processor, cause the processor to perform one or more of the operations described herein.
- the processor may include a microprocessor, an ASIC, an FPGA, etc., or combinations thereof.
- the memory may include, but is not limited to, electronic, optical, magnetic, or any other storage or transmission device capable of providing the processor with program instructions.
- the memory may further include a floppy disk, CD-ROM, DVD, magnetic disk, memory chip, ASIC, FPGA, ROM, RAM, EEPROM, EPROM, flash memory, optical media, or any other suitable memory from which the processor can read instructions.
- the instructions may include code from any suitable computer programming language.
- the secondary computing device 104 can include one or more computing devices or servers that can perform various functions as described herein.
- the secondary computing device 104 can include any or all of the components and perform any or all of the functions of the computer
- the secondary computing system 104 can execute one or more secondary applications 116 based on the data packages 124 received from the user device 103 .
- the data packages 124 can include a security token generated by the trusted computing system 102 that is invalidated in the event of any tampering of the data package 124 . Therefore, if a verification for the data package 124 performed by the secondary computing system 104 is satisfied, the secondary computing system 104 can rely on the information in the data package 124 being both valid and secure. Therefore, using the valid and secure data in the data package, the secondary computing system 104 can execute one or more secondary applications 116 that correspond to any type of functionality that might utilize the user data 120 .
- the user data 120 in the data package can be used to provide recommendations for retirement products, personal loans, home equity loans, or other financial products.
- the secondary applications 116 can select loans to recommend to the user that have monthly, semimonthly, or periodic payments that fall within the user's available periodic cash flow.
- the secondary computing system 104 can utilize a second digital key 122 generated based on the device identifier and biometric information obtained via the user device 103 to decrypt the data package 124 .
- the security token (which may be provided to the secondary computing system 104 with the data package) can be first verified with the trusted computing system 102 as corresponding to the device identifier from which the data package 124 was obtained.
- the integrity of the data package 124 can be verified using the security token to confirm that the data package 124 has not been modified.
- the data package 124 can be decrypted using a digital key 122 obtained from the user device 103 using a suitable key sharing algorithm.
- the secondary applications can then utilize the data extracted from the decrypted data package 124 to perform further operations, with the assurance that the information extracted from the data package 124 is authentic.
- the secondary computing system 104 may also maintain an identity construct for the user, and can update the identity construct based on the information extracted from the received data package 124 .
- FIG. 2 in the context of the components of FIG. 1 , illustrated is an example categorization 200 of identity elements that may be present in the identity construct of the user.
- Information extracted from the data packages 124 received from a user device can be stored in a data structure that may be indexed by or associated with one or more categories.
- the data points of activities or other user data 120 can be sorted by the secondary computing system 104 into categories, cumulatively constituting the basis for a fundamental digital identity.
- “geolocation” may include, for example, elements related to where a user has been; “personal data” may include, for example, name and birthdate; “health history” may include, for example, information that might be found in health records; “romance/marriage” may include, for example, information on significant others and spouses; “work history” may include, for example, information on places and dates of employments and titles held; “charity/volunteer” may include information on, for example, charitable contributions or volunteering activities; “online posts/pics” may include, for example, textual posts and pictures/videos/other media submitted to social networking accounts; “hobbies” may include, for example, leisure or other non-employment related activities; “education” may include, for example, information on schools attended and degrees earned; “faith/religion” may include, for example, information on churches attended or religious activities; “travel” may include, for example, information on places visited; “transactions” may include, for example, information on purchases; “legal history” may include, for example,
- FIG. 3 illustrated is a flow diagram of an example method 300 for secure data processing using data packages (e.g., the data packages 124 ) generated by edge devices (e.g., the user device 103 ), in accordance with one or more example implementations.
- the method 300 can be a computer-implemented method.
- the method 300 may be implemented, for example, using any of the computing systems described herein, including the user device 103 , the secondary computing system 104 , the trusted computing system 102 , or the computing system 400 described in connection with FIG. 4 . In some implementations, additional, fewer, or different operations may be performed.
- the method 300 can include scanning a physical feature of a user of a computing device (e.g., the user device 103 ) using a sensor (e.g., a sensor 126 ) of the computing device.
- the physical feature can be scanned, for example, using a camera or another type of biometric scanning device of the computing device.
- the computing device may include components that capture ambient sights and sounds (such as cameras and microphones), and that allow the user to provide inputs (e.g., a touchscreen, stylus, force sensor for sensing pressure on a display screen, biometric components such as a fingerprint reader, a heart monitor that detects cardiovascular signals, an iris scanner, and so forth).
- the image or video capture devices of the user device 103 that capture video or images can include devices that capture non-visible light, such as infrared (IR) or ultraviolet (UV) light.
- User interfaces on an application e.g., the client application 118 ) executing on the computing device can prompt the user to provide biometric inputs for generating encrypted user data (e.g., the user data 120 ).
- the physical feature scanned using the sensor can be, but is not necessarily limited to, a picture of the user's face, a fingerprint of the user, a heart rate or heart rate pattern of the user, an iris scan of the user, a retinal scan of the user, or the like.
- the physical feature is a voice of the user (e.g., a voice print).
- the computing device can include one or more microphones that can capture a voice of the user.
- the user interfaces on the computing device can prompt the user to speak predetermined phrases or predetermined or desired portions of the user data (e.g., name, address, date of birth, etc.).
- the voice of the user can be applied to a natural language processing (NLP) model (e.g., which may be trained using machine-learning techniques by the secondary computing device 104 ).
- the NLP model may be executed by the computing device to extract one or more words or phrases spoken by the user.
- NLP natural language processing
- scanning the physical feature of the user can include applying one or more filters to determine authenticity or validity of the physical feature.
- the filters may be filters that are applied to determine whether the biometric data is in fact provided by the user, and not “spoofed” by a malicious actor attempting to fraudulently impersonate the user.
- a malicious actor may spoof the output of a camera to the application to provide a pre-obtained or pre-existing photo of the user that the malicious actor is attempting to impersonate.
- the computing device can gather additional information, such as IR images or UV images from additional sensors on the computing device, and cross-reference the images obtained via the camera (which may be spoofed) with the UV data or IR data captured from additional sensors.
- the computing device can execute one or more filters over the captured data to identify one or more anomalies.
- the computing device can apply one or more IR filters, UV filters, or other non-visible light frequency filters to analyze the integrity of the image of the user's face.
- the aforementioned verification techniques can be utilized to determine a score that indicates a likelihood that an appearance of the user is forged. For example, by cross-referencing UV or IR images captured at about the same time that the user's face is captured by a visible-light camera of the computing device, the computing device can detect the presence of one or more anomalies in the visible-light image. The size and number of the detected anomalies can influence the score. For example, larger anomalies or a larger number of anomalies can indicate a larger score (and therefore a higher likelihood that the image is fraudulent). Voice data or other types of biometric data can also be applied to similar filters or anomaly detection models that are trained using machine-learning techniques to detect potentially fraudulent biometric data.
- the anomaly detection model can be executed using the biometric data as input and can generate a score indicating the likelihood that the biometric data has been forged or is fraudulent.
- the anomaly detection model can be trained using supervised learning, unsupervised learning, semi-supervised learning, or other machine-learning techniques to calculate the score.
- machine learning models can include neural networks (e.g., a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN) such as a long-short term memory (LSTM) model, combinations thereof, etc.), regression models (e.g., linear regression, sparse vector machine (SVM), logistic regression, polynomial regression, ridge regression, Lasso regression, Bayesian linear regression, etc.), or other types of classifiers (e.g., na ⁇ ve Bayes, decision trees, k-nearest neighbor (KNN), extreme gradient boosting (XGBoost) models, etc.).
- the aforementioned machine-learning models may also be utilized for any type of machine-learning or artificial intelligence (AI) performed task described herein.
- multiple machine-learning models may be executed in a machine-learning pipeline to perform various operations described herein.
- the method 300 can include generating a biometric signature corresponding to the physical feature.
- the biometric signature may be a signature, or a reduced version, of the scan performed in step 305 .
- the biometric signature may be generated by performing one or more feature extraction techniques, for example, to reduce the size of the biometric data provided by the user. Reducing the size of the data provided by the user allows for increased efficiency when utilizing the biometric signature to perform further processing steps, such as data encryption.
- the biometric signature can further be generated to conform to a predetermined data size or format that is compatible with the encryption techniques described herein.
- the physical feature detected using the biometric techniques described herein can be a facial feature.
- An example facial feature may relate to aspects (e.g., shape or outline) of a user's eyes, nose, lips, etc.
- the facial feature may be, or may be based on, a facial image.
- the biometric signature can be a reduced dataset that preserves the unique or identifying features of the user's face.
- the biometric signature may be generated using one or more feature extraction techniques, such as edge detection, bounding-box detection, or detection of particular features on the user's face (e.g., position and shape of eyes, nose, mouth, ears, eyebrows, etc.), and their relative positions or distances from one another. This information can be stored as the biometric signature for the user, which may be utilized in subsequent processing steps.
- the method 300 can include generating user data (e.g., the user data 120 ) based on inputs detected via one or more input devices of the computing device.
- the inputs can include information about the user.
- the application executing on the computing device can prompt the user to provide information relating to the user.
- the information may be provided verbally and transcribed by the computing device by executing an NLP model or another type of trained speech-to-text processing model over a voice input recorded using a microphone or another type of audio input.
- the user inputs can include an image captured using an image sensor of the computing device.
- the application may prompt the user to capture one or more images of various documents (e.g., driver's license, medical documents, utility bills, etc.) that include identifying information about the user.
- the images can be stored in the memory of the computing device and can be utilized in an image processing function or algorithm that can extract pertinent information relating to the user from the image.
- the computing device can execute a trained artificial intelligence model to identify regions of an image that are likely to correspond to pertinent details (e.g., blocks of text, etc.).
- Natural language processing operations e.g., executing additional machine-learning models or other types of image-to-text algorithms like optical character recognition
- optical character recognition can be used to extract sections of text from the image(s), and then regular expression (regex) rules can be applied to the sections of text to identify and extract the user data.
- Geolocation data may also be detected using a location sensor of the computing device.
- the location sensor can generate a geolocation of the computing device corresponding to where the inputs are detected by the computing device.
- the sensors 126 may include one or more location sensors to enable the user device 103 to determine its location relative to, for example, other physical objects or relative to geographic locations.
- Example location sensors include global positioning system (GPS) devices and other navigation and geolocation devices, digital compasses, gyroscopes and other orientation sensors, as well as proximity sensors or other sensors that allow the user device 103 to detect the presence and relative distance of nearby objects and devices.
- GPS global positioning system
- the user data is generated to comprise the geolocation, which may be stored in association with the data provided via user input.
- the computing device may generate a hash value of a portion of or all of the user data.
- the hash value may be utilized as a measure to detect whether some or all of the user data has been changed after its original generation.
- the hash can be a safeguard against potential data tampering and may be utilized as an initial verification process when attempting to process the user data. For example, prior to processing or utilizing the user data in further downstream processing operations, the hash value of the user data can be recalculated and compared with the hash value generated when the user data itself was generated. If there are differences between the hash values, the user data may have been changed by another party (e.g., a hacker or another entity) prior to the attempted utilization of the user data.
- another party e.g., a hacker or another entity
- the method 300 can include retrieving a device identifier corresponding to the computing device.
- the identifier may be a UUID, a GUID, a MAC address, an IP address, a device serial number, a serial number of a component of the computing device, a predetermined or randomly generated value associated with the computing device, or the like.
- the computing device can retrieve the one or more device identifiers using an API of an operating system of the computing device.
- a predetermined region of memory of the computing device that stores one or more device identifiers can be accessed to retrieve the device identifier.
- the device identifier can then be used in subsequent processing steps to encrypt the user data.
- the method 300 can include encrypting the user data to generate encrypted user data.
- the encrypted user data can be generated, for example, by utilizing the biometric signature and the device identifier to generate a private key.
- the private key may be a hash value of the biometric signature concatenated with the device identifier, or may be individual hash values of the biometric signature and the device identified added together to form a single key value.
- salt values or additional data padding may be added to the biometric signature and the device identifiers to generate the private key (e.g., one of the digital keys 122 ).
- the private key can then be utilized in a corresponding encryption and key generation algorithm to encrypt the user data.
- the computing device can execute a suitable encryption algorithm, such as an asymmetric encryption algorithm or a symmetric encryption algorithm to generate the encrypted user data.
- the method 300 can include generating a first digital key (e.g., a digital key 122 ) granting access to the encrypted user data and transmitting the digital key to a first computing system.
- the first digital key can be utilized to decrypt the encrypted user data.
- the first digital key can be an asymmetric decryption key (e.g., a public key) that is generated as part of the encryption algorithm executed at step 325 .
- the first digital key may be generated as a symmetric decryption key as part of the encryption process executed at step 325 .
- respective digital keys can be generated for user-selected or predetermined portions of the encrypted user data.
- digital keys can be generated for particular portions of the user data and for particular trusted computing systems (e.g., particular trusted computing systems 102 ).
- certain portions of the user data can be encrypted using a first set of encryption keys (e.g., each of which may be derived from a combination of the device identifier and the biometric signature) for a first trusted computing system, and other portions of the user data may be encrypted by a second set of encryption keys (e.g., also derived from the device identifier and the biometric signature) that correspond to a second trusted computing system.
- Corresponding decryption keys can be generated for each encryption key. This enables the first trusted computing system and the second trusted computing system to access only the respective portions of the user data corresponding to their respective decryption key, while the rest of the user data remains encrypted and inaccessible.
- the first digital key(s) (e.g., the decryption keys for the encrypted user data) can be stored in association with the encrypted user data, and can be transmitted to the trusted computing system(s) using a suitable key sharing algorithm.
- Key sharing algorithms can be any algorithm that may be utilized by the computing device to share one or more of the digital keys with other computing systems.
- the key sharing algorithms can include, but are not limited to, the RSA algorithm, the Diffie-Hellman algorithm, the ECDH algorithm, the ephemeral Diffie-Hellman algorithm, the ECDHE algorithm, and the PSK algorithm, among others.
- the computing system can receive the generated security token from a first computing system (e.g., the trusted computing system 102 ), which can correspond to the user data.
- the security token can indicate that at least a subset of the user data is valid.
- the trusted computing system can utilize the trusted user data maintained in a database to verify the authenticity of one or more portions of the encrypted user data received from the computing device. Verifying the information may include performing a decryption technique using a corresponding digital key shared using an appropriate key sharing algorithm. After decrypting the encrypted user data (or portions of the user data to which the decryption key(s) correspond), the trusted computing system can compare the information in the decrypted data to the trusted user data corresponding to the requesting user.
- the trusted computing system can identify what portions of the decrypted user data match those (or almost match, e.g., match within a tolerance threshold) corresponding portions of the trusted user data. These matching portions of the user data can be used to generate a security token.
- the security token may be generated without using the matching portions of the user data.
- the security token can be any type of value that can be used to verify the integrity of the encrypted user, and may be utilized with the data package 124 to prevent data tampering.
- the trusted computing system can generate a hash value, a random value, or another type of token value using a token generation algorithm.
- a copy of the security token can be stored in association with the one or more portions of the user data to which the security token corresponds (e.g., for further verification purposes in response to a request from a secondary computing device).
- the security token can be transmitted to the computing device.
- the computing system can then utilize the security token to generate a data package.
- the method 300 can include storing, at the computing device, the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device.
- the data package can be stored or generated such that attempts to alter the data package invalidate the security token.
- the security token can itself be a hash value of the encrypted user data, which in some implementations may also include a predetermined salt value or other deterministic information.
- the data package can include the encrypted user data and the security value, such that any tampering of the data package would cause a verification process of the encrypted user data using the security value to fail.
- the security token may include a hash (or a partial hash) of some or all of the encrypted user data.
- the data packages can serve as containers or secure enclaves for information, and may be generated by the client application based on a secure token provided by the trusted computing system.
- the data package can be generated such that that any change to the data package will cause verification of the data package using the security token to fail.
- the method 300 can include generating a second digital key corresponding to the data package.
- the second digital key can be generated when encrypting the data package to transmit the data package to a secondary computing device (e.g., the secondary computing device 104 ).
- Encrypting the data package can generate an encrypted data package.
- the encrypted data package can be generated using similar processes to those used to generate the encrypted user data.
- the biometric signature and the device identifier can be utilized to generate a private encryption key for the data package.
- the private key may be a hash value of the biometric signature concatenated with the device identifier, or may be individual hash values of the biometric signature and the device identified added together to form a single key value.
- salt values or additional data padding may be added to the biometric signature and the device identifiers to generate the private encryption key.
- the private encryption key can then be utilized in a corresponding encryption and key generation algorithm, to encrypt the data package.
- the computing device can generate a corresponding decryption key (e.g., which may be a public decryption key) that may be used to decrypt the encrypted data package.
- the second digital key can be an asymmetric decryption key (e.g., a public key) that is generated as part of the encryption algorithm used to encrypt the data package.
- the second digital key may be generated as a symmetric decryption key as part of the encryption process.
- the decryption key can be stored in association with the data package at the computing device.
- the method 300 can include transmitting the data package and the second digital key to a second computing system (e.g., the secondary computing system) for use in executing one or more secondary applications (e.g., the secondary applications 116 ).
- the data package can be transmitted, for example, in response to detecting a user input at the application executing at the computing device.
- the user input can indicate that information relating to the user (e.g., the user data) is to be provided to the second computing system.
- the data package can be transmitted to the second computing system in response to a request from the second computing system.
- the data package can be transmitted to the second computing system via a network.
- the second digital key can be transmitted to the second computing system using a suitable key sharing algorithm.
- Key sharing algorithms can be any algorithm that may be utilized by the computing device to share one or more of the digital keys with other computing systems.
- the key sharing algorithms can include, but are not limited to, the RSA algorithm, the Diffie-Hellman algorithm, the ECDH algorithm, the ephemeral Diffie-Hellman algorithm, the ECDHE algorithm, and the PSK algorithm, among others.
- FIG. 4 is a component diagram of an example computing system suitable for use in the various implementations described herein, according to an example implementation.
- the computing system 400 may implement an example trusted computing system 102 , user device 103 , secondary computing system 104 , or various other example systems and devices described in the present disclosure.
- the computing system 400 includes a bus 402 or other communication component for communicating information and a processor 404 coupled to the bus 402 for processing information.
- the computing system 400 also includes main memory 406 , such as a RAM or other dynamic storage device, coupled to the bus 402 for storing information, and instructions to be executed by the processor 404 .
- Main memory 406 can also be used for storing position information, temporary variables, or other intermediate information during execution of instructions by the processor 404 .
- the computing system 400 may further include a read only memory (ROM) 408 or other static storage device coupled to the bus 402 for storing static information and instructions for the processor 404 .
- a storage device 410 such as a solid-state device, magnetic disk, or optical disk, is coupled to the bus 402 for persistently storing information and instructions.
- the computing system 400 may be coupled via the bus 402 to a display 414 , such as a liquid crystal display, or active matrix display, for displaying information to a user.
- a display 414 such as a liquid crystal display, or active matrix display
- An input device 412 such as a keyboard including alphanumeric and other keys, may be coupled to the bus 402 for communicating information, and command selections to the processor 404 .
- the input device 412 has a touch screen display.
- the input device 412 can include any type of biometric sensor, or a cursor control, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor 404 and for controlling cursor movement on the display 414 .
- the computing system 400 may include a communications adapter 416 , such as a networking adapter.
- Communications adapter 416 may be coupled to bus 402 and may be configured to enable communications with a computing or communications network 101 and/or other computing systems.
- any type of networking configuration may be achieved using communications adapter 416 , such as wired (e.g., via Ethernet), wireless (e.g., via Wi-Fi, Bluetooth), satellite (e.g., via GPS) pre-configured, ad-hoc, LAN, WAN, and the like.
- the processes of the illustrative implementations that are described herein can be achieved by the computing system 400 in response to the processor 404 executing an implementation of instructions contained in main memory 406 .
- Such instructions can be read into main memory 406 from another computer-readable medium, such as the storage device 410 .
- Execution of the implementation of instructions contained in main memory 406 causes the computing system 400 to perform the illustrative processes described herein.
- One or more processors in a multi-processing implementation may also be employed to execute the instructions contained in main memory 406 .
- hard-wired circuitry may be used in place of or in combination with software instructions to implement illustrative implementations. Thus, implementations are not limited to any specific combination of hardware circuitry and software.
- circuit may include hardware structured to execute the functions described herein.
- each respective “circuit” may include machine-readable media for configuring the hardware to execute the functions described herein.
- the circuit may be embodied as one or more circuitry components including, but not limited to, processing circuitry, network interfaces, peripheral devices, input devices, output devices, sensors, etc.
- a circuit may take the form of one or more analog circuits, electronic circuits (e.g., integrated circuits (IC), discrete circuits, system on a chip (SOC) circuits), telecommunication circuits, hybrid circuits, and any other type of “circuit.”
- the “circuit” may include any type of component for accomplishing or facilitating achievement of the operations described herein.
- a circuit as described herein may include one or more transistors, logic gates (e.g., NAND, AND, NOR, OR, XOR, NOT, XNOR), resistors, multiplexers, registers, capacitors, inductors, diodes, wiring, and so on.
- the “circuit” may also include one or more processors communicatively coupled to one or more memory or memory devices.
- the one or more processors may execute instructions stored in the memory or may execute instructions otherwise accessible to the one or more processors.
- the one or more processors may be embodied in various ways.
- the one or more processors may be constructed in a manner sufficient to perform at least the operations described herein.
- the one or more processors may be shared by multiple circuits (e.g., circuit A and circuit B may comprise or otherwise share the same processor, which, in some example implementations, may execute instructions stored, or otherwise accessed, via different areas of memory).
- the one or more processors may be structured to perform or otherwise execute certain operations independent of one or more co-processors.
- two or more processors may be coupled via a bus to enable independent, parallel, pipelined, or multi-threaded instruction execution.
- Each processor may be implemented as one or more general-purpose processors, ASICs, FPGAs, digital signal processors (DSPs), or other suitable electronic data processing components structured to execute instructions provided by memory.
- the one or more processors may take the form of a single core processor, multi-core processor (e.g., a dual core processor, triple core processor, and/or quad core processor), microprocessor, etc.
- the one or more processors may be external to the apparatus, for example the one or more processors may be a remote processor (e.g., a cloud based processor).
- the one or more processors may be internal and/or local to the apparatus.
- a given circuit or components thereof may be disposed locally (e.g., as part of a local server, a local computing system) or remotely (e.g., as part of a remote server such as a cloud based server).
- a “circuit” as described herein may include components that are distributed across one or more locations.
- An exemplary system for implementing the overall system or portions of the implementations might include a general purpose computing devices in the form of computers, including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit.
- Each memory device may include non-transient volatile storage media, non-volatile storage media, non-transitory storage media (e.g., one or more volatile and/or non-volatile memories), etc.
- the non-volatile media may take the form of ROM, flash memory (e.g., flash memory such as NAND, 3 D NAND, NOR, 3 D NOR), EEPROM, MRAM, magnetic storage, hard discs, optical discs, etc.
- the volatile storage media may take the form of RAM, TRAM, ZRAM, etc. Combinations of the above are also included within the scope of machine-readable media.
- machine-executable instructions comprise, for example, instructions and data, which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
- Each respective memory device may be operable to maintain or otherwise store information relating to the operations performed by one or more associated circuits, including processor instructions and related data (e.g., database components, object code components, script components), in accordance with the example implementations described herein.
- input devices may include any type of input device including, but not limited to, a keyboard, a keypad, a mouse, joystick, or other input devices performing a similar function.
- output device may include any type of output device including, but not limited to, a computer monitor, printer, facsimile machine, or other output devices performing a similar function.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Storage Device Security (AREA)
Abstract
Disclosed are example methods, systems, and devices that allow for secure data processing using data packages generated by edge devices. The techniques include generating a biometric signature using information captured by a computing device, and encrypting user data obtained via the computing device using the biometric signature and a device identifier of the computing device. A security token can be generated and utilized by the computing device to generate a data package, which is configured such that any change to the data package would cause a validation process of the data package using the security token to fail. The data package can be encrypted using various digital keys and provided to secondary computing systems.
Description
The present disclosure relates to secure data processing using data packages generated by edge devices.
Client applications can access resources from servers. In many cases, applications utilize authenticating information that may be used to permit access information related to a user. However, gathering authenticating information is an inherently insecure and therefore challenging process.
The present techniques can be utilized for secure data processing using data packages generated by edge devices. Obtaining authenticating information is an inherently insecure and therefore challenging process. To address these issues, the systems and methods described herein can scan biometric signatures of a user of a computing device and encrypt user data using, derived from, or otherwise based on the biometric signature and a device identifier of the computing device. A digital key can then be generated that grants access to the encrypted user data. A trusted computing device can verify the encrypted user data based on the digital key and can provide a security token corresponding to the encrypted data. The encrypted user data and the security token can be used to generate a data package, which if altered would invalidate the security token. A second digital key corresponding to the data package can then be generated and transmitted with the data package to other computing systems for processing. These techniques can be performed by an edge-computing device, thereby solving the aforementioned security issues relating to insecure gathering.
One aspect of the present disclosure relates to a method for generating authentication data packages based on physical features. The method may be performed, for example, by one or more processors of a computing device. The method can include scanning, using a sensor of the computing device, a physical feature of a user of the computing device. The method can include generating, based on the scan of the physical feature, by the computing device a biometric signature corresponding to the physical feature. The method can include generating user data based on inputs detected via one or more input devices of the computing device. The inputs can include information on the user. The method can include retrieving a device identifier corresponding to the computing device. The method can include encrypting, based on the biometric signature and the device identifier, the user data to generate encrypted user data. The method can include generating a first digital key granting access to the encrypted user data and transmitting the first digital key to a first computing system. The method can include receiving, from the first computing system, a security token corresponding to the user data. The security token can indicate that at least a subset of the user data is valid. The method can include storing, at the computing device, the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device. An attempt to alter the data package can invalidate the security token. The method can include generating a second digital key corresponding to the data package. The method can include transmitting the data package and the second digital key to a second computing system for use to provide the user a service (e.g., access to a virtual or physical space).
In some implementations of the method, the data package can be transmitted in response to detecting a user input indicating that the information on the user is to be provided to the second computing system. In some implementations of the method, the data package can be transmitted to the second computing system in response to a request from the second computing system. In some implementations of the method, encrypting the user data based on the biometric signature and the device identifier can include using data based on (e.g., derived from) both the biometric signature and the device identifier to encrypt the user data.
In some implementations, the physical feature can be a facial feature. In some implementations, the biometric signature can be based on facial biometric data detected using the sensor. In some implementations, the physical feature can be a voice of the user. In some implementations, the sensor can be a microphone of the computing device. In some implementations, the user inputs can include an image captured using an image sensor of the computing device. In some implementations, the method can include further including analyzing the image to determine image integrity. In some implementations, analyzing the image can include using at least one of a non-visible light filter or a neural network.
In some implementations, the user data can include information extracted from an image captured using an image sensor of the computing device. In some implementations, scanning the physical feature of the user can include using a plurality of filters to determine authenticity of the physical feature. In some implementations, the plurality of filters corresponds to a plurality of non-visible light frequencies. In some implementations, determining authenticity of the physical feature can include determining a likelihood that an appearance of the user is forged. In some implementations, generating the user data can include performing a hashing operation on the information on the user.
In some implementations, the method can include further including detecting, using a location sensor of the computing device, a geolocation of the computing device corresponding to where the inputs are detected by the computing device. In some implementations, the user data can be generated to include the geolocation. In some implementations, retrieving the device identifier can include requesting the device identifier from an operating system of the computing device. In some implementations, the first and second digital keys can be first and second public keys.
Another aspect of the present disclosure relates to a system configured for generating authentication data packages based on physical features. The system can include a computing device comprising one or more processors configured by machine-readable instructions. The system can scan, using a sensor of a computing device, a physical feature of a user of the computing device. The system can comprise having the computing device generate, based on the scan of the physical feature, a biometric signature corresponding to the physical feature. The system can generate user data based on inputs detected via one or more input devices of the computing device. The inputs can include information on the user. The system can retrieve a device identifier corresponding to the computing device. The system can encrypt, based on the biometric signature and the device identifier, the user data to generate encrypted user data. The system can generate a first digital key granting access to the encrypted user data and transmit the first digital key to a first computing system. The system can receive, from the first computing system, a security token corresponding to the user data. The security token can indicate that at least a subset of the user data is valid. The system can store, at the computing device, the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device. An attempt to alter the data package can invalidate the security token. The system can generate a second digital key corresponding to the data package. The system can transmit the data package and the second digital key to a second computing system for use to provide the user a service.
In some implementations, the data package can be transmitted in response to detecting a user input indicating that the information on the user is to be provided to the second computing system. In some implementations, encrypting the user data based on the biometric signature and the device identifier can include using data based on (e.g., derived from) both the biometric signature and the device identifier to encrypt the user data.
These and other aspects and implementations are discussed in detail below. The foregoing information and the following detailed description include illustrative examples of various aspects and implementations and provide an overview or framework for understanding the nature and character of the claimed aspects and implementations. The drawings provide illustration and a further understanding of the various aspects and implementations, and are incorporated in and constitute a part of this specification. Aspects can be combined, and it will be readily appreciated that features described in the context of one aspect of the invention can be combined with other aspects. Aspects can be implemented in any convenient form, for example, by appropriate computer programs, which may be carried on appropriate carrier media (computer readable media), which may be tangible carrier media (e.g., disks) or intangible carrier media (e.g., communications signals). Aspects may also be implemented using any suitable apparatus, which may take the form of programmable computers running computer programs arranged to implement the aspect. As used in the specification and in the claims, the singular form of ‘a,’ ‘an,’ and ‘the’ include plural referents unless the context clearly dictates otherwise.
The accompanying drawings are not intended to be drawn to scale. Like reference numbers and designations in the various drawings indicate like elements. For purposes of clarity, not every component may be labeled in every drawing. In the drawings:
Below are detailed descriptions of various concepts related to, and implementations of, techniques, approaches, methods, apparatuses, and systems for secure data processing using data packages generated by edge devices. The various concepts introduced above and discussed in detail below may be implemented in any of numerous ways, as the described concepts are not limited to any particular manner of implementation. Examples of specific implementations and applications are provided primarily for illustrative purposes.
Various embodiments described herein relate to systems and methods for secure data processing using data packages generated by edge devices. Obtaining authenticating information is an inherently insecure and therefore challenging process. To address these issues, the systems and methods described herein can scan biometric signatures of a user of a computing device and encrypt user data using, derived from, or otherwise based on the biometric signature and a device identifier of the computing device. A digital key can then be generated that grants access to the encrypted user data. A trusted computing device can verify the encrypted user data based on the digital key and can provide a security token corresponding to the encrypted data. The encrypted user data and the security token can be used to generate a data package, which if altered would invalidate the security token. A second digital key corresponding to the data package can then be generated and transmitted with the data package to other computing systems for processing. These techniques can be performed by an edge-computing device, thereby solving the aforementioned security issues relating to insecure gathering.
Embodiments of the present techniques can serve as a basis for a wide range of authentication procedures utilizing the data packages, including government services, healthcare and effectively any service that implements authentication of users. A single, unique digital identity construct offers a number of advantages, and utilizing the data packages described herein to securely capture the information for that data construct enables increased efficiency through edge computing. Traditional passwords may be replaced with identity-based authentication systems that utilize the data packages described herein. A common framework for establishing trusted identities for individuals, entities (e.g., organizations), and devices can be achieved (something useful for, e.g., the developing Internet of Things). Secure, context-specific identity validation or confirmation for common services such as hotel check-in, financial institutions, social services, car rental, online authentication, etc., can be achieved. Furthermore, the claimed technology increases the security and efficiency for the transmission or sharing of physical documents, which may otherwise be easily being falsified, altered, or tampered with.
Edge devices as used herein are computing devices that are closer to the peripheries of networks, and often closer to sources of data, than a central server that receives data from multiple computing devices may be. A typical edge device may be a mobile device. Additional information may be found at https://spatten.mit.edu/(“SpAtten: Efficient Natural Language Processing”).
Referring to FIG. 1 , illustrated is a block diagram of an example system 100 for secure data processing using data packages generated by edge devices, in accordance with one or more example implementations. The system 100 may include a trusted computing system 102, a user device 103, and a secondary computing system 104. Each of the trusted computing system 102, the secondary computing system 104, and the user device 103 can be in communication with one another via the network 101. The network 101 can facilitate communications among the trusted computing system 102, the user device 103, and the secondary computing system 104 over, for example, the internet or another network via any of a variety of network protocols such as Ethernet, Bluetooth, Cellular, or Wi-Fi.
Each device in system 100 may include one or more processors, memories, network interfaces, and user interfaces. The memory may store programming logic that, when executed by the processor, controls the operation of the corresponding computing device. The memory may also store data in databases. The network interfaces allow the computing devices to communicate wirelessly or otherwise. The various components of devices in system 100 may be implemented via hardware (e.g., circuitry), software (e.g., executable code), or any combination thereof.
The trusted computing system 102 can include at least one processor and a memory (e.g., a processing circuit). The memory can store processor-executable instructions that, when executed by a processor, cause the processor to perform one or more of the operations described herein. The processor may include a microprocessor, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), etc., or combinations thereof. The memory may include, but is not limited to, electronic, optical, magnetic, or any other storage or transmission device capable of providing the processor with program instructions. The memory may further include a floppy disk, CD-ROM, DVD, magnetic disk, memory chip, ASIC, FPGA, read-only memory (ROM), random-access memory (RAM), electrically erasable programmable ROM (EEPROM), erasable programmable ROM (EPROM), flash memory, optical media, or any other suitable memory from which the processor can read instructions. The instructions may include code from any suitable computer programming language. The trusted computing system 102 can include one or more computing devices or servers that can perform various functions as described herein. The trusted computing system 102 can include any or all of the components and perform any or all of the functions of the computer system 400 described herein in conjunction with FIG. 4 .
The trusted computing system 102 may be a computing system of a trusted entity, such as a government entity or a trusted and independent third party, which maintains information that is known to correspond to one or more users (sometimes referred to as “verified” or “ground truth” information). For example, the trusted computing system 102 may be maintained or operated by non-financial institutions and may be associated with government agencies, social media platforms, or user databases, among others. The trusted computing system 102 may include one or more network interfaces that facilitate communication with other computing systems of the system 100 via the network 101. In some implementations, the system 100 may include multiple trusted computing systems 102, which may be controlled or operated by a single entity or multiple entities.
The user device 103 can include at least one processor and a memory (e.g., a processing circuit). The memory can store processor-executable instructions that, when executed by a processor, cause the processor to perform one or more of the operations described herein. The processor may include a microprocessor, an ASIC, an FPGA, etc., or combinations thereof. The memory may include, but is not limited to, electronic, optical, magnetic, or any other storage or transmission device capable of providing the processor with program instructions. The memory may further include a floppy disk, CD-ROM, DVD, magnetic disk, memory chip, ASIC, FPGA, ROM, RAM, EEPROM, EPROM, flash memory, optical media, or any other suitable memory from which the processor can read instructions. The instructions may include code from any suitable computer programming language. The user device 103 can include one or more computing devices or servers that can perform various functions as described herein. The user device 103 can include any or all of the components and perform any or all of the functions of the computer system 400 described herein in conjunction with FIG. 4 .
The user device 103 may include mobile or non-mobile devices, such as smartphones, tablet computing devices, wearable computing devices (e.g., a smartwatch, smart optical wear, etc.), personal computing devices (e.g., laptops or desktops), voice-activated digital assistance devices (e.g., smart speakers having chat bot capabilities), portable media devices, vehicle information systems, or the like. The user device 103 may access one or more software applications running locally or remotely. The user device 103 may operate as a “thin client” device, which presents user interfaces for applications that execute remotely (e.g., at the trusted computing system 102, the secondary computing system(s) 104, etc.). The user device 103 can be associated with a respective device identifier. The identifier may be a universally unique identifier (UUID), a globally unique identifier (GUID), a media access control (MAC) address, an internet protocol (IP) address, a device serial number, a serial number of a component of the user device 103, a predetermined or randomly generated value associated with the user device 103, or any type of identifier that identifies the user device 103 or the components thereof.
Input from the user received via the user device 103 may be communicated to the server executing the remote application, which may provide additional information to the user device 103 or execute further operations in response to the user input. In some examples, a user may access any of the computing devices of the system 100 through various user devices 103 at the same time or at different times. For example, the user may access one or more computing systems of the system 100 via a digital assistance device 103 while also accessing one or more computing systems of the system 100 using a wearable computing device 103 (e.g., a smart watch). In other examples, the user may access one or more computing systems of the system 100 via a digital assistance device 103 and later access the system 100 via a vehicle information system 103, via desktop computing system, or a laptop computing system.
The user device 103 can execute a client application 118, which may provide one or more user interfaces and receive user input via one or more input/output (I/O) devices. The client application 118 may be provided by or be associated with the trusted computing system 102 or the secondary computing system 104. In some implementations, the client application 118 may be a web-based application that is retrieved and displayed in a web-browser executing at the trusted computing system 102 or the secondary computing system 104. In some implementations, the client application 118 can execute locally at the user device 103 and may communicate information with the secondary computing systems 104 or the trusted computing system 102 via the network 101. The client application 118 can access one or more device identifiers using an application programming interface (API) of an operating system of the user device 103. In some implementations, the client application 118 can access a predetermined region of memory where the user device 103 stores one or more device identifiers.
The client application 118 may present one or more user interfaces, for example, in response to user input or interactions with displayed interactive user interface elements. The user interfaces may include user interfaces that capture user information from one or more sensors 126, as described herein. For example, the user interfaces may include text or other instructions that direct the user of the user device 103 to capture one or more images of the user, place their finger on a fingerprint scanner, or provide other types of biometric input. Additionally, the user interfaces can include interactive elements that enable a user to provide various user data 120, send requests, or to navigate between user interfaces of the client application 118. The client application 118 can be used, for example, to generate one or more data packages 124 using the techniques described herein.
The user data 120 that is obtained by the client application 118 can include any type of information relating to the user, including biometric information such as images of the user's face (or parts thereof), fingerprint scans, one or more voice samples, an iris scan (or an image of the user's eye), palm or finger vein patterns, retinal scans, or the like. Additionally, the user data 120 can include one or more documents that include user information, such as a driver's license of the user, a passport of the user, or any other type of identifying document. Non-identifying information that is associated with the user may also be included, such as records of activities (e.g., interactions, websites visited, applications executed or launched, physical or virtual locations, etc.) performed using the user device, records of offline activities (e.g., transaction records, historic records of user location over periods of time, etc.), or other types of information that may be associated with the user. The user data 120 can be stored in one or more data structures, with each portion of the user data 120 being indexed by a corresponding label or tag value, which can be used to access the respective portion of the user data 120. One or more portions of the user data 120 can be encrypted (or decrypted) using a respective digital key 122.
The client application 118 can generate one or more digital keys 122 using the techniques described herein. The digital keys can be, for example, encryption or decryption keys. Some examples include symmetric encryption/decryption keys or asymmetric encryption/decryption keys (e.g., a private key and a public key). The digital keys 122 can be generated to encrypt information communicated by the user device 103 via the client application 118 to improve the security of the information in transit. The digital keys 122 can be used to protect encrypted information such that the encrypted information cannot be accessed unless using a corresponding decryption key. Key sharing algorithms can be utilized by the user device 103 to share one or more of the digital keys 122 with other computing systems. The key sharing algorithms can include, but are not limited to, the Rivest-Shamir-Adelman (RSA) algorithm, the Diffie-Hellman algorithm, the elliptic-curve Diffie-Hellman (ECDH) algorithm, the ephemeral Diffie-Hellman algorithm, the elliptic-curve ephemeral Diffie-Hellman (ECDHE) algorithm, and the pre-shared key (PSK) algorithm, among others. The digital keys 122 can be generated using any suitable encryption/decryption key generation algorithm.
In an embodiment, respective digital keys 122 can be generated for user-selected portions of the user data 120, or portions of the user data 120 selected by the client application 118. In some implementations, digital keys 122 can be generated for particular portions of the user data 120 and for particular trusted computing systems 102. For example, certain portions of the user data 120 can be encrypted using a first digital key 122 for a first trusted computing system 102, and other portions of the user data 120 may be encrypted using a second digital key 122 for a second trusted computing system 102. This enables the first computing system 102 and the second computing system 102 to access only the respective portions of the user data 120 corresponding to their respective key, while the rest of the user data 120 remains encrypted and inaccessible.
The client application 118 can generate one or more data packages 124 using the techniques described herein. The data packages 124 can include any of the user data 120 that is encrypted using one or more corresponding digital keys 122. Corresponding decryption keys generated as part of the digital keys 122 can be stored for each data package in association with the data package 124, and shared with a corresponding trusted computing system 102 using an appropriate key sharing algorithm, as described herein. The key sharing algorithms can be performed prior to sharing the data package 124 with one or more trusted computing systems 102. The data packages 124 can serve as containers or secure enclaves for information and may be generated by the client application based on a secure token provided by the trusted computing system 102. As described in further detail herein, the trusted computing system 102 can generate a secure token upon verifying that the encrypted user data 120 is authentic. The secure token can be provided to the client application 118, which can then generate a data package that includes the secure token and the encrypted user data 120. In an embodiment, the client application generates the data package by encrypting the user data 120 based on the secure token. The data package can be generated such that that any change to the data package will cause verification of the data package using the security token to fail. Additional digital keys 122 may also be generated to further encrypt the data packages 124, using the techniques described herein.
The user device 103 can include one or more sensors 126. The sensors 126 can include one or more biometric sensors or ambient sensors, or any other type of sensor capable of capturing information about a user or an environment in which the user is present. The sensors 126 can include components that capture ambient sights and sounds (such as cameras and microphones), and that allow the user to provide inputs (e.g., a touchscreen, stylus, force sensor for sensing pressure on a display screen, and biometric components such as a fingerprint reader, a heart monitor that detects cardiovascular signals, an iris scanner, and so forth). The sensors 126 may include one or more location sensors to enable the user device 103 to determine its location relative to, for example, other physical objects or relative to geographic locations. Example location sensors include global positioning system (GPS) devices and other navigation and geolocation devices, digital compasses, gyroscopes and other orientation sensors, as well as proximity sensors or other sensors that allow the user device 103 to detect the presence and relative distance of nearby objects and devices.
The trusted computing system 102 can include a database 106, which may store user profiles 108. The user profiles 108 may each be associated with a corresponding user and may include corresponding trusted user data 110. The trusted user data 110 can be any data that is confirmed to be truthful or correct as relating to the user to which the respective user profile 108 corresponds. The trusted user data 110 can include any information about the user. For example, the trusted user data 110 may include, but is not limited to, personally identifying data (e.g., name and social security number), psychographics data (e.g., personality, values, opinions, attitudes, interests, and lifestyles), transactional data (e.g., preferred products, purchase history, transaction history), demographic data (e.g., address, age, education), financial data (e.g., income, assets, credit score), biometric information (e.g., images of the user's face, fingerprint scans, one or more voice samples, an iris scan (or an image of the user's eye), palm or finger vein patterns, retinal scans, etc.), or other user or account data that is maintained or otherwise accessible to the trusted computing system 102. The trusted computing system 102 can receive the information for the trusted user data 110 from trusted sources, such as in-person meetings with the user, government agencies, or other sources of truth.
The trusted computing system 102 can utilize the trusted user data 110 in a user profile 108 to verify the authenticity of one or more portions of the encrypted user data 120 received from the user device 103. Verifying the information may include performing a decryption technique using a corresponding digital key 122 shared with the trusted computing system 102 by the user device 103 using an appropriate key sharing algorithm. After decrypting the encrypted user data 120 (or portions of the user data 120 to which the decryption key(s) correspond) the trusted computing system 102 can compare the information in the decrypted data to the trusted user 110. The trusted computing system 102 can identify what portions of the decrypted user data 120 match those (or reasonably correspond to) corresponding portions of the trusted user data 110. These matching portions of the user data 120 can be used to generate a security token. The security token can be any type of value that can be used to verify the integrity of the encrypted user and may be utilized with a generated data package 124 to prevent data tampering. The user devices 103 can communicate with the trusted computing system 102 to carry out the techniques described herein.
In an embodiment, the client application 118 of the user device 103 can communicate with the trusted computing system 102 via the secure API 112. The trusted computing system 102 can maintain and provide access to the secure API 112 to various authorized computing systems, such as the user device 103 via the network 101. The secure API 112 can be an API, such as a web-based API corresponding to a particular network address uniform resource identifier (URI), or uniform resource locator (URL), among others. The secure API 112 can be a client-based API, a server API (SAPI), or an Internet Server API (ISAPI). Various protocols may be utilized to access the secure API 112, including a representational state transfer (REST) API, a simple object access protocol (SOAP) API, a Common Gateway Interface (CGI) API, or extensions thereof. The secure API 112 may be implemented in part using a network transfer protocol, such as the hypertext transfer protocol (HTTP), the secure hypertext transfer protocol (HTTPS), the file transfer protocol (FTP), the secure file transfer protocol (FTPS), each of which may be associated with a respective URI or URL. The secure API can be secured utilizing one or more encryption techniques, such that the secure API prevents data tampering or data leakage.
The secondary computing system 104 can include at least one processor and a memory (e.g., a processing circuit). The memory can store processor-executable instructions that, when executed by processor, cause the processor to perform one or more of the operations described herein. The processor may include a microprocessor, an ASIC, an FPGA, etc., or combinations thereof. The memory may include, but is not limited to, electronic, optical, magnetic, or any other storage or transmission device capable of providing the processor with program instructions. The memory may further include a floppy disk, CD-ROM, DVD, magnetic disk, memory chip, ASIC, FPGA, ROM, RAM, EEPROM, EPROM, flash memory, optical media, or any other suitable memory from which the processor can read instructions. The instructions may include code from any suitable computer programming language. The secondary computing device 104 can include one or more computing devices or servers that can perform various functions as described herein. The secondary computing device 104 can include any or all of the components and perform any or all of the functions of the computer system 400 described herein in conjunction with FIG. 4 .
The secondary computing system 104 can execute one or more secondary applications 116 based on the data packages 124 received from the user device 103. The data packages 124 can include a security token generated by the trusted computing system 102 that is invalidated in the event of any tampering of the data package 124. Therefore, if a verification for the data package 124 performed by the secondary computing system 104 is satisfied, the secondary computing system 104 can rely on the information in the data package 124 being both valid and secure. Therefore, using the valid and secure data in the data package, the secondary computing system 104 can execute one or more secondary applications 116 that correspond to any type of functionality that might utilize the user data 120. For example, the user data 120 in the data package can be used to provide recommendations for retirement products, personal loans, home equity loans, or other financial products. The secondary applications 116 can select loans to recommend to the user that have monthly, semimonthly, or periodic payments that fall within the user's available periodic cash flow.
To verify the data package 124, the secondary computing system 104 can utilize a second digital key 122 generated based on the device identifier and biometric information obtained via the user device 103 to decrypt the data package 124. Prior to decryption, the security token (which may be provided to the secondary computing system 104 with the data package) can be first verified with the trusted computing system 102 as corresponding to the device identifier from which the data package 124 was obtained. Upon verifying the security token, the integrity of the data package 124 can be verified using the security token to confirm that the data package 124 has not been modified. Upon determining that the data package 124 has not been modified, the data package 124 can be decrypted using a digital key 122 obtained from the user device 103 using a suitable key sharing algorithm. The secondary applications can then utilize the data extracted from the decrypted data package 124 to perform further operations, with the assurance that the information extracted from the data package 124 is authentic.
The secondary computing system 104 may also maintain an identity construct for the user, and can update the identity construct based on the information extracted from the received data package 124. Referring to FIG. 2 in the context of the components of FIG. 1 , illustrated is an example categorization 200 of identity elements that may be present in the identity construct of the user. Information extracted from the data packages 124 received from a user device can be stored in a data structure that may be indexed by or associated with one or more categories. The data points of activities or other user data 120 can be sorted by the secondary computing system 104 into categories, cumulatively constituting the basis for a fundamental digital identity.
As non-exhaustive examples: “geolocation” may include, for example, elements related to where a user has been; “personal data” may include, for example, name and birthdate; “health history” may include, for example, information that might be found in health records; “romance/marriage” may include, for example, information on significant others and spouses; “work history” may include, for example, information on places and dates of employments and titles held; “charity/volunteer” may include information on, for example, charitable contributions or volunteering activities; “online posts/pics” may include, for example, textual posts and pictures/videos/other media submitted to social networking accounts; “hobbies” may include, for example, leisure or other non-employment related activities; “education” may include, for example, information on schools attended and degrees earned; “faith/religion” may include, for example, information on churches attended or religious activities; “travel” may include, for example, information on places visited; “transactions” may include, for example, information on purchases; “legal history” may include, for example, information on legal proceedings; “financial” may include, for example, information on financial accounts; “art/music” may include, for example, information on attendance at concerts and types of art and music purchased or otherwise enjoyed by a user; “state/government” may include, for example, information on licenses; “news/reports” may include, for example, information in broadcasts, publications, or reports that mention a user; and “family/friends” may include, for example, information on children, siblings, and persons with whom the user spends time or otherwise associates.
Referring to FIG. 3 , illustrated is a flow diagram of an example method 300 for secure data processing using data packages (e.g., the data packages 124) generated by edge devices (e.g., the user device 103), in accordance with one or more example implementations. The method 300 can be a computer-implemented method. The method 300 may be implemented, for example, using any of the computing systems described herein, including the user device 103, the secondary computing system 104, the trusted computing system 102, or the computing system 400 described in connection with FIG. 4 . In some implementations, additional, fewer, or different operations may be performed. It will be appreciated that the order or flow of operations indicated by the flow diagrams and arrows with respect to the methods described herein is not meant to be limiting. For example, in one implementation, two or more of the operations of method 300 may be performed simultaneously, or one or more operations may be performed as an alternative to another operation.
At step 305, the method 300 can include scanning a physical feature of a user of a computing device (e.g., the user device 103) using a sensor (e.g., a sensor 126) of the computing device. The physical feature can be scanned, for example, using a camera or another type of biometric scanning device of the computing device. For example, the computing device may include components that capture ambient sights and sounds (such as cameras and microphones), and that allow the user to provide inputs (e.g., a touchscreen, stylus, force sensor for sensing pressure on a display screen, biometric components such as a fingerprint reader, a heart monitor that detects cardiovascular signals, an iris scanner, and so forth). In some implementations, the image or video capture devices of the user device 103 that capture video or images can include devices that capture non-visible light, such as infrared (IR) or ultraviolet (UV) light. User interfaces on an application (e.g., the client application 118) executing on the computing device can prompt the user to provide biometric inputs for generating encrypted user data (e.g., the user data 120). The physical feature scanned using the sensor can be, but is not necessarily limited to, a picture of the user's face, a fingerprint of the user, a heart rate or heart rate pattern of the user, an iris scan of the user, a retinal scan of the user, or the like.
In an embodiment, the physical feature is a voice of the user (e.g., a voice print). For example, the computing device can include one or more microphones that can capture a voice of the user. The user interfaces on the computing device can prompt the user to speak predetermined phrases or predetermined or desired portions of the user data (e.g., name, address, date of birth, etc.). The voice of the user can may be applied to a natural language processing (NLP) model (e.g., which may be trained using machine-learning techniques by the secondary computing device 104). The NLP model may be executed by the computing device to extract one or more words or phrases spoken by the user.
Additionally, scanning the physical feature of the user can include applying one or more filters to determine authenticity or validity of the physical feature. The filters may be filters that are applied to determine whether the biometric data is in fact provided by the user, and not “spoofed” by a malicious actor attempting to fraudulently impersonate the user. For example, a malicious actor may spoof the output of a camera to the application to provide a pre-obtained or pre-existing photo of the user that the malicious actor is attempting to impersonate. To circumvent these fraudulent activities, the computing device can gather additional information, such as IR images or UV images from additional sensors on the computing device, and cross-reference the images obtained via the camera (which may be spoofed) with the UV data or IR data captured from additional sensors. Additionally or alternatively, the computing device can execute one or more filters over the captured data to identify one or more anomalies. For example, the computing device can apply one or more IR filters, UV filters, or other non-visible light frequency filters to analyze the integrity of the image of the user's face.
The aforementioned verification techniques can be utilized to determine a score that indicates a likelihood that an appearance of the user is forged. For example, by cross-referencing UV or IR images captured at about the same time that the user's face is captured by a visible-light camera of the computing device, the computing device can detect the presence of one or more anomalies in the visible-light image. The size and number of the detected anomalies can influence the score. For example, larger anomalies or a larger number of anomalies can indicate a larger score (and therefore a higher likelihood that the image is fraudulent). Voice data or other types of biometric data can also be applied to similar filters or anomaly detection models that are trained using machine-learning techniques to detect potentially fraudulent biometric data. The anomaly detection model can be executed using the biometric data as input and can generate a score indicating the likelihood that the biometric data has been forged or is fraudulent. The anomaly detection model can be trained using supervised learning, unsupervised learning, semi-supervised learning, or other machine-learning techniques to calculate the score. Some examples of machine learning models can include neural networks (e.g., a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN) such as a long-short term memory (LSTM) model, combinations thereof, etc.), regression models (e.g., linear regression, sparse vector machine (SVM), logistic regression, polynomial regression, ridge regression, Lasso regression, Bayesian linear regression, etc.), or other types of classifiers (e.g., naïve Bayes, decision trees, k-nearest neighbor (KNN), extreme gradient boosting (XGBoost) models, etc.). The aforementioned machine-learning models may also be utilized for any type of machine-learning or artificial intelligence (AI) performed task described herein. In some implementations, multiple machine-learning models may be executed in a machine-learning pipeline to perform various operations described herein.
At step 310, the method 300 can include generating a biometric signature corresponding to the physical feature. The biometric signature may be a signature, or a reduced version, of the scan performed in step 305. The biometric signature may be generated by performing one or more feature extraction techniques, for example, to reduce the size of the biometric data provided by the user. Reducing the size of the data provided by the user allows for increased efficiency when utilizing the biometric signature to perform further processing steps, such as data encryption. The biometric signature can further be generated to conform to a predetermined data size or format that is compatible with the encryption techniques described herein. For example, the physical feature detected using the biometric techniques described herein can be a facial feature. An example facial feature may relate to aspects (e.g., shape or outline) of a user's eyes, nose, lips, etc. The facial feature may be, or may be based on, a facial image. The biometric signature can be a reduced dataset that preserves the unique or identifying features of the user's face. As such, the biometric signature may be generated using one or more feature extraction techniques, such as edge detection, bounding-box detection, or detection of particular features on the user's face (e.g., position and shape of eyes, nose, mouth, ears, eyebrows, etc.), and their relative positions or distances from one another. This information can be stored as the biometric signature for the user, which may be utilized in subsequent processing steps.
At step 315, the method 300 can include generating user data (e.g., the user data 120) based on inputs detected via one or more input devices of the computing device. The inputs can include information about the user. For example, the application executing on the computing device can prompt the user to provide information relating to the user. In an embodiment, the information may be provided verbally and transcribed by the computing device by executing an NLP model or another type of trained speech-to-text processing model over a voice input recorded using a microphone or another type of audio input.
The user inputs can include an image captured using an image sensor of the computing device. For example, the application may prompt the user to capture one or more images of various documents (e.g., driver's license, medical documents, utility bills, etc.) that include identifying information about the user. The images can be stored in the memory of the computing device and can be utilized in an image processing function or algorithm that can extract pertinent information relating to the user from the image. For example, the computing device can execute a trained artificial intelligence model to identify regions of an image that are likely to correspond to pertinent details (e.g., blocks of text, etc.). Natural language processing operations (e.g., executing additional machine-learning models or other types of image-to-text algorithms like optical character recognition) can be utilized to extract the information about the user. In some implementations, optical character recognition can be used to extract sections of text from the image(s), and then regular expression (regex) rules can be applied to the sections of text to identify and extract the user data. Geolocation data may also be detected using a location sensor of the computing device. The location sensor can generate a geolocation of the computing device corresponding to where the inputs are detected by the computing device. The sensors 126 may include one or more location sensors to enable the user device 103 to determine its location relative to, for example, other physical objects or relative to geographic locations. Example location sensors include global positioning system (GPS) devices and other navigation and geolocation devices, digital compasses, gyroscopes and other orientation sensors, as well as proximity sensors or other sensors that allow the user device 103 to detect the presence and relative distance of nearby objects and devices. In an embodiment, the user data is generated to comprise the geolocation, which may be stored in association with the data provided via user input.
Additionally, the computing device may generate a hash value of a portion of or all of the user data. The hash value may be utilized as a measure to detect whether some or all of the user data has been changed after its original generation. The hash can be a safeguard against potential data tampering and may be utilized as an initial verification process when attempting to process the user data. For example, prior to processing or utilizing the user data in further downstream processing operations, the hash value of the user data can be recalculated and compared with the hash value generated when the user data itself was generated. If there are differences between the hash values, the user data may have been changed by another party (e.g., a hacker or another entity) prior to the attempted utilization of the user data.
At step 320, the method 300 can include retrieving a device identifier corresponding to the computing device. The identifier may be a UUID, a GUID, a MAC address, an IP address, a device serial number, a serial number of a component of the computing device, a predetermined or randomly generated value associated with the computing device, or the like. The computing device can retrieve the one or more device identifiers using an API of an operating system of the computing device. In some implementations, a predetermined region of memory of the computing device that stores one or more device identifiers can be accessed to retrieve the device identifier. The device identifier can then be used in subsequent processing steps to encrypt the user data.
At step 325, the method 300 can include encrypting the user data to generate encrypted user data. The encrypted user data can be generated, for example, by utilizing the biometric signature and the device identifier to generate a private key. The private key may be a hash value of the biometric signature concatenated with the device identifier, or may be individual hash values of the biometric signature and the device identified added together to form a single key value. In addition, salt values or additional data padding may be added to the biometric signature and the device identifiers to generate the private key (e.g., one of the digital keys 122). The private key can then be utilized in a corresponding encryption and key generation algorithm to encrypt the user data. To encrypt the user data, the computing device can execute a suitable encryption algorithm, such as an asymmetric encryption algorithm or a symmetric encryption algorithm to generate the encrypted user data.
At step 330, the method 300 can include generating a first digital key (e.g., a digital key 122) granting access to the encrypted user data and transmitting the digital key to a first computing system. The first digital key can be utilized to decrypt the encrypted user data. For example, the first digital key can be an asymmetric decryption key (e.g., a public key) that is generated as part of the encryption algorithm executed at step 325. Alternatively, the first digital key may be generated as a symmetric decryption key as part of the encryption process executed at step 325. In an embodiment, respective digital keys can be generated for user-selected or predetermined portions of the encrypted user data. In some implementations, digital keys can be generated for particular portions of the user data and for particular trusted computing systems (e.g., particular trusted computing systems 102).
For example, certain portions of the user data can be encrypted using a first set of encryption keys (e.g., each of which may be derived from a combination of the device identifier and the biometric signature) for a first trusted computing system, and other portions of the user data may be encrypted by a second set of encryption keys (e.g., also derived from the device identifier and the biometric signature) that correspond to a second trusted computing system. Corresponding decryption keys can be generated for each encryption key. This enables the first trusted computing system and the second trusted computing system to access only the respective portions of the user data corresponding to their respective decryption key, while the rest of the user data remains encrypted and inaccessible.
The first digital key(s) (e.g., the decryption keys for the encrypted user data) can be stored in association with the encrypted user data, and can be transmitted to the trusted computing system(s) using a suitable key sharing algorithm. Key sharing algorithms can be any algorithm that may be utilized by the computing device to share one or more of the digital keys with other computing systems. The key sharing algorithms can include, but are not limited to, the RSA algorithm, the Diffie-Hellman algorithm, the ECDH algorithm, the ephemeral Diffie-Hellman algorithm, the ECDHE algorithm, and the PSK algorithm, among others.
The computing system can receive the generated security token from a first computing system (e.g., the trusted computing system 102), which can correspond to the user data. The security token can indicate that at least a subset of the user data is valid. As described herein, the trusted computing system can utilize the trusted user data maintained in a database to verify the authenticity of one or more portions of the encrypted user data received from the computing device. Verifying the information may include performing a decryption technique using a corresponding digital key shared using an appropriate key sharing algorithm. After decrypting the encrypted user data (or portions of the user data to which the decryption key(s) correspond), the trusted computing system can compare the information in the decrypted data to the trusted user data corresponding to the requesting user.
The trusted computing system can identify what portions of the decrypted user data match those (or almost match, e.g., match within a tolerance threshold) corresponding portions of the trusted user data. These matching portions of the user data can be used to generate a security token. In an embodiment, the security token may be generated without using the matching portions of the user data. The security token can be any type of value that can be used to verify the integrity of the encrypted user, and may be utilized with the data package 124 to prevent data tampering. To generate the security token, the trusted computing system can generate a hash value, a random value, or another type of token value using a token generation algorithm. A copy of the security token can be stored in association with the one or more portions of the user data to which the security token corresponds (e.g., for further verification purposes in response to a request from a secondary computing device). Upon verifying the encrypted user data, the security token can be transmitted to the computing device. The computing system can then utilize the security token to generate a data package.
At step 335, the method 300 can include storing, at the computing device, the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device. The data package can be stored or generated such that attempts to alter the data package invalidate the security token. For example, the security token can itself be a hash value of the encrypted user data, which in some implementations may also include a predetermined salt value or other deterministic information. The data package can include the encrypted user data and the security value, such that any tampering of the data package would cause a verification process of the encrypted user data using the security value to fail. For example, the security token may include a hash (or a partial hash) of some or all of the encrypted user data. If the encrypted user data is changed, the hash will be different, and a comparison to the hash included in the security token would not match the hash of the encrypted user data, indicating potential data tampering. The data packages can serve as containers or secure enclaves for information, and may be generated by the client application based on a secure token provided by the trusted computing system. The data package can be generated such that that any change to the data package will cause verification of the data package using the security token to fail.
At step 340, the method 300 can include generating a second digital key corresponding to the data package. The second digital key can be generated when encrypting the data package to transmit the data package to a secondary computing device (e.g., the secondary computing device 104). Encrypting the data package can generate an encrypted data package. The encrypted data package can be generated using similar processes to those used to generate the encrypted user data. For example, the biometric signature and the device identifier can be utilized to generate a private encryption key for the data package. The private key may be a hash value of the biometric signature concatenated with the device identifier, or may be individual hash values of the biometric signature and the device identified added together to form a single key value. In addition, salt values or additional data padding may be added to the biometric signature and the device identifiers to generate the private encryption key. The private encryption key can then be utilized in a corresponding encryption and key generation algorithm, to encrypt the data package.
As part of the encryption process, the computing device can generate a corresponding decryption key (e.g., which may be a public decryption key) that may be used to decrypt the encrypted data package. For example, the second digital key can be an asymmetric decryption key (e.g., a public key) that is generated as part of the encryption algorithm used to encrypt the data package. Alternatively, the second digital key may be generated as a symmetric decryption key as part of the encryption process. The decryption key can be stored in association with the data package at the computing device.
At step 345, the method 300 can include transmitting the data package and the second digital key to a second computing system (e.g., the secondary computing system) for use in executing one or more secondary applications (e.g., the secondary applications 116). The data package can be transmitted, for example, in response to detecting a user input at the application executing at the computing device. The user input can indicate that information relating to the user (e.g., the user data) is to be provided to the second computing system. Alternatively, the data package can be transmitted to the second computing system in response to a request from the second computing system. The data package can be transmitted to the second computing system via a network. The second digital key can be transmitted to the second computing system using a suitable key sharing algorithm. Key sharing algorithms can be any algorithm that may be utilized by the computing device to share one or more of the digital keys with other computing systems. The key sharing algorithms can include, but are not limited to, the RSA algorithm, the Diffie-Hellman algorithm, the ECDH algorithm, the ephemeral Diffie-Hellman algorithm, the ECDHE algorithm, and the PSK algorithm, among others.
The computing system 400 includes a bus 402 or other communication component for communicating information and a processor 404 coupled to the bus 402 for processing information. The computing system 400 also includes main memory 406, such as a RAM or other dynamic storage device, coupled to the bus 402 for storing information, and instructions to be executed by the processor 404. Main memory 406 can also be used for storing position information, temporary variables, or other intermediate information during execution of instructions by the processor 404. The computing system 400 may further include a read only memory (ROM) 408 or other static storage device coupled to the bus 402 for storing static information and instructions for the processor 404. A storage device 410, such as a solid-state device, magnetic disk, or optical disk, is coupled to the bus 402 for persistently storing information and instructions.
The computing system 400 may be coupled via the bus 402 to a display 414, such as a liquid crystal display, or active matrix display, for displaying information to a user. An input device 412, such as a keyboard including alphanumeric and other keys, may be coupled to the bus 402 for communicating information, and command selections to the processor 404. In another implementation, the input device 412 has a touch screen display. The input device 412 can include any type of biometric sensor, or a cursor control, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor 404 and for controlling cursor movement on the display 414.
In some implementations, the computing system 400 may include a communications adapter 416, such as a networking adapter. Communications adapter 416 may be coupled to bus 402 and may be configured to enable communications with a computing or communications network 101 and/or other computing systems. In various illustrative implementations, any type of networking configuration may be achieved using communications adapter 416, such as wired (e.g., via Ethernet), wireless (e.g., via Wi-Fi, Bluetooth), satellite (e.g., via GPS) pre-configured, ad-hoc, LAN, WAN, and the like.
According to various implementations, the processes of the illustrative implementations that are described herein can be achieved by the computing system 400 in response to the processor 404 executing an implementation of instructions contained in main memory 406. Such instructions can be read into main memory 406 from another computer-readable medium, such as the storage device 410. Execution of the implementation of instructions contained in main memory 406 causes the computing system 400 to perform the illustrative processes described herein. One or more processors in a multi-processing implementation may also be employed to execute the instructions contained in main memory 406. In alternative implementations, hard-wired circuitry may be used in place of or in combination with software instructions to implement illustrative implementations. Thus, implementations are not limited to any specific combination of hardware circuitry and software.
The implementations described herein have been described with reference to drawings. The drawings illustrate certain details of specific implementations that implement the systems, methods, and programs described herein. However, describing the implementations with drawings should not be construed as imposing on the disclosure any limitations that may be present in the drawings.
It should be understood that no claim element herein is to be construed under the provisions of 35 U.S.C. § 112 (f), unless the element is expressly recited using the phrase “means for.”
As used herein, the term “circuit” may include hardware structured to execute the functions described herein. In some implementations, each respective “circuit” may include machine-readable media for configuring the hardware to execute the functions described herein. The circuit may be embodied as one or more circuitry components including, but not limited to, processing circuitry, network interfaces, peripheral devices, input devices, output devices, sensors, etc. In some implementations, a circuit may take the form of one or more analog circuits, electronic circuits (e.g., integrated circuits (IC), discrete circuits, system on a chip (SOC) circuits), telecommunication circuits, hybrid circuits, and any other type of “circuit.” In this regard, the “circuit” may include any type of component for accomplishing or facilitating achievement of the operations described herein. For example, a circuit as described herein may include one or more transistors, logic gates (e.g., NAND, AND, NOR, OR, XOR, NOT, XNOR), resistors, multiplexers, registers, capacitors, inductors, diodes, wiring, and so on.
The “circuit” may also include one or more processors communicatively coupled to one or more memory or memory devices. In this regard, the one or more processors may execute instructions stored in the memory or may execute instructions otherwise accessible to the one or more processors. In some implementations, the one or more processors may be embodied in various ways. The one or more processors may be constructed in a manner sufficient to perform at least the operations described herein. In some implementations, the one or more processors may be shared by multiple circuits (e.g., circuit A and circuit B may comprise or otherwise share the same processor, which, in some example implementations, may execute instructions stored, or otherwise accessed, via different areas of memory). Alternatively or additionally, the one or more processors may be structured to perform or otherwise execute certain operations independent of one or more co-processors.
In other example implementations, two or more processors may be coupled via a bus to enable independent, parallel, pipelined, or multi-threaded instruction execution. Each processor may be implemented as one or more general-purpose processors, ASICs, FPGAs, digital signal processors (DSPs), or other suitable electronic data processing components structured to execute instructions provided by memory. The one or more processors may take the form of a single core processor, multi-core processor (e.g., a dual core processor, triple core processor, and/or quad core processor), microprocessor, etc. In some implementations, the one or more processors may be external to the apparatus, for example the one or more processors may be a remote processor (e.g., a cloud based processor). Alternatively or additionally, the one or more processors may be internal and/or local to the apparatus. In this regard, a given circuit or components thereof may be disposed locally (e.g., as part of a local server, a local computing system) or remotely (e.g., as part of a remote server such as a cloud based server). To that end, a “circuit” as described herein may include components that are distributed across one or more locations.
An exemplary system for implementing the overall system or portions of the implementations might include a general purpose computing devices in the form of computers, including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit. Each memory device may include non-transient volatile storage media, non-volatile storage media, non-transitory storage media (e.g., one or more volatile and/or non-volatile memories), etc. In some implementations, the non-volatile media may take the form of ROM, flash memory (e.g., flash memory such as NAND, 3D NAND, NOR, 3D NOR), EEPROM, MRAM, magnetic storage, hard discs, optical discs, etc. In other implementations, the volatile storage media may take the form of RAM, TRAM, ZRAM, etc. Combinations of the above are also included within the scope of machine-readable media. In this regard, machine-executable instructions comprise, for example, instructions and data, which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions. Each respective memory device may be operable to maintain or otherwise store information relating to the operations performed by one or more associated circuits, including processor instructions and related data (e.g., database components, object code components, script components), in accordance with the example implementations described herein.
It should also be noted that the term “input devices,” as described herein, may include any type of input device including, but not limited to, a keyboard, a keypad, a mouse, joystick, or other input devices performing a similar function. Comparatively, the term “output device,” as described herein, may include any type of output device including, but not limited to, a computer monitor, printer, facsimile machine, or other output devices performing a similar function.
Any foregoing references to currency or funds are intended to include fiat currencies, non-fiat currencies (e.g., precious metals), and math-based currencies (often referred to as cryptocurrencies). Examples of math-based currencies include Bitcoin, Litecoin, Dogecoin, and the like.
It should be noted that although the diagrams herein may show a specific order and composition of method steps, it is understood that the order of these steps may differ from what is depicted. For example, two or more steps may be performed concurrently or with partial concurrence. Also, some method steps that are performed as discrete steps may be combined, steps being performed as a combined step may be separated into discrete steps, the sequence of certain processes may be reversed or otherwise varied, and the nature or number of discrete processes may be altered or varied. The order or sequence of any element or apparatus may be varied or substituted according to alternative implementations. Accordingly, all such modifications are intended to be included within the scope of the present disclosure as defined in the appended claims. Such variations will depend on the machine-readable media and hardware systems chosen and on designer choice. It is understood that all such variations are within the scope of the disclosure. Likewise, software and web implementations of the present disclosure could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various database searching steps, correlation steps, comparison steps, and decision steps.
The foregoing description of implementations has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from this disclosure. The implementations were chosen and described in order to explain the principals of the disclosure and its practical application to enable one skilled in the art to utilize the various implementations and with various modifications as are suited to the particular use contemplated. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions and implementation of the implementations without departing from the scope of the present disclosure as expressed in the appended claims.
Claims (20)
1. An edge-computing method, comprising:
scanning, using a sensor of a computing device, a physical feature of a user of the computing device;
generating, based on the scan of the physical feature, a biometric signature corresponding to the physical feature;
generating user data based on inputs detected via one or more input devices of the computing device, the inputs comprising information on the user;
retrieving a device identifier corresponding to the computing device;
encrypting, based on the biometric signature and the device identifier, the user data to generate encrypted user data;
generating a first digital key granting access to the encrypted user data and transmitting the first digital key to a first computing system;
receiving, from the first computing system, a security token corresponding to the user data, the security token indicating that at least a subset of the user data is valid;
storing, at the computing device, the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device, wherein attempts to alter the data package invalidate the security token;
generating a second digital key corresponding to the data package; and
transmitting the data package and the second digital key to a second computing system to provide the user a service.
2. The edge-computing method of claim 1 , wherein the data package is transmitted in response to detecting a user input indicating that the information on the user is provided to the second computing system.
3. The edge-computing method of claim 1 , wherein the data package is transmitted to the second computing system in response to a request from the second computing system.
4. The edge-computing method of claim 1 , wherein encrypting the user data based on the biometric signature and the device identifier comprises using data based on both the biometric signature and the device identifier to encrypt the user data.
5. The edge-computing method of claim 1 , wherein the physical feature is a facial feature, and wherein the biometric signature is based on facial biometric data detected using the sensor.
6. The edge-computing method of claim 1 , wherein the physical feature is a voice of the user, and wherein the sensor is a microphone of the computing device.
7. The edge-computing method of claim 1 , wherein the inputs comprise an image captured using an image sensor of the computing device.
8. The edge-computing method of claim 7 , further comprising analyzing the image to determine image integrity, wherein analyzing the image comprises at least one of (i) using a non-visible light filter, or (ii) a neural network.
9. The edge-computing method of claim 1 , wherein the user data comprises information extracted from an image captured using an image sensor of the computing device.
10. The edge-computing method of claim 1 , wherein scanning the physical feature of the user comprises using a plurality of filters to determine authenticity of the physical feature.
11. The edge-computing method of claim 10 , wherein the plurality of filters correspond to a plurality of non-visible light frequencies.
12. The edge-computing method of claim 10 , wherein determining authenticity of the physical feature comprises determining a likelihood that an appearance of the user is forged.
13. The edge-computing method of claim 1 , wherein generating the user data comprises performing a hashing operation on the information of the user.
14. The edge-computing method of claim 1 , further comprising detecting, using a location sensor of the computing device, a geolocation of the computing device corresponding to where the inputs are detected by the computing device.
15. The edge-computing method of claim 14 , wherein the user data is generated to comprise the geolocation.
16. The edge-computing method of claim 1 , wherein retrieving the device identifier comprises requesting the device identifier from an operating system of the computing device.
17. The edge-computing method of claim 1 , wherein the first and second digital keys are first and second public keys.
18. A computing device comprising:
a memory storing instructions; and
one or more processors configured to execute the instructions to:
scan, using a sensor of the computing device, a physical feature of a user of the computing device;
generate, based on the scan of the physical feature, a biometric signature corresponding to the physical feature;
generate user data based on inputs detected via one or more input devices of the computing device, the inputs comprising information on the user;
retrieve a device identifier corresponding to the computing device;
encrypt, based on the biometric signature and the device identifier, the user data to generate encrypted user data;
generate a first digital key granting access to the encrypted user data and transmit the first digital key to a first computing system;
receive, from the first computing system, a security token corresponding to the user data, the security token indicating that at least a subset of the user data is valid;
store the encrypted user data and the security token as part of a data package that is secured and validated in a storage device of the computing device, wherein attempts to alter the data package invalidate the security token;
generate a second digital key corresponding to the data package; and
transmit, based on a user input, the data package and the second digital key to a second computing system to provide the user a service.
19. The computing device of claim 18 , wherein the computing device is configured to transmit the data package in response to detecting the user input indicating that the information on the user is provided to the second computing system.
20. The computing device of claim 18 , wherein encrypting the user data based on the biometric signature and the device identifier comprises using data based on both the biometric signature and the device identifier to encrypt the user data.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/874,803 US12470552B1 (en) | 2022-07-27 | 2022-07-27 | Secure data processing using data packages generated by edge devices |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/874,803 US12470552B1 (en) | 2022-07-27 | 2022-07-27 | Secure data processing using data packages generated by edge devices |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US12470552B1 true US12470552B1 (en) | 2025-11-11 |
Family
ID=97602777
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/874,803 Active 2044-05-01 US12470552B1 (en) | 2022-07-27 | 2022-07-27 | Secure data processing using data packages generated by edge devices |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US12470552B1 (en) |
Citations (148)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030086341A1 (en) | 2001-07-20 | 2003-05-08 | Gracenote, Inc. | Automatic identification of sound recordings |
| US20060129478A1 (en) | 2004-12-10 | 2006-06-15 | Payday One Xl, Llc | Automated Short Term Loans |
| US7133846B1 (en) | 1995-02-13 | 2006-11-07 | Intertrust Technologies Corp. | Digital certificate support system, methods and techniques for secure electronic commerce transaction and rights management |
| US20070078908A1 (en) | 2005-05-17 | 2007-04-05 | Santu Rohatgi | Method and system for child safety |
| US20080022370A1 (en) | 2006-07-21 | 2008-01-24 | International Business Corporation | System and method for role based access control in a content management system |
| US20080120302A1 (en) | 2006-11-17 | 2008-05-22 | Thompson Timothy J | Resource level role based access control for storage management |
| US20090089205A1 (en) | 2007-09-29 | 2009-04-02 | Anthony Jeremiah Bayne | Automated qualifying of a customer to receive a cash loan at an automated teller machine |
| US20090089107A1 (en) | 2007-09-27 | 2009-04-02 | Robert Lee Angell | Method and apparatus for ranking a customer using dynamically generated external data |
| US7673797B2 (en) | 2006-12-13 | 2010-03-09 | Ncr Corporation | Personalization of self-checkout security |
| WO2011016710A1 (en) | 2009-08-05 | 2011-02-10 | Mimos Berhad | Method for baby-parent identification |
| US8234387B2 (en) | 2003-06-05 | 2012-07-31 | Intertrust Technologies Corp. | Interoperable systems and methods for peer-to-peer service orchestration |
| US20120237908A1 (en) | 2008-04-01 | 2012-09-20 | William Fitzgerald | Systems and methods for monitoring and managing use of mobile electronic devices |
| US8446275B2 (en) | 2011-06-10 | 2013-05-21 | Aliphcom | General health and wellness management method and apparatus for a wellness application using data from a data-capable band |
| CA2478548C (en) | 1999-07-20 | 2014-03-11 | Diebold, Incorporated | Automated banking machine system and development method |
| US8731977B1 (en) | 2013-03-15 | 2014-05-20 | Red Mountain Technologies, LLC | System and method for analyzing and using vehicle historical data |
| US8756153B1 (en) | 1999-08-10 | 2014-06-17 | Gofigure Payments, Llc | System and method for mobile payment at point of sale |
| US20140200885A1 (en) | 2008-02-21 | 2014-07-17 | Snell Limited | Audio visual signature, method of deriving a signature, and method of comparing audio-visual data background |
| US8831972B2 (en) | 2007-04-03 | 2014-09-09 | International Business Machines Corporation | Generating a customer risk assessment using dynamic customer data |
| US8965803B2 (en) | 2005-02-04 | 2015-02-24 | The Invention Science Fund I, Llc | Virtual world reversion rights |
| US20150112732A1 (en) | 2013-10-22 | 2015-04-23 | Esurance Insurance Services, Inc. | Identifying a user as part of a household |
| US9087058B2 (en) | 2011-08-03 | 2015-07-21 | Google Inc. | Method and apparatus for enabling a searchable history of real-world user experiences |
| US9094388B2 (en) | 2013-05-01 | 2015-07-28 | Dmitri Tkachev | Methods and systems for identifying, verifying, and authenticating an identity |
| US20150220999A1 (en) | 2009-01-21 | 2015-08-06 | Truaxis, Inc. | Method and system to dynamically adjust offer spend thresholds and personalize offer criteria specific to individual users |
| US9177257B2 (en) | 2012-08-30 | 2015-11-03 | International Business Machines Corporation | Non-transitory article of manufacture and system for providing a prompt to user for real-time cognitive assistance |
| US20150317728A1 (en) | 2014-05-05 | 2015-11-05 | BeSmartee, Inc. | Mortgage synthesis and automation |
| US20160050557A1 (en) | 2014-08-14 | 2016-02-18 | Samsung Electronics Co., Ltd. | Method and apparatus for profile download of group devices |
| WO2016083987A1 (en) | 2014-11-25 | 2016-06-02 | Ideco Biometric Security Solutions (Proprietary) Limited | Method of and system for obtaining proof of authorisation of a transaction |
| US20160162882A1 (en) | 2014-12-08 | 2016-06-09 | Guy LaMonte McClung, III | Digital money choice and eWallet selection |
| US20160224773A1 (en) | 2012-05-15 | 2016-08-04 | Bphav, Llc | Biometric authentication system |
| US9443298B2 (en) | 2012-03-02 | 2016-09-13 | Authentect, Inc. | Digital fingerprinting object authentication and anti-counterfeiting system |
| US20160335629A1 (en) | 2014-01-20 | 2016-11-17 | Euroclear Sa/Nv | Rights transfer and verification |
| US9519783B2 (en) | 2014-04-25 | 2016-12-13 | Bank Of America Corporation | Evaluating customer security preferences |
| US20170012992A1 (en) | 2014-02-10 | 2017-01-12 | Level 3 Communications, Llc | Authentication system and method |
| US9558397B2 (en) | 2011-08-11 | 2017-01-31 | At&T Intellectual Property I, L.P. | Method and apparatus for automated analysis and identification of a person in image and video content |
| US20170063831A1 (en) | 2015-08-24 | 2017-03-02 | International Business Machines Corporation | Authentication of a user and of access to the user's information |
| US20170063946A1 (en) | 2015-08-31 | 2017-03-02 | Ayla Networks, Inc. | Data streaming service for an internet-of-things platform |
| US20170111351A1 (en) | 2012-09-19 | 2017-04-20 | Secureauth Corporation | Mobile multifactor single-sign-on authentication |
| US20170230351A1 (en) | 2014-08-08 | 2017-08-10 | Identitrade Ab | Method and system for authenticating a user |
| US9734290B2 (en) | 2011-12-16 | 2017-08-15 | Neela SRINIVAS | System and method for evidence based differential analysis and incentives based healthcare policy |
| US20170236037A1 (en) | 2013-04-11 | 2017-08-17 | Digimarc Corporation | Methods for object recognition and related arrangements |
| US9864992B1 (en) | 2001-09-21 | 2018-01-09 | Open Invention Network, Llc | System and method for enrolling in a biometric system |
| US20180068103A1 (en) | 2015-03-20 | 2018-03-08 | Aplcomp Oy | Audiovisual associative authentication method, related system and device |
| US10024684B2 (en) | 2014-12-02 | 2018-07-17 | Operr Technologies, Inc. | Method and system for avoidance of accidents |
| US20180205546A1 (en) | 2016-12-31 | 2018-07-19 | Assetvault Limited | Systems, methods, apparatuses for secure management of legal documents |
| US10044700B2 (en) | 2014-12-23 | 2018-08-07 | Mcafee, Llc | Identity attestation of a minor via a parent |
| US10075445B2 (en) | 2015-04-28 | 2018-09-11 | Xiaomi Inc. | Methods and devices for permission management |
| US10102491B2 (en) | 2014-05-27 | 2018-10-16 | Genesys Telecommunications Laboratories, Inc. | System and method for bridging online customer experience |
| US10110608B2 (en) | 2016-01-07 | 2018-10-23 | Google Llc | Authorizing transaction on a shared device using a personal device |
| US10127378B2 (en) | 2014-10-01 | 2018-11-13 | Kalman Csaba Toth | Systems and methods for registering and acquiring E-credentials using proof-of-existence and digital seals |
| US10142362B2 (en) | 2016-06-02 | 2018-11-27 | Zscaler, Inc. | Cloud based systems and methods for determining security risks of users and groups |
| US10181032B1 (en) | 2017-07-17 | 2019-01-15 | Sift Science, Inc. | System and methods for digital account threat detection |
| WO2019013818A1 (en) | 2017-07-14 | 2019-01-17 | Hitachi Data Systems Corporation | Method, apparatus, and system for controlling user access to a data storage system |
| US10210527B2 (en) | 2015-06-04 | 2019-02-19 | Chronicled, Inc. | Open registry for identity of things including social record feature |
| US10218510B2 (en) * | 2015-06-01 | 2019-02-26 | Branch Banking And Trust Company | Network-based device authentication system |
| US20190095916A1 (en) | 2013-03-18 | 2019-03-28 | Fulcrum Ip Corporation | Systems and methods for a private sector monetary authority |
| US20190098500A1 (en) * | 2008-10-13 | 2019-03-28 | Microsoft Technology Licensing, Llc | Simple protocol for tangible security |
| US20190149539A1 (en) | 2017-11-15 | 2019-05-16 | Citrix Systems, Inc. | Secure Authentication Of A Device Through Attestation By Another Device |
| US20190163889A1 (en) | 2013-08-23 | 2019-05-30 | Morphotrust Usa, Llc | System and Method for Identity Management |
| US10313336B2 (en) | 2010-07-15 | 2019-06-04 | Proxense, Llc | Proximity-based system for object tracking |
| WO2019123291A1 (en) | 2017-12-20 | 2019-06-27 | Wani Nikhilesh Manoj | System and method for user authentication using biometric data |
| US20190205939A1 (en) | 2017-12-31 | 2019-07-04 | OneMarket Network LLC | Using Machine Learned Visitor Intent Propensity to Greet and Guide a Visitor at a Physical Venue |
| US10362027B2 (en) | 2014-12-29 | 2019-07-23 | Paypal, Inc. | Authenticating activities of accounts |
| US10387695B2 (en) | 2013-11-08 | 2019-08-20 | Vattaca, LLC | Authenticating and managing item ownership and authenticity |
| US20190296913A1 (en) * | 2018-03-26 | 2019-09-26 | Ca, Inc. | System and method for dynamic grid authentication |
| US10454913B2 (en) * | 2014-07-24 | 2019-10-22 | Hewlett Packard Enterprise Development Lp | Device authentication agent |
| US20190334724A1 (en) | 2015-08-11 | 2019-10-31 | Vescel, Llc | Authentication through verification of an evolving identity credential |
| US20190342276A1 (en) | 2018-05-07 | 2019-11-07 | Capital One Services, Llc | Methods and processes for utilizing information collected for enhanced verification |
| US10505965B2 (en) | 2011-10-18 | 2019-12-10 | Mcafee, Llc | User behavioral risk assessment |
| US20200036709A1 (en) * | 2018-06-15 | 2020-01-30 | Proxy, Inc. | Secure biometric credential authorization methods and apparatus |
| US10552596B2 (en) | 2017-12-20 | 2020-02-04 | International Business Machines Corporation | Biometric authentication |
| US10572778B1 (en) | 2019-03-15 | 2020-02-25 | Prime Research Solutions LLC | Machine-learning-based systems and methods for quality detection of digital input |
| US10614302B2 (en) | 2016-05-26 | 2020-04-07 | Alitheon, Inc. | Controlled authentication of physical objects |
| US10664581B2 (en) | 2012-03-19 | 2020-05-26 | Tencent Technology (Shenzhen) Company Limited | Biometric-based authentication method, apparatus and system |
| US20200211031A1 (en) | 2017-08-14 | 2020-07-02 | Rajeev Shant PATIL | System and method for automated processing of applications |
| US20200236113A1 (en) * | 2019-01-18 | 2020-07-23 | Anchor Labs, Inc. | Secure account access |
| US10740767B2 (en) | 2016-06-28 | 2020-08-11 | Alitheon, Inc. | Centralized databases storing digital fingerprints of objects for collaborative authentication |
| US20200266985A1 (en) | 2017-11-09 | 2020-08-20 | nChain Holdings Limited | System for securing verification key from alteration and verifying validity of a proof of correctness |
| US10757097B2 (en) | 2017-08-28 | 2020-08-25 | T-Mobile Usa, Inc. | Temporal identity vaulting |
| US10778676B1 (en) | 2016-06-21 | 2020-09-15 | Wells Fargo Bank, N.A. | Biometric reference template record |
| US20200311678A1 (en) | 2017-09-22 | 2020-10-01 | nChain Holdings Limited | Smart contract execution using distributed coordination |
| US20200320619A1 (en) | 2019-04-08 | 2020-10-08 | LendingPoint LLC | Systems and methods for detecting and preventing fraud in financial institution accounts |
| US10834084B2 (en) | 2018-07-20 | 2020-11-10 | International Business Machines Corporation | Privileged identity authentication based on user behaviors |
| US20200374311A1 (en) | 2013-03-15 | 2020-11-26 | Socure Inc. | Risk assessment using social networking data |
| US10855679B2 (en) | 2016-05-18 | 2020-12-01 | Vercrio, Inc. | Automated scalable identity-proofing and authentication process |
| US20200380598A1 (en) | 2019-05-30 | 2020-12-03 | Jpmorgan Chase Bank, N.A. | Systems and methods for digital identity verification |
| US20210029100A1 (en) * | 2019-07-23 | 2021-01-28 | Cyberark Software Ltd. | Identity verification based on electronic file fingerprinting data |
| US20210027061A1 (en) | 2019-07-25 | 2021-01-28 | Hangzhou Glority Software Limited | Method and system for object identification |
| US10938828B1 (en) | 2020-09-17 | 2021-03-02 | Sailpoint Technologies, Inc. | System and method for predictive platforms in identity management artificial intelligence systems using analysis of network identity graphs |
| US10943003B2 (en) | 2018-10-16 | 2021-03-09 | International Business Machines Corporation | Consented authentication |
| US20210089637A1 (en) | 2019-09-20 | 2021-03-25 | Micron Technology, Inc. | Methods and apparatus for persistent biometric profiling |
| US10963670B2 (en) | 2019-02-06 | 2021-03-30 | Alitheon, Inc. | Object change detection and measurement using digital fingerprints |
| US20210104008A1 (en) | 2016-08-12 | 2021-04-08 | Alitheon, Inc. | Event-driven authentication of physical objects |
| US10977353B2 (en) | 2018-09-18 | 2021-04-13 | International Business Machines Corporation | Validating authorized activities approved by a guardian |
| US20210110004A1 (en) | 2019-10-15 | 2021-04-15 | Alitheon, Inc. | Rights management using digital fingerprints |
| US20210134434A1 (en) | 2019-11-05 | 2021-05-06 | American Heart Association, Inc. | System and Method for Improving Food Selections |
| US11044267B2 (en) | 2016-11-30 | 2021-06-22 | Agari Data, Inc. | Using a measure of influence of sender in determining a security risk associated with an electronic message |
| US11048894B2 (en) | 2011-06-14 | 2021-06-29 | Ark Ideaz, Inc. | Authentication systems and methods |
| US11048794B1 (en) | 2019-02-05 | 2021-06-29 | Wells Fargo Bank, N.A. | Multifactor identity authentication via cumulative dynamic contextual identity |
| US20210202067A1 (en) | 2016-12-15 | 2021-07-01 | Conquer Your Addiction Llc | Dynamic and adaptive systems and methods for rewarding and/or disincentivizing behaviors |
| US11055390B1 (en) | 2009-06-03 | 2021-07-06 | James F. Kragh | Identity validation and verification system and associated methods |
| US11057366B2 (en) | 2018-08-21 | 2021-07-06 | HYPR Corp. | Federated identity management with decentralized computing platforms |
| US11068909B1 (en) | 2016-02-19 | 2021-07-20 | Alitheon, Inc. | Multi-level authentication |
| US11075904B2 (en) | 2019-03-04 | 2021-07-27 | Visa International Service Association | Biometric interaction manager |
| US20210234693A1 (en) * | 2020-01-23 | 2021-07-29 | Bank Of America Corporation | Intelligent decryption based on user and data profiling |
| US20210234673A1 (en) * | 2020-01-23 | 2021-07-29 | Bank Of America Corporation | Intelligent encryption based on user and data profiling |
| US20210231706A1 (en) | 2018-06-07 | 2021-07-29 | Sangyang PAK | Integrated pogo pin enabling integrated housing |
| US20210240837A1 (en) | 2020-02-04 | 2021-08-05 | Pindrop Security, Inc. | Dynamic account risk assessment from heterogeneous events |
| US11089014B2 (en) | 2015-06-26 | 2021-08-10 | Cecelumen, Llc | Methods and apparatus for allowing users to control use and/or sharing of images and/or biometric data |
| US11093789B2 (en) | 2017-12-12 | 2021-08-17 | Tusimple, Inc. | Method and apparatus for object re-identification |
| US20210258155A1 (en) | 2018-12-07 | 2021-08-19 | Nike, Inc. | System and method for providing cryptographically secured digital assets |
| US11100503B2 (en) * | 2018-02-07 | 2021-08-24 | Mastercard International Incorporated | Systems and methods for use in managing digital identities |
| US20210279475A1 (en) | 2016-07-29 | 2021-09-09 | Unifai Holdings Limited | Computer vision systems |
| US11128467B2 (en) | 2017-02-06 | 2021-09-21 | Northern Trust Corporation | Systems and methods for digital identity management and permission controls within distributed network nodes |
| US11127092B2 (en) | 2017-05-18 | 2021-09-21 | Bank Of America Corporation | Method and system for data tracking and exchange |
| US20210297259A1 (en) * | 2020-03-19 | 2021-09-23 | Arista Networks, Inc. | Network device authentication |
| US11151550B2 (en) | 2015-08-25 | 2021-10-19 | Paypal, Inc. | Token service provider for electronic/mobile commerce transactions |
| US20210326467A1 (en) | 2018-04-13 | 2021-10-21 | Sophos Limited | Dynamic multi-factor authentication |
| US20210325427A1 (en) | 2011-11-30 | 2021-10-21 | The Nielsen Company (Us), Llc | Multiple meter detection and processing using motion data |
| US11157907B1 (en) | 2017-04-26 | 2021-10-26 | Wells Fargo Bank, N.A. | Transaction validation and fraud mitigation |
| US11163931B2 (en) | 2013-04-15 | 2021-11-02 | Autoconnect Holdings Llc | Access and portability of user profiles stored as templates |
| US20210366014A1 (en) | 2017-08-08 | 2021-11-25 | Netorus, Inc. | Method of generating and accessing product-related information |
| US20210366586A1 (en) | 2018-07-02 | 2021-11-25 | Kelly Dell Tyler | Enterprise Consumer Safety System |
| US11200306B1 (en) | 2021-02-25 | 2021-12-14 | Telcom Ventures, Llc | Methods, devices, and systems for authenticating user identity for location-based deliveries |
| US11205011B2 (en) | 2018-09-27 | 2021-12-21 | Amber Solutions, Inc. | Privacy and the management of permissions |
| US20210399895A1 (en) * | 2018-08-24 | 2021-12-23 | Powch, LLC | Systems and Methods for Single-Step Out-of-Band Authentication |
| US20220004616A1 (en) * | 2016-07-29 | 2022-01-06 | Trusona, Inc. | Anti-replay authentication systems and methods |
| US11223646B2 (en) | 2020-01-22 | 2022-01-11 | Forcepoint, LLC | Using concerning behaviors when performing entity-based risk calculations |
| US20220086141A1 (en) * | 2019-07-23 | 2022-03-17 | Capital One Services, Llc | First factor contactless card authentication system and method |
| US11327992B1 (en) | 2018-04-30 | 2022-05-10 | Splunk Inc. | Authenticating a user to access a data intake and query system |
| US11405189B1 (en) * | 2021-11-18 | 2022-08-02 | James E. Bennison | Systems and methods for trustworthy electronic authentication using a computing device |
| US20220292396A1 (en) | 2021-03-15 | 2022-09-15 | Yandex Europe Ag | Method and system for generating training data for a machine-learning algorithm |
| US11451532B2 (en) | 2019-01-25 | 2022-09-20 | Dell Products L.P. | Behavioral biometrics and machine learning to secure website logins |
| US11461298B1 (en) | 2021-08-20 | 2022-10-04 | ActionIQ, Inc. | Scoring parameter generation for identity resolution |
| DE102021108925A1 (en) | 2021-04-09 | 2022-10-13 | Rea Elektronik Gmbh | Device and method for checking a marking of a product |
| US20220345451A1 (en) * | 2012-02-01 | 2022-10-27 | Amazon Technologies, Inc. | Resetting managed security credentials |
| US11509477B1 (en) * | 2014-12-30 | 2022-11-22 | Idemia Identity & Security USA LLC | User data validation for digital identifications |
| US11522867B2 (en) | 2020-03-31 | 2022-12-06 | LendingClub Bank, National Association | Secure content management through authentication |
| US20230097761A1 (en) * | 2021-09-24 | 2023-03-30 | Apple Inc. | Techniques for secure data reception using a user device |
| US20230334476A1 (en) * | 2019-03-20 | 2023-10-19 | Capital One Services, Llc | Using a contactless card to securely share personal data stored in a blockchain |
| US20230403144A1 (en) * | 2022-05-27 | 2023-12-14 | Keychainx Ag | Non-fungible token (nft) generation for secure applications |
| US20240039537A1 (en) | 2022-07-26 | 2024-02-01 | Stmicroelectronics International N.V. | High-voltage fault protection circuit |
| US20240064135A1 (en) | 2020-10-22 | 2024-02-22 | Acuant, Inc. | Identity Proofing and Portability on Blockchain |
| US20240185660A1 (en) * | 2019-04-22 | 2024-06-06 | Soloinsight, Inc. | System and method for providing credential activation layered security |
| US20240214194A1 (en) | 2022-12-22 | 2024-06-27 | Artema Labs, Inc | Systems and Methods for Facilitating Interactions between Tokens and Digital Wallets |
| US12034719B2 (en) | 2020-12-04 | 2024-07-09 | TruU, Inc. | Context-based risk assessment for an identity verification system |
| US20240256878A1 (en) | 2023-01-31 | 2024-08-01 | Walmart Apollo, Llc | Deep learning entity matching system using weak supervision |
| US20240340314A1 (en) | 2023-04-04 | 2024-10-10 | Lookout, Inc. | System for generating samples to generate machine learning models to facilitate detection of suspicious digital identifiers |
| US20240346085A1 (en) | 2018-11-12 | 2024-10-17 | Nant Holdings Ip, Llc | Curation And Provision Of Digital Content |
-
2022
- 2022-07-27 US US17/874,803 patent/US12470552B1/en active Active
Patent Citations (152)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7133846B1 (en) | 1995-02-13 | 2006-11-07 | Intertrust Technologies Corp. | Digital certificate support system, methods and techniques for secure electronic commerce transaction and rights management |
| CA2478548C (en) | 1999-07-20 | 2014-03-11 | Diebold, Incorporated | Automated banking machine system and development method |
| US8756153B1 (en) | 1999-08-10 | 2014-06-17 | Gofigure Payments, Llc | System and method for mobile payment at point of sale |
| US20030086341A1 (en) | 2001-07-20 | 2003-05-08 | Gracenote, Inc. | Automatic identification of sound recordings |
| US9864992B1 (en) | 2001-09-21 | 2018-01-09 | Open Invention Network, Llc | System and method for enrolling in a biometric system |
| US8234387B2 (en) | 2003-06-05 | 2012-07-31 | Intertrust Technologies Corp. | Interoperable systems and methods for peer-to-peer service orchestration |
| US20060129478A1 (en) | 2004-12-10 | 2006-06-15 | Payday One Xl, Llc | Automated Short Term Loans |
| US8965803B2 (en) | 2005-02-04 | 2015-02-24 | The Invention Science Fund I, Llc | Virtual world reversion rights |
| US20070078908A1 (en) | 2005-05-17 | 2007-04-05 | Santu Rohatgi | Method and system for child safety |
| US20080022370A1 (en) | 2006-07-21 | 2008-01-24 | International Business Corporation | System and method for role based access control in a content management system |
| US20080120302A1 (en) | 2006-11-17 | 2008-05-22 | Thompson Timothy J | Resource level role based access control for storage management |
| US7673797B2 (en) | 2006-12-13 | 2010-03-09 | Ncr Corporation | Personalization of self-checkout security |
| US8831972B2 (en) | 2007-04-03 | 2014-09-09 | International Business Machines Corporation | Generating a customer risk assessment using dynamic customer data |
| US20090089107A1 (en) | 2007-09-27 | 2009-04-02 | Robert Lee Angell | Method and apparatus for ranking a customer using dynamically generated external data |
| US20090089205A1 (en) | 2007-09-29 | 2009-04-02 | Anthony Jeremiah Bayne | Automated qualifying of a customer to receive a cash loan at an automated teller machine |
| US20140200885A1 (en) | 2008-02-21 | 2014-07-17 | Snell Limited | Audio visual signature, method of deriving a signature, and method of comparing audio-visual data background |
| US20120237908A1 (en) | 2008-04-01 | 2012-09-20 | William Fitzgerald | Systems and methods for monitoring and managing use of mobile electronic devices |
| US20190098500A1 (en) * | 2008-10-13 | 2019-03-28 | Microsoft Technology Licensing, Llc | Simple protocol for tangible security |
| US20150220999A1 (en) | 2009-01-21 | 2015-08-06 | Truaxis, Inc. | Method and system to dynamically adjust offer spend thresholds and personalize offer criteria specific to individual users |
| US11055390B1 (en) | 2009-06-03 | 2021-07-06 | James F. Kragh | Identity validation and verification system and associated methods |
| WO2011016710A1 (en) | 2009-08-05 | 2011-02-10 | Mimos Berhad | Method for baby-parent identification |
| US10313336B2 (en) | 2010-07-15 | 2019-06-04 | Proxense, Llc | Proximity-based system for object tracking |
| US8446275B2 (en) | 2011-06-10 | 2013-05-21 | Aliphcom | General health and wellness management method and apparatus for a wellness application using data from a data-capable band |
| US11048894B2 (en) | 2011-06-14 | 2021-06-29 | Ark Ideaz, Inc. | Authentication systems and methods |
| US9087058B2 (en) | 2011-08-03 | 2015-07-21 | Google Inc. | Method and apparatus for enabling a searchable history of real-world user experiences |
| US9558397B2 (en) | 2011-08-11 | 2017-01-31 | At&T Intellectual Property I, L.P. | Method and apparatus for automated analysis and identification of a person in image and video content |
| US10505965B2 (en) | 2011-10-18 | 2019-12-10 | Mcafee, Llc | User behavioral risk assessment |
| US20210325427A1 (en) | 2011-11-30 | 2021-10-21 | The Nielsen Company (Us), Llc | Multiple meter detection and processing using motion data |
| US9734290B2 (en) | 2011-12-16 | 2017-08-15 | Neela SRINIVAS | System and method for evidence based differential analysis and incentives based healthcare policy |
| US20220345451A1 (en) * | 2012-02-01 | 2022-10-27 | Amazon Technologies, Inc. | Resetting managed security credentials |
| US9443298B2 (en) | 2012-03-02 | 2016-09-13 | Authentect, Inc. | Digital fingerprinting object authentication and anti-counterfeiting system |
| US10664581B2 (en) | 2012-03-19 | 2020-05-26 | Tencent Technology (Shenzhen) Company Limited | Biometric-based authentication method, apparatus and system |
| US20160224773A1 (en) | 2012-05-15 | 2016-08-04 | Bphav, Llc | Biometric authentication system |
| US9177257B2 (en) | 2012-08-30 | 2015-11-03 | International Business Machines Corporation | Non-transitory article of manufacture and system for providing a prompt to user for real-time cognitive assistance |
| US20170111351A1 (en) | 2012-09-19 | 2017-04-20 | Secureauth Corporation | Mobile multifactor single-sign-on authentication |
| US8731977B1 (en) | 2013-03-15 | 2014-05-20 | Red Mountain Technologies, LLC | System and method for analyzing and using vehicle historical data |
| US20200374311A1 (en) | 2013-03-15 | 2020-11-26 | Socure Inc. | Risk assessment using social networking data |
| US20190095916A1 (en) | 2013-03-18 | 2019-03-28 | Fulcrum Ip Corporation | Systems and methods for a private sector monetary authority |
| US20170236037A1 (en) | 2013-04-11 | 2017-08-17 | Digimarc Corporation | Methods for object recognition and related arrangements |
| US11163931B2 (en) | 2013-04-15 | 2021-11-02 | Autoconnect Holdings Llc | Access and portability of user profiles stored as templates |
| US9094388B2 (en) | 2013-05-01 | 2015-07-28 | Dmitri Tkachev | Methods and systems for identifying, verifying, and authenticating an identity |
| US20190163889A1 (en) | 2013-08-23 | 2019-05-30 | Morphotrust Usa, Llc | System and Method for Identity Management |
| US20150112732A1 (en) | 2013-10-22 | 2015-04-23 | Esurance Insurance Services, Inc. | Identifying a user as part of a household |
| US10387695B2 (en) | 2013-11-08 | 2019-08-20 | Vattaca, LLC | Authenticating and managing item ownership and authenticity |
| US20160335629A1 (en) | 2014-01-20 | 2016-11-17 | Euroclear Sa/Nv | Rights transfer and verification |
| US20170012992A1 (en) | 2014-02-10 | 2017-01-12 | Level 3 Communications, Llc | Authentication system and method |
| US9519783B2 (en) | 2014-04-25 | 2016-12-13 | Bank Of America Corporation | Evaluating customer security preferences |
| US20150317728A1 (en) | 2014-05-05 | 2015-11-05 | BeSmartee, Inc. | Mortgage synthesis and automation |
| US10102491B2 (en) | 2014-05-27 | 2018-10-16 | Genesys Telecommunications Laboratories, Inc. | System and method for bridging online customer experience |
| US10454913B2 (en) * | 2014-07-24 | 2019-10-22 | Hewlett Packard Enterprise Development Lp | Device authentication agent |
| US20170230351A1 (en) | 2014-08-08 | 2017-08-10 | Identitrade Ab | Method and system for authenticating a user |
| US20160050557A1 (en) | 2014-08-14 | 2016-02-18 | Samsung Electronics Co., Ltd. | Method and apparatus for profile download of group devices |
| US10127378B2 (en) | 2014-10-01 | 2018-11-13 | Kalman Csaba Toth | Systems and methods for registering and acquiring E-credentials using proof-of-existence and digital seals |
| WO2016083987A1 (en) | 2014-11-25 | 2016-06-02 | Ideco Biometric Security Solutions (Proprietary) Limited | Method of and system for obtaining proof of authorisation of a transaction |
| US10024684B2 (en) | 2014-12-02 | 2018-07-17 | Operr Technologies, Inc. | Method and system for avoidance of accidents |
| US20160162882A1 (en) | 2014-12-08 | 2016-06-09 | Guy LaMonte McClung, III | Digital money choice and eWallet selection |
| US10044700B2 (en) | 2014-12-23 | 2018-08-07 | Mcafee, Llc | Identity attestation of a minor via a parent |
| US10362027B2 (en) | 2014-12-29 | 2019-07-23 | Paypal, Inc. | Authenticating activities of accounts |
| US11509477B1 (en) * | 2014-12-30 | 2022-11-22 | Idemia Identity & Security USA LLC | User data validation for digital identifications |
| US20180068103A1 (en) | 2015-03-20 | 2018-03-08 | Aplcomp Oy | Audiovisual associative authentication method, related system and device |
| US10075445B2 (en) | 2015-04-28 | 2018-09-11 | Xiaomi Inc. | Methods and devices for permission management |
| US10218510B2 (en) * | 2015-06-01 | 2019-02-26 | Branch Banking And Trust Company | Network-based device authentication system |
| US10210527B2 (en) | 2015-06-04 | 2019-02-19 | Chronicled, Inc. | Open registry for identity of things including social record feature |
| US11089014B2 (en) | 2015-06-26 | 2021-08-10 | Cecelumen, Llc | Methods and apparatus for allowing users to control use and/or sharing of images and/or biometric data |
| US20190334724A1 (en) | 2015-08-11 | 2019-10-31 | Vescel, Llc | Authentication through verification of an evolving identity credential |
| US20170063831A1 (en) | 2015-08-24 | 2017-03-02 | International Business Machines Corporation | Authentication of a user and of access to the user's information |
| US11151550B2 (en) | 2015-08-25 | 2021-10-19 | Paypal, Inc. | Token service provider for electronic/mobile commerce transactions |
| US20170063946A1 (en) | 2015-08-31 | 2017-03-02 | Ayla Networks, Inc. | Data streaming service for an internet-of-things platform |
| US10110608B2 (en) | 2016-01-07 | 2018-10-23 | Google Llc | Authorizing transaction on a shared device using a personal device |
| US11068909B1 (en) | 2016-02-19 | 2021-07-20 | Alitheon, Inc. | Multi-level authentication |
| US10855679B2 (en) | 2016-05-18 | 2020-12-01 | Vercrio, Inc. | Automated scalable identity-proofing and authentication process |
| US10614302B2 (en) | 2016-05-26 | 2020-04-07 | Alitheon, Inc. | Controlled authentication of physical objects |
| US10142362B2 (en) | 2016-06-02 | 2018-11-27 | Zscaler, Inc. | Cloud based systems and methods for determining security risks of users and groups |
| US10778676B1 (en) | 2016-06-21 | 2020-09-15 | Wells Fargo Bank, N.A. | Biometric reference template record |
| US10740767B2 (en) | 2016-06-28 | 2020-08-11 | Alitheon, Inc. | Centralized databases storing digital fingerprints of objects for collaborative authentication |
| US20210279475A1 (en) | 2016-07-29 | 2021-09-09 | Unifai Holdings Limited | Computer vision systems |
| US20220004616A1 (en) * | 2016-07-29 | 2022-01-06 | Trusona, Inc. | Anti-replay authentication systems and methods |
| US20210104008A1 (en) | 2016-08-12 | 2021-04-08 | Alitheon, Inc. | Event-driven authentication of physical objects |
| US11044267B2 (en) | 2016-11-30 | 2021-06-22 | Agari Data, Inc. | Using a measure of influence of sender in determining a security risk associated with an electronic message |
| US20210202067A1 (en) | 2016-12-15 | 2021-07-01 | Conquer Your Addiction Llc | Dynamic and adaptive systems and methods for rewarding and/or disincentivizing behaviors |
| US20180205546A1 (en) | 2016-12-31 | 2018-07-19 | Assetvault Limited | Systems, methods, apparatuses for secure management of legal documents |
| US11128467B2 (en) | 2017-02-06 | 2021-09-21 | Northern Trust Corporation | Systems and methods for digital identity management and permission controls within distributed network nodes |
| US11157907B1 (en) | 2017-04-26 | 2021-10-26 | Wells Fargo Bank, N.A. | Transaction validation and fraud mitigation |
| US11127092B2 (en) | 2017-05-18 | 2021-09-21 | Bank Of America Corporation | Method and system for data tracking and exchange |
| WO2019013818A1 (en) | 2017-07-14 | 2019-01-17 | Hitachi Data Systems Corporation | Method, apparatus, and system for controlling user access to a data storage system |
| US10181032B1 (en) | 2017-07-17 | 2019-01-15 | Sift Science, Inc. | System and methods for digital account threat detection |
| US20210366014A1 (en) | 2017-08-08 | 2021-11-25 | Netorus, Inc. | Method of generating and accessing product-related information |
| US20200211031A1 (en) | 2017-08-14 | 2020-07-02 | Rajeev Shant PATIL | System and method for automated processing of applications |
| US10757097B2 (en) | 2017-08-28 | 2020-08-25 | T-Mobile Usa, Inc. | Temporal identity vaulting |
| US20200311678A1 (en) | 2017-09-22 | 2020-10-01 | nChain Holdings Limited | Smart contract execution using distributed coordination |
| US20200266985A1 (en) | 2017-11-09 | 2020-08-20 | nChain Holdings Limited | System for securing verification key from alteration and verifying validity of a proof of correctness |
| US20190149539A1 (en) | 2017-11-15 | 2019-05-16 | Citrix Systems, Inc. | Secure Authentication Of A Device Through Attestation By Another Device |
| US11093789B2 (en) | 2017-12-12 | 2021-08-17 | Tusimple, Inc. | Method and apparatus for object re-identification |
| WO2019123291A1 (en) | 2017-12-20 | 2019-06-27 | Wani Nikhilesh Manoj | System and method for user authentication using biometric data |
| US10552596B2 (en) | 2017-12-20 | 2020-02-04 | International Business Machines Corporation | Biometric authentication |
| US20190205939A1 (en) | 2017-12-31 | 2019-07-04 | OneMarket Network LLC | Using Machine Learned Visitor Intent Propensity to Greet and Guide a Visitor at a Physical Venue |
| US11100503B2 (en) * | 2018-02-07 | 2021-08-24 | Mastercard International Incorporated | Systems and methods for use in managing digital identities |
| US20190296913A1 (en) * | 2018-03-26 | 2019-09-26 | Ca, Inc. | System and method for dynamic grid authentication |
| US20210326467A1 (en) | 2018-04-13 | 2021-10-21 | Sophos Limited | Dynamic multi-factor authentication |
| US11327992B1 (en) | 2018-04-30 | 2022-05-10 | Splunk Inc. | Authenticating a user to access a data intake and query system |
| US20190342276A1 (en) | 2018-05-07 | 2019-11-07 | Capital One Services, Llc | Methods and processes for utilizing information collected for enhanced verification |
| US20210231706A1 (en) | 2018-06-07 | 2021-07-29 | Sangyang PAK | Integrated pogo pin enabling integrated housing |
| US20200036709A1 (en) * | 2018-06-15 | 2020-01-30 | Proxy, Inc. | Secure biometric credential authorization methods and apparatus |
| US20210366586A1 (en) | 2018-07-02 | 2021-11-25 | Kelly Dell Tyler | Enterprise Consumer Safety System |
| US10834084B2 (en) | 2018-07-20 | 2020-11-10 | International Business Machines Corporation | Privileged identity authentication based on user behaviors |
| US11057366B2 (en) | 2018-08-21 | 2021-07-06 | HYPR Corp. | Federated identity management with decentralized computing platforms |
| US20210399895A1 (en) * | 2018-08-24 | 2021-12-23 | Powch, LLC | Systems and Methods for Single-Step Out-of-Band Authentication |
| US10977353B2 (en) | 2018-09-18 | 2021-04-13 | International Business Machines Corporation | Validating authorized activities approved by a guardian |
| US11205011B2 (en) | 2018-09-27 | 2021-12-21 | Amber Solutions, Inc. | Privacy and the management of permissions |
| US10943003B2 (en) | 2018-10-16 | 2021-03-09 | International Business Machines Corporation | Consented authentication |
| US20240346085A1 (en) | 2018-11-12 | 2024-10-17 | Nant Holdings Ip, Llc | Curation And Provision Of Digital Content |
| US20210258155A1 (en) | 2018-12-07 | 2021-08-19 | Nike, Inc. | System and method for providing cryptographically secured digital assets |
| US20200236113A1 (en) * | 2019-01-18 | 2020-07-23 | Anchor Labs, Inc. | Secure account access |
| US11451532B2 (en) | 2019-01-25 | 2022-09-20 | Dell Products L.P. | Behavioral biometrics and machine learning to secure website logins |
| US11514155B1 (en) | 2019-02-05 | 2022-11-29 | Wells Fargo Bank, N.A. | Multifactor identity authentication via cumulative dynamic contextual identity |
| US11048794B1 (en) | 2019-02-05 | 2021-06-29 | Wells Fargo Bank, N.A. | Multifactor identity authentication via cumulative dynamic contextual identity |
| US11669611B1 (en) | 2019-02-05 | 2023-06-06 | Wells Fargo Bank, N.A. | Multifactor identity authentication via cumulative dynamic contextual identity |
| US11290448B1 (en) | 2019-02-05 | 2022-03-29 | Wells Fargo Bank, N.A. | Multifactor identity authentication via cumulative dynamic contextual identity |
| US10963670B2 (en) | 2019-02-06 | 2021-03-30 | Alitheon, Inc. | Object change detection and measurement using digital fingerprints |
| US11075904B2 (en) | 2019-03-04 | 2021-07-27 | Visa International Service Association | Biometric interaction manager |
| US10572778B1 (en) | 2019-03-15 | 2020-02-25 | Prime Research Solutions LLC | Machine-learning-based systems and methods for quality detection of digital input |
| US20230334476A1 (en) * | 2019-03-20 | 2023-10-19 | Capital One Services, Llc | Using a contactless card to securely share personal data stored in a blockchain |
| US20200320619A1 (en) | 2019-04-08 | 2020-10-08 | LendingPoint LLC | Systems and methods for detecting and preventing fraud in financial institution accounts |
| US20240185660A1 (en) * | 2019-04-22 | 2024-06-06 | Soloinsight, Inc. | System and method for providing credential activation layered security |
| US20200380598A1 (en) | 2019-05-30 | 2020-12-03 | Jpmorgan Chase Bank, N.A. | Systems and methods for digital identity verification |
| US20210029100A1 (en) * | 2019-07-23 | 2021-01-28 | Cyberark Software Ltd. | Identity verification based on electronic file fingerprinting data |
| US20220086141A1 (en) * | 2019-07-23 | 2022-03-17 | Capital One Services, Llc | First factor contactless card authentication system and method |
| US20210027061A1 (en) | 2019-07-25 | 2021-01-28 | Hangzhou Glority Software Limited | Method and system for object identification |
| US20210089637A1 (en) | 2019-09-20 | 2021-03-25 | Micron Technology, Inc. | Methods and apparatus for persistent biometric profiling |
| US20210110004A1 (en) | 2019-10-15 | 2021-04-15 | Alitheon, Inc. | Rights management using digital fingerprints |
| US20210134434A1 (en) | 2019-11-05 | 2021-05-06 | American Heart Association, Inc. | System and Method for Improving Food Selections |
| US11223646B2 (en) | 2020-01-22 | 2022-01-11 | Forcepoint, LLC | Using concerning behaviors when performing entity-based risk calculations |
| US20210234693A1 (en) * | 2020-01-23 | 2021-07-29 | Bank Of America Corporation | Intelligent decryption based on user and data profiling |
| US20210234673A1 (en) * | 2020-01-23 | 2021-07-29 | Bank Of America Corporation | Intelligent encryption based on user and data profiling |
| US20210240837A1 (en) | 2020-02-04 | 2021-08-05 | Pindrop Security, Inc. | Dynamic account risk assessment from heterogeneous events |
| US20210297259A1 (en) * | 2020-03-19 | 2021-09-23 | Arista Networks, Inc. | Network device authentication |
| US11522867B2 (en) | 2020-03-31 | 2022-12-06 | LendingClub Bank, National Association | Secure content management through authentication |
| US10938828B1 (en) | 2020-09-17 | 2021-03-02 | Sailpoint Technologies, Inc. | System and method for predictive platforms in identity management artificial intelligence systems using analysis of network identity graphs |
| US20240064135A1 (en) | 2020-10-22 | 2024-02-22 | Acuant, Inc. | Identity Proofing and Portability on Blockchain |
| US12034719B2 (en) | 2020-12-04 | 2024-07-09 | TruU, Inc. | Context-based risk assessment for an identity verification system |
| US11200306B1 (en) | 2021-02-25 | 2021-12-14 | Telcom Ventures, Llc | Methods, devices, and systems for authenticating user identity for location-based deliveries |
| US20220292396A1 (en) | 2021-03-15 | 2022-09-15 | Yandex Europe Ag | Method and system for generating training data for a machine-learning algorithm |
| DE102021108925A1 (en) | 2021-04-09 | 2022-10-13 | Rea Elektronik Gmbh | Device and method for checking a marking of a product |
| US20240185596A1 (en) | 2021-04-09 | 2024-06-06 | Rea Elektronik Gmbh | Device and method for checking a marking of a product |
| US11461298B1 (en) | 2021-08-20 | 2022-10-04 | ActionIQ, Inc. | Scoring parameter generation for identity resolution |
| US20230097761A1 (en) * | 2021-09-24 | 2023-03-30 | Apple Inc. | Techniques for secure data reception using a user device |
| US11405189B1 (en) * | 2021-11-18 | 2022-08-02 | James E. Bennison | Systems and methods for trustworthy electronic authentication using a computing device |
| US20230403144A1 (en) * | 2022-05-27 | 2023-12-14 | Keychainx Ag | Non-fungible token (nft) generation for secure applications |
| US20240039537A1 (en) | 2022-07-26 | 2024-02-01 | Stmicroelectronics International N.V. | High-voltage fault protection circuit |
| US20240214194A1 (en) | 2022-12-22 | 2024-06-27 | Artema Labs, Inc | Systems and Methods for Facilitating Interactions between Tokens and Digital Wallets |
| US20240256878A1 (en) | 2023-01-31 | 2024-08-01 | Walmart Apollo, Llc | Deep learning entity matching system using weak supervision |
| US20240340314A1 (en) | 2023-04-04 | 2024-10-10 | Lookout, Inc. | System for generating samples to generate machine learning models to facilitate detection of suspicious digital identifiers |
Non-Patent Citations (2)
| Title |
|---|
| Jain, et al., A Blockchain-Based distributed network for Secure Credit Scoring, 2019 5th International Conference on Signal Processing, Computing and Control (ISPCC), 306-12, Oct. 2019; ISBN-13: 978-1-7281-3988-3. |
| Yan Zhang et al., Real-time Machine Learning Prediction of an Agent-Based Model for Urban Decision-making, URL: https://ifaamas.org/Proceedings/aamas2018/pdfs/p2171.pdf (Jul. 10-15, 2018). |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230129693A1 (en) | Transaction authentication and verification using text messages and a distributed ledger | |
| US11936648B2 (en) | Methods and apparatus for allowing users to control use and/or sharing of images and/or biometric data | |
| US12067106B2 (en) | Multifactor identity authentication via cumulative dynamic contextual identity | |
| US11882118B2 (en) | Identity verification and management system | |
| CN110462658B (en) | System and method for providing digital identity records to verify the identity of a user | |
| CN120092241A (en) | Systems and methods for blockchain-based non-fungible token (NFT) authentication | |
| WO2020106803A1 (en) | Methods, systems, and storage media for managing patient information using a blockchain network | |
| US12200141B2 (en) | Systems and methods for conducting remote attestation | |
| US20250094988A1 (en) | Distributed ledger technology utilizing cardless payments | |
| CA3057398A1 (en) | Securely performing cryptographic operations | |
| US12361104B2 (en) | Circumference based biometric authentication | |
| US20250111367A1 (en) | Systems and methods for facilitating biometric authentication using quantum cryptography and/or blockchain | |
| Tiwari et al. | Emerging biometric modalities and integration challenges | |
| US20250330471A1 (en) | Secure digital authorization based on identity elements of users and/or linkage definitions identifying shared digital assets | |
| US12470552B1 (en) | Secure data processing using data packages generated by edge devices | |
| US12200132B1 (en) | Secure multi-verification of biometric data in a distributed computing environment | |
| US12284172B1 (en) | Secure generation of authentication datasets from network activity | |
| Mahmood et al. | A Privacy-Preserving E-Voting System using Federated Learning and CNNs for Secure Fingerprint and Biometric Verification | |
| Al-Rubaie | Towards privacy-aware mobile-based continuous authentication systems | |
| CN115203737A (en) | Method and electronic device for displaying data |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |