Jiang et al., 2023 - Google Patents
AnonPSI: An anonymity assessment framework for PSIJiang et al., 2023
View PDF- Document ID
- 15723813811597057167
- Author
- Jiang B
- Du J
- Yan Q
- Publication year
- Publication venue
- arXiv preprint arXiv:2311.18118
External Links
Snippet
Private Set Intersection (PSI) is a widely used protocol that enables two parties to securely compute a function over the intersected part of their shared datasets and has been a significant research focus over the years. However, recent studies have highlighted its …
- 238000000034 method 0 abstract description 22
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
- G06F21/6254—Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
- G06F17/30286—Information retrieval; Database structures therefor; File system structures therefor in structured data stores
- G06F17/30386—Retrieval requests
- G06F17/30424—Query processing
- G06F17/30533—Other types of queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/604—Tools and structures for managing or administering access control systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F7/00—Methods or arrangements for processing data by operating upon the order or content of the data handled
- G06F7/58—Random or pseudo-random number generators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/04—Inference methods or devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
- G06N5/02—Knowledge representation
- G06N5/022—Knowledge engineering, knowledge acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communication
- H04L9/08—Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
- H04L9/0816—Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Gursoy et al. | Secure and utility-aware data collection with condensed local differential privacy | |
| Choudhury et al. | Anonymizing data for privacy-preserving federated learning | |
| Evans et al. | Statistically valid inferences from privacy-protected data | |
| Long et al. | Towards measuring membership privacy | |
| Chen et al. | Correlated network data publication via differential privacy | |
| Dwork | A firm foundation for private data analysis | |
| Jiang et al. | AnonPSI: An anonymity assessment framework for PSI | |
| EP1950684A1 (en) | Anonymity measuring device | |
| Wang et al. | A privacy-friendly approach to data valuation | |
| Wang et al. | High utility k-anonymization for social network publishing | |
| Gadotti et al. | When the signal is in the noise: Exploiting diffix's sticky noise | |
| Sei et al. | Privacy-preserving collaborative data collection and analysis with many missing values | |
| Choudhury et al. | A syntactic approach for privacy-preserving federated learning | |
| Task et al. | What should we protect? Defining differential privacy for social network analysis | |
| Alvim et al. | On privacy and accuracy in data releases | |
| Pessach et al. | Fairness-driven private collaborative machine learning | |
| Bauer et al. | Towards realistic membership inferences: The case of survey data | |
| Liu et al. | Differential privacy performance evaluation under the condition of non-uniform noise distribution | |
| Liu et al. | AUDIO: An Integrity ting Framework of utlier-Mining-as-a-Service Systems | |
| Zhan et al. | Will Sharing Metadata Leak Privacy? | |
| Xia et al. | Heterogeneous differential privacy for vertically partitioned databases | |
| Jeba et al. | Classifying and evaluating privacy-preserving techniques based on protection methods: A comprehensive study | |
| Bogdanov et al. | K-anonymity versus PSI3 for depersonalization and security assessment of large data structures | |
| Szűcs | Random Response Forest for Privacy‐Preserving Classification | |
| Li et al. | -privacy: Bounding Privacy Leaks in Privacy Preserving Data Mining |