WO2024069956A1 - 学習装置、学習システム、学習方法、およびコンピュータ可読媒体 - Google Patents
学習装置、学習システム、学習方法、およびコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2024069956A1 WO2024069956A1 PCT/JP2022/036759 JP2022036759W WO2024069956A1 WO 2024069956 A1 WO2024069956 A1 WO 2024069956A1 JP 2022036759 W JP2022036759 W JP 2022036759W WO 2024069956 A1 WO2024069956 A1 WO 2024069956A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- learning
- secure communication
- learning device
- organization
- information terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/04—Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
- H04L63/0428—Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
Definitions
- the present disclosure relates to a learning device, a learning system, a learning method, and a computer-readable medium.
- Patent Document 1 discloses a technology that uses machine learning to build an AI (Artificial Intelligence) model (also called a local model) that is personalized for each user.
- AI Artificial Intelligence
- a better performing AI model (also called a global model) can be constructed by integrating multiple local AI models.
- the server can construct local and global models.
- one of the objectives that the embodiments disclosed in this specification aim to achieve is to provide a learning device, a learning system, a learning method, and a computer-readable medium that can build a global model when the networks of multiple organizations are not constantly connected.
- a learning device for establishing a secure communication between an information terminal disposed on the network of each organization; an acquisition means for acquiring a data set for each organization from the information terminal using the secure communication; A learning means for training a local model on the data set; and an integration means for integrating multiple local models trained on multiple datasets.
- a computing system includes: Information terminals installed on each organization's network, A learning device; A learning system comprising: The learning device includes: Establishing a secure communication with the information terminal; acquiring a data set for each organization from the information terminal using the secure communication; training a local model on said dataset; Combine multiple local models trained on multiple datasets.
- the computer Establish secure communications between information terminals located on each organization's network, acquiring a data set for each organization from the information terminal using the secure communication; training a local model on said dataset; Combine multiple local models trained on multiple datasets.
- a non-transitory computer-readable medium comprising: On the computer, A process of establishing secure communication between information terminals located on the networks of each organization; A process of acquiring a data set for each organization from the information terminal using the secure communication; training a local model on the dataset; A process of integrating multiple local models trained on multiple data sets is stored.
- the present disclosure provides a learning device, a learning system, a learning method, and a computer-readable medium that can build a global model when the networks of multiple organizations are not constantly connected.
- FIG. 11 is a block diagram showing a configuration of a learning device according to a first embodiment.
- FIG. 11 is a block diagram showing the configuration of a learning system according to a second embodiment.
- FIG. 11 is a block diagram showing a configuration of a learning device according to a second embodiment.
- 13 is a flowchart showing a flow of an operation for generating a local model.
- FIG. 11 is a block diagram showing the configuration of a learning system according to a second embodiment.
- the learning device 1 includes a communication establishment unit 11, an acquisition unit 12, a learning unit 13, and an integration unit 14.
- the learning device 1 is connected to a public network (not shown).
- the networks of each organization are connected to the public network.
- An information terminal (not shown) is disposed in the network of each organization.
- the information terminal is a repository in which data sets owned by each organization are accumulated.
- the communication establishment unit 11 establishes secure communication between information terminals arranged in the networks of each organization.
- the communication establishment unit 11 may establish secure communication at a predetermined timing.
- the communication establishment unit 11 may establish secure communication based on the progress of learning of a local model, which will be described later.
- the communication establishment unit 11 connects the learning device 1 to each organization's network via a VPN (Virtual Private Network).
- VPN Virtual Private Network
- the confidentiality of the communication between the learning device 1 and the information terminal is maintained by encryption and encapsulation.
- secure communication is established between the learning device 1 and the information terminal.
- the communication establishment unit 11 may establish secure communication using a technology other than VPN.
- the communication establishment unit 11 may control communication using a protocol that includes encryption (e.g., SSL/TLS, SSH (Secure Shell), FTPS (File Transfer Protocol over SSL/TLS)).
- SSL/TLS Secure Sockets Layer
- SSH Secure Shell
- FTPS File Transfer Protocol over SSL/TLS
- the acquisition unit 12 acquires a data set for each organization from the information terminal using secure communication.
- the learning unit 13 trains the local model on the dataset.
- the integration unit 14 integrates multiple local models trained on multiple data sets.
- the learning device 1 includes a processor, a memory, and a storage device, which are not shown in the figure.
- the storage device also stores a computer program that implements the processing of the learning method according to this embodiment.
- the processor then loads the computer program from the storage device into the memory and executes the computer program. In this way, the processor realizes the functions of a communication establishment unit 11, an acquisition unit 12, a learning unit 13, and an integration unit 14.
- the communication establishment unit 11, the acquisition unit 12, the learning unit 13, and the integration unit 14 may each be realized by dedicated hardware.
- some or all of the components of each device may be realized by general-purpose or dedicated circuits, processors, etc., or a combination of these. These may be configured by a single chip, or may be configured by multiple chips connected via a bus. Some or all of the components of each device may be realized by a combination of the above-mentioned circuits, etc., and programs.
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- FPGA field-programmable gate array
- the multiple information processing devices, circuits, etc. may be centrally or decentralized.
- the information processing devices, circuits, etc. may be realized as a client-server system, cloud computing system, etc., in which each is connected via a communication network.
- the functions of the learning device 1 may be provided in the form of SaaS (Software as a Service).
- the learning device establishes secure communication with an information terminal connected to the network of each organization, and acquires a data set using the secure communication. Therefore, according to the first embodiment, a global model can be constructed in a case where the networks of multiple organizations are not constantly connected.
- Embodiment 2 is a specific example of embodiment 1.
- Fig. 2 is a schematic diagram showing the configuration of a learning system 100 according to embodiment 2.
- the learning system 100 includes an information terminal 2a, an information terminal 2b, an information terminal 2c, a VPN device 3a, a VPN device 3b, a VPN device 3c, and a learning device 4.
- the learning device 4 is a specific example of the learning device 1 described above.
- Organization A's network Na is equipped with an information terminal 2a and a VPN device 3a.
- Organization B's network Nb is equipped with an information terminal 2b and a VPN device 3b.
- Organization C's network Nc is equipped with an information terminal 2c and a VPN device 3c.
- Information terminal 2a stores a data set owned by organization A.
- Information terminal 2b stores a data set owned by organization B.
- Information terminal 2c stores a data set owned by organization C.
- the number of organizations is not limited to three.
- the number of organizations may be two, four or more.
- Each organization is, for example, a pharmaceutical manufacturer or a chemical manufacturer.
- the dataset is a dataset of compounds.
- Each record included in the dataset of compounds lists structural information and property information of the compound.
- the structure of the compound is expressed by a fixed-length bit string, and each bit of the bit string indicates the presence or absence of a specific structure (e.g., benzene ring).
- the property value e.g., tensile strength value
- data generated daily in the research and development work of organization A is accumulated in information terminal 2a.
- the dataset is not limited to a dataset of compounds, and may be a dataset of any thing.
- the network N may be a LAN (Local Area Network) or a network in which multiple LANs are connected.
- the network N is connected to a public network PN such as the Internet.
- the VPN device 3 is a VPN server or a router compatible with VPN.
- the IP (Internet Protocol) address of the learning device 4 may be set in advance in the VPN device 3.
- the VPN may be an Internet VPN, an IP-VPN, or a wide area Ethernet.
- the VPN devices 3a, 3b, and 3c are not to be distinguished from one another, they may simply be referred to as the VPN device 3.
- FIG. 3 is a block diagram illustrating the configuration of the learning device 4.
- the learning device 4 is connected to the network PN.
- the learning device 4 includes a communication establishment unit 41, an acquisition unit 42, a learning unit 43, and an integration unit 44.
- the learning device 4 has storage that stores local models La, Lb, and Lc.
- Local model La is a local model trained on a dataset owned by organization A.
- Local model Lb is a local model trained on a dataset owned by organization B.
- Local model Lc is a local model trained on a dataset owned by organization C.
- Local models La, Lb, and Lc are repeatedly updated by the learning unit 43. When local models La, Lb, and Lc are not to be distinguished from one another, they may be simply referred to as local model L.
- the communication establishment unit 41 is a specific example of the communication establishment unit 11 described above.
- the communication establishment unit 41 establishes secure communication with the information terminal 2. Specifically, the communication establishment unit 41 connects to a VPN device 3 such as a VPN server via the public network PN, and requests a VPN connection from the VPN device 3. First, a TCP/IP connection is established between the learning device 4 and the VPN device 3. Then, the learning device 4 is authenticated, and a VPN session is established between the learning device 4 and the VPN device 3. After the acquisition unit 42 acquires the data set, the communication establishment unit 41 terminates the VPN session.
- the learning device 4 may be connected to the network N by a remote access VPN.
- the timing at which the communication establishment unit 41 establishes secure communication i.e., the timing at which the learning device 4 connects to the network N via VPN, will be described later. This is because the timing may be related to the progress of processing in the learning unit 43, which will be described later.
- the timing at which secure communication is established with the information terminal 2a, the timing at which secure communication is established with the information terminal 2b, and the timing at which secure communication is established with the information terminal 2c may be different from each other.
- the acquisition unit 42 is a specific example of the acquisition unit 12 described above.
- the acquisition unit 42 acquires a data set from the information terminal 2 after the learning device 4 is connected to the network N via VPN.
- the learning unit 43 is a specific example of the learning unit 13 described above.
- the learning unit 43 trains the corresponding local model L on the data set acquired by the acquisition unit 42.
- the integration unit 44 is a specific example of the integration unit 14 described above.
- the integration unit 44 integrates the local models La, Lb, and Lc learned by the learning unit 43.
- the integrated model is called a global model.
- the integration unit 44 may integrate the local models La, Lb, and Lc at a predetermined timing (e.g., once a day, once every few months).
- the global model has higher performance than the local models La, Lb, and Lc.
- the integration unit 44 may also perform a process of integrating the local models La, Lb, and Lc after the learning of the local models La, Lb, and Lc is completed.
- the integration unit 44 may generate a global model by, for example, taking the arithmetic average of the model parameters of the local model La, the model parameters of the local model Lb, and the model parameters of the local model Lc. Note that the method of integrating the model parameters is not limited to the arithmetic average.
- the learning device 4 distributes the global model to the information terminals 2a, 2b, and 2c. For example, after the process of generating the global model is completed, the learning device 4 may connect to the networks Na, Nb, and Nc in order via VPN and transmit the global model to the information terminals 2a, 2b, and 2c.
- the learning device 4 may also connect to the network N via VPN in response to a request from each information terminal 2 and transmit the global model to the information terminal 2.
- Each information terminal 2 can import the global model at any time.
- Organizations A, B, and C will be able to utilize a high-performance global model that links data sets owned by multiple organizations.
- Constructing multiple local models L and integrating the multiple local models L is also called federated learning.
- the learning device 4 is performing federated learning.
- constructing a local model L in local terminals such as the information terminals 2a, 2b, and 2c is also sometimes called federated learning.
- the learning device 4 constructs the local model L.
- the learning device 4 sequentially repeats the process of establishing secure communication, the process of acquiring a dataset, and the process of training a local model from the acquired dataset. This makes it possible to improve the performance of the global model based on the datasets accumulated daily in each information terminal 2. Note that the process of integrating multiple local models may be performed at any time.
- the communication establishment unit 41 may establish secure communication at a predetermined timing.
- the predetermined timing may be once every few months, or once every few days.
- the communication establishment unit 41 may also establish secure communication in response to receiving a request from each information terminal 2. For example, the information terminal 2 transmits a request when the amount of accumulated data sets reaches or exceeds a predetermined amount.
- the communication establishment unit 41 may establish the next secure communication based on the progress of learning the dataset to the local model L.
- the dataset is divided into multiple batches, and the local model L learns the multiple batches in sequence.
- the process of dividing the dataset into batches and learning the multiple batches is repeated a predetermined number of times.
- the predetermined number of times is set so that the model parameters of the local model L converge.
- the predetermined number of times needs to be set to a small number so as not to cause overlearning.
- the communication establishment unit 41 may establish the next secure communication when the model parameters of the local model have converged.
- the progress of learning may be represented by the number of learning iterations or the number of learned batches. For example, if a dataset is divided into five batches and learning is repeated ten times, the next secure communication may be established when learning is complete, that is, when the tenth learning is finished.
- the communication establishment unit 41 may establish the next secure communication when learning is nearing completion, for example, when the fourth batch of the tenth learning is completed.
- the communication establishment unit 41 may sequentially establish secure communication with the information terminals 2a, 2b, and 2c when the learning progress of the local model La, the learning progress of the local model Lb, and the learning progress of the local model Lc exceed a threshold value. Also, the communication establishment unit 41 may establish secure communication with the corresponding information terminal 2 when the learning progress of any of the local models L exceeds a threshold value.
- the communication establishment unit 41 may establish the next secure communication based on the progress of the process of integrating multiple local models L. If the process in the integration unit 44 is not a simple arithmetic average or if there are a large number of organizations, the process in the integration unit 44 may take a long time. It is efficient if the next process can be started after the process in the integration unit 44 is completed.
- Secure computation is a technique for performing computational processing on data while keeping it encrypted, and known secure computation techniques include multi-party computation (MPC) and homomorphic encryption.
- FIG. 4 is a flowchart showing the process flow for generating a local model L. It is assumed that the learning device 4 stores an initial local model L (step S101).
- step S102 the communication establishment unit 41 of the learning device 4 determines whether it is time to establish secure communication. If it is not time to establish secure communication (NO in step S102), the process returns to step S102.
- the communication establishment unit 41 establishes secure communication between the information terminal 2 and the learning device 4, and the acquisition unit 42 acquires a data set from the information terminal 2 (step S103). After that, the communication establishment unit 41 ends the secure communication.
- step S103 multiple data sets may be acquired.
- secure communication is established between the information terminal 2a and the learning device 4, the acquisition unit 42 acquires the data set from the information terminal 2a, and the communication establishment unit 41 terminates the secure communication.
- secure communication is established between the information terminal 2b and the learning device 4, the acquisition unit 42 acquires the data set from the information terminal 2b, and the communication establishment unit 41 terminates the secure communication.
- secure communication is established between the information terminal 2c and the learning device 4, the acquisition unit 42 acquires the data set from the information terminal 2c, and the communication establishment unit 41 terminates the secure communication.
- a data set may be acquired from any of the information terminals 2a, 2b, and 2c.
- the learning unit 43 trains the local model L on the data set acquired in step S103, and updates the local model L (step S104). If multiple data sets are acquired in step S103, multiple local models L may be updated in step S104. After updating the local model L, the process returns to step S102. Note that the process of integrating multiple local models L may be performed at any timing.
- the learning device connects to each organization's network via VPN at an appropriate communication timing and acquires the data set of that organization. This allows the data set to be received securely and a local model to be constructed at an appropriate timing.
- the secure communication is not limited to communication via a VPN.
- the secure communication may be communication via any secure communication protocol (e.g., an encryption protocol).
- the data set may be sent from the information terminal 2 to the learning device 4 by email using a secure communication protocol (e.g., S/MIME).
- S/MIME secure communication protocol
- the device equipped with the integration unit 44 that integrates the global model may be different from the device equipped with the learning unit 43 that constructs the local model L.
- the device equipped with the integration unit 44 may establish secure communication (e.g., SSL) with the device equipped with the learning unit 43 to acquire the local model L.
- SSL secure communication
- the third embodiment is a specific example of the second embodiment.
- the learning device according to the third embodiment integrates model parameters of local models by secure computation.
- Fig. 5 is a block diagram showing the configuration of a learning system 100a according to the third embodiment. Comparing Fig. 2 with Fig. 5, a server group 5 has been added.
- the server group 5 includes multiple secure computation servers 51. Note that the number of secure computation servers 51 is not limited to three. However, when performing secure computation, it is preferable that the number of secure computation servers 51 is three or more.
- the server group 5 integrates the local model La, the local model Lb, and the local model Lc, and transmits the result of the secret computation to the learning device 4.
- the integration unit 44 of the learning device 4 divides the model parameters of the local model La into multiple shares (e.g., three) and transmits the multiple shares to the multiple secure computation servers 51.
- the integration unit 44 divides the model parameters of the local model Lb into multiple shares and transmits the multiple shares to the multiple secure computation servers 51.
- the integration unit 44 divides the model parameters of the local model Lc into multiple shares and transmits the multiple shares to the multiple secure computation servers 51.
- Each secure computation server 51 uses the received shares to perform secure computation to calculate the global model.
- the local model cannot be known from the shares, and computation using the shares can be considered secure computation.
- Multiple secure computation servers 51 may cooperate to perform multi-party computation (MPC). Since the amount of computation required to integrate the local model L is sufficiently small, it is considered that the server group 5 can perform secure computation in a reasonable amount of time.
- MPC multi-party computation
- the third embodiment also has the same effect as the second embodiment. Furthermore, according to the third embodiment, the calculations for integrating the global models can be kept confidential.
- the above-mentioned program includes a set of instructions (or software code) for causing the computer to perform one or more functions described in the embodiments when the program is loaded into the computer.
- the program may be stored on a non-transitory computer-readable medium or a tangible storage medium.
- computer-readable medium or tangible storage medium may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drive (SSD) or other memory technology, CD-ROM, digital versatile disc (DVD), Blu-ray (registered trademark) disc or other optical disk storage, magnetic cassette, magnetic tape, magnetic disk storage or other magnetic storage device.
- the program may be transmitted on a transitory computer-readable medium or a communication medium.
- the transitory computer-readable medium or communication medium may include electrical, optical, acoustic, or other forms of propagated signals.
- a part or all of the above-described embodiments can be described as, but is not limited to, the following supplementary notes.
- Appendix 1 A communication establishment means for establishing a secure communication between an information terminal disposed on the network of each organization; an acquisition means for acquiring a data set for each organization from the information terminal using the secure communication; A learning means for training a local model on the data set; and an integration means for integrating a plurality of local models trained on a plurality of data sets.
- Appendix 2 The learning device according to claim 1, wherein the communication establishment means establishes a next secure communication based on a progress of learning of the local model.
- a learning device Information terminals installed on each organization's network, A learning device; A learning system comprising: The learning device includes: Establishing a secure communication with the information terminal; Acquiring a data set for each organization from an information terminal using the secure communication; training a local model on said dataset; A learning system that integrates multiple local models trained on multiple datasets.
- the learning device includes: Establishing a secure communication with the information terminal; Acquiring a data set for each organization from an information terminal using the secure communication; training a local model on said dataset; A learning system that integrates multiple local models trained on multiple datasets.
- the learning device establishes a next secure communication based on a progress of learning in the local model.
- the computer Establish secure communications between information terminals located on each organization's network, Acquiring a data set for each organization from an information terminal using the secure communication; training a local model on said dataset; A learning method that integrates multiple local models trained on multiple datasets.
- Reference Signs List 1 4 Learning device 11, 41 Communication establishment unit 12, 42 Acquisition unit 13, 43 Learning unit 14, 44 Integration unit 2, 2a, 2b, 2c Information terminal 3, 3a, 3b, 3c VPN device 100, 100a Learning system 5 Server group 51 Secure computation server N, Na, Nb, Nc Network PN Public network
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Medical Informatics (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立する通信確立手段と、
前記セキュアな通信を用いて、前記情報端末から組織ごとのデータセットを取得する取得手段と、
前記データセットをローカルモデルに学習させる学習手段と、
複数のデータセットを学習させた複数のローカルモデルを統合する統合手段と
を備えている。
各組織のネットワークに配置された情報端末と、
学習装置と、
を備えた学習システムであって、
前記学習装置は、
前記情報端末との間でセキュアな通信を確立し、
前記セキュアな通信を用いて、前記情報端末から組織ごとのデータセットを取得し、
前記データセットをローカルモデルに学習させ、
複数のデータセットを学習させた複数のローカルモデルを統合する。
コンピュータが、
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立し、
前記セキュアな通信を用いて、前記情報端末から組織ごとのデータセットを取得し、
前記データセットをローカルモデルに学習させ、
複数のデータセットを学習させた複数のローカルモデルを統合する。
コンピュータに、
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立する処理と、
前記セキュアな通信を用いて、前記情報端末から組織ごとのデータセットを取得する処理と、
前記データセットをローカルモデルに学習させる処理と、
複数のデータセットを学習させた複数のローカルモデルを統合する処理と
を実行させるためのプログラムが格納される。
図1は、実施形態1にかかる学習装置1の構成を示すブロック図である。学習装置1は、通信確立部11、取得部12、学習部13、および統合部14を備えている。学習装置1は、公衆網(不図示)に接続されている。公衆網には、各組織のネットワークが接続されている。各組織のネットワークには、情報端末(不図示)が配置されている。情報端末は、各組織が所有するデータセットが蓄積されるリポジトリである。
実施形態2は、実施形態1の具体例である。図2は、実施形態2にかかる学習システム100の構成を示す概略図である。学習システム100は、情報端末2a、情報端末2b、情報端末2c、VPN装置3a、VPN装置3b、VPN装置3c、および学習装置4を備える。学習装置4は、上述した学習装置1の具体例である。
グローバルモデルを統合する統合部44を備えた装置と、ローカルモデルLを構築する学習部43を備えた装置とが異なっていてもよい。この場合、統合部44を備えた装置が、学習部43を備えた装置との間でセキュアな通信(例:SSL)を確立し、ローカルモデルLを取得してもよい。これにより、データセットが蓄積されるリポジトリ(例:情報端末2)とローカルモデルLの間の通信をセキュアにできるだけでなく、ローカルモデルLとグローバルモデルの間の通信をセキュアにできる。
実施形態3は、実施形態2の具体例である。実施形態3にかかる学習装置は、ローカルモデルのモデルパラメータを秘密計算で統合する。図5は、実施形態3にかかる学習システム100aの構成を示すブロック図である。図2と図5を比較すると、サーバ群5が追加されている。
(付記1)
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立する通信確立手段と、
前記セキュアな通信を用いて、前記情報端末から組織ごとのデータセットを取得する取得手段と、
前記データセットをローカルモデルに学習させる学習手段と、
複数のデータセットを学習させた複数のローカルモデルを統合する統合手段と
を備える学習装置。
(付記2)
前記通信確立手段は、前記ローカルモデルの学習の進捗度に基づいて次のセキュアな通信を確立する
付記1に記載の学習装置。
(付記3)
前記通信確立手段は、前記ローカルモデルのモデルパラメータが収束した場合、前記次のセキュアな通信を確立する
付記1に記載の学習装置。
(付記4)
前記通信確立手段は、予め定められたタイミングで前記セキュアな通信を確立する
付記1に記載の学習装置。
(付記5)
前記通信確立手段は、各情報端末からの要求の受信に応じて前記セキュアな通信を確立する
付記1に記載の学習装置。
(付記6)
前記情報端末に蓄積された前記データセットのデータ量が所定量を超えた場合、前記情報端末によって前記要求が送信される
付記5に記載の学習装置。
(付記7)
前記通信確立手段は、前記複数のローカルモデルを統合する処理の進捗度に基づいて次のセキュアな通信を確立する
付記1に記載の学習装置。
(付記8)
前記統合手段は、秘密計算技術を用いて前記複数のローカルモデルを統合する
付記7に記載の学習装置。
(付記9)
前記通信確立手段は、前記学習装置を前記ネットワークにVPN(Virtual Private Network)接続させることで、前記セキュアな通信を確立する
付記1から8のいずれか1項に記載の学習装置。
(付記10)
各組織のネットワークに配置された情報端末と、
学習装置と、
を備えた学習システムであって、
前記学習装置は、
前記情報端末との間でセキュアな通信を確立し、
前記セキュアな通信を用いて情報端末から組織ごとのデータセットを取得し、
前記データセットをローカルモデルに学習させ、
複数のデータセットを学習させた複数のローカルモデルを統合する
学習システム。
(付記11)
前記学習装置は、前記ローカルモデルにおける学習の進捗度に基づいて次のセキュアな通信を確立する
付記10に記載の学習システム。
(付記12)
コンピュータが、
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立し、
前記セキュアな通信を用いて情報端末から組織ごとのデータセットを取得し、
前記データセットをローカルモデルに学習させ、
複数のデータセットを学習させた複数のローカルモデルを統合する
学習方法。
(付記13)
コンピュータに、
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立する処理と、
前記セキュアな通信を用いて情報端末から組織ごとのデータセットを取得する処理と、
前記データセットをローカルモデルに学習させる処理と、
複数のデータセットを学習させた複数のローカルモデルを統合する処理と
を実行させるためのプログラムを格納する非一時的なコンピュータ可読媒体。
11、41 通信確立部
12、42 取得部
13、43 学習部
14、44 統合部
2、2a、2b、2c 情報端末
3、3a、3b、3c VPN装置
100、100a 学習システム
5 サーバ群
51 秘密計算サーバ
N、Na、Nb、Nc ネットワーク
PN 公衆網
Claims (13)
- 各組織のネットワークに配置された情報端末との間でセキュアな通信を確立する通信確立手段と、
前記セキュアな通信を用いて、前記情報端末から組織ごとのデータセットを取得する取得手段と、
前記データセットをローカルモデルに学習させる学習手段と、
複数のデータセットを学習させた複数のローカルモデルを統合する統合手段と
を備える学習装置。 - 前記通信確立手段は、前記ローカルモデルの学習の進捗度に基づいて次のセキュアな通信を確立する
請求項1に記載の学習装置。 - 前記通信確立手段は、前記ローカルモデルのモデルパラメータが収束した場合、次のセキュアな通信を確立する
請求項1に記載の学習装置。 - 前記通信確立手段は、予め定められたタイミングで前記セキュアな通信を確立する
請求項1に記載の学習装置。 - 前記通信確立手段は、各情報端末からの要求の受信に応じて前記セキュアな通信を確立する
請求項1に記載の学習装置。 - 前記要求は、前記情報端末に蓄積された前記データセットのデータ量が所定量を超えた場合に送信される
請求項5に記載の学習装置。 - 前記通信確立手段は、前記複数のローカルモデルを統合する処理の進捗度に基づいて次のセキュアな通信を確立する
請求項1に記載の学習装置。 - 前記統合手段は、秘密計算技術を用いて前記複数のローカルモデルを統合する
請求項7に記載の学習装置。 - 前記通信確立手段は、前記学習装置を前記ネットワークにVPN(Virtual Private Network)接続させることで、前記セキュアな通信を確立する
請求項1から8のいずれか1項に記載の学習装置。 - 各組織のネットワークに配置された情報端末と、
学習装置と、
を備えた学習システムであって、
前記学習装置は、
前記情報端末との間でセキュアな通信を確立し、
前記セキュアな通信を用いて情報端末から組織ごとのデータセットを取得し、
前記データセットをローカルモデルに学習させ、
複数のデータセットを学習させた複数のローカルモデルを統合する
学習システム。 - 前記学習装置は、前記ローカルモデルにおける学習の進捗度に基づいて次のセキュアな通信を確立する
請求項10に記載の学習システム。 - コンピュータが、
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立し、
前記セキュアな通信を用いて情報端末から組織ごとのデータセットを取得し、
前記データセットをローカルモデルに学習させ、
複数のデータセットを学習させた複数のローカルモデルを統合する
学習方法。 - コンピュータに、
各組織のネットワークに配置された情報端末との間でセキュアな通信を確立する処理と、
前記セキュアな通信を用いて情報端末から組織ごとのデータセットを取得する処理と、
前記データセットをローカルモデルに学習させる処理と、
複数のデータセットを学習させた複数のローカルモデルを統合する処理と
を実行させるためのプログラムを格納する非一時的なコンピュータ可読媒体。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/874,190 US20250373585A1 (en) | 2022-09-30 | 2022-09-30 | Learning apparatus, learning system, learning method, and computer readable medium |
| JP2024549044A JPWO2024069956A5 (ja) | 2022-09-30 | 学習装置、学習システム、学習方法、およびプログラム | |
| PCT/JP2022/036759 WO2024069956A1 (ja) | 2022-09-30 | 2022-09-30 | 学習装置、学習システム、学習方法、およびコンピュータ可読媒体 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/036759 WO2024069956A1 (ja) | 2022-09-30 | 2022-09-30 | 学習装置、学習システム、学習方法、およびコンピュータ可読媒体 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024069956A1 true WO2024069956A1 (ja) | 2024-04-04 |
Family
ID=90476702
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/036759 Ceased WO2024069956A1 (ja) | 2022-09-30 | 2022-09-30 | 学習装置、学習システム、学習方法、およびコンピュータ可読媒体 |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US20250373585A1 (ja) |
| WO (1) | WO2024069956A1 (ja) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015061304A (ja) * | 2013-09-20 | 2015-03-30 | 富士通株式会社 | 通信制御装置、通信制御方法および通信制御プログラム |
| JP2017163360A (ja) * | 2016-03-09 | 2017-09-14 | 富士通株式会社 | データ管理方法及びデータ管理システム |
| JP2018147261A (ja) * | 2017-03-06 | 2018-09-20 | Kddi株式会社 | モデル統合装置、モデル統合システム、方法およびプログラム |
| JP2019526851A (ja) * | 2016-07-18 | 2019-09-19 | ナント ホールディングス アイピー エルエルシーNant Holdings IP, LLC | 分散型機械学習システム、装置、および方法 |
| JP2020046928A (ja) * | 2018-09-19 | 2020-03-26 | キヤノン株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
-
2022
- 2022-09-30 WO PCT/JP2022/036759 patent/WO2024069956A1/ja not_active Ceased
- 2022-09-30 US US18/874,190 patent/US20250373585A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015061304A (ja) * | 2013-09-20 | 2015-03-30 | 富士通株式会社 | 通信制御装置、通信制御方法および通信制御プログラム |
| JP2017163360A (ja) * | 2016-03-09 | 2017-09-14 | 富士通株式会社 | データ管理方法及びデータ管理システム |
| JP2019526851A (ja) * | 2016-07-18 | 2019-09-19 | ナント ホールディングス アイピー エルエルシーNant Holdings IP, LLC | 分散型機械学習システム、装置、および方法 |
| JP2018147261A (ja) * | 2017-03-06 | 2018-09-20 | Kddi株式会社 | モデル統合装置、モデル統合システム、方法およびプログラム |
| JP2020046928A (ja) * | 2018-09-19 | 2020-03-26 | キヤノン株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20250373585A1 (en) | 2025-12-04 |
| JPWO2024069956A1 (ja) | 2024-04-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111611610B (zh) | 联邦学习信息处理方法、系统、存储介质、程序、终端 | |
| Zhang et al. | Federated learning for the internet of things: Applications, challenges, and opportunities | |
| Zhang et al. | SafeCity: Toward safe and secured data management design for IoT-enabled smart city planning | |
| Nguyen et al. | Impact of network delays on Hyperledger Fabric | |
| CN110135986B (zh) | 一种基于区块链智能合约实现的可搜索加密文件数据方法 | |
| CN111901309A (zh) | 一种数据安全共享方法、系统及装置 | |
| CN113537495B (zh) | 基于联邦学习的模型训练系统、方法、装置和计算机设备 | |
| EP3566389A1 (en) | Distributed privacy-preserving verifiable computation | |
| Manju Bala et al. | Blockchain-based IoT architecture for software-defined networking | |
| US10873455B2 (en) | Techniques for encryption key rollover synchronization in a network | |
| Pajic et al. | Topological conditions for in-network stabilization of dynamical systems | |
| CN110310176B (zh) | 一种基于区块链网络的数据加密方法及装置 | |
| CN112235266A (zh) | 一种数据处理方法、装置、设备及存储介质 | |
| US11212083B2 (en) | Slave secure sockets layer proxy system | |
| CN117581505A (zh) | 多用户量子密钥分发和管理的方法和系统 | |
| CN115481441A (zh) | 面向联邦学习的差分隐私保护方法及装置 | |
| Danner et al. | Robust fully distributed minibatch gradient descent with privacy preservation | |
| US20240177018A1 (en) | Systems and Methods for Differentially Private Federated Machine Learning for Large Models and a Strong Adversary | |
| US20200267138A1 (en) | Centrally managing data for distributed identity-based firewalling | |
| CN116057528A (zh) | 支持安全数据路由的方法和装置 | |
| WO2019186484A1 (zh) | Protocol configuration system, device and method in industrial cloud | |
| WO2024069956A1 (ja) | 学習装置、学習システム、学習方法、およびコンピュータ可読媒体 | |
| WO2024069957A1 (ja) | 学習装置、学習システム、学習方法、およびコンピュータ可読媒体 | |
| Khan et al. | IoT and blockchain integration challenges | |
| CN114760023A (zh) | 基于联邦学习的模型训练方法、装置及存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22961015 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18874190 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024549044 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22961015 Country of ref document: EP Kind code of ref document: A1 |
|
| WWP | Wipo information: published in national office |
Ref document number: 18874190 Country of ref document: US |