WO2025108195A1 - Procédé et appareil de détermination de modèle, et dispositif de communication - Google Patents
Procédé et appareil de détermination de modèle, et dispositif de communication Download PDFInfo
- Publication number
- WO2025108195A1 WO2025108195A1 PCT/CN2024/132393 CN2024132393W WO2025108195A1 WO 2025108195 A1 WO2025108195 A1 WO 2025108195A1 CN 2024132393 W CN2024132393 W CN 2024132393W WO 2025108195 A1 WO2025108195 A1 WO 2025108195A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- model
- terminal device
- reference signal
- scene information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/14—Network analysis or design
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/14—Network analysis or design
- H04L41/145—Network analysis or design involving simulating, designing, planning or modelling of a network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/16—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks using machine learning or artificial intelligence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/06—Testing, supervising or monitoring using simulated traffic
Definitions
- the present application belongs to the field of communication technology, and specifically relates to a model determination method, device and communication equipment.
- model supervision is an important part of the lifecycle management of machine learning models.
- the existing model supervision methods based on model input/output only indicate whether the current model is valid, but do not indicate which machine learning model should be applied to the environment in which the current terminal device is located.
- some data information corresponding to the terminal device such as location information and communication data, will change, and the data characteristics will also change accordingly. If the environment in which the current terminal device is located does not match the machine learning model running in the terminal device or the network-side device, it will affect the efficiency and accuracy of data processing, and thus affect the communication quality of the terminal device.
- the embodiments of the present application provide a model determination method, apparatus, and communication device, which can solve the problem in the related art that it is impossible to indicate which machine learning model should be applied to the environment in which the current terminal device is located.
- a model determination method comprising:
- the first device determines a first model based on the first information
- the first device activates the first model
- the first information includes any one of the following:
- Scene information of the terminal device and a mapping relationship between the machine learning model and the scene information;
- the first device is the terminal device or a network side device;
- Model information of the machine learning model associated with the scene information in which the terminal device is located is located.
- a data transmission method comprising:
- the second device sends sixth information to the first device, where the sixth information includes at least one of the following:
- third information where the third information is used to indicate an association relationship between the position coordinates and the scene information
- a first indication where the first indication is used to indicate scene information of a terminal device
- the second indication is used to indicate a mapping relationship between the machine learning model and the scene information
- a third indication is used to indicate a machine learning model associated with the scene information in which the terminal device is located.
- a model determination device which is applied to a first device, and the device includes:
- a model determination module configured to determine a first model based on the first information
- a model activation module used for activating the first model
- the first information includes any one of the following:
- Scene information of the terminal device and a mapping relationship between the machine learning model and the scene information;
- the first device is the terminal device or a network side device;
- Model information of the machine learning model associated with the scene information in which the terminal device is located is located.
- a data transmission apparatus which is applied to a second device, and the apparatus includes:
- the information sending module is used to send sixth information to the first device, where the sixth information includes at least one of the following:
- third information where the third information is used to indicate an association relationship between the position coordinates and the scene information
- a first indication where the first indication is used to indicate scene information of a terminal device
- the second indication is used to indicate a mapping relationship between the machine learning model and the scene information
- a third indication is used to indicate a machine learning model associated with the scene information in which the terminal device is located.
- a communication device which includes a processor and a memory, wherein the memory stores a program or instruction that can be run on the processor, and when the program or instruction is executed by the processor, the steps of the model determination method as described in the first aspect are implemented, or the steps of the data transmission method as described in the second aspect are implemented.
- a model determination system comprising: a first device and a second device, wherein the first device can be used to execute the steps of the model determination method as described in the first aspect above, and the second device can be used to execute the steps of the data transmission method as described in the second aspect above.
- a readable storage medium on which a program or instruction is stored.
- the program or instruction is executed by a processor, the steps of the model determination method as described in the first aspect are implemented, or the steps of the data transmission method as described in the second aspect are implemented.
- a chip comprising a processor and a communication interface, wherein the communication interface is coupled to the processor, and the processor is used to run a program or instruction to implement the method described in the first aspect, or to implement the method described in the second aspect.
- a computer program/program product is provided, wherein the computer program/program product is stored in a storage medium and is executed by at least one processor to implement the steps of the method described in the first aspect or the second aspect.
- a model determination apparatus/device which includes the apparatus/device (configured to) be used to execute steps to implement the model determination method as described in the first aspect.
- a data transmission apparatus/device is provided, wherein the apparatus/device (configured to) is used to execute steps to implement the data transmission method as described in the first aspect.
- the embodiment of the present application associates the machine learning model with the scene.
- the first device can determine which model should be applied in the current environment based on the scene information of the terminal device and the mapping relationship between the machine learning model and the scene information; or, the first device can directly determine the machine learning model associated with the scene information of the current terminal device based on the model information in the first information, and activate the model.
- the first device can determine which model should be applied based on the first information, thereby ensuring that during the movement of the terminal device, the machine learning model running in the terminal device or the network-side device can always adapt to the scene of the terminal device, thereby ensuring the accuracy and efficiency of data processing.
- FIG1 is a block diagram of a wireless communication system to which an embodiment of the present application can be applied;
- FIG2 is a flow chart of a model determination method in an embodiment of the present application.
- FIG3 is a schematic diagram of the structure of a neural network model in an embodiment of the present application.
- FIG4 is a schematic diagram of a neuron in an embodiment of the present application.
- FIG5 is a schematic diagram of a flow chart of a model determination method in an embodiment of the present application.
- FIG6 is a flow chart of another model determination method in an embodiment of the present application.
- FIG7 is a flow chart of a model determination method in an embodiment of the present application.
- FIG8 is a flow chart of another model determination method in an embodiment of the present application.
- FIG9 is a flow chart of a data transmission method in an embodiment of the present application.
- FIG10 is a structural block diagram of a model determination device in an embodiment of the present application.
- FIG11 is a structural block diagram of a data transmission device in an embodiment of the present application.
- FIG12 is a structural block diagram of a communication device in an embodiment of the present application.
- FIG13 is a block diagram of a terminal device in an embodiment of the present application.
- FIG14 is a structural block diagram of a network side device in an embodiment of the present application.
- FIG15 is a structural block diagram of another network-side device in an embodiment of the present application.
- first, second, etc. in the specification and claims of the present application are used to distinguish similar objects, and are not used to describe a specific order or sequence. It should be understood that the terms used in this way are interchangeable under appropriate circumstances, so that the embodiments of the present application can be implemented in an order other than those illustrated or described here, and the objects distinguished by “first” and “second” are generally of the same type, and the number of objects is not limited.
- the first object can be one or more.
- “and/or” in the specification and claims represents at least one of the connected objects, and the character “/" generally represents that the objects associated with each other are in an "or” relationship.
- LTE Long Term Evolution
- LTE-A Long Term Evolution
- CDMA Code Division Multiple Access
- TDMA Time Division Multiple Access
- FDMA Frequency Division Multiple Access
- OFDMA Orthogonal Frequency Division Multiple Access
- SC-FDMA Single-carrier Frequency Division Multiple Access
- NR new radio
- FIG1 shows a block diagram of a wireless communication system applicable to an embodiment of the present application.
- the wireless communication system includes a terminal device 11 and a network side device 12.
- the terminal device 11 may be a mobile phone, a tablet computer (Tablet Personal Computer), a laptop computer (Laptop Computer) or a notebook computer, a personal digital assistant (Personal Digital Assistant, PDA), a handheld computer, a netbook, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a mobile Internet device (Mobile Internet Device, MID), an augmented reality (augmented reality, AR)/ Virtual reality (VR) equipment, robots, wearable devices (Wearable Device), vehicle-mounted equipment (VUE), pedestrian terminal (PUE), smart home (home appliances with wireless communication functions, such as refrigerators, televisions, washing machines or furniture, etc.), game consoles, personal computers (personal computers, PCs), teller machines or self-service machines and other terminal side devices, wearable devices include: smart watches, smart bracelets, smart headphones, smart glasses, smart jewelry (
- the network side device 12 may include access network equipment or core network equipment, wherein the access network device 12 may also be called wireless access network equipment, wireless access network (Radio Access Network, RAN), wireless access network function or wireless access network unit.
- the access network device 12 may include a base station, a WLAN access point or a WiFi node, etc.
- the base station may be called a node B, an evolved node B (eNB), an access point, a base transceiver station (Base Transceiver Station, BTS), a radio base station, a radio transceiver, a basic service set (Basic Service Set, BSS), an extended service set (Extended Service Set, ESS), a home B node, a home evolved B node, a transmitting and receiving point (Transmitting Receiving Point, TRP) or some other suitable term in the field.
- the base station is not limited to specific technical vocabulary. It should be noted that in the embodiments of the present application, only the base station in the NR system is taken as an example for introduction, and the specific type of the base station is not limited.
- the core network equipment may include but is not limited to at least one of the following: core network nodes, core network functions, mobility management entity (Mobility Management Entity, MME), access mobility management function (Access and Mobility Management Function, AMF), session management function (Session Management Function, SMF), user plane function (User Plane Function, UPF), policy control function (Policy Control Function, PCF), policy and charging rules function unit (Policy and Charging Rules Function, PCRF), edge application service discovery function (Edge Application Server Discovery ...
- MME mobility management entity
- AMF Access and Mobility Management Function
- SMF Session Management Function
- SMF Session Management Function
- UPF User Plane Function
- Policy Control Function Policy Control Function
- PCRF Policy and Charging Rules Function
- edge application service discovery function Edge Application Server Discovery ...
- the embodiment of the present application provides a model determination method.
- FIG. 2 a flow chart of a model determination method provided by the embodiment of the present application is shown. The method is applied to a first device, as shown in FIG. 2 , and the method may specifically include:
- Step 201 A first device determines a first model based on first information.
- Step 202 The first device activates the first model.
- the first information includes any one of the following:
- Scene information of the terminal device and a mapping relationship between the machine learning model and the scene information;
- the first device is the terminal device or a network side device;
- Model information of the machine learning model associated with the scene information in which the terminal device is located is located.
- the first device may be a terminal device or a network side device.
- the terminal device may include a conventional terminal device and/or a positioning reference unit.
- the conventional terminal device may be the terminal device 11 in Figure 1.
- the positioning reference unit Positioning Reference Unit, PRU
- PRU Positioning Reference Unit
- the PRU can send a positioning reference signal (Positioning reference signal, PRS) to the transmission and receiving point (Transmission and Receiving Point, TRP), so that the TRP can measure and report the uplink (Up-Link, UL) positioning measurement values of the PRU from a known position, such as RTOA, UL-AoA, gNB Rx-Tx time difference, etc.
- PRS Positioning reference signal
- TRP Transmission and Receiving Point
- the location server may compare the PRU measurements with expected measurements at known PRU locations to determine correction terms for other nearby target devices, and then correct the DL and/or UL location measurements of the other target devices based on the correction terms.
- the network side device can be the access network device in Figure 1, such as a base station or a newly defined artificial intelligence processing node on the access network side, or it can be the core network device in Figure 1, such as a network data analysis function (Network Data Analytics Function, NWDAF), a positioning management function (Location Management Function, LMF), or a newly defined processing node on the core network side, or it can be a combination of the above multiple nodes.
- NWDAF Network Data Analytics Function
- LMF Location Management Function
- a newly defined processing node on the core network side or it can be a combination of the above multiple nodes.
- the machine learning model can be trained by a network-side device, and the network-side device sends the trained machine learning model to the terminal device through model transfer/delivery.
- the network-side device records the association between the model identifier of each machine learning model and the scenario information.
- the machine learning model is trained by a third-party server, which sends the trained machine learning model to the terminal device and/or network side device, and sends the association between the model identifier of the machine learning model and the scene information to the terminal device and/or network side device.
- the machine learning model in the embodiment of the present application can be an artificial intelligence (AI) model, such as any one of a fully connected neural network, a convolutional neural network, a decision tree, a support vector machine, and a Bayesian classifier.
- AI artificial intelligence
- the neural network may include one or more input layers, one or more hidden layers, and an output layer.
- the data to be processed [X1, X2...Xn] are respectively input into the neural network from the corresponding input layer, and the output result Y is obtained after being processed by the input layer, the hidden layer, and the output layer.
- the neural network is composed of neurons, and a schematic diagram of the neuron is shown in Figure 4.
- a1, a2,...aK represent inputs
- w represents weights (i.e., multiplicative coefficients)
- b represents biases (i.e., additive coefficients)
- ⁇ (.) represents activation functions.
- Common activation functions include Sigmoid (mapping variables between 0 and 1), tanh (translation and contraction of Sigmoid), linear rectification function/rectified linear unit (Rectified Linear Unit, ReLU), etc.
- the model training process is introduced as follows:
- the parameters of the neural network can be optimized by the gradient optimization algorithm.
- the gradient optimization algorithm is a type of algorithm that minimizes or maximizes the objective function (sometimes also called the loss function), and the objective function is often a mathematical combination of model parameters and data. For example, given the data X and its corresponding label Y, a neural network model f(.) can be constructed, then the predicted output f(x) can be obtained based on the input x, and the difference between the predicted value and the true value (fx-Y) can be calculated, which is the loss function.
- the optimization goal of the gradient optimization algorithm is to find the appropriate w (i.e. weight) and b (i.e. bias) to minimize the value of the above loss function, and the smaller the loss value, the closer the model is to the actual situation.
- the common optimization algorithms are basically based on the error back propagation (BP) algorithm.
- BP error back propagation
- the basic idea of the BP algorithm is that the learning process consists of two processes: the forward propagation of the signal and the back propagation of the error.
- the input sample is transmitted from the input layer, processed by each hidden layer layer by layer, and then transmitted to the output layer. If the actual output of the output layer does not match the expected output, it will enter the back propagation stage of the error.
- Error back propagation is to propagate the output error layer by layer through the hidden layer to the input layer in some form, and distribute the error to all units in each layer, so as to obtain the error signal of each layer unit, and this error signal is used as the basis for correcting the weights of each unit.
- This process of adjusting the weights of each layer of the signal forward propagation and error back propagation is repeated.
- the process of continuous adjustment of weights is the learning and training process of the network. This process continues until the error of the network output is reduced to an acceptable level, or until the pre-set number of learning times is reached.
- these optimization algorithms calculate the derivative/partial derivative of the current neuron based on the error/loss obtained by the loss function, add the influence of the learning rate, the previous gradient/derivative/partial derivative, etc., get the gradient, and pass the gradient to the previous layer.
- the machine learning model may also be referred to as an AI unit, an AI model, an ML (machine learning) model, an ML unit, an AI structure, an AI function, an AI characteristic, a neural network, a neural network function, a neural network function, etc., or the AI unit/AI model may also refer to a processing unit capable of implementing specific algorithms, formulas, processing procedures, capabilities, etc.
- the AI unit/AI model may be a processing method, algorithm, function, module or unit for a specific data set, or the AI unit/AI model may be a processing method, algorithm, function, module or unit running on AI/ML related hardware such as a GPU, NPU, TPU, ASIC, etc., and the present invention does not specifically limit this.
- the specific data set includes the input and/or output of the AI unit/AI model.
- the identifier of the AI unit/AI model may be an AI model identifier, an AI structure identifier, an AI algorithm identifier, or an identifier of a specific data set associated with the AI unit/AI model, or an identifier of a specific scenario, environment, channel feature, or device related to the AI/ML, or an identifier of a function, feature, capability, or module related to the AI/ML, which is not specifically limited in the embodiments of the present application.
- the first device may determine the first model based on the first information. For example, the first device determines the machine learning model associated with the scene information of the terminal device according to the scene information of the terminal device and the mapping relationship between the machine learning model and the scene information, and determines the model as the first model. Alternatively, in the case where the first information includes model information of the machine learning model associated with the scene information of the terminal device, the first device may directly determine the machine learning model indicated by the model information as the first model.
- the first device can activate the first model, thereby using the first model to process the data generated by the terminal device in the current scenario.
- the machine learning model running in the first device can be used to process the data corresponding to the terminal device, such as determining the location information of the terminal device, analyzing the communication quality of the cell where the terminal device is located, and performing access control on the terminal device, etc.
- the machine learning model running in the first device may not meet the data processing requirements of the scene where the terminal device is currently located.
- the terminal device can determine the first model that matches the scene where the terminal device is currently located based on the first information and activate it.
- the scene information of the terminal device may include, but is not limited to, the scene ID, scene information, scene category, area ID, area information, area category, data set ID, data set information, data set category, etc. of the scene in which the terminal device is located.
- the granularity of the scene or area or data set may be a cell.
- the scene ID, area ID, and data set ID may be associated with the physical cell identifier (PCI) of one or more cells, so as to determine the scene ID, area ID, and data set ID corresponding to the first device according to the cell in which the first device is located.
- the granularity of the scene or area or data set may also be smaller than the cell.
- a scene may be a factory building, a building, or even a floor in a building in a cell.
- a machine learning model can correspond to one or more scenarios, regions, or datasets.
- the first device can determine that the currently running machine learning model can no longer meet the computing requirements of the current scene. In other words, in the scene currently located by the first device, the machine learning model currently running in the first device is an invalid model. In this case, the first device can switch the running machine learning model to the first model.
- the embodiment of the present application associates the machine learning model with the scene.
- the first device can determine which model should be applied in the current environment based on the scene information of the terminal device and the mapping relationship between the machine learning model and the scene information; or, the first device can directly determine the machine learning model associated with the scene information of the current terminal device based on the model information in the first information, and activate the model.
- the first device can determine which model should be applied based on the first information, thereby ensuring that during the movement of the terminal device, the machine learning model running in the terminal device or the network-side device can always adapt to the scene of the terminal device, thereby ensuring the accuracy and efficiency of data processing.
- the method further includes:
- the first device obtains scene information where the terminal device is located
- the first device obtains a mapping relationship between a machine learning model and scene information.
- the mapping relationship between the machine learning model and the scene information can be generated by a network-side device or a third-party server that trains the model.
- the mapping relationship between the machine learning model and the scene information can be sent to the first device by a network-side device or a third-party server that trains the model;
- the network-side device can locally generate a mapping relationship between the machine learning model and the scene information based on the model training process;
- the network-side device can read the mapping relationship between the machine learning model and the scene information from the third-party server.
- the first device may also obtain the scene information of the terminal device in a variety of ways.
- the first device obtains the scene information in which the terminal device is located, including:
- Step S11 The first device acquires second information, where the second information is used to indicate communication information of the terminal device, and the second information is associated with scene information where the terminal device is located;
- Step S12 The first device determines the scene information of the terminal device according to the second information.
- the second information is associated with the scene in which the first device is located, for example, the second information is associated with the scene ID, area ID, data set ID, scene category, area category, data set category and other information in which the first device is currently located.
- the second information may include the cell ID, reference signal ID, transmission and reception point ID, area ID, tracking area (Tracking Area) ID and the like corresponding to the first device.
- the first device can determine the scene information of the terminal device based on the second information.
- the first device acquiring the second information includes:
- the first device measures the reference signal and determines second information based on the measurement result.
- a schematic flow chart of a model determination method provided by an embodiment of the present application is shown.
- the first device if the first device is a terminal device, the first device can determine the second information by measuring a reference signal.
- the first device acquires the second information, including:
- the first device receives second information sent by the terminal device.
- the second information can be generated by the terminal device according to the measurement result of the reference signal and then sent to the network side device.
- the network side device itself does not need to perform any measurement operation on the reference signal.
- the first device determines the scene information of the terminal device according to the second information, including:
- the first device acquires an association relationship between the communication information and the scene information
- the first device determines the scene information of the terminal device according to the second information and the association between the communicated information and the scene information.
- the association between the communication information and the scene may be sent by the second device to the first device, or may be specified by the protocol.
- the first device is a network side device, such as an access network device
- the second device may be a core network device
- the first device is a terminal device
- the second device may be a network side device or a high-level terminal device.
- the first device After the first device acquires the second information, it can determine the scene information of the terminal device based on the communication information indicated by the second information and the association between the communication information and the scene information.
- the first device is a network side device
- the network side device after the network side device receives the second information reported by the terminal device, it can determine the scene information of the terminal device based on the communication information indicated by the second information, and the association between the communication information and the scene information, and then further combine the association between the machine learning model and the scene information to determine the first model and activate it.
- the first device is a terminal device
- the terminal device can determine the scene information of the terminal device based on the communication information indicated by the second information and the association between the communication information and the scene information, and then further combine the association between the machine learning model and the scene information to determine the first model and activate it.
- the terminal device can also send the second information to the network side device, and the network side device determines the scene information of the terminal device based on the second information and indicates it to the terminal device.
- the first device determines the scene information of the terminal device according to the second information, including:
- Step S21 The first device sends the second information to a network side device
- Step S22 The first device receives a first indication sent by the network side device; the first indication is used to indicate scene information of the first device.
- the first device is a terminal device
- the terminal device can determine the second information by measuring the reference signal and send the second information to the network side device.
- the network side device determines the scene information in which the terminal device is currently located based on the second information and the association between the communication information and the scene information, and indicates the scene information to the terminal device through a first indication.
- the terminal device determines and activates the first model based on the scene information indicated by the first indication and the association between the machine learning model and the scene information.
- the network side device determines the scene information in which the terminal device is currently located based on the communication information indicated by the second information and the association between the communication information and the scene information, and further determines the machine learning model associated with the scene information in which the terminal device is currently located based on the association between the machine learning model and the scene information, that is, the first model, and indicates the first model to the terminal device through a third indication.
- the first device obtains the scene information of the terminal device, including:
- the first device receives a first indication and a second indication sent by a second device, wherein the first indication is used to indicate scene information of the terminal device; and the second indication is used to indicate a mapping relationship between a machine learning model and the scene information.
- the scene information of the terminal device and the mapping relationship between the machine learning model and the scene information can also be indicated to the first device by the second device.
- the second device in the embodiment of the present application can be a network side device or a high-level terminal device.
- the first device is a network side device, such as an access network device
- the second device can be a core network device
- the first device is a terminal device
- the second device can be a network side device or a high-level terminal device.
- the first indication and the second indication may be carried in the same signaling, and the second device simultaneously sends the first indication and the second indication to the first device through a certain signaling, and the first device determines and activates the first model based on the received first indication and the second indication.
- the second device may send the first indication to the first device through one signaling, and send the second indication to the first device through other signaling.
- the signaling carrying the first indication and/or the second indication may include but is not limited to: Radio Resource Control Protocol (RRC) signaling, Radio Link Layer Control Protocol (RLA) signaling, Media Access Control (MAC) signaling, LTE Positioning Protocol (LPP) signaling, NR Positioning Protocol A (NRPPa) signaling, downlink control information (DCI), etc.
- RRC Radio Resource Control Protocol
- RLA Radio Link Layer Control Protocol
- MAC Media Access Control
- LPP LTE Positioning Protocol
- NRPPa NR Positioning Protocol A
- DCI downlink control information
- the first indication is sent by the second device to the first device
- the machine learning model is trained by a third-party server
- the third-party server sends the second indication to the first device.
- the method further includes:
- the first device receives a third indication sent by the second device, where the third indication is used to indicate a machine learning model associated with scene information in which the terminal device is located.
- the third indication may be sent by the second device to the first device.
- the first device may directly determine the machine learning model indicated by the third indication as the first model to be activated.
- the second device can determine the scene information of the terminal device based on the location information of the terminal device, and then determine the machine learning model that matches the scene currently located by the terminal device based on the association between the scene information and the machine learning model, generate a third indication and send it to the first device.
- the first device can send the scene information of the terminal device to the second device, and the second device determines the machine learning model associated with the scene currently located by the terminal device based on the scene information and the association between the machine learning model and the scene information, and generates a third indication and sends it to the first device.
- the first device measures the reference signal, and determines the second information based on the measurement result, including:
- Step S31 The first device measures a first reference signal when receiving a fourth indication; the fourth indication is used to instruct the first device to measure at least one reference signal;
- Step S32 The first device determines the second information based on a first measurement result of the first reference signal.
- the first device is a terminal device.
- the terminal device may measure the first reference signal when receiving the fourth indication, and determine the second information based on a first measurement result of the first reference signal.
- the fourth indication may be sent by the second device to the first device, or may be sent by other devices to the first device. In another possible application scenario, the fourth indication may also be automatically triggered by a higher layer of the first device when certain measurement conditions are met.
- the first reference signal may include but is not limited to: positioning reference signal, downlink channel sounding reference signal (Channel-State-Information Reference Signal, CSI-RS), uplink sounding signal (Sounding Reference Signal, SRS), synchronization signal block (Synchronization Signal Block, SSB), time-frequency tracking reference signal (Tracking Reference Signal, TRS), etc.
- the first device measures the reference signal, and determines the second information based on the measurement result, including:
- Step S41 The first device receives a second reference signal sent by a reference point
- Step S42 The first device measures the second reference signal, and determines second information based on a second measurement result of the second reference signal.
- the first device is a terminal device
- the terminal device may also measure a second reference signal from a receiving reference point, and determine second information based on a second measurement result of the second reference signal.
- the second information may include a cell ID, a reference signal ID, a receiving reference point ID, a scene ID, an area ID, a tracking area ID, etc. corresponding to the terminal device.
- the second information includes at least one of the following:
- the first parameter being used to indicate a communication resource of the terminal device
- the second communication information, the fifth parameter is used to indicate the communication area where the terminal device is located.
- the first communication information includes at least one of the following:
- the reference signal information includes at least one of the following:
- the second parameter is used to indicate a reference signal resource
- the second parameter, the third parameter is used to indicate reference signal measurement information
- the third parameter, the fourth parameter is used to indicate reference signal reporting information.
- the first parameter may be a reference signal resource ID, a reference signal resource set ID, etc.
- the second parameter may be a reference signal measurement ID, a reference signal measurement configuration ID, etc.
- the third parameter may be a reference signal reporting ID, a reference signal reporting configuration ID.
- the communication indicator information includes at least one of the following:
- the fourth parameter can be a statistical value or representation of signal quality, such as signal-to-noise ratio (SNR), signal to interference plus noise ratio (SINR), RSRP, reference signal received quality (RSRQ), signal power, noise power, interference power, etc.; or such as L1-RSRP, L1-SINR, L1-RSRP, L1-RSRQ, L3-RSRP, L3-SINR, L3-RSRP, L3-RSRQ, etc.
- SNR signal-to-noise ratio
- SINR signal-to-noise ratio
- SINR signal to interference plus noise ratio
- RSRP reference signal received quality
- the beam information may include information such as a beam index and a beam direction.
- the first device measures the reference signal, and determines the second information based on the measurement result, including:
- Step S51 The first device measures a reference signal to obtain a measurement result
- Step S52 The first device determines a target reference signal resource according to the measurement result, and determines second information according to resource information of the target reference signal resource.
- the target reference signal resource includes at least one of the following:
- N N first target reference signal resources among the reference signal resources configured for each transmission/reception point; the reference signal received power of the N first target reference signal resources is greater than the reference signal received power of other reference signal resources of the same transmission/reception point; N is a positive integer;
- Reference signal resources configured at each transmitting and receiving point.
- the first device is a terminal device, which can screen out N first target reference signal resources whose reference signal receiving power is greater than other reference signal resources of the same sending and receiving point from the reference signal resources configured for each sending and receiving point, and determine the second information based on resource information of the first target reference signal resources, such as reference signal ID, reference signal measurement ID, reference signal reporting ID and other information.
- the terminal device may also filter out a second target reference signal resource whose reference signal received power is greater than or equal to a preset threshold from each reference signal resource configured by each transmission and reception point, thereby determining the second information according to resource information of the second target reference signal resource, such as reference signal ID, reference signal measurement ID, reference signal reporting ID, etc.
- the preset threshold may be indicated by a network side device or may be specified by a protocol, which is not specifically limited in the embodiments of the present application.
- the terminal device determines the second information based on the reference signal resources configured for each transmitting and receiving point, without screening the reference signal resources.
- the terminal device can screen the target reference signal resources based on any one of items A1 to A3, and determine the second information based on the screened target reference signal resources, and then determine the scene information of the terminal device.
- the determined scene information is adapted to the reference signal resources configured at the sending and receiving points, and meets the specific reference signal receiving power, thereby ensuring the reliability of the determined scene information, which is conducive to improving the reliability of the first model finally determined, thereby ensuring that during the movement of the terminal device, the machine learning model running in the terminal device can always adapt to the reference signal resources configured at the sending and receiving points.
- the resource information includes at least one of the following:
- the terminal device can determine the second information based on resource information such as the reference signal receiving power, reference signal resource representation, beam identification, beam direction, etc. of the target reference signal resource (including at least one item of A1 to A3).
- resource information such as the reference signal receiving power, reference signal resource representation, beam identification, beam direction, etc. of the target reference signal resource (including at least one item of A1 to A3).
- the first device obtains the scene information of the terminal device, including:
- Step S61 The first device obtains location information of the terminal device, where the location information is associated with scene information where the terminal device is located;
- Step S62 The first device determines the scene information of the terminal device according to the location information and the association between the location coordinates and the scene information.
- the first device in addition to determining the scene information of the terminal device based on the second information, can also determine the scene information of the terminal device based on the location information of the terminal device and the association between the location coordinates and the scene information.
- the location information of the terminal device can be determined by the terminal device based on an AI model or other positioning methods, such as satellite positioning, GPS positioning system, Beidou positioning system, Bluetooth positioning, radar positioning, and other positioning methods based on mobile communication networks, such as positioning methods based on NR systems, LTE systems, etc.
- an AI model or other positioning methods such as satellite positioning, GPS positioning system, Beidou positioning system, Bluetooth positioning, radar positioning, and other positioning methods based on mobile communication networks, such as positioning methods based on NR systems, LTE systems, etc.
- the association relationship between the location coordinates and the scene information can be determined by a network side device, can be specified by a protocol, or can be sent by a second device to a first device.
- the embodiments of the present application do not specifically limit this.
- the second device can be a network side device or a high-level terminal device.
- the first device is an access network device, such as a base station
- the second device can be a core network device
- the first device is a terminal device
- the second device can be a network side device or a high-level terminal device.
- the method further includes:
- the first device receives third information sent by the second device, where the third information is used to indicate an association relationship between the location coordinates and the scene information.
- the second device may also indicate the association relationship between the location coordinates and the scene information to the first device through the third information.
- the first device After the first device receives the third information, it can determine the scene information of the terminal device based on the location information of the terminal device and the association relationship between the location coordinates and the scene information, and then determine the first model based on the scene information and the association relationship between the machine learning model and the scene information.
- the first device sends the determined scene information, such as scene ID, area ID, data set ID, etc., to the second device, and the second device determines a machine learning model that matches the scene information reported by the first device and indicates it to the first device.
- the determined scene information such as scene ID, area ID, data set ID, etc.
- the first device obtains the location information of the terminal device, including:
- the first device determines current location information based on positioning technology
- the first device receives fourth information sent by the terminal device, where the fourth information is used to indicate location information of the terminal device.
- the location information of the terminal device can be determined by the terminal device itself according to an AI model or other positioning methods, such as satellite positioning, GPS positioning system, Beidou positioning system, Bluetooth positioning, radar positioning, and other positioning methods based on mobile communication networks, such as positioning methods based on NR systems, LTE systems, etc.
- AI model or other positioning methods such as satellite positioning, GPS positioning system, Beidou positioning system, Bluetooth positioning, radar positioning, and other positioning methods based on mobile communication networks, such as positioning methods based on NR systems, LTE systems, etc.
- the scene information can be determined by combining the association between the location coordinates and the scene information.
- the terminal device reports the location information to the network side device through the fourth information, and the network side device determines the scene information of the terminal device based on the location information of the terminal device and the association between the location coordinates and the scene, and indicates the determined scene information to the terminal device through the first indication.
- the terminal device After the terminal device determines the scene information, it can further determine the first model in combination with the association between the machine learning model and the scene information.
- the network-side device determines the scene information of the terminal device based on the location information reported by the terminal device, and further determines the machine learning model associated with the scene information of the terminal device in combination with the association between the machine learning model and the scene information, that is, the first model, and indicates the first model to the terminal device through a third indication.
- FIG8 a flow chart of another model determination method provided by an embodiment of the present application is shown.
- the location information of the terminal device can be reported to the first device by the terminal device through the fourth information.
- the terminal device can simultaneously report the method for obtaining the location information and the reliability or confidence to the first device.
- the network side device After receiving the location information reported by the terminal device, the network side device can determine the scene information of the terminal device based on the location information and the association between the location coordinates and the scene information. Further, the network side device can determine the first model associated with the scene information of the terminal device based on the association between the machine learning model and the scene information.
- the first device activating the first model includes:
- the first device deactivates the second model and activates the first model.
- the first device can deactivate the second model and activate the first model.
- the first model and the second model in the present application are not limited to a certain AI model.
- the first model and the second model in the present application may include one or more AI models, or may be an AI function, and an AI function may be associated with one or more AI models.
- deactivating the second model may be to simultaneously deactivate one or more AI models included in the second model, or to simultaneously deactivate one or more AI functions referred to by the second model.
- activating the first model may be to simultaneously activate one or more AI models included in the first model, or to simultaneously activate one or more AI functions referred to by the first model.
- the deactivation operation and the activation operation may be independent of each other.
- the first model determined by the first device based on the first information includes the machine learning model currently running in the first device, then the machine learning model currently running in the first device is valid and no deactivation operation is required. In this case, the normal operation of the currently running machine learning model can be maintained, and then the AI models and/or AI functions included in the first model, except for the currently running machine learning model, can be activated.
- the AI models and/or AI functions included in the first model only include the AI models and/or AI functions currently running in the first device, then there is no need to perform deactivation and activation operations.
- the activation operation cannot be performed.
- no AI model or AI function is currently running in the first device, then there is no need to perform deactivation.
- the embodiment of the present application deactivates the second model and activates the first model when the currently running second model does not match the first model, thereby ensuring that the machine learning model running in the first device can always adapt to the scene in which the terminal device is located during the movement of the terminal device, thereby ensuring the accuracy and efficiency of data processing.
- the method further includes:
- the first device sends fifth information to the network side device.
- the fifth information includes at least one of the following:
- the activation time of the first model is the activation time of the first model.
- the model identifier of the deactivated second model, the model identifier of the first model to be activated, and at least one of the activation time of the first model can be sent to the network side device through the fifth information.
- the activation time of the first model is used to indicate the time of model switching, for example, the model switching is performed after M time units, including deactivating the second model and activating the first model.
- the embodiment of the present application provides a model determination method, which associates the machine learning model with the scene.
- the first device can determine which model should be applied in the current environment based on the scene information of the terminal device and the mapping relationship between the machine learning model and the scene information; or, the first device can directly determine the machine learning model associated with the scene information of the current terminal device based on the model information in the first information, and activate the model.
- the first device can determine which model should be applied based on the first information, thereby ensuring that during the movement of the terminal device, the machine learning model running in the terminal device or the network side device can always adapt to the scene of the terminal device, thereby ensuring the accuracy and efficiency of data processing.
- the embodiment of the present application provides a data transmission method.
- FIG. 9 a flow chart of a data transmission method provided by the embodiment of the present application is shown. The method is applied to a second device, as shown in FIG. 9 , and the method may specifically include:
- Step 501 The second device sends sixth information to the first device.
- the six pieces of information include at least one of the following:
- third information where the third information is used to indicate an association relationship between the position coordinates and the scene information
- a first indication where the first indication is used to indicate scene information in which the terminal device is located;
- the second indication is used to indicate a mapping relationship between the machine learning model and the scene information
- a third indication is used to indicate a machine learning model associated with the scene information in which the terminal device is located.
- the second device in the embodiment of the present application can be a network side device or a high-level terminal device.
- the first device is a network side device, such as an access network device
- the second device can be a core network device
- the first device is a terminal device
- the second device can be a network side device or a high-level terminal device.
- the third information is used to indicate the association between the location coordinates and the scene information.
- the second device can indicate the association between the location coordinates and the scene information to the first device through the third information.
- the first device After the first device receives the third information, it can determine the scene information of the terminal device based on the location information of the terminal device and the association between the location coordinates and the scene information, and then determine the first model based on the scene information and the association between the machine learning model and the scene information.
- the first indication may include, but is not limited to, a scenario ID, scenario information, scenario category, area ID, area information, area category, dataset ID, dataset information, dataset category, etc. of the scenario in which the first device is located.
- the granularity of the scenario, area, or dataset may be a cell.
- the scenario ID, area ID, and dataset ID may be associated with the Physical Cell Identifier (PCI) of one or more cells, so as to determine the scenario ID, area ID, and dataset ID corresponding to the first device according to the cell in which the first device is located.
- PCI Physical Cell Identifier
- the granularity of the scenario, area, or dataset may also be smaller than the cell.
- a scenario may be a factory building, a building, or even a floor in a building in a cell.
- a machine learning model may correspond to one or more scenarios, areas, or datasets.
- the machine learning model can be trained by the second device, and the second device records the association between the model identifier of each machine learning model and the scene information.
- the machine learning model is trained by a third-party server, and the third-party server sends the trained machine learning model to the first device, and sends the association between the machine learning model and the scene information to the first device and/or the second device.
- the second device can send the association relationship between the machine learning model and the scene information to the first device through the second indication, so that the first device determines the first model based on the second indication.
- the second device may also determine the machine learning model associated with the scene information where the terminal device is located based on the scene information where the terminal device is located and the positional relationship between the machine learning model and the scene information, and indicate the model information of the model to the first device through a third indication.
- the third indication may include model information of the machine learning model associated with the scene information where the terminal device is located, such as a model identifier.
- an embodiment of the present application provides a data transmission method, whereby the second device can send to the first device at least one of the association relationship between the location coordinates and the scene information, the scene information of the terminal device, the mapping relationship between the machine learning model and the scene information, and the model information of the machine learning model associated with the scene information of the terminal device through the sixth information, so that the first device can determine which model should be applied in the scene currently located by the terminal device based on the received sixth information.
- the model determination method provided in the embodiment of the present application can be executed by a model determination device.
- the model determination device provided in the embodiment of the present application is described by taking the model determination method executed by the model determination device as an example.
- the embodiment of the present application provides a model determination device.
- a structural block diagram of a model determination device provided by the embodiment of the present application is shown, and the device can be applied to a first device.
- the device can specifically include:
- a model determination module 601 is used to determine a first model based on first information
- the first information includes any one of the following:
- Scene information of the terminal device and a mapping relationship between the machine learning model and the scene information;
- the first device is the terminal device or a network side device;
- Model information of the machine learning model associated with the scene information in which the terminal device is located is located.
- the device further comprises:
- a scene information acquisition module used to acquire scene information of the terminal device
- the first relationship acquisition module is used to obtain the mapping relationship between the machine learning model and the scene information.
- the scene information acquisition module includes:
- a first acquisition submodule used to acquire second information, where the second information is used to indicate communication information of the terminal device, and the second information is associated with scene information where the terminal device is located;
- the first determination submodule is used to determine the scene information of the terminal device according to the second information.
- the first acquisition submodule includes:
- the measuring unit is used to measure the reference signal and determine the second information based on the measurement result.
- the first acquisition submodule includes:
- the first receiving unit is used to receive second information sent by the terminal device.
- the first determining submodule includes:
- a first acquisition unit used for the first device to acquire an association relationship between the communication information and the scene information
- the first determination unit is used to determine the scene information of the terminal device according to the second information and the association relationship between the communicated information and the scene information.
- the first determining submodule includes:
- a first sending unit configured to send the second information to a network side device
- the second receiving unit is used to receive a first indication sent by the network side device; the first indication is used to indicate the scene information of the first device.
- the scene information acquisition module includes:
- a second acquisition submodule is used to acquire the location information of the terminal device, where the location information is associated with the scene information where the terminal device is located;
- the second determination submodule is used to determine the scene information of the terminal device according to the position information and the association relationship between the position coordinates and the scene information.
- the scene information acquisition module further includes:
- the first receiving submodule is used to receive third information sent by the second device, where the third information is used to indicate the association relationship between the position coordinates and the scene information.
- the second acquisition submodule includes:
- a second determining unit configured to determine current location information based on positioning technology when the first device is a terminal device
- the third receiving unit is used to receive fourth information sent by the terminal device when the first device is a network side device, and the fourth information is used to indicate the location information of the terminal device.
- the scene information acquisition module includes:
- the second receiving submodule is used to receive a first indication and a second indication sent by a second device, wherein the first indication is used to indicate the scene information of the terminal device; and the second indication is used to indicate the mapping relationship between the machine learning model and the scene information.
- the device further comprises:
- a third indication receiving module is used to receive a third indication sent by a second device, where the third indication is used to indicate a machine learning model associated with scene information in which the terminal device is located.
- the measuring unit is specifically used to:
- the fourth indication is used to instruct the first device to measure at least one reference signal
- the second information is determined based on a first measurement result of the first reference signal.
- the measuring unit is specifically used to:
- the second reference signal is measured, and the second information is determined based on a second measurement result of the second reference signal.
- the second information includes at least one of the following:
- the first parameter being used to indicate a communication resource of the terminal device
- the second communication information, the fifth parameter is used to indicate the communication area where the terminal device is located.
- the first communication information includes at least one of the following:
- the reference signal information includes at least one of the following:
- the second parameter is used to indicate a reference signal resource
- the second parameter, the third parameter is used to indicate reference signal measurement information
- the third parameter, the fourth parameter is used to indicate reference signal reporting information.
- the communication indicator information includes at least one of the following:
- the measuring unit is specifically used to:
- the target reference signal resource includes at least one of the following:
- N first target reference signal resources among the reference signal resources configured for each transmission/reception point; the reference signal received power of the N first target reference signal resources is greater than the reference signal received power of other reference signal resources of the same transmission/reception point; N is a positive integer;
- Reference signal resources configured for each transmitting and receiving point.
- the resource information includes at least one of the following:
- model activation module includes:
- the model activation submodule is used to deactivate the second model and activate the first model when the currently running second model does not match the first model.
- the device further comprises:
- a fifth information sending module is used to send fifth information to the network side device, where the fifth information includes at least one of the following:
- the activation time of the first model is the activation time of the first model.
- the model determination device in the embodiment of the present application can be an electronic device, such as an electronic device with an operating system, or a component in the electronic device, such as an integrated circuit or a chip.
- the model determination device provided in the embodiment of the present application can implement the various processes implemented in the aforementioned method embodiment and achieve the same technical effect. To avoid repetition, it will not be repeated here.
- the embodiment of the present application provides a data transmission device.
- a structural block diagram of a data transmission device provided by the embodiment of the present application is shown, and the device can be applied to a second device.
- the device may specifically include:
- the information sending module 701 is used to send sixth information to the first device.
- the six pieces of information include at least one of the following:
- third information where the third information is used to indicate an association relationship between the position coordinates and the scene information
- a first indication where the first indication is used to indicate scene information of a terminal device
- the second indication is used to indicate a mapping relationship between the machine learning model and the scene information
- a third indication is used to indicate a machine learning model associated with the scene information in which the terminal device is located.
- the data transmission device in the embodiment of the present application may be an electronic device, such as an electronic device with an operating system, or may be a component in the electronic device, such as an integrated circuit or a chip.
- the data transmission device provided in the embodiment of the present application can implement the various processes implemented in the aforementioned method embodiment and achieve the same technical effect. To avoid repetition, it will not be described here.
- an embodiment of the present application further provides a communication device 900, including a processor 901 and a memory 902, the memory 902 storing programs or instructions that can be run on the processor 901, for example, when the communication device 900 is a network side device, the program or instruction is executed by the processor 901 to implement the various steps of the aforementioned model determination method embodiment, or to implement the various steps of the aforementioned data transmission method embodiment, and can achieve the same technical effect.
- the communication device 900 is a terminal device
- the program or instruction is executed by the processor 901 to implement the various steps of the aforementioned model determination method embodiment, or to implement the various steps of the aforementioned data transmission method embodiment, and can achieve the same technical effect. To avoid repetition, it will not be repeated here.
- FIG13 it is a schematic diagram of the hardware structure of a terminal device implementing an embodiment of the present application.
- the terminal device 1000 includes but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009 and at least some of the components of the processor 1010.
- the terminal device 1000 can also include a power supply (such as a battery) for supplying power to each component, and the power supply can be logically connected to the processor 1010 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system.
- a power supply such as a battery
- the terminal device structure shown in FIG13 does not constitute a limitation on the terminal device, and the terminal device may include more or fewer components than shown in the figure, or combine certain components, or arrange components differently, which will not be described in detail here.
- the input unit 1004 may include a graphics processing unit (GPU) 10041 and a microphone 10042, and the graphics processor 10041 processes the image data of the static picture or video obtained by the image capture device (such as a camera) in the video capture mode or the image capture mode.
- the display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, etc.
- the user input unit 1007 includes a touch panel 10071 and at least one of other input devices 10072.
- the touch panel 10071 is also called a touch screen.
- the touch panel 10071 may include two parts: a touch detection device and a touch controller.
- Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as a volume control key, a switch key, etc.), a trackball, a mouse, and a joystick, which will not be repeated here.
- the RF unit 1001 can transmit the data to the processor 1010 for processing; in addition, the RF unit 1001 can send uplink data to the network side device.
- the RF unit 1001 includes but is not limited to an antenna, an amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, etc.
- the memory 1009 can be used to store software programs or instructions and various data.
- the memory 1009 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instruction required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
- the memory 1009 may include a volatile memory or a non-volatile memory, or the memory 1009 may include both volatile and non-volatile memories.
- the non-volatile memory may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or a flash memory.
- the volatile memory may be a random access memory (RAM), a static random access memory (SRAM), a dynamic random access memory (DRAM), a synchronous dynamic random access memory (SDRAM), a double data rate synchronous dynamic random access memory (DDRSDRAM), an enhanced synchronous dynamic random access memory (ESDRAM), a synchronous link dynamic random access memory (SLDRAM) and a direct memory bus random access memory (DRRAM).
- the memory 1009 in the embodiment of the present application includes but is not limited to these and any other suitable types of memory.
- the processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor and a modem processor, wherein the application processor mainly processes operations related to an operating system, a user interface, and application programs, and the modem processor mainly processes wireless communication signals, such as a baseband processor. It is understandable that the modem processor may not be integrated into the processor 1010.
- the embodiment of the present application also provides a network side device, including a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run a program or instruction to implement the steps of the aforementioned method embodiment.
- the network side device embodiment corresponds to the aforementioned network side device method embodiment, and each implementation process and implementation method of the network side device in the aforementioned method embodiment can be applied to the network side device embodiment, and can achieve the same technical effect.
- the embodiment of the present application further provides a network side device, as shown in FIG14, the network side device 1100 includes: an antenna 111, a radio frequency device 112, a baseband device 113, a processor 114 and a memory 115.
- the antenna 111 is connected to the radio frequency device 112.
- the radio frequency device 112 receives information through the antenna 111 and sends the received information to the baseband device 113 for processing.
- the baseband device 113 processes the information to be sent and sends it to the radio frequency device 112, and the radio frequency device 112 processes the received information and sends it out through the antenna 111.
- the method executed by the network-side device in the above embodiment may be implemented in the baseband device 113, which includes a baseband processor.
- the baseband device 113 may include, for example, at least one baseband board, on which a plurality of chips are arranged, as shown in FIG14 , wherein one of the chips is, for example, a baseband processor, which is connected to the memory 115 through a bus interface to call a program in the memory 115 and execute the network device operations shown in the above method embodiment.
- the network side device may also include a network interface 116, which is, for example, a common public radio interface (CPRI).
- a network interface 116 which is, for example, a common public radio interface (CPRI).
- CPRI common public radio interface
- the network side device 1100 of the embodiment of the present invention also includes: instructions or programs stored in the memory 115 and executable on the processor 114.
- the processor 114 calls the instructions or programs in the memory 115 to execute the methods executed by the modules in Figure 10 or Figure 11 and achieve the same technical effect. To avoid repetition, it will not be repeated here.
- the embodiment of the present application also provides a network side device.
- the network side device 1200 includes: a processor 1201, a network interface 1202 and a memory 1203.
- the network interface 1202 is, for example, a common public radio interface (CPRI).
- CPRI common public radio interface
- the network side device 1200 of the embodiment of the present invention also includes: instructions or programs stored in the memory 1203 and executable on the processor 1201.
- the processor 1201 calls the instructions or programs in the memory 1203 to execute the method executed by each module shown in Figure 10 or Figure 11, and achieves the same technical effect. To avoid repetition, it will not be repeated here.
- An embodiment of the present application also provides a readable storage medium, on which a program or instruction is stored.
- a program or instruction is stored.
- the various processes of the aforementioned method embodiment are implemented and the same technical effect can be achieved. To avoid repetition, it will not be repeated here.
- the processor is the processor in the terminal device described in the above embodiment.
- the readable storage medium includes a computer readable storage medium, such as a computer read-only memory ROM, a random access memory RAM, a magnetic disk or an optical disk.
- An embodiment of the present application further provides a chip, which includes a processor and a communication interface, wherein the communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the various processes of the aforementioned method embodiment and achieve the same technical effect. To avoid repetition, it will not be repeated here.
- the chip mentioned in the embodiments of the present application can also be called a system-level chip, a system chip, a chip system or a system-on-chip chip, etc.
- the embodiments of the present application further provide a computer program/program product, which is stored in a storage medium and is executed by at least one processor to implement the various processes of the aforementioned method embodiments and can achieve the same technical effects. To avoid repetition, they are not described here.
- An embodiment of the present application also provides a model determination system, including: a first device and a second device, wherein the first device can be used to execute the steps of the model determination method described in the first aspect above, and the second device can be used to execute the steps of the data transmission method described in the second aspect above.
- the technical solution of the present application can be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), and includes a number of instructions for a terminal (which can be a mobile phone, computer, server, air conditioner, or network device, etc.) to execute the methods described in each embodiment of the present application.
- a storage medium such as ROM/RAM, magnetic disk, optical disk
- a terminal which can be a mobile phone, computer, server, air conditioner, or network device, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Artificial Intelligence (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
La présente demande relève du domaine technique des communications, et divulgue un procédé et un appareil de détermination de modèle, et un dispositif de communication. Le procédé de détermination de modèle dans des modes de réalisation de la présente demande comprend les étapes suivantes : un premier dispositif détermine un premier modèle sur la base de premières informations ; et le premier dispositif active le premier modèle. Les premières informations comprennent l'un quelconque des éléments suivants : des informations de scénario dans lesquelles se trouve un dispositif terminal, et une relation de mappage entre un modèle d'apprentissage automatique et les informations de scénario ; le premier dispositif étant le dispositif terminal ou un dispositif côté réseau ; et des informations de modèle du modèle d'apprentissage automatique associées aux informations de scénario dans lesquelles se trouve le dispositif terminal.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311550325.2A CN120021208A (zh) | 2023-11-20 | 2023-11-20 | 模型确定方法、装置及通信设备 |
| CN202311550325.2 | 2023-11-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025108195A1 true WO2025108195A1 (fr) | 2025-05-30 |
Family
ID=95704667
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2024/132393 Pending WO2025108195A1 (fr) | 2023-11-20 | 2024-11-15 | Procédé et appareil de détermination de modèle, et dispositif de communication |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN120021208A (fr) |
| WO (1) | WO2025108195A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019240771A1 (fr) * | 2018-06-12 | 2019-12-19 | Nokia Technologies Oy | Positionnement de dispositif d'utilisateur |
| CN113938232A (zh) * | 2020-07-13 | 2022-01-14 | 华为技术有限公司 | 通信的方法及通信装置 |
| CN114826941A (zh) * | 2022-04-27 | 2022-07-29 | 中国电子科技集团公司第五十四研究所 | 一种无线通信网络ai模型配置方法 |
| CN116234000A (zh) * | 2021-11-30 | 2023-06-06 | 维沃移动通信有限公司 | 定位方法及通信设备 |
| CN116567806A (zh) * | 2022-01-29 | 2023-08-08 | 维沃移动通信有限公司 | 基于人工智能ai模型的定位方法及通信设备 |
-
2023
- 2023-11-20 CN CN202311550325.2A patent/CN120021208A/zh active Pending
-
2024
- 2024-11-15 WO PCT/CN2024/132393 patent/WO2025108195A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019240771A1 (fr) * | 2018-06-12 | 2019-12-19 | Nokia Technologies Oy | Positionnement de dispositif d'utilisateur |
| CN113938232A (zh) * | 2020-07-13 | 2022-01-14 | 华为技术有限公司 | 通信的方法及通信装置 |
| CN116234000A (zh) * | 2021-11-30 | 2023-06-06 | 维沃移动通信有限公司 | 定位方法及通信设备 |
| CN116567806A (zh) * | 2022-01-29 | 2023-08-08 | 维沃移动通信有限公司 | 基于人工智能ai模型的定位方法及通信设备 |
| CN114826941A (zh) * | 2022-04-27 | 2022-07-29 | 中国电子科技集团公司第五十四研究所 | 一种无线通信网络ai模型配置方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120021208A (zh) | 2025-05-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4472306A1 (fr) | Procédé de positionnement basé sur un modèle d'intelligence artificielle (ia), et dispositif de communication | |
| WO2023098661A1 (fr) | Procédé de positionnement et dispositif de communication | |
| WO2023134650A1 (fr) | Procédé et appareil d'interaction d'informations, et dispositif de communication | |
| WO2023103911A1 (fr) | Procédé et appareil de mesure, dispositif et support de stockage | |
| US20250227507A1 (en) | Ai model processing method and apparatus, and communication device | |
| WO2023125951A1 (fr) | Procédé et appareil de configuration de modèle de communication, et dispositif de communication | |
| US20250193833A1 (en) | Data collection method and apparatus | |
| WO2024067434A1 (fr) | Procédé et appareil de configuration de rs, terminal et dispositif côté réseau | |
| WO2024083004A1 (fr) | Procédé de configuration de modèle d'ia, terminal et dispositif côté réseau | |
| WO2024208211A1 (fr) | Procédé de surveillance de performances de prédiction de csi, appareil, terminal et dispositif côté réseau | |
| WO2024125525A1 (fr) | Procédé de rapport de puissance de calcul ia, terminal et dispositif côté réseau | |
| WO2025108195A1 (fr) | Procédé et appareil de détermination de modèle, et dispositif de communication | |
| WO2024027576A1 (fr) | Procédé et appareil de supervision de performance pour modèle de réseau d'ia, et dispositif de communication | |
| WO2024208167A1 (fr) | Procédé de traitement d'informations, appareil de traitement d'informations, terminal et dispositif côté réseau | |
| WO2025108390A1 (fr) | Procédé et appareil de positionnement basé sur un modèle d'ia, dispositif et support de stockage lisible | |
| WO2025092998A1 (fr) | Procédé de transmission d'informations, appareil et dispositif | |
| WO2025092999A1 (fr) | Procédé et appareil de supervision de performance de modèle, et dispositif | |
| WO2025140432A1 (fr) | Procédé et appareil de création de compte-rendu d'informations | |
| WO2024217405A1 (fr) | Procédé et appareil de positionnement de terminal, terminal, dispositif côté réseau et support | |
| WO2024208136A1 (fr) | Procédé et appareil de transmission d'informations, procédé et appareil de traitement d'informations, terminal et dispositif côté réseau | |
| WO2025031325A1 (fr) | Procédé de communication basé sur un modèle de référence, et dispositif | |
| WO2025185549A1 (fr) | Procédé de transmission d'informations, appareil, et dispositif associé | |
| WO2025031405A1 (fr) | Procédé et appareil de test de dispositif de communication, terminal, dispositif côté réseau et support | |
| WO2025036223A1 (fr) | Procédé de rapport d'informations, procédé de réception d'informations et dispositif | |
| WO2025036188A1 (fr) | Procédé et appareil de transmission d'informations, terminal, et dispositif côté réseau |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24893353 Country of ref document: EP Kind code of ref document: A1 |