[go: up one dir, main page]

WO2024148935A1 - Gestion du cycle de vie prenant en charge l'ia/le ml pour améliorer l'interface radio - Google Patents

Gestion du cycle de vie prenant en charge l'ia/le ml pour améliorer l'interface radio Download PDF

Info

Publication number
WO2024148935A1
WO2024148935A1 PCT/CN2023/129763 CN2023129763W WO2024148935A1 WO 2024148935 A1 WO2024148935 A1 WO 2024148935A1 CN 2023129763 W CN2023129763 W CN 2023129763W WO 2024148935 A1 WO2024148935 A1 WO 2024148935A1
Authority
WO
WIPO (PCT)
Prior art keywords
functionality
model
procedure
lcm
network entity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CN2023/129763
Other languages
English (en)
Inventor
Jianfeng Wang
Bingchao LIU
Congchi ZHANG
Hongmei Liu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to PCT/CN2023/129763 priority Critical patent/WO2024148935A1/fr
Publication of WO2024148935A1 publication Critical patent/WO2024148935A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/16Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks using machine learning or artificial intelligence

Definitions

  • the present disclosure relates to wireless communications, and more specifically to a user equipment (UE) , a network entity, a processor for wireless communication, methods, and a computer readable medium for lifecycle management (LCM) supporting artificial intelligence/machine learning (AI/ML) for air interface enhancement.
  • UE user equipment
  • network entity a network entity
  • processor for wireless communication
  • methods methods
  • computer readable medium for lifecycle management (LCM) supporting artificial intelligence/machine learning (AI/ML) for air interface enhancement.
  • AI/ML artificial intelligence/machine learning
  • a wireless communications system may include one or multiple network communication devices, such as base stations, which may be otherwise known as an eNodeB (eNB) , a next-generation NodeB (gNB) , or other suitable terminology.
  • Each network communication devices such as a base station may support wireless communications for one or multiple user communication devices, which may be otherwise known as user equipment (UE) , or other suitable terminology.
  • the wireless communications system may support wireless communications with one or multiple user communication devices by utilizing resources of the wireless communication system (e.g., time resources (e.g., symbols, slots, subframes, frames, or the like) or frequency resources (e.g., subcarriers, carriers) .
  • the wireless communications system may support wireless communications across various radio access technologies including third generation (3G) radio access technology, fourth generation (4G) radio access technology, fifth generation (5G) radio access technology, among other suitable radio access technologies beyond 5G (e.g., sixth generation (6G) ) .
  • 3G third generation
  • 4G fourth generation
  • 5G fifth generation
  • 6G sixth generation
  • AI Artificial Intelligence
  • ML Machine Learning
  • CV computer vison
  • NLP nature language processing
  • DL Deep Learning
  • NN multi-layered neural networks
  • the present disclosure relates to a user equipment (UE) , a network entity, a processor for wireless communication, methods, and a computer readable medium for LCM supporting AI/ML for air interface enhancement.
  • UE user equipment
  • a network entity a network entity
  • a processor for wireless communication methods
  • a computer readable medium for LCM supporting AI/ML for air interface enhancement Embodiments of the disclosure propose a unified LCM framework to support all potential use cases with AI/ML.
  • a UE comprising a processor; and a transceiver coupled to the processor, wherein the processor is configured to: transmit, via the transceiver and to a network entity, an indication on whether model identification is supported for an artificial intelligence/machine learning (AI/ML) functionality; and perform a functionality-based lifecycle management (LCM) procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • AI/ML artificial intelligence/machine learning
  • LCM lifecycle management
  • a network entity comprising: a processor; and a transceiver coupled to the processor, wherein the processor is configured to: receive, via the transceiver and from a user equipment (UE) , an indication on whether model identification is supported for an artificial intelligence/machine learning (AI/ML) functionality; and perform a functionality-based lifecycle management (LCM) procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • UE user equipment
  • LCM lifecycle management
  • a processor for wireless communication comprise at least one memory; and a controller coupled with the at least one memory and configured to cause the controller to: transmit, to a network entity, an indication on whether model identification is supported for an artificial intelligence/machine learning (AI/ML) functionality; and perform a functionality-based lifecycle management (LCM) procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • AI/ML artificial intelligence/machine learning
  • LCM lifecycle management
  • a user equipment UE
  • the method comprising: transmitting, to a network entity, an indication on whether model identification is supported for an artificial intelligence/machine learning (AI/ML) functionality; and performing a functionality-based lifecycle management (LCM) procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • ID model-identifier
  • a network entity comprising: receiving, from a user equipment (UE) , an indication on whether model identification is supported for an artificial intelligence/machine learning (AI/ML) functionality; and performing a functionality-based lifecycle management (LCM) procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • UE user equipment
  • LCM lifecycle management
  • a computer readable medium having instructions stored thereon, the instructions, when executed by a processor of an apparatus, causing the apparatus to perform the method according to the fourth or the fifth aspect of the disclosure.
  • the indication may comprise one bit indicating existence of at least one AI/ML model associated with the AI/ML functionality to be identified.
  • the indication may comprise an identifier of at least one AI/ML model associated with the AI/ML functionality, wherein the identifier is a model ID or a temporary ID.
  • the indication is included in information regarding the AI/ML functionality.
  • the indication may be transmitted to the network entity together with the information regarding the AI/ML functionality during a functionality identification procedure.
  • the information regarding the AI/ML functionality may comprise at least one of the following: configurations of an AI/ML-enabled feature or a feature group; and application conditions.
  • the information regarding the AI/ML functionality may be included in dedicated UE capability or UE assistance information.
  • the functionality-based LCM procedure may further comprise a model identification procedure.
  • the UE may determine a model ID of the at least one AI/ML model based on the indication; and may exchange, using the model ID, information regarding the at least one AI/ML model with the network entity during the model identification procedure.
  • the network entity may determine a model ID of the at least one AI/ML model based on the indication; and may exchange, using the model ID, information regarding the at least one AI/ML model with the UE during the model identification procedure.
  • information regarding the at least one AI/ML model may comprise at least one of the following: descriptions; configurations; or application conditions.
  • the model identification procedure may be enabled after the AI/ML functionality has been activated.
  • the model-ID-based LCM procedure may be enabled after the model identification procedure.
  • each of the functionality-based LCM procedure and the model-ID-based LCM procedure may comprise at least one of the following operations: activation; switching; selection; updating; or de-activation.
  • the UE may apply, in response to one of the operations on the AI/ML functionality, the same operation on the at least one AI/ML model of the AI/ML functionality.
  • the network entity may apply, in response to one of the operations on the AI/ML functionality, the same operation on the at least one AI/ML model of the AI/ML functionality.
  • the AI/ML functionality is at UE side
  • the at least one AI/ML model is a UE-side model or a UE-part model of a two-sided model.
  • FIG. 1 illustrates an example of a wireless communications system in which some embodiments of the present disclosure can be implemented.
  • FIGS. 2A-2C illustrate examples of manageable units of some LCM schemes.
  • FIGS. 3A-3C illustrate examples of three types of model identification procedures.
  • FIGS. 4A-4C illustrate examples of signalling in the model identification procedures shown in FIGS. 3A-3C.
  • FIG. 5 illustrates an example of a process flow of a unified framework for LCM in accordance with some example embodiments of the present disclosure.
  • FIG. 6 illustrates another example of a process flow of a unified framework for LCM in accordance with some example embodiments of the present disclosure.
  • FIG. 7 illustrates an example of a functionality identification procedure in accordance with some example embodiments of the present disclosure.
  • FIG. 8 illustrates a schematic diagram of an example of indication on AI/ML models of an AI/ML functionality to be or not be identified and managed in accordance with some example embodiments of the present disclosure.
  • FIG. 9 illustrates an example of a functionality activation procedure in accordance with some example embodiments of the present disclosure.
  • FIG. 10 illustrates an example of a LCM procedure including operations on an AI/ML functionality and models in accordance with some example embodiments of the present disclosure.
  • FIG. 11 illustrates an example of a device that is suitable for implementing some embodiments of the present disclosure.
  • FIG. 12 illustrates an example of a processor that is suitable for implementing some embodiments of the present disclosure.
  • FIG. 13 illustrates a flowchart of a method that performed by a user equipment in accordance with aspects of the present disclosure.
  • FIG. 14 illustrates a flowchart of a method that performed by a network entity in accordance with aspects of the present disclosure.
  • references in the present disclosure to “one embodiment, ” “an example embodiment, ” “an embodiment, ” “some embodiments, ” and the like indicate that the embodiment (s) described may include a particular feature, structure, or characteristic, but it is not necessary that every embodiment includes the particular feature, structure, or characteristic. Moreover, such phrases do not necessarily refer to the same embodiment (s) . Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • first and second may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element. For example, a first element could also be termed as a second element, and similarly, a second element could also be termed as a first element, without departing from the scope of embodiments.
  • the term “and/or” includes any and all combinations of one or more of the listed terms. In some examples, values, procedures, or apparatuses are referred to as “best, ” “lowest, ” “highest, ” “minimum, ” “maximum, ” or the like. It will be appreciated that such descriptions are intended to indicate that a selection among many used functional alternatives can be made, and such selections need not be better, smaller, higher, or otherwise preferable to other selections.
  • the term “includes” and its variants are to be read as open terms that mean “includes, but is not limited to. ”
  • the term “based on” is to be read as “based at least in part on. ”
  • the term “one embodiment” and “an embodiment” are to be read as “at least one embodiment. ”
  • the term “another embodiment” is to be read as “at least one other embodiment. ”
  • the use of an expression such as “A and/or B” can mean either “only A” or “only B” or “both A and B. ”
  • Other definitions, explicit and implicit, may be included below.
  • LCM lifecycle management
  • the functionality-based LCM procedure may include a functionality identification procedure, which refers to a process/method of identifying an AI/ML functionality for the common understanding between the network (NW) and the UE, where information regarding the AI/ML functionality may be shared during functionality identification.
  • the model-ID-based LCM procedure may include a model identification procedure, which refers to a process/method of identifying an AI/ML model for the common understanding between the NW and the UE, where information regarding the AI/ML model may be shared during model identification.
  • the first issue is that the procedures discussed so far are independent for functionality-based LCM and model-ID-based LCM, which have different requirements during identification procedure.
  • the information to be shared and aligned during identification procedure is different for the following management purposes, e.g., general functionality application condition and details on models.
  • the third issue is that, though the detailed information interaction during the identification procedures needs to be defined for each use case, the basic signaling and procedures to enable identification and LCM are supposed to be common. Detailed operations in both LCM schemes, such as monitoring and fallback, are always discussed in each sub use cases, there are still some common operations, such as activation and deactivation, to be further studied.
  • UE may transmit to a network entity an indication on whether model identification is supported for an AI/ML functionality.
  • the network entity may receive the indication.
  • the UE and the network entity may perform a functionality-based LCM procedure of the AI/ML functionality with each other. If the model identification is supported for AI/ML functionality as indicated, the functionality-based LCM procedure may comprise a model-ID-based LCM procedure.
  • FIG. 1 illustrates an example of a wireless communications system 100 in which some embodiments of the present disclosure can be implemented.
  • the wireless communications system 100 may include one or more network entities 102 (also referred to as network equipment (NE) ) , one or more UEs 104, a core network 106, and a packet data network 108.
  • the wireless communications system 100 may support various radio access technologies.
  • the wireless communications system 100 may be a 4G network, such as an LTE network or an LTE-Advanced (LTE-A) network.
  • LTE-A LTE-Advanced
  • the wireless communications system 100 may be a 5G network, such as an NR network.
  • the wireless communications system 100 may be a combination of a 4G network and a 5G network, or other suitable radio access technology including Institute of Electrical and Electronics Engineers (IEEE) 802.11 (Wi-Fi) , IEEE 802.16 (WiMAX) , IEEE 802.20.
  • IEEE Institute of Electrical and Electronics Engineers
  • Wi-Fi Wi-Fi
  • WiMAX IEEE 802.16
  • IEEE 802.20 The wireless communications system 100 may support radio access technologies beyond 5G. Additionally, the wireless communications system 100 may support technologies, such as time division multiple access (TDMA) , frequency division multiple access (FDMA) , or code division multiple access (CDMA) , etc.
  • TDMA time division multiple access
  • FDMA frequency division multiple access
  • CDMA code division multiple access
  • the one or more network entities 102 may be dispersed throughout a geographic region to form the wireless communications system 100.
  • One or more of the network entities 102 described herein may be or include or may be referred to as a network node, a base station, a network element, a radio access network (RAN) , a base transceiver station, an access point, a NodeB, an eNodeB (eNB) , a next-generation NodeB (gNB) , or other suitable terminology.
  • RAN radio access network
  • eNB eNodeB
  • gNB next-generation NodeB
  • a network entity 102 and a UE 104 may communicate via a communication link 110, which may be a wireless or wired connection.
  • a network entity 102 and a UE 104 may perform wireless communication (e.g., receive signaling, transmit signaling) over a Uu interface.
  • a network entity 102 in form of a satellite can directly communicate to UE 104 using NR/LTE Uu interface.
  • the satellite may be a transparent satellite or a regenerative satellite.
  • a base station on earth may communicate with a UE via the satellite.
  • the base station may be on board and directly communicate with the UE.
  • a network entity 102 may provide a geographic coverage area 112 for which the network entity 102 may support services (e.g., voice, video, packet data, messaging, broadcast, etc. ) for one or more UEs 104 within the geographic coverage area 112.
  • a network entity 102 and a UE 104 may support wireless communication of signals related to services (e.g., voice, video, packet data, messaging, broadcast, etc. ) according to one or multiple radio access technologies.
  • a network entity 102 may be moveable, for example, a satellite associated with a non-terrestrial network.
  • different geographic coverage areas 112 associated with the same or different radio access technologies may overlap, but the different geographic coverage areas 112 may be associated with different network entities 102.
  • Information and signals described herein may be represented using any of a variety of different technologies and techniques.
  • data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
  • the one or more UEs 104 may be dispersed throughout a geographic region of the wireless communications system 100.
  • a UE 104 may include or may be referred to as a mobile device, a wireless device, a remote device, a remote unit, a handheld device, or a subscriber device, or some other suitable terminology.
  • the UE 104 may be referred to as a unit, a station, a terminal, or a client, among other examples.
  • the UE 104 may be referred to as an Internet-of-Things (IoT) device, an Internet-of-Everything (IoE) device, or machine-type communication (MTC) device, among other examples.
  • IoT Internet-of-Things
  • IoE Internet-of-Everything
  • MTC machine-type communication
  • a UE 104 may be stationary in the wireless communications system 100.
  • a UE 104 may be mobile in the wireless communications system 100.
  • the one or more UEs 104 may be devices in different forms or having different capabilities. Some examples of UEs 104 are illustrated in FIG. 1.
  • a UE 104 may be capable of communicating with various types of devices, such as the network entities 102, other UEs 104, or network equipment (e.g., the core network 106, the packet data network 108, a relay device, an integrated access and backhaul (IAB) node, or another network equipment) , as shown in FIG. 1.
  • a UE 104 may support communication with other network entities 102 or UEs 104, which may act as relays in the wireless communications system 100.
  • a UE 104 may also be able to support wireless communication directly with other UEs 104 over a communication link 114.
  • a UE 104 may support wireless communication directly with another UE 104 over a device-to-device (D2D) communication link.
  • D2D device-to-device
  • the communication link 114 may be referred to as a sidelink.
  • a UE 104 may support wireless communication directly with another UE 104 over a PC5 interface.
  • a network entity 102 may support communications with the core network 106, or with another network entity 102, or both.
  • a network entity 102 may interface with the core network 106 through one or more backhaul links 116 (e.g., via an S1, N2, N2, or another network interface) .
  • the network entities 102 may communicate with each other over the backhaul links 116 (e.g., via an X2, Xn, or another network interface) .
  • the network entities 102 may communicate with each other directly (e.g., between the network entities 102) .
  • the network entities 102 may communicate with each other or indirectly (e.g., via the core network 106) .
  • one or more network entities 102 may include subcomponents, such as an access network entity, which may be an example of an access node controller (ANC) .
  • An ANC may communicate with the one or more UEs 104 through one or more other access network transmission entities, which may be referred to as a radio heads, smart radio heads, or transmission-reception points (TRPs) .
  • TRPs transmission-reception points
  • a network entity 102 may be configured in a disaggregated architecture, which may be configured to utilize a protocol stack physically or logically distributed among two or more network entities 102, such as an integrated access backhaul (IAB) network, an open RAN (O-RAN) (e.g., a network configuration sponsored by the O-RAN Alliance) , or a virtualized RAN (vRAN) (e.g., a cloud RAN (C-RAN) ) .
  • IAB integrated access backhaul
  • O-RAN open RAN
  • vRAN virtualized RAN
  • C-RAN cloud RAN
  • a network entity 102 may include one or more of a central unit (CU) , a distributed unit (DU) , a radio unit (RU) , a RAN Intelligent Controller (RIC) (e.g., a Near-Real Time RIC (Near-RT RIC) , a Non-Real Time RIC (Non-RT RIC) ) , a Service Management and Orchestration (SMO) system, or any combination thereof.
  • CU central unit
  • DU distributed unit
  • RU radio unit
  • RIC RAN Intelligent Controller
  • RIC e.g., a Near-Real Time RIC (Near-RT RIC) , a Non-Real Time RIC (Non-RT RIC)
  • SMO Service Management and Orchestration
  • An RU may also be referred to as a radio head, a smart radio head, a remote radio head (RRH) , a remote radio unit (RRU) , or a transmission reception point (TRP) .
  • One or more components of the network entities 102 in a disaggregated RAN architecture may be co-located, or one or more components of the network entities 102 may be located in distributed locations (e.g., separate physical locations) .
  • one or more network entities 102 of a disaggregated RAN architecture may be implemented as virtual units (e.g., a virtual CU (VCU) , a virtual DU (VDU) , a virtual RU (VRU) ) .
  • VCU virtual CU
  • VDU virtual DU
  • VRU virtual RU
  • Split of functionality between a CU, a DU, and an RU may be flexible and may support different functionalities depending upon which functions (e.g., network layer functions, protocol layer functions, baseband functions, radio frequency functions, and any combinations thereof) are performed at a CU, a DU, or an RU.
  • functions e.g., network layer functions, protocol layer functions, baseband functions, radio frequency functions, and any combinations thereof
  • a functional split of a protocol stack may be employed between a CU and a DU such that the CU may support one or more layers of the protocol stack and the DU may support one or more different layers of the protocol stack.
  • the CU may host upper protocol layer (e.g., a layer 3 (L3) , a layer 2 (L2) ) functionality and signaling (e.g., Radio Resource Control (RRC) , service data adaption protocol (SDAP) , Packet Data Convergence Protocol (PDCP) ) .
  • RRC Radio Resource Control
  • SDAP service data adaption protocol
  • PDCP Packet Data Convergence Protocol
  • the CU may be connected to one or more DUs or RUs, and the one or more DUs or RUs may host lower protocol layers, such as a layer 1 (L1) (e.g., physical (PHY) layer) or an L2 (e.g., radio link control (RLC) layer, medium access control (MAC) layer) functionality and signaling, and may each be at least partially controlled by the CU 160.
  • L1 e.g., physical (PHY) layer
  • L2 e.g., radio link control (RLC) layer, medium access
  • a functional split of the protocol stack may be employed between a DU and an RU such that the DU may support one or more layers of the protocol stack and the RU may support one or more different layers of the protocol stack.
  • the DU may support one or multiple different cells (e.g., via one or more RUs) .
  • a functional split between a CU and a DU, or between a DU and an RU may be within a protocol layer (e.g., some functions for a protocol layer may be performed by one of a CU, a DU, or an RU, while other functions of the protocol layer are performed by a different one of the CU, the DU, or the RU) .
  • a CU may be functionally split further into CU control plane (CU-CP) and CU user plane (CU-UP) functions.
  • a CU may be connected to one or more DUs via a midhaul communication link (e.g., F1, F1-c, F1-u)
  • a DU may be connected to one or more RUs via a fronthaul communication link (e.g., open fronthaul (FH) interface)
  • FH open fronthaul
  • a midhaul communication link or a fronthaul communication link may be implemented in accordance with an interface (e.g., a channel) between layers of a protocol stack supported by respective network entities 102 that are in communication via such communication links.
  • the core network 106 may support user authentication, access authorization, tracking, connectivity, and other access, routing, or mobility functions.
  • the core network 106 may be an evolved packet core (EPC) , or a 5G core (5GC) , which may include a control plane entity that manages access and mobility (e.g., a mobility management entity (MME) , an access and mobility management functions (AMF) ) and a user plane entity that routes packets or interconnects to external networks (e.g., a serving gateway (S-GW) , a Packet Data Network (PDN) gateway (P-GW) , or a user plane function (UPF) ) .
  • EPC evolved packet core
  • 5GC 5G core
  • MME mobility management entity
  • AMF access and mobility management functions
  • S-GW serving gateway
  • PDN gateway Packet Data Network gateway
  • UPF user plane function
  • control plane entity may manage non-access stratum (NAS) functions, such as mobility, authentication, and bearer management (e.g., data bearers, signal bearers, etc. ) for the one or more UEs 104 served by the one or more network entities 102 associated with the core network 106.
  • NAS non-access stratum
  • the core network 106 may communicate with the packet data network 108 over one or more backhaul links 116 (e.g., via an S1, N2, N2, or another network interface) .
  • the packet data network 108 may include an application server 118.
  • one or more UEs 104 may communicate with the application server 118.
  • a UE 104 may establish a session (e.g., a protocol data unit (PDU) session, or the like) with the core network 106 via a network entity 102.
  • the core network 106 may route traffic (e.g., control information, data, and the like) between the UE 104 and the application server 118 using the established session (e.g., the established PDU session) .
  • the PDU session may be an example of a logical connection between the UE 104 and the core network 106 (e.g., one or more network functions of the core network 106) .
  • the network entities 102 and the UEs 104 may use resources of the wireless communications system 100 (e.g., time resources (e.g., symbols, slots, subframes, frames, or the like) or frequency resources (e.g., subcarriers, carriers) ) to perform various operations (e.g., wireless communications) .
  • the network entities 102 and the UEs 104 may support different resource structures.
  • the network entities 102 and the UEs 104 may support different frame structures.
  • the network entities 102 and the UEs 104 may support a single frame structure.
  • the network entities 102 and the UEs 104 may support various frame structures (i.e., multiple frame structures) .
  • the network entities 102 and the UEs 104 may support various frame structures based on one or more numerologies.
  • One or more numerologies may be supported in the wireless communications system 100, and a numerology may include a subcarrier spacing and a cyclic prefix.
  • a first subcarrier spacing e.g., 15 kHz
  • a normal cyclic prefix e.g. 15 kHz
  • the first numerology associated with the first subcarrier spacing (e.g., 15 kHz) may utilize one slot per subframe.
  • a time interval of a resource may be organized according to frames (also referred to as radio frames) .
  • Each frame may have a duration, for example, a 10 millisecond (ms) duration.
  • each frame may include multiple subframes.
  • each frame may include 10 subframes, and each subframe may have a duration, for example, a 1 ms duration.
  • each frame may have the same duration.
  • each subframe of a frame may have the same duration.
  • a time interval of a resource may be organized according to slots.
  • a subframe may include a number (e.g., quantity) of slots.
  • the number of slots in each subframe may also depend on the one or more numerologies supported in the wireless communications system 100.
  • Each slot may include a number (e.g., quantity) of symbols (e.g., OFDM symbols) .
  • the number (e.g., quantity) of slots for a subframe may depend on a numerology.
  • a slot For a normal cyclic prefix, a slot may include 14 symbols.
  • a slot For an extended cyclic prefix (e.g., applicable for 60 kHz subcarrier spacing) , a slot may include 12 symbols.
  • an electromagnetic (EM) spectrum may be split, based on frequency or wavelength, into various classes, frequency bands, frequency channels, etc.
  • the wireless communications system 100 may support one or multiple operating frequency bands, such as frequency range designations FR1 (410 MHz –7.125 GHz) , FR2 (24.25 GHz –52.6 GHz) , FR3 (7.125 GHz –24.25 GHz) , FR4 (52.6 GHz –114.25 GHz) , FR4a or FR4-1 (52.6 GHz –71 GHz) , and FR5 (114.25 GHz –500 GHz) .
  • FR1 410 MHz –7.125 GHz
  • FR2 24.25 GHz –52.6 GHz
  • FR3 7.125 GHz –24.25 GHz
  • FR4 (52.6 GHz –114.25 GHz)
  • FR4a or FR4-1 52.6 GHz –71 GHz
  • FR5 114.25 GHz
  • the network entities 102 and the UEs 104 may perform wireless communications over one or more of the operating frequency bands.
  • FR1 may be used by the network entities 102 and the UEs 104, among other equipment or devices for cellular communications traffic (e.g., control information, data) .
  • FR2 may be used by the network entities 102 and the UEs 104, among other equipment or devices for short-range, high data rate capabilities.
  • FR1 may be associated with one or multiple numerologies (e.g., at least three numerologies) .
  • FR2 may be associated with one or multiple numerologies (e.g., at least 2 numerologies) .
  • the functionality-based LCM and model-ID-based LCM on the UE-side/UE-part models have the same purpose, i.e., to support the AI/ML-related operations, but with different manageable units.
  • the manageable unit of functionality-based LCM is ‘functionality’ , which is defined as a set of configuration parameters of AI/ML-enabled Feature/FG
  • the unit of model-ID-based LCM is an AI/ML model and the related configuration parameters of the applied AI/ML models. Accordingly, there could be different schemes to apply the LCM: functionality-based LCM only, model-ID-based LCM only and joint functionality-based and model-ID-based LCM.
  • FIG. 2A illustrates an example of manageable units of functionality-based LCM.
  • the manageable unit is ‘functionality’ , which refers to an AI/ML-enabled Feature/Feature Group (FG) enabled by configuration (s) , and the configuration (s) is (are) supported based on conditions indicated by UE capability.
  • functionality e.g., Functionality A
  • UE may report such information to NW for management.
  • the deployed AI/ML models used for inference at UE is unseen by NW, since it is not necessary for NW to have knowledge of the deployed and current used models.
  • NW may provide the assistance information for applicable functionality according to the configurations provided during identification procedure, and UE can select the model by itself.
  • the proprietary information of the models may be well protected, and this approach is beneficial for the single-side model, especially the model from local training.
  • FIG. 2B illustrates an example of manageable units of model-ID-based LCM.
  • the manageable unit is ‘model’ , whose description and application conditions are reported during model identification.
  • model 1 e.g., Model 1
  • UE may report such information to NW for management.
  • the deployed AI/ML models (e.g., at least a descriptive representation) used for inference at UE is visible by NW, and NW may provide the assistance information, if needed, and operate the model directly. NW can select the model for activation, monitoring and de-activation. In this sense, the models’ information is well disclosed. This approach is beneficial for two-side model and single-side model with model transfer from NW.
  • FIG. 2C illustrates an example of manageable units of joint functionality/model-ID-based LCM where models are grouped by functionality.
  • the manageable unit may be ‘model group’ , identified by a ‘functionality’ , whose descriptions, configurations, and application conditions are indicated in a hierarchical way, i.e., some in functionality identification and others are in individual model identification.
  • models e.g., Model 1
  • UE may report such information to NW for management.
  • NW may provide the assistance information and operate the model directly as the same as shown FIG. 2B.
  • NW may select a group of models within a functionality for monitoring for example. This approach is beneficial for the two-side model and multiple models.
  • a general model identification procedure for a UE-sided and/or UE-part of two-sided models may be categorized as Type A and Type B1/B2 as illustrated in FIGS. 3A-3C.
  • Type A as shown in FIG. 3A, the model is identified without air interface signaling; for Type B1 and B2, as shown in FIG. 3B and 3C, the model is identified over air interface with initiator of either NW or UE.
  • the potential specification impacts and information to be shared over the air interface are discussed in following, including one-sided (UE-sided) models and two-sided models.
  • UE-sided UE-sided
  • two-sided models the main objective is to share the relevant information with NW during the identification procedure.
  • FIG. 4A illustrates an example of signalling in a Type A model identification procedure.
  • the model ID can be directly used for following model-ID-based LCM.
  • the applicable models are reported using the model IDs, and a model can be activated directly based on the model ID.
  • FIG. 4B illustrates an example of signalling in a Type B1 model identification procedure.
  • the procedure would include model ID request initiated from UE and assignment from NW, together with the corresponding model description as illustrated in FIG. 4B.
  • the basic procedure in this type may include UE’s request for model IDs on the local models, which can be realized via sending a request with some temporary ID and some model descriptions.
  • NW confirms the identification/registration, assigns the model ID (s) to the request models and requests for the further potential other information.
  • the model identification to assign model ID (s) and relevant information about the models, i.e., model description, has been aligned for the future Model-ID-based LCM.
  • FIG. 4C illustrates an example of signalling in a Type B2 model identification procedure.
  • the procedure may include a model request initiated from UE and model transfer from NW, together with the model ID (s) and model description as illustrated in FIG. 4C.
  • the basic procedure in this type may include UE’s request for some AI/ML models for some applications, together with relevant requirements and its capabilities.
  • the NW may confirm the request and transfer the model with a model ID (s) together with the model description if needed.
  • the model identification to assign model ID (s) and relevant information about the models, i.e., model description, has been aligned for future model-ID-based LCM.
  • a set of signalling to support unified LCM framework are is proposed to support AI/ML in air interface.
  • the ‘functionality identification’ and ‘functionality-based LCM’ are selected as the baseline, where the indications on whether to support ‘model identification’ and ‘model-ID-based LCM’ or not will be explicitly included during functionality identification.
  • the signals to support such hierarchical procedures are further proposed.
  • FIG. 5 illustrates an example of a process flow 500 of a unified framework for LCM in accordance with some example embodiments of the present disclosure.
  • the process flow 500 may involve a UE 501 and a network entity (e.g. a base station, such as gNB) 502.
  • the process flow 500 may be applied to the wireless communications system 100 with reference to FIG. 1, for example, the UE 501 may be any of UEs 104, and the network entity 502 may be any of the network entities 102. It would be appreciated that the process flow 500 may be applied to other communication scenarios.
  • the UE 501 may transmit, to the network entity 502, an indication 515 on whether model identification is supported for an AI/ML functionality. Accordingly, at 520, the network entity 502 may receive the indication 515 from the UE 501.
  • the AI/ML functionality may refer to an AI/ML-enabled Feature/Feature Group (FG) enabled by configuration (s) , where configuration (s) is (are) supported based on conditions indicated by UE capability.
  • the functionality-based LCM may operate based on, at least, one configuration of AI/ML-enabled Feature/FG or specific configurations of an AI/ML-enabled Feature/FG.
  • the AI/ML functionality may be associated with AI/ML model (s) .
  • the indication 515 may comprise one bit indicating existence of at least one AI/ML model associated with the AI/ML functionality to be identified.
  • the indication 515 may comprise identifier (s) of the AI/ML model (s) associated with the AI/ML functionality.
  • the identifier (s) may be model ID (s) aligned with the network entity 502, or temporary ID (s) which could be used to derive model ID (s) during model identification for Type B1.
  • the indication 515 may be included in information regarding the AI/ML functionality and transmitted to the network entity 502 during a functionality identification procedure.
  • the UE may transmit the information regarding the AI/ML functionality and the indication in dedicated UE capability or UE assistance information.
  • the information regarding the AI/ML functionality may comprise configurations of an AI/ML-enabled feature or a feature group. Additionally or alternatively, the information regarding the AI/ML functionality may comprise application conditions.
  • the UE 501 and the network entity 502 may perform a functionality-based LCM procedure of the AI/ML functionality.
  • the LCM of the AI/ML functionality may select the functionality identification and basic functionality-based LCM (e.g. operations on the functionality) are selected as the baseline. If the model identification is supported for the the AI/ML functionality and the corresponding indication is explicitly transmitted to the network entity 502, the UE 501 and the network entity 502 may perform a model-ID-based LCM procedure of the at least one AI/ML model associated with the AI/ML functionality during the functionality-based LCM procedure 530. Details will be described with reference to FIGS. 6 to 10 hereafter.
  • FIG. 6 illustrates another example of a process flow 600 of a unified framework for LCM in accordance with some example embodiments of the present disclosure.
  • the process flow 600 may be an example implementation of the process flow 500 as shown in FIG. 5.
  • the process flow 600 is based on the framework of functionality identification 610 and functionality-based LCM 620, wherein the model identification 623 and model-ID-based LCM 624 are embedded.
  • the process flow 600 comprises a functionality identification procedure 610.
  • a functionality identification procedure 610 As defined in 3GPP, it is a process/method of identifying an AI/ML functionality for the common understanding between the NW and the UE, and the relevant information regarding the AI/ML functionality may be shared during functionality identification.
  • configurations of an AI/ML-enabled feature or a feature group and application conditions associated with the AI/ML functionality may be transmitted between the UE 501 and the network entity 502.
  • an indication on whether to support model identification and model-ID-based LCM or not may be included in the information regarding the AI/ML functionality during the functionality identification procedure 610.
  • the process flow 600 may further comprise a functionality-based LCM 620 following the functionality identification procedure 610.
  • a functionality-based LCM 620 following the functionality identification procedure 610.
  • the basic LCM with the manageable unit of functionality is used. If the support on model identification and model-ID-based LCM is indicated, the model identification and following LCM on AI/ML models associated with the functionality may be triggered.
  • the functionality-based LCM may comprise a functionality activation procedure 621.
  • the functionality activation procedure 621 may be initiated by either UE 501 or the network entity 502. If the condition (s) pre-defined and aligned during functionality identification is satisfied, the functionality activation procedure 621 is initiated.
  • NW-side models the activation of functionalities may not be specified.
  • UE-side models the activation of functionalities needs to be indicated or configured by the network entity 502, or be done by UE 501 but UE may need to report to the network entity 502.
  • the activation of functionalities needs to be indicated or configured by the network entity 502 or be done by UE 501.
  • the process flow 600 may further comprise functionality-based operations 622.
  • the basic LCM with the manageable unit of functionality is used, such as functionality switching, selection, updating, and the like.
  • the process flow 600 may further comprise an AI/ML model identification procedure 623 and an AI/ML model-ID-based LCM 624.
  • the models in the functionality need to be further identified at least for the case that more than one models may be deployed for a functionality. i.e., to identify at least one AI/ML model in the functionality for the common understanding between the network entity 502 and the UE 501.
  • the UE 501 may further report the number of models in the functionality and corresponding descriptions of the models. The descriptions may at least include the required model input and the potential model output, which can be reported per model or per functionality.
  • Additional conditions associated with the models may be reported by UE 501, e.g., by uplink control information (UCI) or medium access control (MAC) control element (MAC-CE) .
  • the network entity 502 may configure appropriate configuration to UE 501 for inference.
  • One example is the required number of RS transmissions in time domain for temporal domain CSI or beam prediction.
  • the basic LCM with the manageable unit of model is used, such as model activation, model selection, model switching, model updating and model deactivation.
  • the process flow 600 may further comprise an AI/ML functionality de-activation procedure 625. If the performance monitored has been or is to be degraded and/or the applicable condition is not further satisfied, the functionality would be de-activated or be disabled, which also deactivates all activated models if they are activated in this functionality.
  • the deactivation of a functionality may be performed by the UE 501 but need to be reported to the network entity 502 at least for the case that the ground truth is firstly available at the UE side. The detailed procedures with new designs in each procedure are described below.
  • FIG. 7 illustrates an example of a functionality identification procedure 700 in accordance with some example embodiments of the present disclosure.
  • the functionality identification procedure 700 may be an example of the functionality identification procedure 610 illustrated in FIG. 6.
  • some relevant information regarding the AI/ML functionality may be shared and aligned, mainly including the conditions to activate the AI/ML functionality.
  • the functionality identification procedure 700 may include two signaling interactions.
  • the network entity 502 queries the AI/ML functionalities at UE side.
  • the network entity 502 may send an enquiry message on the information of the functionalities via a signaling, e.g., radio resource control (RRC) signaling such as an extended UE capability message, UECapabilityEnquiry or a new AI-related message, UEAICapabilityEnquiry or UEAIInformationRequest for AI-related information request.
  • RRC radio resource control
  • the UE 501 may report the AI/ML functionalities to the network entity 502.
  • the functionalities would be reported to the network entity 502 in UE capability (e.g., UECapabilityInformation RRC message) as AI-related feature groups, for example, UE assistance information (UAI) (e.g., UEAssistanceInformation RRC message) or other signaling, e.g., dedicated RRC parameters or dedicated MAC CE format.
  • UAI e.g., UEAssistanceInformation RRC message
  • Step 0 is not necessary, thus it can be optional in this disclosure.
  • the indication on whether to support model identification and model-ID-based LCM or not on the model (s) associated with the functionality may be included together with other information to be shared during functionality identification, e.g., application conditions.
  • FIG. 8 illustrates a schematic diagram of an example of indication on AI/ML models of an AI/ML functionality to be or not be identified and managed in accordance with some example embodiments of the present disclosure.
  • N models in an AI/ML functionality where K models needs to be identified and managed (or assisted) by the network, and the remaining models don’ t need such identification and management.
  • the AI/ML models to be identified within the associated functionality will be used in following model-ID-based LCM to share more information between the network entity 502 and the UE 501 as explained in following.
  • the UE 501 may use one bit to indicate there are models to be identified.
  • the information of each model, including model ID will be further provided during the model identification procedure.
  • the UE 501 may directly indicate the models to be identified.
  • the models in the AI/ML functionality may be explicitly indicated, together with an identifier (ID) . If Type A is considered, the model IDs are aligned between two sides, the identifier (s) may be the model IDs. Otherwise (i.e., for Type B1 and B2) , a temporary ID may be used to locally identify a model, which will be assigned a model ID during following model identification.
  • FIG. 9 illustrates an example of a functionality activation procedure 900 in accordance with some example embodiments of the present disclosure.
  • the functionality activation procedure 900 may be an example of the functionality identification procedure 621 illustrated in FIG. 6. To manage the functionality, it is necessary to activate the functionality before any operation on it.
  • the functionality activation procedure 900 may include the following interactions between the network entity 502 and the UE 501.
  • the UE 501 may initiate to activate a functionality. If the condition to activate the functionality is satisfied by UE’s monitoring/assessment, the UE 501 may ask the network entity 502 to assist or confirm the functionality activation via dedicated signaling, e.g., RRC or MAC CE signaling.
  • dedicated signaling e.g., RRC or MAC CE signaling.
  • the network entity 502 may activate the functionality. If the condition to activate the functionality is satisfied by the network entity 502’s monitoring or assessment, the network entity 502 may ask the UE 501 to activate the functionality via dedicated signaling, e.g., RRC or MAC CE signaling.
  • dedicated signaling e.g., RRC or MAC CE signaling.
  • the UE 501 may confirm the activation.
  • the corresponding functionality may be activated if the other relevant conditions are satisfied, e.g., battery or computation load. This step can be implicitly within other signaling after the reception of activation from the network entity 502. Then, the functionality is activated for the LCM operations, which are used by the network entity 502 to manage the functionality and AI/ML models in the UE 501.
  • model identification procedure (e.g. the procedure 623) may be enabled.
  • the model identification procedure may comprise the basic Type A, B1 or B2 identifications as described with reference to FIGS. 4A-4C.
  • the model identification procedure may include additional features in each type considering the proposal in this disclosure.
  • the UE 501 may determine model ID (s) of AI/ML model (s) of the functionality based on the indication, and exchange, using the model ID, information regarding the AI/ML model (s) with the network entity 502 during the model identification procedure.
  • model ID (s) indicated from the UE 501 to the network entity 502 may be used for model identification.
  • the model ID (s) is (are) explicitly indicated during functionality identification.
  • further finer information, e.g., application conditions or model description information, for the model may be exchanged using the model IDs during the model identification procedure.
  • Type B1 the information provided during functionality identification, for example, one bit indicating existence of models or temporal ID (s) of the model (s) to be identified, can be used for model identification.
  • a model in this functionality need to be assigned by a temporal ID, which may be determined during identification by the network entity based on the number of models.
  • the further finer information e.g., application conditions or model description information, for the model can be exchanged using the assigned model IDs.
  • the UE and the network entity may perform model identification for all of models of the functionality.
  • Type B2 there is no impact. Since this type is for the model transfer from the network entity to the UE, the proposal for the UE-side/part model in this disclosure has no impact on this type of model identification.
  • model-ID-based LCM the model-ID-based LCM related operations, e.g., model switching, model selection, model updating and model deactivation, may be done as discussed in 3GPP.
  • additional operations on the AI/ML functionality and associated models are included in the unified LCM framework, including functionality switching, updating and de-activation.
  • FIG. 10 illustrates an example of a LCM procedure 1000 including operations on an AI/ML functionality and models in accordance with some example embodiments of the present disclosure.
  • the functionality switching may include two operations: to de-activate current activated functionality and activate another functionality.
  • the functionality de-activation means to de-activate an activated functionality. If multiple models are identified within a functionality, all the models of this functionality may be deactivated when a functionality is deactivated.
  • the operations on the AI/ML functionality apply the same operation on the at least one AI/ML model of the AI/ML functionality. For example, once the functionality is switched or updated, the models in the functionality may need to be identified as above. In addition, if the functionality is switched back without updated, i.e., the functionality was ever activated without updating, the identification of the models in this functionality may be not necessary.
  • the models in the functionality may be deactivated, no matter they are activated or not.
  • the functionality may be deactivated, so as the models in it.
  • any operation on the functionality also means on the models in the functionality.
  • a unified LCM framework which uses functionality-based LCM as the baseline and indicate the models within the functionality for possible identification and model-ID-based LCM after activation.
  • the relevant signaling to support such unified LCM framework can be realized via dedicated RRC signaling and/or MAC CE in specifications for all potential use cases.
  • FIG. 11 illustrates an example of a device that is suitable for implementing some embodiments of the present disclosure.
  • the device 1100 may be an example of a UE 104 or network entity 102 as described herein.
  • the device 1100 may support wireless communication with one or more network entities 102, UEs 104, or any combination thereof.
  • the device 1100 may include components for bi-directional communications including components for transmitting and receiving communications, such as a processor 1102, a memory 1104, a transceiver 1106, and, optionally, an I/O controller 1108. These components may be in electronic communication or otherwise coupled (e.g., operatively, communicatively, functionally, electronically, electrically) via one or more interfaces (e.g., buses) .
  • interfaces e.g., buses
  • the processor 1102, the memory 1104, the transceiver 1106, or various combinations thereof or various components thereof may be examples of means for performing various aspects of the present disclosure as described herein.
  • the processor 1102, the memory 1104, the transceiver 1106, or various combinations or components thereof may support a method for performing one or more of the operations described herein.
  • the processor 1102, the memory 1104, the transceiver 1106, or various combinations or components thereof may be implemented in hardware (e.g., in communications management circuitry) .
  • the hardware may include a processor, a digital signal processor (DSP) , an application-specific integrated circuit (ASIC) , a field-programmable gate array (FPGA) or other programmable logic device, a discrete gate or transistor logic, discrete hardware components, or any combination thereof configured as or otherwise supporting a means for performing the functions described in the present disclosure.
  • the processor 1102 and the memory 1104 coupled with the processor 1102 may be configured to perform one or more of the functions described herein (e.g., executing, by the processor 1102, instructions stored in the memory 1104) .
  • the processor 1102 may support wireless communication at the device 1100 in accordance with examples as disclosed herein.
  • the device 1100 may be an example of a UE 104.
  • the processor 1102 may be configured to operable to support means for transmitting, to a network entity, an indication on whether model identification is supported for an AI/ML functionality; and means for performing a functionality-based LCM procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • ID model-identifier
  • the device 1100 may be an example of a network entity 102, e.g. a network entity.
  • the processor 1102 may be configured to operable to support means for receiving, from a UE, an indication on whether model identification is supported for an AI/ML functionality; and means for performing a functionality-based LCM procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • ID model-identifier
  • the processor 1102 may include an intelligent hardware device (e.g., a general-purpose processor, a DSP, a CPU, a microcontroller, an ASIC, an FPGA, a programmable logic device, a discrete gate or transistor logic component, a discrete hardware component, or any combination thereof) .
  • the processor 1102 may be configured to operate a memory array using a memory controller.
  • a memory controller may be integrated into the processor 1102.
  • the processor 1102 may be configured to execute computer-readable instructions stored in a memory (e.g., the memory 1104) to cause the device 1100 to perform various functions of the present disclosure.
  • the memory 1104 may include random access memory (RAM) and read-only memory (ROM) .
  • the memory 1104 may store computer-readable, computer-executable code including instructions that, when executed by the processor 1102 cause the device 1100 to perform various functions described herein.
  • the code may be stored in a non-transitory computer-readable medium such as system memory or another type of memory.
  • the code may not be directly executable by the processor 1102 but may cause a computer (e.g., when compiled and executed) to perform functions described herein.
  • the memory 1104 may include, among other things, a basic I/O system (BIOS) which may control basic hardware or software operation such as the interaction with peripheral components or devices.
  • BIOS basic I/O system
  • the I/O controller 1108 may manage input and output signals for the device 1100.
  • the I/O controller 1108 may also manage peripherals not integrated into the device 1100.
  • the I/O controller 1108 may represent a physical connection or port to an external peripheral.
  • the I/O controller 1108 may utilize an operating system such as or another known operating system.
  • the I/O controller 1108 may be implemented as part of a processor, such as the processor 1102.
  • a user may interact with the device 1100 via the I/O controller 1108 or via hardware components controlled by the I/O controller 1108.
  • the device 1100 may include a single antenna 1110. However, in some other implementations, the device 1100 may have more than one antenna 1110 (i.e., multiple antennas) , including multiple antenna panels or antenna arrays, which may be capable of concurrently transmitting or receiving multiple wireless transmissions.
  • the transceiver 1106 may communicate bi-directionally, via the one or more antennas 1110, wired, or wireless links as described herein.
  • the transceiver 1106 may represent a wireless transceiver and may communicate bi-directionally with another wireless transceiver.
  • the transceiver 1106 may also include a modem to modulate the packets, to provide the modulated packets to one or more antennas 1110 for transmission, and to demodulate packets received from the one or more antennas 1110.
  • the transceiver 1106 may include one or more transmit chains, one or more receive chains, or a combination thereof.
  • a transmit chain may be configured to generate and transmit signals (e.g., control information, data, packets) .
  • the transmit chain may include at least one modulator for modulating data onto a carrier signal, preparing the signal for transmission over a wireless medium.
  • the at least one modulator may be configured to support one or more techniques such as amplitude modulation (AM) , frequency modulation (FM) , or digital modulation schemes like phase-shift keying (PSK) or quadrature amplitude modulation (QAM) .
  • the transmit chain may also include at least one power amplifier configured to amplify the modulated signal to an appropriate power level suitable for transmission over the wireless medium.
  • the transmit chain may also include one or more antennas 1110 for transmitting the amplified signal into the air or wireless medium.
  • a receive chain may be configured to receive signals (e.g., control information, data, packets) over a wireless medium.
  • the receive chain may include one or more antennas 1110 for receive the signal over the air or wireless medium.
  • the receive chain may include at least one amplifier (e.g., a low-noise amplifier (LNA) ) configured to amplify the received signal.
  • the receive chain may include at least one demodulator configured to demodulate the receive signal and obtain the transmitted data by reversing the modulation technique applied during transmission of the signal.
  • the receive chain may include at least one decoder for decoding the processing the demodulated signal to receive the transmitted data.
  • FIG. 12 illustrates an example of a processor 1200 is suitable for implementing some embodiments of the present disclosure.
  • the processor 1200 may be an example of a processor configured to perform various operations in accordance with examples as described herein.
  • the processor 1200 may include a controller 1202 configured to perform various operations in accordance with examples as described herein.
  • the processor 1200 may optionally include at least one memory 1204. Additionally, or alternatively, the processor 1200 may optionally include one or more arithmetic-logic units (ALUs) 1206.
  • ALUs arithmetic-logic units
  • One or more of these components may be in electronic communication or otherwise coupled (e.g., operatively, communicatively, functionally, electronically, electrically) via one or more interfaces (e.g., buses) .
  • the processor 1200 may be a processor chipset and include a protocol stack (e.g., a software stack) executed by the processor chipset to perform various operations (e.g., receiving, obtaining, retrieving, transmitting, outputting, forwarding, storing, determining, identifying, accessing, writing, reading) in accordance with examples as described herein.
  • a protocol stack e.g., a software stack
  • operations e.g., receiving, obtaining, retrieving, transmitting, outputting, forwarding, storing, determining, identifying, accessing, writing, reading
  • the processor chipset may include one or more cores, one or more caches (e.g., memory local to or included in the processor chipset (e.g., the processor 1200) or other memory (e.g., random access memory (RAM) , read-only memory (ROM) , dynamic RAM (DRAM) , synchronous dynamic RAM (SDRAM) , static RAM (SRAM) , ferroelectric RAM (FeRAM) , magnetic RAM (MRAM) , resistive RAM (RRAM) , flash memory, phase change memory (PCM) , and others) .
  • RAM random access memory
  • ROM read-only memory
  • DRAM dynamic RAM
  • SDRAM synchronous dynamic RAM
  • SRAM static RAM
  • FeRAM ferroelectric RAM
  • MRAM magnetic RAM
  • RRAM resistive RAM
  • PCM phase change memory
  • the controller 1202 may be configured to manage and coordinate various operations (e.g., signaling, receiving, obtaining, retrieving, transmitting, outputting, forwarding, storing, determining, identifying, accessing, writing, reading) of the processor 1200 to cause the processor 1200 to support various operations in accordance with examples as described herein.
  • the controller 1202 may operate as a control unit of the processor 1200, generating control signals that manage the operation of various components of the processor 1200. These control signals include enabling or disabling functional units, selecting data paths, initiating memory access, and coordinating timing of operations.
  • the controller 1202 may be configured to fetch (e.g., obtain, retrieve, receive) instructions from the memory 1204 and determine subsequent instruction (s) to be executed to cause the processor 1200 to support various operations in accordance with examples as described herein.
  • the controller 1202 may be configured to track memory address of instructions associated with the memory 1204.
  • the controller 1202 may be configured to decode instructions to determine the operation to be performed and the operands involved.
  • the controller 1202 may be configured to interpret the instruction and determine control signals to be output to other components of the processor 1200 to cause the processor 1200 to support various operations in accordance with examples as described herein.
  • the controller 1202 may be configured to manage flow of data within the processor 1200.
  • the controller 1202 may be configured to control transfer of data between registers, arithmetic logic units (ALUs) , and other functional units of the processor 1200.
  • ALUs arithmetic logic units
  • the memory 1204 may include one or more caches (e.g., memory local to or included in the processor 1200 or other memory, such RAM, ROM, DRAM, SDRAM, SRAM, MRAM, flash memory, etc. In some implementation, the memory 1204 may reside within or on a processor chipset (e.g., local to the processor 1200) . In some other implementations, the memory 1204 may reside external to the processor chipset (e.g., remote to the processor 1200) .
  • caches e.g., memory local to or included in the processor 1200 or other memory, such RAM, ROM, DRAM, SDRAM, SRAM, MRAM, flash memory, etc.
  • the memory 1204 may reside within or on a processor chipset (e.g., local to the processor 1200) . In some other implementations, the memory 1204 may reside external to the processor chipset (e.g., remote to the processor 1200) .
  • the memory 1204 may store computer-readable, computer-executable code including instructions that, when executed by the processor 1200, cause the processor 1200 to perform various functions described herein.
  • the code may be stored in a non-transitory computer-readable medium such as system memory or another type of memory.
  • the controller 1202 and/or the processor 1200 may be configured to execute computer-readable instructions stored in the memory 1204 to cause the processor 1200 to perform various functions (e.g., functions or tasks supporting transmit power prioritization) .
  • the processor 1200 and/or the controller 1202 may be coupled with or to the memory 1204, the processor 1200, the controller 1202, and the memory 1204 may be configured to perform various functions described herein.
  • the processor 1200 may include multiple processors and the memory 1204 may include multiple memories. One or more of the multiple processors may be coupled with one or more of the multiple memories, which may, individually or collectively, be configured to perform various functions herein.
  • the one or more ALUs 1206 may be configured to support various operations in accordance with examples as described herein.
  • the one or more ALUs 1206 may reside within or on a processor chipset (e.g., the processor 1200) .
  • the one or more ALUs 1206 may reside external to the processor chipset (e.g., the processor 1200) .
  • One or more ALUs 1206 may perform one or more computations such as addition, subtraction, multiplication, and division on data.
  • one or more ALUs 1206 may receive input operands and an operation code, which determines an operation to be executed.
  • One or more ALUs 1206 be configured with a variety of logical and arithmetic circuits, including adders, subtractors, shifters, and logic gates, to process and manipulate the data according to the operation. Additionally, or alternatively, the one or more ALUs 1206 may support logical operations such as AND, OR, exclusive-OR (XOR) , not-OR (NOR) , and not-AND (NAND) , enabling the one or more ALUs 1206 to handle conditional operations, comparisons, and bitwise operations.
  • logical operations such as AND, OR, exclusive-OR (XOR) , not-OR (NOR) , and not-AND (NAND) , enabling the one or more ALUs 1206 to handle conditional operations, comparisons, and bitwise operations.
  • the processor 1200 may support wireless communication in accordance with examples as disclosed herein.
  • the processor 1200 may implemented at a UE 104.
  • the processor 1200 may be configured to operable to support means for transmitting, to a network entity, an indication on whether model identification is supported for an AI/ML functionality; and means for performing a functionality-based LCM procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • ID model-identifier
  • the processor 1200 may implemented at a network entity 102, e.g. a base station.
  • the processor 1200 may be configured to operable to support means for receiving, from a UE, an indication on whether model identification is supported for an AI/ML functionality; and means for performing a functionality-based LCM procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-identifier (ID) -based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • ID model-identifier
  • FIG. 13 illustrates a flowchart of a method 1300 performed by a UE in accordance with aspects of the present disclosure.
  • the operations of the method 1300 may be implemented by a device or its components as described herein.
  • the operations of the method 1300 may be performed by a UE 104 as described herein.
  • the device may execute a set of instructions to control the function elements of the device to perform the described functions. Additionally, or alternatively, the device may perform aspects of the described functions using special-purpose hardware.
  • the method may include transmitting, to a network entity, an indication on whether model identification is supported for an AI/ML functionality.
  • the operations of 1310 may be performed in accordance with examples as described herein. In some implementations, aspects of the operations of 1010 may be performed by a UE 104 as described with reference to FIG. 1.
  • the method may include performing a functionality-based lifecycle LCM procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-ID-based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • the operations of 1320 may be performed in accordance with examples as described herein. In some implementations, aspects of the operations of 1320 may be performed by a UE 104 as described with reference to FIG. 1.
  • FIG. 14 illustrates a flowchart of a method 1400 performed by a network entity in accordance with aspects of the present disclosure.
  • the operations of the method 1400 may be implemented by a device or its components as described herein.
  • the operations of the method 1400 may be performed by a network entity 102 as described herein.
  • the device may execute a set of instructions to control the function elements of the device to perform the described functions. Additionally, or alternatively, the device may perform aspects of the described functions using special-purpose hardware.
  • the method may include receiving, from a UE, an indication on whether model identification is supported for an AI/ML functionality.
  • the operations of 1410 may be performed in accordance with examples as described herein. In some implementations, aspects of the operations of 1410 may be performed by a network entity 102 as described with reference to FIG. 1.
  • the method may include performing a functionality-based LCM procedure of the AI/ML functionality, wherein in case that the model identification is supported for the AI/ML functionality, the functionality-based LCM procedure comprises a model-ID-based LCM procedure of at least one AI/ML model associated with the AI/ML functionality.
  • the operations of 1420 may be performed in accordance with examples as described herein. In some implementations, aspects of the operations of 1420 may be performed by a network entity 102 as described with reference to FIG. 1.
  • a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • the functions described herein may be implemented in hardware, software executed by a processor, firmware, or any combination thereof. If implemented in software executed by a processor, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Other examples and implementations are within the scope of the disclosure and appended claims. For example, due to the nature of software, functions described herein may be implemented using software executed by a processor, hardware, firmware, hardwiring, or combinations of any of these. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations.
  • Computer-readable media includes both non-transitory computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a non-transitory storage medium may be any available medium that may be accessed by a general-purpose or special-purpose computer.
  • non-transitory computer-readable media may include RAM, ROM, electrically erasable programmable ROM (EEPROM) , flash memory, compact disk (CD) ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium that may be used to carry or store desired program code means in the form of instructions or data structures and that may be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor.
  • an article “a” before an element is unrestricted and understood to refer to “at least one” of those elements or “one or more” of those elements.
  • the terms “a, ” “at least one, ” “one or more, ” and “at least one of one or more” may be interchangeable.
  • a list of items indicates an inclusive list such that, for example, a list of at least one of A, B, or C means A or B or C or AB or AC or BC or ABC (i.e., A and B and C) .
  • the phrase “based on” shall not be construed as a reference to a closed set of conditions. For example, an example step that is described as “based on condition A” may be based on both a condition A and a condition B without departing from the scope of the present disclosure.
  • the phrase “based on” shall be construed in the same manner as the phrase “based at least in part on.
  • a “set” may include one or more elements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

Divers aspects de la présente divulgation concernent un UE, un processeur pour une communication sans fil, une entité réseau, des procédés et un support lisible par ordinateur pour la gestion du cycle de vie (LCM) prenant en charge l'IA/le ML pour améliorer l'interface radio. L'UE transmet à l'entité réseau une indication pour indiquer si une identification de modèle est prise en charge pour une fonctionnalité d'IA/de ML. L'UE effectue une procédure LCM basée sur une fonctionnalité de la fonctionnalité d'IA/de ML. Si l'identification de modèle est prise en charge pour la fonctionnalité d'IA/de ML, la procédure LCM basée sur une fonctionnalité comprend une procédure LCM basée sur un ID de modèle d'au moins un modèle d'IA/de ML associé à la fonctionnalité d'IA/de ML. De cette manière, une structure LCM unifiée est proposée pour prendre en charge tous les cas d'utilisation potentiels avec l'IA/le ML.
PCT/CN2023/129763 2023-11-03 2023-11-03 Gestion du cycle de vie prenant en charge l'ia/le ml pour améliorer l'interface radio Pending WO2024148935A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/129763 WO2024148935A1 (fr) 2023-11-03 2023-11-03 Gestion du cycle de vie prenant en charge l'ia/le ml pour améliorer l'interface radio

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/129763 WO2024148935A1 (fr) 2023-11-03 2023-11-03 Gestion du cycle de vie prenant en charge l'ia/le ml pour améliorer l'interface radio

Publications (1)

Publication Number Publication Date
WO2024148935A1 true WO2024148935A1 (fr) 2024-07-18

Family

ID=91897862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/129763 Pending WO2024148935A1 (fr) 2023-11-03 2023-11-03 Gestion du cycle de vie prenant en charge l'ia/le ml pour améliorer l'interface radio

Country Status (1)

Country Link
WO (1) WO2024148935A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109409532A (zh) * 2017-08-14 2019-03-01 埃森哲环球解决方案有限公司 基于人工智能和机器学习的产品开发
WO2023148010A1 (fr) * 2022-02-07 2023-08-10 Telefonaktiebolaget Lm Ericsson (Publ) Gestion réseau-centrique du cycle de vie des modèles ai/ml déployés dans un équipement utilisateur (ue)
WO2023148009A1 (fr) * 2022-02-07 2023-08-10 Telefonaktiebolaget Lm Ericsson (Publ) Gestion de cycle de vie centrée sur l'utilisateur de modèles ai/ml déployés dans un équipement utilisateur (ue)

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109409532A (zh) * 2017-08-14 2019-03-01 埃森哲环球解决方案有限公司 基于人工智能和机器学习的产品开发
WO2023148010A1 (fr) * 2022-02-07 2023-08-10 Telefonaktiebolaget Lm Ericsson (Publ) Gestion réseau-centrique du cycle de vie des modèles ai/ml déployés dans un équipement utilisateur (ue)
WO2023148009A1 (fr) * 2022-02-07 2023-08-10 Telefonaktiebolaget Lm Ericsson (Publ) Gestion de cycle de vie centrée sur l'utilisateur de modèles ai/ml déployés dans un équipement utilisateur (ue)

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XUEMING PAN, VIVO: "Discussions on AI/ML framework", 3GPP DRAFT; R1-2302476; TYPE DISCUSSION; FS_NR_AIML_AIR, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, vol. 3GPP RAN 1, no. Online; 20230417 - 20230426, 7 April 2023 (2023-04-07), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, XP052293048 *

Similar Documents

Publication Publication Date Title
WO2024109110A1 (fr) Fourniture d'informations d'assistance pour gcv
WO2024207851A1 (fr) Techniques de prise en charge d'intelligence artificielle native dans des systèmes de communications sans fil
WO2024093428A1 (fr) Mécanisme pour cho avec des scg candidats
WO2024148935A1 (fr) Gestion du cycle de vie prenant en charge l'ia/le ml pour améliorer l'interface radio
WO2024179093A1 (fr) Apprentissage fédéré assisté
WO2025213873A1 (fr) Collecte de données de mesure basée sur une couche supérieure
WO2024244500A1 (fr) Dispositifs et procédés de communication
WO2025246425A1 (fr) Gestion de ressource
WO2024207740A1 (fr) Mobilité déclenchée de couche 1 ou de couche 2
WO2024109130A1 (fr) Prise en charge de l'établissement de rapport de mesure de rendement clé (kpm) multicellulaire
WO2025175837A1 (fr) Indication de relation qcl dynamique pour ensemble de ressources rs
WO2025107726A1 (fr) Dispositifs et procédés de communication
WO2025241553A1 (fr) Sélection de ressources
WO2024113888A1 (fr) Sélection de ressources pour une transmission de liaison latérale
WO2025060545A1 (fr) Sélection entre un rapport d'informations de synchronisation basé sur un échantillon ou sur un trajet pour un positionnement
WO2025236750A1 (fr) Transmission en liaison montante
WO2025011026A1 (fr) Sélection de groupe de ressources
WO2024187813A1 (fr) Mécanisme de gestion de collision de détection
WO2024198474A1 (fr) Atténuation de conflit dans un ric proche-rt
WO2025145707A1 (fr) Acquisition précoce de csi pour mobilité déclenchée de couche l1/l2
WO2024198604A1 (fr) Alimentation en énergie pour dispositif sans batterie
WO2025148315A1 (fr) Prédiction de défaillance de liaison radio en double connexion
WO2025035789A1 (fr) Détection et mise à jour de dispositif aiot
WO2024169183A1 (fr) Association et mappage entre ensembles de faisceaux
WO2024093655A1 (fr) Division de données de liaison montante déclenchée par état de retard

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23915693

Country of ref document: EP

Kind code of ref document: A1