[go: up one dir, main page]

WO2025129578A1 - Système et procédé de positionnement basé sur l'intelligence artificielle/l'apprentissage machine - Google Patents

Système et procédé de positionnement basé sur l'intelligence artificielle/l'apprentissage machine Download PDF

Info

Publication number
WO2025129578A1
WO2025129578A1 PCT/CN2023/140731 CN2023140731W WO2025129578A1 WO 2025129578 A1 WO2025129578 A1 WO 2025129578A1 CN 2023140731 W CN2023140731 W CN 2023140731W WO 2025129578 A1 WO2025129578 A1 WO 2025129578A1
Authority
WO
WIPO (PCT)
Prior art keywords
wireless communication
positioning
prs
communication method
communication entity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CN2023/140731
Other languages
English (en)
Inventor
Cong Wang
Chuangxin JIANG
Rongwei SHI
Junchen Liu
Yu Pan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to PCT/CN2023/140731 priority Critical patent/WO2025129578A1/fr
Publication of WO2025129578A1 publication Critical patent/WO2025129578A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0205Details
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L27/00Modulated-carrier systems
    • H04L27/26Systems using multi-frequency codes
    • H04L27/2601Multicarrier modulation systems
    • H04L27/2602Signal structure
    • H04L27/261Details of reference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/22Processing or transfer of terminal data, e.g. status or physical capabilities
    • H04W8/24Transfer of terminal data

Definitions

  • the standardization organization Third Generation Partnership Project (3GPP) is currently in the process of specifying a new Radio Interface called 5G New Radio (5G NR) as well as a Next Generation Packet Core Network (NG-CN or NGC) .
  • the 5G NR will have three main components: a 5G Access Network (5G-AN) , a 5G Core Network (5GC) , and a User Equipment (UE) .
  • 5G-AN 5G Access Network
  • 5GC 5G Core Network
  • UE User Equipment
  • the elements of the 5GC also called Network Functions
  • AI Artificial Intelligence
  • ML Machine Learning
  • the basic functions can include, data collection, AI/ML model training, AI/ML model inference, model update, and model selection, among others.
  • the capability can include one or more of: the maximum number of DL-PRS Resource Sets per TRP per positioning frequency layer supported by a second wireless communication entity, the maximum number of TRPs across all positioning frequency layers for AI/ML positioning, the maximum number of supported positioning frequency layers for AI/ML positioning, the maximum number of supported positioning frequency layers for AI/ML positioning, the maximum number of DL-PRS Resources per DL-PRS Resource Set for AI/ML positioning, the maximum number of DL-PRS resources per positioning frequency layer for AI/ML positioning, the maximum number of DL-PRS Resources supported by a second wireless communication entity across all frequency layers for AI/ML positioning.
  • the capability can include whether the second wireless communication entity supports reporting both of the AI/ML positioning result and the positioning result of other positioning method (s) .
  • the capability can include a combination of ⁇ N Q , Q ⁇ , N Q PRS that the second wireless communication entity can achieve an AI/ML inference quality/confidence level/model inference latency (Q) for the second wireless communication entity’s location inference and/or for new measurement, and/or for the enhancement of existing measurement.
  • the capability can include at least one of: a processing speed of the second wireless communication entity; a maximum capacity; a ratio of remaining space to a maximum capacity; a remaining available time or space to receive PRS and/or collect data set; or a timestamp for a current state; or a model inference latency; or a model ID; or a functionality.
  • N Q can indicate the duration of DL-PRS symbols.
  • N Q can indicate the size of dataset (s) for positioning reference signal.
  • N Q can indicate the size of each dataset and the number of datasets for positioning reference signal.
  • Q represents at least one of: a quality value and a quality resolution; or a confidence value and a confidence value resolution; or a latency requirement for model inference.
  • Q represents at least one of: a confidence level for LOS/NLOS identification; an inference quality for timing inference; or an inference quality for angle inference.
  • N and/or N2 can indicate the duration of DL-PRS symbols, a duration of N PRS symbols that the second wireless communication entity can process every T microseconds (ms) and/or a duration of N2 PRS symbols that the second wireless communication entity can process in T2 ms.
  • N and/or N2 can indicate the size of dataset (s) for positioning reference signal, a size of N PRS symbols that the second wireless communication entity can process every T microseconds (ms) and/or a size of N2 PRS symbols that the second wireless communication entity can process in T2 ms.
  • N and/or N2 can further indicate the size of each dataset and the number of datasets for positioning reference signal, a set of N PRS symbols that the second wireless communication entity can process every T microseconds (ms) and/or a set of N2 PRS symbols that the second wireless communication entity can process in T2 ms.
  • the second or the third wireless communication entity can be configured to report the one or more sets of ⁇ N, T ⁇ for PRS processing for AI/ML positioning and other positioning method (s) .
  • the second or the third wireless communication entity can be configured to report the one or more sets of ⁇ N2, T2 ⁇ for PRS processing for AI/ML positioning and other positioning method (s) .
  • the second or the third wireless communication entity can be configured to report an indicator indicating that the one or more sets of ⁇ N, T ⁇ and/or the one or more sets of ⁇ N2, T2 ⁇ is for one of: PRS processing for AI/ML positioning only, PRS processing for other positioning method (s) , or PRS processing for both of the AI/ML positioning and other positioning method (s) .
  • the wireless communication method can include the first wireless communication entity requesting the second wireless communication entity to report an AI/ML positioning result together with a positioning result of other positioning method (s) .
  • a prerequisite for AI/ML-assisted positioning can include feature groups for Multi-RTT and DL-TDOA.
  • a prerequisite for AI/ML-assisted positioning can include feature groups for DL-AOD positioning.
  • the wireless communication method can include the first wireless communication sending a message requesting the capability-related information to the second wireless communication entity.
  • the wireless communication method can include the first wireless communication entity sending a message requesting the second wireless communication entity to reserve the occupancy for the one or more processing units.
  • the message can include at least one of: a processing speed of the second wireless communication entity; a maximum capacity; a ratio of remaining space to a maximum capacity; a remaining available time or space to receive PRS and/or collect data set; or a timestamp for a current state; or a model inference latency; or a model ID; or a functionality; or a periodicity and an offset for the second wireless communication entity to report the capability-related information; a time for the second wireless communication entity to report the capability-related information; or a response time for the second wireless communication entity to report the capability-related information.
  • the message can include at least one of: start time and/or end time, duration of the reserve, number of the reserve processing units, the identification of the reserve processing units, the periodicity of the reservation period.
  • the wireless communication method can include the first wireless communication entity sending an AI/ML positioning configuration.
  • the AI/ML positioning configuration can include at least one of a cell ID list or regional range coordinates.
  • the AI/ML-related information can include a mapping relationship between a model ID and a plurality of criteria.
  • the criteria can include at least one of: the ratio of the number of TRPs with LOS path to the total number of TRPs, the number of TRPs with LOS path, cell ID, the value and range of RSRP, the granularity of timing measurement, the quality of timing value, the initial phase offset of the transmitter, the number of sample for each input, the granularity of sampling for each input, the requirement of AI/ML positioning, the configuration of PRS and/or SRS.
  • the AI/ML positioning configuration further can include a time window to activate an AI/ML positioning procedure.
  • the IEs for AI/ML positioning can include at least one of: triggered reporting requested for AI/ML positioning, periodic reporting is requested for AI/ML positioning, quality of service for AI/ML, Location Information Type for AI/ML positioning.
  • the AI/ML positioning configuration further can include at least one of: an AI/ML reference signal; an RSRP threshold; or an LOS or NLOS rate threshold.
  • the wireless communication method can include the first wireless communication sending a message including assistance data for AI/ML positioning.
  • the assistance data can include a dedicated PRS and/or a configuration of the dedicated PRS for the AI/ML positioning.
  • the assistance data can include at least one of: DL-PRS Resource Priority Subset for AI/ML positioning, model ID, Beam information of the DL-PRS for AI/ML positioning, TRP ID, positioning frequency layer for AI/ML positioning.
  • the wireless communication method can include the first wireless communication entity configuring Requested SRS Transmission Characteristics for a third wireless communication entity.
  • the requested Requested SRS Transmission Characteristics can include SRS configuration for AI/ML positioning.
  • the SRS configuration can include at least one of: UL-SRS Resource Priority Subset for AI/ML positioning, model ID, Beam information of the UL-SRS for AI/ML positioning, UE ID.
  • the wireless communication method can include the first wireless communication entity receiving, from a second wireless communication entity, a report including the label.
  • the label can include at least one of: a UE location; a UE ID; or a timestamp, start time and/or duration of time indicating the location’s validity.
  • the wireless communication method can include the first wireless communication entity receiving, from the third wireless communication entity, a report including a report including a measurement result.
  • the measurement result can include at least one of: CIR/PDP/DP; a UE ID; or a timestamp.
  • the wireless communication method can include the first wireless communication entity sending to the second and/or the third wireless communication entity, requirement or configuration for AI/ML positioning.
  • the requirement or configuration can include the requirement for datasets report.
  • the requirement for datasets report can include at least one of: Periodicity of each report, Datatype requirement, Data size requirement of each dimension, the size of data set, whether data label is required, the required quality of data label, the required confidence level of data label, the required uncertainty threshold for timing and/or angle measurement, the time stamp of the generated data, the configuration requirement of the data set, the source of data label, indicated time and/or time difference threshold.
  • the wireless communication method can include the first wireless communication entity receiving from the second and/or the third wireless communication entity, report of AI/ML positioning.
  • the report can include datasets report, wherein the datasets report can include at least one of: Datatype , Data size of each dimension, the size of data set, the quality of data label, the confidence level of data label, an indicator for data label, the time stamp of the generated data, the source of data label, the ID of the reporter, the size of data set still required.
  • the configuration for model update can include at least one of: expected update time, expected update periodicity and offsets, latest model, time stamp of the latest model, size of dataset trained for each update, duration of PRS/SRS that trained for each update, model ID and/or functionality, positioning scenario, UE ID, PRU ID, TRP ID.
  • the report can include model report, wherein the model report can include at least one of: updated model, feature and/or size of dataset that used to train the model, the duration of PRS/SRS that used to train the model, the timestamp of the report or the updated model, model ID and/or functionality, positioning scenario, UE ID, PRU ID, TRP ID.
  • the reference information can include at least one of: the reference PRS/SRS resource and/or resource set for AI/ML positioning; the reference TRP/UE for AI/ML positioning.
  • the wireless communication method can include the first wireless communication entity sending a message including configuration information of one or more-time windows configured for AI/ML positioning to a second or third wireless communication entity.
  • Each of the one or more-time windows is defined based on at least one of: a starting time of the time window; a duration of the time window; a periodicity of the time window; a PRS resource set ID; an SRS resource set ID; a PRS resource ID; an SRS resource ID; or a TRP ID.
  • the configuration of time window includes one or more of: start time, duration of the time window, the periodicity of the time window.
  • FIG. 1 illustrates an example cellular communication network in which techniques disclosed herein may be implemented, in accordance with an embodiment of the present disclosure
  • FIG. 2 illustrates a block diagram of an example base station and a user equipment device, in accordance with some embodiments of the present disclosure
  • FIG. 3 illustrates an example of processing for channel estimation and artificial intelligence (AI) /machine learning (ML) training or inference, in accordance with an embodiment of the present disclosure
  • FIG. 4 illustrates an example processing time for processing capability of user equipment (UE) , in accordance with an embodiment of the present disclosure
  • FIG. 5 illustrates an example of a funnel leakage process, in accordance with an embodiment of the present disclosure
  • FIG. 6 illustrates an example of a federated learning procedure for positioning, in accordance with an embodiment of the present disclosure
  • FIG. 7 illustrates an example of assigning training tasks, in accordance with an embodiment of the present disclosure
  • FIG. 8 illustrates an example of a location management function (LMF) sending a configuration to the UE, in accordance with an embodiment of the present disclosure
  • FIG. 9 illustrates an example of the LMF sending the configuration to a gNB, in accordance with an embodiment of the present disclosure
  • FIG. 10 illustrates an example of the LMF collecting measured data from different nodes, in accordance with an embodiment of the present disclosure
  • FIG. 11 illustrates a flowchart for artificial intelligence/machine learning positioning, in accordance with an embodiment of the present disclosure.
  • FIG. 1 illustrates an example wireless communication network, and/or system, 100 in which techniques disclosed herein may be implemented, in accordance with an embodiment of the present disclosure.
  • the wireless communication network 100 may be any wireless network, such as a cellular network or a narrowband Internet of things (NB-IoT) network and is herein referred to as “network 100.
  • NB-IoT narrowband Internet of things
  • the UE transceiver 230 may be referred to herein as an "uplink" transceiver 230 that includes a radio frequency (RF) transmitter and a RF receiver each comprising circuitry that is coupled to the antenna 232.
  • a duplex switch (not shown) may alternatively couple the uplink transmitter or receiver to the uplink antenna in time duplex fashion.
  • the BS transceiver 210 may be referred to herein as a "downlink" transceiver 210 that includes a RF transmitter and a RF receiver each comprising circuity that is coupled to the antenna 212.
  • a downlink duplex switch may alternatively couple the downlink transmitter or receiver to the downlink antenna 212 in time duplex fashion.
  • the BS 202 may be an evolved node B (eNB) , a serving eNB, a target eNB, a femto station, or a pico station, for example.
  • eNB evolved node B
  • the UE 204 may be embodied in various types of user devices such as a mobile phone, a smart phone, a personal digital assistant (PDA) , tablet, laptop computer, wearable computing device, etc.
  • PDA personal digital assistant
  • the processor modules 214 and 236 may be implemented, or realized, with a general-purpose processor, a content addressable memory, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof, designed to perform the functions described herein.
  • a processor may be realized as a microprocessor, a controller, a microcontroller, a state machine, or the like.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration.
  • the steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in firmware, in a software module executed by processor modules 214 and 236, respectively, or in any practical combination thereof.
  • the memory modules 216 and 234 may be realized as RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • memory modules 216 and 234 may be coupled to the processor modules 210 and 230, respectively, such that the processors modules 210 and 230 can read information from, and write information to, memory modules 216 and 234, respectively.
  • the memory modules 216 and 234 may also be integrated into their respective processor modules 210 and 230.
  • the memory modules 216 and 234 may each include a cache memory for storing temporary variables or other intermediate information during execution of instructions to be executed by processor modules 210 and 230, respectively.
  • Memory modules 216 and 234 may also each include non-volatile memory for storing instructions to be executed by the processor modules 210 and 230, respectively.
  • the network communication module 218 generally represents the hardware, software, firmware, processing logic, and/or other components of the base station 202 that enable bi-directional communication between base station transceiver 210 and other network components and communication nodes configured to communication with the base station 202.
  • network communication module 218 may be configured to support internet or WiMAX traffic.
  • network communication module 218 provides an 802.3 Ethernet interface such that base station transceiver 210 can communicate with a conventional Ethernet based computer network.
  • the network communication module 218 may include a physical interface for connection to the computer network (e.g., Mobile Switching Center (MSC) ) .
  • MSC Mobile Switching Center
  • positioning method (s) and/or legacy positioning method (s) refer to the current positioning method (s) , include DL-TDOA, DL-AOD, Multi-RTT, UL-RTOA, UL-AOA, etc.
  • Embodiment 1 UE Capability for AI/ML Positioning
  • AI/ML positioning procedure if the AI/ML training and/or inference stage is executed at the UE side, a processing capability corresponding to the training and/or inference can be reported by the UE.
  • the network 100 may struggle to evaluate the AI/ML training and/or inference timeline, required time duration or the required positioning quality without the UE capability reporting. Furthermore, the UE may report the processing capability for AI/ML positioning.
  • the processing capability reported by the UE may be applicable for channel estimation, and/or AI/ML training or inference, and/or the total processing capability for both channel estimation and AI/ML training or AI/ML inference.
  • the processing capability may include a combination of ⁇ N, T ⁇ , indicating the PRS duration or data set N that a UE can process every T ms, and/or a combination of ⁇ N2, T2 ⁇ , indicating the PRS duration or data set N2 that the UE can process in T2 ms.
  • a third candidate may be where N and/or N2 can be a list of ⁇ M, W ⁇ combination (s) .
  • the third candidate can indicate the UE can process M data sets with size W.
  • a format for N and/or N2 be ⁇ M1, W1 ⁇ or ⁇ M2, W2 ⁇ .
  • a fourth candidate may be where N and/or N2 can be a list of ⁇ M, W ⁇ combination (s) . The fourth candidate can indicate the UE can process M data sets with size W.
  • the duration of the detailed UE reports can include an indicator, indicating the reported ⁇ N, T ⁇ and/or ⁇ N2, T2 ⁇ is PRS processing for a legacy positioning method, PRS processing for AI/ML positioning only or PRS processing for both legacy positioning method and AI/ML positioning.
  • the duration of the detailed UE reports can include UE may report two indicators (one is for ⁇ N, T ⁇ , another is for ⁇ N2, T2 ⁇ ) , indicating the reported ⁇ N, T ⁇ and/or ⁇ N2, T2 ⁇ is PRS processing for legacy positioning method only, PRS processing for AI/ML positioning only or PRS processing for both legacy positioning method and AI/ML positioning.
  • a reporter can be the UE, PRU, or gNB. If the processing capability is reported by the gNB, the reported ⁇ N, T ⁇ and/or ⁇ N2, T2 ⁇ can be a processing capability of gNB, ⁇ N, T ⁇ and/or ⁇ N2, T2 ⁇ and refers to SRS processing capability, instead of PRS.
  • the DL-PRS can be replaced by UL-SRS.
  • the UE can report UE processing capabilities, wherein the report includes the (expected) model quality associate with data set N.
  • the processing capability may include a combination of ⁇ N Q , Q ⁇ , indicating with data set N (e.g., the one or more candidates) , or PRS duration N.
  • the UE can achieve an AI/ML inference quality, confidence, or model inference latency Q for UE’s location inference or for new measurement and/or enhancement of existing measurement (e.g., LOS/NLOS identification, timing and/or angle of measurement, likelihood of measurement) .
  • N can indicate the size of data set (s) , which means with data set size N, a UE can achieve an AI/ML inference quality or confidence Q.
  • the unit of data set (s) size can be byte, megabyte, gigabyte, etc.
  • N can indicate the duration of DL-PRS symbols N in units of ms, which means with DL-PRS symbols N, a UE can achieve an AI/ML inference quality or confidence Q.
  • AI/ML inference quality or confidence or model inference latency Q can be of one or more candidates.
  • a first candidate can be for direct be AI/ML position, the inference output is UE location, Q can include one or more of the fields.
  • the field can include the inference quality (Def, Q1) , includes a quality value and a quality resolution.
  • the quality value provides an estimate of uncertainty of the value for which the inference output is provided in units of meters.
  • the candidate value can be INTEGER (0... 31) .
  • the quality resolution provides the resolution used in quality value.
  • the candidate value can be 0.01, 0.1, 1, 10, 30 meters.
  • a second field can include a includes a confidence value and a confidence value resolution (Def, Q2) .
  • the confidence value provides an estimate of confidence level of the value for which the inference output is provided.
  • the candidate value can be INTEGER (0...1000) .
  • the confidence value resolution provides the resolution used in confidence value.
  • the candidate value can be 0.001, 0.01, 0.1.
  • the confidence level can be calculated by 1-(confidence value) * (confidence value resolution) .
  • a third field can include a latency requirement for model inference (Def Q3) .
  • a second candidate can be for AI/ML assisted positioning
  • the inference output can be new measurement and/or enhancement of existing measurement
  • Q may include one or more of fields.
  • a first field can be an LOS/NLOS identification.
  • Q can include the confidence level for the LOS/NLOS identification, the detailed definition can be the same as the second field (Def Q2) for AI/ML Positioning.
  • Q can include the model inference latency for LOS/NLOS identification, the detailed definition can be the same as the third field (Def Q3) for AI/ML positioning.
  • a second field can be a timing measurement
  • Q can include the inference quality for the timing inference
  • the detailed definition can be the same as the first field (Def Q1) for AI/ML positioning.
  • Q can include the confidence level for the timing measurement, the detailed definition can be the same as the second field (Def Q2) for AI/ML positioning.
  • Q can include the model inference latency for timing measurement, the detailed definition can be the same as the third field (Def Q3) for AI/ML positioning.
  • a third candidate can be an angle measurement
  • Q can include the inference quality for the angle inference, which includes a quality value and quality resolution.
  • the quality value can provide an estimate of uncertainty of the value.
  • the angle inference quality can include one or both of azimuth quality and zenith quality.
  • the candidate value can be INTEGER (0... 255) .
  • the quality resolution can provide a resolution used in quality value.
  • the candidate value can be (0.1 deg... ) .
  • Q can include the confidence level for the angle measurement, the detailed definition can be the same as the second field (Def Q2) .
  • Q can include the model inference latency for angle measurement, the detailed definition can be the same as the third field (Def Q3) .
  • the UE processing capability ⁇ N, T ⁇ and/or ⁇ N2, T2 ⁇ and/or ⁇ N, Q ⁇ can be associated with one or more model ID (s) .
  • the UE processing capability ⁇ N, T ⁇ and/or ⁇ N2, T2 ⁇ and/or ⁇ N, Q ⁇ can be associated with one or more functionalities.
  • a location management function may request the UE report the AI/ML positioning result together with a legacy positioning result.
  • the UE can report UE processing capabilities.
  • the UE processing capabilities can include whether the UE supports reporting both legacy positioning results and AI/ML positioning results.
  • the UE may report an indicator 0 or 1 to indicate whether the UE can report these two results, wherein 0 denotes the UE cannot report legacy positioning results together with AI/ML positioning results, 1 can indicate the UE can report both legacy positioning results and AI/ML positioning results.
  • the inference output can be the timing measurement, for model monitoring.
  • the UE can support legacy timing-related positioning method (e.g., Multi-RTT and DL-TDOA positioning) .
  • the prerequisite for AI/ML-assisted positioning can include the feature groups for Multi-RTT and DL-TDOA positioning when the inference output is timing measurement.
  • the inference output can be the angle measurement, for model monitoring.
  • the UE can support legacy angle-related positioning method (e.g., DL-AOD positioning) .
  • the prerequisite for AI/ML-assisted positioning can include the feature groups for DL-AOD positioning when the inference output is angle measurement.
  • the UE can report the IE NR-DL-PRS-ResourcesCapabilityforAI, which defines the DL-PRS resources capability for AI/ML positioning.
  • the UE can include this IE only if the UE supports NR-DL-PRS-ProcessingCapabilityforAI. Otherwise, the UE does not include this IE.
  • NR-DL-PRS-ProcessingCapabilityforAI includes one or more of the following elements: maxNrOfDL-PRS-ResourceSetPerTrpPerFrequencyLayerforAI, indicates the maximum number of DL-PRS Resource Sets per TRP per positioning frequency layer supported by UE for AI/ML positioning; maxNrofTRP-AcrossFreqsforAI, indicates the maximum number of TRPs across all positioning frequency layers for AI/ML positioning; maxNrOfPosLayerforAI, indicates the maximum number of supported positioning frequency layer for AI/ML positioning; maxNrOfDL-PRS-ResourcesPerResourceSetforAI, indicates the maximum number of DL-PRS Resources per DL-PRS Resource Set for AI/ML positioning; maxNrOfDL-PRS-ResourcesPerPositioning FrequencylayerforAI, Indicates the maximum number of DL-PRS resources per positioning frequency layer for AI/ML positioning; maxNrOfDL-PRS-ResourcesAcrossPerF
  • network can estimate or evaluate the AI/ML training and/or inference timeline, required time location and/or the required positioning quality if the AI/ML training and/or inference part is located at UE side. And also, the network is aware of whether UE can support the report of AI/ML positioning result together with legacy positioning result. For example, with reported ⁇ N, T ⁇ , the estimated training time with PRS duration or data set S can be Ceiling (S/N) *T if the training time of UE is proportional to the PRS duration or the size of the dataset.
  • Embodiment 2 Enhancement on UE Processing Capability
  • the network 100 can evaluate the estimated positioning time and/or quality.
  • a required time can be integral multiple of T.
  • FIG. 4 illustrates an example processing time for processing capability of UE, in the former cycle, a collected dataset or PRS duration may be less than the max capability N, and the UE may need T1 to complete data processing, where the remaining (T-T1) may not be occupied.
  • the UE processing capability may not be fully utilized.
  • the UE can report processing capability-related information to the LMF and/or gNb.
  • the capability-related information can include one or more of elements.
  • a first element can be the processing speed of the UE, for example, number of bits (for dataset) per second.
  • a second element can be the maximum capacity N, wherein N can be the same as one or more of the candidates described herein.
  • a third element can be a ratio of remaining space to maximum capacity N.
  • a fourth element can be a remaining available time or space to receive DL-PRS and/or collect data set.
  • a fifth element can be a timestamp for the current state.
  • a sixth element can be the model inference latency.
  • a seventh element can be the model ID.
  • An eighth element can be the functionality. The capability related information can be reported periodically or trigger by the LMF/gNB.
  • the LMF/gNB can send a request to the UE can include the processing speed of UE, the maximum capacity N, wherein N can be the same as one or more candidates described herein, a ratio of remaining space to maximum capacity N, the remaining available time or space to receive DL-PRS and/or collect data set, the timestamp for the current state, and/or the periodicity and offset for UE to report the processing capability-related information.
  • the request can further include the time for UE to report the processing capability-related information, indicated by a combination of system frame number, slot offset, and a symbol index with respect to the SFN initialization time, the response time for UE to report the processing capability-related information, the model inference latency, the model ID, and the functionality.
  • the processing of data set or PRS duration can be regarded as a funnel leakage process.
  • FIG. 5 illustrates an example of a funnel leakage process
  • the LMF/gNB can transmit data set to current UE based on the remaining available space to collect data set, and the UE processing capability can be fully utilized.
  • Embodiment 3 UE Processing Capability on Different Processing Units
  • a first element can include an indicator for each processing unit.
  • the report can be ⁇ PU #1, idle ⁇ , ⁇ PU #2, busy ⁇ .
  • the indicator can indicate that the processing unit #1 is idle and processing unit#2 is busy.
  • a second element can include the processing capability for each processing unit, wherein the processing capability of each processing unit can be the same as ⁇ N, T ⁇ , ⁇ N2, T2 ⁇ , ⁇ N, Q ⁇ as defined in embodiment #1.
  • the report can be ⁇ PU #1, ⁇ N, T ⁇ ⁇ , ⁇ PU #2, ⁇ N’, T’ ⁇ ⁇ , indicating the processing capability of processing unit #1 is ⁇ N, T ⁇ , the processing capability of processing unit #2 is ⁇ N’, T’ ⁇ .
  • a third element can include the processing capability for each processing unit, wherein the processing capability of each processing unit for different models can be the same as ⁇ N, T ⁇ , ⁇ N2, T2 ⁇ , ⁇ N, Q ⁇ as defined in embodiment #1.
  • the report can be ⁇ PU #1, model 1: ⁇ N, T ⁇ , model 2: ⁇ N’, T’ ⁇ , ... ⁇ , means for processing unit #1, the processing capability of model 1 is ⁇ N, T ⁇ , the processing capability of model 2 is ⁇ N’, T’ ⁇ .
  • a fourth element can include an indicator to indicate whether there’s any idle processing units.
  • a fifth element can include a reservation information of each processing units.
  • the reservation information can indicate the time or period of processing units reserved by LMF/gNB already.
  • the reservation information may include start time (e.g., a combination of system frame number, slot offset and symbol index with respect to the SFN initialization time) , duration of the reservation, (e.g., a number of consecutive slots/symbols or in units of slots or milliseconds or seconds) , a number of the reserve processing units, an identification of the reserve processing unit (s) , and a periodicity of the reservation period (e.g., indicated by a periodicity and a offset) .
  • a sixth element can be a number of idle processing units.
  • a seventh element can include a number of busy processing units can indicate how long it takes for each busy processing unit to complete existing tasks.
  • the report can be ⁇ PU #2, 10ms ⁇ , ⁇ PU #5, 8ms ⁇ , means processing unit #2 will be idle in 10ms, and process unit #5 will be idle in 8ms.
  • An eighth element can be a number of total processing units.
  • a ninth element can be a number of available processing units that applicable for positioning purposes.
  • a tenth element can be a number of processing units required for each AI/ML positioning model.
  • An eleventh element can be an association between the number of processing units and processing capability with processing time.
  • the required processing time for a given N can be different when the number of processing units is different.
  • the report can be ⁇ model #1, N, X, T ⁇ , which means for model #1, UE needs Tms to process a given N with X processing units, wherein N can be the same as defined in embodiment #1.
  • a twelfth element can be an association between the number of processing units and processing capability with processing quality.
  • the required processing quality for a given N can be different when the number of processing units is different.
  • the report can be ⁇ model #1, N, X, Q ⁇ , which means for model #1, UE can achieve quality Q for a given N with X processing units, wherein N and Q can be the same as defined in embodiment #1.
  • the capacity can be reported periodically or triggered by the LMF/gNB.
  • the LMF/gNB can send a request to UE indicating the period and offset for the report of occupancy and processing capacity for UE’s processing unit (s) .
  • the LMF/gNB can request the UE to reserve the occupancy for processing unit (s) .
  • the reservation request can be periodic or aperiodic.
  • the request may include the start time (e.g., a combination of system frame number) , the slot offset and symbol index with respect to the SFN initialization time) , the duration of the reservation (e.g., a number of consecutive slots/symbols or in units of slots or milliseconds or seconds) , the number of the reserve processing units, and the identification of the reserve processing unit (s) .
  • start time e.g., a combination of system frame number
  • the duration of the reservation e.g., a number of consecutive slots/symbols or in units of slots or milliseconds or seconds
  • the number of the reserve processing units e.g., the number of the reserve processing units.
  • the request may include the start time (e.g., a combination of system frame number) , the slot offset and symbol index with respect to the SFN initialization time) , the duration of the reservation (e.g., a number of consecutive slots/symbols or in units of slots or milliseconds or seconds) , the number of the reserve processing units, the identification of the reserve processing unit (s) , the periodicity of the reservation period (e.g., periodicity and offset) .
  • the UE can receive the request and reply to the LMF/gNB, indicating whether the request can be satisfied.
  • the LMF and/or gNB can be aware of the details on the UE processing units as well as the processing capability of each processing units. In this way, the LMF and/or gNB can assign corresponding training and/or inference task to UE based on UE status. If reported to the LMF, the UE processing capabilities-related information in embodiment #1, embodiment #2 and embodiment #3 for AI/ML positioning can be included in the IE NR-DL-PRS-ProcessingCapability, which defines the common DL-PRS Processing capability.
  • the UE can report the processing capability for AI/ML positioning in a new IE, (e.g., AI/ML-ProvideCapabilities, (similar to NR-DL-TDOA-ProvideCapabilities, NR-DL-AoD-ProvideCapabilities, and NR-Multi-RTT-ProvideCapabilities) ) .
  • AI/ML-ProvideCapabilities (similar to NR-DL-TDOA-ProvideCapabilities, NR-DL-AoD-ProvideCapabilities, and NR-Multi-RTT-ProvideCapabilities)
  • the IE can be used by a target device to indicate its capability to support AI/ML positioning and to provide its AI/ML positioning capabilities to the location server.
  • the LMF and UE can identify the model ID for AI/ML positioning and LMF/gNB/UE can receive the mapping relationship between model ID (e.g., for model training and/or inference) and different criteria.
  • the model ID can be associated with one or more elements.
  • a and B are predefined values
  • A, B and C are predefined values
  • a and B are predefined values
  • mapping details can be included in a new IE (e.g., AI/ML-ProvideAssistanceData, (similar to OTDOA-ProvideAssistanceData) ) .
  • the IE can be used by the location server to provide assistance data to enable UE-assisted and/or UE-based AI/ML positioning.
  • Embodiment 5 The Application on Federated Learning in Positioning
  • FIG. 6 illustrates an example of a federated learning procedure for positioning.
  • Each participant e.g., PRU
  • Each participant can train the model using local data, encrypts the gradient and uploads the gradient to the server.
  • the server can aggregate the gradient of each user to update the model parameters.
  • the server can return the updated model to all participants.
  • Each participant can update the models.
  • a first element can include a time when the participant is expected to upload the uploaded model. The time can occur in an aperiodic model update. One or more participants are expected to upload the updated model. The time can be indicated by a combination of system frame number, slot offset, and symbol index with respect to the SFN initialization time.
  • a second element can be a periodicity and offset when the participant is expected to upload the updated model. The periodicity can be for a periodic model update.
  • a third element can be a latest model.
  • a fourth element can be a timestamp of the latest model.
  • a fifth element can be the size of dataset trained for each update. The fifth element can configure the size of datasets to train before reporting once.
  • a sixth element can be the duration of PRS/SRS trained for each update.
  • the sixth element can configure the duration of PRS/SRS to train before reporting once.
  • a seventh element can be a model ID and/or model functionality.
  • An eighth element can be a Positioning scenario (e.g., InF-DH, InF-SH, etc. ) .
  • a ninth element can be the UE ID, PRU ID and/or TRP ID.
  • a tenth element can be the participants described herein. The participants can send the model report to the server.
  • the model report can include an updated model, a feature and/or size of dataset that trains the model, the duration of PRS/SRS that trains the model, the timestamp of the report or the updated model, the model ID and/or model functionality, and the positioning scenario.
  • the participants can send the request for model update to the server.
  • the request can include the time when the server is expected to send the updated model based on the aperiodic model, the periodicity for when the server is expected to update the model, an event trigger mode, (e.g., the participants request server send the updated model when given event occurs) , the model ID and/or model functionality, and the positioning scenario.
  • the server can assign different model training tasks for different participants, as shown in FIG. 7.
  • the LMF can assign model ID #1 to PRUs #1, #4 and #7, assign model ID #2 to PRUs #2, #5, #8, and assign model ID #3 to PRUs #3, #6 and #9.
  • the server can assign different model training tasks for different participants based on participants’ location, positioning scenario, or participants’ type. For example, the LMF can assign certain position models to a UE, assign several models to the PRU, and assign the remaining models to the TRP.
  • the federated learning for positioning purposes can be supported and the AI/ML positioning training tasks can be achieved on different entities without dataset transfer or delivery. In this way, the overhead of dataset transmission can be greatly reduced.
  • Embodiment 6 Positioning Area, Time, and RS Configuration
  • AI/ML positioning can be developed to enhance positioning accuracy of the legacy positioning method.
  • a positioning initiator can specify the area, scenario, and time when the AI/ML positioning can be applied for positioning purposes.
  • a location server e.g., LMF
  • LMF can specify the cell ID list and/or regional range coordinates that AI/ML positioning is enabled.
  • LMF can send the AI/ML positioning configuration to UE.
  • the configuration information include a Cell ID list.
  • the Cell ID list can include one or more cell IDs.
  • Each cell ID can include one or more IEs (e.g., nr-CellGlobalID, nr-physCellID, or nr-ARFCN) .
  • the configuration information can include regional range coordinates.
  • the regional range coordinates may include a range of horizontal position coordinates and vertical position coordinates (e.g., ⁇ [X1, X2] , [Y1, Y2] ⁇ ) .
  • the UE and/or LMF can activate or enable the AI/ML positioning procedure when the UE camps on a cell within the Cell ID list or when UE is within the regional range coordinates.
  • the location server can specify the time when AI/ML positioning is enabled.
  • the LMF can send AI/ML positioning configuration information to the UE.
  • the configuration information include a time window to active or enable the AI/ML positioning procedure.
  • the time window can be configured by one or more of: the start time (indicated by a combination of system frame number, slot offset and symbol index with respect to the SFN initialization time) , the duration of the time window (given by a number of consecutive slots/symbols or in units of slots or milliseconds or seconds) , and the periodicity (indicated by a periodicity and a offset, when the reservation is periodic) .
  • the UE and/or LMF can activate or enable the AI/ML positioning procedure within the time window.
  • the location server can specify the condition when enabling AI/ML positioning.
  • the LMF can send the AI/ML positioning configuration information to the UE.
  • the configuration information include an AI/ML reference signal.
  • the reference signal can be SSB, CSI-RS from a serving cell, SSB from a neighbor cell, or DL PRS.
  • the configuration information can include RSRP threshold and LOS or NLOS rate threshold.
  • the AI/ML reference signal can help identify whether AI/ML can be activated or enabled. For example, if the UE can accurately measure the AI/ML reference signal, the AI/ML can be activated or enabled. If the UE cannot accurately measure the AI/ML reference signal, a fallback behavior can occur. For example, a fallback behavior can take the legacy positioning method for positioning purposes.
  • the AI/ML can be activated or enabled if a measured RSRP value of the AI/ML reference signal is not equal to the configured RSRP threshold. In some arrangements, if the LOS or NLOS rate of the measured AI/ML reference signal is not equal to the configured LOS or NLOS rate threshold, the AI/ML can be activated or enabled.
  • the configuration information can be included in ProvideAssistanceData message.
  • the message body in a LPP message can be used by the location server to provide assistance data to a target device either in response to a request from the target device or in an unsolicited manner.
  • the LMF and the UE can identify when and/or where to active or enable AI/ML positioning.
  • Embodiment 7 Enhancement on Request Location Information
  • a request location information element provides detailed information on current legacy positioning methods. Therefore, the request location information can be enhanced with AI/ML related signaling to enable AI/ML positioning.
  • the RequestLocationInformation in LPP can be enhanced by indicating the request for AI/ML positioning.
  • the LMF can request one or more IEs to the UE for AI/ML positioning.
  • the IE can include a triggeredReportingforAI.
  • the triggeredReportingforAI can indicate that triggered reported is requested for AI/ML positioning and include one or more subfields.
  • the subfields can include cellChange. If the cellChange field is set to True, the target device can provide requested location information each time the primary cell has changed.
  • the subfields can include a reportingDuration.
  • the reportingDuration can include a maximum duration of triggered reporting in seconds. A value of zero is interpreted as an unlimited (or "infinite" ) duration.
  • the IE can include a PeriodicalReportingforAI.
  • the PeriodicalReportingforAI can indicate that the periodic reporting is requested and can include one or more subfields.
  • the subfields can include a reportingAmount.
  • the reportingAmount can indicate a number of periodic location information reports requested.
  • the subfields can include a reportingInterval.
  • the reportingInterval can indicate that the interval between location information reports and the response time requirement for the first location information report.
  • the IE can include a QosforAI.
  • the QosforAI can indicate the quality of service and one or more subfields.
  • the subfields can include a horizontalAccuracy.
  • the horizontalAccuracy can indicate a maximum horizontal error in the location estimated at an indicated confidence level.
  • the subfields can include a verticalCoordinateRequest that can indicate whether a vertical coordinate is required (True) or not (FALSE) .
  • the subfields can include a verticalAccuracy that can indicate the maximum vertical error in the location estimate at an indicated confidence level and is only applicable when a vertical coordinate is requested.
  • the subfields can include a responseTime.
  • the responseTime can indicate the maximum response time as measured between receipt of the RequestLocationInformation and transmission of a ProvideLocationInformation based on AI/ML.
  • the IE can include a LocationInformationTypeforAI.
  • the LocatioInformationTypeforAI can indicate whether the server requires a location estimate or one or more measurements.
  • the LocationInformationTypeforAI can indicate the server requires a direct AI/ML positioning or AI/ML-assisted positioning.
  • the server can require a measurement type, (e.g., LOS/NLOS identification, or timing measurement, or angle measurement, etc. ) .
  • the LMF can indicate the required positioning method for the UE, to specify the measurement result and/or positioning result.
  • the result can be evaluated using the legacy positioning method or AI/ML positioning method.
  • the UE can report the positioning result when AI/ML is involved.
  • the reporting periodicity, reporting type, and reporting requirement can be specified.
  • Embodiment 8 Enhancement on Assistance Data
  • the LMF can send the assistance data to the UE and/or PRU.
  • the assistance data include DL-PRS-ResourcePrioritySubsetforAI.
  • the DL-PRS-ResourcePrioritySubsetforAI can be associated with the nr-DL-PRS-ResourceID for the purpose of prioritization of AI/ML training dataset collection and/or label generation.
  • the DL-PRS-ResourcePrioritySubsetforAI can include a list of NR-DL-PRSResourcePriorityItemforAI.
  • the subfields can be associated with a priority indicator to indicate a priority level of the PRS with the NR-DL-PRS-BeamInfo.
  • the assistance data can include a TRP ID.
  • the TRP ID can indicate the PRS resources transmitted by an indicated TRP can be prioritized for AI/ML dataset collection and/or label generation.
  • the assistance data can include NR-DL-PRS-PositioningFrequencyLayer.
  • the NR-DL- PRS-PositioningFrequencyLayer can indicate a PRS resources of certain positioning frequency layer configuration can be prioritized for AI/ML dataset collection and/or label generation.
  • the UE and/or PRU can collect dataset and/or generate label with indicated PRS resource (s) .
  • the configuration information can be included in LPP or in ProvideAssistanceData.
  • a message body in a LPP message can be used the location server to provide assistance data to the target device either in response to a request from the target device or in an unsolicited manner.
  • the LMF can configure the indicated UL SRS resource (s) for TRP/gNB to collect data and/or generate a label, similar to the assistance data to the UE and/or the PRU.
  • the configuration can include a UL-SRS-ResourcePrioritySubsetforAI.
  • the UL-SRS-ResourcePrioritySubsetforAI can be associated with an SRS ResourceID for the purpose of prioritization of the AI/ML training data collection and/or label generation.
  • the UL-SRS-ResourcePrioritySubsetforAI can include a list of NR-UL-SRSResourcePriorityItemforAI (which includes an SRS ResourceSetID and/or an SRS ResourceID) and/or a model ID.
  • the model ID can indicate SRS resources in the UL-SRS-ResourcePrioritySubsetforAI can be prioritized for the model ID.
  • the configuration can include BeamInfo.
  • the BeamInfo can indicate the SRS resources with certain beam information is prioritized for AI/ML dataset collection and/or label generation.
  • the BeamInfo can be associated with a priority indicator and indicate the priority level of the SRS within the BeamInfo.
  • the configuration can include a UE ID.
  • the UE ID can indicate the SRS resources transmitted by the indicated UE can be prioritized for AI/ML dataset collection and/or label generation.
  • the configuration information can be included in NR-DL-PRS-AssistanceData and used by the location server to provide DL-PRS assistance data.
  • the LMF can send the PRS configuration the request the gNB to configure or update PRS transmission.
  • the request can include a PRS configuration for AI/ML.
  • the subfields in the PRS configuration can correspond to the PRS configuration in the NRPPa.
  • the correspondence can indicate the PRS configuration used for data collection and/or label generation in AI/ML positioning procedure.
  • the request can be included in PRS CONFIGURATION REQUEST.
  • the gNB can send the configured or updated PRS configuration for AI in PRS CONFIGURATION RESPONSE, sent by the NG-RAN node to acknowledge configuring or updating the PRS transmission for AI.
  • the gNB can send the PRS configuration for AI to UE in RRC message.
  • FIG. 9 illustrates an example of the LMF sending the configuration to the gNB.
  • the model training node can send the request to a data generation node on the based on a data collection procedure.
  • the request can include periodicity of each report (indicated by a periodicity and an offset) , a datatype requirement (e.g., CIR, PDP, or DP) , and a data size requirement of each dimension.
  • a dimension of the data input reported by the data generation node can include at least one of a number of TRP, a number of antenna port pairs, a number of samples, and/or the number of PRS/SRS resource (s) measured.
  • the request can include a size of the data set.
  • the data of the dataset can include a list of ⁇ M, [X, Y, Z, U] ⁇ combination (s) .
  • X can indicate the number of TRP
  • Y is the number of antenna port pairs
  • Z is the number of samples for each CIR/PDP/DP
  • U is the number of PRS/PRS resources.
  • the data of the dataset can include a list of ⁇ M, V ⁇ combination (s) .
  • the data of the dataset includes M data sets with size [X, V] and V can be any one or more of the 4-dimensional features.
  • the request can include whether a data label is required.
  • the request can include a quality of data label.
  • the quality of data label can include a quality value and a quality resolution. The quality value provides an estimation of uncertainty.
  • the request can include a confidence level of the data label.
  • the confidence level can include a confidence value and a confidence value resolution.
  • the confidence value provides an estimate of confidence level of the value for which the inference output is provided.
  • the request can include an uncertainty threshold for timing and/or angle measurement.
  • the request can include a time stamp of the generated data, a configuration requirement of the data set, and the source of the data label.
  • a RAT dependent positioning method or UE based positioning method can evaluate the source of the data label.
  • the data generation node can report the collected data set with the corresponding data features.
  • the report can include the datatype (the reported data is CIR, PDP, or DP of measured PRS/SRS) , the data size of each dimension (the dimension of data input reported by the data generation node should include one or more of:the number of TRP, the number of antenna port pairs, the number of samples, and/or the number of PRS/SRS resource (s) measured) , the quality of the data label (includes a quality value and a quality resolution. Wherein the quality value provides an estimate of uncertainty) , the confidence level of data label (includes a confidence value and a confidence value resolution.
  • the confidence value provides an estimate of confidence level of the value for which the inference output is provided) , and an indicator for data label.
  • the indicator can indicate whether the confidence level and/or the quality of the data label meets the confidence level and/or the quality requirement.
  • the report can include the time stamp of the generated data, the configuration information of the data set, the source of data label.
  • the report can include and ID of the reporter. For example, if the reporter is a PRU, a report can include a PRU ID, if the reporter is a UE, a report can include a UE ID, if the reporter is a TRP, a report can include a TRP ID.
  • the report can include an indicator that can indicate whether the current reporter can collect enough data set as required in size of data set element.
  • the model training node can be based on the LMF (e.g., LMF-side model) , the UE (e.g., UE-side model) or the gNB (e.g., NG-RAN node-side model) .
  • the data generation or collection node can be the UE, PRU, and gNB. In some cases, the data generation or collection node can report a data set whose data label meets the requirement indicated by required the quality of data label and/or required confidence level of data label.
  • the data generation or collection node can report a data set whose data label meets the requirement indicated by an uncertainty threshold for timing and/or angle measurement.
  • the LMF can collect measured data from different nodes.
  • the gNB can report the measurement result of SRS resource (s) to the LMF as the input of the model training. Furthermore, the data label can be reported by the UE that transmits the SRS.
  • the model input comes from different entities.
  • the data label can be associated with the measurement result of the received SRS. Therefore, when the UE reports the data label to LMF, the report can include a data label (i.e., UE’s location) , the UE ID, and a timestamp.
  • the configuration information can be included in the LPP or in the ProvideLocationInformation.
  • the target device can provide positioning measurements to the location server based on the ProvideLocationInformation message body in a LPP message.
  • the report can include a measurement result (e.g., CIR, PDP, or DP) , the UE ID, and the timestamp.
  • the configuration information can be included in the NRPPa or in MEASUREMENT RESPONSE.
  • the NG-RAN can send the configuration information to the target UE to report positioning measurements for the target UE.
  • the LMF can identify the mapping relationship between different pairs of measurement result and data label.
  • the LMF can request UE and gNB to report the label/measurement result at an indicated time, the difference between the label timestamp, and measurement timestamp can be limited within a certain threshold.
  • the report can include an indicated time (can be indicated by a combination of system frame number, slot offset and symbol index with respect to the SFN initialization time) and a time difference threshold.
  • the configuration information can be included in LPP (e.g., for UE) , in RequestLocation Information, in NRPPa (e.g., for gNB) , or in MEASUREMENT REQUEST.
  • the LMF can send the message to request the NG-RAN node to configure a positioning measurement.
  • the UE can detect the movement of itself, the UE can report the time duration that UE is stationary.
  • the report can include the data label, the UE ID, the start time and the duration of time, indicates the data label is valid within the indicated duration.
  • start time is indicated by a combination of system frame number, slot offset and symbol index with respect to the SFN initialization time
  • duration of time is given by a number of consecutive slots/symbols or in units of slots or milliseconds or seconds or minutes or hours.
  • Embodiment 9 Enhancement on Report
  • the output of AI/ML positioning can be the UE’s location.
  • the UE can report the inference result.
  • a LocationSource can provide the source positioning technology for the location estimate. If the UE reports the location estimation result in Providelocationinformation using AI/ML model, the LocationSource field can indicate the source positioning technology for AI/ML positioning.
  • the inference output can be a new measurement and/or enhancement of existing measurement
  • UE e.g., UE-side, LMF-based model
  • gNB e.g., NG-RAN node-side, LMF-based model
  • the report can include LOS/NLOS identification result.
  • the identification result can include a LOS/NLOS indicator, the confidence level for the LOS/NLOS identification, and the model inference latency for LOS/NLOS identification.
  • the report can include a timing measurement result.
  • the timing measure result can include a TOA measurement, the quality for the timing inference, the confidence level for the timing measurement, and the model inference latency for timing measurement.
  • the report can include an angle measurement result.
  • the angle measurement can include an inference quality for the angle inference.
  • the angle inference can include a quality value and quality resolution.
  • the angle measurement can include a confidence level for the angle measurement, and the model inference latency for angle measurement.
  • the UE may report the AI/ML positioning result together with the measurement result of the legacy positioning method. In some cases, for AI/ML assisted positioning, the UE may report the AI/ML inference result with the measurement result of the legacy UE-assisted positioning.
  • the target device can provide AI/ML inference result to the location server using a new IE (e.g., AI/ML-SignalInferenceInformation) .
  • AI/ML-SignalInferenceInformation e.g., AI/ML-SignalInferenceInformation
  • the report of UE’s location may be supported with the enhancements disclosed herein.
  • the report of model inference result, the inference quality, and confidence level can be supported with the enhancements disclosed herein.
  • Embodiment 10 Enhancement on model Monitoring
  • model monitoring may monitor inference performance of the AI/ML model and is a key function of AI/ML positioning.
  • An effective method of model monitoring is to let the UE/TRP perform PRS measurement for AI/ML positioning and for legacy positioning.
  • the configuration information may include one or more-time windows.
  • Each time window can be indicated by one or more of the following elements: the start time of the window (can be indicated by a combination of subframe number, slot offset and symbol index) , the duration of the window (can be indicated by a number of consecutive slots/symbol) , the periodicity of the time window (can be indicated by a periodicity and a offset) a DL PRS resource set ID, a DL PRS resource ID, or a TRP ID.
  • the configuration information can be included in LPP or in ProvideAssistanceData. Furthermore, the mapping details can be included in a new IE (e.g., AI/ML-ProvideAssistanceData) .
  • the location server can provide assistance data to enable the UE-assisted and/or UE-based AI/ML positioning, using the AI/ML-ProvideAssistanceData.
  • LMF send the configuration information to gNB the configuration information can be included in NRPPa and in assistance information, The LMF can send the NRPPa to transfer assistance information.
  • the UE/PRU/TRP can perform model inference for AI/ML positioning and measurement for legacy positioning based on indicated time window (s) and indicated DL PRS/UL SRS resource.
  • the UE can get the inference output.
  • the UE can receive the TOA measurement and the TOA measurement with legacy positioning technology.
  • the TOA measurement results can be used to perform model monitoring for AI/ML positioning.
  • Embodiment 11 Enhancement on model-Input Size Reduction
  • the label size of one data sample (label data size of one PRS/SRS resource) * (number of PRS/SRS resources needed for model output) .
  • the size of data collection and/or data model input can be large enough if the number of PRS/SRS resources, the number of TRPs, the number of antenna port pairs, and/or the number of a sample is large.
  • the overhead of data transfers may be heavy with large data size.
  • the LMF may configure the reference PRS/SRS resource, resource set for data collection, and/or inference to UE/gNB.
  • the UE and/or gNB can collect training and/or inference data with the CIR/PDP/DP of the indicated reference PRS/SRS resource and/or resource set. Together with the CIR/PDP/DP of other PRS/SRS resource and/or resource set relative to the CIR/PDP/DP of the indicated reference PRS/SRS resource and/or resource set.
  • the gNB may configure the reference PRS resource, resource set for data collection and/or inference to UE.
  • the configuration can indicate that UE can collect training and/or inference data with the CIR/PDP/DP of the indicated reference PRS resource and/or resource set, together with the CIR/PDP/DP of other PRS resource and/or resource set relative to the CIR/PDP/DP of the indicated reference PRS resource and/or resource set.
  • the configured reference PRS/SRS resource, resource set TRP, UE for data collection and/or inference include one or more IEs.
  • the IEs can include dl-PRS-ID, nr-PhysCellID, nr-CellGlobalID, nr-ARFCN, nr-DL-PRS-ResourceID, nr-DL-PRS-ResourceSetID, nr-UL-SRS-ResourceID, nr-UL-SRS-ResourceSetID, or the UE ID.
  • the LMF configures the reference PRS resource ID#1 and PRS resource set#2 and dl-PRS-ID #1 to UE for data collection, the UE can report data to the LMF.
  • the data can include the CIR/PDP/DP (referred to as “REF” ) of the reference PRS resource ID#1 and PRS resource set#2 of dl-PRS-ID #1 and CIR/PDP/DP of other RS resource ID and PRS resource set and dl-PRS-ID combinations relative to REF.
  • REF the CIR/PDP/DP
  • the ID of reference PRS/SRS resource, resource set, TRP and/or UE can be reported by the gNB and/or the UE.
  • the ID can indicate that the reported data includes the CIR/PDP/DP of the indicated reference PRS/SRS resource ID, resource set ID, TRP ID and/or UE ID. Together with the CIR/PDP/DP of other PRS/SRS resource ID, resource set ID, TRP ID and/or UE ID relative to the CIR/PDP/DP of the indicated reference PRS/SRS resource ID and/or resource set ID and/or TRP ID and/or UE ID.
  • FIG. 11 illustrates a flow diagram of a method 1100 for AI/ML positioning.
  • the method 1100 may be executed by any one or more of the components and devices detailed herein in conjunction with FIGs. 1–10.
  • the method 1100 may be performed by a wireless communication node (e.g., a base station (BS) 102) , in some embodiments. Additional, fewer, or different operations may be performed in the method 1100 depending on the embodiment. At least one aspect of the operations is directed to a system, method, apparatus, or a computer-readable medium.
  • BS base station
  • the capability can include at least one of: a processing speed of the second wireless communication entity; a maximum capacity; a ratio of remaining space to a maximum capacity; a remaining available time or space to receive PRS and/or collect data set; or a timestamp for a current state; or a model inference latency; or a model ID; or a functionality.
  • N and/or N2 can indicate the duration of DL-PRS symbols, a duration of N PRS symbols that the second wireless communication entity can process every T microseconds (ms) and/or a duration of N2 PRS symbols that the second wireless communication entity can process in T2 ms.
  • N and/or N2 can indicate the size of dataset (s) for positioning reference signal, a size of N PRS symbols that the second wireless communication entity can process every T microseconds (ms) and a size of N2 PRS symbols that the second wireless communication entity can process in T2 ms.
  • the second or the third wireless communication entity can be configured to report an indicator indicating that the one or more sets of ⁇ N, T ⁇ and/or the one or more sets of ⁇ N2, T2 ⁇ is for one of: PRS processing for AI/ML positioning only, PRS processing for other positioning method (s) , or PRS processing for both of the AI/ML positioning and other positioning method (s) .
  • the wireless communication method can include the first wireless communication entity requesting the second wireless communication entity to report an AI/ML positioning result together with a positioning result of other positioning method (s) .
  • a prerequisite for AI/ML-assisted positioning can include feature groups for Multi-RTT and DL-TDOA.
  • a prerequisite for AI/ML-assisted positioning can include feature groups for DL-AOD positioning.
  • the wireless communication method can include the first wireless communication sending a message requesting the capability-related information to the second wireless communication entity.
  • the wireless communication method can include the first wireless communication entity sending a message requesting the second wireless communication entity to reserve the occupancy for the one or more processing units.
  • the message can include at least one of: a processing speed of the second wireless communication entity; a maximum capacity; a ratio of remaining space to a maximum capacity; a remaining available time or space to receive PRS and/or collect data set; or a timestamp for a current state; or a model inference latency; or a model ID; or a functionality; or a periodicity and an offset for the second wireless communication entity to report the capability-related information; a time for the second wireless communication entity to report the capability-related information; or a response time for the second wireless communication entity to report the capability-related information.
  • the message can include at least one of: start time and/or end time, duration of the reserve, number of the reserve processing units, the identification of the reserve processing units, the periodicity of the reservation period.
  • the wireless communication method can include the first wireless communication entity sending an AI/ML positioning configuration.
  • the AI/ML positioning configuration can include at least one of a cell ID list or regional range coordinates.
  • the AI/ML-related information can include a mapping relationship between a model ID and a plurality of criteria.
  • the criteria can include at least one of: the ratio of the number of TRPs with LOS path to the total number of TRPs, the number of TRPs with LOS path, cell ID, the value and range of RSRP, the granularity of timing measurement, the quality of timing value, the initial phase offset of the transmitter, the number of sample for each input, the granularity of sampling for each input, the requirement of AI/ML positioning, the configuration of PRS and/or SRS.
  • the AI/ML positioning configuration further can include a time window to enable AI/ML positioning procedure.
  • the IEs for AI/ML positioning can include at least one of: triggered reporting requested for AI/ML positioning, periodic reporting is requested for AI/ML positioning , quality of service for AI/ML positioning, Location Information Type for AI/ML positioning.
  • the AI/ML positioning configuration further can include at least one of: an AI/ML reference signal; an RSRP threshold; or an LOS or NLOS rate threshold.
  • the wireless communication method can include the first wireless communication sending a message including assistance data for AI/ML positioning.
  • the assistance data can include a dedicated PRS and/or a configuration of the dedicated PRS for the AI/ML positioning.
  • the assistance data can include at least one of: DL-PRS Resource Priority Subset for AI/ML positioning, model ID, Beam information of the DL-PRS for AI/ML positioning, TRP ID, positioning frequency layer for AI/ML positioning.
  • the wireless communication method can include the first wireless communication entity configuring Requested SRS Transmission Characteristics for a third wireless communication entity.
  • the requested Requested SRS Transmission Characteristics can include SRS configuration for AI/ML positioning.
  • the SRS configuration can include at least one of: UL-SRS Resource Priority Subset for AI/ML positioning, model ID, Beam information of the UL-SRS for AI/ML positioning, UE ID.
  • the wireless communication method can include the first wireless communication entity receiving, from a second wireless communication entity, a report including the label.
  • the label can include at least one of: a UE location; a UE ID; or a timestamp, start time and/or duration of time indicating the location’s validity.
  • the wireless communication method can include the first wireless communication entity receiving, from the third wireless communication entity, a report including a report including a measurement result.
  • the measurement result can include at least one of: CIR/PDP/DP; a UE ID; or a timestamp.
  • the wireless communication method can include the first wireless communication entity sending to the second and/or the third wireless communication entity, requirement or configuration for AI/ML positioning.
  • the requirement or configuration can include the requirement for datasets report.
  • the requirement for datasets report can include at least one of: Periodicity of each report, Datatype requirement, Data size requirement of each dimension, the size of data set, whether data label is required, the required quality of data label, the required confidence level of data label, the required uncertainty threshold for timing and/or angle measurement, the time stamp of the generated data, the configuration requirement of the data set, the source of data label, indicated time and/or time difference threshold.
  • the wireless communication method can include the first wireless communication entity receiving from the second and/or the third wireless communication entity, report of AI/ML positioning.
  • the report can include datasets report, wherein the datasets report can include at least one of: Datatype , Data size of each dimension, the size of data set, the quality of data label, the confidence level of data label, an indicator for data label, the time stamp of the generated data, the source of data label, the ID of the reporter, the size of data set still required.
  • the wireless communication method can include the first wireless communication entity receiving from a second or a third wireless communication entity, an inference result or AI/ML positioning result.
  • the inference result or AI/ML positioning result can include at least one of: an LOS/NLOS identification result; a timing measurement result; or an angle measurement result; or location of a second wireless communication entity.
  • the inference result or AI/ML positioning result can include confidence level and/or the model inference latency and/or the quality of measurement/inference.
  • the requirement or configuration can include the reference information for measurement report.
  • the requirement or configuration can include the configuration for model update.
  • the configuration for model update can include at least one of: expected update time, expected update periodicity and offsets, latest model, time stamp of the latest model, size of dataset trained for each update, duration of PRS/SRS that trained for each update, model ID and/or functionality, positioning scenario, UE ID, PRU ID, TRP ID.
  • the report can include model report, wherein the model report can include at least one of: updated model, feature and/or size of dataset that used to train the model, the duration of PRS/SRS that used to train the model, the timestamp of the report or the updated model, model ID and/or functionality, positioning scenario, UE ID, PRU ID, TRP ID.
  • the reference information can include at least one of: the reference PRS/SRS resource and/or resource set for AI/ML positioning; the reference TRP/UE for AI/ML positioning.
  • the wireless communication method can include the first wireless communication entity sending a message including configuration information of one or more-time windows configured for AI/ML positioning to a second or third wireless communication entity.
  • Each of the one or more-time windows is defined based on at least one of: a starting time of the time window; a duration of the time window; a periodicity of the time window; a PRS resource set ID; an SRS resource set ID; a PRS resource ID; an SRS resource ID; or a TRP ID.
  • the wireless communication method can include the configuration of time window includes one or more of: start time, duration of the time window, the periodicity of the time window.
  • any reference to an element herein using a designation such as “first, “ “second, “ and so forth does not generally limit the quantity or order of those elements. Rather, these designations can be used herein as a convenient means of distinguishing between two or more elements or instances of an element. Thus, a reference to first and second elements does not mean that only two elements can be employed, or that the first element must precede the second element in some manner.
  • any of the various illustrative logical blocks, modules, processors, means, circuits, methods and functions described in connection with the aspects disclosed herein can be implemented by electronic hardware (e.g., a digital implementation, an analog implementation, or a combination of the two) , firmware, various forms of program or design code incorporating instructions (which can be referred to herein, for convenience, as "software” or a "software module) , or any combination of these techniques.
  • firmware e.g., a digital implementation, an analog implementation, or a combination of the two
  • firmware various forms of program or design code incorporating instructions
  • software or a “software module”
  • IC integrated circuit
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the logical blocks, modules, and circuits can further include antennas and/or transceivers to communicate with various components within the network or within the device.
  • a general-purpose processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, or state machine.
  • a processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other suitable configuration to perform the functions described herein.
  • Computer-readable media can include both computer storage media and communication media including any medium that can be enabled to transfer a computer program or code from one place to another.
  • a storage media can be any available media that can be accessed by a computer.
  • such computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • module refers to software, firmware, hardware, and any combination of these elements for performing the associated functions described herein. Additionally, for purpose of discussion, the various modules are described as discrete modules; however, as would be apparent to one of ordinary skill in the art, two or more modules may be combined to form a single module that performs the associated functions according to embodiments of the present solution.
  • memory or other storage may be employed in embodiments of the present solution.
  • memory or other storage may be employed in embodiments of the present solution.
  • any suitable distribution of functionality between different functional units, processing logic elements or domains may be used without detracting from the present solution.
  • functionality illustrated to be performed by separate processing logic elements, or controllers may be performed by the same processing logic element, or controller.
  • references to specific functional units are only references to a suitable means for providing the described functionality, rather than indicative of a strict logical or physical structure or organization.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Molecular Biology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

Au moins un aspect concerne un système, un procédé, un appareil ou un support lisible par ordinateur relatif à ce qui suit. Le procédé de communication sans fil peut comprendre une première entité de communication sans fil qui reçoit des informations relatives au positionnement en provenance d'une deuxième ou d'une troisième entité de communication sans fil. La deuxième entité de communication sans fil est un équipement utilisateur (UE) ou une unité de référence de positionnement (PRU). Les informations relatives au positionnement peuvent comprendre la capacité du second nœud d'entité de communication sans fil par rapport à un positionnement basé sur l'intelligence artificielle/l'apprentissage automatique (IA/ML). La capacité peut comprendre un ou plusieurs ensembles de {N, T} et/ou un ou plusieurs ensembles de {N2, T2}.
PCT/CN2023/140731 2023-12-21 2023-12-21 Système et procédé de positionnement basé sur l'intelligence artificielle/l'apprentissage machine Pending WO2025129578A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/140731 WO2025129578A1 (fr) 2023-12-21 2023-12-21 Système et procédé de positionnement basé sur l'intelligence artificielle/l'apprentissage machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/140731 WO2025129578A1 (fr) 2023-12-21 2023-12-21 Système et procédé de positionnement basé sur l'intelligence artificielle/l'apprentissage machine

Publications (1)

Publication Number Publication Date
WO2025129578A1 true WO2025129578A1 (fr) 2025-06-26

Family

ID=96136248

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/140731 Pending WO2025129578A1 (fr) 2023-12-21 2023-12-21 Système et procédé de positionnement basé sur l'intelligence artificielle/l'apprentissage machine

Country Status (1)

Country Link
WO (1) WO2025129578A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023061458A1 (fr) * 2021-10-15 2023-04-20 维沃移动通信有限公司 Procédé de positionnement, terminal et dispositif côté réseau
US20230164817A1 (en) * 2021-11-24 2023-05-25 Lenovo (Singapore) Pte. Ltd. Artificial Intelligence Capability Reporting for Wireless Communication
WO2023206499A1 (fr) * 2022-04-29 2023-11-02 Apple Inc. Entraînement et inférence pour positionnement basé sur l'ia

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023061458A1 (fr) * 2021-10-15 2023-04-20 维沃移动通信有限公司 Procédé de positionnement, terminal et dispositif côté réseau
US20230164817A1 (en) * 2021-11-24 2023-05-25 Lenovo (Singapore) Pte. Ltd. Artificial Intelligence Capability Reporting for Wireless Communication
WO2023206499A1 (fr) * 2022-04-29 2023-11-02 Apple Inc. Entraînement et inférence pour positionnement basé sur l'ia

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GUOZENG ZHENG, ZTE: "Discussion on other aspects for AI positioning enhancement", 3GPP DRAFT; R1-2302442; TYPE DISCUSSION; FS_NR_AIML_AIR, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, vol. RAN WG1, no. Online; 20230417 - 20230426, 7 April 2023 (2023-04-07), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, XP052293017 *
LENOVO: "AI/ML Positioning use cases and associated impacts", 3GPP DRAFT; R1-2204422, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, vol. RAN WG1, no. e-Meeting; 20220509 - 20220520, 29 April 2022 (2022-04-29), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, XP052144025 *

Similar Documents

Publication Publication Date Title
US20240276444A1 (en) Systems and methods for indicating uplink information
US20240098544A1 (en) Systems and methods for indicating positioning information in wireless communication systems
US20240284389A1 (en) Systems and methods for measurements on positioning reference signals
US20250024409A1 (en) Systems and methods for indicating positioning timing information
WO2025129578A1 (fr) Système et procédé de positionnement basé sur l'intelligence artificielle/l'apprentissage machine
US20240267879A1 (en) Systems and methods for network based positioning
WO2025076832A1 (fr) Systèmes et procédés de détection et de communication intégrées (isac)
WO2025156505A1 (fr) Systèmes et procédés pour effectuer un positionnement par apprentissage machine/intelligence artificielle de liaison descendante et de liaison montante
WO2025145422A1 (fr) Systèmes et procédés pour configuration de ressource adaptative
WO2025156404A1 (fr) Systèmes et procédés de positionnement basé sur l'ia/ml
WO2025148248A1 (fr) Systèmes et procédés pour effectuer des améliorations de mesure et de rapport de besoin de période pour le positionnement
WO2024230187A1 (fr) Systèmes et procédés de détection et de gestion de mobilité assistée
WO2025148275A1 (fr) Systèmes et procédés de configuration de fenêtre temporelle dans un système de détection et de communication intégrées
WO2025091277A1 (fr) Systèmes et procédés de signalisation et de configuration en isac
US20240187169A1 (en) Systems and methods for indicating uplink positioning information in wireless communication systems
WO2025189426A1 (fr) Systèmes et procédés pour effectuer une détection, une gestion de faisceau, des informations d'état de canal et un positionnement
WO2025129606A1 (fr) Systèmes et procédés pour effectuer une détection assistée par trajet de référence
WO2025156259A1 (fr) Systèmes et procédés pour une configuration de l'intégration de la détection et de la communication
WO2024159346A1 (fr) Systèmes et procédés de vérification d'emplacement dans un réseau non terrestre
WO2025129581A1 (fr) Système et procédé pour effectuer une communication de liaison latérale pour une mise en réseau sensible au temps
US20250380236A1 (en) Systems and methods for sidelink positioning
WO2025156347A1 (fr) Systèmes et procédés de compression de rapport de mesure
WO2025171670A1 (fr) Contrôle des performances basé sur la lmf pour un positionnement basé sur l'intelligence artificielle dans des systèmes de communication sans fil
WO2025160982A1 (fr) Détection sans fil dans des réseaux de communication sans fil
WO2024221789A1 (fr) Systèmes et procédés de détection et de communication intégrées sans signal de référence dédié

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23961913

Country of ref document: EP

Kind code of ref document: A1