[go: up one dir, main page]

WO2024170977A1 - Processing framework for noisy labels - Google Patents

Processing framework for noisy labels Download PDF

Info

Publication number
WO2024170977A1
WO2024170977A1 PCT/IB2024/050714 IB2024050714W WO2024170977A1 WO 2024170977 A1 WO2024170977 A1 WO 2024170977A1 IB 2024050714 W IB2024050714 W IB 2024050714W WO 2024170977 A1 WO2024170977 A1 WO 2024170977A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
data processing
processing model
quality
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2024/050714
Other languages
French (fr)
Inventor
Sajad REZAIE
Oana-Elena Barbu
Athul Prasad
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Technologies Oy
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Priority to CN202480012066.4A priority Critical patent/CN120731432A/en
Publication of WO2024170977A1 publication Critical patent/WO2024170977A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0205Details
    • G01S5/0244Accuracy or reliability of position solution or of measurements contributing thereto
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0278Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves involving statistical or probabilistic considerations

Definitions

  • Various example embodiments of the present disclosure generally relate to the field of telecommunication and in particular, to methods, devices, apparatuses and computer readable storage medium for processing framework of noisy labels.
  • AI/ML Artificial Intelligence/Machine Learning
  • the AI/ML models have been employed for positioning of devices in a communication network.
  • a large dataset of training samples may be used to train the AI/ML models to improve the positioning accuracy.
  • the positioning information of dataset samples as labels may not be fully accurate. Therefore, it is worthy studying on processing noisy labels of training sample for machine learning training.
  • a first device comprises at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the first device at least to perform: receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; determining position- related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
  • a second device In a second aspect of the present disclosure, there is provided a second device.
  • the second device comprises at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the second device at least to perform: determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position-related data to obtain second quality data of the position-related parameter; and transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
  • a method comprises: receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; determining position-related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
  • a method comprises: determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position- related data to obtain second quality data of the position-related parameter; and transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
  • the first apparatus comprises means for receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; means for determining position-related data that comprises a position- related parameter for a second data processing model and first quality data of the position- related parameter; and means for determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
  • a second apparatus comprises means for determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position-related data to obtain second quality data of the position-related parameter; and means for transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
  • a computer readable medium comprises instructions stored thereon for causing an apparatus to perform at least the method according to the third aspect.
  • a computer readable medium comprises instructions stored thereon for causing an apparatus to perform at least the method according to the fourth aspect.
  • FIG. 1 illustrates an example communication environment in which example embodiments of the present disclosure can be implemented
  • FIG. 2A illustrates a signaling chart for communication according to some example embodiments of the present disclosure
  • FIG. 2B illustrates a signaling chart for communication according to some example embodiments of the present disclosure
  • FIGS. 3AA-3EB illustrate example probability density function (PDF) for noisy sources of samples according to some example embodiments of the present disclosure
  • FIG. 4 illustrates a flowchart of a method implemented at a first device according to some example embodiments of the present disclosure
  • FIG. 5 illustrates a flowchart of a method implemented at a second device according to some example embodiments of the present disclosure
  • FIG. 6 illustrates a simplified block diagram of a device that is suitable for implementing example embodiments of the present disclosure.
  • FIG. 7 illustrates a block diagram of an example computer readable medium in accordance with some example embodiments of the present disclosure.
  • references in the present disclosure to “one embodiment,” “an embodiment,” “an example embodiment,” and the like indicate that the embodiment described may include a particular feature, structure, or characteristic, but it is not necessary that every embodiment includes the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an example embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • “at least one of the following: ⁇ a list of two or more elements> and “at least one of ⁇ a list of two or more elements> and similar wording, where the list of two or more elements are joined by “and” or “or”, means at least any one of the elements, or at least any two or more of the elements, or at least all the elements.
  • step “in response to A” does not indicate that the step is performed immediately after “A” occurs and one or more intervening steps may be included.
  • circuitry may refer to one or more or all of the following:
  • circuit(s) and or processor(s) such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
  • software e.g., firmware
  • circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware.
  • circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
  • the term “communication network” refers to a network following any suitable communication standards, such as fifth generation (5G) systems, Long Term Evolution (LTE), LTE-Advanced (LTE-A), Wideband Code Division Multiple Access (WCDMA), High-Speed Packet Access (HSPA), Narrow Band Internet of Things (NB-IoT) and so on.
  • 5G fifth generation
  • LTE Long Term Evolution
  • LTE-A LTE-Advanced
  • WCDMA Wideband Code Division Multiple Access
  • HSPA High-Speed Packet Access
  • NB-IoT Narrow Band Internet of Things
  • the communications between a terminal device and a network device in the communication network may be performed according to any suitable generation communication protocols, including, but not limited to, the first generation (1G), the second generation (2G), 2.5G, 2.75G, the third generation (3G), the fourth generation (4G), 4.5G, the fifth generation (5G) new radio (NR) communication protocols, and/or any other protocols either currently known or to be developed in the future.
  • suitable generation communication protocols including, but not limited to, the first generation (1G), the second generation (2G), 2.5G, 2.75G, the third generation (3G), the fourth generation (4G), 4.5G, the fifth generation (5G) new radio (NR) communication protocols, and/or any other protocols either currently known or to be developed in the future.
  • Embodiments of the present disclosure may be applied in various communication systems. Given the rapid development in communications, there will of course also be future type communication technologies and systems with which the present disclosure may be embodied. It should not be seen as limiting the scope of the present disclosure to only the aforementioned system.
  • the term “network device” refers to a node in a communication network via which a terminal device accesses the network and receives services therefrom.
  • the network device may refer to a base station (BS) or an access point (AP), for example, a node B (NodeB or NB), an evolved NodeB (eNodeB or eNB), a Next Generation NodeB (NR NB), a Remote Radio Unit (RRU), a radio header (RH), a remote radio head (RRH), Integrated Access and Backhaul (IAB) node, a relay, a low power node such as a femto, a pico, and so forth, depending on the applied terminology and technology.
  • the network device is allowed to be defined as part of a gNB such as for example in CU/DU split in which case the network device is defined to be either a gNB-CU or a gNB -DU.
  • terminal device refers to any end device that may be capable of wireless communication.
  • a terminal device may also be referred to as a communication device, user equipment (UE), a Subscriber Station (SS), a Portable Subscriber Station, a Mobile Station (MS), or an Access Terminal (AT).
  • UE user equipment
  • SS Subscriber Station
  • MS Mobile Station
  • AT Access Terminal
  • the terminal device may include, but not limited to, a mobile phone, a cellular phone, a smart phone, voice over IP (VoIP) phones, wireless local loop phones, a tablet, a wearable terminal device, a personal digital assistant (PDA), portable computers, desktop computer, image capture terminal devices such as digital cameras, gaming terminal devices, music storage and playback appliances, vehicle-mounted wireless terminal devices, wireless endpoints, mobile stations, laptop-embedded equipment (LEE), laptop-mounted equipment (LME), USB dongles, smart devices, wireless customer-premises equipment (CPE), an Internet of Things (loT) device, a watch or other wearable, a head-mounted display (HMD), a vehicle, a drone, a medical device and applications (e.g., remote surgery), an industrial device and applications (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain contexts), a consumer electronics device, a device operating on commercial and/or industrial wireless networks, and the like.
  • VoIP voice over
  • the terminal device may also correspond to Mobile Termination (MT) part of the integrated access and backhaul (IAB) node (a.k.a. a relay node).
  • MT Mobile Termination
  • IAB integrated access and backhaul
  • a user equipment apparatus such as a cell phone or tablet computer or laptop computer or desktop computer or mobile loT device or fixed loT device.
  • This user equipment apparatus can, for example, be furnished with corresponding capabilities as described in connection with the fixed and/or the wireless network node(s), as appropriate.
  • the user equipment apparatus may be the user equipment and/or or a control device, such as a chipset or processor, configured to control the user equipment when installed therein. Examples of such functionalities include the bootstrapping server function and/or the home subscriber server, which may be implemented in the user equipment apparatus by providing the user equipment apparatus with software configured to cause the user equipment apparatus to perform from the point of view of these functions/nodes.
  • FIG. 1 illustrates an example communication environment 100 in which example embodiments of the present disclosure can be implemented.
  • the communication environment 100 includes a device 110-1, a device 110-2, a device 110-3, . . . , and a device 110-N, which can be collectively referred to as “device(s) 110.”
  • the communication environment also includes a device 120 and a device 130.
  • the device(s) 110, the device 120 and the device 130 can communicate with each other.
  • the device 110 may include a terminal device and the device 130 may include a network device serving the terminal device.
  • the device 120 may include a core network device.
  • the device 120 may include a device on which a location management function (LMF) can be implemented.
  • LMF location management function
  • a central ML unit may be located within the communication environment 100.
  • the central ML unit may be as part of the LMF implemented on the device 120.
  • the central ML unit trains the AI/ML model for positioning by using training samples.
  • the central ML unit may be any suitable unit for data analyzing, including but not limited to a 5G network data analytics function (NWDAF).
  • NWDAAF 5G network data analytics function
  • the central ML unit may collect training samples from a set of data collection devices deployed in certain locations.
  • the data collection device may include a positioning reference unit (PRU) or any other suitable data collection devices.
  • PRUs are reference units such as devices or network nodes at known locations (that is, having label information). PRUs may take measurements to generate correction data used for refining the location of other target device in the area.
  • the device 110, the device 120 and/or device 130 may perform as the data collection device.
  • the device 110, the device 120 and/or device 130 may provide positioning measurements or estimations in addition to its/their own position(s) via radio access network (RAN) or non-RAN.
  • RAN radio access network
  • the positioning information provided by the device 110, the device 120 and/or device 130 is collected in the communication environment 100, thus may be used to analyze the propagation properties of the communication environment 100.
  • the central ML unit may use a training dataset which may include positioning measurements from different PRUs to train a localization ML framework.
  • the trained ML framework may be deployed at network entities running ML processes and/or algorithms. Such entities may be referred to as host types.
  • Host types carrying out ML processes can be a target device to be positioned, and potentially the radio access network (e.g., the network device and or the LMF) to obtain the positioning accuracy.
  • the communication environment 100 may include any suitable number of devices configured to implementing example embodiments of the present disclosure. Although not shown, it would be appreciated that one or more additional devices may be located in the cell of the device 130, and one or more additional cells may be deployed in the communication environment 100. It is noted that although illustrated as a network device, the device 130 may be other device than a network device. Although illustrated as a terminal device, the device 110 may be other device than a terminal device.
  • a link from the device 130 to the device 110 is referred to as a downlink (DL), while a link from the device 110 to the device 130 is referred to as an uplink (UL).
  • DL the device 130 is a transmitting (TX) device (or a transmitter) and the device 110 is a receiving (RX) device (or a receiver).
  • RX receiving
  • UL the device 110 is a TX device (or a transmitter) and the device 130 is a RX device (or a receiver).
  • Communications in the communication environment 100 may be implemented according to any proper communication protocol(s), comprising, but not limited to, cellular communication protocols of the first generation (1G), the second generation (2G), the third generation (3G), the fourth generation (4G), the fifth generation (5G), the sixth generation (6G), and the like, wireless local network communication protocols such as Institute for Electrical and Electronics Engineers (IEEE) 802.11 and the like, and/or any other protocols currently known or to be developed in the future.
  • IEEE Institute for Electrical and Electronics Engineers
  • the communication may utilize any proper wireless communication technology, comprising but not limited to: Code Division Multiple Access (CDMA), Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Frequency Division Duplex (FDD), Time Division Duplex (TDD), Multiple-Input Multiple-Output (MIMO), Orthogonal Frequency Division Multiple (OFDM), Discrete Fourier Transform spread OFDM (DFT-s-OFDM) and/or any other technologies currently known or to be developed in the future.
  • CDMA Code Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • FDD Frequency Division Duplex
  • TDD Time Division Duplex
  • MIMO Multiple-Input Multiple-Output
  • OFDM Orthogonal Frequency Division Multiple
  • DFT-s-OFDM Discrete Fourier Transform spread OFDM
  • the AI/ML can be employed in communication systems to improve system performances.
  • the AI/ML approaches may be used in channel state information (CSI) feedback enhancement, such as, overhead reduction, improved accuracy, prediction.
  • CSI channel state information
  • the AI/ML approaches may also be used in beam management, for example, beam prediction, and/or spatial domain for overhead and latency reduction, beam selection accuracy improvement.
  • the AI/ML approaches may be employed in positioning accuracy enhancements for different scenarios, for example, scenarios with heavy non-line-of-sight (NLOS) conditions.
  • NLOS non-line-of-sight
  • the AI/ML approaches may affect physical (PHY) layer aspects, for example, the Al Model lifecycle management, and dataset construction for training, validation and test for the selected use cases. It may also impact use case and collaboration level, such as new signaling, means for training and validation data assistance, assistance information, measurement, and feedback. Further, the AI/ML approaches may also affect capability indication, configuration and control procedures (training/inference), and management of data and AI/ML model.
  • PHY physical
  • the AI/ML approaches may affect capability indication, configuration and control procedures (training/inference), and management of data and AI/ML model.
  • the AI/ML model may be trained with a dataset of training samples. To better training the AI/ML model, a large dataset with accurate ground truth or label is needed.
  • the term “noisy label” may refer to an inaccurate value for a target parameter instead of the true or actual value at the measurement time.
  • noisy label refers to marking a class other than the true class for a training sample.
  • noisy label means an inaccurate value is reported for the target parameter instead of the true/actual value in the measurement time. Therefore, it is worthy studying on training models by using training samples with noisy labels.
  • a first device receives first information from a terminal device.
  • the first information indicates a first data processing model and a first set of parameters related to the first data processing model.
  • the first device determines position-related data for a second data processing model.
  • the position-related data includes data position-related parameter for the second data processing model and quality data of the data position-related parameter.
  • the first device determines second quality data for the data position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters. In this way, it is capable of training a data processing model with one or more noisy labels, thereby reducing effects of the noisy labels and improving the positioning accuracy.
  • FIG. 2 A and FIG. 2B illustrate a signaling chart 201 and a signaling chart 202 for communication according to some example embodiments of the present disclosure.
  • the signaling chart 200 involves a first device 210 and a second device 220.
  • FIG. 1 to describe the signaling flow 201 and the signaling flow 202.
  • embodiments of the present disclosure can be applied to any proper scenarios, for example, beam management, positioning accuracy enhancement, or CSI. Only for the purpose of illustrations, embodiments of the present disclosure are described with reference to the scenario of positioning accuracy enhancements.
  • the first device 210 may refer to or include the device 110 or device 130 shown in FIG. 1.
  • the second device 220 may refer to or include the device 120 shown in FIG. 1. It is to be understood that the first device 210 and second device 220 may refer to or include any proper devices, including but not limited to a UE, a PRU, a transmit/receive point (TRP), a gNB, a next generation (NG) radio access network (RAN) (NG-RAN) node, or a network element such as EMF. Scope of the present disclosure is not limited in this regard.
  • first device 210 and one second device 220 are illustrated in FIG. 2, it would be appreciated that there may be a plurality of devices performing similar operations as described with respect to the first device 210 or the second device 220 below.
  • the first device 210 includes the device 110 such as a terminal device. In such cases, the first device 210 may transmit information or signal to the second device 220 via an LTE positioning protocol (LPP) information element (IE), such as an IE in the LPP ProvideLocationlnformation. Alternatively, or in addition, in some example embodiments, the first device 210 includes the device 130 such as a network device. In such cases, the first device 210 may transmit information or signal to the second device 220 via an NR positioning protocol annex (NRPPa) IE, such as an IE in NRPPa MeasurementReport.
  • LTP LTE positioning protocol
  • IE LTE positioning protocol
  • the first device 210 includes the device 130 such as a network device.
  • the first device 210 may transmit information or signal to the second device 220 via an NR positioning protocol annex (NRPPa) IE, such as an IE in NRPPa MeasurementReport.
  • NRPPa NR positioning protocol annex
  • the second device 220 may determine (2010) which side to train a data processing model(s).
  • the term “data processing model” used herein may refer to an algorithm or a model that is capable of processing data.
  • the data processing model may be an AI/ML model.
  • the second device 220 may determine to train the data processing model at the first device 210.
  • the second device 220 may determine to train the data processing model by itself.
  • the second device 220 determines (2020) a data processing model (referred to as “first data processing model” hereinafter).
  • the first data processing model may be used for processing noisy labels.
  • the first data processing model may determine how quality label(s) are processed and used in the training phase.
  • quality label or the term “quality data” used herein may refer to label or information indicating a quality of a training sample. For example, if the quality of the training sample is not below a quality threshold, the quality label of the training sample may be an accurate label. If the quality of the training sample is below the quality threshold, the quality label of the training sample may be a noisy label. It is noted the terms “quality label” and “quality data” are used interchangeable.
  • quality labels from different sources may be processed in different ways to provide training labels which are used at the training phase.
  • the first data processing model may indicate/use one or more modes for processing noisy labels.
  • the first data processing model may indicate an index of the noisy label processing (NLP) mode.
  • the first data processing model may indicate an ALL mode (referred to as “first mode” hereinafter). In this case, all quality labels are reported without processing.
  • the first data processing model may indicate an average (AVE) mode (referred to as “second mode” hereinafter). In this case, for each training sample, a weighted average over means may be used as the training label for all training epochs.
  • AVE average
  • the first data processing model may indicate a best mode (referred to as “third mode” hereinafter).
  • this third mode based on accuracy of quality labels for each training sample, the quality label with a highest accuracy may be selected as the training label.
  • a random (RND) mode (referred to as “fourth mode” hereinafter) may be indicated or used.
  • RMD random
  • fourth mode for each training sample, one of means of the training data may be randomly selected as the training label for all training epochs. Uniform or non-uniform random selection may be used depending on the sources.
  • the first data processing model may indicate an epochrandom (EPCH-RND) mode (referred to as “fifth mode” hereinafter).
  • EPCH-RND epochrandom
  • the fifth mode at each training epoch, one of the quality labels may be randomly selected and used as the training label for each training sample. Also, the quality label selection from different positioning sources may follow uniform or non-uniform distribution.
  • an epoch-distribution (EPCH-DIST) (referred to as “sixth mode” hereinafter) may be indicated or used by the first data processing model. In this case, a random number of quality labels selected may be selected for the processing quality labels.
  • a probabilistic model such as Gaussian mixture model may be considered and calibrated for position of each sample.
  • a random number from the corresponding probabilistic model may be drawn and used as the training label.
  • EPCH-RND mode can be seen as a subset of EPCH-DIST, where the probabilistic model is made of Dirac delta functions.
  • the second device 220 may determine (2030) another processing model (referred to as “third data processing model” hereinafter).
  • the third data processing model may be a noisy label loss model.
  • the third data processing model may determine whether the weighted loss function is used or not.
  • the third data processing model may indicate/use one or more modes for weighting noisy label loss.
  • the third data processing model may indicate/use a no weighted loss function model.
  • all training samples may have the same value/effect for training the data processing model. In this case, no weighting function/mechanism may be used for calculation of loss function.
  • the third data processing model may indicate/utilize a weighted loss function model.
  • a weighted loss function may be used.
  • the loss and consequently the gradients may be emphasized.
  • the loss function may be de-emphasized in case of low LCS for some training samples.
  • the second device 220 may determine a type of required measurement.
  • the required type of measurement may need to be recorded.
  • the type of measurement may indicate a radio measurement.
  • the radio measurement may include positioning measurement such as measurement of field NR signals collected by the first device 210 and/or the second device 220.
  • the positioning measurement may include any combination of time, angle of arrival, channel impulse response (OR), etc.
  • the positioning measurement may be obtained after receiving a positioning signal. Examples of the positioning signal may include but not limited to a DL positioning reference signal (PRS), a UL sounding reference signal (SRS), or a sidelink (SL) positioning reference signal (SL-PRS).
  • the second device 220 may determine a format of reporting estimation based on the measurement (for example, the estimated position).
  • the format may indicate that means and variance of the estimation need to be reported.
  • the second device 220 may transmit further information that includes an indication for indicating whether the processing of a first set of quality labels included in a training sample is enabled or not to the first device 210.
  • the first set of quality labels may refer to original labels of the training data which may be noisy. Details of the first set of quality labels are described later.
  • the fourth information may include a LPP ProvideAssistanceData IE.
  • the second device 220 may trigger the noisy label processing flag (NLPF) (i.e., the above-mentioned indication) in the fourth information.
  • NLPF noisy label processing flag
  • the second device 220 may transmit (2040) the type of radio measurement and the format of reporting (referred to as “fifth information” hereinafter) to the first device 210.
  • the fifth information may indicate a required radio measurement for obtaining training data.
  • the fifth information may include a LPP ProvideAssistanceData IE. It is noted that the fifth information and the fourth information may be transmitted together or separately.
  • the first device 210 may determine whether the first device 210 is capable of providing the required radio measurement.
  • the required type of radio measurement includes to global navigation satellite system (GNSS) and LIDAR measurements (that is, the second device 220 asks for GNSS and LIDAR reports)
  • the first device 210 may determine whether the GNSS and LIDAR sources are available. If the GNSS and LIDAR positioning sources become available, the first device 210 may be capable of providing the GNSS and LIDAR measurements.
  • GNSS global navigation satellite system
  • LIDAR measurements that is, the second device 220 asks for GNSS and LIDAR reports
  • the first device 210 may transmit (2050) an acknowledgment message (ACK) to the second device 220.
  • the acknowledgment message may indicate capability of providing the required radio measurement.
  • the second device 210 asks for both GNSS and LIDAR reports, the first device 210 may send the acknowledgment message as soon as both positioning sources become available.
  • the acknowledgment message (ACK) may be a LPP ProvideLocationlnformation IE.
  • the second device 220 may request a network device serving the first device 210 to allocate resources for a following report of positioning measurement.
  • the second device 220 transmits (2060) first information to the first device 210. For example, in some example embodiments, based on receiving (2050) the acknowledgment message, the second device 220 transmits (2060) the first information.
  • the first information indicates the first data processing model and the first set of parameters related to the first data processing.
  • the first set of parameters may include a set of weights for sources of the training data.
  • weights or coefficients for each positioning source for example, GNSS, RAT-based, and LIDAR
  • the first device 210 determines (2070) a training sample for a second data processing model.
  • the training sample includes training data for the second data processing model and first quality data of the training data.
  • the training sample may be position- related data.
  • the position-related data may include position-related parameter and first quality data of the position-related parameter.
  • the position-related parameter may be any one or any combinations of: a position, a direction of the position, or a coordinate of the position. It is noted that the position-related parameter may include any proper parameters that relate to positioning ore directions.
  • the first quality data may include one or more quality labels.
  • each of the quality labels may be related to a positioning source.
  • each of the quality labels may be related to a type of measurement of the positioning source.
  • the term “position-related parameter” may be understood as a variable property related to position and not only understood as a fixed setting.
  • the training sample may be defined as a tuple (a radio measurement, label 1, label 2, . . ., label M), where label M denotes a 2D or 3D position estimation provided by positioning source M and M is an integer number.
  • the first device 210 may determine the radio measurement based on downlink (DL) positioning reference signals (PRSs).
  • PRSs downlink
  • the first device 210 may determine the radio measurement based on uplink (UL) sounding reference signal (SRS)-for-positioning.
  • the radio measurement may be one or more of: TOA, AOA, or OR. It is noted that the radio measurement may include any proper type of measurement.
  • the positioning source may include but not limited to global navigation satellite system (GNSS), radio access technology (RAT), LIDAR, Wi-Fi based positioning, ML based positioning, or the like.
  • the second device 220 may estimate its position based on the radio measurement.
  • the estimation result may include the position mean /i and variance 2.
  • the second device 220 may determine and store a mean and variance for each positioning source.
  • the first device 210 determines (2080) second quality data of the training data by processing the first quality data based on the first data processing model and the first set of parameters.
  • the second quality data may refer to the labels that has been improved compared with the first set of quality labels.
  • the second set of quality labels may be less noisy than the first set of quality labels.
  • the second set of quality labels may include one or more quality labels.
  • the first device 210 may obtain the second set of quality labels by aggregating or fusing the first set of quality labels based on the first data processing model and the first set of parameters.
  • the positioning estimation(s) may be processed to prepare the second set of quality labels, based on the first data processing model, the first set of parameters and estimated position mean(s) and variance(s). In this way, the quality labels may be more reliable, and the training of the data processing model can also be improved. Further, the positioning accuracy may also be improved.
  • the first data processing model may indicate one or more of: the first, second, third, fourth, fifth or sixth mode indicated in the first data processing model.
  • the first device 210 may not process the first set of quality labels.
  • the second mode is indicated, for each training sample, the first device 210 may use the weighted average over positioning means as the second set of quality labels for all training epochs.
  • a most reliable labels may be selected from the first set of quality labels. In this case, the most reliable labels may be the second set of quality labels.
  • the first device 210 may randomly select one of the positioning means as the set of quality labels for all training epochs.
  • the first device 210 may randomly select and use one of the first set of quality labels as the one label in the second set of quality labels for each training sample.
  • a random number from the corresponding probabilistic model may be drawn and used as the second set of quality labels.
  • the first device 210 may transmit (2100) second information to the second device 220.
  • the second information may include the second set of quality labels.
  • the second information may include the first set of quality labels.
  • the second device 220 may store (2110) the training sample and the second set of quality labels.
  • the second device 220 may train (2120) the second data processing model using the training data and the second set of quality labels.
  • the second data processing model may also be trained based on the third data processing model and a second set of parameters related to the third data processing model.
  • the second set of parameters may include coefficients for calculation of weights used in the loss function based on the accuracy and consistency of the second set of quality labels.
  • the training of the second data processing model can be improved.
  • the data processing model may be updated in time.
  • the first information may also indicate the third data processing model for loss function and the second set of parameters related to the third data processing model.
  • the first device 210 may store (2200) the training sample and the second set of quality labels.
  • the first device 210 may train (2210) the second processing model using the training data and the second set of quality labels.
  • the first apparatus may determine the weights of loss function for the second set of labels based on the third training data and the second set of parameters.
  • the second data processing model may be trained based on the training data, the second set of quality labels, and the weights.
  • the second set of parameters may include coefficients for calculation of weights used in the loss function based on the accuracy and consistency of the second set of quality labels.
  • the first device 210 may transmit (2220) third information to the second device 220.
  • the third information may include quality information of the second data processing model trained using the training data and the second set of quality labels. Only as an example, the third information may indicate that accuracy of the second data processing model is 90%.
  • the training of the second data processing model can be improved. Moreover, it can reduce the burden of training the data processing model at terminal device side.
  • the first set of quality labels may only include one quality label.
  • the first device may determine weighting information of the training sample based on a variance of an estimation obtained for a source associated with the training data.
  • the weighting information indicates to emphasize or de-emphasize of effect of the training sample on the second data processing model.
  • the variance of position estimation may be used to emphasize or de-emphasize the effects of each training sample on the training results of the second data processing model.
  • the first device 210 may train, using the training sample, the second data processing model based on the weighting information of the training sample.
  • the first device 210 may transmit the weighting information of the training sample to the second device 220.
  • the second device 220 may train, using the training sample, the second data processing model based on the weighting information of the training sample.
  • training samples with one or several sources of noisy labels can be pre-processed before using in the training procedure of the AI/ML model.
  • embodiments of the present disclosure are capable of training the AI/ML model with one or more noisy labels. Further, it also provides possibility of considering variance or accuracy of noisy labels in the training procedure of training the AI/ML model. Additionally, it also achieves emphasizing or deemphasizing a specific source of positioning in the training process of training the AI/ML model.
  • FIG. 3AA and FIG. 3AB illustrate the probability density function (PDF) of the measured noisy labels from the three sources 311, 312 and 313. It is assumed a Gaussian distribution with the measured mean and variance for each positioning source. In the example of FIG. 3AA and FIG. 3 AB, there are three sources that provide noisy labels with the following means and variance . ⁇ [ ] > ⁇ 2 —
  • NLP-mode AVE: an average of all the three positions are calculated and reported, as be used for all the training epochs. As shown in FIG. 3BA and FIG. 3BB, the processed and reported training label using the AVE mode is marked with 320.
  • NLP-mode BEST: the label from the first positioning source is selected as the training label,
  • the processed and reported training label using the BEST mode is marked with 330.
  • NLP-mode RND: one of the noisy labels are selected randomly and used for all the training epochs. For example, as shown in FIG. 3DA and FIG. 3DB, the third positioning source is selected is used for all the training epochs. As shown in FIG. 3DA and FIG. 3DB, the third positioning source is selected is used for all the training epochs. As shown in FIG. 3DA and FIG. 3DB, the third positioning source is selected is used for all the training epochs. As shown in FIG.
  • the processed and reported training label using the RND mode is marked with 340.
  • NLP-mode EPCH-DIST: a Gaussian mixture model (GMM) for the position and fitted a GMM to the measured noisy labels may be applied, as shown in FIG. 3EA and FIG. 3EB. Considering 30 epochs of training, 30 random numbers are drawn for the GMM. As shown in FIG. 3EA and FIG. 3EB, the processed and reported training labels for 30 epochs of training using the EPCH-DIST mode are marked with 350.
  • GMM Gaussian mixture model
  • FIG. 4 illustrates a flowchart of a method 400 implemented at a first apparatus according to some example embodiments of the present disclosure.
  • the first apparatus may include a terminal device or a network device.
  • the first apparatus receives first information indicating from a second apparatus.
  • the first indication indicates a first data processing model and a first set of parameters related to the first data processing model.
  • the first set of parameters comprises a set of weights for sources of the training data.
  • the first data processing model indicates at least one of: a first mode where quality labels are reported without processing, a second mode where a weighted average of means of the training data is used for the processing of the first set of quality labels, a third mode where a best quality label from the first set of quality labels is selected for the processing of the first set of quality labels, a fourth mode where one of means of the training data is randomly selected for the processing of the first set of quality labels, a fifth mode where one quality label from the first set of quality labels is randomly selected for the processing of the first set of quality labels, or a sixth mode where a random number of quality labels selected is selected for the processing of the first set of quality labels.
  • the first apparatus determines position-related data for a second data processing model.
  • the position-related data comprises a position-related parameter for the second data processing model and first quality data of position-related parameter.
  • the first apparatus determines a second quality data of the position- related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
  • the first apparatus may transmit, to the second apparatus, second information comprising the second set of quality labels.
  • the first information further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model.
  • the first apparatus may store position- related data and the second quality data. The first apparatus may determine the weights of loss function for the second quality data based on the third training data and the second set of parameters. In some example embodiments, the first apparatus may train the second data processing model using the position-related parameter and the second quality data. The first apparatus may determine the weights for the second quality data based on the third training data and the second set of parameters.
  • the first apparatus may transmit, to the second apparatus, third information comprising quality information of the second data processing model trained using the position-related parameter and the second quality data.
  • the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
  • the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
  • the first apparatus may receive, from the second apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled. In some example embodiments, the first apparatus may receive, from the second apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter. In some example embodiments, the first apparatus may transmit, to the second apparatus, an acknowledgment message indicating capability of the required radio measurement.
  • the first set data comprises one quality label.
  • the first apparatus may determine weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position-related data on the second data processing model.
  • the first apparatus may train, using the position-related data, the second data processing model based on the weighting information of the position-related data.
  • the first apparatus may train, to the second apparatus, the weighting information of the position-related data.
  • FIG. 5 illustrates a flowchart of a method 500 implemented at a second apparatus according to some example embodiments of the present disclosure.
  • the second apparatus may include a core network device.
  • the second apparatus determines a first data processing model and a first set of parameters related to the first data processing model.
  • the first data processing model and the first set of parameters are used to process first quality data of a position-related parameter in position-related data to obtain a second quality data.
  • the first set of parameters comprises a set of weights for sources of the position-related parameter.
  • the first data processing model indicates at least one of: a first mode where quality labels are reported without processing, a second mode where a weighted average of means of the position-related parameter is used for the processing of the first set of quality labels, a third mode where a best quality label from the first set of quality labels is selected for the processing of the first set of quality labels, a fourth mode where one of means of the training data is randomly selected for the processing of the first set of quality labels, a fifth mode where one quality label from the first set of quality labels is randomly selected for the processing of the first set of quality labels, or a sixth mode where a random number of quality labels selected is selected for the processing of the first set of quality labels.
  • the second apparatus transmits, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
  • the second apparatus may determine a third data processing model for loss function and a second set of parameters related to the third data processing model.
  • the second apparatus may transmit, to the first apparatus, the first information that further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model.
  • the second apparatus may receive, from the first apparatus, second information comprising the second quality data.
  • the second apparatus may store the position-related data and the second quality data.
  • the second apparatus may determine weights for the second quality data based on the third data processing model and the second set of parameters.
  • the second apparatus may train the second data processing model based on the position-related parameter, the second quality data, and the weights.
  • the second apparatus may receive from the first apparatus, third information comprising quality information of a trained second data processing model.
  • the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
  • the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
  • the second apparatus may transmit, to the first apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled.
  • the second apparatus may receive transmit, to the first apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter.
  • the second apparatus may receive, from the first apparatus, an acknowledgment message indicating capability of the required radio measurement.
  • the first quality data comprises one quality label.
  • the second apparatus may determine weighting information of the training sample based on a variance of an estimation obtained for a source associated with the position-related parameter.
  • the weighting information may indicate to emphasize or de-emphasize of effect of position-related data on the second data processing model.
  • the second apparatus may receive the weighting information from the first apparatus. In some example embodiments, the second apparatus may train, using the position-related data, the second data processing model based on the weighting information of the position-related data.
  • a first apparatus capable of performing any of the method 400 may comprise means for performing the respective operations of the method 400.
  • the means may be implemented in any suitable form.
  • the means may be implemented in a circuitry or software module.
  • the first apparatus may be implemented as or included in the device 110 or the device 130 in FIG. 1.
  • the first apparatus comprises means for receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; means for determining position- related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and means for determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
  • the first apparatus comprises means for transmitting, to the second apparatus, second information comprising the second quality data.
  • the first information further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model.
  • the first apparatus comprises means for storing the position-related data and the second quality data; means for determining weights for the second quality data based on the third processing model and the second of parameters; and means for training the second data processing model based on the position-related parameter, the second quality data, and weights.
  • the first apparatus comprises means for transmitting, to the second apparatus, third information comprising quality information of the second data processing model trained using the position-related parameter and the second quality data.
  • the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
  • the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
  • the first apparatus comprises means for receiving, from the second apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled.
  • the first apparatus comprises means for receiving, from the second apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter; and means for transmitting, to the second apparatus, an acknowledgment message indicating capability of the required radio measurement.
  • the first set of parameters comprises a set of weights for sources of the position-related parameter.
  • the first quality data comprises one quality label.
  • the first apparatus comprises means for determining weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position-related data on the second data processing model.
  • the first apparatus comprises means for training, using the position-related data, the second data processing model based on the weighting information of the position-related data; or means for transmitting, to the second apparatus, the weighting information of the position-related data.
  • the first apparatus comprises a terminal device or a network device
  • the second apparatus comprises a core network device.
  • a second apparatus capable of performing any of the method 500 may comprise means for performing the respective operations of the method 500.
  • the means may be implemented in any suitable form.
  • the means may be implemented in a circuitry or software module.
  • the second apparatus may be implemented as or included in the device 120 in FIG. 1.
  • the second apparatus comprise means for determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position-related data to obtain second quality data of the position-related parameter; and means for transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
  • the second apparatus comprise means for receiving, from the first apparatus, second information comprising the second quality data.
  • the second apparatus comprise means for storing the position-related data and the second quality data; and means for training, using the position-related parameter and the second quality data, the second data processing model based on a third data processing model for loss function and a second set of parameters related to the third data processing model.
  • the second apparatus comprise means for determining a third data processing model for loss function and a second set of parameters related to the third data processing model.
  • the means for transmitting the first information comprises: means for transmitting, to the first apparatus, the first information that further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model.
  • the second apparatus comprise means for receiving, from the first apparatus, third information comprising quality information of the second data processing model trained using the position-related parameter and the second quality data.
  • the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
  • the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
  • the second apparatus comprise means for transmitting, to the first apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled.
  • the second apparatus comprise means for transmitting, to the first apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter; and means for receiving, from the first apparatus, an acknowledgment message indicating capability of the required radio measurement.
  • the first set of parameters comprises a set of weights for sources of the position-related parameter.
  • the first quality data comprises one quality label.
  • the second apparatus comprise means for determining weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position- related data on the second data processing model; or means for receiving the weighting information from the first apparatus.
  • the second apparatus comprise means for training, using the position-related data, the second data processing model based on the weighting information of the position-related data.
  • FIG. 6 is a simplified block diagram of a device 600 that is suitable for implementing example embodiments of the present disclosure.
  • the device 600 may be provided to implement a communication device, for example, the device 110, the device 120, or the device 130 in FIG.l.
  • the device 600 includes one or more processors 610, one or more memories 620 coupled to the processor 610, and one or more communication modules 640 coupled to the processor 610.
  • the communication module 640 is for bidirectional communications.
  • the communication module 640 has one or more communication interfaces to facilitate communication with one or more other modules or devices.
  • the communication interfaces may represent any interface that is necessary for communication with other network elements.
  • the communication module 640 may include at least one antenna.
  • the processor 610 may be of any type suitable to the local technical network and may include one or more of the following: general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multicore processor architecture, as non-limiting examples.
  • the device 600 may have multiple processors, such as an application specific integrated circuit chip that is slaved in time to a clock which synchronizes the main processor.
  • the memory 620 may include one or more non-volatile memories and one or more volatile memories.
  • the non-volatile memories include, but are not limited to, a Read Only Memory (ROM) 624, an electrically programmable read only memory (EPROM), a flash memory, a hard disk, a compact disc (CD), a digital video disk (DVD), an optical disk, a laser disk, and other magnetic storage and/or optical storage.
  • the volatile memories include, but are not limited to, a random access memory (RAM) 622 and other volatile memories that will not last in the power-down duration.
  • a computer program 630 includes computer executable instructions that are executed by the associated processor 610.
  • the instructions of the program 630 may include instructions for performing operations/acts of some example embodiments of the present disclosure.
  • the program 630 may be stored in the memory, e.g., the ROM 624.
  • the processor 610 may perform any suitable actions and processing by loading the program 630 into the RAM 622.
  • the example embodiments of the present disclosure may be implemented by means of the program 630 so that the device 600 may perform any process of the disclosure as discussed with reference to FIG. 2A-FIG. 5.
  • the example embodiments of the present disclosure may also be implemented by hardware or by a combination of software and hardware.
  • the program 630 may be tangibly contained in a computer readable medium which may be included in the device 600 (such as in the memory 620) or other storage devices that are accessible by the device 600.
  • the device 600 may load the program 630 from the computer readable medium to the RAM 622 for execution.
  • the computer readable medium may include any types of non-transitory storage medium, such as ROM, EPROM, a flash memory, a hard disk, CD, DVD, and the like.
  • non-transitory is a limitation of the medium itself (i.e., tangible, not a signal) as opposed to a limitation on data storage persistency (e.g., RAM vs. ROM).
  • FIG. 7 shows an example of the computer readable medium 700 which may be in form of CD, DVD or other optical storage disk.
  • the computer readable medium 700 has the program 630 stored thereon.
  • various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representations, it is to be understood that the block, apparatus, system, technique or method described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • Some example embodiments of the present disclosure also provide at least one computer program product tangibly stored on a computer readable medium, such as a non- transitory computer readable medium.
  • the computer program product includes computerexecutable instructions, such as those included in program modules, being executed in a device on a target physical or virtual processor, to carry out any of the methods as described above.
  • program modules include routines, programs, libraries, objects, classes, components, data structures, or the like that perform particular tasks or implement particular abstract data types.
  • the functionality of the program modules may be combined or split between program modules as desired in various embodiments.
  • Machine-executable instructions for program modules may be executed within a local or distributed device. In a distributed device, program modules may be located in both local and remote storage media.
  • Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages.
  • the program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented.
  • the program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
  • the computer program code or related data may be carried by any suitable carrier to enable the device, apparatus or processor to perform various processes and operations as described above.
  • Examples of the carrier include a signal, computer readable medium, and the like.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the computer readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD- ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • NWDAF Network Data Analytics Function [00175] NLOS Non-Line-of-Sight

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Remote Sensing (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

According to some example embodiments of the present disclosure, there is provided a solution for improving labels of training samples. In this solution, a first device receives first information from a terminal device. The first information indicates a first data processing model and a first set of parameters related to the first data processing model. The first device determines position-related data for a second data processing model. The position-related data includes position-related parameter for the second processing model and first quality data of the position-related parameter. The first device determines second quality data the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters. In this way, it is capable of training a data processing model with one or more noisy labels, thereby reducing effects of the noisy labels improving the positioning accuracy.

Description

PROCESSING FRAMEWORK FOR NOISY LABELS
RELATED APPLICATION
[0001] This application claims priority to FI Application No. 20235184 filed February 16, 2023, which is incorporated herein by reference in its entirety.
FIELD
[0002] Various example embodiments of the present disclosure generally relate to the field of telecommunication and in particular, to methods, devices, apparatuses and computer readable storage medium for processing framework of noisy labels.
BACKGROUND
[0003] In the telecommunication industry, Artificial Intelligence/Machine Learning (AI/ML) models have been employed in telecommunication systems to improve the performance of telecommunications systems. For example, the AI/ML models have been employed for positioning of devices in a communication network. In this case, a large dataset of training samples may be used to train the AI/ML models to improve the positioning accuracy. However, the positioning information of dataset samples as labels may not be fully accurate. Therefore, it is worthy studying on processing noisy labels of training sample for machine learning training.
SUMMARY
[0004] In a first aspect of the present disclosure, there is provided a first device. The first device comprises at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the first device at least to perform: receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; determining position- related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters. [0005] In a second aspect of the present disclosure, there is provided a second device. The second device comprises at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the second device at least to perform: determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position-related data to obtain second quality data of the position-related parameter; and transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
[0006] In a third aspect of the present disclosure, there is provided a method. The method comprises: receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; determining position-related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
[0007] In a fourth aspect of the present disclosure, there is provided a method. The method comprises: determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position- related data to obtain second quality data of the position-related parameter; and transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
[0008] In a fifth aspect of the present disclosure, there is provided a first apparatus. The first apparatus comprises means for receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; means for determining position-related data that comprises a position- related parameter for a second data processing model and first quality data of the position- related parameter; and means for determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
[0009] In a sixth aspect of the present disclosure, there is provided a second apparatus. The second apparatus comprises means for determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position-related data to obtain second quality data of the position-related parameter; and means for transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
[0010] In a seventh aspect of the present disclosure, there is provided a computer readable medium. The computer readable medium comprises instructions stored thereon for causing an apparatus to perform at least the method according to the third aspect.
[0011] In an eighth aspect of the present disclosure, there is provided a computer readable medium. The computer readable medium comprises instructions stored thereon for causing an apparatus to perform at least the method according to the fourth aspect.
[0012] It is to be understood that the Summary section is not intended to identify key or essential features of embodiments of the present disclosure, nor is it intended to be used to limit the scope of the present disclosure. Other features of the present disclosure will become easily comprehensible through the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] Some example embodiments will now be described with reference to the accompanying drawings, where:
[0014] FIG. 1 illustrates an example communication environment in which example embodiments of the present disclosure can be implemented;
[0015] FIG. 2A illustrates a signaling chart for communication according to some example embodiments of the present disclosure;
[0016] FIG. 2B illustrates a signaling chart for communication according to some example embodiments of the present disclosure;
[0017] FIGS. 3AA-3EB illustrate example probability density function (PDF) for noisy sources of samples according to some example embodiments of the present disclosure;
[0018] FIG. 4 illustrates a flowchart of a method implemented at a first device according to some example embodiments of the present disclosure;
[0019] FIG. 5 illustrates a flowchart of a method implemented at a second device according to some example embodiments of the present disclosure;
[0020] FIG. 6 illustrates a simplified block diagram of a device that is suitable for implementing example embodiments of the present disclosure; and
[0021] FIG. 7 illustrates a block diagram of an example computer readable medium in accordance with some example embodiments of the present disclosure.
[0022] Throughout the drawings, the same or similar reference numerals represent the same or similar element.
DETAILED DESCRIPTION
[0023] Principle of the present disclosure will now be described with reference to some example embodiments. It is to be understood that these embodiments are described only for the purpose of illustration and help those skilled in the art to understand and implement the present disclosure, without suggesting any limitation as to the scope of the disclosure. The disclosure described herein can be implemented in various manners other than the ones described below.
[0024] In the following description and claims, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skills in the art to which this disclosure belongs.
[0025] References in the present disclosure to “one embodiment,” “an embodiment,” “an example embodiment,” and the like indicate that the embodiment described may include a particular feature, structure, or characteristic, but it is not necessary that every embodiment includes the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an example embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
[0026] It shall be understood that although the terms “first” and “second” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish functionalities of various elements. As used herein, the term “and/or” includes any and all combinations of one or more of the listed terms.
[0027] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising”, “has”, “having”, “includes” and/or “including”, when used herein, specify the presence of stated features, elements, and/or components etc., but do not preclude the presence or addition of one or more other features, elements, components and/ or combinations thereof.
[0028] As used herein, “at least one of the following: <a list of two or more elements> and “at least one of <a list of two or more elements> and similar wording, where the list of two or more elements are joined by “and” or “or”, means at least any one of the elements, or at least any two or more of the elements, or at least all the elements.
[0029] As used herein, unless stated explicitly, performing a step “in response to A” does not indicate that the step is performed immediately after “A” occurs and one or more intervening steps may be included.
[0030] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising”, “has”, “having”, “includes” and/or “including”, when used herein, specify the presence of stated features, elements, and/or components etc., but do not preclude the presence or addition of one or more other features, elements, components and/ or combinations thereof.
[0031] As used in this application, the term “circuitry” may refer to one or more or all of the following:
(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and
(b) combinations of hardware circuits and software, such as (as applicable):
(i) a combination of analog and/or digital hardware circuit(s) with software/firmware and
(ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and
(c) hardware circuit(s) and or processor(s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
[0032] This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
[0033] As used herein, the term “communication network” refers to a network following any suitable communication standards, such as fifth generation (5G) systems, Long Term Evolution (LTE), LTE-Advanced (LTE-A), Wideband Code Division Multiple Access (WCDMA), High-Speed Packet Access (HSPA), Narrow Band Internet of Things (NB-IoT) and so on. Furthermore, the communications between a terminal device and a network device in the communication network may be performed according to any suitable generation communication protocols, including, but not limited to, the first generation (1G), the second generation (2G), 2.5G, 2.75G, the third generation (3G), the fourth generation (4G), 4.5G, the fifth generation (5G) new radio (NR) communication protocols, and/or any other protocols either currently known or to be developed in the future. Embodiments of the present disclosure may be applied in various communication systems. Given the rapid development in communications, there will of course also be future type communication technologies and systems with which the present disclosure may be embodied. It should not be seen as limiting the scope of the present disclosure to only the aforementioned system.
[0034] As used herein, the term “network device” refers to a node in a communication network via which a terminal device accesses the network and receives services therefrom. The network device may refer to a base station (BS) or an access point (AP), for example, a node B (NodeB or NB), an evolved NodeB (eNodeB or eNB), a Next Generation NodeB (NR NB), a Remote Radio Unit (RRU), a radio header (RH), a remote radio head (RRH), Integrated Access and Backhaul (IAB) node, a relay, a low power node such as a femto, a pico, and so forth, depending on the applied terminology and technology. The network device is allowed to be defined as part of a gNB such as for example in CU/DU split in which case the network device is defined to be either a gNB-CU or a gNB -DU.
[0035] The term “terminal device” refers to any end device that may be capable of wireless communication. By way of example rather than limitation, a terminal device may also be referred to as a communication device, user equipment (UE), a Subscriber Station (SS), a Portable Subscriber Station, a Mobile Station (MS), or an Access Terminal (AT). The terminal device may include, but not limited to, a mobile phone, a cellular phone, a smart phone, voice over IP (VoIP) phones, wireless local loop phones, a tablet, a wearable terminal device, a personal digital assistant (PDA), portable computers, desktop computer, image capture terminal devices such as digital cameras, gaming terminal devices, music storage and playback appliances, vehicle-mounted wireless terminal devices, wireless endpoints, mobile stations, laptop-embedded equipment (LEE), laptop-mounted equipment (LME), USB dongles, smart devices, wireless customer-premises equipment (CPE), an Internet of Things (loT) device, a watch or other wearable, a head-mounted display (HMD), a vehicle, a drone, a medical device and applications (e.g., remote surgery), an industrial device and applications (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain contexts), a consumer electronics device, a device operating on commercial and/or industrial wireless networks, and the like. The terminal device may also correspond to Mobile Termination (MT) part of the integrated access and backhaul (IAB) node (a.k.a. a relay node). In the following description, the terms “terminal device”, “communication device”, “terminal”, “user equipment” and “UE” may be used interchangeably.
[0036] Although functionalities described herein can be performed, in various example embodiments, in a fixed and/or a wireless network node, in other example embodiments, functionalities may be implemented in a user equipment apparatus (such as a cell phone or tablet computer or laptop computer or desktop computer or mobile loT device or fixed loT device). This user equipment apparatus can, for example, be furnished with corresponding capabilities as described in connection with the fixed and/or the wireless network node(s), as appropriate. The user equipment apparatus may be the user equipment and/or or a control device, such as a chipset or processor, configured to control the user equipment when installed therein. Examples of such functionalities include the bootstrapping server function and/or the home subscriber server, which may be implemented in the user equipment apparatus by providing the user equipment apparatus with software configured to cause the user equipment apparatus to perform from the point of view of these functions/nodes.
Example Environment
[0037] FIG. 1 illustrates an example communication environment 100 in which example embodiments of the present disclosure can be implemented. The communication environment 100 includes a device 110-1, a device 110-2, a device 110-3, . . . , and a device 110-N, which can be collectively referred to as “device(s) 110.” The communication environment also includes a device 120 and a device 130. The device(s) 110, the device 120 and the device 130 can communicate with each other.
[0038] In the example of FIG. 1, the device 110 may include a terminal device and the device 130 may include a network device serving the terminal device. The device 120 may include a core network device. For example, the device 120 may include a device on which a location management function (LMF) can be implemented.
[0039] In some example embodiments, a central ML unit (also referred to as “central unit”) may be located within the communication environment 100. For example, the central ML unit may be as part of the LMF implemented on the device 120. The central ML unit trains the AI/ML model for positioning by using training samples. The central ML unit may be any suitable unit for data analyzing, including but not limited to a 5G network data analytics function (NWDAF).
[0040] In some example embodiments, the central ML unit may collect training samples from a set of data collection devices deployed in certain locations. The data collection device may include a positioning reference unit (PRU) or any other suitable data collection devices. The PRUs are reference units such as devices or network nodes at known locations (that is, having label information). PRUs may take measurements to generate correction data used for refining the location of other target device in the area.
[0041] In some example embodiments, the device 110, the device 120 and/or device 130 may perform as the data collection device. For example, the device 110, the device 120 and/or device 130 may provide positioning measurements or estimations in addition to its/their own position(s) via radio access network (RAN) or non-RAN. The positioning information provided by the device 110, the device 120 and/or device 130 is collected in the communication environment 100, thus may be used to analyze the propagation properties of the communication environment 100.
[0042] The central ML unit may use a training dataset which may include positioning measurements from different PRUs to train a localization ML framework. The trained ML framework may be deployed at network entities running ML processes and/or algorithms. Such entities may be referred to as host types. Host types carrying out ML processes can be a target device to be positioned, and potentially the radio access network (e.g., the network device and or the LMF) to obtain the positioning accuracy.
[0043] It is to be understood that the number of devices and their connections shown in FIG.
1 are only for the purpose of illustration without suggesting any limitation. The communication environment 100 may include any suitable number of devices configured to implementing example embodiments of the present disclosure. Although not shown, it would be appreciated that one or more additional devices may be located in the cell of the device 130, and one or more additional cells may be deployed in the communication environment 100. It is noted that although illustrated as a network device, the device 130 may be other device than a network device. Although illustrated as a terminal device, the device 110 may be other device than a terminal device.
[0044] In the following, for the purpose of illustration, some example embodiments are described with the device 110 operating as a terminal device and the device 130 operating as a network device. However, in some example embodiments, operations described in connection with a terminal device may be implemented at a network device or other device, and operations described in connection with a network device may be implemented at a terminal device or other device.
[0045] In some example embodiments, if the device 110 is a terminal device and the device 130 is a network device, a link from the device 130 to the device 110 is referred to as a downlink (DL), while a link from the device 110 to the device 130 is referred to as an uplink (UL). In DL, the device 130 is a transmitting (TX) device (or a transmitter) and the device 110 is a receiving (RX) device (or a receiver). In UL, the device 110 is a TX device (or a transmitter) and the device 130 is a RX device (or a receiver).
[0046] Communications in the communication environment 100 may be implemented according to any proper communication protocol(s), comprising, but not limited to, cellular communication protocols of the first generation (1G), the second generation (2G), the third generation (3G), the fourth generation (4G), the fifth generation (5G), the sixth generation (6G), and the like, wireless local network communication protocols such as Institute for Electrical and Electronics Engineers (IEEE) 802.11 and the like, and/or any other protocols currently known or to be developed in the future. Moreover, the communication may utilize any proper wireless communication technology, comprising but not limited to: Code Division Multiple Access (CDMA), Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Frequency Division Duplex (FDD), Time Division Duplex (TDD), Multiple-Input Multiple-Output (MIMO), Orthogonal Frequency Division Multiple (OFDM), Discrete Fourier Transform spread OFDM (DFT-s-OFDM) and/or any other technologies currently known or to be developed in the future.
[0047] As mentioned above, the AI/ML can be employed in communication systems to improve system performances. For example, the AI/ML approaches may be used in channel state information (CSI) feedback enhancement, such as, overhead reduction, improved accuracy, prediction. The AI/ML approaches may also be used in beam management, for example, beam prediction, and/or spatial domain for overhead and latency reduction, beam selection accuracy improvement. In addition, the AI/ML approaches may be employed in positioning accuracy enhancements for different scenarios, for example, scenarios with heavy non-line-of-sight (NLOS) conditions.
[0048] Moreover, the AI/ML approaches may affect physical (PHY) layer aspects, for example, the Al Model lifecycle management, and dataset construction for training, validation and test for the selected use cases. It may also impact use case and collaboration level, such as new signaling, means for training and validation data assistance, assistance information, measurement, and feedback. Further, the AI/ML approaches may also affect capability indication, configuration and control procedures (training/inference), and management of data and AI/ML model.
[0049] The AI/ML model may be trained with a dataset of training samples. To better training the AI/ML model, a large dataset with accurate ground truth or label is needed. The term “noisy label” may refer to an inaccurate value for a target parameter instead of the true or actual value at the measurement time. However, in many applications, it is difficult to obtain accurate labels (also referred to as golden labels) of the training samples. For example, in classification problems, noisy label refers to marking a class other than the true class for a training sample. On the other hand, in regression problem, noisy label means an inaccurate value is reported for the target parameter instead of the true/actual value in the measurement time. Therefore, it is worthy studying on training models by using training samples with noisy labels. [0050] In an approach, training a robust classifier/regressor to noisy labels is investigated in several studies, including updating the loss function with a neighbor consistency regularization. In some applications, there are several sources that provide (noisy) labels for each training samples. Majority voting is a popular technique to provide labels in a classification problem. Also, it also proposes an approach to refine a label aggregator (besides the classifier) with the help of a small, trusted dataset with golden labels. In another track, a sample selection framework is proposed, which is able to identify samples with noisy labels and provide a clean dataset. However, in training of a regressor, averaging over the multiple noisy labels is the most common way of reducing the effects of noisy labels in training of a regressor. However, in all the introduced solutions, it is assumed that maximum one noisy label is available, and none of them propose an approach to deal with several noisy labels (from different labelling sources) per sample.
[0051] Supervised ML training needs true (golden) labels to train properly the ML model wights. For AI/ML-based positioning task, it is likely that one or several noisy labels instead of the golden positioning label would be available from different positioning sources. As there is no straight forward way of training an AI/ML model with noisy labels, exploiting the noisy labels in the training procedure is an open problem. Based on the quality of the measured signal/channel, each source may also report variance of its positioning estimation. This extra information may also be used in the training process.
Work Principle and Example Signaling for Communication
[0052] As discussed above, it is challenging to manage the training procedure and provides actual training labels based on the available noisy labels. For example, to train an ML- enabled positioning framework in an environment, a training dataset including several samples measured in the environment is needed. A training sample is made of the measurements of field NR signals and the position of the target device at the time of measurement. However, as the obtained positions as labels are typically inaccurate (since they are most often estimated, and not known in advance), we call them noisy labels. Also, for a sample, it may be possible to obtain noisy labels from several sources. Therefore, embodiments of the present disclosure propose a label aggregation framework, where training samples with one or several sources of noisy labels are pre-processed before using in the data collection and/or training procedure.
[0053] According to some example embodiments of the present disclosure, there is provided a solution for improving labels of training samples. In this solution, a first device receives first information from a terminal device. The first information indicates a first data processing model and a first set of parameters related to the first data processing model. The first device determines position-related data for a second data processing model. The position-related data includes data position-related parameter for the second data processing model and quality data of the data position-related parameter. The first device determines second quality data for the data position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters. In this way, it is capable of training a data processing model with one or more noisy labels, thereby reducing effects of the noisy labels and improving the positioning accuracy.
[0054] Example embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
[0055] Reference is now made to FIG. 2 A and FIG. 2B, which illustrate a signaling chart 201 and a signaling chart 202 for communication according to some example embodiments of the present disclosure. As shown in FIG. 2 A and FIG. 2B, the signaling chart 200 involves a first device 210 and a second device 220. For the purpose of discussion, reference is made to FIG. 1 to describe the signaling flow 201 and the signaling flow 202. It is noted that embodiments of the present disclosure can be applied to any proper scenarios, for example, beam management, positioning accuracy enhancement, or CSI. Only for the purpose of illustrations, embodiments of the present disclosure are described with reference to the scenario of positioning accuracy enhancements.
[0056] In some example embodiments, the first device 210 may refer to or include the device 110 or device 130 shown in FIG. 1. The second device 220 may refer to or include the device 120 shown in FIG. 1. It is to be understood that the first device 210 and second device 220 may refer to or include any proper devices, including but not limited to a UE, a PRU, a transmit/receive point (TRP), a gNB, a next generation (NG) radio access network (RAN) (NG-RAN) node, or a network element such as EMF. Scope of the present disclosure is not limited in this regard.
[0057] Although one first device 210 and one second device 220 are illustrated in FIG. 2, it would be appreciated that there may be a plurality of devices performing similar operations as described with respect to the first device 210 or the second device 220 below.
[0058] In some example embodiments, the first device 210 includes the device 110 such as a terminal device. In such cases, the first device 210 may transmit information or signal to the second device 220 via an LTE positioning protocol (LPP) information element (IE), such as an IE in the LPP ProvideLocationlnformation. Alternatively, or in addition, in some example embodiments, the first device 210 includes the device 130 such as a network device. In such cases, the first device 210 may transmit information or signal to the second device 220 via an NR positioning protocol annex (NRPPa) IE, such as an IE in NRPPa MeasurementReport. It is to be understood the IES described hereinafter is only for the purpose of illustration, without suggesting any limitation. The devices may transmit information with any suitable IE or other information format. Scope of the present disclosure is not limited in this regard.
[0059] The second device 220 may determine (2010) which side to train a data processing model(s). The term “data processing model” used herein may refer to an algorithm or a model that is capable of processing data. For example, the data processing model may be an AI/ML model. In some example embodiments, the second device 220 may determine to train the data processing model at the first device 210. Alternatively, the second device 220 may determine to train the data processing model by itself.
[0060] The second device 220 determines (2020) a data processing model (referred to as “first data processing model” hereinafter). For example, the first data processing model may be used for processing noisy labels. The first data processing model may determine how quality label(s) are processed and used in the training phase. The term “quality label” or the term “quality data” used herein may refer to label or information indicating a quality of a training sample. For example, if the quality of the training sample is not below a quality threshold, the quality label of the training sample may be an accurate label. If the quality of the training sample is below the quality threshold, the quality label of the training sample may be a noisy label. It is noted the terms “quality label” and “quality data” are used interchangeable.
[0061] In some example embodiments, quality labels from different sources (such as, positioning sources) may be processed in different ways to provide training labels which are used at the training phase. The first data processing model may indicate/use one or more modes for processing noisy labels. By way of example, the first data processing model may indicate an index of the noisy label processing (NLP) mode. For example, the first data processing model may indicate an ALL mode (referred to as “first mode” hereinafter). In this case, all quality labels are reported without processing. Alternatively, or in addition, the first data processing model may indicate an average (AVE) mode (referred to as “second mode” hereinafter). In this case, for each training sample, a weighted average over means may be used as the training label for all training epochs. The first data processing model may indicate a best mode (referred to as “third mode” hereinafter). In this third mode, based on accuracy of quality labels for each training sample, the quality label with a highest accuracy may be selected as the training label. In some other example embodiments, a random (RND) mode (referred to as “fourth mode” hereinafter) may be indicated or used. In this case, for each training sample, one of means of the training data may be randomly selected as the training label for all training epochs. Uniform or non-uniform random selection may be used depending on the sources.
[0062] Alternatively, or in addition, the first data processing model may indicate an epochrandom (EPCH-RND) mode (referred to as “fifth mode” hereinafter). In the fifth mode, at each training epoch, one of the quality labels may be randomly selected and used as the training label for each training sample. Also, the quality label selection from different positioning sources may follow uniform or non-uniform distribution. In some example embodiments, an epoch-distribution (EPCH-DIST) (referred to as “sixth mode” hereinafter) may be indicated or used by the first data processing model. In this case, a random number of quality labels selected may be selected for the processing quality labels. For example, a probabilistic model such as Gaussian mixture model may be considered and calibrated for position of each sample. Thus, at each training epoch, a random number from the corresponding probabilistic model (distribution) may be drawn and used as the training label. It is noted that EPCH-RND mode can be seen as a subset of EPCH-DIST, where the probabilistic model is made of Dirac delta functions.
[0063] In some example embodiments, the second device 220 may determine (2030) another processing model (referred to as “third data processing model” hereinafter). For example, the third data processing model may be a noisy label loss model. The third data processing model may determine whether the weighted loss function is used or not. The third data processing model may indicate/use one or more modes for weighting noisy label loss. For example, the third data processing model may indicate/use a no weighted loss function model. For example, all training samples may have the same value/effect for training the data processing model. In this case, no weighting function/mechanism may be used for calculation of loss function. Alternatively, the third data processing model may indicate/utilize a weighted loss function model. For example, as the accuracy of labels and consistency of the reported labels from different sources for all the training samples are not the same, a weighted loss function may be used. In this case, for samples with high label consistency score (LCS), the loss and consequently the gradients may be emphasized. However, the loss function may be de-emphasized in case of low LCS for some training samples.
[0064] The second device 220 may determine a type of required measurement. The required type of measurement may need to be recorded. For example, the type of measurement may indicate a radio measurement. For example, the radio measurement may include positioning measurement such as measurement of field NR signals collected by the first device 210 and/or the second device 220. The positioning measurement may include any combination of time, angle of arrival, channel impulse response (OR), etc. The positioning measurement may be obtained after receiving a positioning signal. Examples of the positioning signal may include but not limited to a DL positioning reference signal (PRS), a UL sounding reference signal (SRS), or a sidelink (SL) positioning reference signal (SL-PRS). Alternatively, or in addition, the second device 220 may determine a format of reporting estimation based on the measurement (for example, the estimated position). For example, the format may indicate that means and variance of the estimation need to be reported.
[0065] In some example embodiments, the second device 220 may transmit further information that includes an indication for indicating whether the processing of a first set of quality labels included in a training sample is enabled or not to the first device 210. The first set of quality labels may refer to original labels of the training data which may be noisy. Details of the first set of quality labels are described later. For example, the fourth information may include a LPP ProvideAssistanceData IE. The second device 220 may trigger the noisy label processing flag (NLPF) (i.e., the above-mentioned indication) in the fourth information. For example, an IE “NLPF = 1/0” may be also configured in the fourth information. This IE is to enable/disable noisy label processing at the first device 210 or the second device 220. If the NLPF is off (i.e., NLPF = 0), only one of the labeling sources (as default) may be reported or used and all the other sources may be deactivated. If NLPF is set to 1, the NLP-mode may determine the procedure for exploiting the captured noisy label(s)/position(s). Alternatively, if one source is used, the accuracy of labeling may be included explicitly or implicitly. Alternatively, if one source is used, but different positioning algorithms are used to provide positioning labels (with the same or different measurements), the NLP may also be set to 1.
[0066] The second device 220 may transmit (2040) the type of radio measurement and the format of reporting (referred to as “fifth information” hereinafter) to the first device 210. For example, the fifth information may indicate a required radio measurement for obtaining training data. In some example embodiments, the fifth information may include a LPP ProvideAssistanceData IE. It is noted that the fifth information and the fourth information may be transmitted together or separately.
[0067] In some example embodiments, after receiving the fifth information, the first device 210 may determine whether the first device 210 is capable of providing the required radio measurement. By way of example, if the required type of radio measurement includes to global navigation satellite system (GNSS) and LIDAR measurements (that is, the second device 220 asks for GNSS and LIDAR reports), the first device 210 may determine whether the GNSS and LIDAR sources are available. If the GNSS and LIDAR positioning sources become available, the first device 210 may be capable of providing the GNSS and LIDAR measurements.
[0068] If the first device 210 is capable of providing the required radio measurement, the first device 210 may transmit (2050) an acknowledgment message (ACK) to the second device 220. The acknowledgment message may indicate capability of providing the required radio measurement. For example, in case the second device 210 asks for both GNSS and LIDAR reports, the first device 210 may send the acknowledgment message as soon as both positioning sources become available. In some example embodiments, the acknowledgment message (ACK) may be a LPP ProvideLocationlnformation IE. In addition, in some example embodiments, based on receiving (2050) the acknowledgment message, the second device 220 may request a network device serving the first device 210 to allocate resources for a following report of positioning measurement.
[0069] The second device 220 transmits (2060) first information to the first device 210. For example, in some example embodiments, based on receiving (2050) the acknowledgment message, the second device 220 transmits (2060) the first information. The first information indicates the first data processing model and the first set of parameters related to the first data processing.
[0070] In some example embodiments, the first set of parameters may include a set of weights for sources of the training data. For example, weights or coefficients for each positioning source (for example, GNSS, RAT-based, and LIDAR) may be included in the first set of parameters and may be used in the first data processing model.
[0071] The first device 210 determines (2070) a training sample for a second data processing model. The training sample includes training data for the second data processing model and first quality data of the training data. The training sample may be position- related data. In this case, the position-related data may include position-related parameter and first quality data of the position-related parameter. For example, the position-related parameter may be any one or any combinations of: a position, a direction of the position, or a coordinate of the position. It is noted that the position-related parameter may include any proper parameters that relate to positioning ore directions. The first quality data may include one or more quality labels. For example, each of the quality labels may be related to a positioning source. Alternatively, each of the quality labels may be related to a type of measurement of the positioning source. It is noted that the term “position-related parameter” may be understood as a variable property related to position and not only understood as a fixed setting.
[0072] In some example embodiments, the training sample may be defined as a tuple (a radio measurement, label 1, label 2, . . ., label M), where label M denotes a 2D or 3D position estimation provided by positioning source M and M is an integer number. In some example embodiments, if the first device 210 is a terminal device, the first device 210 may determine the radio measurement based on downlink (DL) positioning reference signals (PRSs). Alternatively, if the first device 210 is a network device, the first device 210 may determine the radio measurement based on uplink (UL) sounding reference signal (SRS)-for-positioning. As mentioned above, the radio measurement may be one or more of: TOA, AOA, or OR. It is noted that the radio measurement may include any proper type of measurement. The positioning source may include but not limited to global navigation satellite system (GNSS), radio access technology (RAT), LIDAR, Wi-Fi based positioning, ML based positioning, or the like.
[0073] In some example embodiments, the second device 220 may estimate its position based on the radio measurement. For example, the estimation result may include the position mean /i and variance 2. In some example embodiments, if there are several sources of positioning, the second device 220 may determine and store a mean and variance for each positioning source. [0074] The first device 210 determines (2080) second quality data of the training data by processing the first quality data based on the first data processing model and the first set of parameters. In other words, the second quality data may refer to the labels that has been improved compared with the first set of quality labels. For example, the second set of quality labels may be less noisy than the first set of quality labels. The second set of quality labels may include one or more quality labels. In some example embodiments, the first device 210 may obtain the second set of quality labels by aggregating or fusing the first set of quality labels based on the first data processing model and the first set of parameters. In some other example embodiments, the positioning estimation(s) may be processed to prepare the second set of quality labels, based on the first data processing model, the first set of parameters and estimated position mean(s) and variance(s). In this way, the quality labels may be more reliable, and the training of the data processing model can also be improved. Further, the positioning accuracy may also be improved.
[0075] As mentioned above, the first data processing model may indicate one or more of: the first, second, third, fourth, fifth or sixth mode indicated in the first data processing model. For example, if the first mode is indicated, the first device 210 may not process the first set of quality labels. In some example embodiments, if the second mode is indicated, for each training sample, the first device 210 may use the weighted average over positioning means as the second set of quality labels for all training epochs. Alternatively, or addition, if the third mode is indicated, a most reliable labels may be selected from the first set of quality labels. In this case, the most reliable labels may be the second set of quality labels. In some other embodiments, if the fourth mode is indicated, for each training sample, the first device 210 may randomly select one of the positioning means as the set of quality labels for all training epochs. Alternatively, if the fifth mode is indicated, at each training epoch, the first device 210 may randomly select and use one of the first set of quality labels as the one label in the second set of quality labels for each training sample. In some example embodiments, if the sixth mode is indicated, at each training epoch, a random number from the corresponding probabilistic model may be drawn and used as the second set of quality labels.
[0076] As mentioned above, which side to train the second data processing model may be determined by the second device 220. Referring to FIG. 2A, if the second data processing model is trained by the second device 220, the first device 210 may transmit (2100) second information to the second device 220. The second information may include the second set of quality labels. In some example embodiments, if the fourth or fifth mode is indicated, the second information may include the first set of quality labels. In this case, the second device 220 may store (2110) the training sample and the second set of quality labels. The second device 220 may train (2120) the second data processing model using the training data and the second set of quality labels. In some example embodiments, the second data processing model may also be trained based on the third data processing model and a second set of parameters related to the third data processing model. For example, the second set of parameters may include coefficients for calculation of weights used in the loss function based on the accuracy and consistency of the second set of quality labels. Advantageously, the training of the second data processing model can be improved. Moreover, the data processing model may be updated in time.
[0077] If the second data processing model is trained by the first device 210, the first information (received at 2040) may also indicate the third data processing model for loss function and the second set of parameters related to the third data processing model. Referring to FIG. 2B, the first device 210 may store (2200) the training sample and the second set of quality labels. The first device 210 may train (2210) the second processing model using the training data and the second set of quality labels. The first apparatus may determine the weights of loss function for the second set of labels based on the third training data and the second set of parameters. In this case, the second data processing model may be trained based on the training data, the second set of quality labels, and the weights. For example, the second set of parameters may include coefficients for calculation of weights used in the loss function based on the accuracy and consistency of the second set of quality labels. The first device 210 may transmit (2220) third information to the second device 220. The third information may include quality information of the second data processing model trained using the training data and the second set of quality labels. Only as an example, the third information may indicate that accuracy of the second data processing model is 90%. Advantageously, the training of the second data processing model can be improved. Moreover, it can reduce the burden of training the data processing model at terminal device side.
[0078] In some example embodiments, the first set of quality labels may only include one quality label. In this case, the first device may determine weighting information of the training sample based on a variance of an estimation obtained for a source associated with the training data. The weighting information indicates to emphasize or de-emphasize of effect of the training sample on the second data processing model. In other words, the variance of position estimation may be used to emphasize or de-emphasize the effects of each training sample on the training results of the second data processing model. In some example embodiments, if the second data processing model is trained by the first device 210, the first device 210 may train, using the training sample, the second data processing model based on the weighting information of the training sample. Alternatively, if the second data processing model is trained by the second device 220, the first device 210 may transmit the weighting information of the training sample to the second device 220. In this case, the second device 220 may train, using the training sample, the second data processing model based on the weighting information of the training sample.
[0079] According to embodiments described with reference to FIG. 2 A and FIG. 2B, training samples with one or several sources of noisy labels can be pre-processed before using in the training procedure of the AI/ML model. Moreover, embodiments of the present disclosure are capable of training the AI/ML model with one or more noisy labels. Further, it also provides possibility of considering variance or accuracy of noisy labels in the training procedure of training the AI/ML model. Additionally, it also achieves emphasizing or deemphasizing a specific source of positioning in the training process of training the AI/ML model.
[0080] FIG. 3AA and FIG. 3AB illustrate the probability density function (PDF) of the measured noisy labels from the three sources 311, 312 and 313. It is assumed a Gaussian distribution with the measured mean and variance for each positioning source. In the example of FIG. 3AA and FIG. 3 AB, there are three sources that provide noisy labels with the following means and variance . ~ [ ] > ^2 —
Figure imgf000022_0001
Figure imgf000022_0002
[0081] For NLP-mode=AVE: an average of all the three positions are calculated and reported, as be used for all the training epochs. As shown in FIG. 3BA
Figure imgf000022_0003
and FIG. 3BB, the processed and reported training label using the AVE mode is marked with 320.
[0082] For NLP-mode=BEST: the label from the first positioning source is selected as the training label, | ], which will be used for all the training epochs. As shown in FIG. 3CA and FIG. 3CB, the processed and reported training label using the BEST mode is marked with 330.
[0083] For NLP-mode=RND: one of the noisy labels are selected randomly and used for all the training epochs. For example, as shown in FIG. 3DA and FIG. 3DB, the third positioning source is selected is used for all the training epochs. As shown in FIG.
Figure imgf000023_0001
3DA and FIG. 3DB, the processed and reported training label using the RND mode is marked with 340.
[0084] For NLP-mode=EPCH-DIST: a Gaussian mixture model (GMM) for the position and fitted a GMM to the measured noisy labels may be applied, as shown in FIG. 3EA and FIG. 3EB. Considering 30 epochs of training, 30 random numbers are drawn for the GMM. As shown in FIG. 3EA and FIG. 3EB, the processed and reported training labels for 30 epochs of training using the EPCH-DIST mode are marked with 350.
Example Methods
[0085] FIG. 4 illustrates a flowchart of a method 400 implemented at a first apparatus according to some example embodiments of the present disclosure. For example, the first apparatus may include a terminal device or a network device.
[0086] At block 410, the first apparatus receives first information indicating from a second apparatus. The first indication indicates a first data processing model and a first set of parameters related to the first data processing model. In some example embodiments, the first set of parameters comprises a set of weights for sources of the training data. In some example embodiments, the first data processing model indicates at least one of: a first mode where quality labels are reported without processing, a second mode where a weighted average of means of the training data is used for the processing of the first set of quality labels, a third mode where a best quality label from the first set of quality labels is selected for the processing of the first set of quality labels, a fourth mode where one of means of the training data is randomly selected for the processing of the first set of quality labels, a fifth mode where one quality label from the first set of quality labels is randomly selected for the processing of the first set of quality labels, or a sixth mode where a random number of quality labels selected is selected for the processing of the first set of quality labels. [0087] At block 420, the first apparatus determines position-related data for a second data processing model. The position-related data comprises a position-related parameter for the second data processing model and first quality data of position-related parameter.
[0088] At block 430, the first apparatus determines a second quality data of the position- related parameter by processing the first quality data based on the first data processing model and the first set of parameters. In some example embodiments, the first apparatus may transmit, to the second apparatus, second information comprising the second set of quality labels.
[0089] In some example embodiments, the first information further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model. In some example embodiments, the first apparatus may store position- related data and the second quality data. The first apparatus may determine the weights of loss function for the second quality data based on the third training data and the second set of parameters. In some example embodiments, the first apparatus may train the second data processing model using the position-related parameter and the second quality data. The first apparatus may determine the weights for the second quality data based on the third training data and the second set of parameters.
[0090] In some example embodiments, the first apparatus may transmit, to the second apparatus, third information comprising quality information of the second data processing model trained using the position-related parameter and the second quality data. In some example embodiments, the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data. In some example embodiments, the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
[0091] In some example embodiments, the first apparatus may receive, from the second apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled. In some example embodiments, the first apparatus may receive, from the second apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter. In some example embodiments, the first apparatus may transmit, to the second apparatus, an acknowledgment message indicating capability of the required radio measurement.
[0092] In some example embodiments, the first set data comprises one quality label. In some example embodiments, the first apparatus may determine weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position-related data on the second data processing model. In some example embodiments, the first apparatus may train, using the position-related data, the second data processing model based on the weighting information of the position-related data. In some example embodiments, the first apparatus may train, to the second apparatus, the weighting information of the position-related data.
[0093] FIG. 5 illustrates a flowchart of a method 500 implemented at a second apparatus according to some example embodiments of the present disclosure. For example, the second apparatus may include a core network device.
[0094] At block 510, the second apparatus determines a first data processing model and a first set of parameters related to the first data processing model. The first data processing model and the first set of parameters are used to process first quality data of a position-related parameter in position-related data to obtain a second quality data. In some example embodiments, the first set of parameters comprises a set of weights for sources of the position-related parameter. In some example embodiments, the first data processing model indicates at least one of: a first mode where quality labels are reported without processing, a second mode where a weighted average of means of the position-related parameter is used for the processing of the first set of quality labels, a third mode where a best quality label from the first set of quality labels is selected for the processing of the first set of quality labels, a fourth mode where one of means of the training data is randomly selected for the processing of the first set of quality labels, a fifth mode where one quality label from the first set of quality labels is randomly selected for the processing of the first set of quality labels, or a sixth mode where a random number of quality labels selected is selected for the processing of the first set of quality labels. [0095] At block 520, the second apparatus transmits, to a first apparatus, first information indicating the first data processing model and the first set of parameters. In some example embodiments, the second apparatus may determine a third data processing model for loss function and a second set of parameters related to the third data processing model. In some example embodiments, the second apparatus may transmit, to the first apparatus, the first information that further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model.
[0096] In some example embodiments, the second apparatus may receive, from the first apparatus, second information comprising the second quality data. In some example embodiments, the second apparatus may store the position-related data and the second quality data. In some example embodiments, the second apparatus may determine weights for the second quality data based on the third data processing model and the second set of parameters. In some example embodiments, the second apparatus may train the second data processing model based on the position-related parameter, the second quality data, and the weights.
[0097] In some example embodiments, the second apparatus may receive from the first apparatus, third information comprising quality information of a trained second data processing model. In some example embodiments, the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data. In some example embodiments, the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
[0098] In some example embodiments, the second apparatus may transmit, to the first apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled. In some example embodiments, the second apparatus may receive transmit, to the first apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter. In some example embodiments, the second apparatus may receive, from the first apparatus, an acknowledgment message indicating capability of the required radio measurement.
[0099] In some example embodiments, the first quality data comprises one quality label.
In some example embodiments, the second apparatus may determine weighting information of the training sample based on a variance of an estimation obtained for a source associated with the position-related parameter. The weighting information may indicate to emphasize or de-emphasize of effect of position-related data on the second data processing model.
[00100] In some example embodiments, the second apparatus may receive the weighting information from the first apparatus. In some example embodiments, the second apparatus may train, using the position-related data, the second data processing model based on the weighting information of the position-related data.
Example Apparatus, Device and Medium
[00101] In some example embodiments, a first apparatus capable of performing any of the method 400 (for example, the device 110 or the device 130 in FIG. 1) may comprise means for performing the respective operations of the method 400. The means may be implemented in any suitable form. For example, the means may be implemented in a circuitry or software module. The first apparatus may be implemented as or included in the device 110 or the device 130 in FIG. 1.
[00102] In some example embodiments, the first apparatus comprises means for receiving, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; means for determining position- related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and means for determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
[00103] In some example embodiments, the first apparatus comprises means for transmitting, to the second apparatus, second information comprising the second quality data.
[00104] In some example embodiments, the first information further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model. In some example embodiments, the first apparatus comprises means for storing the position-related data and the second quality data; means for determining weights for the second quality data based on the third processing model and the second of parameters; and means for training the second data processing model based on the position-related parameter, the second quality data, and weights.
[00105] In some example embodiments, the first apparatus comprises means for transmitting, to the second apparatus, third information comprising quality information of the second data processing model trained using the position-related parameter and the second quality data.
[00106] In some example embodiments, the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
[00107] In some example embodiments, the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
[00108] In some example embodiments, the first apparatus comprises means for receiving, from the second apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled.
[00109] In some example embodiments, the first apparatus comprises means for receiving, from the second apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter; and means for transmitting, to the second apparatus, an acknowledgment message indicating capability of the required radio measurement.
[00110] In some example embodiments, the first set of parameters comprises a set of weights for sources of the position-related parameter.
[00111] In some example embodiments, the first quality data comprises one quality label. In some example embodiments, the first apparatus comprises means for determining weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position-related data on the second data processing model. In some example embodiments, the first apparatus comprises means for training, using the position-related data, the second data processing model based on the weighting information of the position-related data; or means for transmitting, to the second apparatus, the weighting information of the position-related data. [00112] In some example embodiments, the first apparatus comprises a terminal device or a network device, and the second apparatus comprises a core network device.
[00113] In some example embodiments, a second apparatus capable of performing any of the method 500 (for example, the device 120 in FIG. 1) may comprise means for performing the respective operations of the method 500. The means may be implemented in any suitable form. For example, the means may be implemented in a circuitry or software module. The second apparatus may be implemented as or included in the device 120 in FIG. 1.
[00114] In some example embodiments, the second apparatus comprise means for determining a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position-related data to obtain second quality data of the position-related parameter; and means for transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
[00115] In some example embodiments, the second apparatus comprise means for receiving, from the first apparatus, second information comprising the second quality data.
[00116] In some example embodiments, the second apparatus comprise means for storing the position-related data and the second quality data; and means for training, using the position-related parameter and the second quality data, the second data processing model based on a third data processing model for loss function and a second set of parameters related to the third data processing model.
[00117] In some example embodiments, the second apparatus comprise means for determining a third data processing model for loss function and a second set of parameters related to the third data processing model. In some example embodiments, the means for transmitting the first information comprises: means for transmitting, to the first apparatus, the first information that further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model. [00118] In some example embodiments, the second apparatus comprise means for receiving, from the first apparatus, third information comprising quality information of the second data processing model trained using the position-related parameter and the second quality data.
[00119] In some example embodiments, the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
[00120] In some example embodiments, the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
[00121] In some example embodiments, the second apparatus comprise means for transmitting, to the first apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled.
[00122] In some example embodiments, the second apparatus comprise means for transmitting, to the first apparatus, fifth information indicating a required radio measurement for obtaining the position-related parameter; and means for receiving, from the first apparatus, an acknowledgment message indicating capability of the required radio measurement.
[00123] In some example embodiments, the first set of parameters comprises a set of weights for sources of the position-related parameter.
[00124] In some example embodiments, the first quality data comprises one quality label.
[00125] In some example embodiments, the second apparatus comprise means for determining weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position- related data on the second data processing model; or means for receiving the weighting information from the first apparatus. In some example embodiments, the second apparatus comprise means for training, using the position-related data, the second data processing model based on the weighting information of the position-related data.
[00126] FIG. 6 is a simplified block diagram of a device 600 that is suitable for implementing example embodiments of the present disclosure. The device 600 may be provided to implement a communication device, for example, the device 110, the device 120, or the device 130 in FIG.l. As shown, the device 600 includes one or more processors 610, one or more memories 620 coupled to the processor 610, and one or more communication modules 640 coupled to the processor 610.
[00127] The communication module 640 is for bidirectional communications. The communication module 640 has one or more communication interfaces to facilitate communication with one or more other modules or devices. The communication interfaces may represent any interface that is necessary for communication with other network elements. In some example embodiments, the communication module 640 may include at least one antenna.
[00128] The processor 610 may be of any type suitable to the local technical network and may include one or more of the following: general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multicore processor architecture, as non-limiting examples. The device 600 may have multiple processors, such as an application specific integrated circuit chip that is slaved in time to a clock which synchronizes the main processor.
[00129] The memory 620 may include one or more non-volatile memories and one or more volatile memories. Examples of the non-volatile memories include, but are not limited to, a Read Only Memory (ROM) 624, an electrically programmable read only memory (EPROM), a flash memory, a hard disk, a compact disc (CD), a digital video disk (DVD), an optical disk, a laser disk, and other magnetic storage and/or optical storage. Examples of the volatile memories include, but are not limited to, a random access memory (RAM) 622 and other volatile memories that will not last in the power-down duration.
[00130] A computer program 630 includes computer executable instructions that are executed by the associated processor 610. The instructions of the program 630 may include instructions for performing operations/acts of some example embodiments of the present disclosure. The program 630 may be stored in the memory, e.g., the ROM 624. The processor 610 may perform any suitable actions and processing by loading the program 630 into the RAM 622. [00131] The example embodiments of the present disclosure may be implemented by means of the program 630 so that the device 600 may perform any process of the disclosure as discussed with reference to FIG. 2A-FIG. 5. The example embodiments of the present disclosure may also be implemented by hardware or by a combination of software and hardware.
[00132] In some example embodiments, the program 630 may be tangibly contained in a computer readable medium which may be included in the device 600 (such as in the memory 620) or other storage devices that are accessible by the device 600. The device 600 may load the program 630 from the computer readable medium to the RAM 622 for execution. In some example embodiments, the computer readable medium may include any types of non-transitory storage medium, such as ROM, EPROM, a flash memory, a hard disk, CD, DVD, and the like. The term “non-transitory,” as used herein, is a limitation of the medium itself (i.e., tangible, not a signal) as opposed to a limitation on data storage persistency (e.g., RAM vs. ROM).
[00133] FIG. 7 shows an example of the computer readable medium 700 which may be in form of CD, DVD or other optical storage disk. The computer readable medium 700 has the program 630 stored thereon.
[00134] Generally, various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representations, it is to be understood that the block, apparatus, system, technique or method described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
[00135] Some example embodiments of the present disclosure also provide at least one computer program product tangibly stored on a computer readable medium, such as a non- transitory computer readable medium. The computer program product includes computerexecutable instructions, such as those included in program modules, being executed in a device on a target physical or virtual processor, to carry out any of the methods as described above. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, or the like that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Machine-executable instructions for program modules may be executed within a local or distributed device. In a distributed device, program modules may be located in both local and remote storage media.
[00136] Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. The program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented. The program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
[00137] In the context of the present disclosure, the computer program code or related data may be carried by any suitable carrier to enable the device, apparatus or processor to perform various processes and operations as described above. Examples of the carrier include a signal, computer readable medium, and the like.
[00138] The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the computer readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD- ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
[00139] Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are contained in the above discussions, these should not be construed as limitations on the scope of the present disclosure, but rather as descriptions of features that may be specific to particular embodiments. Unless explicitly stated, certain features that are described in the context of separate embodiments may also be implemented in combination in a single embodiment. Conversely, unless explicitly stated, various features that are described in the context of a single embodiment may also be implemented in a plurality of embodiments separately or in any suitable sub-combination.
[00140] Although the present disclosure has been described in languages specific to structural features and/or methodological acts, it is to be understood that the present disclosure defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
A list of Abbreviations
[00141] LMF Location Management Function
[00142] PRS Positioning Reference Signal
[00143] SRS Sounding Reference Signal
[00144] PRU Positioning Reference Unit
[00145] TRP Transmit Receive Point
[00146] GNSS Global Navigation Satellite System
[00147] IE Information Element
[00148] NR New Radio [00149] NRPPa NR Positioning Protocol Annex
[00150] TA Target Accuracy
[00151] LCS label consistency score
[00152] CIR Channel Impulse Response
[00153] 2D Two Dimensional
[00154] 3D Three Dimensional
[00155] RAT Radio Access Technology
[00156] UE User Equipment
[00157] 5G Fifth Generation
[00158] LTE Long Term Evolution
[00159] LTE-A LTE-Advanced
[00160] LPP LTE Positioning Protocol
[00161] WCDMA Wideband Code Division Multiple Access
[00162] BS Base Station
[00163] AP Access Point
[00164] eNodeB Evolved NodeB
[00165] gNB/NR NB Next Generation NodeB
[00166] Tx Transmitting
[00167] Rx Receiving
[00168] DL Downlink
[00169] UL Uplink
[00170] SL Sidelink
[00171] SL-PRS Sidelink Positioning Reference Signal
[00172] Al Artificial Intelligence
[00173] ML Machine Learning
[00174] NWDAF Network Data Analytics Function [00175] NLOS Non-Line-of-Sight
[00176] RAN Radio Access Network
[00177] NG-RAN Next Generation Radio Access Network

Claims

WHAT IS CLAIMED IS:
1. A first apparatus comprising: at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the first apparatus at least to: receive, from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; determine position-related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and determine second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
2. The first apparatus of claim 1, wherein the first apparatus is further caused to: transmit, to the second apparatus, second information comprising the second quality data.
3. The first apparatus of claim 1, wherein the first apparatus is further caused to: transmit, to the second apparatus, third information comprising quality information of the second data processing model trained using the position-related parameter and the second quality data.
4. The first apparatus of claim 3, wherein the first information further indicates a third data processing model for a loss function and a second set of parameters related to the third data processing model, wherein the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
5. The first apparatus of any of claims 3-4, wherein the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
6. The first apparatus of any of claims 1-5, wherein the first apparatus is further caused to: receive, from the second apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled.
7. The first apparatus of any of claims 1-6, wherein the first set of parameters comprises a set of weights for sources of the position-related parameter.
8. The first apparatus of any of claims 1-7, wherein the first quality data comprises one quality label, and wherein the first apparatus is further caused to: determine weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position- related data on the second data processing model; and wherein the first apparatus is further caused to perform one of: training, using the position-related data, the second data processing model based on the weighting information of the position-related data; or transmitting, to the second apparatus, the weighting information of the position- related data.
9. The first apparatus of any of claims 1-8, wherein the first apparatus comprises a terminal device or a network device, and wherein the second apparatus comprises a core network device.
10. A second apparatus comprising: at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the second apparatus at least to: determine a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position- related data to obtain second quality data of the position-related parameter; and transmit, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
11. The second apparatus of claim 10, wherein the second apparatus is further caused to: receive, from the first apparatus, second information comprising the second quality data.
12. The second apparatus of claim 11, wherein the second apparatus is further caused to: receive, from the first apparatus, third information comprising quality information of a second data processing model trained using the position-related parameter and the second quality data.
13. The second apparatus of claim 10, wherein the second apparatus is further caused to: determine a third data processing model for loss function and a second set of parameters related to the third data processing model, and wherein transmitting the first information comprises: transmitting, to the first apparatus, the first information that further indicates a third data processing model for loss function and a second set of parameters related to the third data processing model.
14. The second apparatus of claim 13, wherein the second set of parameters comprises a set of coefficients that is used for determining weights in the loss function based on the second quality data.
15. The second apparatus of any of claims 12-14, wherein the third data processing model indicates one of: a no weighted loss function model, or a weighted loss function model.
16. The second apparatus of any of claims 10-15, wherein the second apparatus is further caused to: transmit, to the first apparatus, fourth information comprising an indication for indicating whether the processing of the first quality data is enabled.
17. The second apparatus of any of claims 10-16, wherein the first set of parameters comprises a set of weights for sources of the position-related parameter.
18. The second apparatus of any of claims 10-17, wherein the first quality data comprises one quality label, and wherein the second apparatus is further caused to: determine weighting information of the position-related data based on a variance of an estimation obtained for a source associated with the position-related parameter, wherein the weighting information indicates to emphasize or de-emphasize of effect of the position- related data on the second data processing model; or receive the weighting information from the first apparatus; and wherein the second apparatus is further caused to: train, using the position-related data, the second data processing model based on the weighting information of the position-related data.
19. A method, comprising: receiving, at a first apparatus and from a second apparatus, first information indicating a first data processing model and a first set of parameters related to the first data processing model; determining position-related data that comprises a position-related parameter for a second data processing model and first quality data of the position-related parameter; and determining second quality data of the position-related parameter by processing the first quality data based on the first data processing model and the first set of parameters.
20. A method, comprising: determining, at a second apparatus, a first data processing model and a first set of parameters related to the first data processing model, wherein the first data processing model and the first set of parameters are used to process first quality data of position-related parameter in a position-related data to obtain second quality data of the position-related parameter; and transmitting, to a first apparatus, first information indicating the first data processing model and the first set of parameters.
PCT/IB2024/050714 2023-02-16 2024-01-25 Processing framework for noisy labels Ceased WO2024170977A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202480012066.4A CN120731432A (en) 2023-02-16 2024-01-25 Processing architecture for noisy labels

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20235184 2023-02-16
FI20235184 2023-02-16

Publications (1)

Publication Number Publication Date
WO2024170977A1 true WO2024170977A1 (en) 2024-08-22

Family

ID=89768294

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/050714 Ceased WO2024170977A1 (en) 2023-02-16 2024-01-25 Processing framework for noisy labels

Country Status (2)

Country Link
CN (1) CN120731432A (en)
WO (1) WO2024170977A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021211399A1 (en) * 2020-04-14 2021-10-21 Qualcomm Incorporated Neural network based line of sight detection for positioning
WO2022087137A1 (en) * 2020-10-21 2022-04-28 Intel Corporation Prs and srs configuration for nr positioning
WO2022155244A2 (en) * 2021-01-12 2022-07-21 Idac Holdings, Inc. Methods and apparatus for training based positioning in wireless communication systems
US20230043111A1 (en) * 2020-04-22 2023-02-09 Vivo Mobile Communication Co., Ltd. Positioning method, communications device, and network device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021211399A1 (en) * 2020-04-14 2021-10-21 Qualcomm Incorporated Neural network based line of sight detection for positioning
US20230043111A1 (en) * 2020-04-22 2023-02-09 Vivo Mobile Communication Co., Ltd. Positioning method, communications device, and network device
WO2022087137A1 (en) * 2020-10-21 2022-04-28 Intel Corporation Prs and srs configuration for nr positioning
WO2022155244A2 (en) * 2021-01-12 2022-07-21 Idac Holdings, Inc. Methods and apparatus for training based positioning in wireless communication systems

Also Published As

Publication number Publication date
CN120731432A (en) 2025-09-30

Similar Documents

Publication Publication Date Title
WO2021092813A1 (en) Accurate sidelink positioning reference signal transmission timing
US20240152814A1 (en) Training data characterization and optimization for a positioning task
WO2023077390A1 (en) Positioning enhancement for near-far field scenario
WO2024170977A1 (en) Processing framework for noisy labels
EP4594775A1 (en) Training sample evaluation in positioning
US20250358774A1 (en) Weighting positioning measurements
WO2024031575A1 (en) Phase noise determination for positioning
US20240276425A1 (en) Robust sample aggregation for ml positioning
WO2025020141A1 (en) Semi-supervised learning with data augmentation
WO2024229708A1 (en) Mechanism for model monitoring
US20250056288A1 (en) Obtaining measured information and predicted information related to ai/ ml model
US20250056477A1 (en) Model monitoring for positioning
EP4417996A1 (en) Devices, methods and apparatuses for transforming sampled signal
US20250113328A1 (en) Sidelink positioning reference signal measurement based on a plurality of measurement samples
US20250039832A1 (en) Enhancement on sidelink positioning
US12181594B2 (en) Positioning target device
WO2024229840A1 (en) Signaling framework for ai/ml based semi-supervised learning
WO2025022184A1 (en) Performance monitoring of channel classification
WO2021142841A1 (en) Fallback reference signal configuration
WO2023050430A1 (en) Positioning measurement reporting
WO2025171904A1 (en) Providing ground truth labels
WO2024199896A1 (en) Enhancements on model monitoring
WO2023070243A1 (en) Panel selection for positioning
WO2024032895A1 (en) Data augmentation for machine learning training in positioning
WO2025209693A1 (en) Method and device for pathloss prediction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24702624

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202480012066.4

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 202547086373

Country of ref document: IN

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 202480012066.4

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 202547086373

Country of ref document: IN