[go: up one dir, main page]

WO2025166609A1 - Devices and methods for sensing and positioning fusion - Google Patents

Devices and methods for sensing and positioning fusion

Info

Publication number
WO2025166609A1
WO2025166609A1 PCT/CN2024/076493 CN2024076493W WO2025166609A1 WO 2025166609 A1 WO2025166609 A1 WO 2025166609A1 CN 2024076493 W CN2024076493 W CN 2024076493W WO 2025166609 A1 WO2025166609 A1 WO 2025166609A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
target
positioning
result
communication device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CN2024/076493
Other languages
French (fr)
Inventor
Gang Wang
Jin Yang
Jinhui WEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to PCT/CN2024/076493 priority Critical patent/WO2025166609A1/en
Publication of WO2025166609A1 publication Critical patent/WO2025166609A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel

Definitions

  • Example embodiments of the present disclosure generally relate to the field of communication techniques and in particular, to devices and methods for sensing and positioning fusion.
  • ISAC Integrated Sensing and Communication
  • RF radio frequency
  • a first communication device comprising: a processor configured to cause the first communication device to: obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and perform a fusion of the sensing result and the positioning result.
  • a fifth communication device comprising: a processor configured to cause the fifth communication device to: receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
  • AF Application Function
  • a communication method performed by a first communication device. The method comprises: obtaining a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and performing a fusion of the sensing result and the positioning result.
  • a communication method performed by a fifth communication device.
  • the method comprises: receiving, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmitting, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
  • AF Application Function
  • a computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, causing the at least one processor to carry out the method according to the third, or fourth aspect.
  • FIG. 1A illustrates an example communication environment in which embodiments of the present disclosure can be implemented
  • FIG. 1B illustrates another example communication environment in which embodiments of the present disclosure can be implemented
  • FIG. 2 illustrate a schematic diagram of example sensing modes in accordance with some embodiments of the present disclosure
  • FIG. 3 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure
  • FIG. 4A illustrates a signaling flow of an example process of sensing and positioning fusion performed at a sensing function (SF) device in accordance with some embodiments of the present disclosure
  • FIG. 4B illustrates a signaling flow of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure
  • FIG. 4C illustrates a signaling flow of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure
  • FIG. 5 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure
  • FIG. 6 illustrates a signaling flow of an example process of sensing and positioning fusion performed at a Location Management Function (LMF) device in accordance with some embodiments of the present disclosure
  • LMF Location Management Function
  • FIG. 7 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure
  • FIG. 8 illustrates a signaling flow of an example process of sensing and positioning fusion performed at an Application Function (AF) device in accordance with some embodiments of the present disclosure
  • FIG. 9 illustrates a signaling flow of an example process for the sensing and positioning fusion in accordance with some embodiments of the present disclosure
  • FIG. 10 illustrates a flowchart of a method implemented at a first communication device according to some example embodiments of the present disclosure
  • FIG. 11 illustrates a flowchart of a method implemented at a fifth communication device according to some example embodiments of the present disclosure
  • FIG. 12 illustrates a simplified block diagram of an apparatus that is suitable for implementing example embodiments of the present disclosure.
  • terminal device refers to any device having wireless or wired communication capabilities.
  • the terminal device include, but not limited to, user equipment (UE) , personal computers, desktops, mobile phones, cellular phones, smart phones, personal digital assistants (PDAs) , portable computers, tablets, wearable devices, internet of things (IoT) devices, Ultra-reliable and Low Latency Communications (URLLC) devices, Internet of Everything (IoE) devices, machine type communication (MTC) devices, devices on vehicle for V2X communication where X means pedestrian, vehicle, or infrastructure/network, devices for Integrated Access and Backhaul (IAB) , Space borne vehicles or Air borne vehicles in Non-terrestrial networks (NTN) including Satellites and High Altitude Platforms (HAPs) encompassing Unmanned Aircraft Systems (UAS) , eXtended Reality (XR) devices including different types of realities such as Augmented Reality (AR) , Mixed Reality (MR) and Virtual Reality (VR) , the unmanned aerial vehicle (UAV)
  • UE user equipment
  • the ‘terminal device’ can further has ‘multicast/broadcast’ feature, to support public safety and mission critical, V2X applications, transparent IPv4/IPv6 multicast delivery, IPTV, smart TV, radio services, software delivery over wireless, group communications and IoT applications. It may also incorporate one or multiple Subscriber Identity Module (SIM) as known as Multi-SIM.
  • SIM Subscriber Identity Module
  • the term “terminal device” can be used interchangeably with a UE, a mobile station, a subscriber station, a mobile terminal, a user terminal or a wireless device.
  • network device refers to a device which is capable of providing or hosting a cell or coverage where terminal devices can communicate.
  • a network device include, but not limited to, a Node B (NodeB or NB) , an evolved NodeB (eNodeB or eNB) , a next generation NodeB (gNB) , a transmission reception point (TRP) , a remote radio unit (RRU) , a radio head (RH) , a remote radio head (RRH) , an IAB node, a low power node such as a femto node, a pico node, a reconfigurable intelligent surface (RIS) , and the like.
  • NodeB Node B
  • eNodeB or eNB evolved NodeB
  • gNB next generation NodeB
  • TRP transmission reception point
  • RRU remote radio unit
  • RH radio head
  • RRH remote radio head
  • IAB node a low power node such as a fe
  • the terminal device or the network device may have Artificial intelligence (AI) or Machine learning capability. It generally includes a model which has been trained from numerous collected data for a specific function, and can be used to predict some information.
  • AI Artificial intelligence
  • Machine learning capability it generally includes a model which has been trained from numerous collected data for a specific function, and can be used to predict some information.
  • the terminal or the network device may work on several frequency ranges, e.g., FR1 (e.g., 450 MHz to 6000 MHz) , FR2 (e.g., 24.25GHz to 52.6GHz) , frequency band larger than 100 GHz as well as Tera Hertz (THz) . It can further work on licensed/unlicensed/shared spectrum.
  • FR1 e.g., 450 MHz to 6000 MHz
  • FR2 e.g., 24.25GHz to 52.6GHz
  • THz Tera Hertz
  • the terminal device may have more than one connection with the network devices under Multi-Radio Dual Connectivity (MR-DC) application scenario.
  • MR-DC Multi-Radio Dual Connectivity
  • the terminal device or the network device can work on full duplex, flexible duplex and cross division duplex modes.
  • the embodiments of the present disclosure may be performed in test equipment, e.g., signal generator, signal analyzer, spectrum analyzer, network analyzer, test terminal device, test network device, channel emulator.
  • the terminal device may be connected with a first network device and a second network device.
  • One of the first network device and the second network device may be a master node and the other one may be a secondary node.
  • the first network device and the second network device may use different radio access technologies (RATs) .
  • the first network device may be a first RAT device and the second network device may be a second RAT device.
  • the first RAT device is eNB and the second RAT device is gNB.
  • Information related with different RATs may be transmitted to the terminal device from at least one of the first network device or the second network device.
  • first information may be transmitted to the terminal device from the first network device and second information may be transmitted to the terminal device from the second network device directly or via the first network device.
  • information related with configuration for the terminal device configured by the second network device may be transmitted from the second network device via the first network device.
  • Information related with reconfiguration for the terminal device configured by the second network device may be transmitted to the terminal device from the second network device directly or via the first network device.
  • the singular forms ‘a’ , ‘an’ and ‘the’ are intended to include the plural forms as well, unless the context clearly indicates otherwise.
  • the term ‘includes’ and its variants are to be read as open terms that mean ‘includes, but is not limited to. ’
  • the term ‘based on’ is to be read as ‘at least in part based on. ’
  • the term ‘one embodiment’ and ‘an embodiment’ are to be read as ‘at least one embodiment. ’
  • the term ‘another embodiment’ is to be read as ‘at least one other embodiment. ’
  • the terms ‘first, ’ ‘second, ’ and the like may refer to different or same objects. Other definitions, explicit and implicit, may be included below.
  • values, procedures, or apparatus are referred to as ‘best, ’ ‘lowest, ’ ‘highest, ’ ‘minimum, ’ ‘maximum, ’ or the like. It will be appreciated that such descriptions are intended to indicate that a selection among many used functional alternatives can be made, and such selections need not be better, smaller, higher, or otherwise preferable to other selections.
  • the term “resource, ” “transmission resource, ” “uplink resource, ” or “downlink resource” may refer to any resource for performing a communication, such as a resource in time domain, a resource in frequency domain, a resource in space domain, a resource in code domain, or any other resource enabling a communication, and the like.
  • a resource in both frequency domain and time domain will be used as an example of a transmission resource for describing some example embodiments of the present disclosure. It is noted that example embodiments of the present disclosure are equally applicable to other resources in other domains.
  • performing a step “in response to A” does not indicate that the step is performed immediately after “A” occurs and one or more intervening steps may be included.
  • 3rd Generation Partnership Project (3GPP) sensing data may refer to data derived from 3GPP radio signals that are impacted (e.g., reflected, refracted, diffracted) by an object or environment of interest for sensing purposes, and optionally processed within the 5th Generation Mobile Communication Technology (5G) system.
  • 3GPP 3rd Generation Partnership Project
  • 5G Wireless sensing may refer to 5G System (5GS) feature providing capabilities to get information about characteristics of an environment and/or objects within the environment (e.g., shape, size, orientation, speed, location, distances or relative motion between objects, etc. ) using New Radio (NR) radio frequency signals, which, in some cases, can be extended by information created via previously specified functionalities in Evolved Packet Core (EPC) and/or Evolved UMTS Terrestrial Radio Access Network (E-UTRAN) .
  • EPC Evolved Packet Core
  • E-UTRAN Evolved UMTS Terrestrial Radio Access Network
  • non-3GPP sensing data may refer to data provided by non-3GPP sensors (e.g., video, LiDAR, sonar) about an object or environment of interest for sensing purposes.
  • non-3GPP sensors e.g., video, LiDAR, sonar
  • sensing assistance information may refer to information that is provided to 5G system and can be used to derive sensing result.
  • the sensing assistance information may be, for example, map information, area information, a user equipment (UE) Identity (ID) attached to or in the proximity of the sensing target, UE position information, UE velocity information etc.
  • UE user equipment
  • ID user equipment
  • sensing contextual information may refer to information that is exposed with the sensing results by 5G system to a trusted third party which provides context to the conditions under which the sensing results were derived.
  • the sensing contextual information may include, for example, map information, area information, time of capture, UE location and ID.
  • the sensing contextual information can be required in scenarios where the sensing result is to be combined with data from other sources outside the 5GS.
  • sensing group may refer to a set of sensing transmitters and sensing receivers whose locations are known and whose sensing data can be collected synchronously.
  • sensing transmitter may be the entity that sends out the sensing signal which the sensing service will use in its operation.
  • a Sensing transmitter is an NR RAN node or a UE.
  • a Sensing transmitter can be located in the same or different entity as the Sensing receiver.
  • sensing signals may refer to transmissions on the 3GPP radio interface that can be used for sensing purposes.
  • the sensing signals may refer to NR radio frequency signals which, in some cases, may be extended by information created via previously specified functionalities in EPC and/or E-UTRAN.
  • sensing result may refer to processed 3GPP sensing data requested by a service consumer.
  • target sensing service area may refer to a cartesian location area that needs to be sensed by deriving characteristics of an environment and/or objects within the environment with certain sensing service quality from the impacted (e.g., reflected, refracted, diffracted) 3GPP radio signals. This includes both indoor and outdoor environments.
  • a sensing function (SF) device is a device having a core network function to trigger sensing, collect sensing result/report, and expose the sensing result/report to the 3rd party which is in or out of 3GPP scope.
  • a sensing management function (SEMF) device is a device having a new RAN function between sensing function device and a network device to manage the sensing operation, including selecting a suitable network device, relaying the sensing request from the sensing function device to the network device, relaying the sensing result/report from the network device to the sensing function device.
  • SEMF sensing management function
  • ISAC is considered as a promising topic for future wireless network extension. According to the requirements of ISAC communication/sensing, how to identify and report a target in the network need to be resolved.
  • FIG. 1B illustrates another example communication environment 100B in which embodiments of the present disclosure can be implemented.
  • the communication environment 100B is an implementation of the communication environment 100A.
  • the communication environment 100B a plurality of communication devices communicate with each other.
  • the communication environment 100B comprises, in addition to the LMF device 110, the SF device 120, the AF device 130, a NEF 140, an Access and Mobility Management Function (AMF) device 150.
  • the communication environment 100B further comprises a terminal device 101, a network device 102, and a sensing management function (SEMF) device 103.
  • SEMF sensing management function
  • the NEF device 140 may be used to communicate information/data between the AF device 130 and the LMF device 110 or between the AF device 130 and the SF device 120.
  • the AMF device 150 also referred to as an AMF node, plays a pivotal role in managing the control plane functions related to user access and mobility. Specifically, the AMF node acts as an anchor point for a control plane signaling for connected communication devices (e.g., UEs) in the environment 100, ensuring smooth and secure user access while managing their mobility throughout the network.
  • connected communication devices e.g., UEs
  • the network device 102 and the terminal device 101 are in a radio access network (RAN) .
  • the terminal device 101 may communicate with the network device 102.
  • the network device 102 may communicatively connect with the SEMF device 103.
  • the sensing management function device 103 may be connected to the SF device 120.
  • the SEMF device 103 may be for example implemented at an Operation Administration and Maintenance (OAM) device or the AMF device 150.
  • OAM Operation Administration and Maintenance
  • the OAM device or the AMF device 150 is just one option for acting as the SEMF device 130.
  • the SEMF device 130 may be implemented as a node or device with a new function.
  • OAM is indicative while SEMF can replace OAM as another option in the cases, although SEMF is not indicative in the embodiments.
  • terminal device 101 operating as a UE
  • network device 102 operating as a base station
  • operations described in connection with a terminal device may be implemented at a network device or other device
  • operations described in connection with a network device may be implemented at a terminal device or other device.
  • a link from the network device 102 to the terminal device 101 is referred to as a downlink (DL)
  • a link from the terminal device 101 to the network device 102 is referred to as an uplink (UL)
  • the network device 102 is a transmitting (TX) device (or a transmitter)
  • the terminal device 101 is a receiving (RX) device (or a receiver)
  • the terminal device 101 is a TX device (or a transmitter) and the network device 102 is a RX device (or a receiver) .
  • the communications in the communication environment 100A and/or 100 B may conform to any suitable standards including, but not limited to, Global System for Mobile Communications (GSM) , Long Term Evolution (LTE) , LTE-Evolution, LTE-Advanced (LTE-A) , New Radio (NR) , Wideband Code Division Multiple Access (WCDMA) , Code Division Multiple Access (CDMA) , GSM EDGE Radio Access Network (GERAN) , Machine Type Communication (MTC) and the like.
  • GSM Global System for Mobile Communications
  • LTE Long Term Evolution
  • LTE-Evolution LTE-Advanced
  • NR New Radio
  • WCDMA Wideband Code Division Multiple Access
  • CDMA Code Division Multiple Access
  • GERAN GSM EDGE Radio Access Network
  • MTC Machine Type Communication
  • Examples of the communication protocols include, but not limited to, the first generation (1G) , the second generation (2G) , 2.5G, 2.75G, the third generation (3G) , the fourth generation (4G) , 4.5G, the fifth generation (5G) communication protocols, 5.5G, 5G-Advanced networks, or the sixth generation (6G) networks.
  • the communication environment 100A or 100B may include any suitable number of devices configured to implementing example embodiments of the present disclosure. Although not shown, it is to be understood that one or more additional devices may be located in the cell, and one or more additional cells may be deployed in the communication environment.
  • sensing modes There are generally two types of sensing modes defined based on Tx/Rx node of sensing signal, namely, monostatic and bi-static. These types of sensing modes include 6 specific modes, namely, Sensing Mode 1 which is a gNB mono-static sensing, Sensing Mode 2 which is gNB-to-UE bi-static sensing, Sensing Mode 3 which is gNB-to-gNB bi-static sensing, Sensing Mode 4 which is UE mono-static sensing, Sensing Mode 5 which is UE-to-gNB bi-static sensing, and Sensing Mode 6 which is UE-to-UE bi-static sensing.
  • Sensing Mode 1 which is a gNB mono-static sensing
  • Sensing Mode 2 which is gNB-to-UE bi-static sensing
  • Sensing Mode 3 which is gNB-to-gNB bi-static sensing
  • Sensing Mode 4 which is UE mono-static sensing
  • Sensing Mode 5 which is
  • FIG. 2 illustrates schematic diagrams of six example sensing modes in accordance with some example embodiments of the present disclosure.
  • Sensing Mode 1 a sensing signal for sensing a target 230 is transmitted by a network device 210 and received or measured by the network device 210 itself.
  • Sensing Mode 2 a sensing signal for sensing the target 230 is transmitted by the network device 210 and received or measured by a terminal device 220.
  • Sensing Mode 3 as indicated by 203, a sensing signal for sensing the target 230 is transmitted by the network device 210 and received or measured by another network device 212.
  • Sensing Mode 4 a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by the network device 210.
  • Sensing Mode 5 a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by the terminal device 220 itself.
  • Sensing Mode 6 a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by another terminal device 222.
  • sensing modes illustrated in FIG. 2 are examples only and there may be many other sensing modes. It would be appreciated that more than one second communication device may be involved in a sensing service. It can be seen from the sensing modes in FIG. 2 that there may be various combinations of the devices which are to measure a sensing signal.
  • the focus of the study is to define channel modelling aspects to support object detection and/or tracking.
  • the study may aim at a common modelling framework capable of detecting and/or tracking the following example objects and to enable them to be distinguished from unintended objects: UAVs, humans indoors and outdoors, automotive vehicles (at least outdoors) , automated guided vehicles (e.g. in indoor factories) , and objects creating hazards on roads/railways, with a minimum size dependent on frequency.
  • All of the six sensing modes may be considered (i.e. TRP-TRP bistatic, TRP monostatic, TRP-UE bistatic, UE-TRP bistatic, UE-UE bistatic, UE monostatic) .
  • Frequencies from 0.5 to 52.6 GHz are the primary focus, with the assumption that the modelling approach should scale to 100 GHz. (If significant problems are identified with scaling above 52.6 GHz, the range above 52.6 GHz can be deprioritized. )
  • an object also referred to as a target
  • a 3GPP ID e.g. SUPI/IMSI
  • SUPI/IMSI 3GPP ID
  • Embodiments of the present disclosure provide a solution for positioning and sensing fusion.
  • the 3GPP ID of a target is used to correlate a positioning result and a sensing result to track the target.
  • the time stamp and location in the positioning results may be fused with the sensing results to identify the target and its trajectory.
  • the fusion may be performed at a first communication device implementing a network logical function (for instance, referred to as Positioning and Sensing Fusion Function) .
  • the first communication device obtains a sensing result of a target in a sensing area and a positioning result of the target.
  • the positioning result indicates at least one position of the target in the sensing area.
  • the first communication device may perform a fusion of the sensing result and the positioning result.
  • the fusion may be performed in a variety of ways.
  • the first communication device may determine a trajectory of the target with timestamps from the sensing result, and may determine, from the positioning result, at least one time point corresponding to the at least one position of the target. Then, the first communication device performs the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
  • the result of the fusion may include various information, which including for example, but not limited to, the trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, an identification of the target, and/or the like.
  • the first communication device may be a LMF device, a SF device, an AF device or other suitable device or node. More details will be discussed with reference to FIGS. 3 to 9.
  • the LMF device is sometimes referred to as “LMF” for short
  • the SF device is sometimes referred to as “SF”
  • the AF device is sometimes referred to as “AF”
  • the AMF device is sometimes referred to as “AMF”
  • the NEF device is sometimes referred to as “NEF”
  • the target may be a terminal device, which may be sometimes referred to as UE as an example.
  • FIG. 3 illustrates a signaling flow 300 of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure.
  • the first communication device fusing the positioning result and the sensing result may be a SF device, e.g., the SF device 120.
  • the SF 120 obtains a positioning result and a sensing result and performs a fusion of them. Specifically, the LMF device 110 transmits (305) the positioning result. The positioning result indicates at least one position of the target in the sensing area. The SF 120 receives (310) the positioning result from the LMF device 110.
  • the SF 120 performs a sensing procedure to obtain (315) the sensing result.
  • the “sensing result” may refer to processed 3GPP sensing data requested by a service consumer.
  • the sensing result may include any desired information that can be derived from the measurement result (s) of the sensing signal (s) .
  • the sensing result may include a distance of a target, a size of the target, a velocity of the target, a position of the target, a moving direction of the target, a surrounding environment of the target, real-time map, or the like.
  • the SF 120 may first obtain the positioning result and then the sensing result. Alternatively, the SF 120 may first obtain the sensing result and then the positioning result. As a further alternative, the SF 120 may obtain the positioning result and the sensing result in parallel. There is no limitation on the temporal sequence of obtaining the positioning result and the sensing result.
  • the SF 120 performs (320) a fusion of the positioning result and the sensing result.
  • the SF 120 may determine a trajectory of the target with timestamps from the sensing result.
  • the SF 120 may determine, from the positioning result, at least one time point corresponding to the at least one position of the target.
  • the SF device 120 may perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
  • the LMF device 110 transmits (325) the result of the fusion to the AF device 130.
  • the AF device 130 receives (330) the result of the fusion and may obtain the trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory.
  • the result of the fusion indicates the identification of the target as well, and the AF device 130 would know to which target the result of the fusion corresponds.
  • the above fusing procedure may be initiated by the AF device 130, by transmitting a positioning and sensing fusion request.
  • the SF 120 receives the positioning and sensing fusion request from the AF device 130 directly or indirectly from the NEF device 140 (which are collectively referred to as “second communication device” in some embodiments) .
  • the positioning and sensing fusion request may indicate that the sensing result and the positioning result of the target are to be fused.
  • the positioning and sensing fusion request may comprise an identification of the target.
  • the first communication device then transmit, to the second communication device, a positioning and sensing fusion response indicating the result of the fusion.
  • the identification may be an ID of the terminal device 101, e.g., . SUbscription Permanent Identifier (SUPI) or International Mobile Subscriber Identification Number (IMSI) .
  • SUPI SUbscription Permanent Identifier
  • IMSI International Mobile Subscriber Identification Number
  • the first communication device (the SF device 120 in this case) transmits, to the LMF device 110, a positioning service request to trigger a positioning procedure for the target. Then, the SF device 120 may receive the positioning result of the target from the LMF device 110.
  • the first communication device (the SF device 120 in this case) transmits, to the SEMF device 103 or a network device 102, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target.
  • the SF device 120 may then receive the sensing result of the target from the SEMF device 103 or the network device 102.
  • the first position may be also referred to a first location, which may be determined in various ways.
  • the first position may be obtained from the positioning result received from the LMF device 110.
  • the first position may be obtained from the positioning and sensing fusion request.
  • the SF device 120 may transmit a location request for the target to an AMF device 150 and may receive information about the first position from the AMF device 150.
  • the SF device 120 may transmit, to the LMF device 110, a further positioning service request for the first position of the target and receive information about the first position from the LMF device 110.
  • the AF device 130 sends the Positioning and Sensing Fusion request to the SF device 120 via NEF, which includes the target 3GPP ID and the report granularity requirement (real time or regularly) .
  • the SF device 120 triggers the LMF device 110 for positioning to the target.
  • the LMF device 110 report the positioning result to the SF device 120.
  • the SF device 120 starts sensing in the area of the positioning result, the SF device 120 receives the sensing result of the area with time stamp.
  • the SF device 120 triggers the LMF device 110 for multiple positioning to the target, the SF device 120 receives the positioning results with time stamps.
  • the SF device 120 fuses the sensing result and the multiple positioning results based on the time stamp to identify the target and obtain the target trajectory.
  • the target trajectory is report to the AF device 130 in real time or regularly according to the requirement from the AF device 130.
  • FIG. 4A illustrates a signaling flow 400A of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure.
  • the embodiments of FIG. 4A will be discussed with respect to FIG. 1A and FIG. 1B.
  • the LMF device 110 provides accurate UE location.
  • the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 399 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
  • the AF device 130 transmits a Positioning and Sensing Fusion request 401 to the NEF device 140, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
  • the transmission of the Positioning and Sensing Fusion request 401 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
  • UAS Unmanned Aircraft System
  • UMM Traffic Management
  • UAV Unmanned Aerial Vehicle
  • the NEF device 140 sends a Positioning and Sensing Fusion request 402 to the SF device 120, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
  • the SF device 120 transmits a Positioning Request 403 to the LMF device 110 to trigger positioning to the target.
  • the 3GPP ID of the target may be included in the Positioning Request 403 from the SF device 120 to the LMF device 110.
  • the LMF device 110 initiates a positioning procedure 404 to position the target with the 3GPP ID.
  • the LMF device 110 reports the positioning result, for example, by transmitting a positioning report 405 to the SF device 120.
  • the result includes a first position (also referred to as target location) and the timestamp corresponding to the first position.
  • the SF device 120 may trigger a further positioning procedure 407, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 407, the SF device 120 receives the positioning results with the target location and corresponding time stamps.
  • the SF device 120 sends a Positioning and Sensing Fusion response 409 to the NEF device 140, which includes the result of the fusion, for example, the target trajectory and the 3GPP ID in real time or regularly according to the reporting time requirement from the AF device 130.
  • the NEF device 140 sends a further Positioning and Sensing Fusion response 410 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 409.
  • FIG. 4B illustrates a signaling flow 400B of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure.
  • FIG. 4B will be discussed with respect to FIG. 1A and FIG. 1B.
  • the AMF device 150 provides the first position of the target, which may be a UE or a terminal device, for example, the terminal device 101.
  • the first position may be determined based on the Track Area (TA) list when the UE is in idle state, or may be the cell ID when the UE is in active state.
  • TA Track Area
  • the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 420 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
  • the AF device 130 transmits a Positioning and Sensing Fusion request 421 to the NEF device 140, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
  • the transmission of the Positioning and Sensing Fusion request 401 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a flying plan.
  • UAS Unmanned Aircraft System
  • UAV Unmanned Aerial Vehicle
  • the NEF device 140 sends a Positioning and Sensing Fusion request 422 to the SF device 120, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
  • the SF device 120 transmits a UE location request 423 to the AMF device 150 to trigger positioning to the target.
  • the 3GPP ID of the target may be included in the UE location request 423.
  • the AMF device 150 transmits the stored UE location as the first position 424 of the target to the SF device 120.
  • the first position may be determined based on the TA (Track Area) list when the UE is in idle state, or may be the cell ID when the UE is in active state. Or based on the configured policy, the AMF device 150 may initiate location request procedure to RAN to obtain the UE location.
  • TA Track Area
  • the SF device 120 starts sensing in the area of the target location in a sensing procedure 426.
  • the area may be wide and covers the target location, according to the target location and the estimated movement of the target.
  • the SF device 120 may generate the sensing result of the area with time stamps.
  • the SF device 120 may trigger a further positioning procedure 427, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 427, the SF device 120 receives the positioning results with the target location and corresponding time stamps.
  • the SF device 120 performs a fusion 428 based on the sensing result, the positions of the target and the time stamps, to identify the target and obtain the target trajectory.
  • the SF device 120 may trigger a further positioning procedure 447, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 447, the SF device 120 obtain the positioning results with positions of the target 101 and corresponding time stamps.
  • the sensing procedure 446 may be performed before or after or in parallel with the positioning procedure 447.
  • FIG. 5 illustrates a signaling flow 500 of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure.
  • the first communication device fusing the positioning result and the sensing result may be a LMF device, e.g., the LMF device 110.
  • the LMF device 110 performs a positioning procedure for the target to obtain the positioning result of the target.
  • the LMF device 110 transmits, to the SF device 120, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target.
  • the LMF device 110 then receives the sensing result of the target from the SF device 120.
  • the LMF device 110 may obtain the first position from the positioning result.
  • the LMF device 110 may obtain the first position is from a positioning and sensing fusion request, which is for instance received from the AF 130 directly or indirectly.
  • the LMF device 110 performs (520) a fusion of the positioning result and the sensing result.
  • the LMF device 110 may determine a trajectory of the target with timestamps from the sensing result.
  • the LMF device 110 may determine, from the positioning result, at least one time point corresponding to the at least one position of the target.
  • the LMF device 110 may perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
  • FIG. 6 illustrates a signaling flow 600 of an example process of sensing and positioning fusion performed at a the LMF device 110 device in accordance with some embodiments of the present disclosure.
  • FIG. 6 illustrates a signaling flow 600 of an example process of sensing and positioning fusion performed at a the LMF device 110 device in accordance with some embodiments of the present disclosure.
  • the embodiments of FIG. 6 will be discussed with respect to FIG. 1A and FIG. 1B.
  • the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 599 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
  • the position of the UE i.e., the first position of the target (also referred to as UE location or target location) , may be indicated via the Positioning and Sensing Fusion Request 599.
  • the AF device 130 transmits a Positioning and Sensing Fusion request 601 to the NEF device 140, which includes the target 3GPP ID, the report granularity requirement or report time requirement (real time or regularly) , and/or the like.
  • the transmission of the Positioning and Sensing Fusion request 601 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
  • UAS Unmanned Aircraft System
  • UAM Traffic Management
  • UAV Unmanned Aerial Vehicle
  • the NEF device 140 transmits a Positioning and Sensing Fusion request 602 to the LMF device 110. Based on the Positioning and Sensing Fusion request 602, the LMF device 110 initiate a positioning procedure 603 to the target using the 3GPP ID. In this way, the LMF device 110 obtains the location of the target 101 (i.e., the first position) and the corresponding timestamp.
  • the LMF device 110 may transmit a Sensing Service request 604 to the SF device 120 for sensing to the area covering the first position based on the estimated movement of the target and the first position.
  • the SF device 120 initiates sensing in the area of the target location in a sensing procedure 605.
  • the area may be wide and covers the target location, according to the target location and the estimated movement of the target.
  • the SF device 120 may generate the sensing result of the area with time stamps.
  • the SF device 120 may transmit a Sensing Response 606 to the LMF device 110 to report the Sensing Result with time stamps.
  • the LMF device 110 performs a fusion 607 the sensing result and the positioning results based on the sensing result, the target location and time stamps to identify the target and obtain the target trajectory. If needed, the LMF device 110 may perform another or multiple positioning to the target to obtain more target location information.
  • the LMF device 110 then transmits a Positioning and Sensing Fusion response 608 to the NEF device 140, which includes the target trajectory and the 3GPP ID in real time or regularly according to the requirement from the AF device 130.
  • the NEF device 140 sends a further Positioning and Sensing Fusion response 609 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 608.
  • the AF device 130 may transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target, and receive the positioning result of the target from the third communication device.
  • the third communication device may be for example the LMF device 110, the NEF device 140, or other suitable device.
  • the AF device 130 may transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target. Then, the AF device 130 may receive the sensing result of the target from the fourth communication device.
  • the fourth communication device may be for example the SF device 120, the NEF device 140, or other suitable device.
  • the first position of the target 101 may be obtained from the positioning result.
  • the first position may be obtained from position information received from the target 101.
  • the first position may be determined at the AF device based on a path plan or a flying plan.
  • the AF 130 obtains a positioning result and a sensing result and performs a fusion of them.
  • the SF 120 performs a sensing procedure to obtain the sensing result and transmits (705) the sensing result to the AF 130.
  • the sensing result may include any desired information that can be derived from the measurement result (s) of the sensing signal (s) .
  • the sensing result may include a distance of a target, a size of the target, a velocity of the target, a position of the target, a moving direction of the target, a surrounding environment of the target, real-time map, or the like.
  • the LMF device 110 performs a positioning procedure to obtain the positioning result and transmits (715) the positioning result to the AF 130.
  • the positioning result indicates at least one position of the target in the sensing area.
  • the AF 130 receives (710) the sensing result from the SF device 120 and receives (720) the positioning result from the LMF device 110.
  • FIG. 8 illustrates a signaling flow 800 of an example process of sensing and positioning fusion performed at an the AF device 130 device in accordance with some embodiments of the present disclosure.
  • FIG. 8 will be discussed with respect to FIG. 1A and FIG. 1B.
  • the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 799 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
  • the position of the UE i.e., the first position of the target (also referred to as UE location or target location) , may be indicated via the Positioning and Sensing Fusion Request 799.
  • the AF device 130 transmits a Positioning and Sensing Fusion request 801 to the NEF device 140, which includes the target 3GPP ID, the report granularity requirement or report time requirement (real time or regularly) , for example per 10 minutes.
  • the transmission of the Positioning and Sensing Fusion request 801 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
  • UAS Unmanned Aircraft System
  • UAM Traffic Management
  • UAV Unmanned Aerial Vehicle
  • the NEF device 140 transmits a Positioning and Sensing Fusion request 802 to the LMF device 110. Based on the Positioning and Sensing Fusion request 802, the LMF device 110 initiate a positioning procedure 803 to the target using the 3GPP ID. In this way, the LMF device 110 obtains the location of the target 101 (i.e., the first position) and the corresponding timestamp.
  • the LMF device 110 may skip the positioning procedure 803. Alternatively, in some embodiments, the LMF device 110 may initiate a UE location request to the AMF device 150 to obtain the first position of the target 101, for example, TA list or Cell ID.
  • the LMF device 110 transmits a Positioning service response 804 to the NEF device 140 as the location report granularity, including the target location and the timestamp. Then, the NEF device 140 transmits a Positioning service response 805 to the AF device 130.
  • the AF device 130 sends a sensing service request 806 to the NEF device 140 including the area covering the target location and the sensing report granularity, e.g. real time or per 1 minutes. The area should consider the target location and the estimated movement of the target. Then the NEF device 140 transmits a further sensing service request 807 to the SF device 120.
  • the AF device 130 may send the UE location (i.e., the first position of the target 101 or the UE 101) to the SF device 120, which may be obtained from UE or based on the path/flying plan.
  • the SF device 120 initiates sensing in the area of the target location in a sensing procedure 808.
  • the area may be wide and covers the target location, according to the target location and the estimated movement of the target.
  • the SF device 120 may generate the sensing result of the area with time stamps.
  • the SF device 120 may transmit a Sensing Response 809 to the NEF 140 to report the Sensing Result with time stamps.
  • the NEF 140 then transmits a further Sensing Response 809 to the AF device 130 to report the same.
  • the AF device 130 may obtain the Sensing Result with time stamps.
  • the AF device 130 performs the fusion 811 of the sensing result and the positioning result to identify the target and the trajectory.
  • the timestamp, the target location and the sensing result are correlated with the 3GPP ID of the target.
  • the AF device 130 can obtain the first position/UE location from the target or UE 101 or based on the path/flying plan.
  • the positioning procedure and the sensing procedure may be performed in parallel.
  • the positioning procedure and the sensing procedure may be triggered by the application layer event, e.g. the UTM needs to track the UAV based on the flying plan.
  • embodiments of the present disclosure provide a communication device (also referred to as “fifth communication device” for purpose of discussion) .
  • the fifth communication device receives, from an AF device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device.
  • the positioning and sensing fusion request comprises an identification of the target.
  • the fifth communication device then transmits, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
  • the fifth communication device receives, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result.
  • the fifth communication device then transmits, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
  • the fifth communication device may be a Network Exposure Function (NEF) device, e.g., the NEF device 140.
  • the sixth communication device may be a sensing function device, e.g. the SF device 120 or a LMF device, e.g., the LMF device 110.
  • FIG. 9 illustrates a signaling flow 900 of an example process for the sensing and positioning fusion in accordance with some embodiments of the present disclosure.
  • FIG. 9 will be discussed with respect to FIG. 1A and FIG. 1B.
  • the AF 130 transmits (905) a positioning and sensing fusion request to the NEF device 140.
  • the positioning and sensing fusion request indicates that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device 901.
  • the positioning and sensing fusion request comprises an identification of the target.
  • the identification may be an ID of the terminal device 101, e.g., SUPI or IMSI.
  • the NEF device 140 receives (910) the positioning and sensing fusion request and transmits a further positioning and sensing fusion request to the sixth communication device 901, which may be the LMF 110 or the SF device 120, for example.
  • This request indicates that the sensing result of the target and the positioning result of the target are to be fused at the sixth communication device.
  • the sixth communication device 901 may have the knowledge that the fusion is to be performed by itself.
  • the sixth communication device 901 transmits (925) the a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result.
  • the NEF device 140 receives (930) , from the sixth communication device 901, the a positioning and sensing fusion response and transmits (935) the result of the fusion of the sensing result and the positioning result to the AF 130, for example via a further positioning and sensing fusion response or another message.
  • the AF device 130 receives (940) the further positioning and sensing fusion response and thus obtains the result of the fusion.
  • Embodiments of the present disclosure may have some the following impacts.
  • Table 1 shows an example of some impacts.
  • Table 2 shows an example of some other impacts.
  • embodiments of the present disclosure may lead to create a New ISAC SA2 TS, including the Positioning and Sensing Fusion function introduction and procedures as above.
  • FIG. 10 illustrates a flowchart of a communication method 1000 implemented at a first communication device in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1000 will be described from the perspective of the first communication device.
  • the first communication device is further caused to: determine a trajectory of the target with timestamps from the sensing result; determine, from the positioning result, at least one time point corresponding to the at least one position of the target; and perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
  • a result of the fusion comprises at least one of: trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, or an identification of the target.
  • the first communication device comprises a sensing function device or a Location Management Function (LMF) device
  • the first communication device is further caused to: receive, from a second communication device, a positioning and sensing fusion request indicating that the sensing result and the positioning result of the target are to be fused, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the second communication device, a positioning and sensing fusion response indicating a result of the fusion.
  • LMF Location Management Function
  • the second communication device comprises an Application Function (AF) device or a Network Exposure Function (NEF) device.
  • AF Application Function
  • NEF Network Exposure Function
  • the first communication device comprises the sensing function device, and wherein the first communication device is further caused to: transmit, to a sensing management function device or a network device, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive, from the sensing management function device or the network device, the sensing result of the target.
  • the first position is obtained from the positioning result received from a Location Management Function (LMF) device, or wherein the first position is obtained from the positioning and sensing fusion request.
  • LMF Location Management Function
  • the first communication device is further caused to: transmit a location request for the target to an Access and Mobility Management Function (AMF) device; and receive information about the first position from the AMF device.
  • AMF Access and Mobility Management Function
  • the first communication device is further caused to: transmit, to a Location Management Function (LMF) device, a further positioning service request for the first position of the target; and receive information about the first position from the LMF device.
  • LMF Location Management Function
  • the first communication device is further caused to perform at least one of: obtaining the first position is obtained from the positioning result; or obtaining the first position is from the positioning and sensing fusion request.
  • the first communication device comprises an Application Function (AF) device
  • the first communication device is further caused to: transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target; and receive the positioning result of the target from the third communication device.
  • AF Application Function
  • the third communication device comprises a Location Management Function (LMF) device or a Network Exposure Function (NEF) device.
  • LMF Location Management Function
  • NEF Network Exposure Function
  • the first communication device comprises an Application Function (AF) device
  • the first communication device is further caused to: transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive the sensing result of the target from the fourth communication device.
  • AF Application Function
  • the fourth communication device comprises a sensing function device or a Network Exposure Function (NEF) device.
  • NEF Network Exposure Function
  • the first position is obtained from the positioning result, or wherein the first position is obtained from position information received from the target, or wherein the first position is determined at the AF device based on a path plan.
  • FIG. 11 illustrates a flowchart of a communication method 1100 implemented at a fifth communication device in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1100 will be described from the perspective of a second communication device.
  • the second communication device receives, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device.
  • the positioning and sensing fusion request comprises an identification of the target.
  • the second communication device transmits, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
  • the fifth communication device is further caused to: receive, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result; and transmit, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
  • the fifth communication device comprises a Network Exposure Function (NEF) device
  • the sixth communication device comprises a sensing function device or a Location Management Function (LMF) device.
  • NEF Network Exposure Function
  • LMF Location Management Function
  • FIG. 12 is a simplified block diagram of a device 1200 that is suitable for implementing embodiments of the present disclosure.
  • the device 1200 can be considered as a further example implementation of any of the devices as shown in FIG. 1. Accordingly, the device 1200 can be implemented at or as at least a part of the LMF device 110, the SF device 120 or the AF device 130.
  • the device 1200 includes a processor 1210, a memory 1220 coupled to the processor 1210, a suitable transceiver 1240 coupled to the processor 1210, and a communication interface coupled to the transceiver 1240.
  • the memory 1220 stores at least a part of a program 1230.
  • the transceiver 1240 may be for bidirectional communications or a unidirectional communication based on requirements.
  • the transceiver 1240 may include at least one of a transmitter 1242 and a receiver 1244.
  • the transmitter 1242 and the receiver 1244 may be functional modules or physical entities.
  • the transceiver 1240 has at least one antenna to facilitate communication, though in practice an Access Node mentioned in this application may have several ones.
  • the communication interface may represent any interface that is necessary for communication with other network elements, such as X2/Xn interface for bidirectional communications between eNBs/gNBs, S1/NG interface for communication between a Mobility Management Entity (MME) /Access and Mobility Management Function (AMF) /SGW/UPF and the eNB/gNB, Un interface for communication between the eNB/gNB and a relay node (RN) , or Uu interface for communication between the eNB/gNB and a terminal device.
  • MME Mobility Management Entity
  • AMF Access and Mobility Management Function
  • RN relay node
  • Uu interface for communication between the eNB/gNB and a terminal device.
  • the program 1230 is assumed to include program instructions that, when executed by the associated processor 1210, enable the device 1200 to operate in accordance with the embodiments of the present disclosure, as discussed herein with reference to FIGS. 1A to 11.
  • the embodiments herein may be implemented by computer software executable by the processor 1210 of the device 1200, or by hardware, or by a combination of software and hardware.
  • the processor 1210 may be configured to implement various embodiments of the present disclosure.
  • a combination of the processor 1210 and memory 1220 may form processing means 1250 adapted to implement various embodiments of the present disclosure.
  • the memory 1220 may be of any type suitable to the local technical network and may be implemented using any suitable data storage technology, such as a non-transitory computer readable storage medium, semiconductor based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples. While only one memory 1220 is shown in the device 1200, there may be several physically distinct memory modules in the device 1200.
  • the processor 1210 may be of any type suitable to the local technical network, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multicore processor architecture, as non-limiting examples.
  • the device 1200 may have multiple processors, such as an application specific integrated circuit chip that is slaved in time to a clock which synchronizes the main processor.
  • a first communication device comprising a circuitry.
  • the circuitry is configured to: obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and perform a fusion of the sensing result and the positioning result.
  • the circuitry may be configured to perform any method implemented by the first communication device as discussed above.
  • a fifth communication device comprising a circuitry.
  • the circuitry is configured to: receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
  • the circuitry may be configured to perform any method implemented by the fifth communication device as discussed above.
  • circuitry used herein may refer to hardware circuits and/or combinations of hardware circuits and software.
  • the circuitry may be a combination of analog and/or digital hardware circuits with software/firmware.
  • the circuitry may be any portions of hardware processors with software including digital signal processor (s) , software, and memory (ies) that work together to cause an apparatus, such as a terminal device or a network device, to perform various functions.
  • the circuitry may be hardware circuits and or processors, such as a microprocessor or a portion of a microprocessor, that requires software/firmware for operation, but the software may not be present when it is not needed for operation.
  • the term circuitry also covers an implementation of merely a hardware circuit or processor (s) or a portion of a hardware circuit or processor (s) and its (or their) accompanying software and/or firmware.
  • various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it will be appreciated that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

Embodiments of the present disclosure provide a solution for sensing and positioning fusion. In a solution, a first communication device obtains a sensing result of a target in a sensing area and a positioning result of the target. The positioning result indicates at least one position of the target in the sensing area. The first communication device performs a fusion of the sensing result and the positioning result.

Description

DEVICES AND METHODS FOR SENSING AND POSITIONING FUSION
FIELDS
Example embodiments of the present disclosure generally relate to the field of communication techniques and in particular, to devices and methods for sensing and positioning fusion.
BACKGROUND
Integrated Sensing and Communication (ISAC) is considered as a promising topic for future wireless network extension. ISAC involves the simultaneous use of radio frequency (RF) signals for both sensing and communication purposes. This integration can lead to improved spectrum efficiency, reduced latency, and enhanced reliability in various applications.
SUMMARY
In a first aspect, there is provided a first communication device comprising: a processor configured to cause the first communication device to: obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and perform a fusion of the sensing result and the positioning result.
In a second aspect, there is provided a fifth communication device comprising: a processor configured to cause the fifth communication device to: receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
In a third aspect, there is provided a communication method performed by a first communication device. The method comprises: obtaining a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least  one position of the target in the sensing area; and performing a fusion of the sensing result and the positioning result.
In a fourth aspect, there is provided a communication method performed by a fifth communication device. The method comprises: receiving, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmitting, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
In a fifth aspect, there is provided a computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, causing the at least one processor to carry out the method according to the third, or fourth aspect.
Other features of the present disclosure will become easily comprehensible through the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
Through the more detailed description of some example embodiments of the present disclosure in the accompanying drawings, the above and other objects, features and advantages of the present disclosure will become more apparent, wherein:
FIG. 1A illustrates an example communication environment in which embodiments of the present disclosure can be implemented;
FIG. 1B illustrates another example communication environment in which embodiments of the present disclosure can be implemented;
FIG. 2 illustrate a schematic diagram of example sensing modes in accordance with some embodiments of the present disclosure;
FIG. 3 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure;
FIG. 4A illustrates a signaling flow of an example process of sensing and positioning fusion performed at a sensing function (SF) device in accordance with some  embodiments of the present disclosure;
FIG. 4B illustrates a signaling flow of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure;
FIG. 4C illustrates a signaling flow of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure;
FIG. 5 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure;
FIG. 6 illustrates a signaling flow of an example process of sensing and positioning fusion performed at a Location Management Function (LMF) device in accordance with some embodiments of the present disclosure;
FIG. 7 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure;
FIG. 8 illustrates a signaling flow of an example process of sensing and positioning fusion performed at an Application Function (AF) device in accordance with some embodiments of the present disclosure;
FIG. 9 illustrates a signaling flow of an example process for the sensing and positioning fusion in accordance with some embodiments of the present disclosure;
FIG. 10 illustrates a flowchart of a method implemented at a first communication device according to some example embodiments of the present disclosure;
FIG. 11 illustrates a flowchart of a method implemented at a fifth communication device according to some example embodiments of the present disclosure;
FIG. 12 illustrates a simplified block diagram of an apparatus that is suitable for implementing example embodiments of the present disclosure.
Throughout the drawings, the same or similar reference numerals represent the same or similar element.
DETAILED DESCRIPTION
Principle of the present disclosure will now be described with reference to some  example embodiments. It is to be understood that these embodiments are described only for the purpose of illustration and help those skilled in the art to understand and implement the present disclosure, without suggesting any limitation as to the scope of the disclosure. Embodiments described herein can be implemented in various manners other than the ones described below.
In the following description and claims, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skills in the art to which this disclosure belongs.
As used herein, the term ‘terminal device’ refers to any device having wireless or wired communication capabilities. Examples of the terminal device include, but not limited to, user equipment (UE) , personal computers, desktops, mobile phones, cellular phones, smart phones, personal digital assistants (PDAs) , portable computers, tablets, wearable devices, internet of things (IoT) devices, Ultra-reliable and Low Latency Communications (URLLC) devices, Internet of Everything (IoE) devices, machine type communication (MTC) devices, devices on vehicle for V2X communication where X means pedestrian, vehicle, or infrastructure/network, devices for Integrated Access and Backhaul (IAB) , Space borne vehicles or Air borne vehicles in Non-terrestrial networks (NTN) including Satellites and High Altitude Platforms (HAPs) encompassing Unmanned Aircraft Systems (UAS) , eXtended Reality (XR) devices including different types of realities such as Augmented Reality (AR) , Mixed Reality (MR) and Virtual Reality (VR) , the unmanned aerial vehicle (UAV) commonly known as a drone which is an aircraft without any human pilot, devices on high speed train (HST) , or image capture devices such as digital cameras, sensors, gaming devices, music storage and playback appliances, or Internet appliances enabling wireless or wired Internet access and browsing and the like. The ‘terminal device’ can further has ‘multicast/broadcast’ feature, to support public safety and mission critical, V2X applications, transparent IPv4/IPv6 multicast delivery, IPTV, smart TV, radio services, software delivery over wireless, group communications and IoT applications. It may also incorporate one or multiple Subscriber Identity Module (SIM) as known as Multi-SIM. The term “terminal device” can be used interchangeably with a UE, a mobile station, a subscriber station, a mobile terminal, a user terminal or a wireless device.
The term “network device” refers to a device which is capable of providing or hosting a cell or coverage where terminal devices can communicate. Examples of a  network device include, but not limited to, a Node B (NodeB or NB) , an evolved NodeB (eNodeB or eNB) , a next generation NodeB (gNB) , a transmission reception point (TRP) , a remote radio unit (RRU) , a radio head (RH) , a remote radio head (RRH) , an IAB node, a low power node such as a femto node, a pico node, a reconfigurable intelligent surface (RIS) , and the like.
The terminal device or the network device may have Artificial intelligence (AI) or Machine learning capability. It generally includes a model which has been trained from numerous collected data for a specific function, and can be used to predict some information.
The terminal or the network device may work on several frequency ranges, e.g., FR1 (e.g., 450 MHz to 6000 MHz) , FR2 (e.g., 24.25GHz to 52.6GHz) , frequency band larger than 100 GHz as well as Tera Hertz (THz) . It can further work on licensed/unlicensed/shared spectrum. The terminal device may have more than one connection with the network devices under Multi-Radio Dual Connectivity (MR-DC) application scenario. The terminal device or the network device can work on full duplex, flexible duplex and cross division duplex modes.
The embodiments of the present disclosure may be performed in test equipment, e.g., signal generator, signal analyzer, spectrum analyzer, network analyzer, test terminal device, test network device, channel emulator. In some embodiments, the terminal device may be connected with a first network device and a second network device. One of the first network device and the second network device may be a master node and the other one may be a secondary node. The first network device and the second network device may use different radio access technologies (RATs) . In some embodiments, the first network device may be a first RAT device and the second network device may be a second RAT device. In some embodiments, the first RAT device is eNB and the second RAT device is gNB. Information related with different RATs may be transmitted to the terminal device from at least one of the first network device or the second network device. In some embodiments, first information may be transmitted to the terminal device from the first network device and second information may be transmitted to the terminal device from the second network device directly or via the first network device. In some embodiments, information related with configuration for the terminal device configured by the second network device may be transmitted from the second network device via the first network device. Information related with reconfiguration for the terminal device configured by the second network device may be transmitted to the terminal device from the second network device directly or via the first network device.
As used herein, the singular forms ‘a’ , ‘an’ and ‘the’ are intended to include the plural forms as well, unless the context clearly indicates otherwise. The term ‘includes’ and its variants are to be read as open terms that mean ‘includes, but is not limited to. ’ The term ‘based on’ is to be read as ‘at least in part based on. ’ The term ‘one embodiment’ and ‘an embodiment’ are to be read as ‘at least one embodiment. ’ The term ‘another embodiment’ is to be read as ‘at least one other embodiment. ’ The terms ‘first, ’ ‘second, ’ and the like may refer to different or same objects. Other definitions, explicit and implicit, may be included below.
In some examples, values, procedures, or apparatus are referred to as ‘best, ’ ‘lowest, ’ ‘highest, ’ ‘minimum, ’ ‘maximum, ’ or the like. It will be appreciated that such descriptions are intended to indicate that a selection among many used functional alternatives can be made, and such selections need not be better, smaller, higher, or otherwise preferable to other selections.
As used herein, the term “resource, ” “transmission resource, ” “uplink resource, ” or “downlink resource” may refer to any resource for performing a communication, such as a resource in time domain, a resource in frequency domain, a resource in space domain, a resource in code domain, or any other resource enabling a communication, and the like. In the following, unless explicitly stated, a resource in both frequency domain and time domain will be used as an example of a transmission resource for describing some example embodiments of the present disclosure. It is noted that example embodiments of the present disclosure are equally applicable to other resources in other domains.
As used herein, unless stated explicitly, performing a step “in response to A” does not indicate that the step is performed immediately after “A” occurs and one or more intervening steps may be included.
As used herein, the term “3rd Generation Partnership Project (3GPP) sensing data” may refer to data derived from 3GPP radio signals that are impacted (e.g., reflected, refracted, diffracted) by an object or environment of interest for sensing purposes, and optionally processed within the 5th Generation Mobile Communication Technology (5G) system.
As used herein, the term “5G Wireless sensing” may refer to 5G System (5GS) feature providing capabilities to get information about characteristics of an environment and/or objects within the environment (e.g., shape, size, orientation, speed, location, distances or relative motion between objects, etc. ) using New Radio (NR) radio frequency signals, which, in some cases, can be extended by information created via previously  specified functionalities in Evolved Packet Core (EPC) and/or Evolved UMTS Terrestrial Radio Access Network (E-UTRAN) .
As used herein, the term “non-3GPP sensing data” may refer to data provided by non-3GPP sensors (e.g., video, LiDAR, sonar) about an object or environment of interest for sensing purposes.
The term “sensing assistance information” may refer to information that is provided to 5G system and can be used to derive sensing result. The sensing assistance information may be, for example, map information, area information, a user equipment (UE) Identity (ID) attached to or in the proximity of the sensing target, UE position information, UE velocity information etc.
The term “sensing contextual information” may refer to information that is exposed with the sensing results by 5G system to a trusted third party which provides context to the conditions under which the sensing results were derived. The sensing contextual information may include, for example, map information, area information, time of capture, UE location and ID. The sensing contextual information can be required in scenarios where the sensing result is to be combined with data from other sources outside the 5GS.
The term “sensing group” may refer to a set of sensing transmitters and sensing receivers whose locations are known and whose sensing data can be collected synchronously.
The term “sensing transmitter” may be the entity that sends out the sensing signal which the sensing service will use in its operation. A Sensing transmitter is an NR RAN node or a UE. A Sensing transmitter can be located in the same or different entity as the Sensing receiver.
The term “sensing receiver” may be an entity that receives the sensing signal which the sensing service will use in its operation. A sensing receiver is an NR RAN node or a UE. A Sensing receiver can be located in the same or different entity as the Sensing transmitter.
The term “sensing signals” may refer to transmissions on the 3GPP radio interface that can be used for sensing purposes. The sensing signals may refer to NR radio frequency signals which, in some cases, may be extended by information created via  previously specified functionalities in EPC and/or E-UTRAN.
The term “sensing result” may refer to processed 3GPP sensing data requested by a service consumer.
The term “target sensing service area” may refer to a cartesian location area that needs to be sensed by deriving characteristics of an environment and/or objects within the environment with certain sensing service quality from the impacted (e.g., reflected, refracted, diffracted) 3GPP radio signals. This includes both indoor and outdoor environments.
As used herein, a sensing function (SF) device is a device having a core network function to trigger sensing, collect sensing result/report, and expose the sensing result/report to the 3rd party which is in or out of 3GPP scope.
As used herein, a sensing management function (SEMF) device is a device having a new RAN function between sensing function device and a network device to manage the sensing operation, including selecting a suitable network device, relaying the sensing request from the sensing function device to the network device, relaying the sensing result/report from the network device to the sensing function device.
As discussed above, ISAC is considered as a promising topic for future wireless network extension. According to the requirements of ISAC communication/sensing, how to identify and report a target in the network need to be resolved.
Principles and implementations of the present disclosure will be described in detail below with reference to the figures.
FIG. 1A illustrates a schematic diagram of an example communication environment 100 in which example embodiments of the present disclosure can be implemented. In the communication environment 100A, a plurality of communication devices, including a Location Management Function (LMF) device 110, a sensing function (SF) device 120, and an Application Function (AF) device 130, can communicate with each other.
The LMF device 110 may be a location server or a device or a node implementing a function responsible for managing the location information of terminal devices, e.g., UEs. The SF device 120 may be a device or a node having a core network function to trigger sensing, collect sensing result/report, and expose the sensing result/report to the 3rd party which is in or out of 3GPP scope.
The AF device 130 may refer to a device or a node implementing an Application Function which is a network function that may interact with other nodes, such as, the Policy Control Function (PCF) and Session Management Function (SMF) , to provide specific services or applications. It may influence the network's behavior by requesting certain Quality of Service (QoS) parameters, policy rules, or data usage reports based on the requirements of the application it serves.
It is to be understood that the number of devices and their connections shown in FIG. 1A are only for the purpose of illustration without suggesting any limitation. For example, the LMF device 110 and/or the SF device 120 may directly connect to the AF device 130, or may connect to the AF device 130 via a Network Exposure Function (NEF) device which is not shown in FIG. 1A. The communication environment 100 may include any suitable number of devices configured to implementing example embodiments of the present disclosure.
FIG. 1B illustrates another example communication environment 100B in which embodiments of the present disclosure can be implemented. The communication environment 100B is an implementation of the communication environment 100A.
In the communication environment 100B, a plurality of communication devices communicate with each other. As shown, the communication environment 100B comprises, in addition to the LMF device 110, the SF device 120, the AF device 130, a NEF 140, an Access and Mobility Management Function (AMF) device 150. The communication environment 100B further comprises a terminal device 101, a network device 102, and a sensing management function (SEMF) device 103.
The NEF device 140 may be used to communicate information/data between the AF device 130 and the LMF device 110 or between the AF device 130 and the SF device 120.
The AMF device 150, also referred to as an AMF node, plays a pivotal role in managing the control plane functions related to user access and mobility. Specifically, the AMF node acts as an anchor point for a control plane signaling for connected communication devices (e.g., UEs) in the environment 100, ensuring smooth and secure user access while managing their mobility throughout the network.
As used herein, a sensing management function (SEMF) device 103 is a device having a new RAN function between sensing function device and a network device to manage the sensing operation, including selecting a suitable network device, relaying the  sensing request from the sensing function device to the network device, relaying the sensing result/report from the network device to the sensing function device.
The network device 102 and the terminal device 101 are in a radio access network (RAN) . The terminal device 101 may communicate with the network device 102. The network device 102 may communicatively connect with the SEMF device 103. The sensing management function device 103 may be connected to the SF device 120. The SEMF device 103 may be for example implemented at an Operation Administration and Maintenance (OAM) device or the AMF device 150.
It is to be noted that the OAM device or the AMF device 150 is just one option for acting as the SEMF device 130. In another option, the SEMF device 130 may be implemented as a node or device with a new function. In the following embodiments, OAM is indicative while SEMF can replace OAM as another option in the cases, although SEMF is not indicative in the embodiments.
In the following, for the purpose of illustration, some example embodiments are described with the terminal device 101 operating as a UE and the network device 102 operating as a base station. However, in some example embodiments, operations described in connection with a terminal device may be implemented at a network device or other device, and operations described in connection with a network device may be implemented at a terminal device or other device.
In some example embodiments, if the terminal device 101 is a terminal device and the network device 102 is a network device, a link from the network device 102 to the terminal device 101 is referred to as a downlink (DL) , while a link from the terminal device 101 to the network device 102 is referred to as an uplink (UL) . In DL, the network device 102 is a transmitting (TX) device (or a transmitter) and the terminal device 101 is a receiving (RX) device (or a receiver) . In UL, the terminal device 101 is a TX device (or a transmitter) and the network device 102 is a RX device (or a receiver) .
The communications in the communication environment 100A and/or 100 B may conform to any suitable standards including, but not limited to, Global System for Mobile Communications (GSM) , Long Term Evolution (LTE) , LTE-Evolution, LTE-Advanced (LTE-A) , New Radio (NR) , Wideband Code Division Multiple Access (WCDMA) , Code Division Multiple Access (CDMA) , GSM EDGE Radio Access Network (GERAN) , Machine Type Communication (MTC) and the like. The embodiments of the present  disclosure may be performed according to any generation communication protocols either currently known or to be developed in the future. Examples of the communication protocols include, but not limited to, the first generation (1G) , the second generation (2G) , 2.5G, 2.75G, the third generation (3G) , the fourth generation (4G) , 4.5G, the fifth generation (5G) communication protocols, 5.5G, 5G-Advanced networks, or the sixth generation (6G) networks.
It is to be understood that the number of devices and their connections shown in FIGS. 1A and 1B are only for the purpose of illustration without suggesting any limitation. The communication environment 100A or 100B may include any suitable number of devices configured to implementing example embodiments of the present disclosure. Although not shown, it is to be understood that one or more additional devices may be located in the cell, and one or more additional cells may be deployed in the communication environment.
There are generally two types of sensing modes defined based on Tx/Rx node of sensing signal, namely, monostatic and bi-static. These types of sensing modes include 6 specific modes, namely, Sensing Mode 1 which is a gNB mono-static sensing, Sensing Mode 2 which is gNB-to-UE bi-static sensing, Sensing Mode 3 which is gNB-to-gNB bi-static sensing, Sensing Mode 4 which is UE mono-static sensing, Sensing Mode 5 which is UE-to-gNB bi-static sensing, and Sensing Mode 6 which is UE-to-UE bi-static sensing.
FIG. 2 illustrates schematic diagrams of six example sensing modes in accordance with some example embodiments of the present disclosure. As shown in FIG. 2, in Sensing Mode 1, as indicated by 201, a sensing signal for sensing a target 230 is transmitted by a network device 210 and received or measured by the network device 210 itself. In Sensing Mode 2, as indicated by 202, a sensing signal for sensing the target 230 is transmitted by the network device 210 and received or measured by a terminal device 220. In Sensing Mode 3, as indicated by 203, a sensing signal for sensing the target 230 is transmitted by the network device 210 and received or measured by another network device 212.
In Sensing Mode 4, as indicated by 204, a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by the network device 210. In Sensing Mode 5, as indicated by 205, a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by the terminal  device 220 itself. In Sensing Mode 6, as indicated by 206, a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by another terminal device 222.
It would be appreciated that the sensing modes illustrated in FIG. 2 are examples only and there may be many other sensing modes. It would be appreciated that more than one second communication device may be involved in a sensing service. It can be seen from the sensing modes in FIG. 2 that there may be various combinations of the devices which are to measure a sensing signal.
Regarding the study on channel modelling for Integrated Sensing And Communication (ISAC) for NR, the focus of the study is to define channel modelling aspects to support object detection and/or tracking. The study may aim at a common modelling framework capable of detecting and/or tracking the following example objects and to enable them to be distinguished from unintended objects: UAVs, humans indoors and outdoors, automotive vehicles (at least outdoors) , automated guided vehicles (e.g. in indoor factories) , and objects creating hazards on roads/railways, with a minimum size dependent on frequency.
All of the six sensing modes, as discussed with respect to FIG. 2, may be considered (i.e. TRP-TRP bistatic, TRP monostatic, TRP-UE bistatic, UE-TRP bistatic, UE-UE bistatic, UE monostatic) .
Frequencies from 0.5 to 52.6 GHz are the primary focus, with the assumption that the modelling approach should scale to 100 GHz. (If significant problems are identified with scaling above 52.6 GHz, the range above 52.6 GHz can be deprioritized. ) 
For the above use cases, sensing modes and frequencies:
● Identify details of the deployment scenarios corresponding to the above use cases.
● Define channel modelling details for sensing, and taking into account relevant measurements, including: modelling of sensing targets and background environment, including, for example (if needed by the above use cases) , radar cross-section (RCS) , mobility and clutter/scattering patterns; spatial consistency.
Currently, as for an object (also referred to as a target) having a 3GPP ID (e.g. SUPI/IMSI) , there is a need to detect and/or track the object and to enable it to be  distinguished from unintended objects.
Embodiments of the present disclosure provide a solution for positioning and sensing fusion. In the solution, the 3GPP ID of a target is used to correlate a positioning result and a sensing result to track the target. The time stamp and location in the positioning results may be fused with the sensing results to identify the target and its trajectory. The fusion may be performed at a first communication device implementing a network logical function (for instance, referred to as Positioning and Sensing Fusion Function) .
Specifically, the first communication device obtains a sensing result of a target in a sensing area and a positioning result of the target. The positioning result indicates at least one position of the target in the sensing area. Then, the first communication device may perform a fusion of the sensing result and the positioning result.
It is to be understood that there is no temporal sequence requirement on obtaining the positioning result and the sensing result. The positioning result and the sensing result may obtained . Alternatively, the SF 120 may first obtain the sensing result and then the positioning result. As a further alternative, the SF 120 may obtain the positioning result and the sensing result in parallel.
The fusion may be performed in a variety of ways. In some embodiments, the first communication device may determine a trajectory of the target with timestamps from the sensing result, and may determine, from the positioning result, at least one time point corresponding to the at least one position of the target. Then, the first communication device performs the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
The result of the fusion may include various information, which including for example, but not limited to, the trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, an identification of the target, and/or the like.
The first communication device may be a LMF device, a SF device, an AF device or other suitable device or node. More details will be discussed with reference to FIGS. 3 to 9.
For purpose of discussion, in embodiments of the present disclosure, the LMF  device is sometimes referred to as “LMF” for short, the SF device is sometimes referred to as “SF” , the AF device is sometimes referred to as “AF” , the AMF device is sometimes referred to as “AMF” , the NEF device is sometimes referred to as “NEF” . In addition, the target may be a terminal device, which may be sometimes referred to as UE as an example.
FIG. 3 illustrates a signaling flow 300 of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 3 will be discussed with respect to FIG. 1A and FIG. 1B. In the embodiments of FIG. 3, the first communication device fusing the positioning result and the sensing result may be a SF device, e.g., the SF device 120.
As shown in FIG. 3, the SF 120 obtains a positioning result and a sensing result and performs a fusion of them. Specifically, the LMF device 110 transmits (305) the positioning result. The positioning result indicates at least one position of the target in the sensing area. The SF 120 receives (310) the positioning result from the LMF device 110.
The SF 120 performs a sensing procedure to obtain (315) the sensing result. The “sensing result” may refer to processed 3GPP sensing data requested by a service consumer. The sensing result may include any desired information that can be derived from the measurement result (s) of the sensing signal (s) . As some examples, the sensing result may include a distance of a target, a size of the target, a velocity of the target, a position of the target, a moving direction of the target, a surrounding environment of the target, real-time map, or the like.
It is to be understood that there is no temporal sequence requirement on obtaining the positioning result and the sensing result. The SF 120 may first obtain the positioning result and then the sensing result. Alternatively, the SF 120 may first obtain the sensing result and then the positioning result. As a further alternative, the SF 120 may obtain the positioning result and the sensing result in parallel. There is no limitation on the temporal sequence of obtaining the positioning result and the sensing result.
Then, the SF 120 performs (320) a fusion of the positioning result and the sensing result. In some embodiments, the SF 120 may determine a trajectory of the target with timestamps from the sensing result. The SF 120 may determine, from the positioning result, at least one time point corresponding to the at least one position of the target. Then, the SF device 120 may perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
The LMF device 110 transmits (325) the result of the fusion to the AF device 130. The AF device 130 receives (330) the result of the fusion and may obtain the trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory. In some embodiments, the result of the fusion indicates the identification of the target as well, and the AF device 130 would know to which target the result of the fusion corresponds.
In some embodiments, the above fusing procedure may be initiated by the AF device 130, by transmitting a positioning and sensing fusion request. In this case, the SF 120 receives the positioning and sensing fusion request from the AF device 130 directly or indirectly from the NEF device 140 (which are collectively referred to as “second communication device” in some embodiments) . The positioning and sensing fusion request may indicate that the sensing result and the positioning result of the target are to be fused. In some embodiments, the positioning and sensing fusion request may comprise an identification of the target. The first communication device then transmit, to the second communication device, a positioning and sensing fusion response indicating the result of the fusion. For example, the identification may be an ID of the terminal device 101, e.g., . SUbscription Permanent Identifier (SUPI) or International Mobile Subscriber Identification Number (IMSI) .
In addition, in some cases, the positioning and sensing fusion request may include a reporting time requirement. The reporting time requirement may include, for example, reporting the result of the fusion in a certain period (i.e., regularly) or in real time. In some embodiments, the first communication device (the SF device 120 in this case) determines the reporting time requirement from the positioning and sensing fusion request and transmits the positioning and sensing fusion response according to the reporting time requirement.
In some embodiments, the first communication device (the SF device 120 in this case) transmits, to the LMF device 110, a positioning service request to trigger a positioning procedure for the target. Then, the SF device 120 may receive the positioning result of the target from the LMF device 110.
In some embodiments, the first communication device (the SF device 120 in this case) transmits, to the SEMF device 103 or a network device 102, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a  first position associated with the target. The SF device 120 may then receive the sensing result of the target from the SEMF device 103 or the network device 102.
The first position may be also referred to a first location, which may be determined in various ways. In some embodiments, the first position may be obtained from the positioning result received from the LMF device 110. Alternatively, the first position may be obtained from the positioning and sensing fusion request. As a further alternative, the SF device 120 may transmit a location request for the target to an AMF device 150 and may receive information about the first position from the AMF device 150. Still further, the SF device 120 may transmit, to the LMF device 110, a further positioning service request for the first position of the target and receive information about the first position from the LMF device 110.
In some implementations, the AF device 130 sends the Positioning and Sensing Fusion request to the SF device 120 via NEF, which includes the target 3GPP ID and the report granularity requirement (real time or regularly) . the SF device 120 triggers the LMF device 110 for positioning to the target. the LMF device 110 report the positioning result to the SF device 120. the SF device 120 starts sensing in the area of the positioning result, the SF device 120 receives the sensing result of the area with time stamp. the SF device 120 triggers the LMF device 110 for multiple positioning to the target, the SF device 120 receives the positioning results with time stamps. the SF device 120 fuses the sensing result and the multiple positioning results based on the time stamp to identify the target and obtain the target trajectory. The target trajectory is report to the AF device 130 in real time or regularly according to the requirement from the AF device 130.
More details about the sensing and positioning fusion performed at the SF device 120 will be discussed with respect to FIGS. 4A to 4C. FIG. 4A illustrates a signaling flow 400A of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 4A will be discussed with respect to FIG. 1A and FIG. 1B. In the embodiments of FIG. 4A, the LMF device 110 provides accurate UE location.
Optionally, the target, for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 399 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
The AF device 130 transmits a Positioning and Sensing Fusion request 401 to  the NEF device 140, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) . The transmission of the Positioning and Sensing Fusion request 401 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
The NEF device 140 sends a Positioning and Sensing Fusion request 402 to the SF device 120, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
The SF device 120 transmits a Positioning Request 403 to the LMF device 110 to trigger positioning to the target. The 3GPP ID of the target may be included in the Positioning Request 403 from the SF device 120 to the LMF device 110.
The LMF device 110 initiates a positioning procedure 404 to position the target with the 3GPP ID.
The LMF device 110 reports the positioning result, for example, by transmitting a positioning report 405 to the SF device 120. The result includes a first position (also referred to as target location) and the timestamp corresponding to the first position.
The SF device 120 starts sensing in the area of the target location in a sensing procedure 406. The area may be wide and covers the target location, according to the target location and the estimated movement of the target. The SF device 120 may generate the sensing result of the area with time stamps.
The SF device 120 may trigger a further positioning procedure 407, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 407, the SF device 120 receives the positioning results with the target location and corresponding time stamps.
The SF device 120 performs a fusion 408 of the sensing result and the multiple positioning results, based on the sensing result, the target location and the time stamps, to identify the target and obtain the target trajectory.
The SF device 120 sends a Positioning and Sensing Fusion response 409 to the NEF device 140, which includes the result of the fusion, for example, the target trajectory and the 3GPP ID in real time or regularly according to the reporting time requirement from the AF device 130.
The NEF device 140 sends a further Positioning and Sensing Fusion response 410 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 409.
FIG. 4B illustrates a signaling flow 400B of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 4B will be discussed with respect to FIG. 1A and FIG. 1B.
In the embodiments of FIG. 4B, the AMF device 150 provides the first position of the target, which may be a UE or a terminal device, for example, the terminal device 101. The first position may be determined based on the Track Area (TA) list when the UE is in idle state, or may be the cell ID when the UE is in active state.
Optionally, the target, for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 420 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
The AF device 130 transmits a Positioning and Sensing Fusion request 421 to the NEF device 140, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) . The transmission of the Positioning and Sensing Fusion request 401 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a flying plan.
The NEF device 140 sends a Positioning and Sensing Fusion request 422 to the SF device 120, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
The SF device 120 transmits a UE location request 423 to the AMF device 150 to trigger positioning to the target. The 3GPP ID of the target may be included in the UE location request 423.
The AMF device 150 transmits the stored UE location as the first position 424 of the target to the SF device 120. The first position may be determined based on the TA (Track Area) list when the UE is in idle state, or may be the cell ID when the UE is in active state. Or based on the configured policy, the AMF device 150 may initiate location request procedure to RAN to obtain the UE location.
The SF device 120 starts sensing in the area of the target location in a sensing procedure 426. The area may be wide and covers the target location, according to the target location and the estimated movement of the target. The SF device 120 may generate the sensing result of the area with time stamps.
The SF device 120 may trigger a further positioning procedure 427, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 427, the SF device 120 receives the positioning results with the target location and corresponding time stamps.
Then, the SF device 120 performs a fusion 428 based on the sensing result, the positions of the target and the time stamps, to identify the target and obtain the target trajectory.
The SF device 120 sends a Positioning and Sensing Fusion response 429 to the NEF device 140, which includes the result of the fusion, for example, the target trajectory and the 3GPP ID in real time or regularly according to the reporting time requirement from the AF device 130.
The NEF device 140 sends a further Positioning and Sensing Fusion response 430 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 429.
FIG. 4C illustrates a signaling flow 400C of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 4C will be discussed with respect to FIG. 1A and FIG. 1B. In the embodiments, the AF device 130 device 130 provides the first position of the terminal device 101.
Optionally, the target, for example, a UE or the terminal device 101 sends Positioning and Sensing Fusion request 440 to the AF device 130, which includes the 3GPP ID of the UE and its location. For instance, the location may be the real time Global Navigation Satellite System (GNSS) /Global Positioning System (GPS) location or the location in the path/flying plan.
The AF device 130 transmits the AF device 130 sends the Positioning and Sensing Fusion request 441 to the NEF device 140, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) . The UE location (first  position of the target 101) may be included in this Positioning and Sensing Fusion request 441, which may be obtained from UE or from the path or flying plan.
The NEF device 140 sends a Positioning and Sensing Fusion request 442 to the SF device 120, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) . The Positioning and Sensing Fusion request 442 may include the UE location, that is, the first position of the target 101.
Based on the Positioning and Sensing Fusion request 442, the SF device 120 initiates a sensing procedure 446 for sensing the target 101. For instance, the SF device 120 starts sensing in the area of the target location in a sensing procedure 446. The area may be wide and covers the target location, according to the target location and the estimated movement of the target 101. The SF device 120 may generate the sensing result of the area with time stamps. As such, the SF 120 obtains the sensing result of the target 101.
The SF device 120 may trigger a further positioning procedure 447, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 447, the SF device 120 obtain the positioning results with positions of the target 101 and corresponding time stamps.
It is to be understood that there is no temporal reequipment for the positioning procedure 447 or the sensing procedure 446. The sensing procedure 446 may be performed before or after or in parallel with the positioning procedure 447.
Then, the SF device 120 performs a fusion 448 based on the sensing result, the positions of the target and the time stamps, to identify the target and obtain the target trajectory.
The SF device 120 sends a Positioning and Sensing Fusion response 449 to the NEF device 140, which includes the result of the fusion, for example, the target trajectory and the 3GPP ID in real time or regularly according to the reporting time requirement from the AF device 130.
The NEF device 140 sends a further Positioning and Sensing Fusion response 450 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 449.
FIG. 5 illustrates a signaling flow 500 of an example process of sensing and  positioning fusion in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 5 will be discussed with respect to FIG. 1A and FIG. 1B. In the embodiments of FIG. 5, the first communication device fusing the positioning result and the sensing result may be a LMF device, e.g., the LMF device 110.
In embodiments of FIG. 5, the LMF device 110 performs a positioning procedure for the target to obtain the positioning result of the target. To obtain the sensing result, in some embodiments, the LMF device 110 transmits, to the SF device 120, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target. The LMF device 110 then receives the sensing result of the target from the SF device 120.
In some embodiments, the LMF device 110 may obtain the first position from the positioning result. Alternatively, the LMF device 110 may obtain the first position is from a positioning and sensing fusion request, which is for instance received from the AF 130 directly or indirectly.
As shown in FIG. 5, the LMF device 110 obtains a positioning result and a sensing result and performs a fusion of them. Specifically, in the signaling flow 500, the SF device 120 performs a sensing procedure and transmits (505) the sensing result to the LMF device 110. The LMF device 110 receives (510) the sensing result from the SF device 120. The sensing result may include any desired information that can be derived from the measurement result (s) of the sensing signal (s) . As some examples, the sensing result may include a distance of a target, a size of the target, a velocity of the target, a position of the target, a moving direction of the target, a surrounding environment of the target, real-time map, or the like.
The LMF device 110 performs (515) a positioning procedure to obtain the positioning result. The positioning result indicates at least one position of the target in the sensing area.
It is to be understood that there is no temporal sequence requirement on obtaining the positioning result and the sensing result. The LMF device 110 may first obtain the positioning result and then the sensing result. Alternatively, the LMF device 110 may first obtain the sensing result and then the positioning result. As a further alternative, the LMF device 110 may obtain the positioning result and the sensing result in parallel. There is no limitation on the temporal sequence of obtaining the positioning  result and the sensing result.
Then, the LMF device 110 performs (520) a fusion of the positioning result and the sensing result. In some embodiments, the LMF device 110 may determine a trajectory of the target with timestamps from the sensing result. The LMF device 110 may determine, from the positioning result, at least one time point corresponding to the at least one position of the target. Thus, the LMF device 110 may perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
The LMF device 110 transmits (525) the result of the fusion to the AF device 130. The AF device 130 receives (530) the result of the fusion and may obtain the trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory. In some embodiments, the result of the fusion indicates the identification of the target as well, and the AF device 130 would know to which target the result of the fusion corresponds.
FIG. 6 illustrates a signaling flow 600 of an example process of sensing and positioning fusion performed at a the LMF device 110 device in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 6 will be discussed with respect to FIG. 1A and FIG. 1B.
Optionally, the target, for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 599 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE. The position of the UE, i.e., the first position of the target (also referred to as UE location or target location) , may be indicated via the Positioning and Sensing Fusion Request 599.
The AF device 130 transmits a Positioning and Sensing Fusion request 601 to the NEF device 140, which includes the target 3GPP ID, the report granularity requirement or report time requirement (real time or regularly) , and/or the like. The transmission of the Positioning and Sensing Fusion request 601 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
The NEF device 140 transmits a Positioning and Sensing Fusion request 602 to the LMF device 110. Based on the Positioning and Sensing Fusion request 602, the LMF  device 110 initiate a positioning procedure 603 to the target using the 3GPP ID. In this way, the LMF device 110 obtains the location of the target 101 (i.e., the first position) and the corresponding timestamp.
It is to be understood that, if the first position is included in the Positioning and Sensing Fusion request 602, the LMF device 110 may skip the positioning procedure 603. Alternatively, in some embodiments, the LMF device 110 may initiate a UE location request to the AMF device 150 to obtain the first position of the target 101, for example, TA list or Cell ID.
The LMF device 110 may transmit a Sensing Service request 604 to the SF device 120 for sensing to the area covering the first position based on the estimated movement of the target and the first position.
The SF device 120 initiates sensing in the area of the target location in a sensing procedure 605. The area may be wide and covers the target location, according to the target location and the estimated movement of the target. The SF device 120 may generate the sensing result of the area with time stamps.
Then, the SF device 120 may transmit a Sensing Response 606 to the LMF device 110 to report the Sensing Result with time stamps.
The LMF device 110 performs a fusion 607 the sensing result and the positioning results based on the sensing result, the target location and time stamps to identify the target and obtain the target trajectory. If needed, the LMF device 110 may perform another or multiple positioning to the target to obtain more target location information.
The LMF device 110 then transmits a Positioning and Sensing Fusion response 608 to the NEF device 140, which includes the target trajectory and the 3GPP ID in real time or regularly according to the requirement from the AF device 130.
The NEF device 140 sends a further Positioning and Sensing Fusion response 609 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 608.
FIG. 7 illustrates a signaling flow 700 of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 7 will be discussed with respect to FIG. 1A and FIG. 1B. In the embodiments of FIG. 7, the first communication device fusing the  positioning result and the sensing result may be a AF device, e.g., the AF device 130.
To obtain the positioning result, the AF device 130 may transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target, and receive the positioning result of the target from the third communication device. The third communication device may be for example the LMF device 110, the NEF device 140, or other suitable device.
To obtain the sensing result, the AF device 130 may transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target. Then, the AF device 130 may receive the sensing result of the target from the fourth communication device. The fourth communication device may be for example the SF device 120, the NEF device 140, or other suitable device.
In some embodiments, the first position of the target 101 may be obtained from the positioning result. Alternatively, the first position may be obtained from position information received from the target 101. As a further alternative, the first position may be determined at the AF device based on a path plan or a flying plan.
As shown in FIG. 7, the AF 130 obtains a positioning result and a sensing result and performs a fusion of them. Specifically, in the signaling flow 700, the SF 120 performs a sensing procedure to obtain the sensing result and transmits (705) the sensing result to the AF 130. The sensing result may include any desired information that can be derived from the measurement result (s) of the sensing signal (s) . As some examples, the sensing result may include a distance of a target, a size of the target, a velocity of the target, a position of the target, a moving direction of the target, a surrounding environment of the target, real-time map, or the like.
The LMF device 110 performs a positioning procedure to obtain the positioning result and transmits (715) the positioning result to the AF 130. The positioning result indicates at least one position of the target in the sensing area. The AF 130 receives (710) the sensing result from the SF device 120 and receives (720) the positioning result from the LMF device 110.
It is to be understood that there is no temporal sequence requirement on obtaining the positioning result and the sensing result. Upon receiving the positioning  result and the sensing result, the AF device 130 performs (725) a fusion of the positioning result and the sensing result. In some embodiments, the AF device 130 may determine a trajectory of the target with timestamps from the sensing result. The AF device 130 may determine, from the positioning result, at least one time point corresponding to the at least one position of the target. Then, the AF device 130 may perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
FIG. 8 illustrates a signaling flow 800 of an example process of sensing and positioning fusion performed at an the AF device 130 device in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 8 will be discussed with respect to FIG. 1A and FIG. 1B.
Optionally, the target, for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 799 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE. The position of the UE, i.e., the first position of the target (also referred to as UE location or target location) , may be indicated via the Positioning and Sensing Fusion Request 799.
The AF device 130 transmits a Positioning and Sensing Fusion request 801 to the NEF device 140, which includes the target 3GPP ID, the report granularity requirement or report time requirement (real time or regularly) , for example per 10 minutes. The transmission of the Positioning and Sensing Fusion request 801 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
The NEF device 140 transmits a Positioning and Sensing Fusion request 802 to the LMF device 110. Based on the Positioning and Sensing Fusion request 802, the LMF device 110 initiate a positioning procedure 803 to the target using the 3GPP ID. In this way, the LMF device 110 obtains the location of the target 101 (i.e., the first position) and the corresponding timestamp.
It is to be understood that, if the first position is included in the Positioning and Sensing Fusion request 802, the LMF device 110 may skip the positioning procedure 803. Alternatively, in some embodiments, the LMF device 110 may initiate a UE location request to the AMF device 150 to obtain the first position of the target 101, for example, TA list or Cell ID.
The LMF device 110 transmits a Positioning service response 804 to the NEF device 140 as the location report granularity, including the target location and the timestamp. Then, the NEF device 140 transmits a Positioning service response 805 to the AF device 130.
To obtain the sensing result, the AF device 130 sends a sensing service request 806 to the NEF device 140 including the area covering the target location and the sensing report granularity, e.g. real time or per 1 minutes. The area should consider the target location and the estimated movement of the target. Then the NEF device 140 transmits a further sensing service request 807 to the SF device 120. The AF device 130 may send the UE location (i.e., the first position of the target 101 or the UE 101) to the SF device 120, which may be obtained from UE or based on the path/flying plan.
The SF device 120 initiates sensing in the area of the target location in a sensing procedure 808. The area may be wide and covers the target location, according to the target location and the estimated movement of the target. The SF device 120 may generate the sensing result of the area with time stamps.
Then, the SF device 120 may transmit a Sensing Response 809 to the NEF 140 to report the Sensing Result with time stamps. The NEF 140 then transmits a further Sensing Response 809 to the AF device 130 to report the same. Thus , the AF device 130 may obtain the Sensing Result with time stamps.
The AF device 130 performs the fusion 811 of the sensing result and the positioning result to identify the target and the trajectory. The timestamp, the target location and the sensing result are correlated with the 3GPP ID of the target.
It is to be understood that the above steps are just illustrated for example, rather than suggesting any limitations. If the AF device 130 can obtain the first position/UE location from the target or UE 101 or based on the path/flying plan. The positioning procedure and the sensing procedure may be performed in parallel.
In some embodiments, the positioning procedure and the sensing procedure may be triggered by the application layer event, e.g. the UTM needs to track the UAV based on the flying plan.
In addition to the above, embodiments of the present disclosure provide a communication device (also referred to as “fifth communication device” for purpose of  discussion) . The fifth communication device receives, from an AF device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device. The positioning and sensing fusion request comprises an identification of the target. The fifth communication device then transmits, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
In some embodiments, the fifth communication device receives, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result. The fifth communication device then transmits, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
In some embodiments, the fifth communication device may be a Network Exposure Function (NEF) device, e.g., the NEF device 140. The sixth communication device may be a sensing function device, e.g. the SF device 120 or a LMF device, e.g., the LMF device 110.
FIG. 9 illustrates a signaling flow 900 of an example process for the sensing and positioning fusion in accordance with some embodiments of the present disclosure. For purpose of discussion, the embodiments of FIG. 9 will be discussed with respect to FIG. 1A and FIG. 1B.
In the signaling flow 900, the AF 130 transmits (905) a positioning and sensing fusion request to the NEF device 140. The positioning and sensing fusion request indicates that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device 901. The positioning and sensing fusion request comprises an identification of the target. For example, the identification may be an ID of the terminal device 101, e.g., SUPI or IMSI.
The NEF device 140 receives (910) the positioning and sensing fusion request and transmits a further positioning and sensing fusion request to the sixth communication device 901, which may be the LMF 110 or the SF device 120, for example. This request indicates that the sensing result of the target and the positioning result of the target are to be fused at the sixth communication device. Thus, upon receiving the request, the sixth communication device 901 may have the knowledge that the fusion is to be performed by  itself.
The sixth communication device 901 transmits (925) the a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result. The NEF device 140 receives (930) , from the sixth communication device 901, the a positioning and sensing fusion response and transmits (935) the result of the fusion of the sensing result and the positioning result to the AF 130, for example via a further positioning and sensing fusion response or another message.
The AF device 130 receives (940) the further positioning and sensing fusion response and thus obtains the result of the fusion.
Embodiments of the present disclosure may have some the following impacts. Table 1 shows an example of some impacts.
Table 1
Table 2 shows an example of some other impacts.
Table 2
As an alternative, or in addition, embodiments of the present disclosure may lead to create a New ISAC SA2 TS, including the Positioning and Sensing Fusion function introduction and procedures as above.
FIG. 10 illustrates a flowchart of a communication method 1000 implemented at  a first communication device in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1000 will be described from the perspective of the first communication device.
At block 1010, the first communication device obtains a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area.
At block 1020, first communication device performs a fusion of the sensing result and the positioning result.
In some example embodiments, the first communication device is further caused to: determine a trajectory of the target with timestamps from the sensing result; determine, from the positioning result, at least one time point corresponding to the at least one position of the target; and perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
In some example embodiments, a result of the fusion comprises at least one of: trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, or an identification of the target.
In some example embodiments, the first communication device comprises a sensing function device or a Location Management Function (LMF) device, and the first communication device is further caused to: receive, from a second communication device, a positioning and sensing fusion request indicating that the sensing result and the positioning result of the target are to be fused, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the second communication device, a positioning and sensing fusion response indicating a result of the fusion.
In some example embodiments, the second communication device comprises an Application Function (AF) device or a Network Exposure Function (NEF) device.
In some example embodiments, the first communication device is further caused to:determine a reporting time requirement from the positioning and sensing fusion request; transmit the positioning and sensing fusion response according to the reporting time requirement.
In some example embodiments, the first communication device comprises the sensing function device, and wherein the first communication device is further caused to:  transmit, to a Location Management Function (LMF) device, a positioning service request to trigger a positioning procedure for the target; and receive, from the LMF device, the positioning result of the target.
In some example embodiments, the first communication device comprises the sensing function device, and wherein the first communication device is further caused to: transmit, to a sensing management function device or a network device, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive, from the sensing management function device or the network device, the sensing result of the target.
In some example embodiments, the first position is obtained from the positioning result received from a Location Management Function (LMF) device, or wherein the first position is obtained from the positioning and sensing fusion request.
In some example embodiments, the first communication device is further caused to: transmit a location request for the target to an Access and Mobility Management Function (AMF) device; and receive information about the first position from the AMF device.
In some example embodiments, the first communication device is further caused to: transmit, to a Location Management Function (LMF) device, a further positioning service request for the first position of the target; and receive information about the first position from the LMF device.
In some example embodiments, the first communication device comprises the LMF device, and the first communication device is further caused to: perform a positioning procedure for the target to obtain the positioning result of the target.
In some example embodiments, the first communication device comprises the LMF device, and the first communication device is further caused to: transmit, to a sensing function device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive the sensing result of the target from the sensing function device.
In some example embodiments, the first communication device is further caused to perform at least one of: obtaining the first position is obtained from the positioning result; or obtaining the first position is from the positioning and sensing fusion request.
In some example embodiments, the first communication device comprises an Application Function (AF) device, and the first communication device is further caused to: transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target; and receive the positioning result of the target from the third communication device.
In some example embodiments, the third communication device comprises a Location Management Function (LMF) device or a Network Exposure Function (NEF) device.
In some example embodiments, the first communication device comprises an Application Function (AF) device, and the first communication device is further caused to: transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive the sensing result of the target from the fourth communication device.
In some example embodiments, the fourth communication device comprises a sensing function device or a Network Exposure Function (NEF) device.
In some example embodiments, the first position is obtained from the positioning result, or wherein the first position is obtained from position information received from the target, or wherein the first position is determined at the AF device based on a path plan.
FIG. 11 illustrates a flowchart of a communication method 1100 implemented at a fifth communication device in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1100 will be described from the perspective of a second communication device.
At block 1110, the second communication device receives, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device. The positioning and sensing fusion request comprises an identification of the target.
At block 1120, the second communication device transmits, to the sixth communication device, a further positioning and sensing fusion request indicating that the  sensing result and the positioning result are to be fused.
In some example embodiments, the fifth communication device is further caused to: receive, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result; and transmit, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
In some example embodiments, the fifth communication device comprises a Network Exposure Function (NEF) device, and the sixth communication device comprises a sensing function device or a Location Management Function (LMF) device.
FIG. 12 is a simplified block diagram of a device 1200 that is suitable for implementing embodiments of the present disclosure. The device 1200 can be considered as a further example implementation of any of the devices as shown in FIG. 1. Accordingly, the device 1200 can be implemented at or as at least a part of the LMF device 110, the SF device 120 or the AF device 130.
As shown, the device 1200 includes a processor 1210, a memory 1220 coupled to the processor 1210, a suitable transceiver 1240 coupled to the processor 1210, and a communication interface coupled to the transceiver 1240. The memory 1220 stores at least a part of a program 1230. The transceiver 1240 may be for bidirectional communications or a unidirectional communication based on requirements. The transceiver 1240 may include at least one of a transmitter 1242 and a receiver 1244. The transmitter 1242 and the receiver 1244 may be functional modules or physical entities. The transceiver 1240 has at least one antenna to facilitate communication, though in practice an Access Node mentioned in this application may have several ones. The communication interface may represent any interface that is necessary for communication with other network elements, such as X2/Xn interface for bidirectional communications between eNBs/gNBs, S1/NG interface for communication between a Mobility Management Entity (MME) /Access and Mobility Management Function (AMF) /SGW/UPF and the eNB/gNB, Un interface for communication between the eNB/gNB and a relay node (RN) , or Uu interface for communication between the eNB/gNB and a terminal device.
The program 1230 is assumed to include program instructions that, when executed by the associated processor 1210, enable the device 1200 to operate in accordance with the embodiments of the present disclosure, as discussed herein with  reference to FIGS. 1A to 11. The embodiments herein may be implemented by computer software executable by the processor 1210 of the device 1200, or by hardware, or by a combination of software and hardware. The processor 1210 may be configured to implement various embodiments of the present disclosure. Furthermore, a combination of the processor 1210 and memory 1220 may form processing means 1250 adapted to implement various embodiments of the present disclosure.
The memory 1220 may be of any type suitable to the local technical network and may be implemented using any suitable data storage technology, such as a non-transitory computer readable storage medium, semiconductor based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples. While only one memory 1220 is shown in the device 1200, there may be several physically distinct memory modules in the device 1200. The processor 1210 may be of any type suitable to the local technical network, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multicore processor architecture, as non-limiting examples. The device 1200 may have multiple processors, such as an application specific integrated circuit chip that is slaved in time to a clock which synchronizes the main processor.
According to embodiments of the present disclosure, a first communication device comprising a circuitry is provided. The circuitry is configured to: obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and perform a fusion of the sensing result and the positioning result. According to embodiments of the present disclosure, the circuitry may be configured to perform any method implemented by the first communication device as discussed above.
According to embodiments of the present disclosure, a fifth communication device comprising a circuitry is provided. The circuitry is configured to: receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused. According to embodiments of the present disclosure,  the circuitry may be configured to perform any method implemented by the fifth communication device as discussed above.
The term “circuitry” used herein may refer to hardware circuits and/or combinations of hardware circuits and software. For example, the circuitry may be a combination of analog and/or digital hardware circuits with software/firmware. As a further example, the circuitry may be any portions of hardware processors with software including digital signal processor (s) , software, and memory (ies) that work together to cause an apparatus, such as a terminal device or a network device, to perform various functions. In a still further example, the circuitry may be hardware circuits and or processors, such as a microprocessor or a portion of a microprocessor, that requires software/firmware for operation, but the software may not be present when it is not needed for operation. As used herein, the term circuitry also covers an implementation of merely a hardware circuit or processor (s) or a portion of a hardware circuit or processor (s) and its (or their) accompanying software and/or firmware.
According to embodiments of the present disclosure, a first communication apparatus is provided. The first communication apparatus comprises means for obtaining a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and means for performing a fusion of the sensing result and the positioning result. In some embodiments, the first apparatus may comprise means for performing the respective operations of the method 1000. In some example embodiments, the first apparatus may further comprise means for performing other operations in some example embodiments of the method 1000. The means may be implemented in any suitable form. For example, the means may be implemented in a circuitry or software module.
According to embodiments of the present disclosure, a fifth communication apparatus is provided. The fifth communication apparatus comprises means for receiving, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and means for transmitting, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused. In some embodiments, the second apparatus may comprise means for performing the respective operations of the method  1100. In some example embodiments, the second apparatus may further comprise means for performing other operations in some example embodiments of the method 1100. The means may be implemented in any suitable form. For example, the means may be implemented in a circuitry or software module.
In summary, embodiments of the present disclosure provide the following aspects.
In an aspect, it is proposed a first communication device comprising: a processor configured to cause the first communication device to: obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and perform a fusion of the sensing result and the positioning result.
In some embodiments, the first communication device is further caused to: determine a trajectory of the target with timestamps from the sensing result; determine, from the positioning result, at least one time point corresponding to the at least one position of the target; and perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
In some embodiments, a result of the fusion comprises at least one of: trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, or an identification of the target.
In some embodiments, the first communication device comprises a sensing function device or a Location Management Function (LMF) device, and the first communication device is further caused to: receive, from a second communication device, a positioning and sensing fusion request indicating that the sensing result and the positioning result of the target are to be fused, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the second communication device, a positioning and sensing fusion response indicating a result of the fusion.
In some embodiments, the second communication device comprises an Application Function (AF) device or a Network Exposure Function (NEF) device.
In some embodiments, the first communication device is further caused to: determine a reporting time requirement from the positioning and sensing fusion request; transmit the positioning and sensing fusion response according to the reporting time  requirement.
In some embodiments, the first communication device comprises the sensing function device, and wherein the first communication device is further caused to: transmit, to a Location Management Function (LMF) device, a positioning service request to trigger a positioning procedure for the target; and receive, from the LMF device, the positioning result of the target.
In some embodiments, the first communication device comprises the sensing function device, and wherein the first communication device is further caused to: transmit, to a sensing management function device or a network device, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive, from the sensing management function device or the network device, the sensing result of the target.
In some embodiments, the first position is obtained from the positioning result received from a Location Management Function (LMF) device, or wherein the first position is obtained from the positioning and sensing fusion request.
In some embodiments, the first communication device is further caused to: transmit a location request for the target to an Access and Mobility Management Function (AMF) device; and receive information about the first position from the AMF device.
In some embodiments, the first communication device is further caused to: transmit, to a Location Management Function (LMF) device, a further positioning service request for the first position of the target; and receive information about the first position from the LMF device.
In some embodiments, the first communication device comprises the LMF device, and the first communication device is further caused to: perform a positioning procedure for the target to obtain the positioning result of the target.
In some embodiments, the first communication device comprises the LMF device, and the first communication device is further caused to: transmit, to a sensing function device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive the sensing result of the target from the sensing function device.
In some embodiments, the first communication device is further caused to  perform at least one of: obtaining the first position is obtained from the positioning result; or obtaining the first position is from the positioning and sensing fusion request.
In some embodiments, the first communication device comprises an Application Function (AF) device, and the first communication device is further caused to: transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target; and receive the positioning result of the target from the third communication device.
In some embodiments, the third communication device comprises a Location Management Function (LMF) device or a Network Exposure Function (NEF) device.
In some embodiments, the first communication device comprises an Application Function (AF) device, and the first communication device is further caused to: transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive the sensing result of the target from the fourth communication device.
In some embodiments, the fourth communication device comprises a sensing function device or a Network Exposure Function (NEF) device.
In some embodiments, the first position is obtained from the positioning result, or wherein the first position is obtained from position information received from the target, or wherein the first position is determined at the AF device based on a path plan.
In an aspect, it is proposed a fifth communication device comprising: a processor configured to cause the fifth communication device to: receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
In some embodiments, the fifth communication device is further caused to: receive, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result; and  transmit, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
In some embodiments, the fifth communication device comprises a Network Exposure Function (NEF) device, and the sixth communication device comprises a sensing function device or a Location Management Function (LMF) device.
In an aspect, a first communication device comprises: at least one processor; and at least one memory coupled to the at least one processor and storing instructions thereon, the instructions, when executed by the at least one processor, causing the device to perform the method implemented by the first communication device discussed above.
In an aspect, a fifth communication device comprises: at least one processor; and at least one memory coupled to the at least one processor and storing instructions thereon, the instructions, when executed by the at least one processor, causing the device to perform the method implemented by the fifth communication device discussed above.
In an aspect, a computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, causing the at least one processor to perform the method implemented by the first communication device discussed above.
In an aspect, a computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, causing the at least one processor to perform the method implemented by the fifth communication device discussed above.
In an aspect, a computer program comprising instructions, the instructions, when executed on at least one processor, causing the at least one processor to perform the method implemented by the first communication device discussed above.
In an aspect, a computer program comprising instructions, the instructions, when executed on at least one processor, causing the at least one processor to perform the method implemented by the fifth communication device discussed above.
Generally, various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other  computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it will be appreciated that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
The present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer readable storage medium. The computer program product includes computer-executable instructions, such as those included in program modules, being executed in a device on a target real or virtual processor, to carry out the process or method as described above with reference to FIGS. 1 to 12. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, or the like that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Machine-executable instructions for program modules may be executed within a local or distributed device. In a distributed device, program modules may be located in both local and remote storage media.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented. The program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
The above program code may be embodied on a machine readable medium, which may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine readable medium may be a machine readable signal medium or a machine readable storage medium. A machine readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the machine readable storage medium would include an electrical connection having one or  more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are contained in the above discussions, these should not be construed as limitations on the scope of the present disclosure, but rather as descriptions of features that may be specific to particular embodiments. Certain features that are described in the context of separate embodiments may also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment may also be implemented in multiple embodiments separately or in any suitable sub-combination.
Although the present disclosure has been described in language specific to structural features and/or methodological acts, it is to be understood that the present disclosure defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (25)

  1. A first communication device comprising:
    a processor configured to cause the first communication device to:
    obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and
    perform a fusion of the sensing result and the positioning result.
  2. The device of claim 1, wherein the first communication device is further caused to:
    determine a trajectory of the target with timestamps from the sensing result;
    determine, from the positioning result, at least one time point corresponding to the at least one position of the target; and
    perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
  3. The device of claim 1, wherein a result of the fusion comprises at least one of:
    trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, or
    an identification of the target.
  4. The device of any of claims 1 to 3, wherein the first communication device comprises a sensing function device or a Location Management Function (LMF) device, and the first communication device is further caused to:
    receive, from a second communication device, a positioning and sensing fusion request indicating that the sensing result and the positioning result of the target are to be fused, the positioning and sensing fusion request comprising an identification of the target; and
    transmit, to the second communication device, a positioning and sensing fusion response indicating a result of the fusion.
  5. The device of claim 4, wherein the second communication device comprises an Application Function (AF) device or a Network Exposure Function (NEF) device.
  6. The device of claim 4 or 5, wherein the first communication device is further caused  to:
    determine a reporting time requirement from the positioning and sensing fusion request;
    transmit the positioning and sensing fusion response according to the reporting time requirement.
  7. The device of any of claims 4 to 6, wherein the first communication device comprises the sensing function device, and wherein the first communication device is further caused to:
    transmit, to a Location Management Function (LMF) device, a positioning service request to trigger a positioning procedure for the target; and
    receive, from the LMF device, the positioning result of the target.
  8. The device of any of claims 4 to 7, wherein the first communication device comprises the sensing function device, and wherein the first communication device is further caused to:
    transmit, to a sensing management function device or a network device, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and
    receive, from the sensing management function device or the network device, the sensing result of the target.
  9. The device of claim 8, wherein the first position is obtained from the positioning result received from a Location Management Function (LMF) device, or
    wherein the first position is obtained from the positioning and sensing fusion request.
  10. The device of claim 8, wherein the first communication device is further caused to:
    transmit a location request for the target to an Access and Mobility Management Function (AMF) device; and
    receive information about the first position from the AMF device.
  11. The device of claim 8, wherein the first communication device is further caused to:
    transmit, to a Location Management Function (LMF) device, a further positioning service request for the first position of the target; and
    receive information about the first position from the LMF device.
  12. The device of any of claims 4 to 6, wherein the first communication device  comprises the LMF device, and the first communication device is further caused to:
    perform a positioning procedure for the target to obtain the positioning result of the target.
  13. The device of any of claims 4 to 12, wherein the first communication device comprises the LMF device, and the first communication device is further caused to:
    transmit, to a sensing function device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and
    receive the sensing result of the target from the sensing function device.
  14. The device of claim 13, wherein the first communication device is further caused to perform at least one of:
    obtaining the first position is obtained from the positioning result; or
    obtaining the first position is from the positioning and sensing fusion request.
  15. The device of any of claims 1 to 3, wherein the first communication device comprises an Application Function (AF) device, and the first communication device is further caused to:
    transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target; and
    receive the positioning result of the target from the third communication device.
  16. The device of claim 15, wherein the third communication device comprises a Location Management Function (LMF) device or a Network Exposure Function (NEF) device.
  17. The device of any of claims 1 to 3 and 15-16, wherein the first communication device comprises an Application Function (AF) device, and the first communication device is further caused to:
    transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and
    receive the sensing result of the target from the fourth communication device.
  18. The device of claim 17, wherein the fourth communication device comprises a sensing function device or a Network Exposure Function (NEF) device.
  19. The device of claim 17 or 18, wherein the first position is obtained from the positioning result, or
    wherein the first position is obtained from position information received from the target, or
    wherein the first position is determined at the AF device based on a path plan.
  20. A fifth communication device comprising:
    a processor configured to cause the fifth communication device to:
    receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and
    transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
  21. The device of claim 20, wherein the fifth communication device is further caused to:
    receive, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result; and
    transmit, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
  22. The device of claim 20, wherein the fifth communication device comprises a Network Exposure Function (NEF) device, and the sixth communication device comprises a sensing function device or a Location Management Function (LMF) device.
  23. A communication method implemented at a first communication device, comprising:
    obtaining a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and
    performing a fusion of the sensing result and the positioning result.
  24. A communication method implemented at a fifth communication device, comprising:
    receiving, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and
    transmitting, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
  25. A computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, causing the at least one processor to perform the method according to any of claims 23-24.
PCT/CN2024/076493 2024-02-06 2024-02-06 Devices and methods for sensing and positioning fusion Pending WO2025166609A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2024/076493 WO2025166609A1 (en) 2024-02-06 2024-02-06 Devices and methods for sensing and positioning fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2024/076493 WO2025166609A1 (en) 2024-02-06 2024-02-06 Devices and methods for sensing and positioning fusion

Publications (1)

Publication Number Publication Date
WO2025166609A1 true WO2025166609A1 (en) 2025-08-14

Family

ID=96698959

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2024/076493 Pending WO2025166609A1 (en) 2024-02-06 2024-02-06 Devices and methods for sensing and positioning fusion

Country Status (1)

Country Link
WO (1) WO2025166609A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093644A1 (en) * 2021-11-25 2023-06-01 维沃移动通信有限公司 Wireless sensing method and apparatus, and network side device and terminal
CN116347327A (en) * 2021-12-24 2023-06-27 维沃移动通信有限公司 Location perception method, perception measurement method, device, terminal and network side equipment
WO2023116754A1 (en) * 2021-12-24 2023-06-29 维沃移动通信有限公司 Target positioning sensing method and apparatus, communication device, and storage medium
WO2023116753A1 (en) * 2021-12-24 2023-06-29 维沃移动通信有限公司 Positioning sensing method and apparatus, and related device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093644A1 (en) * 2021-11-25 2023-06-01 维沃移动通信有限公司 Wireless sensing method and apparatus, and network side device and terminal
CN116347327A (en) * 2021-12-24 2023-06-27 维沃移动通信有限公司 Location perception method, perception measurement method, device, terminal and network side equipment
WO2023116754A1 (en) * 2021-12-24 2023-06-29 维沃移动通信有限公司 Target positioning sensing method and apparatus, communication device, and storage medium
WO2023116753A1 (en) * 2021-12-24 2023-06-29 维沃移动通信有限公司 Positioning sensing method and apparatus, and related device

Similar Documents

Publication Publication Date Title
EP2878161B1 (en) Enhancing positioning in multi-plmn deployments
EP4557882A1 (en) Electronic device and method for wireless communication system, and storage medium
US20220377698A1 (en) Methods for communication, terminal device, network device, and computer readable media
CN118235488A (en) Device positioning
WO2025166609A1 (en) Devices and methods for sensing and positioning fusion
WO2025160820A1 (en) Devices and methods for target identification and report
WO2025145453A1 (en) Devices and methods for performing sensing process
WO2025175567A1 (en) Devices and methods for sensing coordination and fusion
WO2025086292A1 (en) Devices and methods for communication
WO2024239174A1 (en) Devices and methods for communication
WO2025160841A1 (en) Devices and methods for communication
WO2025199971A1 (en) Devices and methods for power allocation for sensing
US20250126590A1 (en) Apparatus, methods, for apparatus and computer program products for location function including non-terestrial access point
WO2024239294A1 (en) Devices and methods of communication
WO2025007277A1 (en) Devices and methods for communication
WO2024239295A1 (en) Devices, methods, and medium for communication
WO2025152175A1 (en) Devices and methods for communication
WO2025179596A1 (en) Devices and methods for communication
WO2024108445A1 (en) Methods, devices and medium for communication
WO2025060055A1 (en) Devices and methods for integrated sensing and communication
WO2025137968A1 (en) Devices and methods for sensing service management
WO2025156171A1 (en) Devices and methods for communication
WO2025000541A1 (en) Devices and methods for communication
WO2025199970A1 (en) Devices and methods for sensing resource allocation
WO2025199995A1 (en) Devices, methods, and medium for communication

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24922852

Country of ref document: EP

Kind code of ref document: A1