WO2025166609A1 - Dispositifs et procédés de fusion de résultats de détection et de positionnement - Google Patents
Dispositifs et procédés de fusion de résultats de détection et de positionnementInfo
- Publication number
- WO2025166609A1 WO2025166609A1 PCT/CN2024/076493 CN2024076493W WO2025166609A1 WO 2025166609 A1 WO2025166609 A1 WO 2025166609A1 CN 2024076493 W CN2024076493 W CN 2024076493W WO 2025166609 A1 WO2025166609 A1 WO 2025166609A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensing
- target
- positioning
- result
- communication device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/20—Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
Definitions
- Example embodiments of the present disclosure generally relate to the field of communication techniques and in particular, to devices and methods for sensing and positioning fusion.
- ISAC Integrated Sensing and Communication
- RF radio frequency
- a first communication device comprising: a processor configured to cause the first communication device to: obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and perform a fusion of the sensing result and the positioning result.
- a fifth communication device comprising: a processor configured to cause the fifth communication device to: receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
- AF Application Function
- a communication method performed by a first communication device. The method comprises: obtaining a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and performing a fusion of the sensing result and the positioning result.
- a communication method performed by a fifth communication device.
- the method comprises: receiving, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmitting, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
- AF Application Function
- a computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, causing the at least one processor to carry out the method according to the third, or fourth aspect.
- FIG. 1A illustrates an example communication environment in which embodiments of the present disclosure can be implemented
- FIG. 1B illustrates another example communication environment in which embodiments of the present disclosure can be implemented
- FIG. 2 illustrate a schematic diagram of example sensing modes in accordance with some embodiments of the present disclosure
- FIG. 3 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure
- FIG. 4A illustrates a signaling flow of an example process of sensing and positioning fusion performed at a sensing function (SF) device in accordance with some embodiments of the present disclosure
- FIG. 4B illustrates a signaling flow of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure
- FIG. 4C illustrates a signaling flow of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure
- FIG. 5 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure
- FIG. 6 illustrates a signaling flow of an example process of sensing and positioning fusion performed at a Location Management Function (LMF) device in accordance with some embodiments of the present disclosure
- LMF Location Management Function
- FIG. 7 illustrates a signaling flow of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure
- FIG. 8 illustrates a signaling flow of an example process of sensing and positioning fusion performed at an Application Function (AF) device in accordance with some embodiments of the present disclosure
- FIG. 9 illustrates a signaling flow of an example process for the sensing and positioning fusion in accordance with some embodiments of the present disclosure
- FIG. 10 illustrates a flowchart of a method implemented at a first communication device according to some example embodiments of the present disclosure
- FIG. 11 illustrates a flowchart of a method implemented at a fifth communication device according to some example embodiments of the present disclosure
- FIG. 12 illustrates a simplified block diagram of an apparatus that is suitable for implementing example embodiments of the present disclosure.
- terminal device refers to any device having wireless or wired communication capabilities.
- the terminal device include, but not limited to, user equipment (UE) , personal computers, desktops, mobile phones, cellular phones, smart phones, personal digital assistants (PDAs) , portable computers, tablets, wearable devices, internet of things (IoT) devices, Ultra-reliable and Low Latency Communications (URLLC) devices, Internet of Everything (IoE) devices, machine type communication (MTC) devices, devices on vehicle for V2X communication where X means pedestrian, vehicle, or infrastructure/network, devices for Integrated Access and Backhaul (IAB) , Space borne vehicles or Air borne vehicles in Non-terrestrial networks (NTN) including Satellites and High Altitude Platforms (HAPs) encompassing Unmanned Aircraft Systems (UAS) , eXtended Reality (XR) devices including different types of realities such as Augmented Reality (AR) , Mixed Reality (MR) and Virtual Reality (VR) , the unmanned aerial vehicle (UAV)
- UE user equipment
- the ‘terminal device’ can further has ‘multicast/broadcast’ feature, to support public safety and mission critical, V2X applications, transparent IPv4/IPv6 multicast delivery, IPTV, smart TV, radio services, software delivery over wireless, group communications and IoT applications. It may also incorporate one or multiple Subscriber Identity Module (SIM) as known as Multi-SIM.
- SIM Subscriber Identity Module
- the term “terminal device” can be used interchangeably with a UE, a mobile station, a subscriber station, a mobile terminal, a user terminal or a wireless device.
- network device refers to a device which is capable of providing or hosting a cell or coverage where terminal devices can communicate.
- a network device include, but not limited to, a Node B (NodeB or NB) , an evolved NodeB (eNodeB or eNB) , a next generation NodeB (gNB) , a transmission reception point (TRP) , a remote radio unit (RRU) , a radio head (RH) , a remote radio head (RRH) , an IAB node, a low power node such as a femto node, a pico node, a reconfigurable intelligent surface (RIS) , and the like.
- NodeB Node B
- eNodeB or eNB evolved NodeB
- gNB next generation NodeB
- TRP transmission reception point
- RRU remote radio unit
- RH radio head
- RRH remote radio head
- IAB node a low power node such as a fe
- the terminal device or the network device may have Artificial intelligence (AI) or Machine learning capability. It generally includes a model which has been trained from numerous collected data for a specific function, and can be used to predict some information.
- AI Artificial intelligence
- Machine learning capability it generally includes a model which has been trained from numerous collected data for a specific function, and can be used to predict some information.
- the terminal or the network device may work on several frequency ranges, e.g., FR1 (e.g., 450 MHz to 6000 MHz) , FR2 (e.g., 24.25GHz to 52.6GHz) , frequency band larger than 100 GHz as well as Tera Hertz (THz) . It can further work on licensed/unlicensed/shared spectrum.
- FR1 e.g., 450 MHz to 6000 MHz
- FR2 e.g., 24.25GHz to 52.6GHz
- THz Tera Hertz
- the terminal device may have more than one connection with the network devices under Multi-Radio Dual Connectivity (MR-DC) application scenario.
- MR-DC Multi-Radio Dual Connectivity
- the terminal device or the network device can work on full duplex, flexible duplex and cross division duplex modes.
- the embodiments of the present disclosure may be performed in test equipment, e.g., signal generator, signal analyzer, spectrum analyzer, network analyzer, test terminal device, test network device, channel emulator.
- the terminal device may be connected with a first network device and a second network device.
- One of the first network device and the second network device may be a master node and the other one may be a secondary node.
- the first network device and the second network device may use different radio access technologies (RATs) .
- the first network device may be a first RAT device and the second network device may be a second RAT device.
- the first RAT device is eNB and the second RAT device is gNB.
- Information related with different RATs may be transmitted to the terminal device from at least one of the first network device or the second network device.
- first information may be transmitted to the terminal device from the first network device and second information may be transmitted to the terminal device from the second network device directly or via the first network device.
- information related with configuration for the terminal device configured by the second network device may be transmitted from the second network device via the first network device.
- Information related with reconfiguration for the terminal device configured by the second network device may be transmitted to the terminal device from the second network device directly or via the first network device.
- the singular forms ‘a’ , ‘an’ and ‘the’ are intended to include the plural forms as well, unless the context clearly indicates otherwise.
- the term ‘includes’ and its variants are to be read as open terms that mean ‘includes, but is not limited to. ’
- the term ‘based on’ is to be read as ‘at least in part based on. ’
- the term ‘one embodiment’ and ‘an embodiment’ are to be read as ‘at least one embodiment. ’
- the term ‘another embodiment’ is to be read as ‘at least one other embodiment. ’
- the terms ‘first, ’ ‘second, ’ and the like may refer to different or same objects. Other definitions, explicit and implicit, may be included below.
- values, procedures, or apparatus are referred to as ‘best, ’ ‘lowest, ’ ‘highest, ’ ‘minimum, ’ ‘maximum, ’ or the like. It will be appreciated that such descriptions are intended to indicate that a selection among many used functional alternatives can be made, and such selections need not be better, smaller, higher, or otherwise preferable to other selections.
- the term “resource, ” “transmission resource, ” “uplink resource, ” or “downlink resource” may refer to any resource for performing a communication, such as a resource in time domain, a resource in frequency domain, a resource in space domain, a resource in code domain, or any other resource enabling a communication, and the like.
- a resource in both frequency domain and time domain will be used as an example of a transmission resource for describing some example embodiments of the present disclosure. It is noted that example embodiments of the present disclosure are equally applicable to other resources in other domains.
- performing a step “in response to A” does not indicate that the step is performed immediately after “A” occurs and one or more intervening steps may be included.
- 3rd Generation Partnership Project (3GPP) sensing data may refer to data derived from 3GPP radio signals that are impacted (e.g., reflected, refracted, diffracted) by an object or environment of interest for sensing purposes, and optionally processed within the 5th Generation Mobile Communication Technology (5G) system.
- 3GPP 3rd Generation Partnership Project
- 5G Wireless sensing may refer to 5G System (5GS) feature providing capabilities to get information about characteristics of an environment and/or objects within the environment (e.g., shape, size, orientation, speed, location, distances or relative motion between objects, etc. ) using New Radio (NR) radio frequency signals, which, in some cases, can be extended by information created via previously specified functionalities in Evolved Packet Core (EPC) and/or Evolved UMTS Terrestrial Radio Access Network (E-UTRAN) .
- EPC Evolved Packet Core
- E-UTRAN Evolved UMTS Terrestrial Radio Access Network
- non-3GPP sensing data may refer to data provided by non-3GPP sensors (e.g., video, LiDAR, sonar) about an object or environment of interest for sensing purposes.
- non-3GPP sensors e.g., video, LiDAR, sonar
- sensing assistance information may refer to information that is provided to 5G system and can be used to derive sensing result.
- the sensing assistance information may be, for example, map information, area information, a user equipment (UE) Identity (ID) attached to or in the proximity of the sensing target, UE position information, UE velocity information etc.
- UE user equipment
- ID user equipment
- sensing contextual information may refer to information that is exposed with the sensing results by 5G system to a trusted third party which provides context to the conditions under which the sensing results were derived.
- the sensing contextual information may include, for example, map information, area information, time of capture, UE location and ID.
- the sensing contextual information can be required in scenarios where the sensing result is to be combined with data from other sources outside the 5GS.
- sensing group may refer to a set of sensing transmitters and sensing receivers whose locations are known and whose sensing data can be collected synchronously.
- sensing transmitter may be the entity that sends out the sensing signal which the sensing service will use in its operation.
- a Sensing transmitter is an NR RAN node or a UE.
- a Sensing transmitter can be located in the same or different entity as the Sensing receiver.
- sensing signals may refer to transmissions on the 3GPP radio interface that can be used for sensing purposes.
- the sensing signals may refer to NR radio frequency signals which, in some cases, may be extended by information created via previously specified functionalities in EPC and/or E-UTRAN.
- sensing result may refer to processed 3GPP sensing data requested by a service consumer.
- target sensing service area may refer to a cartesian location area that needs to be sensed by deriving characteristics of an environment and/or objects within the environment with certain sensing service quality from the impacted (e.g., reflected, refracted, diffracted) 3GPP radio signals. This includes both indoor and outdoor environments.
- a sensing function (SF) device is a device having a core network function to trigger sensing, collect sensing result/report, and expose the sensing result/report to the 3rd party which is in or out of 3GPP scope.
- a sensing management function (SEMF) device is a device having a new RAN function between sensing function device and a network device to manage the sensing operation, including selecting a suitable network device, relaying the sensing request from the sensing function device to the network device, relaying the sensing result/report from the network device to the sensing function device.
- SEMF sensing management function
- ISAC is considered as a promising topic for future wireless network extension. According to the requirements of ISAC communication/sensing, how to identify and report a target in the network need to be resolved.
- FIG. 1B illustrates another example communication environment 100B in which embodiments of the present disclosure can be implemented.
- the communication environment 100B is an implementation of the communication environment 100A.
- the communication environment 100B a plurality of communication devices communicate with each other.
- the communication environment 100B comprises, in addition to the LMF device 110, the SF device 120, the AF device 130, a NEF 140, an Access and Mobility Management Function (AMF) device 150.
- the communication environment 100B further comprises a terminal device 101, a network device 102, and a sensing management function (SEMF) device 103.
- SEMF sensing management function
- the NEF device 140 may be used to communicate information/data between the AF device 130 and the LMF device 110 or between the AF device 130 and the SF device 120.
- the AMF device 150 also referred to as an AMF node, plays a pivotal role in managing the control plane functions related to user access and mobility. Specifically, the AMF node acts as an anchor point for a control plane signaling for connected communication devices (e.g., UEs) in the environment 100, ensuring smooth and secure user access while managing their mobility throughout the network.
- connected communication devices e.g., UEs
- the network device 102 and the terminal device 101 are in a radio access network (RAN) .
- the terminal device 101 may communicate with the network device 102.
- the network device 102 may communicatively connect with the SEMF device 103.
- the sensing management function device 103 may be connected to the SF device 120.
- the SEMF device 103 may be for example implemented at an Operation Administration and Maintenance (OAM) device or the AMF device 150.
- OAM Operation Administration and Maintenance
- the OAM device or the AMF device 150 is just one option for acting as the SEMF device 130.
- the SEMF device 130 may be implemented as a node or device with a new function.
- OAM is indicative while SEMF can replace OAM as another option in the cases, although SEMF is not indicative in the embodiments.
- terminal device 101 operating as a UE
- network device 102 operating as a base station
- operations described in connection with a terminal device may be implemented at a network device or other device
- operations described in connection with a network device may be implemented at a terminal device or other device.
- a link from the network device 102 to the terminal device 101 is referred to as a downlink (DL)
- a link from the terminal device 101 to the network device 102 is referred to as an uplink (UL)
- the network device 102 is a transmitting (TX) device (or a transmitter)
- the terminal device 101 is a receiving (RX) device (or a receiver)
- the terminal device 101 is a TX device (or a transmitter) and the network device 102 is a RX device (or a receiver) .
- the communications in the communication environment 100A and/or 100 B may conform to any suitable standards including, but not limited to, Global System for Mobile Communications (GSM) , Long Term Evolution (LTE) , LTE-Evolution, LTE-Advanced (LTE-A) , New Radio (NR) , Wideband Code Division Multiple Access (WCDMA) , Code Division Multiple Access (CDMA) , GSM EDGE Radio Access Network (GERAN) , Machine Type Communication (MTC) and the like.
- GSM Global System for Mobile Communications
- LTE Long Term Evolution
- LTE-Evolution LTE-Advanced
- NR New Radio
- WCDMA Wideband Code Division Multiple Access
- CDMA Code Division Multiple Access
- GERAN GSM EDGE Radio Access Network
- MTC Machine Type Communication
- Examples of the communication protocols include, but not limited to, the first generation (1G) , the second generation (2G) , 2.5G, 2.75G, the third generation (3G) , the fourth generation (4G) , 4.5G, the fifth generation (5G) communication protocols, 5.5G, 5G-Advanced networks, or the sixth generation (6G) networks.
- the communication environment 100A or 100B may include any suitable number of devices configured to implementing example embodiments of the present disclosure. Although not shown, it is to be understood that one or more additional devices may be located in the cell, and one or more additional cells may be deployed in the communication environment.
- sensing modes There are generally two types of sensing modes defined based on Tx/Rx node of sensing signal, namely, monostatic and bi-static. These types of sensing modes include 6 specific modes, namely, Sensing Mode 1 which is a gNB mono-static sensing, Sensing Mode 2 which is gNB-to-UE bi-static sensing, Sensing Mode 3 which is gNB-to-gNB bi-static sensing, Sensing Mode 4 which is UE mono-static sensing, Sensing Mode 5 which is UE-to-gNB bi-static sensing, and Sensing Mode 6 which is UE-to-UE bi-static sensing.
- Sensing Mode 1 which is a gNB mono-static sensing
- Sensing Mode 2 which is gNB-to-UE bi-static sensing
- Sensing Mode 3 which is gNB-to-gNB bi-static sensing
- Sensing Mode 4 which is UE mono-static sensing
- Sensing Mode 5 which is
- FIG. 2 illustrates schematic diagrams of six example sensing modes in accordance with some example embodiments of the present disclosure.
- Sensing Mode 1 a sensing signal for sensing a target 230 is transmitted by a network device 210 and received or measured by the network device 210 itself.
- Sensing Mode 2 a sensing signal for sensing the target 230 is transmitted by the network device 210 and received or measured by a terminal device 220.
- Sensing Mode 3 as indicated by 203, a sensing signal for sensing the target 230 is transmitted by the network device 210 and received or measured by another network device 212.
- Sensing Mode 4 a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by the network device 210.
- Sensing Mode 5 a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by the terminal device 220 itself.
- Sensing Mode 6 a sensing signal for sensing the target 230 is transmitted by the terminal device 220 and received or measured by another terminal device 222.
- sensing modes illustrated in FIG. 2 are examples only and there may be many other sensing modes. It would be appreciated that more than one second communication device may be involved in a sensing service. It can be seen from the sensing modes in FIG. 2 that there may be various combinations of the devices which are to measure a sensing signal.
- the focus of the study is to define channel modelling aspects to support object detection and/or tracking.
- the study may aim at a common modelling framework capable of detecting and/or tracking the following example objects and to enable them to be distinguished from unintended objects: UAVs, humans indoors and outdoors, automotive vehicles (at least outdoors) , automated guided vehicles (e.g. in indoor factories) , and objects creating hazards on roads/railways, with a minimum size dependent on frequency.
- All of the six sensing modes may be considered (i.e. TRP-TRP bistatic, TRP monostatic, TRP-UE bistatic, UE-TRP bistatic, UE-UE bistatic, UE monostatic) .
- Frequencies from 0.5 to 52.6 GHz are the primary focus, with the assumption that the modelling approach should scale to 100 GHz. (If significant problems are identified with scaling above 52.6 GHz, the range above 52.6 GHz can be deprioritized. )
- an object also referred to as a target
- a 3GPP ID e.g. SUPI/IMSI
- SUPI/IMSI 3GPP ID
- Embodiments of the present disclosure provide a solution for positioning and sensing fusion.
- the 3GPP ID of a target is used to correlate a positioning result and a sensing result to track the target.
- the time stamp and location in the positioning results may be fused with the sensing results to identify the target and its trajectory.
- the fusion may be performed at a first communication device implementing a network logical function (for instance, referred to as Positioning and Sensing Fusion Function) .
- the first communication device obtains a sensing result of a target in a sensing area and a positioning result of the target.
- the positioning result indicates at least one position of the target in the sensing area.
- the first communication device may perform a fusion of the sensing result and the positioning result.
- the fusion may be performed in a variety of ways.
- the first communication device may determine a trajectory of the target with timestamps from the sensing result, and may determine, from the positioning result, at least one time point corresponding to the at least one position of the target. Then, the first communication device performs the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
- the result of the fusion may include various information, which including for example, but not limited to, the trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, an identification of the target, and/or the like.
- the first communication device may be a LMF device, a SF device, an AF device or other suitable device or node. More details will be discussed with reference to FIGS. 3 to 9.
- the LMF device is sometimes referred to as “LMF” for short
- the SF device is sometimes referred to as “SF”
- the AF device is sometimes referred to as “AF”
- the AMF device is sometimes referred to as “AMF”
- the NEF device is sometimes referred to as “NEF”
- the target may be a terminal device, which may be sometimes referred to as UE as an example.
- FIG. 3 illustrates a signaling flow 300 of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure.
- the first communication device fusing the positioning result and the sensing result may be a SF device, e.g., the SF device 120.
- the SF 120 obtains a positioning result and a sensing result and performs a fusion of them. Specifically, the LMF device 110 transmits (305) the positioning result. The positioning result indicates at least one position of the target in the sensing area. The SF 120 receives (310) the positioning result from the LMF device 110.
- the SF 120 performs a sensing procedure to obtain (315) the sensing result.
- the “sensing result” may refer to processed 3GPP sensing data requested by a service consumer.
- the sensing result may include any desired information that can be derived from the measurement result (s) of the sensing signal (s) .
- the sensing result may include a distance of a target, a size of the target, a velocity of the target, a position of the target, a moving direction of the target, a surrounding environment of the target, real-time map, or the like.
- the SF 120 may first obtain the positioning result and then the sensing result. Alternatively, the SF 120 may first obtain the sensing result and then the positioning result. As a further alternative, the SF 120 may obtain the positioning result and the sensing result in parallel. There is no limitation on the temporal sequence of obtaining the positioning result and the sensing result.
- the SF 120 performs (320) a fusion of the positioning result and the sensing result.
- the SF 120 may determine a trajectory of the target with timestamps from the sensing result.
- the SF 120 may determine, from the positioning result, at least one time point corresponding to the at least one position of the target.
- the SF device 120 may perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
- the LMF device 110 transmits (325) the result of the fusion to the AF device 130.
- the AF device 130 receives (330) the result of the fusion and may obtain the trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory.
- the result of the fusion indicates the identification of the target as well, and the AF device 130 would know to which target the result of the fusion corresponds.
- the above fusing procedure may be initiated by the AF device 130, by transmitting a positioning and sensing fusion request.
- the SF 120 receives the positioning and sensing fusion request from the AF device 130 directly or indirectly from the NEF device 140 (which are collectively referred to as “second communication device” in some embodiments) .
- the positioning and sensing fusion request may indicate that the sensing result and the positioning result of the target are to be fused.
- the positioning and sensing fusion request may comprise an identification of the target.
- the first communication device then transmit, to the second communication device, a positioning and sensing fusion response indicating the result of the fusion.
- the identification may be an ID of the terminal device 101, e.g., . SUbscription Permanent Identifier (SUPI) or International Mobile Subscriber Identification Number (IMSI) .
- SUPI SUbscription Permanent Identifier
- IMSI International Mobile Subscriber Identification Number
- the first communication device (the SF device 120 in this case) transmits, to the LMF device 110, a positioning service request to trigger a positioning procedure for the target. Then, the SF device 120 may receive the positioning result of the target from the LMF device 110.
- the first communication device (the SF device 120 in this case) transmits, to the SEMF device 103 or a network device 102, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target.
- the SF device 120 may then receive the sensing result of the target from the SEMF device 103 or the network device 102.
- the first position may be also referred to a first location, which may be determined in various ways.
- the first position may be obtained from the positioning result received from the LMF device 110.
- the first position may be obtained from the positioning and sensing fusion request.
- the SF device 120 may transmit a location request for the target to an AMF device 150 and may receive information about the first position from the AMF device 150.
- the SF device 120 may transmit, to the LMF device 110, a further positioning service request for the first position of the target and receive information about the first position from the LMF device 110.
- the AF device 130 sends the Positioning and Sensing Fusion request to the SF device 120 via NEF, which includes the target 3GPP ID and the report granularity requirement (real time or regularly) .
- the SF device 120 triggers the LMF device 110 for positioning to the target.
- the LMF device 110 report the positioning result to the SF device 120.
- the SF device 120 starts sensing in the area of the positioning result, the SF device 120 receives the sensing result of the area with time stamp.
- the SF device 120 triggers the LMF device 110 for multiple positioning to the target, the SF device 120 receives the positioning results with time stamps.
- the SF device 120 fuses the sensing result and the multiple positioning results based on the time stamp to identify the target and obtain the target trajectory.
- the target trajectory is report to the AF device 130 in real time or regularly according to the requirement from the AF device 130.
- FIG. 4A illustrates a signaling flow 400A of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure.
- the embodiments of FIG. 4A will be discussed with respect to FIG. 1A and FIG. 1B.
- the LMF device 110 provides accurate UE location.
- the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 399 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
- the AF device 130 transmits a Positioning and Sensing Fusion request 401 to the NEF device 140, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
- the transmission of the Positioning and Sensing Fusion request 401 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
- UAS Unmanned Aircraft System
- UMM Traffic Management
- UAV Unmanned Aerial Vehicle
- the NEF device 140 sends a Positioning and Sensing Fusion request 402 to the SF device 120, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
- the SF device 120 transmits a Positioning Request 403 to the LMF device 110 to trigger positioning to the target.
- the 3GPP ID of the target may be included in the Positioning Request 403 from the SF device 120 to the LMF device 110.
- the LMF device 110 initiates a positioning procedure 404 to position the target with the 3GPP ID.
- the LMF device 110 reports the positioning result, for example, by transmitting a positioning report 405 to the SF device 120.
- the result includes a first position (also referred to as target location) and the timestamp corresponding to the first position.
- the SF device 120 may trigger a further positioning procedure 407, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 407, the SF device 120 receives the positioning results with the target location and corresponding time stamps.
- the SF device 120 sends a Positioning and Sensing Fusion response 409 to the NEF device 140, which includes the result of the fusion, for example, the target trajectory and the 3GPP ID in real time or regularly according to the reporting time requirement from the AF device 130.
- the NEF device 140 sends a further Positioning and Sensing Fusion response 410 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 409.
- FIG. 4B illustrates a signaling flow 400B of an example process of sensing and positioning fusion performed at a SF device in accordance with some embodiments of the present disclosure.
- FIG. 4B will be discussed with respect to FIG. 1A and FIG. 1B.
- the AMF device 150 provides the first position of the target, which may be a UE or a terminal device, for example, the terminal device 101.
- the first position may be determined based on the Track Area (TA) list when the UE is in idle state, or may be the cell ID when the UE is in active state.
- TA Track Area
- the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 420 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
- the AF device 130 transmits a Positioning and Sensing Fusion request 421 to the NEF device 140, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
- the transmission of the Positioning and Sensing Fusion request 401 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a flying plan.
- UAS Unmanned Aircraft System
- UAV Unmanned Aerial Vehicle
- the NEF device 140 sends a Positioning and Sensing Fusion request 422 to the SF device 120, which includes the target 3GPP ID and the report granularity requirement (e.g. real time or regularly) .
- the SF device 120 transmits a UE location request 423 to the AMF device 150 to trigger positioning to the target.
- the 3GPP ID of the target may be included in the UE location request 423.
- the AMF device 150 transmits the stored UE location as the first position 424 of the target to the SF device 120.
- the first position may be determined based on the TA (Track Area) list when the UE is in idle state, or may be the cell ID when the UE is in active state. Or based on the configured policy, the AMF device 150 may initiate location request procedure to RAN to obtain the UE location.
- TA Track Area
- the SF device 120 starts sensing in the area of the target location in a sensing procedure 426.
- the area may be wide and covers the target location, according to the target location and the estimated movement of the target.
- the SF device 120 may generate the sensing result of the area with time stamps.
- the SF device 120 may trigger a further positioning procedure 427, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 427, the SF device 120 receives the positioning results with the target location and corresponding time stamps.
- the SF device 120 performs a fusion 428 based on the sensing result, the positions of the target and the time stamps, to identify the target and obtain the target trajectory.
- the SF device 120 may trigger a further positioning procedure 447, for example, by triggering the LMF device 110 for another or multiple positioning to the target. With the further positioning procedure 447, the SF device 120 obtain the positioning results with positions of the target 101 and corresponding time stamps.
- the sensing procedure 446 may be performed before or after or in parallel with the positioning procedure 447.
- FIG. 5 illustrates a signaling flow 500 of an example process of sensing and positioning fusion in accordance with some embodiments of the present disclosure.
- the first communication device fusing the positioning result and the sensing result may be a LMF device, e.g., the LMF device 110.
- the LMF device 110 performs a positioning procedure for the target to obtain the positioning result of the target.
- the LMF device 110 transmits, to the SF device 120, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target.
- the LMF device 110 then receives the sensing result of the target from the SF device 120.
- the LMF device 110 may obtain the first position from the positioning result.
- the LMF device 110 may obtain the first position is from a positioning and sensing fusion request, which is for instance received from the AF 130 directly or indirectly.
- the LMF device 110 performs (520) a fusion of the positioning result and the sensing result.
- the LMF device 110 may determine a trajectory of the target with timestamps from the sensing result.
- the LMF device 110 may determine, from the positioning result, at least one time point corresponding to the at least one position of the target.
- the LMF device 110 may perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
- FIG. 6 illustrates a signaling flow 600 of an example process of sensing and positioning fusion performed at a the LMF device 110 device in accordance with some embodiments of the present disclosure.
- FIG. 6 illustrates a signaling flow 600 of an example process of sensing and positioning fusion performed at a the LMF device 110 device in accordance with some embodiments of the present disclosure.
- the embodiments of FIG. 6 will be discussed with respect to FIG. 1A and FIG. 1B.
- the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 599 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
- the position of the UE i.e., the first position of the target (also referred to as UE location or target location) , may be indicated via the Positioning and Sensing Fusion Request 599.
- the AF device 130 transmits a Positioning and Sensing Fusion request 601 to the NEF device 140, which includes the target 3GPP ID, the report granularity requirement or report time requirement (real time or regularly) , and/or the like.
- the transmission of the Positioning and Sensing Fusion request 601 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
- UAS Unmanned Aircraft System
- UAM Traffic Management
- UAV Unmanned Aerial Vehicle
- the NEF device 140 transmits a Positioning and Sensing Fusion request 602 to the LMF device 110. Based on the Positioning and Sensing Fusion request 602, the LMF device 110 initiate a positioning procedure 603 to the target using the 3GPP ID. In this way, the LMF device 110 obtains the location of the target 101 (i.e., the first position) and the corresponding timestamp.
- the LMF device 110 may transmit a Sensing Service request 604 to the SF device 120 for sensing to the area covering the first position based on the estimated movement of the target and the first position.
- the SF device 120 initiates sensing in the area of the target location in a sensing procedure 605.
- the area may be wide and covers the target location, according to the target location and the estimated movement of the target.
- the SF device 120 may generate the sensing result of the area with time stamps.
- the SF device 120 may transmit a Sensing Response 606 to the LMF device 110 to report the Sensing Result with time stamps.
- the LMF device 110 performs a fusion 607 the sensing result and the positioning results based on the sensing result, the target location and time stamps to identify the target and obtain the target trajectory. If needed, the LMF device 110 may perform another or multiple positioning to the target to obtain more target location information.
- the LMF device 110 then transmits a Positioning and Sensing Fusion response 608 to the NEF device 140, which includes the target trajectory and the 3GPP ID in real time or regularly according to the requirement from the AF device 130.
- the NEF device 140 sends a further Positioning and Sensing Fusion response 609 to the AF device 130, including the result of the fusion received from the Positioning and Sensing Fusion response 608.
- the AF device 130 may transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target, and receive the positioning result of the target from the third communication device.
- the third communication device may be for example the LMF device 110, the NEF device 140, or other suitable device.
- the AF device 130 may transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target. Then, the AF device 130 may receive the sensing result of the target from the fourth communication device.
- the fourth communication device may be for example the SF device 120, the NEF device 140, or other suitable device.
- the first position of the target 101 may be obtained from the positioning result.
- the first position may be obtained from position information received from the target 101.
- the first position may be determined at the AF device based on a path plan or a flying plan.
- the AF 130 obtains a positioning result and a sensing result and performs a fusion of them.
- the SF 120 performs a sensing procedure to obtain the sensing result and transmits (705) the sensing result to the AF 130.
- the sensing result may include any desired information that can be derived from the measurement result (s) of the sensing signal (s) .
- the sensing result may include a distance of a target, a size of the target, a velocity of the target, a position of the target, a moving direction of the target, a surrounding environment of the target, real-time map, or the like.
- the LMF device 110 performs a positioning procedure to obtain the positioning result and transmits (715) the positioning result to the AF 130.
- the positioning result indicates at least one position of the target in the sensing area.
- the AF 130 receives (710) the sensing result from the SF device 120 and receives (720) the positioning result from the LMF device 110.
- FIG. 8 illustrates a signaling flow 800 of an example process of sensing and positioning fusion performed at an the AF device 130 device in accordance with some embodiments of the present disclosure.
- FIG. 8 will be discussed with respect to FIG. 1A and FIG. 1B.
- the target for example, a UE or the terminal device 101 sends a Positioning and Sensing Fusion Request 799 to the AF device 130 to trigger a positioning and sensing fusion for the UE, which may for example include the 3GPP ID of the UE.
- the position of the UE i.e., the first position of the target (also referred to as UE location or target location) , may be indicated via the Positioning and Sensing Fusion Request 799.
- the AF device 130 transmits a Positioning and Sensing Fusion request 801 to the NEF device 140, which includes the target 3GPP ID, the report granularity requirement or report time requirement (real time or regularly) , for example per 10 minutes.
- the transmission of the Positioning and Sensing Fusion request 801 may be triggered by an application layer event, e.g., Unmanned Aircraft System (UAS) Traffic Management (UTM) needs to track an Unmanned Aerial Vehicle (UAV) based on a path plan or a flying plan.
- UAS Unmanned Aircraft System
- UAM Traffic Management
- UAV Unmanned Aerial Vehicle
- the NEF device 140 transmits a Positioning and Sensing Fusion request 802 to the LMF device 110. Based on the Positioning and Sensing Fusion request 802, the LMF device 110 initiate a positioning procedure 803 to the target using the 3GPP ID. In this way, the LMF device 110 obtains the location of the target 101 (i.e., the first position) and the corresponding timestamp.
- the LMF device 110 may skip the positioning procedure 803. Alternatively, in some embodiments, the LMF device 110 may initiate a UE location request to the AMF device 150 to obtain the first position of the target 101, for example, TA list or Cell ID.
- the LMF device 110 transmits a Positioning service response 804 to the NEF device 140 as the location report granularity, including the target location and the timestamp. Then, the NEF device 140 transmits a Positioning service response 805 to the AF device 130.
- the AF device 130 sends a sensing service request 806 to the NEF device 140 including the area covering the target location and the sensing report granularity, e.g. real time or per 1 minutes. The area should consider the target location and the estimated movement of the target. Then the NEF device 140 transmits a further sensing service request 807 to the SF device 120.
- the AF device 130 may send the UE location (i.e., the first position of the target 101 or the UE 101) to the SF device 120, which may be obtained from UE or based on the path/flying plan.
- the SF device 120 initiates sensing in the area of the target location in a sensing procedure 808.
- the area may be wide and covers the target location, according to the target location and the estimated movement of the target.
- the SF device 120 may generate the sensing result of the area with time stamps.
- the SF device 120 may transmit a Sensing Response 809 to the NEF 140 to report the Sensing Result with time stamps.
- the NEF 140 then transmits a further Sensing Response 809 to the AF device 130 to report the same.
- the AF device 130 may obtain the Sensing Result with time stamps.
- the AF device 130 performs the fusion 811 of the sensing result and the positioning result to identify the target and the trajectory.
- the timestamp, the target location and the sensing result are correlated with the 3GPP ID of the target.
- the AF device 130 can obtain the first position/UE location from the target or UE 101 or based on the path/flying plan.
- the positioning procedure and the sensing procedure may be performed in parallel.
- the positioning procedure and the sensing procedure may be triggered by the application layer event, e.g. the UTM needs to track the UAV based on the flying plan.
- embodiments of the present disclosure provide a communication device (also referred to as “fifth communication device” for purpose of discussion) .
- the fifth communication device receives, from an AF device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device.
- the positioning and sensing fusion request comprises an identification of the target.
- the fifth communication device then transmits, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
- the fifth communication device receives, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result.
- the fifth communication device then transmits, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
- the fifth communication device may be a Network Exposure Function (NEF) device, e.g., the NEF device 140.
- the sixth communication device may be a sensing function device, e.g. the SF device 120 or a LMF device, e.g., the LMF device 110.
- FIG. 9 illustrates a signaling flow 900 of an example process for the sensing and positioning fusion in accordance with some embodiments of the present disclosure.
- FIG. 9 will be discussed with respect to FIG. 1A and FIG. 1B.
- the AF 130 transmits (905) a positioning and sensing fusion request to the NEF device 140.
- the positioning and sensing fusion request indicates that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device 901.
- the positioning and sensing fusion request comprises an identification of the target.
- the identification may be an ID of the terminal device 101, e.g., SUPI or IMSI.
- the NEF device 140 receives (910) the positioning and sensing fusion request and transmits a further positioning and sensing fusion request to the sixth communication device 901, which may be the LMF 110 or the SF device 120, for example.
- This request indicates that the sensing result of the target and the positioning result of the target are to be fused at the sixth communication device.
- the sixth communication device 901 may have the knowledge that the fusion is to be performed by itself.
- the sixth communication device 901 transmits (925) the a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result.
- the NEF device 140 receives (930) , from the sixth communication device 901, the a positioning and sensing fusion response and transmits (935) the result of the fusion of the sensing result and the positioning result to the AF 130, for example via a further positioning and sensing fusion response or another message.
- the AF device 130 receives (940) the further positioning and sensing fusion response and thus obtains the result of the fusion.
- Embodiments of the present disclosure may have some the following impacts.
- Table 1 shows an example of some impacts.
- Table 2 shows an example of some other impacts.
- embodiments of the present disclosure may lead to create a New ISAC SA2 TS, including the Positioning and Sensing Fusion function introduction and procedures as above.
- FIG. 10 illustrates a flowchart of a communication method 1000 implemented at a first communication device in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1000 will be described from the perspective of the first communication device.
- the first communication device is further caused to: determine a trajectory of the target with timestamps from the sensing result; determine, from the positioning result, at least one time point corresponding to the at least one position of the target; and perform the fusion based on the trajectory, the timestamps, the at least one position and the at least one time point.
- a result of the fusion comprises at least one of: trajectory information of the target indicating a trajectory of the target and one or more positions of the target related to the trajectory, or an identification of the target.
- the first communication device comprises a sensing function device or a Location Management Function (LMF) device
- the first communication device is further caused to: receive, from a second communication device, a positioning and sensing fusion request indicating that the sensing result and the positioning result of the target are to be fused, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the second communication device, a positioning and sensing fusion response indicating a result of the fusion.
- LMF Location Management Function
- the second communication device comprises an Application Function (AF) device or a Network Exposure Function (NEF) device.
- AF Application Function
- NEF Network Exposure Function
- the first communication device comprises the sensing function device, and wherein the first communication device is further caused to: transmit, to a sensing management function device or a network device, a sensing request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive, from the sensing management function device or the network device, the sensing result of the target.
- the first position is obtained from the positioning result received from a Location Management Function (LMF) device, or wherein the first position is obtained from the positioning and sensing fusion request.
- LMF Location Management Function
- the first communication device is further caused to: transmit a location request for the target to an Access and Mobility Management Function (AMF) device; and receive information about the first position from the AMF device.
- AMF Access and Mobility Management Function
- the first communication device is further caused to: transmit, to a Location Management Function (LMF) device, a further positioning service request for the first position of the target; and receive information about the first position from the LMF device.
- LMF Location Management Function
- the first communication device is further caused to perform at least one of: obtaining the first position is obtained from the positioning result; or obtaining the first position is from the positioning and sensing fusion request.
- the first communication device comprises an Application Function (AF) device
- the first communication device is further caused to: transmit, to a third communication device, a positioning service request to trigger a positioning procedure for the target; and receive the positioning result of the target from the third communication device.
- AF Application Function
- the third communication device comprises a Location Management Function (LMF) device or a Network Exposure Function (NEF) device.
- LMF Location Management Function
- NEF Network Exposure Function
- the first communication device comprises an Application Function (AF) device
- the first communication device is further caused to: transmit, to a fourth communication device, a sensing service request to trigger a sensing procedure for the target in the sensing area, the sensing area covering a first position associated with the target; and receive the sensing result of the target from the fourth communication device.
- AF Application Function
- the fourth communication device comprises a sensing function device or a Network Exposure Function (NEF) device.
- NEF Network Exposure Function
- the first position is obtained from the positioning result, or wherein the first position is obtained from position information received from the target, or wherein the first position is determined at the AF device based on a path plan.
- FIG. 11 illustrates a flowchart of a communication method 1100 implemented at a fifth communication device in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1100 will be described from the perspective of a second communication device.
- the second communication device receives, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device.
- the positioning and sensing fusion request comprises an identification of the target.
- the second communication device transmits, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
- the fifth communication device is further caused to: receive, from the sixth communication device, a positioning and sensing fusion response indicating a result of the fusion of the sensing result and the positioning result; and transmit, to the AF device, a further positioning and sensing fusion response indicating the result of the fusion.
- the fifth communication device comprises a Network Exposure Function (NEF) device
- the sixth communication device comprises a sensing function device or a Location Management Function (LMF) device.
- NEF Network Exposure Function
- LMF Location Management Function
- FIG. 12 is a simplified block diagram of a device 1200 that is suitable for implementing embodiments of the present disclosure.
- the device 1200 can be considered as a further example implementation of any of the devices as shown in FIG. 1. Accordingly, the device 1200 can be implemented at or as at least a part of the LMF device 110, the SF device 120 or the AF device 130.
- the device 1200 includes a processor 1210, a memory 1220 coupled to the processor 1210, a suitable transceiver 1240 coupled to the processor 1210, and a communication interface coupled to the transceiver 1240.
- the memory 1220 stores at least a part of a program 1230.
- the transceiver 1240 may be for bidirectional communications or a unidirectional communication based on requirements.
- the transceiver 1240 may include at least one of a transmitter 1242 and a receiver 1244.
- the transmitter 1242 and the receiver 1244 may be functional modules or physical entities.
- the transceiver 1240 has at least one antenna to facilitate communication, though in practice an Access Node mentioned in this application may have several ones.
- the communication interface may represent any interface that is necessary for communication with other network elements, such as X2/Xn interface for bidirectional communications between eNBs/gNBs, S1/NG interface for communication between a Mobility Management Entity (MME) /Access and Mobility Management Function (AMF) /SGW/UPF and the eNB/gNB, Un interface for communication between the eNB/gNB and a relay node (RN) , or Uu interface for communication between the eNB/gNB and a terminal device.
- MME Mobility Management Entity
- AMF Access and Mobility Management Function
- RN relay node
- Uu interface for communication between the eNB/gNB and a terminal device.
- the program 1230 is assumed to include program instructions that, when executed by the associated processor 1210, enable the device 1200 to operate in accordance with the embodiments of the present disclosure, as discussed herein with reference to FIGS. 1A to 11.
- the embodiments herein may be implemented by computer software executable by the processor 1210 of the device 1200, or by hardware, or by a combination of software and hardware.
- the processor 1210 may be configured to implement various embodiments of the present disclosure.
- a combination of the processor 1210 and memory 1220 may form processing means 1250 adapted to implement various embodiments of the present disclosure.
- the memory 1220 may be of any type suitable to the local technical network and may be implemented using any suitable data storage technology, such as a non-transitory computer readable storage medium, semiconductor based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples. While only one memory 1220 is shown in the device 1200, there may be several physically distinct memory modules in the device 1200.
- the processor 1210 may be of any type suitable to the local technical network, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multicore processor architecture, as non-limiting examples.
- the device 1200 may have multiple processors, such as an application specific integrated circuit chip that is slaved in time to a clock which synchronizes the main processor.
- a first communication device comprising a circuitry.
- the circuitry is configured to: obtain a sensing result of a target in a sensing area and a positioning result of the target, the positioning result indicating at least one position of the target in the sensing area; and perform a fusion of the sensing result and the positioning result.
- the circuitry may be configured to perform any method implemented by the first communication device as discussed above.
- a fifth communication device comprising a circuitry.
- the circuitry is configured to: receive, from an Application Function (AF) device, a positioning and sensing fusion request indicating that a sensing result of a target and a positioning result of the target are to be fused at a sixth communication device, the positioning and sensing fusion request comprising an identification of the target; and transmit, to the sixth communication device, a further positioning and sensing fusion request indicating that the sensing result and the positioning result are to be fused.
- the circuitry may be configured to perform any method implemented by the fifth communication device as discussed above.
- circuitry used herein may refer to hardware circuits and/or combinations of hardware circuits and software.
- the circuitry may be a combination of analog and/or digital hardware circuits with software/firmware.
- the circuitry may be any portions of hardware processors with software including digital signal processor (s) , software, and memory (ies) that work together to cause an apparatus, such as a terminal device or a network device, to perform various functions.
- the circuitry may be hardware circuits and or processors, such as a microprocessor or a portion of a microprocessor, that requires software/firmware for operation, but the software may not be present when it is not needed for operation.
- the term circuitry also covers an implementation of merely a hardware circuit or processor (s) or a portion of a hardware circuit or processor (s) and its (or their) accompanying software and/or firmware.
- various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it will be appreciated that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Des modes de réalisation de la présente divulgation proposent une solution de fusion de résultats de détection et de positionnement. Dans une solution, un premier dispositif de communication obtient un résultat de détection d'une cible dans une zone de détection et un résultat de positionnement de la cible. Le résultat de positionnement indique au moins une position de la cible dans la zone de détection. Le premier dispositif de communication effectue une fusion du résultat de détection et du résultat de positionnement.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2024/076493 WO2025166609A1 (fr) | 2024-02-06 | 2024-02-06 | Dispositifs et procédés de fusion de résultats de détection et de positionnement |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2024/076493 WO2025166609A1 (fr) | 2024-02-06 | 2024-02-06 | Dispositifs et procédés de fusion de résultats de détection et de positionnement |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025166609A1 true WO2025166609A1 (fr) | 2025-08-14 |
Family
ID=96698959
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2024/076493 Pending WO2025166609A1 (fr) | 2024-02-06 | 2024-02-06 | Dispositifs et procédés de fusion de résultats de détection et de positionnement |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025166609A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023093644A1 (fr) * | 2021-11-25 | 2023-06-01 | 维沃移动通信有限公司 | Procédé et appareil de détection sans fil, dispositif côté réseau et terminal |
| CN116347327A (zh) * | 2021-12-24 | 2023-06-27 | 维沃移动通信有限公司 | 定位感知方法、感知测量方法、装置、终端及网络侧设备 |
| WO2023116754A1 (fr) * | 2021-12-24 | 2023-06-29 | 维沃移动通信有限公司 | Procédé et appareil de détection de positionnement de cible, dispositif de communication et support de stockage |
| WO2023116753A1 (fr) * | 2021-12-24 | 2023-06-29 | 维沃移动通信有限公司 | Procédé et appareil de détection de positionnement, et dispositif associé |
-
2024
- 2024-02-06 WO PCT/CN2024/076493 patent/WO2025166609A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023093644A1 (fr) * | 2021-11-25 | 2023-06-01 | 维沃移动通信有限公司 | Procédé et appareil de détection sans fil, dispositif côté réseau et terminal |
| CN116347327A (zh) * | 2021-12-24 | 2023-06-27 | 维沃移动通信有限公司 | 定位感知方法、感知测量方法、装置、终端及网络侧设备 |
| WO2023116754A1 (fr) * | 2021-12-24 | 2023-06-29 | 维沃移动通信有限公司 | Procédé et appareil de détection de positionnement de cible, dispositif de communication et support de stockage |
| WO2023116753A1 (fr) * | 2021-12-24 | 2023-06-29 | 维沃移动通信有限公司 | Procédé et appareil de détection de positionnement, et dispositif associé |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP2878161B1 (fr) | Amélioration de localisation dans des déploiements à multiples plmn | |
| EP4557882A1 (fr) | Dispositif électronique et procédé pour système de communication sans fil, et support de stockage | |
| US20220377698A1 (en) | Methods for communication, terminal device, network device, and computer readable media | |
| CN118235488A (zh) | 设备定位 | |
| WO2025166609A1 (fr) | Dispositifs et procédés de fusion de résultats de détection et de positionnement | |
| WO2025160820A1 (fr) | Dispositifs et procédés d'identification et de rapport de cible | |
| WO2025145453A1 (fr) | Dispositifs et procédés pour effectuer un processus de détection | |
| WO2025175567A1 (fr) | Dispositifs et procédés de détection de coordination et de fusion | |
| WO2025086292A1 (fr) | Dispositifs et procédés de communication | |
| WO2024239174A1 (fr) | Dispositifs et procédés de communication | |
| WO2025160841A1 (fr) | Dispositifs et procédés de communication | |
| WO2025199971A1 (fr) | Dispositifs et procédés d'attribution de puissance pour détection | |
| US20250126590A1 (en) | Apparatus, methods, for apparatus and computer program products for location function including non-terestrial access point | |
| WO2024239294A1 (fr) | Dispositifs et procédés de communication | |
| WO2025007277A1 (fr) | Dispositifs et procédés de communication | |
| WO2024239295A1 (fr) | Dispositifs, procédés, et support de communication | |
| WO2025152175A1 (fr) | Dispositifs et procédés de communication | |
| WO2025179596A1 (fr) | Dispositifs et procédés de communication | |
| WO2024108445A1 (fr) | Procédés, dispositifs et support de communication | |
| WO2025060055A1 (fr) | Dispositifs et procédés de détection et de communication intégrées (isac) | |
| WO2025137968A1 (fr) | Dispositifs et procédés de détection de service de détection | |
| WO2025156171A1 (fr) | Dispositifs et procédés de communication | |
| WO2025000541A1 (fr) | Dispositifs et procédés de communication | |
| WO2025199970A1 (fr) | Dispositifs et procédés de détection d'attribution de ressources | |
| WO2025199995A1 (fr) | Dispositifs, procédés, et support de communication |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24922852 Country of ref document: EP Kind code of ref document: A1 |