[go: up one dir, main page]

WO2025065476A1 - Sensing scheduling - Google Patents

Sensing scheduling Download PDF

Info

Publication number
WO2025065476A1
WO2025065476A1 PCT/CN2023/122475 CN2023122475W WO2025065476A1 WO 2025065476 A1 WO2025065476 A1 WO 2025065476A1 CN 2023122475 W CN2023122475 W CN 2023122475W WO 2025065476 A1 WO2025065476 A1 WO 2025065476A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensing
request
union
group
transmit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CN2023/122475
Other languages
French (fr)
Inventor
Xiao Bing Leng
Li Yang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Shanghai Bell Co Ltd
Nokia Solutions and Networks Oy
Nokia Technologies Oy
Original Assignee
Nokia Shanghai Bell Co Ltd
Nokia Solutions and Networks Oy
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Shanghai Bell Co Ltd, Nokia Solutions and Networks Oy, Nokia Technologies Oy filed Critical Nokia Shanghai Bell Co Ltd
Priority to PCT/CN2023/122475 priority Critical patent/WO2025065476A1/en
Publication of WO2025065476A1 publication Critical patent/WO2025065476A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • G01S7/006Transmission of data between radar, sonar or lidar systems and remote stations using shared front-end circuitry, e.g. antennas
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/878Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/10Scheduling measurement reports ; Arrangements for measurement reports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/026Services making use of location information using location based information parameters using orientation information, e.g. compass

Definitions

  • Embodiments of the present disclosure generally relate to the field of telecommunication, and in particular, to devices, methods, apparatuses and computer readable storage medium for sensing scheduling.
  • JCAS Joint Communication and Sensing
  • IRC Integrated Sensing and Communication
  • a sensing radar in the JCAS system may scan its surrounding to measure distances and/or velocities of those static or dynamic objects, such as, bicycles, vehicles, building, pedestrians, and so on.
  • BS and UE in JCAS system can integrate sensing radar function.
  • example embodiments of the present disclosure provide devices, methods, apparatuses and computer readable storage medium for sensing scheduling.
  • a first device may comprise at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the first device to: receive, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area.
  • the first device is further caused to: i) transmit, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receive a resource configuration information for separate sensing from a second device.
  • a second device may comprise at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the second device to: receive, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area.
  • the second device is further caused to: i) receive, from the first device, a confirmation message for the first request, or ii) transmit a first resource configuration information for separate sensing to the first device.
  • a third device may comprise at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the third device to: transmit, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area.
  • the third device is further caused to receive, from the second device, a confirmation message for the first request, and include the first device in the sensing device group to perform the union sensing.
  • a method implemented at a first device comprises: receiving, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area.
  • the method further comprises: i) transmitting, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receiving a resource configuration information for separate sensing from a second device.
  • a method implemented at a second device comprises: receiving, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; transmitting the first request to the first device; and i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
  • a method implemented at a third device comprises: transmitting, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; receiving, from the second device, a confirmation message for the first request; and including the first device in the sensing device group to perform the union sensing.
  • an apparatus comprising: means for receiving, at a first device and from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and means for i) transmitting, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receiving a resource configuration information for separate sensing from a second device.
  • an apparatus comprises: means for receiving, at a second device and from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for transmitting the first request to the first device; and means for i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
  • an apparatus comprising: means for transmitting, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for receiving, from the second device, a confirmation message for the first request; and means for including the first device in the sensing device group to perform the union sensing.
  • a non-transitory computer readable medium comprising program instructions for causing an apparatus to perform at least the method according to the fourth aspect to the sixth aspect.
  • a first device comprising a receiving circuitry configured to receive, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and i) a transmitting circuitry configured to transmit a confirmation message for the first request to the second device, or ii) a receiving circuitry configured to in the case that the first device refrains from joining the sensing device group, receive a resource configuration information for separate sensing from a second device.
  • a second device comprising a receiving circuitry configured to receive, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; a transmitting circuitry configured to transmit the first request to the first device; and i) a receiving circuitry configured to receive, from the first device, a confirmation message for the first request, or ii) a transmitting circuitry configured to transmit a first resource configuration information for separate sensing to the first device.
  • a third device comprising: a transmitting circuitry configured to transmit, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; a receiving circuitry configured to receive, from the second device, a confirmation message for the first request; and a including circuitry configured to include the first device in the sensing device group to perform the union sensing.
  • Fig. 1A illustrates an example network environment in which example embodiments of the present disclosure may be implemented
  • Fig. 1B illustrates another example network environment in which example embodiments of the present disclosure may be implemented
  • Fig. 2 illustrates an example signaling process for sensing scheduling according to example embodiments of the present disclosure
  • Fig. 3 illustrates an example signaling process for joining UE sensing radar into a sensing device group according to example embodiments of the present disclosure
  • Fig. 4 illustrates an example signaling process for parallel union sensing according to example embodiments of the present disclosure
  • Fig. 5 illustrates an example signaling process for serial union sensing according to example embodiments of the present disclosure
  • Figs. 6A to 6E illustrate example situations of determination of the scanning scopes for members in the sensing device group according to example embodiments of the present disclosure
  • Fig. 7 illustrates an example of general flowchart for sensing scheduling according to example embodiments of the present disclosure
  • Fig. 8 illustrates an example module integrated in the first device according to example embodiments of the present disclosure
  • Fig. 9 illustrates an example module integrated in the third device according to example embodiments of the present disclosure.
  • Fig. 10 illustrates an example flowchart of a method implemented at a first device according to example embodiments of the present disclosure
  • Fig. 11 illustrates an example flowchart of a method implemented at a second device according to example embodiments of the present disclosure
  • Fig. 12 illustrates an example flowchart of a method implemented at a third device according to example embodiments of the present disclosure
  • Fig. 13 illustrates an example simplified block diagram of an apparatus that is suitable for implementing embodiments of the present disclosure.
  • Fig. 14 illustrates an example block diagram of an example computer readable medium in accordance with some embodiments of the present disclosure.
  • references in the present disclosure to “one embodiment, ” “an embodiment, ” “an example embodiment, ” and the like indicate that the embodiment described may include a particular feature, structure, or characteristic, but it is not necessary that every embodiment includes the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • first and second etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and similarly, a second element could be termed a first element, without departing from the scope of example embodiments.
  • the term “and/or” includes any and all combinations of one or more of the listed terms.
  • circuitry may refer to one or more or all of the following:
  • circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware.
  • circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
  • the term “communication network” refers to a network following any suitable communication standards, such as long term evolution (LTE) , LTE-advanced (LTE-A) , wideband code division multiple access (WCDMA) , high-speed packet access (HSPA) , narrow band Internet of things (NB-IoT) and so on.
  • LTE long term evolution
  • LTE-A LTE-advanced
  • WCDMA wideband code division multiple access
  • HSPA high-speed packet access
  • NB-IoT narrow band Internet of things
  • the communications between a terminal device and a network device in the communication network may be performed according to any suitable generation communication protocols, including, but not limited to, the third generation (3G) , the fourth generation (4G) , 4.5G, the fifth generation (5G) communication protocols, 5G-A, and/or beyond.
  • Embodiments of the present disclosure may be applied in various communication systems. Given the rapid development in communications, there will of course also be future type communication technologies and systems with which the
  • the term “network device” refers to a node in a communication network via which a terminal device accesses the network and receives services therefrom.
  • the network device may refer to a base station (BS) or an access point (AP) , for example, a node B (NodeB or NB) , an evolved NodeB (eNodeB or eNB) , a NR NB (also referred to as a gNB) , a remote radio unit (RRU) , a radio header (RH) , a remote radio head (RRH) , a relay, a low power node such as a femto, a pico, and so forth, depending on the applied terminology and technology.
  • BS base station
  • AP access point
  • NodeB or NB node B
  • eNodeB or eNB evolved NodeB
  • NR NB also referred to as a gNB
  • RRU remote radio unit
  • RH radio header
  • terminal device refers to any end device that may be capable of wireless communication.
  • a terminal device may also be referred to as a communication device, user equipment (UE) , a subscriber station (SS) , a portable subscriber station, a mobile station (MS) , or an access terminal (AT) .
  • UE user equipment
  • SS subscriber station
  • MS mobile station
  • AT access terminal
  • the terminal device may include, but not limited to, a mobile phone, a cellular phone, a smart phone, voice over IP (VoIP) phones, wireless local loop phones, a tablet, a wearable terminal device, a personal digital assistant (PDA) , portable computers, desktop computer, image capture terminal devices such as digital cameras, gaming terminal devices, music storage and playback appliances, vehicle-mounted wireless terminal devices, wireless endpoints, mobile stations, laptop-embedded equipment (LEE) , laptop-mounted equipment (LME) , USB dongles, smart devices, wireless customer-premises equipment (CPE) , an Internet of things (loT) device, a watch or other wearable, a head-mounted display (HMD) , a vehicle, a drone, a medical device and applications (e.g., remote surgery) , an industrial device and applications (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain contexts) , a consumer electronics device, a device operating on commercial and/
  • JCAS Joint Communication and Sensing
  • IRC Integrated Sensing and Communication
  • union sensing refers to that a group of sensing devices perform sensing procedures coordinated by a sensing management entity, respectively, in order to achieve efficient resource usage.
  • the members of the sensing device group may comprise any device which is configured with sensing capability, for example, UE, BS, access point or IOT devices and so on.
  • a sensing radar in the JCAS system may have six sensing modes including monostatic sensing radar modes (transmitter and receiver at the same BS or UE) , bistatic sensing radar modes (transmitter and receiver at different BSs, transmitter and receiver at different UEs, and transmitter and receiver at BS/UE and UE /BS respectively) .
  • a sensing radar may be installed on a moving platform (for example, vehicle or a drone) , therefore, it becomes a mobile sensing radar.
  • the sensing radar may request radio resource from a network device to which the sensing radar accesses. However, at some places, for example, intersections, many vehicles installed with sensing radars may be gather at some time.
  • Some vehicles at one direction are passing, and some vehicles at another direction are waiting. These sensing radars may consume too much radio resource of their access BS and may impact normal communication services. How to improve UE sensing radar resource utilization efficiency and minimize impact to communication services is very important for JCAS system.
  • a scheme for sensing scheduling is provided.
  • a first device receives a first request for requesting the first device to join a sensing device group from a second device.
  • This sensing device group is configured to perform a union sensing in a sensing area.
  • the terminal device may transmit a confirmation message for the first request to the second device, such that the sensing device may be scheduled with resources for the union sensing.
  • the first device receives a resource configuration information for separate sensing from a second device.
  • a centralized controlling and scheduling mechanism can organize those sensing radars (for example, adjacent sensing radars) to unitedly sensing a special area.
  • sensing radars for example, adjacent sensing radars
  • other UE sensing radars may avoid to repeatedly scan that area as much as possible and try to scan those left dead zones.
  • UE sensing radar’s sensing result can be shared in the sensing group. That can make each sensing radar ‘see’a larger scope.
  • the sensing and communication resources in the network can be efficiently utilized without any performance degrade.
  • the above scheme can be applied to a focuses on UE monostatic sensing radar, which can be applied for vehicle autonomous driving and assisted driving. Comparing with camera and lidar, JCAS sensing radar can work well in poor weather conditions.
  • a vehicle monostatic sensing radar can detect vehicle surrounding objects for autonomous navigation and collision avoidance. Without any limitation, the above scheme may be also applied to any other sensing radar mode.
  • FIG. 1A illustrates an example network environment 100 in which example embodiments of the present disclosure may be implemented.
  • the environment 100 which may be a part of a communication network, includes terminal devices and network devices.
  • the network environment 100 may include a first device 110, a second device 120 and a third device 130.
  • the first device 110 may be a terminal device that is integrated into a vehicle.
  • the first device 110 may also refer to the vehicle 110 as shown in Fig. 1A.
  • the second device 120 may be a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) .
  • the second device 120 may be a Wi-Fi access point or a BS.
  • the third device 130 may be a core network (CN) device configured with sensing management functionality or a sensing management entity (SME) .
  • CN core network
  • SME sensing management entity
  • the first device 110, the second device 120 and the third device 130 may be any other devices having the similar sensing requirements or functionalities.
  • the network environment 100 further includes a sensing device group 140.
  • the sensing device group 140 may include devices 140-1, 140-2 and 140-3.
  • the third device 130 may, via the second device 120, schedule the (sensing) devices in the sensing device group 140 to unitedly perform a sensing procedure in a corresponding sensing area, for example, the area in the block as shown in Fig. 1.
  • the third device 130 may invite the first device 110 to join the sensing device group 140 for performing union sensing.
  • the network environment100 may include any suitable number of network devices and/or terminal devices adapted for implementing embodiments of the present disclosure. Although not shown, it would be appreciated that one or more terminal devices may be located in the network environment 100.
  • Communications in the network environment 100 may be implemented according to any proper communication protocol (s) , comprising, but not limited to, the third generation (3G) , the fourth generation (4G) , the fifth generation (5G) , 5G-Advanced or beyond (6G) , wireless local network communication protocols such as institute for electrical and electronics engineers (IEEE) 802.11 and the like, and/or any other protocols currently known or to be developed in the future.
  • s any proper communication protocol
  • s comprising, but not limited to, the third generation (3G) , the fourth generation (4G) , the fifth generation (5G) , 5G-Advanced or beyond (6G) , wireless local network communication protocols such as institute for electrical and electronics engineers (IEEE) 802.11 and the like, and/or any other protocols currently known or to be developed in the future.
  • IEEE institute for electrical and electronics engineers
  • the communication may utilize any proper wireless communication technology, comprising but not limited to: multiple-input multiple-output (MIMO) , orthogonal frequency division multiplexing (OFDM) , time division multiplexing (TDM) , frequency division multiplexing (FDM) , code division multiplexing (CDM) , Bluetooth, ZigBee, and machine type communication (MTC) , enhanced mobile broadband (eMBB) , massive machine type communication (mMTC) , ultra-reliable low latency communication (URLLC) , carrier aggregation (CA) , dual connectivity (DC) , and new radio unlicensed (NR-U) technologies.
  • MIMO multiple-input multiple-output
  • OFDM orthogonal frequency division multiplexing
  • TDM time division multiplexing
  • FDM frequency division multiplexing
  • CDM code division multiplexing
  • Bluetooth ZigBee
  • MTC machine type communication
  • MTC enhanced mobile broadband
  • mMTC massive machine type communication
  • URLLC ultra-reliable low latency
  • Fig. 1B illustrates another example network environment in which example embodiments of the present disclosure may be implemented
  • Fig. 1B there are three cars in the same area, i.e., the intersection in Fig. 1B. Therefore, there may be a very large area is repeatedly scanned by these three cars’ sensing radars. In turn, to avoid interference each other, these UE sensing radars should occupy different sensing resources (time/frequency/space) , i.e., orthogonal resources. However, this may lead to that too much radio resource is used for sensing, but communication user cannot obtain enough radio resource. In turn, with the scheme discussed with reference to the following embodiments, the resources for sensing and communication can be utilized efficiently.
  • sensing resources time/frequency/space
  • Fig. 2 illustrates an example signaling process 200 for sensing scheduling according to example embodiments of the present disclosure.
  • the signaling process 200 will be described with reference to Fig. 1A. It would be appreciated that although the signaling process 200 has been described in the communication environment 100 of Fig. 1A, this flowchart 200 may be likewise applied to other communication scenarios.
  • the third device 130 transmits (210) a first request 215 for requesting the first device 110 to join a sensing device group 140 to the second device 120.
  • This sensing device group 140 is configured to perform a union sensing in a sensing area.
  • the third device 130 may determine or initiate (201) the sensing device group based on one or more of: a distribution of one or more sensing devices and environment information on the sensing area.
  • the environment information includes at least one of a building distribution, facility distribution or a road arrangement within the sensing area.
  • the third device 130 may collect sensing device positions to determine union sensing areas.
  • the third device 130 may autonomously select a union sensing area according to vehicle distribution.
  • an area gathering a lot of vehicles may be defined as a union sensing area.
  • the sensing area can be jointly sensed by sensing devices of the sensing device group (for example, located in the union sensing area) .
  • the sensing device group may be a temporary group or permanent group.
  • the operator of the JCAS may directly configure a union sensing area for some special areas, for example, intersections.
  • a union sensing area always corresponds to a union sensing group.
  • a union sensing group always corresponds to a sensing area to be unitedly sensed (which may be also referred to as a union sensing area) .
  • the union sensing area may be given with a center position, size, shape and building contours information.
  • the union sensing area may be also referred to a sensing area associated with a sensing device group.
  • the union sensing group may be empty, only has one sensing device (for example, UE sensing radar) , or has multiple sensing devices.
  • the third device 110 may determine a permanent sensing device group for an intersection, since the accident rate at this intersection is high. If there is no sensing device or vehicle is located in the intersection at certain time, the permanent sensing device group may have no member.
  • the third device 130 may maintain the initiated sensing device group dynamically. For example, the third device 130 may cancel a temporary sensing device group or remove a member from the sensing device group. In some embodiments, when most sensing devices (or vehicles) have left a union sensing area, the third device 130 may cancel the corresponding union sensing group. In an example, the terminal device 130 may determine whether a first number of sensing devices in the sensing device group is below a number threshold. If the first number is below the number threshold, the third device 130 may cancel the sensing device group.
  • the third device 130 may monitor all members’ positions in the sensing device group. When a sensing device leaves its union sensing area, the third device 130 may remove this sensing device from the sensing device group. In an example, the third device 130 may monitor a position of at least one sensing device in the sensing device group. Then, the third device 130 may determine, based on the monitoring, whether the at least one sensing device leaves from the sensing area. If the at least one sensing device leaves from the sensing area, the third device 130 may remove the at least one sensing device from the sensing device group.
  • the third device 130 may determine (208) whether to transmit the first request towards the first device 110 based on pose information of the first device 110.
  • the pose information of the first device 110 may include at least one of a position or an orientation of the first device 110.
  • the third device 130 may obtain the pose information of terminal devices and determine whether to transmit the first request to one or more of the terminal devices.
  • the third device 130 may monitor position of the devices having sensing requirements and match their positions with union sensing areas. If a device comes into a first union sensing area, the third device 130 may invite it to join to the union sensing group corresponding to the first union sensing area.
  • the third device 130 may determine whether a first area to be sensed by the first device is associated with the sensing area. If the first area is associated with the sensing area (for example, the first area is a portion of the sensing area or is adjacent to the sensing area) , the third device 130 may transmit the first request towards the first device 110 via the second device 120.
  • the first device 110 may transmit (202) a second request 203 for sensing to the second device 120, in order to obtain the resource configuration for sensing.
  • the second request may include at least one of pose information or sensing capability information of the first device.
  • the sensing capability information may include a field of view (FOV) of the first device 110.
  • the second device 120 may transmit or forward the second request 203 to the third device 130.
  • the third device 130 may determine (208) whether to transmit the first request 215 based on the pose information of the first device in the second request 203. The determination manner may be the same as mentioned above.
  • the third device 130 may obtain the pose information of the first device 110 in any other manner, in addition to receiving the second request 203.
  • the first device 110 may also obtain the pose information from a location management function (LMF) server.
  • LMF location management function
  • the second device 120 may transmit (230) the first request 235 to the first device 110.
  • the first device 110 receives (240) the first request 235 accordingly.
  • the first device 110 may determine (241) whether to join the sensing device group. For example, the first device 110 may have a higher privacy protection requirement, the first device 110 may refrain (243) from joining the sensing device group.
  • the first device 110 may transmit a rejection message for the first request to the second device 120.
  • the second device 120 may be aware that the first device 110 rejects to share the sensing information.
  • the second device 120 may transmit resource configuration information for separate sensing to the first device 110. Then, the first device 110 may perform the sensing procedure by itself without sharing its sensing result.
  • the first device 110 may voluntarily agree to join the sensing device group 140.
  • the first device 110 transmits (245) a confirmation message 247 for the first request to the second device 120.
  • the second device 120 After receiving (255) the confirmation message 247, the second device 120 transmits (255) the confirmation message 257 to the third device 130.
  • the confirmation message 257 may be the same as the confirmation message 247, or may be determined based on the confirmation message 247.
  • the third device 130 receives (258) the confirmation message 257 accordingly.
  • the third device 130 may be aware that the first device 110 voluntarily joins the sensing device group 140 for performing a union sensing in the union sensing area. Then, the third device 130 includes the first device 110 into the sensing device group 140 to perform the union sensing. To perform the union sensing, the third device 130 may plan the scanning scope for one or more members of the sensing group 140, in order to achieve an efficient union scanning for the union sensing area. Only for discussion purposes, the joining procedure of the sensing device (radar) is further discussed with reference to Fig. 3.
  • Fig. 3 illustrates an example signaling process 300 for joining UE sensing radar into a sensing device group according to example embodiments of the present disclosure.
  • the UE sensing radars 110 may be the first device 110 as shown in Figs. 1 and 2
  • the 4G/5G BS or WiFi AP 120 may be the second device 120 as shown in Figs. 1 and 2
  • SME 130 may be the third device 130 as shown in Figs. 1 and 2.
  • the SME 130 can autonomously generate some union sensing groups for those areas with a lot of vehicles.
  • the UE sensing radars 110 may transmit a UE monostatic sensing request 320 (which may be the second request as shown in Fig. 2) with pose and capability information to the BS 120.
  • BS 120 transmits the UE monostatic sensing request 330 to the SME 130.
  • the request message 320 and 330 may include its current pose information, i.e. position and orientation, and its capability information, e.g. maximum FOV.
  • the SME 130 may determine (340) whether the UE sensing radar locates in a union sensing area. If the UE sensing radar locates in a union sensing area, the SME will ask the UE sensing radar whether it hopes to join the union sensing group. As shown in Fig. 3, the third device 130 may transmit the joining group request 350 (which may be the first request as shown in Fig. 2) to the BS 120. Then, the BS 120 may transmit the joining group request 360 based on the joining group request 350. In turn, if the vehicle voluntarily joins (370) the union sensing group, it will send confirmation message 380 (which may be the confirmation message as shown in Fig. 2) to the BS 120. Then, BS 120 may transmit the joining group confirmation 390 to the third device 130. Once a UE sensing radar joins a union sensing group, the SME 130 may plan its sensing scope and share union sensing result with it.
  • the third device 130 may schedule the sensing devices (i.e., the first device 110 and other devices in the sensing device group 140) to perform the union sensing periodically.
  • the union sensing area can be scanned by the sensing device group at a certain sensing frequency, such as 5 Hz.
  • the union sensing procedure is performed for five times in one second.
  • the union sensing procedure may be performed at any other sensing frequency.
  • the third device 130 may need to obtain the current pose information on each member of the sensing device group, in order to plan the scanning scopes for one or more members.
  • the third device 130 may transmit (259) , via the second device 120, a third request 260 for current information towards the first device 110. That is, the third device 130 may transmit the third request to the second device 120. Moreover, the second device 120 may transmit the received third request to the first device 110. In turn, after receiving the third request 260, the second device 110 may transmit (262) , via the second device 120, the pose information 263 of the first device 110 towards the third device 130. In an example, the first device 110 may transmit a pose report of at least one of a position or an orientation of the first device 110.
  • the third device 130 may determine (265) at least one scanning scope for one or more members in the sensing device group 140.
  • the scanning scope may be further determined based on at least one of the following: FOVs of one or more devices in the sensing device group 140, environment information on the sensing area corresponding to the sensing device group 140, one or more dynamic objects within the sensing area; or dead zones of the one or more devices in the sensing device group 140.
  • the dead zones may be determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  • the determination of the scanning scopes is further discussed with reference to Figs. 6A to 6E, and is not discussed here.
  • the union sensing 266 can be performed in parallel or serially. In some embodiments, the union sensing may be performed simultaneously by the members in the sensing device group 140 (including the first device 110) in parallel.
  • the third device 130 may receive one or more pose reports of the one or more sensing devices of the sensing device group 140.
  • the pose report of the one or more pose reports may indicate at least one of a position or an orientation of a sensing device of the one or more sensing devices.
  • the third device 130 may determine a plurality of scanning scopes for the sensing area for one time. At least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices.
  • the third device 130 may transmit (267) an indication 268 of the plurality of scanning scopes to the second device 120.
  • the second device 120 may determine (270) resource configuration information for one or more members of the sensing device group 140. For example, if the third device 130 has determined the scanning scope for each member of the sensing device group 140, the second device 120 may determine the resource configuration information for each member based on the corresponding scanning scope.
  • the third device 130 may also determine that a portion of the device sensing group 140 is to perform the union sensing, and determine the scanning scope of each member in the portion of the sensing device group 140.
  • the second device 120 may also allocate the resource configuration information to the member (s) in the portion of the sensing device group 140 (that is, the resource configuration information is not transmitted to each member of the whole sensing device group 140) . Then, the second device 120 may transmit the resource configuration information to the corresponding device (for example, the first device 110) in the sensing device group 140.
  • Fig. 4 illustrates an example signaling process 400 for parallel union sensing according to example embodiments of the present disclosure.
  • the UE sensing radars 110 may be the first device 110 as shown in Figs. 1 and 2
  • the 4G/5G BS or WiFi AP 120 may be the second device 120 as shown in Figs. 1 and 2
  • SME 130 may be the third device 130 as shown in Figs. 1 and 2.
  • SME 130 will request all group members in the sensing device group 140 to provide their current poses.
  • SME 130 uses UE sensing radars’ poses, capabilities, and buildings’ contours to plan all UE sensing radars’ scanning scopes.
  • SME 130 may only consider those dead zones sheltered by static objects and ignore those dynamic dead zones.
  • SME 130 completes sensing scope planning, it will inform BS 120 to schedule radio resource for all group members.
  • BS 120 will configure time/frequency/space resource for those union sensing radars according to their sensing scopes.
  • Those union grouping sensing radars will simultaneously scan their sensing scopes. Then, the sensing results will be reported to the SME 130.
  • SME 130 will combine those sensing results to a union sensing result, which will be broadcast to all group members (and other devices which subscribe the sensing service) .
  • SME 130 transmits the request 410 for the current poses to BS 120.
  • BS 120 transmits the request 420 for the current poses to UEs 110.
  • the UE sensing radars 110 may transmit the report 430 of poses to BS 120.
  • the UE sensing radars 110 are only examples of members in the sensing device group 140, and the UE sensing radars may be also the device 140-1, 140-2, 140-3 and so on.
  • BS transmits the report 435 of poses to SME 130.
  • SME 130 may plan (440) scopes for these members.
  • SME 130 transmits reports 445 of the scopes determined for these members to BS 120.
  • BS 120 may schedule (450) time/frequency/space resource for union sensing. Then, BS 120 may transmit UE sensing radars’ resource configuration 455 to the corresponding UE sensing radar. UE sensing radar 110 may perform (460) a sensing procedure for the scanning scope planned for this UE sensing radar based on received UE sensing radars’ resource configuration 455. Then, UE sensing radar 110 transmits sensing result report 465 of this UE sensing radar 110 to BS 120. BS 120 receives sensing result reports of members in the sensing device group 140. BS 120 transmit the received sensing result reports to SME 130. SME 130 combine all sensing results based on the sensing result reports to generate a union sensing result.
  • SME 130 may broadcast the union sensing result 480 to the BS 120.
  • BS 120 may broadcast the union sensing result 480 to members in the sensing device group 140.
  • other devices which do not join the sensing device group 140 may also subscribe the sensing service.
  • BS 120 may also transmit the union sensing result to these other devices.
  • the members in the sensing device group 140 may be configured with a first charging policy.
  • Other devices may be configured with a second charging policy. The first charging policy may be lower than the second charging policy since the members made contributions to the union sensing result.
  • the union sensing 266 may be performed serially.
  • the determination of a scanning scope may depend from a sensing result for a previous scanning scope.
  • the third device 130 may determine (265) a first scanning scope to be sensed by a fourth device of the one or more sensing device in the sensing device group 140.
  • the fourth device may be the first device 110 or any other device in the sensing device group 140.
  • the third device 130 may transmit (267) an indication 268 of the first scanning scope to the second device 120.
  • the second device 120 may determine (270) a first resource configuration for the fourth device. Then, the second device 120 may transmit the first resource configuration to the fourth device (which is not shown in Fig. 2 or the fourth device may be the first device 110) .
  • the fourth device performs the sensing procedure based on the first resource configuration. Then, the fourth device transmits (278) a first sensing result 280 of the fourth device to the second device 120. The second device 120 transmits the first sensing result of the fourth device to the third device 130. Further, based on the first sensing result and the one or more pose reports, the third device 130 may determine a second scanning scope to be sensed by a fifth device of the one or more sensing device. Without any limitation, the fifth device may be also the first device or any other devices in the sensing device group 140. Similarly, the third device 130 may transmit an indication of the second scanning scope to the second device 120. The second device 120 may determine a second resource configuration information for the fifth device based on the indication of the second scanning scope.
  • the second device 120 may transmit the second resource configuration information to the fifth device.
  • the fifth device may transmit a second sensing result of the fifth device towards the third device 130.
  • the third device 130 it may, iteratively, determine other scanning scopes based on the sensing results (for example, the first or second sensing result) and schedule members in the sensing device group 140 to perform the sensing procedure, until the union sensing area is sensed completely.
  • the third device 130 may combine the sensing results to generate a union sensing result.
  • the third device 130 may broadcast (284) the union sensing result to the members in the sensing device group 140 and/or other devices.
  • the serial union sensing is further discussed with reference to Fig. 5.
  • Fig. 5 illustrates an example signaling process 500 for serial union sensing according to example embodiments of the present disclosure.
  • the serial grouping sensing may be performed by selecting a UE sensing radar as the first sensing radar, for example, a sensing radar has maximum scanning scope in the union sensing group.
  • SME 130 may plan the first sensing radar’s sensing scope according to its pose, capability, near buildings’ contours and traffic status. Then the planning sensing scope will be sent to BS 120 for radio resource scheduling. BS 120 will configure the first sensing radar with time/frequency/space resource and command the first sensing radar to scan its sensing scope. The first sensing radar’s sensing result will be sent to SME 130. SME 130 will use the first sensing radar’s result to plan other sensing radar’s scanning scopes.
  • SME 130 plans the second sensing radar’s scanning scope, it will plan it to scan those areas where are out of the first sensing radar scanning scope and those areas sheltered by some moving objects detected by the first sensing radar.
  • the second sensing radar should be one which can scan the maximum area of dead zones of the first sensing radar.
  • SME 130 may plan other sensing radars until all the union sensing area is scanned. Finally, SME 130 will combine all sensing results into the union sensing result and broadcast to all group members. In Fig.
  • the UE sensing radars 140-1 and 140-2 may be the members in the sensing device group 140 as shown in Fig. 1, the 4G/5G BS or WiFi AP 120 may be the second device 120 as shown in Figs. 1 and 2, and SME 130 may be the third device 130 as shown in Figs. 1 and 2.
  • SME 130 transmits a request 501 for the current poses.
  • BS 120 transmits the request 503 to the members in the sensing device group 140.
  • the members transmit their pose reports 505 to BS 120.
  • BS 120 transmits the pose reports 507 to SME 130.
  • SME 130 plan a first radar’s scope 509, and transmit a report 511 of the first radar scope to BS 120.
  • BS 120 schedule (513) time/frequency/space resource for the first radar.
  • BS 120 transmits the first sensing radar configuration 515 to the first radar 140-1 (which may be the above fourth device) .
  • the first radar 140-1 scans (517) its sensing group accordingly.
  • the first radar 140-1 transmits the first radar’s sensing report 519 to BS 120.
  • BS 120 transmits the first radar’s sensing report 521 to SME 130.
  • SME 130 plans (523) the second radar’s scanning scope.
  • the second sensing radar 140-2 may scan (529) its sensing scope.
  • the second sensing radar 140-2 may transmit the second radar’s sensing report 531 to BS 120.
  • BS 120 may transmit the second radar’s sensing report 533 to SME 130.
  • SME 130 may iteratively perform the above steps until the union sensing area is scanned.
  • SME 130 may broadcast (284) the union sensing result to the members in the sensing device group 140 and/or other devices.
  • the devices in the sensing device group 140 may sense different part of the union sensing area.
  • the union sensing area may be periodically scanned and the union sensing result may be broadcast to group members or subscribed by non-group vehicles.
  • the third device 130 may select all or a part UE sensing radars to sense those fixed and moving objects.
  • the third device 130 may plan these sensing radars’s canning scopes according to their poses, capabilities, surrounding environment and traffic status.
  • the union sensing has two work modes, one is parallel union sensing and another serial union sensing. In parallel union sensing, those UE sensing radars will make sensing at the same time.
  • Figs. 6A to 6E illustrate example situations of determination of the scanning scopes for members in the sensing device group according to example embodiments of the present disclosure.
  • Fig. 6A and Fig. 6B illustrates an example situations of rural intersection.
  • a rural intersection is usually wide and is not seriously sheltered by surrounding buildings. So, SME 130 doesn’t need to consider surrounding buildings.
  • One simple method is that SME 130 plans car1 sensing scope at 120°, and plans car2 and car3 only to scan the dead zone left by car1, as shadow area in Fig. 6A.
  • SME 160 will plan sensing scope of car2 at its far left 30°, and plan sensing scope of car3 at its far right 30°. Now the full intersection is scanned by the union sensing radar group and doesn’t have any repeatedly scanning area. Another simple method is to balance all union sensing radars’ scanning scopes as in Fig. 6B. Alternatively, SME 130 may also plan these three cars’ sensing scopes at their central 80°.
  • Fig. 6C illustrates an example of an urban intersection near buildings.
  • the urban areas always have many buildings along the streets. These buildings will impact union sensing planning.
  • each corner has a building.
  • the SME 130 can plan car1 sensing scope at 120°. Besides its rear dead zone 1, car 1 has other two dead zones (2 and 3) sheltered by surrounding buildings.
  • car 2 and car 3 sensing scopes it will make use of map information to obtain surrounding building contours and then to calculate car2 and car3 sensing scopes.
  • SME 130 plans car2 and car3 sensing scopes at 45°.
  • Fig. 6D illustrates an example of an urban intersection with heavy traffic.
  • an urban intersection may have a lot of moving objects.
  • SME 130 may plan car1 sensing scope at its 120°. Unlike above three cases, SME 130 may make dynamically planning for car2 and car3. That means that SME 130 uses car1 sensing result to plan car2 and car3 sensing scope.
  • dead zone 1, 2 and 3 there exists a dynamic dead zone 4, which is sheltered by moving objects, i.e. riders shelter each other.
  • SME 130 may request car2 and car3 to sense those riders at different directions. So, SME 130 enlarges car2 and car3 sensing scopes to about 75° and 80° respectively.
  • Fig. 6E illustrates an example of a road section with heavy traffic.
  • a busy road section maybe has a lot of moving objects.
  • SME 130 plans car1 sensing scope at its maximum 120°.
  • the other two cars (2 and 3) shelter some areas, i.e. car1 dead zone 2 and dead zone 3.
  • SME 130 can plan car2 and car3 sensing scope to about 30° and 80° respectively for those dynamic dead zones.
  • SME 130 may adopt both static and dynamic sensing planning for union sensing group.
  • SME 130 may plan all vehicle sensing scopes at the same time (for example, the parallel union sensing and the steps as discussed in Figs. 6A, 6B and 6C) .
  • SME 130 may plan all vehicle sensing scopes one by one.
  • the vehicle sensing planning maybe uses previous vehicles’ sensing results (for example, the serial union sensing and the steps as discussed in Figs. 6D) .
  • a centralized controlling and scheduling mechanism can organize those sensing radars (for example, adjacent sensing radars) to unitedly sensing a special area. As such, the consumed resource can be reduced without performance degrades.
  • Fig. 7 illustrates an example of general flowchart 700 for sensing scheduling according to example embodiments of the present disclosure.
  • the basic idea and steps of the disclosure are presented in general.
  • SME 130 can unite a group of vehicle UE monostatic sensing radars to sense a special area, e.g. an intersection and share the union sensing result among these vehicles. That not only makes each vehicle ‘see’ larger scope, but also avoid repeatedly scanning and save radio resource.
  • SME 130 can autonomously select a union sensing area according to vehicle distribution. An area gathering a lot of vehicles can be defined as a union sensing area. The area will be jointly sensed by UE sensing radars located in it. The union sensing area can be scanned at some frequency, e.g. 5 Hz.
  • SME 130 When a vehicle with UE monostatic sensing radar comes into the union sensing area, SME 130 will ask it whether to join a union sensing group. If the UE sensing radar voluntarily agrees to join the group, it will be scheduled together with other sensing radars in the union group. Otherwise, if the vehicle doesn’ t agree to join the group, it will be separately scheduled. As it independently senses all its surrounding, it maybe occupies more radio resource and lead to higher sensing cost.
  • SME 130 will plan sensing scope for each UE sensing radar based on its current position, sensing field of view (FOV) , placement on the vehicle, surrounding environment, traffic status, etc.
  • the objectives of union sensing planning are maximization of scanning area and minimization of repeatedly scanning area.
  • the SME can adopt different sensing planning methods, such as the situations shown in Figs. 6A to 6E.
  • SME 130 After SME 130 completes union sensing planning, it will inform BS to schedule radio resource for those union sensing radars. All sensing results from the union sensing radars will be sent to SME 130. SME 130 will combine them into one union sensing result. The same moving object sensed by different sensing radars will be merged. In each union sensing period, when SME 130 collects all vehicles sensing results and combine them together, it will broadcast the union sensing result to all group members. The vehicles without sensing radars can also subscribe the union sensing results. The operators can adopt different charging policies for group members and non-group members, e.g. free or lower charge for group members than non-group members.
  • a corresponding functionality module may be integrated at the first device 110 or the third device 130.
  • Fig. 8 illustrates an example module integrated in the first device according to example embodiments of the present disclosure.
  • the first device 110 may include a union sensing control module 810, an uplink forward module 82 and a downlink forward module 830.
  • the JCAS UE (which may be the first device 110) may have a new function to voluntarily join a union sensing group as show in Fig. 8.
  • the UE owner can enable or disable the union sensing function.
  • the SME ask the UE sensing radar whether to join a union sensing group, it can answer the SME according to UE owner’s settings. This function also forwards the union sensing result to the vehicle navigation system, and forwards its sensing result to the SME for union sensing group.
  • Fig. 9 illustrates an example module integrated in the third device according to example embodiments of the present disclosure.
  • the third device 130 may include a vehicle distribution analysis 910, a position match and union sensing invitation module 920, a union sensing area/group database module 930, a union sensing planning module 940 and a union sensing combination and broadcast module 950.
  • SME 130 should have several new functions to generate union sensing group, plan union sensing scopes, combine and broadcast union sensing results.
  • SME 130 collects vehicle positions to determine union sensing areas. According to vehicle distribution, SME 130 can generate a union sensing group for those vehicle gathering areas. When most vehicles have left a union sensing area, SME 130 can also remove a union sensing group.
  • a union sensing group may be a temporary or permanent group.
  • the JCAS operator can also directly configure a union sensing area for some special areas, e.g. intersections.
  • a union sensing area always corresponds to a union sensing group.
  • a union sensing area will be given with center position, size, shape and building contours information.
  • a union sensing group may be empty, only has one UE sensing radar, or has multiple UE sensing radars.
  • SME 130 will monitor all UE sensing radars’ positions. When a sensing radar leaves its union sensing area, the SME will remove it from the group. In general, the more vehicle sensing radars has in the union sensing group, the better radio resource efficiency is gotten, and the larger sensing scope is ‘seen’ . SME 130 will monitor vehicle position and match with union sensing areas.
  • SME 130 will invite it to join to union sensing group.
  • the union sensing group will periodically sense the union sensing area.
  • the SME will plan all sensing radars’ scanning scopes according to their current poses (positions and orientations) , sensing radar capabilities (FOV) , building contours and shelter status.
  • the sensing scope information will be sent to BS for sensing resource scheduling.
  • SME 130 will combine all union sensing radars’ results to generate a union sensing result, which will cover the whole union sensing area and be shared by all group members.
  • the disclosure can be implemented via future JCAS UE, BS and SME.
  • the UE will be installed on a vehicle to provide both communication and sensing functions. Its sensing function will work as monostatic radar to scan surrounding fixed or moving objects for navigation positioning and obstacle avoidance.
  • the BS will schedule radio resource (time/frequency/space) for UE sensing radar.
  • the SME will manage and control those UE sensing radars. In this disclosure, the SME will organize those adjacent UE sensing radars to unitedly scan a special area to avoid repeatedly scanning and meanwhile remove sensing dead zone as much as possible. As such, the disclosure can improve JCAS radio resource efficiency and make the sensing function not to impact communication user experience.
  • Fig. 10 shows a flowchart of an example method 1000 implemented at a first device (for example, the first device 110) in accordance with some embodiments of the present disclosure.
  • a first device for example, the first device 110
  • the method 1000 will be described from the perspective of the first device 110 with reference to Fig. 1.
  • the first device 110 receives, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area.
  • the first device 110 transmits, to the second device, a confirmation message for the first request.
  • the first device 110 receives resource configuration information for separate sensing from a second device.
  • the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
  • the first device 110 may further: transmit, to the second device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device.
  • the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
  • FOV field of view
  • the first request is determined based on the pose information of the first device and is transmitted by the third device.
  • the first device 110 transmits the confirmation message, and the first device may further: receive, from the second device, a third request for current pose information of one or more sensing devices in the sensing device group; and transmit, to the second device, a pose report of at least one of a position or an orientation of the first device.
  • the first device transmits the confirmation message, and the first device may further: receive, from the second device, a resource configuration information for the union sensing, wherein the resource configuration for the union sensing indicates a set of sensing resources that are determined based on the pose report.
  • the set of sensing resources are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  • the first device may further: perform, based on the resource configuration information for the union sensing, a sensing procedure to obtain a sensing result of the first device; transmit the sensing result to the second device; and receive, from the second device, a union sensing result that is at least partially based on the sensing result of the first device.
  • the first device based on that the first device transmits the confirmation message, the first device is configured with a first charging policy; or based on that the first device refrains from joining the sensing device group, the first device is configured with a second charging policy.
  • the first device may receive the resource configuration for separate sensing by: transmitting, to the second device, a rejection message for the first request.
  • the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
  • a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ;
  • a third device comprises a core network (CN) device or a sensing management entity (SME) .
  • CN core network
  • SME sensing management entity
  • Fig. 11 shows a flowchart of an example method 1100 implemented at a second device (for example, the second device 120) in accordance with some embodiments of the present disclosure.
  • a second device for example, the second device 120
  • the method 1100 will be described from the perspective of the second device 120 with reference to Fig. 1.
  • the second device 120 receives, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area.
  • the second device 120 transmits the first request to the first device.
  • the second device 120 receives, from the first device, a confirmation message for the first request.
  • the second device 120 transmits a first resource configuration information for separate sensing to the first device.
  • the confirmation message is received, and the second device may further transmit the configuration message to the third device.
  • the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
  • the second device may further receive, from the first device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device; and transmit the second request to the third device.
  • the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
  • FOV field of view
  • the first request is determined by based on the pose information of the first device and is transmitted by the third device.
  • the second device may further receive, from a third device, a third request for current pose information of one or more sensing devices in the sensing device group; transmit the third request to the first device; receive, from the first device, a pose report of at least one of a position or an orientation of the first device; and transmit the pose report to the third device.
  • the second device may further receive, from a third device, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; determine, based on the plurality of the scanning scopes, a plurality of resource configuration information for the union sensing; and transmit, to the sensing device of the one or more sensing devices, a resource configuration information of the plurality of the resource configuration information, wherein the resource configuration information is for sensing the at least one of the plurality of the scanning scopes.
  • the second device may further receive one or more sensing results from the one or more sensing devices; transmit, to the third device, the one or more sensing results; receive, from the third device, a union sensing result that is based on the one or more sensing results.
  • the second device may further receive, from the third device, a first indication of the first scanning scope associated with a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; determine, based on the first indication of the first scanning scope, a first resource configuration information; and transmit, to the fourth device, the first resource configuration information.
  • the second device may further receive, from the fourth device, a first sensing result of the fourth device; transmit, to the third device, the first sensing result; receive, from the third device, a second indication of the second scanning scope associated with a fifth device of the one or more sensing device, wherein the second scanning scope is determined based on the first sensing result; and determine, based on the second indication of the first scanning scope, a second resource configuration information; and transmit, to the fifth device, the second resource configuration information.
  • the second device may further: receive, from the fifth device, a second sensing result; transmit, to the third device, the second sensing result; and receive, from the third device, a union sensing result that is at least based on the first sensing result and the second sensing result.
  • the second device may further: transmit the union result to the one or more sensing devices; transmit the union result to a further device which is not included in the sensing device group.
  • the one or more sensing devices are configured with a first charging policy; and the further device is configured with a second charging policy.
  • the scanning scope are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  • the second device may transmit the resource configuration information for separate sensing by: receiving, from the first device, a rejection message for the first request.
  • the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
  • a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ;
  • a third device comprises a core network (CN) device or a sensing management entity (SME) .
  • CN core network
  • SME sensing management entity
  • Fig. 12 shows a flowchart of an example method 1200 implemented at a third device (for example, the third device 130) in accordance with some embodiments of the present disclosure.
  • a third device for example, the third device 130
  • the method 1200 will be described from the perspective of the third device 130 with reference to Fig. 1.
  • the third device 130 transmits, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area.
  • the third device 130 receives, from the second device, a confirmation message for the first request.
  • the third device 130 includes the first device in the sensing device group to perform the union sensing.
  • the sensing device group is determined by the third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area.
  • the third device may further: obtain pose information of the first device, wherein the pose information comprises at least one of a position or an orientation of the first device.
  • the third device may further receive, from the second device, a second request for sensing, wherein the second request is transmitted from the first device and the second request comprises at least one of pose information or sensing capability information of the first device.
  • the third device may transmit the first request by: determining, based on the pose information of the first device, whether a first area to be sensed by the first device is associated with the sensing area; based on determining that the first area is associated with the sensing area, transmitting the first request.
  • the third device may further: monitor a position of at least one sensing device in the sensing device group; and determine, based on the monitoring, whether the at least one sensing device leaves from the sensing area; and based on determining that the at least one sensing device leaves from the sensing area, remove the at least one sensing device from the sensing device group.
  • the third device may further: determine whether a first number of sensing devices in the sensing device group is below a number threshold; based on determining that the first number is below the number threshold, cancel the sensing device group.
  • the third device may further: determine a permanent sensing device group for the sensing area.
  • the third device may further: transmit, to the second device, a third request for current pose information of one or more sensing devices in the sensing device group; receive, from the second device, one or more pose reports of the one or more sensing devices, wherein a pose report of the one or more pose report indicates at least one of a position or an orientation of a sensing device of the one or more sensing devices.
  • the third device may further: determine, based on the one or more pose reports, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; and transmit, to the second device, an indication of the plurality of scanning scopes.
  • the third device may further: receive, from the second device, one or more sensing results of the one or more sensing devices; determine, based on the one or more sensing results, a first union sensing result; and transmit, to the second device, the first union sensing result.
  • the third device may further: determine, based on the one or more pose reports, a first scanning scope to be sensed by a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; and transmit, to the second device, an indication of the first scanning scope.
  • the third device may further: receive, from the second device, a first sensing result of the fourth device; determine, based on the first sensing result and the one or more pose reports, a second scanning scope to be sensed by a fifth device of the one or more sensing device; and transmit, to the second device, an indication of the second scanning scope.
  • the third device may further: receive, from the second device, a second sensing result of the fifth device; determine, based on at least the first sensing result and the second sensing result, a second union sensing result; and transmit, to the second device, the second union sensing result.
  • the scanning scope is determined by the third device further based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  • the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
  • the second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ;
  • the third device comprises a core network (CN) device or a sensing management entity (SME) .
  • CN core network
  • SME sensing management entity
  • an apparatus capable of performing any of the method 1000 may include means for receiving, at a first device and from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and means for i) transmitting, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receiving a resource configuration information for separate sensing from a second device.
  • the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
  • the apparatus may further include means for transmitting, to the second device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device.
  • the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
  • FOV field of view
  • the first request is determined based on the pose information of the first device and is transmitted by the third device.
  • the apparatus includes means for transmitting the confirmation message, and the apparatus may further include means for receiving, from the second device, a third request for current pose information of one or more sensing devices in the sensing device group; and means for transmitting, to the second device, a pose report of at least one of a position or an orientation of the first device.
  • the apparatus includes means for transmitting the confirmation message, and he apparatus may further include means for receiving, from the second device, a resource configuration information for the union sensing, wherein the resource configuration for the union sensing indicates a set of sensing resources that are determined based on the pose report.
  • the set of sensing resources are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  • he apparatus may further include means for performing, based on the resource configuration information for the union sensing, a sensing procedure to obtain a sensing result of the first device; means for transmitting the sensing result to the second device; and means for receiving, from the second device, a union sensing result that is at least partially based on the sensing result of the first device.
  • the first device based on that the apparatus transmits the confirmation message, the first device is configured with a first charging policy; or based on that the first device refrains from joining the sensing device group, the first device is configured with a second charging policy.
  • the first device may receive the resource configuration for separate sensing by: transmitting, to the second device, a rejection message for the first request.
  • the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
  • a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ;
  • a third device comprises a core network (CN) device or a sensing management entity (SME) .
  • CN core network
  • SME sensing management entity
  • an apparatus capable of performing any of the method 1100 may include means for receiving, at a second device and from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for transmitting the first request to the first device; and means for i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
  • the confirmation message is received, and the second device may further transmit the configuration message to the third device.
  • the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
  • the apparatus may further include means for receiving, from the first device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device; and means for transmitting the second request to the third device.
  • the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
  • FOV field of view
  • the first request is determined by based on the pose information of the first device and is transmitted by the third device.
  • the apparatus may further include means for receiving, from a third device, a third request for current pose information of one or more sensing devices in the sensing device group; means for transmitting the third request to the first device; means for receiving, from the first device, a pose report of at least one of a position or an orientation of the first device; and means for transmitting the pose report to the third device.
  • the apparatus may further include means for receiving, from a third device, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; means for determining, based on the plurality of the scanning scopes, a plurality of resource configuration information for the union sensing; and means for transmitting, to the sensing device of the one or more sensing devices, a resource configuration information of the plurality of the resource configuration information, wherein the resource configuration information is for sensing the at least one of the plurality of the scanning scopes.
  • the apparatus may further include means for receiving one or more sensing results from the one or more sensing devices; means for transmitting, to the third device, the one or more sensing results; receive, from the third device, a union sensing result that is based on the one or more sensing results.
  • the apparatus may further include means for receiving, from the third device, a first indication of the first scanning scope associated with a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; means for determining, based on the first indication of the first scanning scope, a first resource configuration information; and transmit, to the fourth device, the first resource configuration information.
  • the apparatus may further include means for receiving, from the fourth device, a first sensing result of the fourth device; transmit, to the third device, the first sensing result; means for receiving, from the third device, a second indication of the second scanning scope associated with a fifth device of the one or more sensing device, wherein the second scanning scope is determined based on the first sensing result; and means for determining, based on the second indication of the first scanning scope, a second resource configuration information; and transmit, to the fifth device, the second resource configuration information.
  • the apparatus may further include means for receiving, from the fifth device, a second sensing result; means for transmitting, to the third device, the second sensing result; and means for receiving, from the third device, a union sensing result that is at least based on the first sensing result and the second sensing result.
  • the apparatus may further include means for transmitting the union result to the one or more sensing devices; means for transmitting the union result to a further device which is not included in the sensing device group.
  • the one or more sensing devices are configured with a first charging policy; and the further device is configured with a second charging policy.
  • the scanning scope are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  • the means for transmitting the resource configuration information for separate sensing includes: means for receiving, from the first device, a rejection message for the first request.
  • the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
  • a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ;
  • a third device comprises a core network (CN) device or a sensing management entity (SME) .
  • CN core network
  • SME sensing management entity
  • an apparatus capable of performing any of the method 1200 may include means for transmitting, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for receiving, from the second device, a confirmation message for the first request; and means for including the first device in the sensing device group to perform the union sensing.
  • the sensing device group is determined by the third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area.
  • the apparatus may further include means for obtaining pose information of the first device, wherein the pose information comprises at least one of a position or an orientation of the first device.
  • the apparatus may further include means for receiving, from the second device, a second request for sensing, wherein the second request is transmitted from the first device and the second request comprises at least one of pose information or sensing capability information of the first device.
  • the means for transmitting the first request includes: means for determining, based on the pose information of the first device, whether a first area to be sensed by the first device is associated with the sensing area; based on determining that the first area is associated with the sensing area, means for transmitting the first request.
  • the apparatus may further include means for monitoring a position of at least one sensing device in the sensing device group; and means for determining, based on the monitoring, whether the at least one sensing device leaves from the sensing area; and based on determining that the at least one sensing device leaves from the sensing area, remove the at least one sensing device from the sensing device group.
  • the apparatus may further include means for determining whether a first number of sensing devices in the sensing device group is below a number threshold; means for based on determining that the first number is below the number threshold, cancelling the sensing device group.
  • the apparatus may further include means for determining a permanent sensing device group for the sensing area.
  • the apparatus may further include means for transmitting, to the second device, a third request for current pose information of one or more sensing devices in the sensing device group; means for receiving, from the second device, one or more pose reports of the one or more sensing devices, wherein a pose report of the one or more pose report indicates at least one of a position or an orientation of a sensing device of the one or more sensing devices.
  • the apparatus may further include means for determining, based on the one or more pose reports, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; and means for transmitting, to the second device, an indication of the plurality of scanning scopes.
  • the apparatus may further include means for receiving, from the second device, one or more sensing results of the one or more sensing devices; determine, based on the one or more sensing results, a first union sensing result; and transmit, to the second device, the first union sensing result.
  • the apparatus may further include means for determining, based on the one or more pose reports, a first scanning scope to be sensed by a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; and transmit, to the second device, an indication of the first scanning scope.
  • the apparatus may further include means for receiving, from the second device, a first sensing result of the fourth device; determine, based on the first sensing result and the one or more pose reports, a second scanning scope to be sensed by a fifth device of the one or more sensing device; and transmit, to the second device, an indication of the second scanning scope.
  • the apparatus may further include means for receiving, from the second device, a second sensing result of the fifth device; determine, based on at least the first sensing result and the second sensing result, a second union sensing result; and transmit, to the second device, the second union sensing result.
  • the scanning scope is determined by the third device further based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  • the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
  • the second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ;
  • the third device comprises a core network (CN) device or a sensing management entity (SME) .
  • CN core network
  • SME sensing management entity
  • Fig. 13 is a simplified block diagram of a device 1300 that is suitable for implementing embodiments of the present disclosure.
  • the device 1300 may be provided to implement the communication device, for example the first device 110 to the third device 130 as shown in Fig. 1a.
  • the device 1300 includes one or more processors 1310, one or more memories 1340 coupled to the processor 1310, and one or more transmitters and/or receivers (TX/RX) 1340 coupled to the processor 1310.
  • TX/RX transmitters and/or receivers
  • the TX/RX 1340 is for bidirectional communications.
  • the TX/RX 1340 has at least one antenna to facilitate communication.
  • the communication interface may represent any interface that is necessary for communication with other network elements.
  • the processor 1310 may be of any type suitable to the local technical network and may include one or more of the following: general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multicore processor architecture, as non-limiting examples.
  • the device 1300 may have multiple processors, such as an application specific integrated circuit chip that is slaved in time to a clock which synchronizes the main processor.
  • the memory 1320 may include one or more non-volatile memories and one or more volatile memories.
  • the non-volatile memories include, but are not limited to, a read only memory (ROM) 1324, an electrically programmable read only memory (EPROM) , a flash memory, a hard disk, a compact disc (CD) , a digital video disk (DVD) , and other magnetic storage and/or optical storage.
  • the volatile memories include, but are not limited to, a random access memory (RAM) 1322 and other volatile memories that will not last in the power-down duration.
  • a program 1330 includes executable instructions that are executed by the associated processor 1310.
  • the program 830 may be stored in the ROM 1324.
  • the processor 1310 may perform any suitable actions and processing by loading the program 930 into the RAM 822.
  • the embodiments of the present disclosure may be implemented by means of the program so that the device 1300 may perform any process of the disclosure as discussed with reference to Figs. 2 to 12.
  • the embodiments of the present disclosure may also be implemented by hardware or by a combination of software and hardware.
  • the program 830 may be tangibly contained in a readable storage medium which may be included in the device 1300 (such as in the memory 1320) or other storage devices that are accessible by the device 1300.
  • the device 800 may load the program 1330 from the storage medium to the RAM 1322 for execution.
  • the storage medium may include any types of tangible non-volatile storage, such as ROM, EPROM, a flash memory, a hard disk, CD, DVD, and the like.
  • Fig. 14 shows an example of the storage medium 1400 in form of CD or DVD.
  • the storage medium has the processor instructions 1330 stored therein.
  • various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representations, it is to be understood that the block, apparatus, system, technique or method described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • the present disclosure also provides at least one program product tangibly stored on a non-transitory readable storage medium.
  • the program product includes executable instructions, such as those included in program modules, being executed in a device on a target real or virtual processor, to carry out process 200, the method 1000, 1100 or 1200 as described above with reference to Fig. 2 to Fig. 12.
  • program modules include routines, programs, libraries, objects, classes, components, data structures, or the like that perform particular tasks or implement particular abstract data types.
  • the functionality of the program modules may be combined or split between program modules as desired in various embodiments.
  • Machine-executable instructions for program modules may be executed within a local or distributed device. In a distributed device, program modules may be located in both local and remote storage media.
  • Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented.
  • the program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
  • program codes or related data may be carried by any suitable carrier to enable the device, apparatus or processor to perform various processes and operations as described above.
  • Examples of the carrier include a signal, readable storage medium, and the like.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • a readable storage medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random-access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • non-transitory is a limitation of the medium itself (i.e., tangible, not a signal) as opposed to a limitation on data storage persistency (e.g., RAM vs. ROM) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

Embodiments of the present disclosure disclose devices, methods and apparatuses for sensing scheduling. In the embodiments, a first device receives, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area. Then, the first device: i) transmits, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receives a resource configuration information for separate sensing from a second device. In this way, the performance of the uplink channel of communication and sensing system can be improved.

Description

SENSING SCHEDULING FIELD
Embodiments of the present disclosure generally relate to the field of telecommunication, and in particular, to devices, methods, apparatuses and computer readable storage medium for sensing scheduling.
BACKGROUND
With the development of communication technology, the sensing capability can be also supported, enabled or integrated in a communication system. For example, Joint Communication and Sensing (JCAS) have become one of the hottest topics for next generation mobile network. In some cases, JCAS may be also referred to as Integrated Sensing and Communication (ISAC) . It makes use of communication system resource and infrastructure (spectrum, hardware, site, and so on) to implement radar sensing function. Specifically, a sensing radar in the JCAS system may scan its surrounding to measure distances and/or velocities of those static or dynamic objects, such as, bicycles, vehicles, building, pedestrians, and so on. Both BS and UE in JCAS system can integrate sensing radar function.
In turn, with the increasing demand for sensing in the JCAS system, the efficient utilization of communication and/or sensing resources becomes a key aspect.
SUMMARY
In general, example embodiments of the present disclosure provide devices, methods, apparatuses and computer readable storage medium for sensing scheduling.
In a first aspect, there is provided a first device. The first device may comprise at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the first device to: receive, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area. The first device is further caused to: i) transmit, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receive a resource configuration information for separate sensing from a second device.
In a second aspect, there is provided a second device. The second device may comprise at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the second device to: receive, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area. The second device is further caused to: i) receive, from the first device, a confirmation message for the first request, or ii) transmit a first resource configuration information for separate sensing to the first device.
In a third aspect, there is provided a third device. The third device may comprise at least one processor; and at least one memory storing instructions that, when executed by the at least one processor, cause the third device to: transmit, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area. The third device is further caused to receive, from the second device, a confirmation message for the first request, and include the first device in the sensing device group to perform the union sensing.
In a fourth aspect, there is provided a method implemented at a first device. The method comprises: receiving, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area. The method further comprises: i) transmitting, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receiving a resource configuration information for separate sensing from a second device.
In a fifth aspect, there is provided a method implemented at a second device. The method comprises: receiving, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; transmitting the first request to the first device; and i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
In a sixth aspect, there is provided a method implemented at a third device. The method comprises: transmitting, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; receiving, from the second device, a confirmation message for the first request; and including the first device in the sensing device group to perform the union sensing.
In a seventh aspect, there is provided an apparatus. The apparatus comprises: means for receiving, at a first device and from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and means for i) transmitting, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receiving a resource configuration information for separate sensing from a second device.
In an eighth aspect, there is provided an apparatus. The apparatus comprises: means for receiving, at a second device and from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for transmitting the first request to the first device; and means for i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
In a ninth aspect, there is provided an apparatus. The apparatus comprises: means for transmitting, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for receiving, from the second device, a confirmation message for the first request; and means for including the first device in the sensing device group to perform the union sensing.
In a tenth aspect, there is provided a non-transitory computer readable medium comprising program instructions for causing an apparatus to perform at least the method according to the fourth aspect to the sixth aspect.
In an eleventh aspect, there is provided a first device. The first device comprises a receiving circuitry configured to receive, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and i) a transmitting circuitry configured to transmit a confirmation message for the first request to the second device, or ii) a receiving circuitry configured to in the case that the first device refrains from joining the sensing device group, receive a resource configuration information for separate sensing from a second device.
In a twelfth aspect, there is provided a second device. The second device comprises a receiving circuitry configured to receive, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; a transmitting circuitry configured to transmit the first request to  the first device; and i) a receiving circuitry configured to receive, from the first device, a confirmation message for the first request, or ii) a transmitting circuitry configured to transmit a first resource configuration information for separate sensing to the first device.
In a thirteenth aspect, there is provided a third device. The third device comprises: a transmitting circuitry configured to transmit, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; a receiving circuitry configured to receive, from the second device, a confirmation message for the first request; and a including circuitry configured to include the first device in the sensing device group to perform the union sensing.
It is to be understood that the summary section is not intended to identify key or essential features of embodiments of the present disclosure, nor is it intended to be used to limit the scope of the present disclosure. Other features of the present disclosure will become easily comprehensible through the following description.
BRIEF DESCRIPTION OF THE DRAWINGS
Some example embodiments will now be described with reference to the accompanying drawings, where:
Fig. 1A illustrates an example network environment in which example embodiments of the present disclosure may be implemented;
Fig. 1B illustrates another example network environment in which example embodiments of the present disclosure may be implemented;
Fig. 2 illustrates an example signaling process for sensing scheduling according to example embodiments of the present disclosure;
Fig. 3 illustrates an example signaling process for joining UE sensing radar into a sensing device group according to example embodiments of the present disclosure;
Fig. 4 illustrates an example signaling process for parallel union sensing according to example embodiments of the present disclosure;
Fig. 5 illustrates an example signaling process for serial union sensing according to example embodiments of the present disclosure;
Figs. 6A to 6E illustrate example situations of determination of the scanning scopes for members in the sensing device group according to example embodiments of the present  disclosure;
Fig. 7 illustrates an example of general flowchart for sensing scheduling according to example embodiments of the present disclosure;
Fig. 8 illustrates an example module integrated in the first device according to example embodiments of the present disclosure;
Fig. 9 illustrates an example module integrated in the third device according to example embodiments of the present disclosure;
Fig. 10 illustrates an example flowchart of a method implemented at a first device according to example embodiments of the present disclosure;
Fig. 11 illustrates an example flowchart of a method implemented at a second device according to example embodiments of the present disclosure;
Fig. 12 illustrates an example flowchart of a method implemented at a third device according to example embodiments of the present disclosure;
Fig. 13 illustrates an example simplified block diagram of an apparatus that is suitable for implementing embodiments of the present disclosure; and
Fig. 14 illustrates an example block diagram of an example computer readable medium in accordance with some embodiments of the present disclosure.
Throughout the drawings, the same or similar reference numerals represent the same or similar element.
DETAILED DESCRIPTION
Principle of the present disclosure will now be described with reference to some example embodiments. It is to be understood that these embodiments are described only for the purpose of illustration and help those skilled in the art to understand and implement the present disclosure, without suggesting any limitation as to the scope of the disclosure. The disclosure described herein may be implemented in various manners other than the ones described below.
In the following description and claims, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skills in the art to which the present disclosure belongs.
References in the present disclosure to “one embodiment, ” “an embodiment, ” “an example embodiment, ” and the like indicate that the embodiment described may include a particular feature, structure, or characteristic, but it is not necessary that every embodiment includes the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
It may be understood that although the terms “first” and “second” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term “and/or” includes any and all combinations of one or more of the listed terms.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a” , “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” , “comprising” , “has” , “having” , “includes” and/or “including” , when used herein, specify the presence of stated features, elements, and/or components etc., but do not preclude the presence or addition of one or more other features, elements, components and/or combinations thereof.
As used in this application, the term “circuitry” may refer to one or more or all of the following:
(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and
(b) combinations of hardware circuits and software, such as (as applicable) :
(i) a combination of analog and/or digital hardware circuit (s) with software/firmware and
(ii) any portions of hardware processor (s) with software (including digital signal processor (s) ) , software, and memory (ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and
(c) hardware circuit (s) and or processor (s) , such as a microprocessor (s) or a portion of a microprocessor (s) that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.
This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device.
As used herein, the term “communication network” refers to a network following any suitable communication standards, such as long term evolution (LTE) , LTE-advanced (LTE-A) , wideband code division multiple access (WCDMA) , high-speed packet access (HSPA) , narrow band Internet of things (NB-IoT) and so on. Furthermore, the communications between a terminal device and a network device in the communication network may be performed according to any suitable generation communication protocols, including, but not limited to, the third generation (3G) , the fourth generation (4G) , 4.5G, the fifth generation (5G) communication protocols, 5G-A, and/or beyond. Embodiments of the present disclosure may be applied in various communication systems. Given the rapid development in communications, there will of course also be future type communication technologies and systems with which the present disclosure may be embodied. It should not be seen as limiting the scope of the present disclosure to only the aforementioned system.
As used herein, the term “network device” refers to a node in a communication network via which a terminal device accesses the network and receives services therefrom. The network device may refer to a base station (BS) or an access point (AP) , for example, a node B (NodeB or NB) , an evolved NodeB (eNodeB or eNB) , a NR NB (also referred to as a gNB) , a remote radio unit (RRU) , a radio header (RH) , a remote radio head (RRH) , a relay, a low power node such as a femto, a pico, and so forth, depending on the applied terminology and technology.
The term “terminal device” refers to any end device that may be capable of wireless communication. By way of example rather than limitation, a terminal device may also be  referred to as a communication device, user equipment (UE) , a subscriber station (SS) , a portable subscriber station, a mobile station (MS) , or an access terminal (AT) . The terminal device may include, but not limited to, a mobile phone, a cellular phone, a smart phone, voice over IP (VoIP) phones, wireless local loop phones, a tablet, a wearable terminal device, a personal digital assistant (PDA) , portable computers, desktop computer, image capture terminal devices such as digital cameras, gaming terminal devices, music storage and playback appliances, vehicle-mounted wireless terminal devices, wireless endpoints, mobile stations, laptop-embedded equipment (LEE) , laptop-mounted equipment (LME) , USB dongles, smart devices, wireless customer-premises equipment (CPE) , an Internet of things (loT) device, a watch or other wearable, a head-mounted display (HMD) , a vehicle, a drone, a medical device and applications (e.g., remote surgery) , an industrial device and applications (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain contexts) , a consumer electronics device, a device operating on commercial and/or industrial wireless networks, and the like. In the following description, the terms “terminal device” , “communication device” , “terminal” , “user equipment” and “UE” may be used interchangeably.
The terms “Joint Communication and Sensing (JCAS) ” and “Integrated Sensing and Communication (ISAC) ” used herein can be used interchangeably without any limitation.
In this disclosure, the term “union sensing” refers to that a group of sensing devices perform sensing procedures coordinated by a sensing management entity, respectively, in order to achieve efficient resource usage. The members of the sensing device group may comprise any device which is configured with sensing capability, for example, UE, BS, access point or IOT devices and so on.
As mentioned above, the efficient utilization of communication and/or sensing resources becomes a key aspect. Only for discussion purposes without any limitation, some example situations regarding the sensing and communication resource allocation are discussed as below.
Generally, a sensing radar in the JCAS system may have six sensing modes including monostatic sensing radar modes (transmitter and receiver at the same BS or UE) , bistatic sensing radar modes (transmitter and receiver at different BSs, transmitter and receiver at different UEs, and transmitter and receiver at BS/UE and UE /BS respectively) . In some cases, a sensing radar may be installed on a moving platform (for example, vehicle or a drone) ,  therefore, it becomes a mobile sensing radar. The sensing radar may request radio resource from a network device to which the sensing radar accesses. However, at some places, for example, intersections, many vehicles installed with sensing radars may be gather at some time. Some vehicles at one direction are passing, and some vehicles at another direction are waiting. These sensing radars may consume too much radio resource of their access BS and may impact normal communication services. How to improve UE sensing radar resource utilization efficiency and minimize impact to communication services is very important for JCAS system.
In fact, when several adjacent UE sensing radars sense their surroundings, they maybe repeatedly scan some common areas (which is further discussed with reference to Fig. 1B) . In most cases, their sensing results are same. To avoid interference between each other, these sensing radars should occupy different radar resource (time/frequency/space) , i.e., orthogonal resources. That will lead to too much radio resource is used for sensing, but communication user cannot obtain enough radio resource.
In view of the above and in order to improve the performance of a communication and sensing system, a scheme for sensing scheduling is provided. In this scheme, a first device receives a first request for requesting the first device to join a sensing device group from a second device. This sensing device group is configured to perform a union sensing in a sensing area. Then, the terminal device may transmit a confirmation message for the first request to the second device, such that the sensing device may be scheduled with resources for the union sensing. In addition, in the case that the first device refrains from joining the sensing device group, the first device receives a resource configuration information for separate sensing from a second device.
In this way, a centralized controlling and scheduling mechanism can organize those sensing radars (for example, adjacent sensing radars) to unitedly sensing a special area. When a sensing radar has scanned a common area, other UE sensing radars may avoid to repeatedly scan that area as much as possible and try to scan those left dead zones. In addition, UE sensing radar’s sensing result can be shared in the sensing group. That can make each sensing radar ‘see’a larger scope. As such, the sensing and communication resources in the network can be efficiently utilized without any performance degrade. The above scheme can be applied to a focuses on UE monostatic sensing radar, which can be applied for vehicle autonomous driving and assisted driving. Comparing with camera and lidar, JCAS sensing radar can work well in poor weather conditions. A vehicle monostatic  sensing radar can detect vehicle surrounding objects for autonomous navigation and collision avoidance. Without any limitation, the above scheme may be also applied to any other sensing radar mode.
Principle and embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. Fig. 1A illustrates an example network environment 100 in which example embodiments of the present disclosure may be implemented. The environment 100, which may be a part of a communication network, includes terminal devices and network devices.
As illustrated in Fig. 1a, the network environment 100 may include a first device 110, a second device 120 and a third device 130. In some embodiments, the first device 110 may be a terminal device that is integrated into a vehicle. For discussion simplicity, the first device 110 may also refer to the vehicle 110 as shown in Fig. 1A. In some embodiments, the second device 120 may be a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) . For example, the second device 120 may be a Wi-Fi access point or a BS. In some embodiments, the third device 130 may be a core network (CN) device configured with sensing management functionality or a sensing management entity (SME) . Without any limitation, the first device 110, the second device 120 and the third device 130 may be any other devices having the similar sensing requirements or functionalities. In addition, the network environment 100 further includes a sensing device group 140. The sensing device group 140 may include devices 140-1, 140-2 and 140-3. In an example, the third device 130 may, via the second device 120, schedule the (sensing) devices in the sensing device group 140 to unitedly perform a sensing procedure in a corresponding sensing area, for example, the area in the block as shown in Fig. 1. As discussed in the following, once the first device 110 is to sense the area associated with the sensing device group 140, the third device 130 may invite the first device 110 to join the sensing device group 140 for performing union sensing.
It is to be understood that the number of devices in Fig. 1a is given only for the purpose of illustration without suggesting any limitations. The network environment100 may include any suitable number of network devices and/or terminal devices adapted for implementing embodiments of the present disclosure. Although not shown, it would be appreciated that one or more terminal devices may be located in the network environment 100.
Communications in the network environment 100 may be implemented according to any proper communication protocol (s) , comprising, but not limited to, the third generation (3G) , the fourth generation (4G) , the fifth generation (5G) , 5G-Advanced or beyond (6G) , wireless local network communication protocols such as institute for electrical and electronics engineers (IEEE) 802.11 and the like, and/or any other protocols currently known or to be developed in the future. Moreover, the communication may utilize any proper wireless communication technology, comprising but not limited to: multiple-input multiple-output (MIMO) , orthogonal frequency division multiplexing (OFDM) , time division multiplexing (TDM) , frequency division multiplexing (FDM) , code division multiplexing (CDM) , Bluetooth, ZigBee, and machine type communication (MTC) , enhanced mobile broadband (eMBB) , massive machine type communication (mMTC) , ultra-reliable low latency communication (URLLC) , carrier aggregation (CA) , dual connectivity (DC) , and new radio unlicensed (NR-U) technologies.
Fig. 1B illustrates another example network environment in which example embodiments of the present disclosure may be implemented
As shown in Fig. 1B, there are three cars in the same area, i.e., the intersection in Fig. 1B. Therefore, there may be a very large area is repeatedly scanned by these three cars’ sensing radars. In turn, to avoid interference each other, these UE sensing radars should occupy different sensing resources (time/frequency/space) , i.e., orthogonal resources. However, this may lead to that too much radio resource is used for sensing, but communication user cannot obtain enough radio resource. In turn, with the scheme discussed with reference to the following embodiments, the resources for sensing and communication can be utilized efficiently.
Fig. 2 illustrates an example signaling process 200 for sensing scheduling according to example embodiments of the present disclosure. For the purpose of discussions, the signaling process 200 will be described with reference to Fig. 1A. It would be appreciated that although the signaling process 200 has been described in the communication environment 100 of Fig. 1A, this flowchart 200 may be likewise applied to other communication scenarios.
In the signaling process 200, the third device 130 transmits (210) a first request 215 for requesting the first device 110 to join a sensing device group 140 to the second device 120. This sensing device group 140 is configured to perform a union sensing in a sensing  area. In some embodiments, the third device 130 may determine or initiate (201) the sensing device group based on one or more of: a distribution of one or more sensing devices and environment information on the sensing area. The environment information includes at least one of a building distribution, facility distribution or a road arrangement within the sensing area. In an example, the third device 130 may collect sensing device positions to determine union sensing areas. Specifically, the third device 130 may autonomously select a union sensing area according to vehicle distribution. For example, an area gathering a lot of vehicles may be defined as a union sensing area. In turn, after determining the sensing device group for the union sensing area, the sensing area can be jointly sensed by sensing devices of the sensing device group (for example, located in the union sensing area) . In addition or alternatively, the sensing device group may be a temporary group or permanent group. For example, the operator of the JCAS may directly configure a union sensing area for some special areas, for example, intersections. A union sensing area always corresponds to a union sensing group. Furthermore, a union sensing group always corresponds to a sensing area to be unitedly sensed (which may be also referred to as a union sensing area) . In an example, the union sensing area may be given with a center position, size, shape and building contours information. In this disclosure, the union sensing area may be also referred to a sensing area associated with a sensing device group.
In addition, the union sensing group may be empty, only has one sensing device (for example, UE sensing radar) , or has multiple sensing devices. For example, the third device 110 may determine a permanent sensing device group for an intersection, since the accident rate at this intersection is high. If there is no sensing device or vehicle is located in the intersection at certain time, the permanent sensing device group may have no member.
In addition, the third device 130 may maintain the initiated sensing device group dynamically. For example, the third device 130 may cancel a temporary sensing device group or remove a member from the sensing device group. In some embodiments, when most sensing devices (or vehicles) have left a union sensing area, the third device 130 may cancel the corresponding union sensing group. In an example, the terminal device 130 may determine whether a first number of sensing devices in the sensing device group is below a number threshold. If the first number is below the number threshold, the third device 130 may cancel the sensing device group.
In addition or alternatively, in some embodiments, the third device 130 may monitor all members’ positions in the sensing device group. When a sensing device leaves its union  sensing area, the third device 130 may remove this sensing device from the sensing device group. In an example, the third device 130 may monitor a position of at least one sensing device in the sensing device group. Then, the third device 130 may determine, based on the monitoring, whether the at least one sensing device leaves from the sensing area. If the at least one sensing device leaves from the sensing area, the third device 130 may remove the at least one sensing device from the sensing device group.
Still referring to Fig. 2, with respect to the first request, the third device 130 may determine (208) whether to transmit the first request towards the first device 110 based on pose information of the first device 110. The pose information of the first device 110 may include at least one of a position or an orientation of the first device 110. In an example, the third device 130 may obtain the pose information of terminal devices and determine whether to transmit the first request to one or more of the terminal devices. In an example, the third device 130 may monitor position of the devices having sensing requirements and match their positions with union sensing areas. If a device comes into a first union sensing area, the third device 130 may invite it to join to the union sensing group corresponding to the first union sensing area. In an example, based on the pose information of the first device, the third device 130 may determine whether a first area to be sensed by the first device is associated with the sensing area. If the first area is associated with the sensing area (for example, the first area is a portion of the sensing area or is adjacent to the sensing area) , the third device 130 may transmit the first request towards the first device 110 via the second device 120.
In addition or alternatively, the first device 110 may transmit (202) a second request 203 for sensing to the second device 120, in order to obtain the resource configuration for sensing. Moreover, the second request may include at least one of pose information or sensing capability information of the first device. The sensing capability information may include a field of view (FOV) of the first device 110. Then, the second device 120 may transmit or forward the second request 203 to the third device 130. In this case, after receiving (204) the second request, the third device 130 may determine (208) whether to transmit the first request 215 based on the pose information of the first device in the second request 203. The determination manner may be the same as mentioned above. Without any limitation, it is to be understood that the third device 130 may obtain the pose information of the first device 110 in any other manner, in addition to receiving the second request 203. For example, the first device 110 may also obtain the pose information from a location  management function (LMF) server.
Still referring to Fig. 2, after receiving (220) the first request 215, the second device 120 may transmit (230) the first request 235 to the first device 110. The first device 110 receives (240) the first request 235 accordingly. After receiving the first request 235, the first device 110 may determine (241) whether to join the sensing device group. For example, the first device 110 may have a higher privacy protection requirement, the first device 110 may refrain (243) from joining the sensing device group. In this case, the first device 110 may transmit a rejection message for the first request to the second device 120. Upon receiving the rejection message, the second device 120 may be aware that the first device 110 rejects to share the sensing information. In this case, the second device 120 may transmit resource configuration information for separate sensing to the first device 110. Then, the first device 110 may perform the sensing procedure by itself without sharing its sensing result.
Alternatively, the first device 110 may voluntarily agree to join the sensing device group 140. In this case, the first device 110 transmits (245) a confirmation message 247 for the first request to the second device 120. After receiving (255) the confirmation message 247, the second device 120 transmits (255) the confirmation message 257 to the third device 130. The confirmation message 257 may be the same as the confirmation message 247, or may be determined based on the confirmation message 247. The third device 130 receives (258) the confirmation message 257 accordingly.
After receiving the confirmation message 257, the third device 130 may be aware that the first device 110 voluntarily joins the sensing device group 140 for performing a union sensing in the union sensing area. Then, the third device 130 includes the first device 110 into the sensing device group 140 to perform the union sensing. To perform the union sensing, the third device 130 may plan the scanning scope for one or more members of the sensing group 140, in order to achieve an efficient union scanning for the union sensing area. Only for discussion purposes, the joining procedure of the sensing device (radar) is further discussed with reference to Fig. 3.
Fig. 3 illustrates an example signaling process 300 for joining UE sensing radar into a sensing device group according to example embodiments of the present disclosure. In Fig. 3, the UE sensing radars 110 may be the first device 110 as shown in Figs. 1 and 2, the 4G/5G BS or WiFi AP 120 may be the second device 120 as shown in Figs. 1 and 2, and SME 130 may be the third device 130 as shown in Figs. 1 and 2.
As mentioned above, according to vehicles distribution, as shown by 330, the SME 130 can autonomously generate some union sensing groups for those areas with a lot of vehicles. In addition, as shown in Fig. 3, the UE sensing radars 110 may transmit a UE monostatic sensing request 320 (which may be the second request as shown in Fig. 2) with pose and capability information to the BS 120. BS 120 transmits the UE monostatic sensing request 330 to the SME 130. In an example, when a UE sensing radar 110 make sensing request to the SME 130, the request message 320 and 330 may include its current pose information, i.e. position and orientation, and its capability information, e.g. maximum FOV. When the SME 130 receives the sensing request 330, according to the UE sensing radar’s position, the SME 130 may determine (340) whether the UE sensing radar locates in a union sensing area. If the UE sensing radar locates in a union sensing area, the SME will ask the UE sensing radar whether it hopes to join the union sensing group. As shown in Fig. 3, the third device 130 may transmit the joining group request 350 (which may be the first request as shown in Fig. 2) to the BS 120. Then, the BS 120 may transmit the joining group request 360 based on the joining group request 350. In turn, if the vehicle voluntarily joins (370) the union sensing group, it will send confirmation message 380 (which may be the confirmation message as shown in Fig. 2) to the BS 120. Then, BS 120 may transmit the joining group confirmation 390 to the third device 130. Once a UE sensing radar joins a union sensing group, the SME 130 may plan its sensing scope and share union sensing result with it.
Referring back to Fig. 2, the third device 130 may schedule the sensing devices (i.e., the first device 110 and other devices in the sensing device group 140) to perform the union sensing periodically. For example, the union sensing area can be scanned by the sensing device group at a certain sensing frequency, such as 5 Hz. In this case, the union sensing procedure is performed for five times in one second. Without any limitation, the union sensing procedure may be performed at any other sensing frequency. Before performing the union sensing, the third device 130 may need to obtain the current pose information on each member of the sensing device group, in order to plan the scanning scopes for one or more members.
In some embodiments, the third device 130 may transmit (259) , via the second device 120, a third request 260 for current information towards the first device 110. That is, the third device 130 may transmit the third request to the second device 120. Moreover, the second device 120 may transmit the received third request to the first device 110. In  turn, after receiving the third request 260, the second device 110 may transmit (262) , via the second device 120, the pose information 263 of the first device 110 towards the third device 130. In an example, the first device 110 may transmit a pose report of at least one of a position or an orientation of the first device 110.
Then, after obtaining pose information on one or more members in the sensing device group 140 (for example, the pose information 263) , the third device 130 may determine (265) at least one scanning scope for one or more members in the sensing device group 140. In addition, the scanning scope may be further determined based on at least one of the following: FOVs of one or more devices in the sensing device group 140, environment information on the sensing area corresponding to the sensing device group 140, one or more dynamic objects within the sensing area; or dead zones of the one or more devices in the sensing device group 140. The dead zones may be determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects. For discussion clarity, the determination of the scanning scopes is further discussed with reference to Figs. 6A to 6E, and is not discussed here.
In addition, depending on the manner of determining the scanning scopes, the union sensing 266 can be performed in parallel or serially. In some embodiments, the union sensing may be performed simultaneously by the members in the sensing device group 140 (including the first device 110) in parallel. As mentioned above, by transmitting the third request 260, the third device 130 may receive one or more pose reports of the one or more sensing devices of the sensing device group 140. The pose report of the one or more pose reports may indicate at least one of a position or an orientation of a sensing device of the one or more sensing devices. Then, based on the one or more pose reports, the third device 130 may determine a plurality of scanning scopes for the sensing area for one time. At least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices.
The third device 130 may transmit (267) an indication 268 of the plurality of scanning scopes to the second device 120. In turn, after receiving (268) the indication 268 of plurality of scanning scopes, the second device 120 may determine (270) resource configuration information for one or more members of the sensing device group 140. For example, if the third device 130 has determined the scanning scope for each member of the sensing device group 140, the second device 120 may determine the resource configuration information for each member based on the corresponding scanning scope. Alternatively,  the third device 130 may also determine that a portion of the device sensing group 140 is to perform the union sensing, and determine the scanning scope of each member in the portion of the sensing device group 140. In this case, the second device 120 may also allocate the resource configuration information to the member (s) in the portion of the sensing device group 140 (that is, the resource configuration information is not transmitted to each member of the whole sensing device group 140) . Then, the second device 120 may transmit the resource configuration information to the corresponding device (for example, the first device 110) in the sensing device group 140.
For discussion clarity, the parallel union sensing is further discussed with reference to Fig. 4.
Fig. 4 illustrates an example signaling process 400 for parallel union sensing according to example embodiments of the present disclosure. In Fig. 4, the UE sensing radars 110 may be the first device 110 as shown in Figs. 1 and 2, the 4G/5G BS or WiFi AP 120 may be the second device 120 as shown in Figs. 1 and 2, and SME 130 may be the third device 130 as shown in Figs. 1 and 2.
In the union sensing as shown in Fig. 4, SME 130 will request all group members in the sensing device group 140 to provide their current poses. SME 130 uses UE sensing radars’ poses, capabilities, and buildings’ contours to plan all UE sensing radars’ scanning scopes. In this parallel mode, SME 130 may only consider those dead zones sheltered by static objects and ignore those dynamic dead zones. When SME 130 completes sensing scope planning, it will inform BS 120 to schedule radio resource for all group members. BS 120 will configure time/frequency/space resource for those union sensing radars according to their sensing scopes. Those union grouping sensing radars will simultaneously scan their sensing scopes. Then, the sensing results will be reported to the SME 130. Finally, SME 130 will combine those sensing results to a union sensing result, which will be broadcast to all group members (and other devices which subscribe the sensing service) .
In the example of Fig. 4, SME 130 transmits the request 410 for the current poses to BS 120. BS 120 transmits the request 420 for the current poses to UEs 110. Accordingly, the UE sensing radars 110 may transmit the report 430 of poses to BS 120. It is to be understood that the UE sensing radars 110 are only examples of members in the sensing device group 140, and the UE sensing radars may be also the device 140-1, 140-2, 140-3 and so on. Then, BS transmits the report 435 of poses to SME 130. Based on the one or more  reports from members of the sensing device group 140, SME 130 may plan (440) scopes for these members. SME 130 transmits reports 445 of the scopes determined for these members to BS 120. Based on the scopes determined for these members, BS 120 may schedule (450) time/frequency/space resource for union sensing. Then, BS 120 may transmit UE sensing radars’ resource configuration 455 to the corresponding UE sensing radar. UE sensing radar 110 may perform (460) a sensing procedure for the scanning scope planned for this UE sensing radar based on received UE sensing radars’ resource configuration 455. Then, UE sensing radar 110 transmits sensing result report 465 of this UE sensing radar 110 to BS 120. BS 120 receives sensing result reports of members in the sensing device group 140. BS 120 transmit the received sensing result reports to SME 130. SME 130 combine all sensing results based on the sensing result reports to generate a union sensing result. Then, SME 130 may broadcast the union sensing result 480 to the BS 120. BS 120 may broadcast the union sensing result 480 to members in the sensing device group 140. In addition, other devices which do not join the sensing device group 140 may also subscribe the sensing service. In this case, BS 120 may also transmit the union sensing result to these other devices. In some embodiments, the members in the sensing device group 140 may be configured with a first charging policy. Other devices may be configured with a second charging policy. The first charging policy may be lower than the second charging policy since the members made contributions to the union sensing result.
Referring back to Fig. 2, alternatively, as mentioned above, the union sensing 266 may be performed serially. In some embodiments, the determination of a scanning scope may depend from a sensing result for a previous scanning scope.
In some embodiments, after obtaining the one or more pose reports from members in the sensing device group 140, the third device 130 may determine (265) a first scanning scope to be sensed by a fourth device of the one or more sensing device in the sensing device group 140. Without any limitation, the fourth device may be the first device 110 or any other device in the sensing device group 140. Then, the third device 130 may transmit (267) an indication 268 of the first scanning scope to the second device 120. After receiving (269) the indication 268, the second device 120 may determine (270) a first resource configuration for the fourth device. Then, the second device 120 may transmit the first resource configuration to the fourth device (which is not shown in Fig. 2 or the fourth device may be the first device 110) . The fourth device performs the sensing procedure based on the first resource configuration. Then, the fourth device transmits (278) a first sensing result 280 of  the fourth device to the second device 120. The second device 120 transmits the first sensing result of the fourth device to the third device 130. Further, based on the first sensing result and the one or more pose reports, the third device 130 may determine a second scanning scope to be sensed by a fifth device of the one or more sensing device. Without any limitation, the fifth device may be also the first device or any other devices in the sensing device group 140. Similarly, the third device 130 may transmit an indication of the second scanning scope to the second device 120. The second device 120 may determine a second resource configuration information for the fifth device based on the indication of the second scanning scope. Then, the second device 120 may transmit the second resource configuration information to the fifth device. Then, the fifth device may transmit a second sensing result of the fifth device towards the third device 130. In turn, at the third device 130, it may, iteratively, determine other scanning scopes based on the sensing results (for example, the first or second sensing result) and schedule members in the sensing device group 140 to perform the sensing procedure, until the union sensing area is sensed completely. Then, the third device 130 may combine the sensing results to generate a union sensing result.
As mentioned above, similarly, the third device 130 may broadcast (284) the union sensing result to the members in the sensing device group 140 and/or other devices. For discussion clarity, the serial union sensing is further discussed with reference to Fig. 5.
Fig. 5 illustrates an example signaling process 500 for serial union sensing according to example embodiments of the present disclosure.
Compared to the above parallel union sensing, the serial grouping sensing may be performed by selecting a UE sensing radar as the first sensing radar, for example, a sensing radar has maximum scanning scope in the union sensing group. SME 130 may plan the first sensing radar’s sensing scope according to its pose, capability, near buildings’ contours and traffic status. Then the planning sensing scope will be sent to BS 120 for radio resource scheduling. BS 120 will configure the first sensing radar with time/frequency/space resource and command the first sensing radar to scan its sensing scope. The first sensing radar’s sensing result will be sent to SME 130. SME 130 will use the first sensing radar’s result to plan other sensing radar’s scanning scopes. When SME 130 plans the second sensing radar’s scanning scope, it will plan it to scan those areas where are out of the first sensing radar scanning scope and those areas sheltered by some moving objects detected by the first sensing radar. In some embodiments, the second sensing radar should be one which can scan the maximum area of dead zones of the first sensing radar. Once the second  sensing radar is selected, its sensing scope planning, radio scheduling, configuration and sensing operations are like the first sensing radar. Similarly, SME 130 may plan other sensing radars until all the union sensing area is scanned. Finally, SME 130 will combine all sensing results into the union sensing result and broadcast to all group members. In Fig. 5, the UE sensing radars 140-1 and 140-2 may be the members in the sensing device group 140 as shown in Fig. 1, the 4G/5G BS or WiFi AP 120 may be the second device 120 as shown in Figs. 1 and 2, and SME 130 may be the third device 130 as shown in Figs. 1 and 2.
In the example of Fig. 5, SME 130 transmits a request 501 for the current poses. BS 120 transmits the request 503 to the members in the sensing device group 140. The members transmit their pose reports 505 to BS 120. BS 120 transmits the pose reports 507 to SME 130. Then, SME 130 plan a first radar’s scope 509, and transmit a report 511 of the first radar scope to BS 120. BS 120 schedule (513) time/frequency/space resource for the first radar. Then, BS 120 transmits the first sensing radar configuration 515 to the first radar 140-1 (which may be the above fourth device) . The first radar 140-1 scans (517) its sensing group accordingly. Then, the first radar 140-1 transmits the first radar’s sensing report 519 to BS 120. BS 120 transmits the first radar’s sensing report 521 to SME 130. At least based on the first radar’s sensing report 521, SME 130 plans (523) the second radar’s scanning scope. Then, similarly, by means of the report 525 of second radar scope, scheduling 526 of time/frequency/space resources for the second radar, the second radar configuration 527, the second sensing radar 140-2 (which may be the above fifth device) may scan (529) its sensing scope. Then, the second sensing radar 140-2 may transmit the second radar’s sensing report 531 to BS 120. BS 120 may transmit the second radar’s sensing report 533 to SME 130. As mentioned above, SME 130 may iteratively perform the above steps until the union sensing area is scanned. Then, SME 130 may broadcast (284) the union sensing result to the members in the sensing device group 140 and/or other devices.
Referring back to Fig. 2, in view of the above, regarding the scanning scopes, the devices in the sensing device group 140 may sense different part of the union sensing area. In addition, the union sensing area may be periodically scanned and the union sensing result may be broadcast to group members or subscribed by non-group vehicles. In each union sensing period, the third device 130 may select all or a part UE sensing radars to sense those fixed and moving objects. The third device 130 may plan these sensing radars’s canning scopes according to their poses, capabilities, surrounding environment and traffic status. As mentioned above, the union sensing has two work modes, one is parallel union sensing and  another serial union sensing. In parallel union sensing, those UE sensing radars will make sensing at the same time. In serial union sensing mode, those UE sensing radars make sensing one by one. One sensing radar’s scanning scope will be planned according to previous sensing radars’ sensing results. The third device 130 will try to remove those dynamic dead zones which are sheltered by those moving objects.
Figs. 6A to 6E illustrate example situations of determination of the scanning scopes for members in the sensing device group according to example embodiments of the present disclosure.
Fig. 6A and Fig. 6B illustrates an example situations of rural intersection. As shown in Fig. 6A, a rural intersection is usually wide and is not seriously sheltered by surrounding buildings. So, SME 130 doesn’t need to consider surrounding buildings. There are three cars (or sensing devices in three cars) in the union sensing group. Car1 is passing through the intersection, and car2 and car3 are waiting. Here assuming that they are all equipped with front sensing radar with 120° FOV. One simple method is that SME 130 plans car1 sensing scope at 120°, and plans car2 and car3 only to scan the dead zone left by car1, as shadow area in Fig. 6A. SME 160 will plan sensing scope of car2 at its far left 30°, and plan sensing scope of car3 at its far right 30°. Now the full intersection is scanned by the union sensing radar group and doesn’t have any repeatedly scanning area. Another simple method is to balance all union sensing radars’ scanning scopes as in Fig. 6B. Alternatively, SME 130 may also plan these three cars’ sensing scopes at their central 80°.
Fig. 6C illustrates an example of an urban intersection near buildings. As shown in Fig. 6C, the urban areas always have many buildings along the streets. These buildings will impact union sensing planning. As shown, each corner has a building. At first, the SME 130 can plan car1 sensing scope at 120°. Besides its rear dead zone 1, car 1 has other two dead zones (2 and 3) sheltered by surrounding buildings. When the SME 130 plans car 2 and car 3 sensing scopes, it will make use of map information to obtain surrounding building contours and then to calculate car2 and car3 sensing scopes. Furthermore, SME 130 plans car2 and car3 sensing scopes at 45°.
Fig. 6D illustrates an example of an urban intersection with heavy traffic. As shown in Fig. 6D, sometimes, an urban intersection may have a lot of moving objects. SME 130 may plan car1 sensing scope at its 120°. Unlike above three cases, SME 130 may make dynamically planning for car2 and car3. That means that SME 130 uses car1 sensing result  to plan car2 and car3 sensing scope. Besides dead zone 1, 2 and 3, there exists a dynamic dead zone 4, which is sheltered by moving objects, i.e. riders shelter each other. In order to sense those riders, SME 130 may request car2 and car3 to sense those riders at different directions. So, SME 130 enlarges car2 and car3 sensing scopes to about 75° and 80° respectively.
Fig. 6E illustrates an example of a road section with heavy traffic. As shown in Fig. 6E, in some cases, a busy road section maybe has a lot of moving objects. SME 130 plans car1 sensing scope at its maximum 120°. The other two cars (2 and 3) shelter some areas, i.e. car1 dead zone 2 and dead zone 3. Based on car1 sensing result, SME 130 can plan car2 and car3 sensing scope to about 30° and 80° respectively for those dynamic dead zones.
In some embodiments, SME 130 may adopt both static and dynamic sensing planning for union sensing group. In the example of a road with heavy traffic as shown in Fig. 6E. As shown in Fig. 6E, in the static sensing planning, SME 130 may plan all vehicle sensing scopes at the same time (for example, the parallel union sensing and the steps as discussed in Figs. 6A, 6B and 6C) . In turn, in dynamic sensing planning, SME 130 may plan all vehicle sensing scopes one by one. The vehicle sensing planning maybe uses previous vehicles’ sensing results (for example, the serial union sensing and the steps as discussed in Figs. 6D) .
In view of the above, the sensing scheduling for the JCAS or ISAC system has been discussed with reference to Figs. 2 to 6E. In this way, a centralized controlling and scheduling mechanism can organize those sensing radars (for example, adjacent sensing radars) to unitedly sensing a special area. As such, the consumed resource can be reduced without performance degrades.
Fig. 7 illustrates an example of general flowchart 700 for sensing scheduling according to example embodiments of the present disclosure. In Fig. 7, the basic idea and steps of the disclosure are presented in general.
As shown in Fig. 7, for vehicle navigation and collision avoidance applications, SME 130 can unite a group of vehicle UE monostatic sensing radars to sense a special area, e.g. an intersection and share the union sensing result among these vehicles. That not only makes each vehicle ‘see’ larger scope, but also avoid repeatedly scanning and save radio resource. SME 130 can autonomously select a union sensing area according to vehicle  distribution. An area gathering a lot of vehicles can be defined as a union sensing area. The area will be jointly sensed by UE sensing radars located in it. The union sensing area can be scanned at some frequency, e.g. 5 Hz. When a vehicle with UE monostatic sensing radar comes into the union sensing area, SME 130 will ask it whether to join a union sensing group. If the UE sensing radar voluntarily agrees to join the group, it will be scheduled together with other sensing radars in the union group. Otherwise, if the vehicle doesn’ t agree to join the group, it will be separately scheduled. As it independently senses all its surrounding, it maybe occupies more radio resource and lead to higher sensing cost.
In each union sensing period, SME 130 will plan sensing scope for each UE sensing radar based on its current position, sensing field of view (FOV) , placement on the vehicle, surrounding environment, traffic status, etc. The objectives of union sensing planning are maximization of scanning area and minimization of repeatedly scanning area. For different union sensing areas, the SME can adopt different sensing planning methods, such as the situations shown in Figs. 6A to 6E.
After SME 130 completes union sensing planning, it will inform BS to schedule radio resource for those union sensing radars. All sensing results from the union sensing radars will be sent to SME 130. SME 130 will combine them into one union sensing result. The same moving object sensed by different sensing radars will be merged. In each union sensing period, when SME 130 collects all vehicles sensing results and combine them together, it will broadcast the union sensing result to all group members. The vehicles without sensing radars can also subscribe the union sensing results. The operators can adopt different charging policies for group members and non-group members, e.g. free or lower charge for group members than non-group members.
In addition, for achieving the above steps at the first device 110 and the third device 130, a corresponding functionality module may be integrated at the first device 110 or the third device 130.
Fig. 8 illustrates an example module integrated in the first device according to example embodiments of the present disclosure.
As shown in Fig. 8, the first device 110 may include a union sensing control module 810, an uplink forward module 82 and a downlink forward module 830. Specifically, to implement the above sensing scheduling, the JCAS UE (which may be the first device 110) may have a new function to voluntarily join a union sensing group as show in Fig. 8. The  UE owner can enable or disable the union sensing function. When the SME ask the UE sensing radar whether to join a union sensing group, it can answer the SME according to UE owner’s settings. This function also forwards the union sensing result to the vehicle navigation system, and forwards its sensing result to the SME for union sensing group.
Fig. 9 illustrates an example module integrated in the third device according to example embodiments of the present disclosure.
As shown in Fig. 9, the third device 130 may include a vehicle distribution analysis 910, a position match and union sensing invitation module 920, a union sensing area/group database module 930, a union sensing planning module 940 and a union sensing combination and broadcast module 950. Similarly, to implement the above sensing scheduling, SME 130 should have several new functions to generate union sensing group, plan union sensing scopes, combine and broadcast union sensing results. SME 130 collects vehicle positions to determine union sensing areas. According to vehicle distribution, SME 130 can generate a union sensing group for those vehicle gathering areas. When most vehicles have left a union sensing area, SME 130 can also remove a union sensing group. A union sensing group may be a temporary or permanent group. The JCAS operator can also directly configure a union sensing area for some special areas, e.g. intersections. A union sensing area always corresponds to a union sensing group. A union sensing area will be given with center position, size, shape and building contours information. A union sensing group may be empty, only has one UE sensing radar, or has multiple UE sensing radars. SME 130 will monitor all UE sensing radars’ positions. When a sensing radar leaves its union sensing area, the SME will remove it from the group. In general, the more vehicle sensing radars has in the union sensing group, the better radio resource efficiency is gotten, and the larger sensing scope is ‘seen’ . SME 130 will monitor vehicle position and match with union sensing areas. If a vehicle comes into a union sensing area, SME 130 will invite it to join to union sensing group. The union sensing group will periodically sense the union sensing area. At each union sensing, the SME will plan all sensing radars’ scanning scopes according to their current poses (positions and orientations) , sensing radar capabilities (FOV) , building contours and shelter status. The sensing scope information will be sent to BS for sensing resource scheduling. SME 130 will combine all union sensing radars’ results to generate a union sensing result, which will cover the whole union sensing area and be shared by all group members.
In some embodiments, the disclosure can be implemented via future JCAS UE, BS  and SME. The UE will be installed on a vehicle to provide both communication and sensing functions. Its sensing function will work as monostatic radar to scan surrounding fixed or moving objects for navigation positioning and obstacle avoidance. The BS will schedule radio resource (time/frequency/space) for UE sensing radar. The SME will manage and control those UE sensing radars. In this disclosure, the SME will organize those adjacent UE sensing radars to unitedly scan a special area to avoid repeatedly scanning and meanwhile remove sensing dead zone as much as possible. As such, the disclosure can improve JCAS radio resource efficiency and make the sensing function not to impact communication user experience.
Fig. 10 shows a flowchart of an example method 1000 implemented at a first device (for example, the first device 110) in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1000 will be described from the perspective of the first device 110 with reference to Fig. 1.
At 1010, the first device 110 receives, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area. At 1020, the first device 110 transmits, to the second device, a confirmation message for the first request. Alternatively, at 1030, in the case that the first device refrains from joining the sensing device group, the first device 110 receives resource configuration information for separate sensing from a second device.
In some embodiments, the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
In some embodiments, the first device 110 may further: transmit, to the second device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device.
In some embodiments, at least one of the following: the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
In some embodiments, the first request is determined based on the pose information of the first device and is transmitted by the third device.
In some embodiments, the first device 110 transmits the confirmation message, and the first device may further: receive, from the second device, a third request for current pose information of one or more sensing devices in the sensing device group; and transmit, to the second device, a pose report of at least one of a position or an orientation of the first device.
In some embodiments, the first device transmits the confirmation message, and the first device may further: receive, from the second device, a resource configuration information for the union sensing, wherein the resource configuration for the union sensing indicates a set of sensing resources that are determined based on the pose report.
In some embodiments, the set of sensing resources are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
In some embodiments, the first device may further: perform, based on the resource configuration information for the union sensing, a sensing procedure to obtain a sensing result of the first device; transmit the sensing result to the second device; and receive, from the second device, a union sensing result that is at least partially based on the sensing result of the first device.
In some embodiments, based on that the first device transmits the confirmation message, the first device is configured with a first charging policy; or based on that the first device refrains from joining the sensing device group, the first device is configured with a second charging policy.
In some embodiments, the first device may receive the resource configuration for separate sensing by: transmitting, to the second device, a rejection message for the first request.
In some embodiments, at least one of: the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle; a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or a third device comprises a core network (CN) device or a sensing management entity (SME) .
Fig. 11 shows a flowchart of an example method 1100 implemented at a second device (for example, the second device 120) in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1100 will be described from the perspective of the second device 120 with reference to Fig. 1.
At 1110, the second device 120 receives, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area. At 1120, the second device 120 transmits the first request to the first device. At 1130, the second device 120 receives, from the first device, a confirmation message for the first request. Alternatively, at 1140, the second device 120 transmits a first resource configuration information for separate sensing to the first device.
In some embodiments, the confirmation message is received, and the second device may further transmit the configuration message to the third device.
In some embodiments, the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
In some embodiments, the second device may further receive, from the first device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device; and transmit the second request to the third device.
In some embodiments, at least one of the following: the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
In some embodiments, the first request is determined by based on the pose information of the first device and is transmitted by the third device.
In some embodiments, the second device may further receive, from a third device, a third request for current pose information of one or more sensing devices in the sensing device group; transmit the third request to the first device; receive, from the first device, a pose report of at least one of a position or an orientation of the first device; and transmit the pose report to the third device.
In some embodiments, the second device may further receive, from a third device, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; determine, based on the plurality of the scanning scopes, a plurality of resource configuration information for the union sensing; and transmit, to the sensing device of the one or more sensing devices, a resource configuration information of the plurality of the resource configuration information, wherein the resource configuration information is for sensing the at least one of the plurality of the scanning scopes.
In some embodiments, the second device may further receive one or more sensing results from the one or more sensing devices; transmit, to the third device, the one or more sensing results; receive, from the third device, a union sensing result that is based on the one or more sensing results.
In some embodiments, the second device may further receive, from the third device, a first indication of the first scanning scope associated with a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; determine, based on the first indication of the first scanning scope, a first resource configuration information; and transmit, to the fourth device, the first resource configuration information.
In some embodiments, the second device may further receive, from the fourth device, a first sensing result of the fourth device; transmit, to the third device, the first sensing result; receive, from the third device, a second indication of the second scanning scope associated with a fifth device of the one or more sensing device, wherein the second scanning scope is determined based on the first sensing result; and determine, based on the second indication of the first scanning scope, a second resource configuration information; and transmit, to the fifth device, the second resource configuration information.
In some embodiments, the second device may further: receive, from the fifth device, a second sensing result; transmit, to the third device, the second sensing result; and receive, from the third device, a union sensing result that is at least based on the first sensing result and the second sensing result.
In some embodiments, the second device may further: transmit the union result to the one or more sensing devices; transmit the union result to a further device which is not included in the sensing device group.
In some embodiments, the one or more sensing devices are configured with a first charging policy; and the further device is configured with a second charging policy.
In some embodiments, the scanning scope are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
In some embodiments, the second device may transmit the resource configuration information for separate sensing by: receiving, from the first device, a rejection message for the first request.
In some embodiments, at least one of: the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle; a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or a third device comprises a core network (CN) device or a sensing management entity (SME) .
Fig. 12 shows a flowchart of an example method 1200 implemented at a third device (for example, the third device 130) in accordance with some embodiments of the present disclosure. For the purpose of discussion, the method 1200 will be described from the perspective of the third device 130 with reference to Fig. 1.
At 1310, the third device 130 transmits, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area. At 1320, the third device 130 receives, from the second device, a confirmation message for the first request. At 1330, the third device 130 includes the first device in the sensing device group to perform the union sensing.
In some embodiments, the sensing device group is determined by the third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area.
In some embodiments, the third device may further: obtain pose information of the  first device, wherein the pose information comprises at least one of a position or an orientation of the first device.
In some embodiments, the third device may further receive, from the second device, a second request for sensing, wherein the second request is transmitted from the first device and the second request comprises at least one of pose information or sensing capability information of the first device.
In some embodiments, the third device may transmit the first request by: determining, based on the pose information of the first device, whether a first area to be sensed by the first device is associated with the sensing area; based on determining that the first area is associated with the sensing area, transmitting the first request.
In some embodiments, the third device may further: monitor a position of at least one sensing device in the sensing device group; and determine, based on the monitoring, whether the at least one sensing device leaves from the sensing area; and based on determining that the at least one sensing device leaves from the sensing area, remove the at least one sensing device from the sensing device group.
In some embodiments, the third device may further: determine whether a first number of sensing devices in the sensing device group is below a number threshold; based on determining that the first number is below the number threshold, cancel the sensing device group.
In some embodiments, the third device may further: determine a permanent sensing device group for the sensing area.
In some embodiments, the third device may further: transmit, to the second device, a third request for current pose information of one or more sensing devices in the sensing device group; receive, from the second device, one or more pose reports of the one or more sensing devices, wherein a pose report of the one or more pose report indicates at least one of a position or an orientation of a sensing device of the one or more sensing devices.
In some embodiments, the third device may further: determine, based on the one or more pose reports, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; and transmit, to the second device, an indication of the plurality of scanning scopes.
In some embodiments, the third device may further: receive, from the second device,  one or more sensing results of the one or more sensing devices; determine, based on the one or more sensing results, a first union sensing result; and transmit, to the second device, the first union sensing result.
In some embodiments, the third device may further: determine, based on the one or more pose reports, a first scanning scope to be sensed by a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; and transmit, to the second device, an indication of the first scanning scope.
In some embodiments, the third device may further: receive, from the second device, a first sensing result of the fourth device; determine, based on the first sensing result and the one or more pose reports, a second scanning scope to be sensed by a fifth device of the one or more sensing device; and transmit, to the second device, an indication of the second scanning scope.
In some embodiments, the third device may further: receive, from the second device, a second sensing result of the fifth device; determine, based on at least the first sensing result and the second sensing result, a second union sensing result; and transmit, to the second device, the second union sensing result.
In some embodiments, the scanning scope is determined by the third device further based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
In some embodiments, at least one of: the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle; the second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or the third device comprises a core network (CN) device or a sensing management entity (SME) .
In some embodiments, an apparatus capable of performing any of the method 1000 (for example, the first device 110) may include means for receiving, at a first device and from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and means for i) transmitting,  to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receiving a resource configuration information for separate sensing from a second device.
In some embodiments, the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
In some embodiments, the apparatus may further include means for transmitting, to the second device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device.
In some embodiments, at least one of the following: the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
In some embodiments, the first request is determined based on the pose information of the first device and is transmitted by the third device.
In some embodiments, the apparatus includes means for transmitting the confirmation message, and the apparatus may further include means for receiving, from the second device, a third request for current pose information of one or more sensing devices in the sensing device group; and means for transmitting, to the second device, a pose report of at least one of a position or an orientation of the first device.
In some embodiments, the apparatus includes means for transmitting the confirmation message, and he apparatus may further include means for receiving, from the second device, a resource configuration information for the union sensing, wherein the resource configuration for the union sensing indicates a set of sensing resources that are determined based on the pose report.
In some embodiments, the set of sensing resources are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
In some embodiments, he apparatus may further include means for performing, based on the resource configuration information for the union sensing, a sensing procedure to obtain a sensing result of the first device; means for transmitting the sensing result to the second device; and means for receiving, from the second device, a union sensing result that is at least partially based on the sensing result of the first device.
In some embodiments, based on that the apparatus transmits the confirmation message, the first device is configured with a first charging policy; or based on that the first device refrains from joining the sensing device group, the first device is configured with a second charging policy.
In some embodiments, the first device may receive the resource configuration for separate sensing by: transmitting, to the second device, a rejection message for the first request.
In some embodiments, at least one of: the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle; a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or a third device comprises a core network (CN) device or a sensing management entity (SME) .
In some embodiments, an apparatus capable of performing any of the method 1100 (for example, the second device 120) may include means for receiving, at a second device and from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for transmitting the first request to the first device; and means for i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
In some embodiments, the confirmation message is received, and the second device may further transmit the configuration message to the third device.
In some embodiments, the sensing device group is determined by a third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area, wherein the third device is configured for a sensing management.
In some embodiments, the apparatus may further include means for receiving, from the first device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device; and means for transmitting the second request to the third device.
In some embodiments, at least one of the following: the pose information comprises at least one of a position or an orientation of the first device; or the sensing capability information comprises a field of view (FOV) of the first device.
In some embodiments, the first request is determined by based on the pose information of the first device and is transmitted by the third device.
In some embodiments, the apparatus may further include means for receiving, from a third device, a third request for current pose information of one or more sensing devices in the sensing device group; means for transmitting the third request to the first device; means for receiving, from the first device, a pose report of at least one of a position or an orientation of the first device; and means for transmitting the pose report to the third device.
In some embodiments, the apparatus may further include means for receiving, from a third device, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; means for determining, based on the plurality of the scanning scopes, a plurality of resource configuration information for the union sensing; and means for transmitting, to the sensing device of the one or more sensing devices, a resource configuration information of the plurality of the resource configuration information, wherein the resource configuration information is for sensing the at least one of the plurality of the scanning scopes.
In some embodiments, the apparatus may further include means for receiving one or more sensing results from the one or more sensing devices; means for transmitting, to the third device, the one or more sensing results; receive, from the third device, a union sensing result that is based on the one or more sensing results.
In some embodiments, the apparatus may further include means for receiving, from the third device, a first indication of the first scanning scope associated with a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; means for determining, based on the first indication of the first scanning scope, a first resource configuration information; and transmit, to the fourth device, the first resource configuration information.
In some embodiments, the apparatus may further include means for receiving, from the fourth device, a first sensing result of the fourth device; transmit, to the third device, the first sensing result; means for receiving, from the third device, a second indication of the second scanning scope associated with a fifth device of the one or more sensing device, wherein the second scanning scope is determined based on the first sensing result; and means for determining, based on the second indication of the first scanning scope, a second resource configuration information; and transmit, to the fifth device, the second resource configuration information.
In some embodiments, the apparatus may further include means for receiving, from the fifth device, a second sensing result; means for transmitting, to the third device, the second sensing result; and means for receiving, from the third device, a union sensing result that is at least based on the first sensing result and the second sensing result.
In some embodiments, the apparatus may further include means for transmitting the union result to the one or more sensing devices; means for transmitting the union result to a further device which is not included in the sensing device group.
In some embodiments, the one or more sensing devices are configured with a first charging policy; and the further device is configured with a second charging policy.
In some embodiments, the scanning scope are further determined based on at least one of the following: FOVs of one or more devices in the sensing device group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
In some embodiments, the means for transmitting the resource configuration information for separate sensing includes: means for receiving, from the first device, a rejection message for the first request.
In some embodiments, at least one of: the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle; a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or a third device comprises a core network (CN) device or a sensing management entity (SME) .
In some embodiments, an apparatus capable of performing any of the method 1200 (for example, the third device 130) may include means for transmitting, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; means for receiving, from the second device, a confirmation message for the first request; and means for including the first device in the sensing device group to perform the union sensing.
In some embodiments, the sensing device group is determined by the third device based on at least one of the following: a distribution of one or more sensing devices; or environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area.
In some embodiments, the apparatus may further include means for obtaining pose information of the first device, wherein the pose information comprises at least one of a position or an orientation of the first device.
In some embodiments, the apparatus may further include means for receiving, from the second device, a second request for sensing, wherein the second request is transmitted from the first device and the second request comprises at least one of pose information or sensing capability information of the first device.
In some embodiments, the means for transmitting the first request includes: means for determining, based on the pose information of the first device, whether a first area to be sensed by the first device is associated with the sensing area; based on determining that the first area is associated with the sensing area, means for transmitting the first request.
In some embodiments, the apparatus may further include means for monitoring a position of at least one sensing device in the sensing device group; and means for determining, based on the monitoring, whether the at least one sensing device leaves from the sensing area; and based on determining that the at least one sensing device leaves from the sensing area, remove the at least one sensing device from the sensing device group.
In some embodiments, the apparatus may further include means for determining whether a first number of sensing devices in the sensing device group is below a number threshold; means for based on determining that the first number is below the number threshold, cancelling the sensing device group.
In some embodiments, the apparatus may further include means for determining a  permanent sensing device group for the sensing area.
In some embodiments, the apparatus may further include means for transmitting, to the second device, a third request for current pose information of one or more sensing devices in the sensing device group; means for receiving, from the second device, one or more pose reports of the one or more sensing devices, wherein a pose report of the one or more pose report indicates at least one of a position or an orientation of a sensing device of the one or more sensing devices.
In some embodiments, the apparatus may further include means for determining, based on the one or more pose reports, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; and means for transmitting, to the second device, an indication of the plurality of scanning scopes.
In some embodiments, the apparatus may further include means for receiving, from the second device, one or more sensing results of the one or more sensing devices; determine, based on the one or more sensing results, a first union sensing result; and transmit, to the second device, the first union sensing result.
In some embodiments, the apparatus may further include means for determining, based on the one or more pose reports, a first scanning scope to be sensed by a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; and transmit, to the second device, an indication of the first scanning scope.
In some embodiments, the apparatus may further include means for receiving, from the second device, a first sensing result of the fourth device; determine, based on the first sensing result and the one or more pose reports, a second scanning scope to be sensed by a fifth device of the one or more sensing device; and transmit, to the second device, an indication of the second scanning scope.
In some embodiments, the apparatus may further include means for receiving, from the second device, a second sensing result of the fifth device; determine, based on at least the first sensing result and the second sensing result, a second union sensing result; and transmit, to the second device, the second union sensing result.
In some embodiments, the scanning scope is determined by the third device further based on at least one of the following: FOVs of one or more devices in the sensing device  group; environment information on the sensing area; one or more dynamic objects within the sensing area; or dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
In some embodiments, at least one of: the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle; the second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or the third device comprises a core network (CN) device or a sensing management entity (SME) .
Fig. 13 is a simplified block diagram of a device 1300 that is suitable for implementing embodiments of the present disclosure. The device 1300 may be provided to implement the communication device, for example the first device 110 to the third device 130 as shown in Fig. 1a. As shown, the device 1300 includes one or more processors 1310, one or more memories 1340 coupled to the processor 1310, and one or more transmitters and/or receivers (TX/RX) 1340 coupled to the processor 1310.
The TX/RX 1340 is for bidirectional communications. The TX/RX 1340 has at least one antenna to facilitate communication. The communication interface may represent any interface that is necessary for communication with other network elements.
The processor 1310 may be of any type suitable to the local technical network and may include one or more of the following: general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multicore processor architecture, as non-limiting examples. The device 1300 may have multiple processors, such as an application specific integrated circuit chip that is slaved in time to a clock which synchronizes the main processor.
The memory 1320 may include one or more non-volatile memories and one or more volatile memories. Examples of the non-volatile memories include, but are not limited to, a read only memory (ROM) 1324, an electrically programmable read only memory (EPROM) , a flash memory, a hard disk, a compact disc (CD) , a digital video disk (DVD) , and other magnetic storage and/or optical storage. Examples of the volatile memories include, but are not limited to, a random access memory (RAM) 1322 and other volatile memories that will not last in the power-down duration.
A program 1330 includes executable instructions that are executed by the associated processor 1310. The program 830 may be stored in the ROM 1324. The processor 1310 may perform any suitable actions and processing by loading the program 930 into the RAM 822.
The embodiments of the present disclosure may be implemented by means of the program so that the device 1300 may perform any process of the disclosure as discussed with reference to Figs. 2 to 12. The embodiments of the present disclosure may also be implemented by hardware or by a combination of software and hardware.
In some embodiments, the program 830 may be tangibly contained in a readable storage medium which may be included in the device 1300 (such as in the memory 1320) or other storage devices that are accessible by the device 1300. The device 800 may load the program 1330 from the storage medium to the RAM 1322 for execution. The storage medium may include any types of tangible non-volatile storage, such as ROM, EPROM, a flash memory, a hard disk, CD, DVD, and the like. Fig. 14 shows an example of the storage medium 1400 in form of CD or DVD. The storage medium has the processor instructions 1330 stored therein.
Generally, various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representations, it is to be understood that the block, apparatus, system, technique or method described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
The present disclosure also provides at least one program product tangibly stored on a non-transitory readable storage medium. The program product includes executable instructions, such as those included in program modules, being executed in a device on a target real or virtual processor, to carry out process 200, the method 1000, 1100 or 1200 as described above with reference to Fig. 2 to Fig. 12. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, or the like that  perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Machine-executable instructions for program modules may be executed within a local or distributed device. In a distributed device, program modules may be located in both local and remote storage media.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented. The program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present disclosure, the program codes or related data may be carried by any suitable carrier to enable the device, apparatus or processor to perform various processes and operations as described above. Examples of the carrier include a signal, readable storage medium, and the like.
The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random-access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. The term “non-transitory, ” as used herein, is a limitation of the medium itself (i.e., tangible, not a signal) as opposed to a limitation on data storage persistency (e.g., RAM vs. ROM) .
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.  Likewise, while several specific implementation details are contained in the above discussions, these should not be construed as limitations on the scope of the present disclosure, but rather as descriptions of features that may be specific to particular embodiments. Certain features that are described in the context of separate embodiments may also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment may also be implemented in multiple embodiments separately or in any suitable sub-combination.
Although the present disclosure has been described in languages specific to structural features and/or methodological acts, it is to be understood that the present disclosure defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (52)

  1. A first device comprising
    at least one processor; and
    at least one memory storing instructions that, when executed by the at least one processor, cause the first device at least to:
    receive, from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and
    i) transmit, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receive resource configuration information for separate sensing from a second device.
  2. The first device of claim 1, wherein the sensing device group is determined by a third device based on at least one of the following:
    a distribution of one or more sensing devices; or
    environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area,
    wherein the third device is configured for a sensing management.
  3. The first device of claim 1 or 2, wherein the first device is further caused to:
    transmit, to the second device, a second request for sensing,
    wherein the second request comprises at least one of pose information or sensing capability information of the first device.
  4. The first device of claim 3, wherein at least one of the following:
    the pose information comprises at least one of a position or an orientation of the first device; or
    the sensing capability information comprises a field of view (FOV) of the first device.
  5. The first device of claim 3 or 4, wherein the first request is determined based on the pose information of the first device and is transmitted by the third device.
  6. The first device of any of claims 1 to 5, wherein the first device is caused to transmit the confirmation message, and wherein the first device is further caused to:
    receive, from the second device, a third request for current pose information of one or more sensing devices in the sensing device group; and
    transmit, to the second device, a pose report of at least one of a position or an orientation of the first device.
  7. The first device of claim 6, wherein the first device is caused to transmit the confirmation message, and wherein the first device is further caused to:
    receive, from the second device, a resource configuration information for the union sensing, wherein the resource configuration for the union sensing indicates a set of sensing resources that are determined based on the pose report.
  8. The first device of claim 7, wherein the set of sensing resources are further determined based on at least one of the following:
    FOVs of one or more devices in the sensing device group;
    environment information on the sensing area;
    one or more dynamic objects within the sensing area; or
    dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  9. The first device of claim 7 or 8, the first device is further caused to:
    perform, based on the resource configuration information for the union sensing, a sensing procedure to obtain a sensing result of the first device;
    transmit the sensing result to the second device; and
    receive, from the second device, a union sensing result that is at least partially based on the sensing result of the first device.
  10. The first device of any of claims 1 to 9, wherein:
    based on that the first device transmits the confirmation message, the first device is configured with a first charging policy; or
    based on that the first device refrains from joining the sensing device group, the first device is configured with a second charging policy.
  11. The first device of any of claims 1 to 10, wherein the first device is caused to receive the resource configuration for separate sensing by:
    transmitting, to the second device, a rejection message for the first request.
  12. The first device of any of claims 1 to 11, wherein at least one of:
    the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
    a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or
    a third device comprises a core network (CN) device or a sensing management entity (SME) .
  13. A second device comprising
    at least one processor; and
    at least one memory storing instructions that, when executed by the at least one processor, cause the second device at least to:
    receive, from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area;
    transmit the first request to the first device; and
    i) receive, from the first device, a confirmation message for the first request, or ii) transmit a first resource configuration information for separate sensing to the first device.
  14. The second device of claim 13, wherein the confirmation message is received, and wherein the second device is further caused to:
    transmit the configuration message to the third device.
  15. The second device of claim 13 or 14, wherein the sensing device group is determined by a third device based on at least one of the following:
    a distribution of one or more sensing devices; or
    environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area,
    wherein the third device is configured for a sensing management.
  16. The second device of any of claims 13 to 15, wherein the second device is further caused to:
    receive, from the first device, a second request for sensing, wherein the second request comprises at least one of pose information or sensing capability information of the first device; and
    transmit the second request to the third device.
  17. The second device of claim 16, wherein at least one of the following:
    the pose information comprises at least one of a position or an orientation of the first device; or
    the sensing capability information comprises a field of view (FOV) of the first device.
  18. The second device of claim 16 or 17, wherein the first request is determined by based on the pose information of the first device and is transmitted by the third device.
  19. The second device of any of claims 13 to 18, wherein second device is further caused to:
    receive, from a third device, a third request for current pose information of one or more sensing devices in the sensing device group;
    transmit the third request to the first device;
    receive, from the first device, a pose report of at least one of a position or an orientation of the first device; and
    transmit the pose report to the third device.
  20. The second device of claim 19, wherein the second device is further caused to:
    receive, from a third device, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices;
    determine, based on the plurality of the scanning scopes, a plurality of resource configuration information for the union sensing; and
    transmit, to the sensing device of the one or more sensing devices, a resource configuration information of the plurality of the resource configuration information, wherein  the resource configuration information is for sensing the at least one of the plurality of the scanning scopes.
  21. The second device of claim 20, wherein the second device is further caused to:
    receive one or more sensing results from the one or more sensing devices;
    transmit, to the third device, the one or more sensing results;
    receive, from the third device, a union sensing result that is based on the one or more sensing results.
  22. The second device of claim 19, wherein the second device is further caused to:
    receive, from the third device, a first indication of the first scanning scope associated with a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices;
    determine, based on the first indication of the first scanning scope, a first resource configuration information; and
    transmit, to the fourth device, the first resource configuration information.
  23. The second device of claim 22, wherein the second device is further caused to:
    receive, from the fourth device, a first sensing result of the fourth device;
    transmit, to the third device, the first sensing result;
    receive, from the third device, a second indication of the second scanning scope associated with a fifth device of the one or more sensing device, wherein the second scanning scope is determined based on the first sensing result; and
    determine, based on the second indication of the first scanning scope, a second resource configuration information; and
    transmit, to the fifth device, the second resource configuration information.
  24. The second device of claim 23, wherein the second device is further caused to:
    receive, from the fifth device, a second sensing result;
    transmit, to the third device, the second sensing result; and
    receive, from the third device, a union sensing result that is at least based on the first sensing result and the second sensing result.
  25. The second device of claim 21 or 24, wherein the second device is further caused  to:
    transmit the union result to the one or more sensing devices;
    transmit the union result to a further device which is not included in the sensing device group.
  26. The second device of claim 25, wherein:
    the one or more sensing devices are configured with a first charging policy; and
    the further device is configured with a second charging policy.
  27. The second device of claim 20, 22 or 23, wherein the scanning scope are further determined based on at least one of the following:
    FOVs of one or more devices in the sensing device group;
    environment information on the sensing area;
    one or more dynamic objects within the sensing area; or
    dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  28. The second device of claim 22, wherein the second device is caused to transmit the resource configuration information for separate sensing by:
    receiving, from the first device, a rejection message for the first request.
  29. The second device of any of claims 13 to 28, wherein at least one of:
    the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
    a second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or
    a third device comprises a core network (CN) device or a sensing management entity (SME) .
  30. A third device comprising
    at least one processor; and
    at least one memory storing instructions that, when executed by the at least one  processor, cause the network device at least to:
    transmit, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area;
    receive, from the second device, a confirmation message for the first request; and
    include the first device in the sensing device group to perform the union sensing.
  31. The third device of claim 30, wherein the sensing device group is determined by the third device based on at least one of the following:
    a distribution of one or more sensing devices; or
    environment information on the sensing area, the environment information comprising at least one of a building distribution, facility distribution or a road arrangement within the sensing area.
  32. The third device of claim 30 or 31, wherein the third device is further caused to:
    obtain pose information of the first device, wherein the pose information comprises at least one of a position or an orientation of the first device.
  33. The third device of any of claims 30 to 32, wherein the third device is further caused to:
    receiving, from the second device, a second request for sensing, wherein the second request is transmitted from the first device and the second request comprises at least one of pose information or sensing capability information of the first device.
  34. The third device of claim 32 or 33, wherein the third device is caused to transmit the first request by:
    determining, based on the pose information of the first device, whether a first area to be sensed by the first device is associated with the sensing area;
    based on determining that the first area is associated with the sensing area, transmitting the first request.
  35. The third device of any of claims 30 to 34, wherein the third device is further caused to:
    monitor a position of at least one sensing device in the sensing device group; and
    determine, based on the monitoring, whether the at least one sensing device leaves from the sensing area; and
    based on determining that the at least one sensing device leaves from the sensing area, remove the at least one sensing device from the sensing device group.
  36. The third device of any of claims 30 to 35, wherein the third device is further caused to:
    determine whether a first number of sensing devices in the sensing device group is below a number threshold;
    based on determining that the first number is below the number threshold, cancel the sensing device group.
  37. The third device of any of claims 30 to 35, wherein the third device is further caused to:
    determine a permanent sensing device group for the sensing area.
  38. The third device of any of claims 30 to 37, wherein the third device is further caused to:
    transmit, to the second device, a third request for current pose information of one or more sensing devices in the sensing device group;
    receive, from the second device, one or more pose reports of the one or more sensing devices, wherein a pose report of the one or more pose report indicates at least one of a position or an orientation of a sensing device of the one or more sensing devices.
  39. The third device of claim 38, wherein the third device is further caused to:
    determine, based on the one or more pose reports, a plurality of scanning scopes for the sensing area, wherein at least one of the plurality of scanning scopes is to be sensed by a sensing device of the one or more sensing devices; and
    transmit, to the second device, an indication of the plurality of scanning scopes.
  40. The third device of claim 39, wherein the third device is further caused to:
    receive, from the second device, one or more sensing results of the one or more sensing devices;
    determine, based on the one or more sensing results, a first union sensing result; and
    transmit, to the second device, the first union sensing result.
  41. The third device of claim 38, wherein the third device is further caused to:
    determine, based on the one or more pose reports, a first scanning scope to be sensed by a fourth device of the one or more sensing device, wherein the fourth device and the first device are the same device or different devices; and
    transmit, to the second device, an indication of the first scanning scope.
  42. The third device of claim 41, wherein the third device is further caused to:
    receive, from the second device, a first sensing result of the fourth device;
    determine, based on the first sensing result and the one or more pose reports, a second scanning scope to be sensed by a fifth device of the one or more sensing device; and
    transmit, to the second device, an indication of the second scanning scope.
  43. The third device of claim 42, wherein the third device is further caused to:
    receive, from the second device, a second sensing result of the fifth device;
    determine, based on at least the first sensing result and the second sensing result, a second union sensing result; and
    transmit, to the second device, the second union sensing result.
  44. The third device of any of claims 39, 42 or 43, wherein the scanning scope is determined by the third device further based on at least one of the following:
    FOVs of one or more devices in the sensing device group;
    environment information on the sensing area;
    one or more dynamic objects within the sensing area; or
    dead zones of the one or more devices, wherein the dead zones are determined based on the FOVs of the one or more devices, the environment information and/or the one or more dynamic objects.
  45. The third device of any of claims 30 to 44, wherein at least one of:
    the first device comprises a terminal device that is integrated on a vehicle, and the first device is further configured to transmit a union sensing result to a navigation system for the vehicle;
    the second device comprises a network device, an access point, a base station, an evolved node B (eNB) or a next generation node B (gNB) ; or
    the third device comprises a core network (CN) device or a sensing management entity (SME) .
  46. A method comprising:
    receiving, at a first device and from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and
    i) transmitting, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group, receiving a resource configuration information for separate sensing from a second device.
  47. A method comprising:
    receiving, at a second device and from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area;
    transmitting the first request to the first device; and
    i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
  48. A method comprising:
    transmitting, at a third device and to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area;
    receiving, from the second device, a confirmation message for the first request; and
    including the first device in the sensing device group to perform the union sensing.
  49. An apparatus comprising:
    means for receiving, at a first device and from a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area; and
    means for i) transmitting, to the second device, a confirmation message for the first request, or ii) in the case that the first device refrains from joining the sensing device group,  receiving a resource configuration information for separate sensing from a second device.
  50. An apparatus comprising:
    means for receiving, at a second device and from a third device, a first request for requesting a first device to join a sensing device group that is configured to perform a union sensing in a sensing area;
    means for transmitting the first request to the first device; and
    means for i) receiving, from the first device, a confirmation message for the first request, or ii) transmitting a first resource configuration information for separate sensing to the first device.
  51. An apparatus comprising:
    means for transmitting, to a second device, a first request for requesting the first device to join a sensing device group that is configured to perform a union sensing in a sensing area;
    means for receiving, from the second device, a confirmation message for the first request; and
    means for including the first device in the sensing device group to perform the union sensing.
  52. A non-transitory computer readable medium comprising program instructions stored thereon for performing at least the method of any of claims 49 to 51.
PCT/CN2023/122475 2023-09-28 2023-09-28 Sensing scheduling Pending WO2025065476A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/122475 WO2025065476A1 (en) 2023-09-28 2023-09-28 Sensing scheduling

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2023/122475 WO2025065476A1 (en) 2023-09-28 2023-09-28 Sensing scheduling

Publications (1)

Publication Number Publication Date
WO2025065476A1 true WO2025065476A1 (en) 2025-04-03

Family

ID=95204490

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/122475 Pending WO2025065476A1 (en) 2023-09-28 2023-09-28 Sensing scheduling

Country Status (1)

Country Link
WO (1) WO2025065476A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111902728A (en) * 2018-03-26 2020-11-06 高通股份有限公司 Using side communication channels to exchange radar information to improve multi-radar coexistence
WO2022107050A1 (en) * 2020-11-18 2022-05-27 Lenovo (Singapore) Pte. Ltd. Radar sensing in a radio access network
US20230288554A1 (en) * 2020-07-10 2023-09-14 Telefonaktiebolaget Lm Ericsson (Publ) Multistatic radar system using wireless communication devices
CN116806441A (en) * 2021-02-04 2023-09-26 三星电子株式会社 Sensing in a wireless communication system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111902728A (en) * 2018-03-26 2020-11-06 高通股份有限公司 Using side communication channels to exchange radar information to improve multi-radar coexistence
US20230288554A1 (en) * 2020-07-10 2023-09-14 Telefonaktiebolaget Lm Ericsson (Publ) Multistatic radar system using wireless communication devices
WO2022107050A1 (en) * 2020-11-18 2022-05-27 Lenovo (Singapore) Pte. Ltd. Radar sensing in a radio access network
CN116806441A (en) * 2021-02-04 2023-09-26 三星电子株式会社 Sensing in a wireless communication system

Similar Documents

Publication Publication Date Title
US11706593B2 (en) Terminal device, method, and recording medium
US10536197B2 (en) Device and method for managing spectrum resources, and wireless communication device and method
US20170302369A1 (en) Device, vehicle, mobile communication system, method and computer program for a mobile base station transceiver
US10681774B2 (en) Electronic device and communication method
EP3404977A1 (en) Apparatuses and methods for network management side and user equipment side, and central management apparatus
CN117561680A (en) Control wireless power transfer
EP4017038A1 (en) Electronic device, user equipment, wireless communication method and storage medium
US9761938B2 (en) Antenna apparatus for base station and operation method therefor
US20250240602A1 (en) Location determination
WO2025065476A1 (en) Sensing scheduling
US20240430942A1 (en) Method, device and computer readable medium for communications
WO2024216609A1 (en) Determination of sensing beam
WO2021174468A1 (en) Methods, devices and computer readable media for communications
WO2025030330A1 (en) Device, method and computer readable medium for communications
WO2024234390A1 (en) Devices and methods for communication
WO2024250156A1 (en) Decentralized method for selection of mobile network operator
WO2025091391A1 (en) Device, method and computer readable medium for communications
WO2024197742A1 (en) Device, method and computer readable medium for sidelink communications
US20250279866A1 (en) Method, device and computer readable medium for communications
WO2025111919A1 (en) Radio resource management categorization enhancement for extended reality
US12513669B2 (en) Methods, devices and computer readable media for communications on a sidelink
WO2023201465A1 (en) Method, device and computer readable medium for communications
WO2023173325A1 (en) Multi-agent discovery
WO2022056688A1 (en) Devices, methods, apparatuses and computer readable media for performing surveillance
WO2025062275A1 (en) Apparatus, method and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23953625

Country of ref document: EP

Kind code of ref document: A1