[go: up one dir, main page]

WO2025089335A1 - Appareil de traitement d'informations, appareil de communication, et procédé de traitement d'informations - Google Patents

Appareil de traitement d'informations, appareil de communication, et procédé de traitement d'informations Download PDF

Info

Publication number
WO2025089335A1
WO2025089335A1 PCT/JP2024/037899 JP2024037899W WO2025089335A1 WO 2025089335 A1 WO2025089335 A1 WO 2025089335A1 JP 2024037899 W JP2024037899 W JP 2024037899W WO 2025089335 A1 WO2025089335 A1 WO 2025089335A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sensing
sensor
service
detection data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/037899
Other languages
English (en)
Inventor
Shinichiro Tsuda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of WO2025089335A1 publication Critical patent/WO2025089335A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/02Protecting privacy or anonymity, e.g. protecting personally identifiable information [PII]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/08Access security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information

Definitions

  • the present disclosure relates to an information processing apparatus, a communication apparatus, and an information processing method.
  • a location service using location information of a communication apparatus (for example, a terminal apparatus) is provided.
  • a communication apparatus for example, a terminal apparatus
  • Many communication apparatuses include sensors (for example, a positioning sensor such as a GPS sensor).
  • An information processing apparatus that performs processing related to a service may acquire data detected by a sensor of the communication apparatus (hereinafter, also referred to as detection data) from the communication apparatus.
  • detection data data detected by a sensor of the communication apparatus
  • an information processing apparatus such as an application server may acquire location information (for example, detection data detected by a positioning sensor) of a terminal apparatus from the terminal apparatus.
  • the service performed using detection data has a risk of invasion of privacy of a person related to the communication apparatus (for example, a user of the terminal apparatus) due to the reason such as leakage of the detection data. Therefore, a predetermined privacy management method may be supported in providing a service.
  • a privacy management method referred to as terminal LCS privacy UE LCS privacy
  • UE LCS privacy may be supported in order to protect privacy against acquisition of location information of the terminal apparatus.
  • the present disclosure proposes an information processing apparatus, a communication apparatus, and an information processing method capable of achieving highly advanced privacy protection.
  • an information processing apparatus includes: a reception unit that receives a request related to a sensing service; a first acquisition unit that acquires information related to privacy related to detection data of one or a plurality of sensors included in one or a plurality of communication apparatuses, for each sensor; and a second acquisition unit that acquires detection data of the sensor for which access is determined to be permitted based on the information related to privacy, as data related to the sensing service.
  • Fig. 1 is a diagram illustrating outline of the present embodiment.
  • Fig. 2 is a diagram illustrating a configuration of a communication system according to the present embodiment.
  • Fig. 3 is a diagram illustrating a configuration example of a server according to an embodiment of the present disclosure.
  • Fig. 4 is a diagram illustrating a configuration of a management apparatus according to the present embodiment.
  • Fig. 5 is a diagram illustrating a configuration of a base station according to the present embodiment.
  • Fig. 6 is a diagram illustrating a configuration of a terminal apparatus according to the present embodiment.
  • Fig. 7 is a diagram illustrating a configuration example of a network architecture of 5G system (5GS).
  • Fig. 5GS 5G system
  • FIG. 8 is a diagram illustrating a configuration example of a reference architecture related to a location service of 5GS.
  • Fig. 9 is a diagram illustrating a configuration of LPP for control in an NG-RAN and positioning in a user plane.
  • Fig. 10 is a diagram illustrating an example of LPP session processing.
  • Fig. 11A is a diagram illustrating an example of a procedure prepared in LPP.
  • Fig. 11B is a diagram illustrating an example of a procedure prepared in LPP.
  • Fig. 12A is a diagram illustrating an example of a procedure prepared in NRPPa.
  • Fig. 12B is a diagram illustrating an example of a procedure prepared in NRPPa.
  • Fig. 13 is a sequence diagram illustrating a basic procedure of a location service.
  • Fig. 13 is a sequence diagram illustrating a basic procedure of a location service.
  • FIG. 14A is a diagram illustrating a configuration example of a sensing function in a core network CN.
  • Fig. 14B is a diagram illustrating a configuration example of a sensing function in the core network CN.
  • Fig. 14C is a diagram illustrating a configuration example of a sensing function in the core network CN.
  • Fig. 15 is a diagram illustrating a configuration example of a reference architecture related to a sensing service.
  • Fig. 16 is a diagram illustrating a configuration for control and sensing result detection in a user plane.
  • Fig. 17 is a diagram illustrating an example of SP session processing.
  • Fig. 18A is a diagram illustrating an example of a procedure that can be supported by SP.
  • Fig. 18A is a diagram illustrating an example of a procedure that can be supported by SP.
  • Fig. 18B is a diagram illustrating an example of a procedure that can be supported by SP.
  • Fig. 19A is a diagram illustrating an example of a procedure that can be supported by NSPPa of the present embodiment.
  • Fig. 19B is a diagram illustrating an example of a procedure that can be supported by NSPPa of the present embodiment.
  • Fig. 20A is a diagram illustrating a configuration example of sensing privacy information.
  • Fig. 20B is a diagram illustrating a configuration example of sensing privacy information.
  • Fig. 21 is a sequence diagram illustrating an example of processing related to a sensing service.
  • Fig. 22 is a sequence diagram illustrating an example of processing of a sensing service.
  • Fig. 23 is a flowchart illustrating an example of target sensor specifying processing.
  • Fig. 24 is a flowchart illustrating an example of permission list generation processing.
  • Fig. 25 is a diagram illustrating a configuration example of capability information.
  • Fig. 26 is a diagram illustrating another configuration example of capability information.
  • Fig. 27 is a flowchart illustrating an example of capability information acquisition processing.
  • Fig. 28 is a diagram illustrating an example of detection data collection processing for sensor fusion.
  • Fig. 29 is a sequence diagram illustrating another example of processing of the sensing service.
  • a plurality of components having substantially the same functional configuration will be distinguished by attaching different numbers after the same reference numerals.
  • a plurality of configurations having substantially the same functional configuration are distinguished as necessary, such as terminal apparatuses 40 1 , 40 2 , and 40 3 .
  • terminal apparatuses 40 1 , 40 2 , and 40 3 when it is not particularly necessary to distinguish between the plurality of components having substantially the same functional configuration, only the same reference numeral is given.
  • the terminal apparatus 40 in a case where it is not necessary to particularly distinguish the terminal apparatuses 40 1 , 40 2 , and 40 3 , they are simply referred to as the terminal apparatus 40.
  • the expression “at least one of” with a list of elements is to be construed as an expression with the listed elements as options.
  • “at least one of A, B, and C” represents “A”, “B”, “C”, “A and B”, “A and C", “B and C”, or "A, B, and C ".
  • the “at least one of A, B, or C” and the “at least one of A, B, and/or C” are also similar to the "at least one of A, B, and C”.
  • A, B, and C are all optional expressions (for example, word, phrase, term, or item).
  • One or a plurality of embodiments (including exemplary embodiments and modifications) described below can each be implemented independently. On the other hand, at least some of the plurality of embodiments described below may be combined with at least some of other embodiments as appropriate.
  • the plurality of embodiments may include novel features different from each other. Accordingly, the plurality of embodiments can contribute to achieving or solving different objects or problems, and can exhibit different effects.
  • 5G The first standard of the fifth generation mobile communication system (typically abbreviated as 5G) was formulated in 2018 as Rel-15.
  • 5G is a radio access technology (RAT) that can support various use cases including enhanced Mobile BroadBand (eMBB), massive Machine Type Communications (mMTC), and Ultra-Reliable and Low Latency Communications (URLLC).
  • RAT radio access technology
  • eMBB enhanced Mobile BroadBand
  • mMTC massive Machine Type Communications
  • URLLC Ultra-Reliable and Low Latency Communications
  • 3G third generation mobile communication system
  • 4G fourth generation mobile communication system
  • 3G and later systems including 5G support a location service using location information of a communication apparatus (for example, a terminal apparatus).
  • the location service defines various approaches related to positioning including a RAT-independent approach.
  • Many communication apparatuses include sensors (for example, a positioning sensor such as a GPS sensor).
  • the information processing apparatus that performs processing related to the service may acquire data detected by a sensor of the communication apparatus (hereinafter, also referred to as detection data) from the communication apparatus.
  • detection data data detected by a sensor of the communication apparatus
  • an information processing apparatus such as an application server may acquire location information (for example, detection data detected by a positioning sensor) of a communication apparatus (for example, terminal apparatus) from the communication apparatus.
  • the service performed using detection data has a risk of invasion of privacy of a person related to the communication apparatus (for example, a user of the terminal apparatus) due to the reason such as leakage of the detection data.
  • the person related to the communication apparatus is not limited to a direct user of the communication apparatus, and includes an indirect user of the communication apparatus.
  • the person related to the communication apparatus may be a user who receives a service from the communication apparatus via another apparatus connected to the communication apparatus.
  • the user is not limited to the direct or indirect user of the communication apparatus.
  • the person related to the communication apparatus may be a person having some information detected or recorded in the communication apparatus.
  • the person related to the communication apparatus includes all persons having a possibility of suffering privacy invasion due to leakage of detection data or the like.
  • a predetermined privacy management method may be supported in providing the service.
  • a privacy management method referred to as terminal LCS privacy UE LCS privacy
  • UE LCS privacy may be supported in order to protect privacy against acquisition of location information of the terminal apparatus.
  • ISAC Integrated Sensing and Communication
  • the communication apparatus including the sensor is not limited to a terminal apparatus, and may be an apparatus other than the terminal apparatus such as a base station or a road-side unit (RSU), for example.
  • a terminal apparatus or a base station provides not only a communication function but also a radio frequency (RF)-based sensing function. That is, the mobile communication system is expected to be able to acquire information from various sensors in the terminal apparatus or the base station via wireless communication and expected to provide a sensing service that can produce a new added value such as sensor fusion using artificial intelligence or machine learning.
  • RF radio frequency
  • terminal LCS privacy may be supported in order to protect privacy against acquisition of location information of the terminal apparatus.
  • the terminal LCS privacy only relates to the privacy management at granularity of service.
  • the sensing service by ISAC will handle information other than the location information of the terminal apparatus. Therefore, there is a concern that the privacy management method performed at a granularity of service would be insufficient in privacy protection in the future.
  • PTL 1 Japanese Laid-open Patent Publication No. 2001-359169 discloses an approach of protecting privacy of a user of a mobile terminal by identifying location information of the mobile terminal in units of base points.
  • PTL 1 does not mention protection of privacy against various sensors having different types and purposes of data as acquisition targets.
  • the present embodiment proposes to solve the above problem as follows.
  • Fig. 1 is a diagram illustrating outline of the present embodiment.
  • the communication system of the present embodiment is a cellular communication system having a plurality of communication apparatuses (for example, the terminal apparatus) wirelessly connected to each other.
  • the communication system includes a base station and an information processing apparatus, and provides a wireless communication service to the plurality of communication apparatuses.
  • the information processing apparatus is a core network, for example.
  • the communication system may include a server separately from the base station and the information processing apparatus.
  • the server is an application server that provides various services to the terminal apparatus, for example.
  • At least one of the plurality of communication apparatuses includes one or a plurality of sensors.
  • the sensor included in the communication apparatus may be a sensor that detects location information, such as a positioning sensor, or may be a sensor such as a camera and/or LiDAR sensor.
  • the sensor included in the communication apparatus may be a sensor that detects at least one of a color of an object, a velocity of the object, an acceleration of the object, a temperature of the object, a reflectance of the object, an image, and a shape of the object.
  • the communication apparatus may include other types of sensors.
  • Services provided by the communication system according to the present embodiment include a sensing service.
  • the sensing service is, for example, a service performed based on data detected by one or a plurality of sensors of one or a plurality of communication apparatuses.
  • data detected by one or a plurality of sensors is referred to as detection data.
  • the detection data can be rephrased as sensing data, sensing information, and/or sensing results.
  • the sensing service may be rephrased as acquisition of detection data by a sensor.
  • the sensing service is typically a service of providing detection data of one or a plurality of sensors.
  • the sensing service is not limited to the service of providing detection data.
  • the sensing service may be a service of providing processing executed using detection data, or may be a service of providing information generated using detection data.
  • the sensing service may be a service of providing processing based on image data or shape data detected by a camera, LiDAR, or the like (for example, an automated driving service of the vehicle), or may be a service of providing information generated based on location information detected by a positioning sensor (for example, a service of providing information related to an area/facility specified by detection data).
  • the sensing service is not limited to these services.
  • the sensing service may be a service provided based on a sensor that detects the color of the object, the velocity of the object, the acceleration of the object, the temperature of the object, and the reflectance of the object.
  • the sensing service is not limited to a service performed by directly using data detected by a sensor.
  • the sensing service may be a service performed indirectly using data detected by a sensor.
  • the sensing service may be a service performed using data obtained by processing detection data of one or a plurality of sensors (for example, an analysis result based on detection data or data obtained by combining a plurality of pieces of detection data). Note that data obtained by processing detection data of one or a plurality of sensors can also be regarded as a type of detection data.
  • the detection data may be data detected using a radio wave in the same band as the resource used for wireless communication.
  • the detection data may be data detected using a resource used in cellular communication such as LTE or NR, or may be data detected using a resource used in Wi-fi communication, Bluetooth communication, or the like.
  • An information processing apparatus for example, a core network included in a communication system performs processing related to a sensing service based on a request from a server (for example, an application server).
  • the apparatus that requests the sensing service is not limited to the server.
  • the apparatus that requests the sensing service may be a communication apparatus other than the server, for example, at least one of a terminal apparatus, a core network, and a base station.
  • the information processing apparatus that performs the processing related to the sensing service is not limited to the core network.
  • the information processing apparatus that performs the processing related to the sensing service may be a server, a base station, or other communication apparatuses (for example, the terminal apparatus).
  • the information processing apparatus acquires information related to privacy of one or a plurality of sensors included in one or a plurality of communication apparatuses (hereinafter, the information is referred to as sensing privacy information).
  • the sensing privacy information includes information related to access to detection data of one or a plurality of sensors included in the communication apparatus, for each sensor.
  • the information related to the access is, for example, information indicating whether to permit each sensor access to detection data.
  • the information related to the access may be information indicating whether to permit access to the detection data for each data related to the sensor.
  • the information related to the access may be information indicating whether to permit access to detection data for each sensing data.
  • the information related to the access may be regarded as exactly the same as sensing privacy information (information related to privacy).
  • sensing privacy information (information related to privacy) may be regarded as exactly the same as information regarding access.
  • Information (information related to privacy) included in the sensing privacy information may be regarded as information related to access.
  • the sensing privacy information may include information related to duplication of the detection data.
  • the sensing privacy information may include information related to conditions of duplication of the detection data.
  • the sensing privacy information may include information related to the number of times of duplication of detection data and/or whether to permit duplication of detection data.
  • the information processing apparatus determines a sensor for which access to the detection data is permitted based on the sensing privacy information (for example, information related to access included in the sensing privacy information). Subsequently, the information processing apparatus acquires data detected by the sensor for which access is determined to be permitted. Subsequently, the information processing apparatus transmits the acquired detection data to the server.
  • the sensing privacy information for example, information related to access included in the sensing privacy information.
  • the information processing apparatus can decide whether the sensor can access the detection data, not in units of services but in units of sensors. That is, the information processing apparatus can perform privacy management with finer granularity than before. As a result, it is possible to sufficiently protect privacy of a person related to the communication apparatus.
  • Fig. 2 is a diagram illustrating a configuration example of the communication system 1 according to the present embodiment.
  • the communication system 1 includes a server 10, a management apparatus 20, a base station 30, and a terminal apparatus 40. With individual wireless communication apparatuses constituting the communication system 1 operating in cooperation with each other, the communication system 1 provides a user with a wireless network capable of mobile communication (mobile network).
  • a wireless network capable of mobile communication (mobile network).
  • the wireless network of the present embodiment may be, for example, a cellular network including a radio access network RAN and a core network CN.
  • the mobile network may include a terminal apparatus 40.
  • the wireless communication apparatus is an apparatus having a wireless communication function, and in the example of Fig. 2, the apparatus corresponds to the base station 30 and the terminal apparatus 40.
  • the communication system 1 may include a plurality of servers 10, a plurality of management apparatuses 20, a plurality of base stations 30, and a plurality of terminal apparatuses 40.
  • the communication system 1 includes a server 10 1 and a server 10 2 as the server 10, and includes a management apparatus 20 1 and a management apparatus 20 2 as the management apparatus 20.
  • the communication system 1 includes a base station 30 1 , a base station 30 2 , and a base station 30 3 as the base station 30, and includes a terminal apparatus 40 1 , a terminal apparatus 40 2 , and a terminal apparatus 40 3 as the terminal apparatus 40.
  • an apparatus included in the communication system 1 may be referred to as a network apparatus.
  • the terminal apparatus 40 may be configured to connect to the network using a radio access technology (RAT) such as long term evolution (LTE), new radio (NR), Beyond 5G (B5G), 6G, Wi-Fi, or Bluetooth (registered trademark).
  • RAT radio access technology
  • LTE long term evolution
  • NR new radio
  • B5G 5G
  • 6G 6G
  • Wi-Fi Wi-Fi
  • Bluetooth registered trademark
  • the terminal apparatus 40 may be configured to be able to use different radio access technologies (wireless communication method).
  • the terminal apparatus 40 may be configured to be able to use NR and Wi-Fi.
  • the terminal apparatus 40 may be configured to be able to use different cellular communication technologies (for example, LTE and NR, B5G, or 6G).
  • the terminal apparatus 40 may be referred to as user equipment (UE) 40.
  • UE user equipment
  • LTE and NR are a type of cellular communication technology, and enable mobile communication of terminal apparatuses by using cellular arrangement of a plurality of areas covered by base stations.
  • 6G is also assumed to be a type of cellular communication technology, and it is assumed that arranging a plurality of areas covered by the base station in a cellular manner will make it possible to perform mobile communication of the terminal apparatuses.
  • LTE includes LTE-advanced (LTE-A), LTE-advanced pro (LTE-A Pro), and evolved universal terrestrial radio access (EUTRA).
  • NR includes new radio access technology (NRAT) and further EUTRA (FEUTRA).
  • NR may further include 5G-Advanced.
  • a single base station 30 may manage a plurality of cells.
  • a cell corresponding to LTE may be referred to as an LTE cell
  • a cell corresponding to NR may be referred to as an NR cell.
  • NR is the next generation (fifth generation) radio access technology subsequent to LTE (fourth generation communication including LTE-Advanced and LTE-Advanced Pro).
  • the NR is a radio access technology that can support various use cases including enhanced mobile broadband (eMBB), massive machine type communications (mMTC), and Ultra-Reliable and Low Latency Communications (URLLC).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC Ultra-Reliable and Low Latency Communications
  • NR is standardized by Rel-15 of 3GPP (registered trademark) as a technical framework supporting a usage scenario, a requirements, a deployment scenario, and the like in these use cases.
  • B5G and 6G are required to simultaneously achieve a plurality of axes of high speed and large capacity, low latency/high reliability, and multiple simultaneous connection.
  • 6G is a mobile communication technology of a next generation of NR or 5G system (5GS), being the fifth generation mobile communication.
  • the 6G can be a cellular communication technology similarly to the 5G (NR).
  • 6G includes a radio access technology and a network technology between a base station, a core network, and a data network.
  • the 6G includes a technology for sophistication (referred to as extreme connectivity) of each technology of eMBB, mMTC, and URLLC, which have been defined as a main use case or a requirement in the NR.
  • the 6G also includes new technologies in new aspects.
  • 6G can include technologies related to artificial intelligence (cognitive network, AI native Air Interface), sensing (Rader sensing, network as a sensor), and terahertz communication.
  • the wireless network described above or below may support at least one of radio access technologies (RAT) such as LTE, NR, B5G, and 6G.
  • RAT radio access technologies
  • the radio access method used by the communication system 1 is not limited to LTE, NR, or 6G, and may be other radio access methods such as wideband code division multiple access (W-CDMA) and code division multiple access 2000 (cdma2000), for example.
  • W-CDMA wideband code division multiple access
  • cdma2000 code division multiple access 2000
  • the base station 30 may be a terrestrial station or a non-terrestrial station. That is, the communication system illustrated in Fig. 2 may be a non-terrestrial network.
  • the non-terrestrial station may be a satellite station or an aircraft station.
  • the wireless network may be a Bent-pipe (Transparent) mobile satellite communication system.
  • the terrestrial station (also referred to as a terrestrial base station) refers to a base station or a relay station installed on the ground.
  • the "ground” represents not only a land but also a terrestrial location in a broad sense including underground, above-water, and underwater. Note that, in the following description, the description of "terrestrial station” may be referred to as a "gateway".
  • the base station in LTE may be referred to as Evolved Node B (eNodeB) or eNB.
  • NR base stations may be referred to as gNodeB or gNB.
  • 6G base stations may be referred to as 6G NodeB (6GNB).
  • the RAN of LTE may be referred to as EUTRAN.
  • the RAN of NR may be referred to as NGRAN.
  • the RAN of 6G may be referred to as 6GRAN.
  • a terminal apparatus also referred to as a mobile station, or terminal
  • UE user equipment
  • the terminal apparatus is a type of communication apparatus, and is also referred to as a mobile station or a terminal.
  • the terminal apparatus 40 may be connectable to the network using a radio access technology (wireless communication method) other than LTE, NR, B5G, 6G, Wi-Fi, or Bluetooth.
  • a radio access technology wireless communication method
  • the terminal apparatus 40 may be connectable to the network by using low power wide area (LPWA) communication.
  • LPWA low power wide area
  • the terminal apparatus 40 may be connectable to the network using wireless communication of a proprietary standard.
  • LPWA communication is wireless communication that enables low-power wide-range communication.
  • the LPWA wireless is Internet of Things (IoT) wireless communication using a specified low power wireless (for example, the 920 MHz band) or an Industry-Science-Medical (ISM) band.
  • LPWA radio may include LTE-M operating in a cellular frequency band and/or Cellular IoT (C-IoT) typified by NB-IoT.
  • C-IoT Cellular IoT
  • the LPWA communication used by the terminal apparatus 40 may conform to the LPWA standard.
  • the LPWA standard may be, for example, at least one of ELTRES, ZETA, SIGFOX, LoRaWAN, LTE-M, and NB-IoT. Needless to say, the LPWA standard is not to be limited thereto, and may be other LPWA standards.
  • Each wireless communication apparatus in Fig. 2 may be considered as an apparatus in a logical sense. That is, a part of each wireless communication apparatus may be implemented by a virtual machine (VM), a container such as a docker, or the like, and they may be implemented on physically the same piece of hardware.
  • VM virtual machine
  • container such as a docker
  • the concept of the "wireless communication apparatus” includes not only a portable mobile apparatus (terminal apparatus) such as a mobile terminal but also an apparatus installed in a structure or a mobile body.
  • the structure or a mobile body itself may be regarded as a wireless communication apparatus.
  • the wireless communication apparatus conceptually includes not only the terminal apparatus 40 but also the base station 30.
  • the wireless communication apparatus is a type of processing apparatus and information processing apparatus.
  • the wireless communication apparatus can be rephrased as a transmission apparatus or a reception apparatus.
  • the resource may indicate, for example, at least one of Frequency, Time, Resource Element (INCLUDING REG, CCE, CORESET), Resource Block, Bandwidth Part, Component Carrier, Symbol, Sub-Symbol, Slot, Mini-Slot, Subslot, Subframe, Frame, PRACH occasion, Occasion, Code, Multi-access physical resource, Multi-access signature, and Subcarrier Spacing (Numerology). That is, the "resource” or the "radio resource” described above or below may be rephrased as at least one of the above examples.
  • each wireless communication apparatus illustrated below is just an example.
  • the configuration of each wireless communication apparatus may differ from the configuration below.
  • the server 10 is an information processing apparatus (computer) that provides various services to the terminal apparatus 40.
  • the server 10 is an information processing apparatus that executes processing related to a sensing service.
  • the server 10 may be an application server or a web server.
  • the server 10 may be a cloud server or an edge server.
  • the server 10 may be a PC server, a midrange server, or a mainframe server.
  • the server 10 may be an information processing apparatus that performs data processing (edge processing) near the user or the terminal.
  • the server 10 may be an information processing apparatus (computer) provided close to or built in a base station.
  • the server 10 may have a function as a core network.
  • the server 10 may be an apparatus that functions as the management apparatus 20.
  • the server 10 may naturally be an information processing apparatus that performs cloud computing.
  • the server 10 of the present embodiment can function as an application function.
  • the server 10 is connected to another communication apparatus (for example, the management apparatus 20) via a network N.
  • a network N may be provided in plurality.
  • the network N is a public network such as the Internet, for example.
  • the network N is not limited to the Internet, but may be other networks such as a local area network (LAN), a wide area network (WAN), a cellular network, a fixed-line telephone network, or a regional Internet protocol (IP) network, for example.
  • the network N may include a wired network or a wireless network.
  • Fig. 3 is a diagram illustrating a configuration example of the server 10 according to the embodiment of the present disclosure.
  • the server 10 includes a communication unit 11, a storage unit 12, and a control unit 13.
  • the configuration illustrated in Fig. 3 is a functional configuration, and the hardware configuration may be different from this.
  • the functions of the server 10 may be installed in a distributed manner in a plurality of physically separated configurations.
  • the server 10 may include a plurality of information processing apparatuses.
  • the server 10 does not necessarily include all the configurations described above or below.
  • the server 10 may have a configuration other than the configuration described above or below.
  • the management apparatus 20 may include a sensor unit having a configuration similar to that of the sensor unit (sensor unit 34 or sensor unit 46) included in the base station 30 or the terminal apparatus 40.
  • the communication unit 11 is a communication interface for communicating with other apparatuses.
  • the communication unit 11 is a network interface.
  • An example of the communication unit 11 is a local area network (LAN) interface such as a Network Interface Card (NIC).
  • the communication unit 11 may be a wired interface, or may be a wireless interface. Under the control of the control unit 13, the communication unit 11 communicates with the management apparatus 20, the base station 30, the terminal apparatus 40, and another server 10.
  • the storage unit 12 is a data readable/writable storage device such as dynamic random access memory (DRAM), static random access memory (SRAM), a flash drive, or a hard disk.
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • flash drive or a hard disk.
  • the control unit 13 is a controller that controls individual units of the server 10.
  • the control unit 13 may be implemented by a processor such as a central processing unit (CPU) or a micro processing unit (MPU), for example.
  • the control unit 13 may be implemented by execution of various programs stored in the storage device inside the management apparatus 20 by the processor using random access memory (RAM) or the like as a work area.
  • the control unit 13 may be implemented by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the control unit 13 may be implemented by a Graphics Processing Unit (GPU).
  • the CPU, MPU, ASIC, FPGA. and GPU can all be regarded as controllers.
  • the control unit 13 may include a plurality of physically separated objects.
  • the control unit 13 may include a plurality of semiconductor chips.
  • the control unit 13 includes at least one block of a reception unit 131, an acquisition unit 132, a generation unit 133, a saving unit 134, a request unit 135, an update unit 136, a setting unit 137, and a processing unit 138.
  • the control unit 13 may include a plurality of these blocks, or may include only one block each.
  • the control unit 13 may include at least one of a first acquisition unit, a second acquisition unit, and a third acquisition unit as the acquisition unit 132.
  • the control unit 13 may include at least one of a first request unit and a second request unit as the request unit 135.
  • Individual blocks (reception unit 131 to processing unit 138) constituting the control unit 13 are functional blocks individually indicating functions of the control unit 13. These functional blocks may be software blocks or hardware blocks. For example, each of the functional blocks described above may be one software module realized by software (including a microprogram) or one circuit block on a semiconductor chip (die). Needless to say, each of the functional blocks may be formed as one processor or one integrated circuit. Note that the control unit 13 may be configured in a functional unit different from the above-described functional block. The functional block may be configured by using any method. The operation of the control unit 13 may be the same as the operation of the control unit (control unit 23, control unit 33, or control unit 43) included in the management apparatus 20, the base station 30, or the terminal apparatus 40, respectively.
  • the management apparatus 20 is an information processing apparatus (computer) that manages a wireless network.
  • the management apparatus 20 is an information processing apparatus that manages communication of the base station 30.
  • the management apparatus 20 may be an apparatus constituting the core network CN.
  • the management apparatus 20 may be an apparatus having a function as a Mobility Management Entity (MME).
  • the management apparatus 20 may be an apparatus having a function as an Access and Mobility Management Function (AMF) and/or a Session Management Function (SMF).
  • the MME, the AMF, and the SMF are control plane network function nodes in the core network CN.
  • the management apparatus 20 may be an apparatus having a function as a 6G control plane network function (6G CPNF).
  • the 6G CPNF include one or a plurality of logical nodes.
  • the functions of the management apparatus 20 are not to be limited to MME, AMF, SMF, or 6G CPNF.
  • the management apparatus 20 may be an apparatus having a function as a Network Slice Selection Function (NSSF), an Authentication Server Function (AUSF), a Policy Control Function (PCF), or Unified Data Management (UDM).
  • NSF Network Slice Selection Function
  • AUSF Authentication Server Function
  • PCF Policy Control Function
  • UDM Unified Data Management
  • the management apparatus 20 may be an apparatus having a function as a Home Subscriber Server (HSS).
  • HSS Home Subscriber Server
  • the management apparatus 20 may have a function of a gateway.
  • the management apparatus 20 may have a function as a Serving Gateway (S-GW) or a Packet Data Network Gateway (P-GW).
  • the management apparatus 20 may have a function as a User Plane Function (UPF).
  • UPF User Plane Function
  • the management apparatus 20 may have a plurality of UPFs.
  • the management apparatus 20 may be an apparatus having a function as a 6G control user plane network function (6G UPNF).
  • 6G UPNF 6G control user plane network function
  • the core network CN includes a plurality of network functions. Each network function may be integrated into one physical apparatus or distributed to a plurality of physical apparatuses. That is, the management apparatus 20 can be disposed in a plurality of apparatuses as distributed arrangement. Furthermore, this distributed arrangement may be controlled to be performed dynamically.
  • the core network CN may be constituted with one management apparatus 20 or may be constituted with a plurality of management apparatuses.
  • the base station 30 and the management apparatus 20 constitute one network, and provide a wireless communication service to the terminal apparatus 40.
  • the management apparatus 20 is connected to the Internet, and the terminal apparatus 40 can use various services provided over the Internet via the base station 30.
  • the management apparatus 20 does not necessarily have to be an apparatus constituting the core network CN.
  • the core network CN is a core network of Wideband Code Division Multiple Access (W-CDMA) or Code Division Multiple Access 2000 (cdma2000).
  • the management apparatus 20 may be an apparatus that functions as a Radio Network Controller (RNC).
  • RNC Radio Network Controller
  • Fig. 4 is a diagram illustrating a configuration example of the management apparatus 20 according to the present embodiment.
  • the management apparatus 20 includes a communication unit 21, a storage unit 22, and a control unit 23.
  • the configuration illustrated in Fig. 4 is a functional configuration, and the hardware configuration may be different from this.
  • the functions of the management apparatus 20 may be implemented in a statically or dynamically distributed form in a plurality of physically separated configurations.
  • the management apparatus 20 may be constituted with a plurality of server apparatuses.
  • the management apparatus 20 does not necessarily have all the configurations described above or below.
  • the management apparatus 20 may have a configuration other than the configuration described above or below.
  • the management apparatus 20 may include a sensor unit having a configuration similar to that of the sensor unit (sensor unit 34 or sensor unit 46) included in the base station 30 or the terminal apparatus 40.
  • the communication unit 21 is a communication interface for communicating with a wireless communication apparatus (for example, the base station 30).
  • the communication unit 21 may be a network interface or a device connection interface.
  • the communication unit 21 may be a local area network (LAN) interface such as a network interface card (NIC), or may be a universal serial bus (USB) interface including a USB host controller, a USB port, and the like.
  • the communication unit 21 may be a wired interface, or may be a wireless interface.
  • the communication unit 21 is controlled by the control unit 23.
  • the storage unit 22 is a readable/writable storage device such as DRAM, SRAM, a flash drive, or a hard disk.
  • the storage unit 22 stores, for example, a connection state of the terminal apparatus 40.
  • the storage unit 22 stores a Radio Resource Control (RRC) state or an EPS connection management (ECM) state or a 5G system connection management (CM) state of the terminal apparatus 40.
  • RRC Radio Resource Control
  • ECM EPS connection management
  • CM 5G system connection management
  • the storage unit 22 may function as a unit referred to as "home memory" that stores location information of the terminal apparatus 40.
  • the control unit 23 is a controller that controls individual parts of the management apparatus 20.
  • the control unit 23 may be implemented by, for example, a processor such as a CPU or an MPU.
  • the control unit 23 is implemented by the processor executing various programs stored in the storage device inside the management apparatus 20 using RAM or the like as a work area.
  • the control unit 23 may be implemented by an integrated circuit such as an ASIC or an FPGA.
  • the control unit 23 may be implemented by a GPU.
  • the CPU, MPU, ASIC, FPGA. and GPU can all be regarded as controllers.
  • the control unit 23 may include a plurality of physically separated objects.
  • the control unit 23 may include a plurality of semiconductor chips.
  • the control unit 23 includes at least one block of a reception unit 231, an acquisition unit 232, a generation unit 233, a saving unit 234, a request unit 235, an update unit 236, a setting unit 237, and a processing unit 238.
  • the control unit 23 may include a plurality of these blocks, or may include only one block each.
  • the control unit 23 may include at least one of a first acquisition unit, a second acquisition unit, and a third acquisition unit as the acquisition unit 232.
  • the control unit 23 may include at least one of a first request unit and a second request unit as the request unit 235.
  • Individual blocks (reception unit 231 to processing unit 238) constituting the control unit 23 are functional blocks individually indicating functions of the control unit 23.
  • These functional blocks may be software blocks or hardware blocks.
  • each of the functional blocks described above may be one software module realized by software (including a microprogram) or one circuit block on a semiconductor chip (die). Needless to say, each of the functional blocks may be formed as one processor or one integrated circuit.
  • the control unit 23 may be configured in a functional unit different from the above-described functional block.
  • the functional block may be configured by using any method.
  • the operation of the control unit 23 may be the same as the operation of the control unit (control unit 13, control unit 33, or control unit 43) included in the server 10, the base station 30, or the terminal apparatus 40, respectively.
  • the base station 30 is a wireless communication apparatus that performs wireless communication with other wireless communication apparatuses (for example, the terminal apparatus 40 or another base station 30).
  • the base station 30 may wirelessly communicate with the terminal apparatus 40 via a relay station, or may directly wirelessly communicate with the terminal apparatus 40.
  • the base station 30 is an apparatus corresponding to a radio base station (Base Station, Node B, eNB, gNB, or 6GNB, etc.) or a radio access point.
  • the base station 30 may be a radio relay station.
  • the base station 30 may be an optical link apparatus referred to as a Remote Radio Head (RRH).
  • RRH Remote Radio Head
  • the base station 30 may be a receiving station such as a Field Pickup Unit (FPU).
  • the base station 30 may be an Integrated Access and Backhaul (IAB) donor node or an IAB relay node that provides a radio access channel and a radio backhaul channel by using time division multiplexing, frequency division multiplexing, or space division multiplexing.
  • IAB Integrated Access and Backhaul
  • the radio access technology used by the base station 30 may be a cellular communication technology.
  • the radio access technology used by the base station 30 may be a wireless LAN technology.
  • the radio access technology used by the base station 30 may be a low power wide area (LPWA) communication technology.
  • LPWA low power wide area
  • the radio communication used by the base station 30 may be radio communication using a millimeter wave or radio communication using a terahertz wave (THz wave).
  • the wireless communication used by the base station 30 may be wireless communication using radio waves or wireless communication (optical wireless communication) using infrared rays or visible light.
  • the base station 30 may be capable of Non-Orthogonal Multiple Access (NOMA) communication with the terminal apparatus 40.
  • NOMA communication refers to communication (transmission, reception, or both) using non-orthogonal resources.
  • the base station 30 may be capable of performing NOMA communication with another base station 30.
  • the base station 30 may be capable of performing mutual communication with the core network via a base station-core network interface (for example, NG Interface, S1 Interface, or the like). This interface may be implemented as wired or wireless interface. Furthermore, the base stations may be capable of performing mutual communication with another base station by an inter-base station interface (for example, Xn Interface, X2 Interface, F1 Interface, or the like). This interface may be implemented as wired or wireless interface.
  • a base station-core network interface for example, NG Interface, S1 Interface, or the like.
  • This interface may be implemented as wired or wireless interface.
  • an inter-base station interface for example, Xn Interface, X2 Interface, F1 Interface, or the like. This interface may be implemented as wired or wireless interface.
  • the base station (also referred to as a base station apparatus) conceptually includes not only a donor base station but also a relay base station (also referred to as a relay station).
  • the relay base station may be any one of RF Repeater, Smart Repeater, and Intelligent Surface.
  • the base station may conceptually include a road-side unit (RSU).
  • RSU road-side unit
  • a base station may conceptually include not only a structure having a function of a base station but also an apparatus installed in the structure.
  • Examples of the structure include a building such as a high-rise building, a house, a steel tower, a station facility, an airport facility, a harbor facility, an office building, a school building, a hospital, a factory, a commercial facility, or a stadium.
  • the structure conceptually includes not only buildings but also non-building structures such as tunnels, bridges, dams, fences, and steel columns, as well as facilities such as cranes, gates, and windmills.
  • the structure conceptually includes not only land-based (ground-based, in a narrow sense) structures or underground structures but also structures on the water, such as a jetty or a mega-float, and underwater structures such as an ocean observation facility.
  • the base station can also be rephrased as an information processing apparatus.
  • the base station 30 may be a donor station or a relay station.
  • the base station 30 may be a fixed station or a mobile station.
  • the mobile station is a wireless communication apparatus (for example, a base station) configured to be movable.
  • the base station 30 may be an apparatus installed on a mobile body, or may be a mobile body itself.
  • a relay station having mobility can be regarded as the base station 30 as a mobile station.
  • an apparatus designed to have mobility such as an Unmanned Aerial Vehicle (UAV) represented by a drone, or a smartphone, and having a function of a base station (at least a part of the function of a base station) also corresponds to the base station 30 as a mobile station.
  • UAV Unmanned Aerial Vehicle
  • the mobile body may be a mobile terminal such as a smartphone or a mobile phone.
  • the mobile body may be a mobile body that moves on the land (ground in a narrow sense) (for example, a vehicle such as an automobile, a motorcycle, a bus, a truck, a motorbike, a train, or a linear motor car), or a mobile body (for example, subway) that moves under the ground (for example, through a tunnel).
  • the mobile body may be a mobile body that moves on the water (for example, a ship such as a passenger ship, a cargo ship, or a hovercraft), or a mobile body that moves underwater (for example, a submersible ship such as a submersible boat, a submarine, or an unmanned submarine).
  • the mobile body may be a mobile body that moves in the atmosphere (for example, an aircraft such as an airplane, an airship, or a drone).
  • the base station 30 may be a terrestrial base station (terrestrial station) installed on the ground.
  • the base station 30 may be a base station disposed on a structure on the ground, or may be a base station installed in a mobile body moving on the ground.
  • the base station 30 may be an antenna installed in a structure such as a building and a signal processing apparatus connected to the antenna.
  • the base station 30 may be a structure or a mobile body itself.
  • the "ground” represents not only a land (ground in a narrow sense) but also a terrestrial location in a broad sense including underground, above-water, and underwater.
  • the base station 30 is not limited to a terrestrial base station.
  • the base station 30 may be an aircraft station. From the perspective of a satellite station, an aircraft station located on the earth is a terrestrial station.
  • the base station 30 is not limited to a terrestrial station.
  • the base station 30 may be a non-terrestrial base station (non-terrestrial station) capable of floating in the air or space.
  • the base station 30 may be an aircraft station or a satellite station.
  • the satellite station is a satellite station capable of floating outside the atmosphere.
  • the satellite station may be an apparatus mounted on a space mobile body such as an artificial satellite, or may be a space mobile body itself.
  • a space mobile body is a mobile body that moves outside the atmosphere.
  • the space mobile body may be at least one of a satellite, a spacecraft, a space station, and a probe. Needless to say, the space mobile body may be an artificial celestial body other than these.
  • the satellite serving as the satellite station may be any of a low earth orbiting (LEO) satellite, a medium earth orbiting (MEO) satellite, a geostationary earth orbiting (GEO) satellite, or a highly elliptical orbiting (HEO) satellite.
  • the satellite station may be an apparatus mounted on a low earth orbiting satellite, a medium earth orbiting satellite, a geostationary earth orbiting satellite, or a highly elliptical orbiting satellite.
  • the aircraft station is a wireless communication apparatus capable of floating in the atmosphere, such as an aircraft.
  • the aircraft station may be an apparatus mounted on an aircraft or the like, or may be an aircraft itself.
  • the aircraft conceptually includes not only heavy aircraft such as an airplane or a glider but also light aircraft such as a balloon or an airship.
  • the aircraft conceptually includes not only a heavy aircraft or a light aircraft but also a rotorcraft such as a helicopter or an auto-gyro.
  • the aircraft station or an aircraft equipped with an aircraft station may be an unmanned aerial vehicle such as a drone.
  • the unmanned aerial vehicle conceptually includes an unmanned aircraft system (UAS) and a tethered UAS.
  • the unmanned aircraft conceptually includes also a Lighter-than-Air (LTA) unmanned aircraft system (UAS) and a Heavier-than-Air (HTA) unmanned aircraft system (UAS).
  • LTA Lighter-than-Air
  • HTA Heavier-than-Air
  • the unmanned aircraft conceptually includes also High Altitude unmanned aircraft system (UAS) platforms (HAPs).
  • HAPs High Altitude unmanned aircraft system
  • the coverage of the base station 30 may be relatively large such as a macro cell or relatively small such as a pico cell.
  • the coverage of the base station 30 may be extremely small such as a femto cell.
  • the base station 30 may have a beamforming function. In this case, the base station 30 may form a cell or a service area for each beam.
  • the base station 30 may have a function of delivering a desired wave to a predetermined pinpoint target position by an additional consideration of distance information from an antenna of the base station 30 in addition to beamforming that provides directivity to a beam. This function may be referred to as beam focusing or point forming. Furthermore, it may be configured to acquire detection data by performing sensing using a beam.
  • Fig. 5 is a diagram illustrating a configuration example of the base station 30 according to the present embodiment.
  • the base station 30 includes a wireless communication unit 31, a storage unit 32, a control unit 33, and a sensor unit 34.
  • the configuration illustrated in Fig. 5 is a functional configuration, and the hardware configuration may be different from this.
  • the functions of the base station 30 may be implemented in a distributed form in a plurality of physically separated configurations.
  • the base station 30 does not necessarily include all the configurations described above or below.
  • the base station 30 does not have to include the sensor unit 34.
  • the base station 30 may have a configuration other than the configuration described above or below.
  • the wireless communication unit 31 is a signal processing unit for performing wireless communication with other wireless communication apparatuses (for example, at least one of the terminal apparatus 40 and another base station 30).
  • the wireless communication unit 31 may be referred to as a radio transceiver or simply a transceiver.
  • the wireless communication unit 31 may be a transceiver (hereinafter, referred to as a 3GPP transceiver) of a specification defined by a technical specification (TS) of 3rd generation partnership project (3GPP).
  • the 3GPP transceiver may be a 3G transceiver, a 4G (LTE) transceiver, a 5G (NR) transceiver, or a transceiver of 5G or later generation.
  • the wireless communication unit 31 is controlled by the control unit 33.
  • the wireless communication unit 31 may support one or a plurality of radio access methods.
  • the wireless communication unit 31 may support at least one of NR, LTE, Beyond 5G (B5G), and 6G.
  • the wireless communication unit 31 may support W-CDMA, cdma2000, and the like in addition to NR, LTE, B5G, and 6G.
  • the wireless communication unit 31 may support an automatic retransmission technology such as Hybrid Automatic Repeat reQuest (HARQ). A part or all of the processing executed by the wireless communication unit 31 may be executed by the control unit 33.
  • HARQ Hybrid Automatic Repeat reQuest
  • the wireless communication unit 31 includes a transmission processing unit 311, a reception processing unit 312, and an antenna 313. Alternatively, the wireless communication unit 31 may regard at least one of the transmission processing unit 311, the reception processing unit 312, and the antenna 313 as the wireless communication unit 31.
  • the wireless communication unit 31 may include a plurality of the transmission processing units 311, a plurality of the reception processing units 312, and a plurality of the antennas 313. In a case where the wireless communication unit 31 supports a plurality of radio access methods, individual portions of the wireless communication unit 31 can be configured separately for each of the radio access methods.
  • the transmission processing unit 311 and the reception processing unit 312 may be configured separately for LTE, NR, B5G, and 6G.
  • the antenna 313 may include a plurality of antenna elements, for example, a plurality of patch antennas.
  • the wireless communication unit 31 may have a beamforming function.
  • the wireless communication unit 31 may have a polarization beamforming function using vertically polarized waves (V-polarized waves) and horizontally polarized waves (H-polarized waves) (or may have a polarization beamforming function using dual polarization in polarization directions of 45 degrees and -45 degrees with the vertical direction).
  • the transmission processing unit 311 performs transmission processing of downlink control information and downlink data.
  • the transmission processing unit 311 codes the downlink control information and the downlink data input from the control unit 33 by using a coding method such as block coding, convolutional coding, or turbo coding.
  • the coder may perform coding using a polar code or a Low Density Parity Check (LDPC) code.
  • LDPC Low Density Parity Check
  • the transmission processing unit 311 modulates the coded bits by a predetermined modulation method (for example, BPSK, QPSK, 16QAM, 64QAM, 256QAM, or a higher order multi-level modulation scheme). In this case, the signal points on the constellation do not necessarily have to be equidistant.
  • the constellation may be a non uniform constellation (NUC).
  • the transmission processing unit 311 multiplexes the modulation symbol of each of channels and the downlink reference signal and allocates the multiplexed signals on a predetermined resource element. Subsequently, the transmission processing unit 311 performs various types of signal processing on the multiplexed signal. For example, the transmission processing unit 311 performs processing such as conversion to the frequency domain using fast Fourier transform, addition of a guard interval (cyclic prefix), generation of a baseband digital signal, conversion to an analog signal, quadrature modulation, upconvert, removal of extra frequency components, and power amplification.
  • the signal generated by the transmission processing unit 311 is transmitted from the antenna 313.
  • the reception processing unit 312 processes an uplink signal received via the antenna 313. For example, the reception processing unit 312 performs processing on the uplink signal, such as down-conversion, removal of unnecessary frequency components, amplification level control, orthogonal demodulation, conversion to digital signal, removal of guard interval (cyclic prefix), and frequency domain signal extraction using fast Fourier transform.
  • the reception processing unit 312 Subsequently demultiplexes an uplink channel such as a physical uplink shared channel (PUSCH) or a physical uplink control channel (PUCCH) and an uplink reference signal from the signal that has undergone these processing procedures.
  • PUSCH physical uplink shared channel
  • PUCCH physical uplink control channel
  • the reception processing unit 312 demodulates a received signal using a modulation scheme such as binary phase shift keying (BPSK) or quadrature phase shift keying (QPSK) for the modulation symbol of the uplink channel.
  • the modulation scheme used in the demodulation may be 16 quadrature amplitude modulation (QAM), 64 QAM, or 256 QAM. In this case, the signal points on the constellation do not necessarily have to be equidistant.
  • the constellation may be a non-uniform constellation (NUC).
  • the reception processing unit 312 performs decoding processing on the coded bits of the demodulated uplink channel.
  • the decoded uplink data and uplink control information are output to the control unit 33.
  • the antenna 313 is an antenna apparatus that performs mutual conversion of a current and a radio wave.
  • the antenna 313 may be configured by one antenna element, for example, one patch antenna.
  • the antenna 313 may include a plurality of antenna elements (for example, a plurality of patch antennas).
  • the wireless communication unit 31 may have a beamforming function.
  • the wireless communication unit 31 may control the directivity of a radio signal using a plurality of antenna elements to generate a directional beam.
  • the antenna 313 may be a dual polarized antenna.
  • the wireless communication unit 31 may use, in radio signal transmission, vertically polarized waves (V-polarized waves) and horizontally polarized waves (H-polarized waves) (or dual polarized waves in polarization direction at 45 degrees and -45 degrees with the vertical direction).
  • the wireless communication unit 31 may control directivity of a radio signal transmitted using vertically polarized waves and horizontally polarized waves (or dual polarization in polarization direction at 45 degrees and -45 degrees with vertical direction).
  • the wireless communication unit 31 may transmit and receive spatially multiplexed signals via a plurality of layers including a plurality of antenna elements.
  • the storage unit 32 is a readable/writable storage device such as DRAM, SRAM, a flash drive, or a hard disk.
  • the control unit 33 is a controller that controls individual parts of the base station 30.
  • the control unit 33 controls the wireless communication unit to perform wireless communication with another wireless communication apparatus (for example, the terminal apparatus 40 or another base station 30).
  • the control unit 33 may be implemented by a processor such as a CPU or an MPU. Specifically, the control unit 33 is realized by a processor executing various programs stored in a storage device inside the base station 30 using RAM or the like as a work area.
  • the control unit 33 may be implemented by an integrated circuit such as an ASIC or an FPGA.
  • the control unit 33 may be implemented by a GPU.
  • the CPU, MPU, ASIC, FPGA. and GPU can all be regarded as controllers.
  • the control unit 33 may include a plurality of physically separated objects.
  • the control unit 33 may include a plurality of semiconductor chips.
  • the control unit 33 includes at least one block of a notification unit 331, a reception unit 332, a transmission unit 333, a determination unit 334, and an acquisition unit 335.
  • the control unit 33 may include a plurality of these blocks, or may include only one block each.
  • Individual blocks (the notification unit 331 to acquisition unit 335) constituting the control unit 33 are functional blocks individually indicating functions of the control unit 33.
  • These functional blocks may be software blocks or hardware blocks.
  • each of the functional blocks described above may be one software module realized by software (including a microprogram) or one circuit block on a semiconductor chip (die). Needless to say, each of the functional blocks may be formed as one processor or one integrated circuit.
  • the control unit 33 may be configured in a functional unit different from the above-described functional block.
  • the functional block may be configured by using any method.
  • the operation of the control unit 33 may be the same as the operation of the control unit (control unit 13, control unit 23, or control unit 43) included in the server 10, the management apparatus 20, or the terminal apparatus 40, respectively.
  • the sensor unit 34 includes one or a plurality of sensors that detect various data related to the base station 30.
  • one or a plurality of sensors included in the sensor unit 34 may include a sensor that performs detection regarding the surroundings of the base station 30.
  • one or a plurality of sensors included in the sensor unit 34 may include at least one of a geomagnetic sensor, an illuminance sensor, a distance measuring sensor (for example, a time of flight (ToF) sensor), an atmospheric pressure sensor, a temperature sensor, an optical sensor, a sound sensor, and an image sensor.
  • the sensor unit 34 (or one or a plurality of sensors included in the sensor unit 34) may be configured to perform sensing using the above-described beamforming function and acquire detection data.
  • the sensor included in the sensor unit 34 is not limited to a sensor that performs detection regarding the surroundings of the base station 30.
  • the one or the plurality of sensors included in the sensor unit 34 may include a sensor that performs detection regarding the location or the attitude of the base station 30.
  • one or a plurality of sensors included in the sensor unit 34 may include an acceleration sensor and/or a gyro sensor.
  • one or a plurality of sensors included in the sensor unit 34 may include a six degrees of freedom (6DoF) sensor or a three degrees of freedom (3DoF) sensor.
  • 6DoF six degrees of freedom
  • 3DoF three degrees of freedom
  • one or a plurality of sensors included in the sensor unit 34 may include a positioning sensor such as a global navigation satellite system (GNSS) sensor.
  • the GNSS sensor may be a global positioning system (GPS) sensor, a GLONASS sensor, a Galileo sensor, or a quasi-zenith satellite system (QZSS) sensor.
  • GPS global positioning system
  • one or a plurality of sensors included in the sensor unit 34 may include a device/component using the sensor.
  • one or a plurality of sensors included in the sensor unit 34 may include at least one of a camera (for example, a visible light camera, an infrared camera, a light field camera, or the like), Light Detection and Ranging (LiDAR), a radar (for example, a microwave radar, a millimeter wave radar, or the like), a microphone, and an image device.
  • the image device is a device including one or a plurality of sensors. Devices/components configured using sensors can also be considered a type of sensor.
  • one or a plurality of sensing functions implemented by devices/components included in the base station 30 may be regarded as one or a plurality of sensors included in the base station 30.
  • one or a plurality of sensing functions of the wireless communication unit 31 may be regarded as one or a plurality of sensors included in the base station 30.
  • one or a plurality of sensing functions of the wireless communication unit 31 may include a radio frequency (RF)-based sensing function (for example, an RF-based sensing function supported by a 3GPP transceiver).
  • RF radio frequency
  • the wireless communication unit 31(for example, a 3GPP transceiver) may be regarded as the sensor unit 34 (or a sensor included in the sensor unit 34).
  • sensors may be divided into physical sensors and logical sensors. That is, the sensor described above or below may indicate a physical sensor or a logical sensor.
  • the physical sensor may be at least one of the examples of sensors described above or below.
  • the physical sensor may be at least one of a geomagnetic sensor, an illuminance sensor, a distance measuring sensor (for example, a time of flight (ToF) sensor), an atmospheric pressure sensor, a temperature sensor, an optical sensor, a sound sensor, an image sensor, an acceleration sensor, a gyro sensor, a six degrees of freedom (6DoF) sensor, a three degrees of freedom (3DoF) sensor, a positioning sensor (for example, a global navigation satellite system (GNSS) sensor such as a global positioning system (GPS) sensor, a GLONASS sensor, a Galileo sensor, or a Quasi-Zenith Satellite System (QZSS) sensor), an Inertial Measurement Unit (IMU), a camera (for example, a visible light camera, an infrared camera, a light field camera, or the like), LiDAR (Light Detection And Ranging), a radar (
  • the logical sensor may be an entity related to a sensor defined in a standard (for example, 3GPP Technical Standard).
  • One logical sensor may be associated with one or more physical sensors (including a plurality of sensors of the same type and a plurality of sensors of different types). Additionally or alternatively, a plurality of logical sensors may be associated with a plurality of physical sensors.
  • the base station 30 may be configured by a set of a plurality of physical or logical apparatuses.
  • the base station 30 in the present embodiment may be classified into a plurality of apparatuses such as a Baseband Unit (BBU) and a Radio Unit (RU).
  • BBU Baseband Unit
  • RU Radio Unit
  • the base station 30 may be construed as a set of the plurality of apparatuses.
  • the base station may be either one or both of the BBU and the RU.
  • the BBU and the RU may be connected to each other via a predetermined interface (for example, an enhanced Common Public Radio Interface (eCPRI)).
  • eCPRI enhanced Common Public Radio Interface
  • the RU may be referred to as a Remote Radio Unit (RRU) or a Radio DoT (RD).
  • the RU may support a gNB Distributed Unit (gNB-DU) described below.
  • the BBU may support a gNB Central Unit) (gNB-CU) described below.
  • the RU may be an apparatus integrally formed with an antenna.
  • An antenna of the base station 30, for example, an antenna integrally formed with an RU, may employ an Advanced Antenna System and support MIMO (for example, FD-MIMO) or beamforming.
  • the antenna of the base station 30 may include 64 transmitting antenna ports and 64 receiving antenna ports.
  • the antenna mounted on the RU may be an antenna panel including one or more antenna elements, and the RU may include one or more antenna panels.
  • the RU may be equipped with two types of antenna panels: a horizontally polarized antenna panel and a vertically polarized antenna panel.
  • the RU may be equipped with two types of antenna panels, that is, a right-handed circularly polarized antenna panel and a left-handed circularly polarized antenna panel, or an antenna panel with a polarization direction of 45 degrees with the vertical direction and an antenna panel with a polarization direction of -45 degrees with the vertical direction.
  • a plurality of antennas having the plurality of polarization directions may be mounted on one antenna panel.
  • the RU may form and control an independent beam for each antenna panel.
  • the plurality of base stations 30 may be connected to each other.
  • One or the plurality of base stations 30 may be included in a Radio Access Network (RAN). That is, the base station 30 may be simply referred to as a RAN, a RAN node, an Access Network (AN), an AN node, or the like.
  • RAN in LTE is sometimes referred to as Enhanced Universal Terrestrial RAN (EUTRAN).
  • EUTRAN Enhanced Universal Terrestrial RAN
  • RAN in NR may be referred to as NGRAN.
  • RAN in 6G may be referred to as 6GRAN.
  • RAN in W-CDMA (UMTS) may be referred to as UTRAN.
  • the base station 30 in LTE may be referred to as Evolved Node B (eNodeB) or eNB. That is, EUTRAN includes one or a plurality of eNodeB (eNB). NR base stations 30 may be referred to as gNodeB or gNB. At this time, NGRAN contains one or a plurality of gNBs. A 6G base station may be referred to as a 6GNodeB, a 6gNodeB, a 6GNB, or a 6gNB. At this time, 6GRAN contains one or a plurality of 6GNBs. EUTRAN may include gNB (en-gNB) connected to the core network (EPC) in LTE communication systems (EPS). NGRAN may include an ng-eNB connected to the core network 5GC in a 5G communication system (5GS).
  • EPC core network
  • EPS LTE communication systems
  • NGRAN may include an ng-eNB connected to the core network 5GC in a 5G communication
  • the base station 30 When the base station 30 is eNB, gNB, 6GNB or the like, the base station 30 may be referred to as 3GPP access. When the base station 30 is a radio access point, the base station 30 may be referred to as non-3GPP access.
  • the base station 30 may be an optical link apparatus referred to as a Remote Radio Head (RRH).
  • RRH Remote Radio Head
  • the base station 30 may be a combination of the gNB-CU and the gNB-DU described above, or may be any of the gNB-CU and the gNB-DU.
  • the gNB-CU hosts a plurality of upper layers (for example, Radio Resource Control (RRC), Service Data Adaptation Protocol (SDAP), and Packet Data Convergence Protocol (PDCP)) in an access stratum.
  • the gNB-DU hosts a plurality of lower layers (for example, Radio Link Control (RLC), Medium Access Control (MAC), and Physical Layer (PHY)) in an access stratum. That is, among messages/information to be described below, RRC signaling (semi-static notification) may be generated by the gNB-CU, while MAC CE and DCI (dynamic notification) may be generated by the gNB-DU.
  • some configurations such as IE: cellGroupConfig may be generated by the gNB-DU, while the remaining configurations may be generated by the gNB-CU, for example. These configurations may be transmitted and received through an F1 interface described below.
  • the base station 30 may be configured to be able to communicate with another base station.
  • these base stations 30 may be connected by an X2 interface.
  • the plurality of base stations 30 are gNBs or a combination of a gn-eNB and a gNB, these base stations 30 may be connected by an Xn interface.
  • the plurality of base stations 30 are a combination of a gNB-CU and a gNB-DU, these base stations 30 may be interconnected by the F1 interface described above.
  • a message/information (for example, RRC signaling, MAC control element (MAC CE), Downlink Control Information (DCI), or the like) to be described below may be transmitted among the plurality of base stations 30 via these inter-base station interfaces (for example, an X2 interface, an Xn interface, an F1 interface, or the like).
  • inter-base station interfaces for example, an X2 interface, an Xn interface, an F1 interface, or the like.
  • the cell provided by the base station 30 may be referred to as a serving cell.
  • the serving cell conceptually includes a primary cell (PCell) and a secondary cell (SCell).
  • the PCell provided by a Master Node (MN) and zero or one or more SCells may be referred to as a Master Cell Group.
  • the dual connectivity may be at least one of the EUTRA-EUTRA dual connectivity, the EUTRA-NR dual connectivity (ENDC), the EUTRA-NR dual connectivity with 5GC, the NR-EUTRA dual connectivity (NEDC), the NR-NR dual connectivity, the NR-6G dual connectivity, and the 6G-NR dual connectivity. Needless to say, the dual connectivity is not limited thereto.
  • the serving cell may include a Primary Secondary Cell or Primary SCG Cell (PSCell).
  • PSCell Primary Secondary Cell
  • SCG Secondary Cell Group
  • PUCCH Physical uplink control channel
  • PUCCH physical uplink control channel
  • the radio link failure is also detected by the PCell and the PSCell, but is not detected by the SCell (need not be detected). In this manner, since the PCell and the PSCell have a special role in the serving cell, these cells are also referred to as Special Cells (SpCells).
  • SpCells Special Cells
  • One cell may be associated with one downlink component carrier and one uplink component carrier.
  • the system bandwidth corresponding to one cell may be divided into a plurality of bandwidth parts (BWPs).
  • BWPs bandwidth parts
  • one or a plurality of BWPs may be configured for the terminal apparatus 40, and one BWP may be used for the terminal apparatus 40 as an active BWP.
  • Radio resources for example, a frequency band, a numerology (subcarrier spacing), and a slot format (slot configuration) usable by the terminal apparatus 40 may be different for each cell, each component carrier, or each BWP.
  • the terminal apparatus 40 is a wireless communication apparatus that performs wireless communication with another wireless communication apparatus (for example, the base station 30 or another terminal apparatus 40).
  • the terminal apparatus 40 can be referred to as user equipment (UE) 40.
  • UE user equipment
  • the terminal apparatus 40 can be implemented by employing any form of information processing apparatus (computer).
  • the terminal apparatus 40 may be a mobile terminal such as a mobile phone, a smart device (smartphone or tablet), a personal digital assistant (PDA), or a laptop PC.
  • the terminal apparatus 40 may be a communication module that is connected to an information processing apparatus (for example, an imaging apparatus with no wireless communication function) and provides the information processing apparatus with a wireless communication function.
  • the terminal apparatus 40 may be an imaging apparatus (for example, a camcorder) having a wireless communication function.
  • the terminal apparatus 40 may be a motorcycle, a moving relay vehicle, or the like, equipped with a communication device such as the field pickup unit (FPU).
  • the terminal apparatus 40 may be a machine to machine (M2M) device or an Internet of Things (IoT) device.
  • the terminal apparatus 40 may be a wearable device such as a smart watch.
  • the terminal apparatus 40 may be an Extended Reality (XR) device such as an Augmented Reality (AR) device, a Virtual Reality (VR) device, or a Mixed Reality (MR) device.
  • the XR device may be an eyeglass-type device such as AR glasses or MR glasses, or may be a head-mounted device such as a VR head-mounted display.
  • the terminal apparatus 40 may be a standalone device including only a portion worn on the user (for example, the eyeglass portion).
  • the terminal apparatus 40 may be a terminal-linked device including the portion worn on the user (for example, the eyeglass portion) and a terminal portion (for example, a smart device) linked with the portion worn on the user.
  • the terminal apparatus 40 may be capable of performing NOMA communication with the base station 30.
  • the terminal apparatus 40 may be able to use an automatic retransmission technology such as HARQ when communicating with the base station 30.
  • the terminal apparatus 40 may be capable of sidelink communication with another terminal apparatus 40.
  • the terminal apparatus 40 may be capable of using an automatic retransmission technology such as HARQ when performing sidelink communication.
  • the terminal apparatus 40 may be able to perform NOMA communication when performing sidelink communication with another terminal apparatus 40.
  • the terminal apparatus 40 may be able to perform LPWA communication with another wireless communication apparatus such as the base station 30.
  • the wireless communication used by the terminal apparatus 40 may be wireless communication using millimeter waves.
  • the wireless communication (including sidelink communication) used by the terminal apparatus 40 may be wireless communication using radio waves or wireless communication using infrared rays or visible light, namely, optical wireless communication.
  • the terminal apparatus 40 may be a movable wireless communication apparatus, that is, a mobile apparatus. Furthermore, the terminal apparatus 40 may be a wireless communication apparatus installed on a mobile body, or may be the mobile body itself.
  • the terminal apparatus 40 may be a vehicle that moves on a road, such as an automobile, a bus, a truck, or a motorcycle, or a vehicle of a train that travels on a track, or may be a wireless communication apparatus mounted on the vehicle.
  • the mobile body may be a mobile terminal, or may be a mobile body that moves on land (on the ground in a narrow sense), in the ground, on water, or under water.
  • the mobile body may be a mobile body that moves inside the atmosphere, such as an aircraft, airship, balloon, or a helicopter, or may be a mobile body that moves outside the atmosphere, such as an artificial satellite.
  • the mobile body may be an unmanned aerial vehicle (UAV) such as a drone.
  • UAV unmanned aerial vehicle
  • the terminal apparatus 40 may be a wireless communication apparatus mounted on a mobile body.
  • the terminal apparatus 40 may be capable of performing communication while being simultaneously connected to a plurality of base stations 30 or a plurality of cells.
  • a plurality of cells for example, pCell and sCell
  • CA carrier aggregation
  • DC dual connectivity
  • MC multi-connectivity
  • the terminal apparatus 40 and the plurality of base stations 30 can communicate with each other by a Coordinated Multi-Point Transmission and Reception (CoMP) technology via cells of different base stations 30.
  • CoMP Coordinated Multi-Point Transmission and Reception
  • the terminal apparatus 40 may be a relay terminal that relays communication to a remote terminal.
  • Fig. 6 is a diagram illustrating a configuration of the terminal apparatus 40 according to the present embodiment.
  • the terminal apparatus 40 includes a wireless communication unit 41, a storage unit 42, a control unit 43, an input unit 44, an output unit 45, and a sensor unit 46.
  • the configuration illustrated in Fig. 6 is a functional configuration, and the hardware configuration may be different from this.
  • the functions of the terminal apparatus 40 may be implemented in a distributed manner in a plurality of physically separated configurations.
  • the terminal apparatus 40 does not necessarily have all the configurations described above or below.
  • the terminal apparatus 40 may omit at least one of the input unit 44, the output unit 45, and the sensor unit 46.
  • the base station 30 may have a configuration other than the configuration described above or below.
  • the terminal apparatus 40 may have a beamforming function.
  • the terminal apparatus 40 may be configured to acquire detection data by performing sensing using a beam.
  • the wireless communication unit 41 is a signal processing unit for performing wireless communication with other wireless communication apparatuses (for example, the base station 30 and another terminal apparatus 40).
  • the wireless communication unit 41 may be referred to as a radio transceiver or simply a transceiver.
  • the wireless communication unit 41 may be a transceiver (hereinafter, referred to as a 3GPP transceiver) of a standard defined by a technical specification (TS) of 3GPP.
  • the 3GPP transceiver may be a 3G transceiver, a 4G (LTE) transceiver, a 5G (NR) transceiver, or a transceiver of 5G or later generation.
  • the wireless communication unit 41 is controlled by the control unit 43, for example.
  • the wireless communication unit 41 may support one or a plurality of radio access methods.
  • the wireless communication unit 41 may support at least one of NR, LTE, Beyond 5G (B5G), and 6G.
  • the wireless communication unit 41 may support W-CDMA, cdma2000, and the like in addition to NR, LTE, B5G, and 6G.
  • the wireless communication unit 41 may support an automatic retransmission technology such as Hybrid Automatic Repeat reQuest (HARQ). A part or all of the processing executed by the wireless communication unit 41 may be executed by the control unit 43.
  • HARQ Hybrid Automatic Repeat reQuest
  • the wireless communication unit 41 includes a transmission processing unit 411, a reception processing unit 412, and an antenna 413. At least one of the transmission processing unit 411, the reception processing unit 412, and the antenna 413 may be regarded as the wireless communication unit 41.
  • the wireless communication unit 41 may include a plurality of the transmission processing units 411, a plurality of the reception processing units 412, and a plurality of the antennas 413. In a case where the wireless communication unit 41 supports a plurality of radio access methods, individual portions of the wireless communication unit 41 may be configured separately for each of the radio access methods.
  • the transmission processing unit 411 and the reception processing unit 412 may be configured separately for LTE, NR, B5G, and 6G.
  • the antenna 413 may include a plurality of antenna elements, for example, a plurality of patch antennas.
  • the wireless communication unit 41 may have a beamforming function.
  • the wireless communication unit 41 may have a polarization beamforming function using vertically polarized waves (V-polarized waves) and horizontally polarized waves (H-polarized waves) (or may have a polarization beamforming function using dual polarization in polarization directions of 45 degrees and -45 degrees with the vertical direction).
  • the storage unit 42 is a readable/writable storage device such as DRAM, SRAM, a flash drive, or a hard disk.
  • the control unit 43 is a controller that controls individual parts of the terminal apparatus 40.
  • the control unit 43 controls the wireless communication unit to perform wireless communication with another wireless communication apparatus (for example, the base station 30 or another terminal apparatus 40).
  • the control unit 43 may be implemented by a processor such as a CPU or an MPU.
  • the control unit 23 is implemented by the processor executing various programs stored in the storage device inside the terminal apparatus 40 using RAM or the like as a work area.
  • the control unit 43 may be implemented by an integrated circuit such as an ASIC or an FPGA.
  • the CPU, MPU, ASIC, and FPGA can all be regarded as controllers.
  • the control unit 43 may be implemented by a GPU.
  • the CPU, MPU, ASIC, FPGA. and GPU can all be regarded as controllers.
  • the control unit 43 may include a plurality of physically separated objects.
  • the control unit 43 may include a plurality of semiconductor chips.
  • Individual blocks (the notification unit 431 to acquisition unit 435) constituting the control unit 43 are functional blocks individually indicating functions of the control unit 43.
  • These functional blocks may be software blocks or hardware blocks.
  • each of the functional blocks described above may be one software module realized by software (including a microprogram) or one circuit block on a semiconductor chip (die). Needless to say, each of the functional blocks may be formed as one processor or one integrated circuit.
  • the control unit 43 may be configured in a functional unit different from the above-described functional block.
  • the functional block may be configured by using any method.
  • the operation of the control unit 43 may be the same as the operation of the control unit (control unit 13, control unit 23, or control unit 33) included in the server 10, the management apparatus 20, or the base station 30, respectively.
  • the input unit 44 is an input apparatus that receives various inputs from the outside.
  • the input unit 44 is an operation apparatus for the user to perform various operations, including apparatuses such as a keyboard, a mouse, operation keys and a voice input apparatus.
  • apparatuses such as a keyboard, a mouse, operation keys and a voice input apparatus.
  • the touch panel is also included in the input unit 44. In this case, the user performs various operations by touching the screen with a finger or a stylus.
  • the output unit 45 is an apparatus that performs various outputs such as sound, light, vibration, and an image to the outside.
  • the output unit 45 includes a display unit that displays various types of information.
  • the display unit is implemented by a display apparatus such as a liquid crystal display and an organic electro-luminescence (EL) display, for example.
  • EL organic electro-luminescence
  • the display unit may be integrated with the input unit 44.
  • the terminal apparatus 40 is an XR device
  • the terminal apparatus 40 may be a transmission type device that projects an image on glass, or may be a retina projection type device that directly projects an image on the retina of the user.
  • the output unit 45 performs various outputs to the user under the control of the control unit 43.
  • the sensor unit 46 includes one or a plurality of sensors that detect various data related to the terminal apparatus 40.
  • one or a plurality of sensors included in the sensor unit 46 may include a sensor that performs detection regarding the location or posture of the terminal apparatus 40.
  • one or a plurality of sensors included in the sensor unit 46 may include an acceleration sensor and/or a gyro sensor.
  • one or a plurality of sensors included in the sensor unit 46 may include a 6DoF sensor or a 3DoF sensor.
  • One or a plurality of sensors included in the sensor unit 46 may include a positioning sensor (for example, a GNSS sensor).
  • the GNSS sensor may be a GPS sensor, a GLONASS sensor, a Galileo sensor, or a QZSS sensor.
  • the sensor unit 46 (or one or a plurality of sensors included in the sensor unit 46) may be configured to perform sensing using the above-described beamforming function and acquire detection data.
  • the sensor included in the sensor unit 46 is not limited to a sensor that performs detection regarding the location or posture of the terminal apparatus 40.
  • the one or the plurality of sensors included in the sensor unit 46 may include a sensor that performs detection regarding the surroundings of the terminal apparatus 40.
  • one or a plurality of sensors included in the sensor unit 46 may include at least one of a geomagnetic sensor, an illuminance sensor, a distance measuring sensor (for example, a ToF sensor), an atmospheric pressure sensor, a temperature sensor, an optical sensor, a sound sensor, and an image sensor.
  • One or a plurality of sensors included in the sensor unit 46 may include a sensor unit combining a plurality of sensors.
  • one or a plurality of sensors included in the sensor unit 46 may include an inertial measurement unit (IMU) configured by combining a plurality of sensors among a positioning sensor (for example, a GNSS sensor), an acceleration sensor, and a gyro sensor.
  • IMU inertial measurement unit
  • a sensor unit can also be regarded as a type of sensor.
  • one or a plurality of sensing functions implemented by devices/components included in the terminal apparatus 40 may be regarded as one or a plurality of sensors included in the terminal apparatus 40.
  • one or a plurality of sensing functions of the wireless communication unit 41 may be regarded as one or a plurality of sensors included in the terminal apparatus 40.
  • one or a plurality of sensing functions of the wireless communication unit 41 may include an RF-based sensing function (for example, an RF-based sensing function supported by a 3GPP transceiver).
  • the wireless communication unit 41 (for example, a 3GPP transceiver) may be regarded as the sensor unit 46 (or a sensor included in the sensor unit 46).
  • sensors may be divided into physical sensors and logical sensors. That is, the sensor described above or below may indicate a physical sensor or a logical sensor.
  • a network architecture of a fifth generation mobile communication system (5G) will be described below as an example of a network architecture applicable to the communication system 1.
  • the network architecture applied to the communication system 1 is not limited to the network architecture of the fifth generation mobile communication system (5G).
  • the network architecture applied to the communication system 1 may be a network architecture of a fourth generation mobile communications system (4G) or a network architecture of a sixth generation mobile communications system (6G).
  • the network architecture applied to the communication system 1 may be a Beyond 5G (B5G) network architecture. Needless to say, the network architecture applied to the communication system 1 may be a network architecture of another RAT.
  • Fig. 7 is a diagram illustrating a configuration example of a network architecture of 5G system (5GS).
  • the 5G core network is also referred to as 5G Core (5GC)/Next Generation Core (NGC).
  • the core network CN of the present embodiment includes, for example, one or a plurality of management apparatuses 20.
  • the core network CN of 5G is also referred to as a 5GC/NGC.
  • the core network CN is connected to user equipment (UE) 40 via a radio access network (RAN)/access network (AN) 510.
  • RAN radio access network
  • AN access network
  • An example of the UE 40 is the terminal apparatus 40 in the communication system 1.
  • An application server (AS) 10 that performs processing related to the application is connected to the 5GS via the Internet.
  • An example of the application server 10 is the server 10 in the communication system 1. With this configuration, the UE 40 can use the application via the 5G service.
  • the application server 10 can be disposed in the core network CN as a DN 530 or a part of the DN 530.
  • the application server 10 may be provided in a form of an edge server.
  • a control plane function group 540 of the 5GS includes one or a plurality of Network Functions (NFs).
  • the one or a plurality of NFs included in the control plane function group 540 may include, for example, at least one NF among an access and Mobility Management Function (AMF) 541, a Network Exposure Function (NEF) 542, a Network Repository Function (NRF) 543, a Network Slice Selection Function (NSSF) 544, a Policy Control Function (PCF) 545, a Session Management Function (SMF) 546, a Unified Data Management (UDM) 547, an Application Function (AF) 548, an Authentication Server Function (AUSF) 549, a UE radio capability Management Function (UCMF) 550, a Location Management Function (LMF) 551, and a Gateway Mobile Location Center (GMLC) 552.
  • AMF access and Mobility Management Function
  • NEF Network Exposure Function
  • NRF Network Repository Function
  • NSF Network Slice Selection Function
  • PCF Policy Control Function
  • UDM
  • the UDM 547 includes a Unified Data Repository (UDR) that holds and manages contractor information, and a Front End (FE) portion that processes the contractor information.
  • UDR Unified Data Repository
  • FE Front End
  • the AMF 541 performs mobility management.
  • the SMF 546 performs session management.
  • the UCMF 550 holds UE radio capability information corresponding to all UE radio capability IDs in the Public Land Mobile Network (PLMN).
  • PLMN Public Land Mobile Network
  • the UCMF 550 is responsible for assigning each PLMN-assigned UE radio capability ID.
  • the LMF 551 and the GMLC 552 will be described below.
  • Functions and services provided by the core network CN can be used via the AF 548 prepared for the application.
  • a Service Level Agreement (SLA) with an operator being a management entity of the core network CN so as to be regarded as a trusted AF 548 for the core network CN.
  • SLA Service Level Agreement
  • the third party AF 548 disposed outside the core network CN it is typical to connect the third party AF 548 to the core network CN via the NEF 542 from the viewpoint of security.
  • the Namf is a service-based interface provided by the AMF 541.
  • the Nsmf is a service-based interface provided by the SMF 546.
  • Nnef is a service-based interface provided by the NEF 542.
  • Npcf is a service-based interface provided by the PCF 545.
  • Nudm is a service-based interface provided by the UDM 547.
  • Naf is a service-based interface provided by the AF 548.
  • Nnrf is a service-based interface provided by the NRF 543.
  • Nnssf is a service-based interface provided by the NSSF 544.
  • Nausf is a service-based interface provided by the AUSF 549.
  • Nucmf is a service-based interface provided by the UCMF 550.
  • Nlmf is a service-based interface provided by the LMF 551.
  • Ngmlc is a service-based interface provided by the GMLC 552.
  • Each NF
  • each NF can receive a response or notification from the service. That is, each NF exchanges information with another NF by means of request/response or subscribe/notification via each service-based interface.
  • a User Plane Function (UPF) 520 has a function of user plane processing.
  • the Data Network (DN) 530 has a function of enabling connection to a service unique to a Mobile Network Operator (MNO), the Internet, and a third party service.
  • MNO Mobile Network Operator
  • the UPF 520 functions as a transfer processing unit of user plane data processed by the application server 10.
  • the UPF 520 also functions as a gateway connected to a RAN/AN 510.
  • each NF of the core network CN is configurable by virtualization and/or a container.
  • Each NF can be installed on a cloud server.
  • each NF can be dynamically set so as to be reconfigurable by using a Software Defined Network (SDN).
  • SDN Software Defined Network
  • the RAN/AN 510 has a function of enabling connection to the RAN and connection to an AN other than the RAN.
  • the RAN/AN 510 includes a base station referred to as a gNB or an ng-eNB.
  • the RAN may also be referred to as a Next Generation-RAN (NG-RAN).
  • NG-RAN Next Generation-RAN
  • the functions of the RAN/AN 510 are divided into a Central Unit (CU) that processes L2/L3 functions of Packet Data Convergence Protocol (PDCP) sublayers and higher, and a Distributed Unit (DU) that processes L2/L1 functions of Radio Link Control (RLC) sublayers and lower.
  • the functions of the RAN/AN 510 can be distributed and arranged via the F1 interface.
  • the DU function is divided into: a radio unit (RU) that processes a LOW PHY sublayer and a radio portion (Radio); and DU that processes each sublayer of RLC, Medium Access Control (MAC) and HIGH PHY.
  • the RU functions can be distributed and arranged, for example, via a fronthaul compliant with an evolved Common Public Radio Interface (eCPRI).
  • eCPRI evolved Common Public Radio Interface
  • Functions of the CU and/or the DU can be implemented by virtualization or a container. Functions of the CU and/or the DU can be installed on a cloud server. In the 5GS, the functions of the CU and/or the DU can be dynamically so as to be reconfigurable by using SDN.
  • the UE 40 and the AMF 541 mutually exchange information via a reference point N1.
  • Information is exchanged between the RAN/AN 510 and the AMF 541 via a reference point N2.
  • Information is exchanged between the SMF 546 and the UPF 520 via a reference point N4.
  • the SMF 546 performs Quality of Service (QoS) control for each service data flow.
  • QoS Quality of Service
  • the QoS control of the SMF 546 is applicable to both IP and Ethernet type service data flows. With QoS control performed for each service data flow, the SMF 546 provides QoS authorized for each specific service.
  • the SMF 546 may utilize indicators such as QoS contractor information in conjunction with service-based, subscription-based, or predefined PCF internal policy rules.
  • the SMF 546 uses a Policy and Charging Control (PCC) rule related to a QoS flow, that is, a QoS-controlled data flow, to determine QoS to be authorized for a QoS flow.
  • PCC Policy and Charging Control
  • the SMF 546 can notify the PCF 545 that the QoS flow has been deleted.
  • the SMF 546 can notify the PCF 545 that the GFBR cannot be guaranteed.
  • GFBR Guaranteed Flow Bit Rate
  • a QoS reservation procedure of the QoS flow can include establishment of a UE-initiated QoS Flow.
  • QoS downgrade or upgrade can be included as a part of QoS flow change processing.
  • a location service in a fifth generation mobile communication system will be described as an example of a location service that can be supported by the communication system 1 of the present embodiment.
  • the location service in the present embodiment is not limited to the location service in 5G.
  • the location service in the present embodiment may be a location service in a fourth generation mobile communication system (4G) or a location service in a sixth generation mobile communication system (6G).
  • the location service in the present embodiment may be a location service in Beyond 5G (B5G). Needless to say, the location service in the present embodiment may be a location service using another RAT.
  • the 5G system defines a service-based architecture for supporting a location service.
  • the 5GS supports a radio access technology (RAT)-dependent positioning method and a RAT-independent positioning method.
  • the communication system 1 of the present embodiment may support at least one of the RAT-dependent positioning method and the RAT-independent positioning method.
  • the description of 5GS in the following description can be replaced with the communication system 1.
  • the description of the core network CN in the following description can be replaced with the management apparatus 20.
  • the RAT-dependent positioning method is a positioning method performed using a measurement result of a 3GPP RAT signal acquired by the target UE 40 and/or a measurement result of a 3GPP RAT signal transmitted from the target UE 40 and acquired by the access network.
  • the RAT-independent positioning method is a positioning method performed using a measurement result of a signal other than the 3GPP RAT signal acquired by the target UE 40 and/or other information.
  • a location service (LCS) client inside a Public Land Mobile Network (PLMN) or a Standalone Non-Public Network (SNPN), an LCS client outside the PLMN or the SNPN, or the AF 548 requests location information of one or a plurality of target UE 40 from an apparatus (for example, the core network CN) constituting the 5GS.
  • the apparatus constituting the 5GS reports the location information related to the request, to the LCS client or the AF 548.
  • the NF of the control plane inside the PLMN or SNPN requests location information of one or a plurality of target UE 40 from the apparatus constituting the 5GS.
  • the apparatus constituting the 5GS reports the location information related to the request, to the NF of the control plane inside the PLMN or the SNPN.
  • the UE 40 outside the PLMN or SNPN also needs to be able to perform privacy verification on the target UE 40 in response to a request for location information from an LCS client other than the RAN/AN 510, or from the AF 548 outside the PLMN or SNPN.
  • UE 40 outside the PLMN or SNPN needs to be able to check whether the LCS client or AF 548 is authorized to acquire location information of UE 40 and/or whether the LCS client or AF 548 is authorized to use the location service based on the LCS privacy profile of the UE 40.
  • the UE 40 can optionally support privacy notification and verification on behalf of the user.
  • the UE 40 may perform capability signaling of the UE 40 that supports a location service at an application server (AS), non-access stratum (NAS), or application (for example, the positioning protocol) level, to a serving PLMN or SNPN.
  • AS application server
  • NAS non-access stratum
  • application for example, the positioning protocol
  • the LCS client or the AF 548 may request the location information of the UE 40 by at least one of the following methods (requests).
  • ⁇ Location information request by network Network Induced Location Request (NI-LR)
  • NI-LR Network Induced Location Request
  • MT-LR Mobile Terminated Location Request
  • MO-LR Mobile Terminal Originated Location Request
  • Immediate location information request Immediate Location Request
  • An event reservation type location request (Deferred Location Request)
  • the location information request by the network is, for example, a request started by the serving AMF 541 for the purpose of regulatory services such as an emergency call from the UE 40 or purpose of verifying the location of the UE 40 in a country or an overseas area for NR satellite access.
  • the mobile terminal end location information request is a request transmitted by the LCS client outside or inside the serving PLMN or AF 548 to the serving PLMN to acquire information regarding the location of target UE 40.
  • the mobile terminal origination location information request is a request that the UE 40 transmits to the serving PLMN in order to acquire information regarding its own location.
  • the immediate location information request is a request in transmitted or started by the LCS client or the AF 548 in order to acquire information regarding the location of the target UE 40 or the group of UEs 40, and the request is expected to receive a response including the information regarding the location of the target UE 40 or the group of UEs 40 within a short time of a certain length, which is designated using QoS.
  • This immediate location information request can be used for a location information request, a mobile terminal end location information request, or a mobile terminal origination location information request from the network described above.
  • the event reservation type terminal information request is a request transmitted by the LCS client or the AF 548 in order to acquire the information regarding the location of the target UE 40 or the group of UEs 40, and the request is expected to receive a response including the notification of the occurrence of the event and including the information regarding the location when the request is issued to the UE 40 or the group of UEs 40 being a target at a certain future time or time regarding a specific event regarding the target UE 40 or the group of UEs 40.
  • Fig. 8 is a diagram illustrating a configuration example of a reference architecture related to a location service of 5GS. The drawing is based on a drawing included in the document "3GPP TS23.237". The reference architecture illustrated in Fig. 8 may be applied to the communication system 1 (for example, the core network CN) of the present embodiment.
  • the control plane function group 540 of the 5GS includes a plurality of Network Functions (NFs).
  • NFs Network Functions
  • One or a plurality of NFs included in the control plane function group 540 may include at least one NF among the UDM 547, the AF 548, the LMF 551, the GMLC 552, and a Location Retrieval Function (LRF) 553.
  • the LRF 553 may be installed together with the GMLC 552 or may be installed separately.
  • the LRF 553 is responsible for acquisition or verification of location information.
  • the LRF 553 provides routing information and/or correlation information to the UE 40 that has started an IP Multimedia Subsystem (IMS) emergency session.
  • IMS IP Multimedia Subsystem
  • AF 548 or other NFs can access the event exposure service of the location information of AMF 541 in the same trust domain using a Ngmlc interface (for example, by using a location service of GMLC 552 in a same PLMN or in a same trust domain, or using a Namf interface).
  • a Ngmlc interface for example, by using a location service of GMLC 552 in a same PLMN or in a same trust domain, or using a Namf interface.
  • LCS client 560 can access the location service of GMLC 552 using Le being a reference point.
  • the external AF 548 can access the location service via NEF 542 using a Nnef interface or Common API Framework (CAPIF).
  • CAPIF Common API Framework
  • the LCS client 560 or AF 548 can access a location service of the UE 40 on a connected user plane for a location reporting event by the UE 40 to a periodic or triggered 5GC mobile terminal end location information request (5GC-MT-LR).
  • 5GC-MT-LR 5GC mobile terminal end location information request
  • the GMLC 552 includes functions required to support location services.
  • One PLMN can include one or more GMLCs 552.
  • GMLC 552 is the first node that external LCS client 560 should access within the PLMN.
  • the AF 548 or other NFs can access the GMLC 552 directly or via the NEF 542.
  • the GMLC 552 can request routing information and/or privacy information for the target UE 40 from the UDM 547 via a Nudm interface.
  • the UDM 547 manages LCS privacy profiles and routing information regarding LCS subscribers. These pieces of information can be accessed from the AMF 541, the GMLC 552, or the NEF 542 via the Nudm interface.
  • the UDM 547 may include, in UE 40 subscription data, an instruction of whether the UE 40 is permitted to act as a Positioning Reference Unit (PRU) and an instruction of whether the PRU is a fixed PRU.
  • PRU Positioning Reference Unit
  • the UDM 547 may include the identification information (identifier(s)) of the LMF 551 and the instruction of the positioning of the user plane between the UE 40 and the LMF 551 in the LCS subscriber data of the UE 40.
  • the GMLC 552 transfers the request for location information to the serving AMF 541 using the Namf interface.
  • the GMLC 552 transfers the location information request to the GMLC 552 of another PLMN by using the Ngmlc interface.
  • the configuration of the privacy profile of the target UE 40 needs to be checked at the home PLMN of the UE 40 before providing an estimate of the location.
  • the LMF 551 performs overall coordination and schedule management of resources required at the location of the UE 40 registered in or accessing the core network CN.
  • the LMF 551 can also calculate or verify final position and velocity estimates so as to estimate achievable accuracy.
  • the LMF 551 can directly report the estimate of the location of the target UE 40 at a periodic timing or an event-triggered timing to the GMLC 552.
  • the LMF 551 receives a location information request for the target UE 40 from the serving AMF 541 using a Nlmf interface.
  • the LMF 551 interacts with the UE 40 to exchange information regarding a position applicable to a terminal assisted (UE assisted) positioning method and/or a terminal based (UE based) positioning method.
  • UE assisted terminal assisted
  • UE based terminal based
  • Prepared UE 40 positioning modes include a terminal assisted mode, a terminal based mode, a standalone mode, and a network based mode.
  • the UE 40 receives a measurement signal for positioning. Subsequently, the UE 40 transmits the measurement result to another entity (for example, LMF 551) in order to calculate the location.
  • another entity for example, LMF 551
  • the UE 40 receives a measurement signal for positioning. Subsequently, the UE 40 calculates an estimate of the location using assistance data provided by the serving PLMN.
  • the UE 40 receives a measurement signal for positioning.
  • the UE 40 calculates an estimate of the location with no assistance data provided from the serving PLMN.
  • the serving PLMN receives measurement signals for positioning transmitted from the target UE 40. Subsequently, the serving PLMN (for example, LMF 551) calculates an estimate of the location.
  • the serving PLMN for example, LMF 551
  • Terminal LCS privacy (UE LCS privacy)> In the location service, it is allowable but there is no need to give the LCS client 560 or the AF 548 the authority to acquire the location information of the UE 40.
  • the UE 40 and/or AF 548 can control which LCS clients 560 and AF 548 are permitted or not permitted to access location information of UE 40 using terminal LCS privacy.
  • the UDM 547 can save privacy settings for the UE 40 as part of the terminal subscriber data. This privacy setting can be saved as a terminal LCS privacy profile.
  • Other NFs for example, GMLC 552 or NEF 542 can send an inquiry to the UDM 547 about a terminal LCS privacy profile.
  • the UE 40 and/or the AF 548 can provide and/or update a part of the terminal privacy profile by using the processing of the terminal LCS privacy profile. This makes it possible for the UE 40 and/or the AF 548 to provide privacy settings to the network.
  • the terminal LCS privacy profile is used to indicate whether LCS requests from LCS client 560 and AF 548 are permitted or denied, together with a Privacy Override Indicator (POI).
  • POI Privacy Override Indicator
  • the Privacy Override Indicator is used to decide whether the terminal LCS privacy profile of the subscriber terminal to position is to be overwritten by a location service request.
  • the privacy override is applied only to a regulatory service.
  • FIG. 9 is a diagram illustrating a configuration of LPP for control in an NG-RAN and positioning in a user plane. This drawing is based on a drawing included in the document "3GPP TS37.355".
  • the LPP is used to locate a target device (for example, UE 40) using measurements of positional relationships acquired by one or more reference sources.
  • the LPP is used as a point-to-point protocol between a location server (for example, LMF 551) and a target device.
  • the LPP is defined for Long Term Evolution (LTE), which is a fourth generation mobile communication system, and is also used in a 5G system.
  • LTE Long Term Evolution
  • the LPP session is used between the location server and the target device to acquire a measurement of a location relationship or an estimate of a location, or to transfer assistance data.
  • One LPP session is used to support one location information request.
  • one LPP session is used to support one mobile terminal end location information request, one mobile terminal origination location information request, or one location information request by a network.
  • a plurality of LPP sessions may be used between the same end point to support a plurality of different location information requests.
  • Each LPP session includes one or more LPP transactions, and each LPP transaction performs one operation (for example, at least one of exchange of capability information, transfer of assistance data, and transfer of location information).
  • the LPP transaction displays transaction ID at the LPP protocol level to associate both messages with each other (for example, request and response).
  • Each LPP transaction involves an exchange of one or more LPP messages between the location server and the target device.
  • the LPP message provides a full set of information for invoking and responding to LPP transactions.
  • the typical format of an LPP message includes a series of common fields followed by a body.
  • the body includes unique information for a specific message type. Note that the body may be blank.
  • Each message type includes unique information for one or more positioning methods and/or common information for all positioning methods.
  • the shared field is as follows: ⁇ Transaction ID for identifying messages belonging to the same transaction ⁇ Transaction end flag indicating when the transaction has ended, such as a transaction with periodic response ⁇ Sequence number capable of detecting LPP message overlap at receiver ⁇ Acknowledgement to allow to request and/or return an acknowledgement for any LPP message
  • the capability indicates positioning and protocol functions related to the LPP, and a positioning method supported by the LPP.
  • Fig. 10 is a diagram illustrating an example of LPP session processing. This drawing is based on a drawing included in the document "3GPP TS37.355".
  • Endpoint A transmitting an LPP message with initial LPP transaction ID of j to endpoint B, thereby starting an LPP session (Step S11).
  • endpoint A is one of a target device and a location server
  • endpoint B is the other of the target device and the location server.
  • Endpoint A and endpoint B can continue the transaction initiated in Step S11 to further exchange messages (Step S12).
  • Both endpoints can initiate further transactions by transmitting additional LPP messages (Step S13).
  • the session ends by an LPP message whose final transaction ID exchanged between the two endpoints is N (Step S14).
  • All configuration messages within each transaction need to contain the same transaction ID.
  • the last message sent in each transaction needs to set Information Element (IE) of endTransaction to TRUE.
  • IE Information Element
  • Concurrently occurring transactions need to use different transaction IDs. Note that the transaction ID of a completed transaction can be reused at any time after it is found that a final message of a previous transaction having the same ID has been received.
  • Figs. 11A and 11B are diagrams illustrating an example of a procedure prepared in LPP. These drawings are based on the figures shown in the document "3GPP TS37.355".
  • Fig. 11A is a diagram illustrating an example of a capability information transmission procedure.
  • the location server transmits a RequestCapabilities message to the target device (Step S21).
  • the location server may indicate a type of a required capability.
  • the target device determines whether the target device supports one or a plurality of positioning methods being a target of the request included in the message. When the positioning method is supported, the target device includes capability of the device related to the supported positioning method in a response message (ProvideCapabilities message). Subsequently, the target device sets the same value as the IE of the LPP-TransactionID of the received message to the IE of the LPP-TransactionID of the response message.
  • the target device transmits the ProvideCapabilities message to the location server (Step S22).
  • This message needs to include an information element (IE) of endTransaction that has been set to TRUE.
  • IE information element
  • Fig. 11B is a diagram illustrating another example of the capability information transmission procedure.
  • the target device starts the transmission procedure of the ProvideCapabilities message
  • the target device sets a corresponding IE so as to include the capability of the device in the message for each positioning method indicating the capability.
  • the target device transmits a ProvideCapabilities message to the location server (Step S31).
  • This message needs to include the IE of endTransaction that has been set to TRUE. This procedure makes it possible for the target to indicate to the location server the capability not requested to the location server.
  • the LPP has prepared a procedure of transfer and display regarding assistance data and information regarding location.
  • the information regarding location is measurement data regarding the location and/or an estimates of the location.
  • Positioning Method IEs a positioning method is specified by Positioning Method IEs. Positioning methods supported by the LPP are as follows:
  • OTDOA Observed Time Difference Of Arrival
  • A-GNSS Assisted GNSS
  • Enhanced Cell ID Positioning ⁇ Terrestrial Beacon System Positioning ⁇ Sensor based Positioning ⁇ WLAN-based Positioning ⁇ Bluetooth-based Positioning ⁇ NR UL Positioning ⁇ NR Enhanced Cell-ID (NR E-CID) Positioning ⁇ NR DL-Time Difference Of Arrival (NR DL-TDOA) Positioning ⁇ NR DL-Angle of Departure (NR DL-AoD) Positioning ⁇ NR Multi-Round Trip Time (NR Multi-RTT) Positioning
  • A-GNSS Positioning In A-GNSS Positioning, the location server provides assistance data for terminal based and/or terminal assisted A-GNSS by using an IE of A-GNSS-ProvideAssistanceData.
  • the target device can request the GNSS assistance data from the location server by using an IE of A-GNSS-RequestAssistanceData.
  • the target device provides location measurements (for example, pseudo ranges, location estimate, and velocity) to the location server together with time information by using an IE of A-GNSS-ProvideLocationInformation.
  • the target device provides GNSS signal measurement information to the location server by using an IE of GNSS-SignalMeasurementInformation.
  • the target device When requested by the location server, the target device provides GNSS network time association to the location server.
  • This information includes a code phase, Doppler, and C/N.
  • this information optionally includes a cumulative carrier phase which is also referred to as an Accumulated DeltaRange (ADR).
  • ADR Accumulated DeltaRange
  • the location server can request the location information from the target device that uses the GNSS by using an IE of A-GNSS-RequestLocationInformation.
  • A-GNSS-RequestLocationInformation the location server provides a GNSS measurement instruction to the target device by using an IE of GNSS-PositioningInstructions.
  • the IE of A-GNSS-Provide-Capabilities is used to indicate a capability of the target device to support the A-GNSS.
  • the IE of A-GNSS-Provide-Capabilities is used by the target device to provide A-GNSS location capabilities (for example, GNSS to be supported and assistance data) to the location server.
  • the location server provides assistance data to the target device to assist in advanced computing at the UE by using an IE of Sensor-ProvideAssistanceData.
  • the location server provides assistance data to the target device to assist in advanced computation at the UE in the terminal based mode.
  • WLAN-based Positioning In WLAN-based positioning, a target device provides measurement results on one or more WLANs to a location server by using an IE of WLAN-ProvideLocationInformation.
  • the location server can request WLAN measurements from the target device by using an IE of WLAN-RequestLocationInformation.
  • the target device provides a capability for WLAN positioning to the location server by using an IE of WLAN-ProvideCapabilities.
  • the location server can request WLAN positioning capabilities information from the target device by using an IE of WLAN-RequestCapabilities.
  • the location server provides assistance data to the target device to enable terminal based and/or terminal assisted WLAN positioning by using an IE of WLAN-ProvideAssistanceData.
  • the target device can request the WLAN assistance data from the location server by using an IE of WLAN-RequestAssistanceData.
  • Bluetooth-based Positioning In Bluetooth-based positioning, a target device provides measurement results for one or more Bluetooth beacons to the location server by using an IE of BT-ProvideLocationInformation.
  • the location server can request Bluetooth measurements from the target device using an IE of BT-RequestLocationInformation.
  • the target device provides a capability for Bluetooth positioning to the location server by using an IE of BT-ProvideCapabilites.
  • the location server can request Bluetooth positioning capabilities from the target device by using an IE of BT-RequestCapabilities.
  • NR UL Positioning In NR UL Positioning, an IE of NR-UL-ProvideCapabilities is used by a target device to indicate the capability of supporting UL Sounding Reference Signals (UL SRS) for positioning to a location server and to provide the UL SRS for the positioning capability to the location server.
  • UL SRS UL Sounding Reference Signals
  • An IE of the NR-UL-RequestCapabilities is used by the location server to request a capability of the target device to support the UL SRS for positioning and to request the UL SRS for positioning from the target device.
  • NR E-CID Positioning In NR E-CID Positioning, the target device provides NR E-CID location measurements to a location server by using an IE of NR-ECID-ProvideLocationInformation.
  • the target device provides the NR E-CID measurements to the location server by using an IE of NR-ECID-SignalMeasurementInformation.
  • the location server can request NR E-CID location measurements from the target device by using an IE of NR-ECID-RequestLocationInformation.
  • the IE of the NR-ECID-ProvideCapabilities is used to indicate capabilities (NR E-CID positioning capabilities) of the target device to support NR E-CID.
  • the IE of the NR-ECID-ProvideCapabilities is used for the target device to provide the NR E-CID positioning capabilities to the location server.
  • the location server requests a capability of a target device to support NR E-CID by using an IE of NR-ECID-RequestCapabilities. For example, the location server requests E-CID positioning capabilities from the target device by using an IE of NR-ECID-RequestCapabilities.
  • NR DL-TDOA Positioning the location server provides assistance data for terminal assisted and/or terminal based NR DL-TDOA by using an IE of NR-DL-TDOA-ProvideAssistanceData.
  • the target device can request the assistance data from the location server by using an IE of NR-DL-TDOA-RequestAssistanceData.
  • the target device provides NR DL-TDOA location measurements to the location server by using an IE of NR-DL-TDOA-ProvideLocationInformation.
  • the target device provides the NR DL-TDOA measurements to the location server by using an IE of NR-DL-TDOA-SignalMeasurementInformation.
  • the target device includes information regarding IE of NR-DL-TDOA-LocationInformation in the information to be provided.
  • the IE of NR-DL-TDOA-ProvideCapabilities is used to indicate a capability of the target device to support NR DL-TDOA.
  • the IE of NR-DL-TDOA-ProvideCapabilities is used for the target device to provide NR DL-TDOA positioning capabilities to the location server.
  • the IE of the NR-DL-TDOA-MeasurementCapability can be included only when the DL-TDOA measurement capability is defined and the target device supports NR-DL-PRS-ResourcesCapability for DL-TDOA.
  • the IE of the NR-DL-TDOA-RequestCapabilities is used for the location server to request a capability of a target device to support the NR DL-TDOA.
  • the IE of NR-DL-TDOA-RequestCapabilities is used for the location server to request NR DL-TDOA positioning capabilities from the target device.
  • NR DL-AoD Positioning In NR DL-AoD Positioning, the location server provides assistance data for terminal assisted and terminal based NR DL-AoD by using an IE of NR-DL-AoD-ProvideAssistanceData.
  • the target device provides NR DL-AoD measurements to the location server by using an IE of NR-DL-AoD-SignalMeasurementInformation.
  • the target device At acquisition of the information regarding the location using the NR DL-AoD, the target device provides an IE of NR-DL-AoD-LocationInformation to the location server.
  • the location server can request NR DL-AoD location measurements from the target device using an IE of NR-DL-AoD-RequestLocationInformation.
  • the IE of the NR-DL-AoD-ProvideCapabilities is used to indicate a capability of the target device to support NR DL-AoD.
  • the IE of the NR-DL-AoD-ProvideCapabilities is used to provide NR DL-AoD positioning capabilities to the location server.
  • the IE of the NR-DL-AoD-MeasurementCapability can be included only when the DL-AoD measurement capability is defined and the target device supports NR-DL-PRS-ResourcesCapability for DL-AoD.
  • An IE of the NR-DL-AoD-RequestCapabilities is used for the location server to request a capability of a target device to support the NR DL-AoD.
  • the IE of NR-DL-AoD-RequestCapabilities is used by the location server to request NR DL-AoD positioning capabilities from the target device.
  • NR Multi-RTT Positioning In NR Multi-RTT Positioning, the location server provides assistance data for terminal assisted NR Multi-RTT by using an IE of NR-Multi-RTT-ProvideAssistanceData.
  • the target device can request the assistance data from the location server by using an IE of NR-Multi-RTT-RequestAssistanceData.
  • the target device provides NR Multi-RTT location measurements to the location server by using an IE of NR-Multi-RTT-ProvideLocationInformation.
  • the target device provides NR Multi-RTT measurements to the location server by using an IE of NR-Multi-RTT-SignalMeasurementInformation.
  • the location server requests NR Multi-RTT location measurements from the target device by using an IE of NR-Multi-RTT-RequestLocationInformation.
  • An IE of NR-Multi-RTT-ProvideCapabilities is used to indicate a capability of the target device to support NR Multi-RTT.
  • the IE of the NR-Multi-RTT-ProvideCapabilities is used by the target device to provide NR Multi-RTT positioning capabilities to the location server.
  • the IE of NR-Multi-RTT-MeasurementCapability can be included only when the Multi-RTT measurement capability is defined and the target device supports NR-DL-PRS-ResourcesCapability for Multi-RTT.
  • the IE of the NR-Multi-RTT-RequestCapabilities is used for the location server to request a capability of a target device to support NR Multi-RTT.
  • the IE of the NR-Multi-RTT-RequestCapabilities is used by the location server to request NR Multi-RTT positioning capabilities from the target device.
  • NR Positioning Protocol A is a protocol defined for the RAN/AN 510 to provide a service to the LMF 551.
  • the NR Positioning Protocol A includes NRPPa location information transfer procedures and NRPPa management procedures.
  • the NRPPa location information transfer procedure includes a procedure of transferring positioning-related information between the RAN/AN 510 and the LMF 551.
  • the NRPPa management procedure includes a procedure unrelated to positioning (for example, processing at the time of an error).
  • E-CID Measurement Initiation procedure E-CID Measurement Failure Indication procedure
  • E-CID Measurement Report procedure E-CID Measurement Termination procedure ⁇ Information transfer to Observed Time Difference of Arrival (OTDOA) positioning ⁇ OTDOA Information Exchange procedure ⁇ Typical error status report ⁇ Error Indication procedure ⁇ Transfer assistance information ⁇ Assistance Information Control procedure ⁇ Assistance Information Feedback procedure ⁇ Transfer positioning information ⁇ Positioning Information Exchange procedure ⁇ Positioning Information Update procedure ⁇ Positioning Information Update procedure ⁇ Positioning Activation procedure ⁇ Positioning Deactivation Procedure ⁇ Transfer measurement information ⁇ Measurement procedure ⁇ Measurement Update procedure ⁇ Measurement Report procedure ⁇ Measurement Abort procedure ⁇ Measurement Failure Indication procedure ⁇ Transfer of Transmission-Reception Point (TRP) information ⁇ TRP Information Exchange procedure ⁇ Transfer Positioning Reference Signal (PRS) information ⁇ PRS Configuration Exchange procedure ⁇ Transfer Measurement Preconfiguration information ⁇ Measurement Preconfiguration
  • Figs. 12A and 12B are diagrams illustrating an example of a procedure prepared in NRPPa. This drawing is based on the drawing included in the document "3GPP TS38.455".
  • Fig. 12A is a diagram illustrating a positioning information exchange procedure.
  • the LMF 551 transmits a POSITIONING INFORMATION REQUEST message to the RAN/AN 510 (Step S41).
  • the NG-RAN node Having received a POSITIONING INFORMATION REQUEST message, the NG-RAN node (for example, RAN/AN 510) performs necessary settings, and then transmits a POSITIONING INFORMATION RESPONSE message to the LMF 551 (Step S42).
  • the NG-RAN node can consider the information of the IE when configuring transmission of a sounding reference signal (SRS) to the UE 40.
  • SRS sounding reference signal
  • the NG-RAN node needs to include the IE of the SRS configuration and the IE of the System Frame Number (SFN) Initialisation Time in the POSITIONING INFORMATION RESPONSE message.
  • SRS sounding reference signal
  • Fig. 12B is a diagram illustrating a measurement procedure.
  • the LMF 551 transmits a MEASUREMENT REQUEST message to the NG-RAN node (for example, RAN/AN 510) (Step S51).
  • LMF 551 can include an IE of a TRP Measurement Request List in a MEASUREMENT REQUEST message.
  • the IE of the TRP Measurement Request List indicates a TRP at which measurement is requested.
  • the NG-RAN node configures measurement by the indicated TRP according to the information included in the message.
  • the NG-RAN node responds with a MEASUREMENT RESPONSE message (Step S52).
  • the MEASUREMENT RESPONSE message includes an IE of a TRP Measurement Response List.
  • the communication system 1 is assumed to be a 5G system (5GS) as an example. However, the communication system 1 is not limited to 5GS.
  • the communication system 1 may be a 4G system (4GS) or a 6G system (6GS).
  • the communication system 1 may be a Beyond 5G (B5G) wireless communication system. Needless to say, the communication system 1 may be a wireless communication system other than these.
  • Fig. 13 is a sequence diagram illustrating a basic procedure of the location service. A basic operation related to the location service of communication system 1 will be described below with reference to Fig. 13.
  • the AF 548 transmits, to the NEF 542, a Nnef_EventExposure_Subscribe request message for utilizing the location service (Step S61).
  • the NEF 542 transmits an Ngmlc_Location_ProvideLocation request message to the GMLC 552 (Step S62).
  • the AF 548 can directly access GMLC 552
  • the AF 548 can transmit a location service request (or an Ngmlc_Location_ProvideLocation request message) for utilizing the location service to GMLC 552.
  • the GMLC 552 Having received the Ngmlc_Location_ProvideLocation request message, the GMLC 552 starts a Nudm_SDM_Get service for the UDM 547 in order to acquire the terminal LCS privacy profile of the target UE 40 (Step S63).
  • the target UE 40 is identified by, for example, a Generic Public Subscription Identifier (GPSI) or a Subscription Permanent Identifier (SUPI).
  • GPSI Generic Public Subscription Identifier
  • SUPI Subscription Permanent Identifier
  • the GMLC 552 checks the privacy setting of UE 40 for the location service according to the terminal LCS privacy profile of UE 40 (Step S64).
  • the GMLC 552 When the location service is permitted to access the location information of the target UE 40, the GMLC 552 starts a Nudm_UECM_Get service for the UDM 547. Subsequently, the UDM 547 responds to the GMLC 552 with a network address of the serving AMF 541 of the target UE 40 (Step S65).
  • the target UE 40 is identified by GPSI or SUPI, for example.
  • the GMLC 552 Having acquired the network address of the serving AMF 541 from the UDM 547, the GMLC 552 transmits a Namf_Location_ProvidePositioningInfo request message to the serving AMF 541 (Step S66).
  • the serving AMF 541 starts a Network triggered Service request procedure for the location service (Step S67).
  • the serving AMF 541 transmits a NAS message to start a notification of the location information to the UE 40 (Step S68).
  • the target UE 40 notifies the user of the UE 40 of the request for location information according to the received NAS message. In a case where privacy verification is requested, the target UE 40 waits to see whether the user gives permission or suspends. The target UE 40 responds with a notification result to the AMF 541 (Step S69). In a case where privacy verification has been requested, the target UE 40 indicates whether permission has been granted or denied for the request for location information in the current location service.
  • the AMF 541 starts a Nudm_ParameterProvision_Update service in order to cause the UDM 547 to hold the acquired Location Privacy Indication information (Step S70).
  • the UDM 547 can save the updated privacy setting of the UE 40 in a Unified Data Repository (UDR) as location service privacy.
  • Location service privacy may be a subset of terminal subscriber data.
  • the AMF 541 transmits an Nlmf_Location_DetermineLocation service request to the LMF 551 (Step S71).
  • the Nlmf_Location_DetermineLocation service request may include information of a scheduled location and time.
  • the LMF 551 transmits a request for location information to the UE 40.
  • the LMF 551 can include information of a scheduled position and time in the request.
  • the UE 40 Having received the request, the UE 40 returns current location information to the LMF 551 (Step S72).
  • the LMF 551 returns an Nlmf_location_determineLocation service response to the AMF 541 (Step S73).
  • the response may include the current location information of the UE 40.
  • the AMF 541 returns an Namf_Location_ProvidePositioningInfo response message to the GMLC 552 (Step S74). With this operation, the AMF 541 returns the current location information of the UE 40 to the GMLC 552. In a case where it is indicated in Step S73 that the location information is to be directly transmitted to the GMLC 552, the AMF 541 notifies that the current location information of the UE 40 is to be directly transmitted to the GMLC 552 by using the Namf_Location_ProvidePositioningInfo response message.
  • the GMLC 552 transmits an Ngmlc_Location_ProvideLocation response message to the NEF 542 as a response to the request message in Step S62 (Step S75).
  • the response message includes the current location information of the UE 40.
  • the GMLC 552 responds that the Ngmlc_Location_ProvideLocation request is to be denied.
  • the AF 548 executes various types of processing based on the current location information of the UE 40.
  • the AF 548 again transmits the Nnef_EventExposure_Subscribe request message to the NEF 542 (Step S61).
  • Sensing service and sensing function>> Based on the above, the operation of the communication system 1 that can solve the problem of the present embodiment will be described in detail. Before describing an operation of the communication system 1 according to the present embodiment, a sensing service and a sensing function (SEF) will be described. Furthermore, in the description of the present embodiment, the sensing function and/or SEF may be read as SF, SMF, SEMF, and/or sensing management function.
  • the service provided by the communication system 1 according to the present embodiment to an apparatus to be a service utilization entity includes a sensing service.
  • the sensing service is, for example, a service based on data detected by one or a plurality of sensors included in one or a plurality of communication apparatuses.
  • the sensing service is a service based on data detected by one or a plurality of sensors included in the base station 30 and/or the terminal apparatus 40.
  • One or a plurality of sensors included in one or a plurality of communication apparatuses may be, for example, one or a plurality of sensors included in the sensor unit 34 and/or the sensor unit 46 described above.
  • one or a plurality of sensors may include a sensor that detects an image and/or a shape of an object, such as a camera and/or a LiDAR.
  • the one or plurality of sensors may include a sensor that detects at least one of the color of the object, the velocity of the object, the acceleration of the object, the temperature of the object, the reflectance of the object, the distance to the object, geomagnetism, illuminance, atmospheric pressure, light, and sound.
  • the sensing service may be a service based on image data or shape data detected by the sensor (for example, a service related to automated driving of a mobile body).
  • the sensing service may be a service based on detection data obtained by a sensor that detects at least one of the color of the object, the velocity of the object, the acceleration of the object, the temperature of the object, the reflectance of the object, the distance to the object, geomagnetism, illuminance, atmospheric pressure, light, and sound.
  • one or a plurality of sensors included in one or a plurality of communication apparatuses may include a sensor of a communication apparatus different from an apparatus to be a service utilization entity (for example, an apparatus that transmits a request related to a service).
  • One or a plurality of sensors included in one or a plurality of communication apparatuses may include a sensor of an apparatus to be a service utilization entity (for example, an apparatus that transmits a request related to a service).
  • detection data data detected by one or a plurality of sensors is referred to as detection data.
  • the detection data can be rephrased as sensing data or sensing information.
  • the sensing service is typically a service of providing detection data of one or a plurality of sensors.
  • the sensing service is a service of providing detection data used in a predetermined use case (for example, processing related to automated driving of a mobile body processing related to automated operation of an apparatus/system, or processing related to VR content).
  • the sensing service is not limited to the service of providing detection data.
  • the sensing service may be a service of providing processing executed using one or a plurality of pieces of detection data, or may be a service of providing information generated using one or a plurality of pieces of detection data.
  • the one or plurality of sensors directly or indirectly used for the sensing service are not limited to one or a plurality of sensors included in the base station 30 or the terminal apparatus 40.
  • the one or plurality of sensors directly or indirectly used for the sensing service may be one or a plurality of sensors included in a communication apparatus other than the base station 30 or the terminal apparatus 40.
  • the sensors may be one or a plurality of sensors included in the server 10 or the management apparatus 20.
  • the service providing entity may provide the sensing service based on detection data of a plurality of sensors instead of detection data of one sensor. At this time, the service providing entity may provide the sensing service based on data obtained by fusing detection data of a plurality of sensors. With this configuration, the service providing entity can provide a sensing service with accuracy higher than accuracy that can be provided by detection data of one sensor.
  • the service providing entity may provide the sensing service based on detection data of a sensor managed by an entity other than the utilization entity of the sensing service.
  • the service providing entity may provide the sensing service based on one or a plurality of pieces of detection data including detection data of a sensor managed by an entity other than the utilization entity of the sensing service.
  • the service providing entity is typically an apparatus or a system.
  • the service providing entity is not limited to an apparatus or a system, and may be, for example, a function of the apparatus/system or a software/program (for example, an application) included in the apparatus/system.
  • the service providing entity may be a person or an organization (for example, an operator such as a Mobile Network Operator (MNO)).
  • MNO Mobile Network Operator
  • the service providing entity can be rephrased as, for example, a provider, a server, or a providing entity.
  • the service providing entity is typically a core network CN (for example, the management apparatus 20).
  • the service providing entity is not limited thereto.
  • the service providing entity may be the server 10, the base station 30, or the terminal apparatus 40.
  • the utilization entity of the sensing service is an entity that utilizes the sensing service.
  • the utilization entity of the sensing service utilizes various sensing services provided by the service providing entity.
  • a utilization entity of the sensing service may be referred to as a service utilization entity.
  • the sensing service of the present embodiment may be, for example, at least one of the following (A1) to (A4).
  • the sensing service of the present embodiment may be a service related to automated driving.
  • the service related to the automated driving may include, for example, a processing or information (data) providing service necessary for implementation of automated driving of a mobile body (for example, a vehicle such as an automobile, a flying object such as a drone, or the like).
  • the service providing entity may provide, as a service related to automated driving, detection data of one or a plurality of sensors selected based on a predetermined standard related to processing of automated driving.
  • the service providing entity may provide, as a service related to automated driving, information generated by fusing detection data of a plurality of sensors selected based on a predetermined standard related to processing of automated driving.
  • the information generated by fusing the detection data of the plurality of sensors may include, for example, at least one of information regarding a route necessary for automated driving of the mobile body, control information regarding steering, control information regarding acceleration, control information regarding braking, and high-precision three-dimensional map information.
  • the information regarding the route may include, for example, information regarding the position of the next point with respect to the current position or velocity information.
  • the sensor selected based on a predetermined standard related to the automated driving process may include, for example, at least one of: one or a plurality of sensors included in a mobile body to be controlled by the automated driving; one or a plurality of sensors included in another mobile body; and one or a plurality of sensors included in a device such as a road-side unit.
  • the sensor selected based on a predetermined standard may include a plurality of sensors of different types.
  • the sensor may include a plurality of sensors among a Global Navigation Satellite System (GNSS) sensor, an acceleration sensor, an Inertial Measurement Unit (IMU) including a gyro sensor, an image sensor/camera, Light Detection And Ranging (LiDAR), and a millimeter wave radar.
  • GNSS Global Navigation Satellite System
  • IMU Inertial Measurement Unit
  • the sensor selected based on a predetermined standard may include one or a plurality of sensors included in the base station 30 or the terminal apparatus 40 (for example, one or a pluralit
  • the sensing service of the present embodiment may be a service related to automated operation of one or a plurality of apparatuses/systems (for example, apparatuses/systems in a factory, hospital, or operating room).
  • the service related to the automated operation of one or a plurality of devices/systems may include, for example, processing or information (data) providing service necessary for implementing automated operation of one or a plurality of apparatuses/systems related to a predetermined facility.
  • the predetermined facility may be a production facility installed in a factory or a facility installed in a hospital/operating room.
  • the service providing entity may provide, as a service related to automated operation, detection data of one or a plurality of sensors selected based on a predetermined standard related to processing of automated operation.
  • the service providing entity may provide, as a service related to automated operation, information generated by fusing detection data of a plurality of sensors selected based on a predetermined standard related to processing of automated operation.
  • the service providing entity may provide information (for example, control information necessary for automated operation) generated by fusing data detected by a plurality of sensors included in one or a plurality of apparatuses in a factory or a hospital/operating room.
  • the sensor selected based on a predetermined standard related to the processing of automated operation may include a plurality of sensors of different types.
  • the sensor selected based on a predetermined standard may include a plurality of sensors among a GNSS sensor, an acceleration sensor, an inertial measurement device including a gyro sensor, an illuminance sensor, an infrared camera, a light-field camera, an image sensor/camera, a Time of Flight (ToF) sensor, LiDAR, and a millimeter wave radar.
  • the sensor selected based on a predetermined standard may include one or a plurality of sensors included in the base station 30 or the terminal apparatus 40 (for example, one or a plurality of sensors included in the sensor unit 34 and/or the sensor unit 46).
  • the sensing service of the present embodiment may be a service related to XR content (for example, display content of XR such as an XR game or an XR video).
  • the service related to XR content may include, for example, a service that provides processing or information (data) necessary for implementing processing of XR content.
  • the service providing entity may provide, as a service related to XR content, detection data of one or a plurality of sensors selected based on a predetermined standard related to processing of XR content.
  • the service providing entity may provide, for example, information (for example, spatiotemporal information of XR content) generated by fusing detection data of a plurality of sensors mounted on the terminal apparatus 40 for XR (for example, XR devices such as smart glasses).
  • information for example, spatiotemporal information of XR content
  • XR for example, XR devices such as smart glasses.
  • the sensor selected based on a predetermined standard related to the processing of XR content may include a plurality of sensors of different types.
  • the plurality of sensors may include a plurality of sensors among a Global Navigation Satellite System (GNSS) sensor, an acceleration sensor, an Inertial Measurement Unit (IMU) including a gyro sensor, a 6DoF sensor, a 3DoF sensor, a geomagnetic sensor, an infrared camera, a light field camera, an image sensor/camera, Light Detection And Ranging (LiDAR), a Time of Flight (ToF) sensor, an illuminance sensor, and a millimeter wave radar.
  • the sensor selected based on a predetermined standard may include one or a plurality of sensors included in the base station 30 or the terminal apparatus 40.
  • the sensing service of the present embodiment may be a service related to provision of detection data.
  • the sensing service of the present embodiment may be a service of providing detection data (sensing data) used in a predetermined use case (for example, processing related to automated driving of a mobile body, processing related to automated operation of an apparatus/system, or processing related to XR content).
  • the sensing service of the present embodiment may be a service in which the core network CN (for example, the management apparatus 20) provides detection data to an application of the terminal apparatus 40 or the server 10.
  • the sensing service is not limited to the services illustrated in (A1) to (A4).
  • the sensing service may include the above-described location service.
  • the sensing service of the present embodiment may include a sensing service other than the service described above or below.
  • the senor illustrated here is an example.
  • the sensor for the sensing service is not limited to the sensor described above or below.
  • the use case of the sensing service is, for example, a utilization purpose/utilization scene of the sensing service.
  • the sensing service is a service providing the detection data
  • the use case of the sensing service is, for example, a utilization purpose/utilization scene of the detection data by the service utilization entity.
  • the use case of the sensing service of the present embodiment may be, for example, at least one of processing related to automated driving of a mobile body, processing related to automated operation of an apparatus/system, and processing related to XR content. Needless to say, the use case of the sensing service of the present embodiment is not limited thereto, and may be, for example, at least one of the following (B1) to (B26).
  • B1 Intruder detection in smart home (B2) Pedestrian/animal intrusion detection on a highway (B3) Sensing for railway intrusion detection (B4) Unmanned or Uncrewed aerial vehicles (UAVs)/vehicles/pedestrians detection near Smart Grid equipment (B5) Rainfall monitoring (B6) Sensing for flooding in smart cities (B7) Intruder detection in surroundings of smart home (B8) Sensing Assisted Automotive Maneuvering and Navigation (B9) Blind spot detection (B10)Vehicles Sensing for Advanced Driver-Assistance Systems (ADAS) (B11) Sensing for Parking Space Determination (B12) Sensing for tourist spot traffic management (B13) Sensing at crossroads with/without obstacle (B14) Automated Guided Vehicles (AGV) detection and tracking in factories (B15) Autonomous Mobile Robots (AMR) collision avoidance in smart factories (B16) UAV flight trajectory tracing (B17) Network assisted sensing to avoid UAV collision (B18) Sensing
  • each of the use cases illustrated here is an example.
  • the use case of the sensing service is not limited to the use cases illustrated in (B1) to (B26).
  • the use case of the sensing service of the present embodiment may include a use case other than the use case (for example, the use cases illustrated in (B1) to (B26) described above) described above or below.
  • the 5G system is expected to support a 5G wireless sensing service.
  • the 5G wireless sensing service can provide functions such as sensing of one or a plurality of objects in an environment, monitoring of environmental conditions, and sensing of motion and/or gestures of a person.
  • the 5G wireless sensing service includes at least a service for collecting and/or processing 3GPP sensing data (for example, services such as secure distribution of 3GPP sensing data and secure disclosure of sensing results to a trusted third party).
  • a service for collecting and/or processing 3GPP sensing data for example, services such as secure distribution of 3GPP sensing data and secure disclosure of sensing results to a trusted third party.
  • non-3GPP sensing data can be used.
  • 3GPP sensing data is defined as data used for sensing purposes and obtained from a 3GPP radio signal that is affected by an object or environment (for example, reflection, refraction, diffraction) and that is optionally processed within a 5G system.
  • Non-3GPP sensing data is defined as data related to an object and/or environment to be sensed and provided by a sensor (for example, an image sensor/camera, LiDAR, sonar, or the like) other than 3GPP sensing.
  • a sensor for example, an image sensor/camera, LiDAR, sonar, or the like
  • the 5G wireless sensing service by Integrated Sensing and Communication also assumes introduction of a sensing function (SEF) having a service-based interface.
  • SEF sensing function
  • the 5G wireless sensing service may be simply referred to as a sensing service.
  • the communication system 1(for example, the core network CN) of the present embodiment may include a sensing function (SEF) having the service-based interface.
  • Figs. 14A to 14C are diagrams illustrating configuration examples of sensing functions in the core network CN.
  • a SEF 554 is defined independently of the LMF 551.
  • the SEF 554 is incorporated in the LMF 551.
  • the SEF 554 is configured to include the LMF 551.
  • LTE positioning protocol defined as a point-to-point protocol between a location server (for example, LMF 551) and a target device (for example, UE 40) and/or NRPPa defined as a protocol between the LMF 551 and the RAN/AN 510 (for example, NG-RAN).
  • LPF LTE positioning protocol
  • UE 40 target device
  • NRPPa NRPPa defined as a protocol between the LMF 551 and the RAN/AN 510 (for example, NG-RAN).
  • the configuration example illustrated in Fig. 14A may define a new protocol independently of LPP and NRPPa. Still, considering that ISAC includes a sensor that measures the position of the terminal apparatus, it is desirable to follow the concept of LPP and/or NRPPa.
  • FIG. 15 is a diagram illustrating a configuration example of a reference architecture related to a sensing service.
  • the reference architecture related to the sensing service may be applied to the communication system 1(for example, the core network CN) of the present embodiment.
  • the sensing service may be rephrased as SES.
  • the control plane function group 540 includes a plurality of Network Functions (NFs).
  • NFs Network Functions
  • One or a plurality of NFs included in the control plane function group 540 may include at least one NF among the UDM 547, the AF 548, the SEF 554, a Sensing Retrieval Function (SRF) 555, and a GMSE 556.
  • the SRF 555 may be provided together with the GMSE 556 or may be installed separately.
  • the SRF 555 is responsible for acquisition or verification of information (for example, sensing information) regarding detection data.
  • the SRF 555 provides routing information and/or correlation information to the UE 40 that has started an IP Multimedia Subsystem (IMS) emergency session.
  • IMS IP Multimedia Subsystem
  • the AF 548 or other NFs can access the event exposure service of the location information of AMF 541 in the same trust domain using a Ngmse interface (for example, by using a sensing service of GMSE 556 in a same PLMN or in a same trust domain, or using a Namf interface,).
  • a Ngmse interface for example, by using a sensing service of GMSE 556 in a same PLMN or in a same trust domain, or using a Namf interface,).
  • the SES client 570 can access the sensing service of the GMSE 556 by using Le being a reference point.
  • the external AF 548 can access the sensing service through the NEF 542 by using a Nnef interface or a Common API Framework (CAPIF).
  • CAPIF Common API Framework
  • the SES client 570 or the AF 548 can access the sensing service of the UE 40 on a connected user plane for the purpose of report events by the UE 40 in response to a periodic or triggered 5GC sensing information request (5GC-MT-SE).
  • 5GC-MT-SE 5GC sensing information request
  • the GMSE 556 may include functions requested to support sensing services.
  • One PLMN can include one or more GMSE 556.
  • the GMSE 556 is the first node that external SES client 570 should access within the PLMN.
  • AF 548 or other NFs can access the GMSE 556 directly or via the NEF 542.
  • the GMSE 556 can request routing information and/or privacy information of the target UE 40 from the UDM 547 via the Nudm interface.
  • the UDM 547 manages privacy profiles and routing information of sensing service subscribers. These pieces of information can be accessed from the AMF 541, the GMSE 556, or the NEF 542 via the Nudm interface.
  • the UDM 547 may include, in subscription data of the UE 40, an instruction of whether the UE 40 is permitted to act as a sensing reference unit (SRU) and an instruction of whether the SRU is a fixed SRU.
  • SRU sensing reference unit
  • the UDM 547 may include the identification information (identifier(s)) of the LMF 551 and the instruction of the positioning of the user plane between the UE 40 and the SEF 554 in the SES subscriber data of the UE 40.
  • the GMSE 556 After execution of the authorization of the external SES client 570 or AF 548 and the verification of the target UE 40, the GMSE 556 transfers the request for sensing information to the serving AMF 541 using the Namf interface. Alternatively, when the target UE 40 is roaming UE, the GMSE 556 transfers the sensing information request to the GMSE 556 of another PLMN by using the Ngmse interface.
  • the configuration of the privacy profile of the target UE 40 may be checked at the home PLMN of the UE 40 before providing an estimate of the sensing information.
  • the SEF 554 performs overall coordination and schedule management of resources required in the UE 40 registered in or accessing the core network CN. Furthermore, the SEF 554 can also calculate or verify an estimate of the final sensing result and can estimate the achievable accuracy.
  • the SEF 554 can directly report the estimate of the sensing result of the target UE 40 at a periodic timing or an event-triggered timing to the GMSE 556.
  • the SEF 554 receives a request for sensing information for the target UE 40 from the serving AMF 541 using the Nlmf interface.
  • the SEF 554 interacts with the UE 40 to exchange information applicable to a terminal assisted (UE assisted) sensing method and/or a terminal based (UE based) sensing method.
  • UE assisted terminal assisted
  • UE based terminal based sensing method
  • the UE 40 may support at least any one of a terminal assisted mode, a terminal based mode, a standalone mode, and a network based mode.
  • the UE 40 receives a measurement signal for detecting a sensing result. Subsequently, the UE 40 transmits the measurement result to another entity (for example, SEF 554).
  • another entity for example, SEF 554.
  • the UE 40 receives a measurement signal for detection of a sensing result. Subsequently, the UE 40 calculates an estimate of the sensing results using the assistance data provided by the serving PLMN.
  • the UE 40 receives a measurement signal for detecting a sensing result. Subsequently, the UE 40 calculates an estimate of the sensing result with no assistance data provided from the serving PLMN.
  • the serving PLMN receives a measurement signal for detection of sensing results transmitted from the target UE 40. Subsequently, the serving PLMN (for example, SEF 554) calculates an estimate of the sensing result.
  • the serving PLMN for example, SEF 554.
  • Fig. 16 is a diagram illustrating a configuration for control and sensing result detection in a user plane.
  • the SP may be used to locate a target device (for example, UE 40) using measurements of positional relationships acquired by one or more reference sources.
  • the SP may be used as a point-to-point protocol between a sensing server (for example, SEF 554) and a target device.
  • the SP session may be used between the sensing server and the target device to acquire a measurement of a positional relationship or an estimates of a location, or to transfer assistance data.
  • One SP session may be used to support one sensing information request.
  • one SP session may be used to support one mobile terminal terminated sensing information request, one mobile terminal transmitted sensing information request, or one sensing information request by a network.
  • a plurality of SP sessions may be used between the same end point to support a plurality of different sensing information requests.
  • Each SP session may include one or more SP transactions.
  • each SP transaction may execute one operation (for example, at least one of exchange of capability information, transfer of assistance data, and transfer of sensing information).
  • the SP transaction displays transaction ID at the SP protocol level in order to associate exchanged messages (for example, request and response).
  • Each SP transaction involves exchange of one or more SP messages between the sensing server and the target device.
  • the SP message provides a full set of information for invoking and responding to the SP transaction.
  • the typical format of an SP message includes a series of common fields followed by a body.
  • the body may include unique information for a specific message type. Note that the body may be blank.
  • Each message type includes unique information for one or more sensing methods and/or common information for all sensing methods.
  • the shared field is as follows: ⁇ Transaction ID for identifying messages belonging to the same transaction ⁇ Transaction end flag indicating when the transaction has ended, such as a transaction with periodic response ⁇ Sequence number capable of detecting SP message overlap at receiver ⁇ Acknowledgement to allow to request and/or return an acknowledgement for any SP message
  • Request Capabilities Provide Capabilities ⁇ Request Assistance Data ⁇ Provide Assistance Data ⁇ Request Sensing Information ⁇ Provide Sensing Information ⁇ Abort ⁇ Error
  • the capability indicates positioning and protocol functions related to the SP, and a sensing method supported by the SP.
  • Fig. 17 is a diagram illustrating an example of SP session processing.
  • Endpoint A transmitting an SP message with initial SP transaction ID of j to endpoint B, thereby starting an SP session (Step S11B).
  • endpoint A is one of a target device and a sensing server
  • endpoint B is the other of the target device and the sensing server.
  • Endpoint A and endpoint B can continue the transaction initiated in Step S11B to further exchange messages (Step S12B).
  • Both endpoints can initiate further transactions by sending additional SP messages (Step S13B).
  • the session ends by an SP message whose final transaction ID exchanged between the two endpoints is N (Step S14B).
  • All configuration messages within each transaction need to contain the same transaction ID.
  • the last message sent in each transaction needs to set Information Element (IE) of endTransaction to TRUE.
  • IE Information Element
  • Concurrently occurring transactions need to use different transaction IDs. Note that the transaction ID of a completed transaction can be reused at any time after it is found that a final message of a previous transaction having the same ID has been received.
  • Figs. 18A and 18B are diagrams illustrating an example of a procedure that can be supported by the SP.
  • Fig. 18A is a diagram illustrating an example of a capability information transmission procedure.
  • the sensing server transmits a RequestCapabilities message to the target device (Step S21B).
  • the sensing server may indicate a type of a required capability.
  • the target device determines whether the target device supports one or a plurality of sensing methods being a target of the request that may be included in the message.
  • the target device includes capability of the device related to the supported sensing method in a response message (ProvideCapabilities message). Subsequently, the target device sets the same value as the IE of the SP-TransactionID of the received message to the IE of the SP-TransactionID of the response message.
  • the target device transmits the ProvideCapabilities message to the sensing server (Step S22B).
  • This message needs to include an information element (IE) of endTransaction that has been set to TRUE.
  • IE information element
  • Fig. 18B is a diagram illustrating another example of the capability information transmission procedure.
  • the target device starts the transmission procedure of the ProvideCapabilities message
  • the target device sets a corresponding IE so as to include the capability of the device in the message for each sensing method indicating the capability.
  • the target device transmits the ProvideCapabilities message to the sensing server (Step S31B).
  • This message needs to include the IE of endTransaction that has been set to TRUE. This procedure makes it possible for the target to indicate to the sensing server the capability not requested to the sensing server.
  • the SP has prepared a procedure of transfer and display regarding assistance data and information regarding location.
  • the information regarding location is measurement data regarding the location and/or an estimates of the location.
  • Sensing Method IEs a sensing method is specified by Sensing Method IEs.
  • the sensing method supported by the SP is as follows:
  • OTDOA Observed Time Difference Of Arrival
  • the sensing server can provide assistance data for terminal based and/or terminal assisted A-GNSS by using the IE of A-GNSS-ProvideAssistanceData.
  • the target device can request the GNSS assistance data from the sensing server by using an IE of A-GNSS-RequestAssistanceData.
  • the target device can provide sensing measurements (for example, pseudo ranges, sensing estimates, and velocity) together with time information to the sensing server by using an IE of A-GNSS-ProvideSensingInformation.
  • the target device can provide GNSS signalmeasurement information to the sensing server by using an IE of the GNSS-SignalmeasurementInformation.
  • the target device can provide GNSS network time association to the sensing server.
  • This information includes a code phase, Doppler, and C/N.
  • this information optionally includes a cumulative carrier phase which is also referred to as an Accumulated DeltaRange (ADR).
  • ADR Accumulated DeltaRange
  • the sensing server can calculate the location of the target device by the terminal assisted GNSS approach.
  • the sensing server can request the sensing information from the target device that uses the GNSS by using an IE of A-GNSS-RequestSensingInformation.
  • the sensing server can provide a measurement instruction of the GNSS to the target device by using an IE of GNSS-SensingInstructions.
  • the IE of A-GNSS-Provide-Capabilities may be used to indicate a capability that the target device supports A-GNSS.
  • the target device can provide A-GNSS Sensing capabilities(for example, supported GNSS and assistance data) to the sensing server.
  • the sensing server can request A-GNSS sensing capabilities from the target device by using the IE of A-GNSS-Request-Capabilities.
  • Terrestrial Beacon System Sensing the target device can provide TBS sensing measurements to the sensing server by using an IE of TBS-ProvideSensingInformation.
  • the sensing server can request the information regarding the location for the TBS-based approach from the target device by using an IE of TBS-RequestSensingInformation.
  • the IE of TBS-ProvideCapabilities may be used to indicate a capability of the target device to support the TBS.
  • the target device can provide TBS sensing capabilities to the sensing server.
  • the sensing server can request the TBS sensing capabilities from the target device by using the IE of TBS-RequestCapabilities.
  • the sensing server can provide assistance data to the target device to assist in localization and/or assistance data to facilitate acquisition of the TBS signal by using the IE of TBS-ProvideAssistanceData.
  • the target device can request the TBS assistance data from the sensing server by using the IE of TBS-RequestAssistanceData.
  • a target device can provide sensing information for a sensor-based approach to a sensing server by using an IE of Sensor-ProvideSensingInformation.
  • the target device can provide UE sensor measurements to the sensing server by using the IE of Sensor-MeasurementInformation.
  • the sensing server can request information regarding the location from the target device for the sensor-based approach by using an IE of Sensor-RequestSensingInformation.
  • the target device can provide the capability for the sensor-based approach to the sensing server by using the IE of Sensor-ProvideCapabilities.
  • the sensing server can request capabilities from the target device for the sensor-based approach using the IE of Sensor-RequestCapabilities.
  • the sensing server can provide assistance data to the target device to assist in advanced computation at the UE by using the IE of Sensor-ProvideAssistanceData.
  • the sensing server can provide assistance data to the target device to assist in advanced computation at the UE in the terminal based mode.
  • the target device can request sensor assistance data from the sensing server by using the IE of Sensor-RequestAssistanceData.
  • the target device can provide measurement results for one or more WLANs to the sensing server by using an IE of WLAN-ProvideSensingInformation.
  • the sensing server can request WLAN measurements from the target device by using an IE of WLAN-RequestSensingInformation.
  • the target device can provide a capability for WLAN Sensing to the sensing server by using the IE of WLAN-ProvideCapabilites.
  • the sensing server can request WLAN sensing capabilities information from the target device by using the IE of WLAN-RequestCapabilities.
  • the sensing server can provide assistance data to the target device to enable terminal based and/or terminal assisted WLAN sensing by using the IE of WLAN-ProvideAssistanceData.
  • the target device can request the WLAN assistance data from the sensing server by using the IE of WLAN-RequestAssistanceData.
  • Bluetooth-based Sensing In Bluetooth-based Sensing, a target device can provide measurement results for one or more Bluetooth beacons to the sensing server by using an IE of BT-ProvideSensingInformation.
  • the sensing server can request Bluetooth measurements from the target device by using an IE of BT-RequestSensingInformation.
  • the target device can provide a capability for Bluetooth Sensing to the sensing server by using the IE of BT-ProvideCapabilites.
  • the sensing server can request Bluetooth sensing capabilities from the target device by using the IE of BT-RequestCapabilities.
  • the IE of NR-UL-ProvideCapabilities is used by a target device to indicate the capability of supporting UL Sounding Reference Signals (UL SRS) for sensing to a sensing server and to provide the UL SRS for the sensing capability to the sensing server.
  • UL SRS UL Sounding Reference Signals
  • the IE of the NR-UL-RequestCapabilities is used by the sensing server to request a capability of the target device to support the UL SRS for sensing and to request the UL SRS for sensing from the target device.
  • a target device can provide NR E-CID Sensing measurements to a sensing server by using an IE of NR-ECID-ProvideSensingInformation.
  • the target device can provide the NR E-CID measurements to the sensing server by using an IE of NR-ECID-SignaIMeasurementInformation.
  • the sensing server can request NR E-CID Sensing measurements from the target device by using an IE of NR-ECID-RequestSensingInformation.
  • the IE of NR-ECID-ProvideCapabilities may be used to indicate capabilities (NR E-CID Sensing capabilities) of the target device to support NR E-CID.
  • the target device can provide the NR E-CID Sensing capabilities to the sensing server.
  • the sensing server requests a capability of a target device to support NR E-CID by using an IE of NR-ECID-RequestCapabilities.
  • the sensing server requests E-CID Sensing capabilities from the target device by using an IE of NR-ECID-RequestCapabilities.
  • NR DL-TDOA Sensing In NR DL-TDOA Sensing, the sensing server can provide assistance data for terminal assisted and/or terminal based NR DL-TDOA by using the IE of NR-DL-TDOA-ProvideAssistanceData.
  • the target device can request the assistance data from the sensing server by using the IE of NR-DL-TDOA-RequestAssistanceData.
  • the target device can provide the NR DL-TDOA Sensing measurements to the sensing server by using an IE of NR-DL-TDOA-ProvideSensingInformation.
  • the target device can provide the NR DL-TDOA measurements to the sensing server by using an IE of NR-DL-TDOA-SignaIMeasurementInformation.
  • the sensing information obtained by the NR DL-TDOA can be provided to the sensing server.
  • the target device includes Information of IE of NR-DL-TDOA-SensingInformation is included in the information to be provided.
  • the sensing server requests NR DL-TDOA Sensing measurements from the target device by using an IE of NR-DL-TDOA-RequestSensingInformation.
  • the IE of the NR-DL-TDOA-ProvideCapabilities may be used to indicate a capability of the target device to support the NR DL-TDOA.
  • the target device can provide NR DL-TDOA Sensing capabilities to the sensing server.
  • the IE of the NR-DL-TDOA-MeasurementCapability can be included only when the DL-TDOA measurement capability is defined and the target device supports NR-DL-PRS-ResourcesCapability for DL-TDOA.
  • the IE of the NR-DL-TDOA-RequestCapabilities may be used for the sensing server to request a capability of a target device to support the NR DL-TDOA.
  • the IE of NR-DL-TDOA-RequestCapabilities may be used by the sensing server to request NR DL-TDOA Sensing capabilities from the target device.
  • the sensing server can provide assistance data for terminal assisted and terminal based NR DL-AoD by using the IE of the NR-DL-AoD-ProvideAssistanceData.
  • the target device can request the assistance data from the sensing server by using the IE of NR-DL-AoD-RequestAssistanceData.
  • the target device can provide the NR DL-AoD Sensing measurements to the sensing server by using an IE of NR-DL-AoD-ProvideSensingInformation.
  • the target device can provide the NR DL-AoD measurements to the sensing server by using an IE of NR-DL-AoD-SignalMeasurementInformation.
  • the target device can provide an IE of NR-DL-AoD-SensingInformation to the sensing server.
  • the sensing server can request NR DL-AoD Sensing measurements from the target device by using an IE of NR-DL-AoD-RequestSensingInformation.
  • the IE of the NR-DL-AoD-ProvideCapabilities may be used to indicate the capability of the target device to support the NR DL-AoD.
  • the IE of the NR-DL-AoD-ProvideCapabilities can provide the NR DL-AoD Sensing capabilities to the sensing server.
  • the IE of the NR-DL-AoD-MeasurementCapability can be included only when the DL-AoD measurement capability is defined and the target device supports NR-DL-PRS-ResourcesCapability for DL-AoD.
  • the IE of the NR-DL-AoD-RequestCapabilities may be used for the sensing server to request a capability of a target device to support the NR DL-AoD.
  • the IE of the NR-DL-AoD-RequestCapabilities may be used by the sensing server to request the NR DL-AoD Sensing capabilities from the target device.
  • NR Multi-RTT Sensing In NR Multi-RTT Sensing, the sensing server can provide assistance data for terminal assisted NR Multi-RTT by using the IE of NR-Multi-RTT-ProvideAssistanceData.
  • the target device can request the assistance data from the sensing server by using the IE of NR-Multi-RTT-RequestAssistanceData.
  • the target device can provide NR Multi-RTT Sensing measurements to the sensing server by using an IE of NR-Multi-RTT-ProvideSensingInformation.
  • the target device can provide the NR Multi-RTT measurements to the sensing server by using an IE of NR-Multi-RTT-SignaIMeasurementInformation.
  • the sensing server requests NR Multi-RTT Sensing measurements from the target device by using an IE of NR-Multi-RTT-RequestSensingInformation.
  • the IE of NR-Multi-RTT-ProvideCapabilities may be used to indicate a capability of the target device to support NR Multi-RTT.
  • the target device can provide the NR Multi-RTT Sensing capabilities to the sensing server.
  • the IE of NR-Multi-RTT-MeasurementCapability can be included only when the Multi-RTT measurement capability is defined and the target device supports NR-DL-PRS-ResourcesCapability for Multi-RTT.
  • the IE of the NR-Multi-RTT-RequestCapabilities may be used for the sensing server to request a capability of a target device to support NR Multi-RTT.
  • the IE of the NR-Multi-RTT-RequestCapabilities may be used for the sensing server to request the NR Multi-RTT Sensing capabilities from the target device.
  • the NSPPa sensing information transfer procedure may include a transfer procedure of information related to sensing between the RAN/AN 510 and the SEF 554.
  • the NSPPa management procedure may include a procedure not related to sensing (for example, processing at the time of an error).
  • Sensing information transfer for E-CID and NR E-CID sensing ⁇ E-CID Measurement Initiation procedure ⁇ E-CID Measurement Failure Indication Procedure ⁇ E-CID Measurement Report procedure ⁇ E-CID Measurement Termination procedure ⁇ Information transfer for Observed Time Difference of Arrival (OTDOA) sensing ⁇ OTDOA Information Exchange procedure ⁇ Typical error status report ⁇ Error Indication procedure ⁇ Transfer assistance information ⁇ Assistance Information Control procedure ⁇ Assistance Information Feedback procedure ⁇ Transfer sensing information ⁇ Sensing Information Exchange procedure ⁇ Sensing Information Update procedure ⁇ Sensing Activation procedure ⁇ Sensing Deactivation procedure ⁇ Transfer measurement information ⁇ Measurement procedure ⁇ Measurement Update procedure ⁇ Measurement Report procedure ⁇ Measurement Abort procedure ⁇ Measurement Failure Indication procedure ⁇ Transfer of Transmission-Reception Point (TRP) information ⁇ TRP Information Exchange procedure ⁇ Transfer of Sensing Reference Signal (SRS) information ⁇ SRS Configuration Exchange procedure ⁇ Transfer Measurement Pre
  • Figs. 19A and 19B are diagrams illustrating an example of a procedure that can be supported by the NSPPa of the present embodiment.
  • Fig. 19A is a diagram illustrating a sensing information exchange procedure.
  • the SEF 554 transmits a SENSING INFORMATION REQUEST message to the RAN/AN 510 (Step S41B).
  • the NG-RAN node Having received the SENSING INFORMATION REQUEST message, the NG-RAN node (for example, RAN/AN 510) performs necessary setting, and then transmits a SENSING INFORMATION RESPONSE message to the SEF 554 (Step S42B).
  • the NG-RAN node can consider the information of the IE when configuring transmission of a sounding reference signal (SRS) to the UE 40.
  • the NG-RAN node may include the IE of the SRS configuration and the IE of the SFN (System Frame Number) Initialisation Time in the SENSING INFORMATION RESPONSE message.
  • Fig. 19B is a diagram illustrating a measurement procedure.
  • the SEF 554 transmits a MEASUREMENT REQUEST message to the NG-RAN node (for example, RAN/AN 510) (Step S51B).
  • SEF 554 can include an IE of a TRP Measurement Request List in a MEASUREMENT REQUEST message.
  • the IE of the TRP Measurement Request List indicates a TRP at which measurement is requested.
  • the NG-RAN node configures measurement by the indicated TRP according to the information included in the message.
  • the NG-RAN node responds with a MEASUREMENT RESPONSE message (Step S52B).
  • the MEASUREMENT RESPONSE message may include an IE of a TRP Measurement Response List.
  • the sensor used in the processing related to the sensing service is one or a plurality of sensors included in the terminal apparatus 40 (for example, one or a plurality of sensors included in the sensor unit 46).
  • the sensor used in the processing related to the sensing service is not limited to one or a plurality of sensors included in the terminal apparatus 40.
  • the sensor used in the processing related to the sensing service may be one or a plurality of sensors included in the base station 30 (for example, one or a plurality of sensors included in the sensor unit 34).
  • the description of the terminal apparatus 40 in the following description can be replaced with a communication apparatus (for example, the base station 30) as appropriate.
  • the sensor management entity when permitting access to detection data of a sensor corresponding to D001, the sensor management entity (for example, the terminal apparatus 40 including the sensor or the user of the terminal apparatus 40) sets the permission information (access privilege) to "enable”. This enables access for the service providing entity or the service utilization entity to detection data. In contrast, when denying access to the detection data of the sensor corresponding to D002, the sensor management entity sets the permission information (access privilege) to "disable”. This disables access for the service providing entity or the service utilization entity to the detection data.
  • setting of permission information (access privilege) to data detected by the sensor corresponding to the sensor ID may be classified by conditions.
  • the conditions may be at least one of the following (C1) to (C7).
  • the condition for classifying the permission information may be a condition related to a contract concluded by the service utilization entity.
  • the condition may be a condition that access to detection data is permitted in a case where the service utilization entity has a contract for utilizing a first service, while access to the detection data is not permitted in a case where the service utilization entity has no contract for utilizing the first service.
  • the service utilization entity may be, for example, a user of the terminal apparatus 40.
  • the condition for classifying the permission information may be a condition related to a contract concluded by the service providing entity.
  • the condition may be a condition that access to the detection data is permitted in a case where the service providing entity has a contract for providing a second service to the service utilization entity, while access to the detection data is not permitted in a case where the service providing entity has no contract for providing the second service to the service utilization entity.
  • the service providing entity may be, for example, an administrator of the core network CN or an administrator of the server 10.
  • the condition for classifying the permission information may be a condition related to a use case of a sensing service.
  • the use case of the sensing service is, for example, a utilization purpose/utilization scene of the sensing service by the service utilization entity.
  • the use case of the sensing service may include the use case described above or below (for example, at least one of (B1) to (B26) described above).
  • the condition for classifying the permission information may be the use case itself of the sensing service.
  • the condition may be a condition that the access to the detection data is permitted in a case where the use case of the sensing service is a first use case, while the access to the detection data is not permitted in a case where the use case of the sensing service is a second use case.
  • the condition for classifying the permission information may be a condition related to utilization of the sensing service.
  • the condition for classifying the permission information may be a condition related to at least one of a utilization area and a utilization time zone of the sensing service.
  • the condition for classifying the permission information may be the utilization area itself.
  • the condition may be a condition that the access to the detection data is permitted in a case where the area where the sensing service is used is a first area, while the access to the detection data is not permitted in a case where the area where the sensing service is used is a second area.
  • the condition for classifying the permission information may be the utilization time zone itself.
  • the condition may be a condition that the access to the detection data is permitted when the time zone in which the sensing service is used is a first time zone, while the access to the detection data is not permitted when the time zone in which the sensing service is used is a second time zone.
  • condition for classifying the permission information may be a type of information generated by detection data.
  • the condition may be a condition that, when the condition is a type of information generated in a service related to automated driving, access to detection data for high-precision three-dimensional map information is permitted, and access to detection data for information regarding a route necessary for automated driving of a mobile body, control information regarding steering, control information regarding acceleration, or control information regarding braking is not permitted.
  • the condition may be a condition that access to detection data for information regarding a route necessary for automated driving of a mobile body, control information on steering, control information regarding acceleration, or control information regarding braking is permitted, and access to detection data for high-precision three-dimensional map information is not permitted.
  • the condition for classifying the permission information (access privilege) may be a condition related to saving of the detection data.
  • the saving of the detection data may be recording of the detection data in nonvolatile memory such as an SSD or an HDD, or may be holding of the detection data for a time exceeding a set time.
  • the set time may be a specific time (for example, a specific date and time) or an elapsed time (for example, time from acquisition of detection data).
  • the detection data to be saved is not necessarily to be the exactly the detection data. A case of saving the detection data converted by some processing can also be regarded as saving of the detection data.
  • the saving condition may be the presence or absence of saving.
  • the condition may be a condition that access to the detection data is permitted in a case where the entity that acquires detection data is not saved beyond a set time, and access to the detection data is not permitted in a case where the entity is saved beyond the set time.
  • condition for classifying the permission information may be a condition related to a disclosure recipient of the detection data.
  • the detection data to be disclosed does not have to be exactly the detection data.
  • a case of disclosing the detection data converted by some processing can also be regarded as disclosure of the detection data.
  • the condition for classifying the permission information may be the disclosure recipient itself of the detection data.
  • the condition may be a condition that access to the detection data is permitted in a case where the disclosure recipient of the detection data is a first disclosure recipient, while access to the detection data is not permitted in a case where the disclosure recipient of the detection data is a second disclosure recipient.
  • the disclosure recipient of the detection data is not limited to the direct disclosure recipient of the detection data (that is, the service utilization entity).
  • An entity that directly or indirectly acquires information related to detection data can also be regarded as a disclosure recipient of the detection data. That is, the disclosure recipient of the detection data may be an entity that directly or indirectly acquires information related to the detection data from the service utilization entity.
  • the disclosure recipient to be the condition may include, for example, at least one of identification information of a node or an entity in the system, identification information of User Equipment (UE), identification information of a Radio Access Network (RAN), identification information of a core network (for example, 5GC/NGC (5G Core/Next Generation Core)), identification information of a network function constituting the core network, identification information of an Application Server (AS), and identification information of an Edge Application Server (EAS).
  • the disclosure recipient serving as the condition may include at least one of an identifier and an address (for example, an IP address) of the apparatus.
  • the disclosure recipient serving as the condition may include, for example, any type classified by a difference in the depth of human relationship, such as the person himself/herself, family members, friends, and other persons.
  • the disclosure recipient as a condition may include a corporation.
  • the corporation includes an operator and a service provider.
  • the disclosure recipient to be the condition may include a group or a range classifying at least one of a node, an entity, and an apparatus.
  • the group or the range can be set at any granularity.
  • the group or range may be set at a granularity of at least one of a country, a region, and an area.
  • the group or the range can be set according to the level of the subscription.
  • the condition for classifying the permission information may be a condition related to the type of data generated using detection data.
  • the condition for classifying the permission information may be exactly the type of the generated data.
  • the condition may be a condition that access to the detection data is permitted in a case where the type of the generation data is a first type, while access to the detection data is not permitted in a case where the type of the generation data is a second type.
  • the condition for classifying the permission information (access privilege) may be a condition related to an artificial intelligence (AI)/machine learning (ML) model used for the sensing service.
  • the condition may include a condition regarding use of distributed processing of an AI/ML model to which detection data of a sensor is input.
  • the condition may include a condition (first condition) related to the learning usage of the AI/ML model to which the detection data of the sensor is input.
  • the condition may include a condition (second condition) indicating whether saving of detection data of the sensor is permitted in learning of the AI/ML model in the learning usage satisfying the first condition regarding the learning usage.
  • the access privilege may be individually set for the conditions described above or below (for example, the conditions indicated in (C1) to (C7)), or the access privilege may be set for a plurality of optionally selected combinations.
  • the access privilege may be determined for each of the first condition to the third condition, or the access privilege may be determined for a combination of two or more conditions selected from the first condition to the third condition (for example, at least one of a combination of the first condition and the second condition, a combination of the first condition and the third condition, a combination of the second condition and the third condition, or a combination of the first condition, the second condition, and the third condition).
  • a configuration of terminal sensing privacy information for the terminal apparatus 40 can be saved in a sensing service privacy profile as part of terminal subscriber data within the UDM 547.
  • the sensing service privacy profile can be rephrased as a terminal sensing privacy profile, or simply a privacy profile.
  • the configuration of the sensing privacy information can be updated as necessary according to a request from the user of the terminal apparatus 40.
  • the one or plurality of sensors included in the terminal apparatus 40 include sensors of various positioning methods supported by the LPP described above.
  • the information regarding the above condition regarding the access privilege to the detection data may be managed as assistance information for the sensing privacy information.
  • This assistance information may be saved in a Unified Data Repository (UDR).
  • UDR Unified Data Repository
  • At least one of the terminal apparatus 40, the base station 30, and the road-side unit may include a 3GPP transceiver.
  • a 3GPP transceiver may support an RF-based sensing function, similar to a millimeter wave radar.
  • the sensor managed by the sensing privacy information described above may include an RF-based sensing function supported by the 3GPP transceiver.
  • the data detected by one or a plurality of sensors included in the terminal apparatus 40 can conceptually include data measured by one or a plurality of sensors. That is, the detection data (sensing data) described above or below can be rephrased as measurement data.
  • the information processing apparatus included in the communication system 1 according to the present embodiment executes processing related to a sensing service.
  • Fig. 21 is a sequence diagram illustrating an example of processing related to a sensing service.
  • An information processing apparatus for example, the core network CN included in the communication system performs processing related to a sensing service based on a request from another apparatus (for example, the server 10).
  • the processing related to the sensing service is a service performed directly or indirectly using detection data of one or a plurality of sensors.
  • the processing related to the sensing service may be processing of the sensing service (processing executed by the service providing entity) itself or auxiliary processing of the processing of the sensing service (for example, processing for providing data/processing necessary for execution of the sensing service).
  • the service utilization entity may be the server 10 or the terminal apparatus 40. Needless to say, the service utilization entity may be an apparatus other than these (for example, the base station 30 or a core network CN (for example, the management apparatus 20)).
  • the service providing entity may be a core network CN (for example, the management apparatus 20) or the server 10. Needless to say, the service providing entity may be an apparatus other than these (for example, the base station 30 or the terminal apparatus 40).
  • the processing related to the sensing service is assumed to be a service of providing detection data of one or a plurality of sensors included in the communication apparatus from the service providing entity to the service utilization entity.
  • the communication apparatus including one or a plurality of sensors is the terminal apparatus 40 (UE illustrated in Fig. 21)
  • the service providing entity is the management apparatus 20 (CN illustrated in Fig. 21)
  • the service utilization entity is the server 10 (AS illustrated in Fig. 21).
  • processing related to the sensing service will be described with reference to the sequence diagram of Fig. 21.
  • the reception unit 231 of the management apparatus 20 receives a request related to the sensing service from the server 10 (Step S101).
  • the request related to the sensing service may be exactly the sensing service request or a request performed with execution of processing of the sensing service.
  • the request related to the sensing service is a sensing service request (detection data transmission request).
  • the request related to the sensing service may include identification information for identifying a use case of the sensing service.
  • the acquisition unit 232 of the management apparatus 20 acquires information related to access to detection data of one or a plurality of sensors (Step S102).
  • the information related to the access may be stored in the sensing privacy information.
  • the sensing privacy information may be regarded as information related to access.
  • the management apparatus 20 may acquire the information related to access, from the terminal apparatus 40 including one or a plurality of sensors.
  • the request unit 235 of the management apparatus 20 requests information related to access from the terminal apparatus 40.
  • the notification unit 431 of the terminal apparatus 40 transmits information related to access to the management apparatus 20.
  • the management apparatus 20 may acquire information related to access from each of the plurality of terminal apparatuses 40.
  • the management apparatus 20 may hold the information related to access in the storage unit 22 in advance and acquire the information related to access from the storage unit 22. Furthermore, information related to access may be acquired from an apparatus other than the terminal apparatus 40 including one or a plurality of sensors. For example, the management apparatus 20 may acquire information related to access from the server 10 or the another terminal apparatus 40.
  • the management apparatus 20 determines a sensor that is a target of access to the detection data. For example, the management apparatus 20 determines a target sensor based on a use case of the sensing service. In a case where identification information for identifying a use case of a request sensing service related to the sensing service is included, the management apparatus 20 may determine the use case of the sensing service based on the identification information.
  • the management apparatus 20 determines whether the access to the detection data of the target sensor is permitted based on the information related to the access.
  • the information related to access includes permission information indicating whether to permit access to the detection data of the sensor.
  • the management apparatus 20 may determine whether access to the detection data of the target sensor is permitted based on the permission information.
  • the generation unit 233 of the management apparatus 20 may generate a list of sensors for which access to detection data is permitted based on the information related to access. Subsequently, the management apparatus 20 may determine whether access to the detection data is permitted based on the generated list.
  • the information related to access includes permission information indicating whether to permit access to the detection data of the sensor.
  • This permission information may be classified according to conditions, for example, as illustrated in Fig. 20B.
  • the condition may be at least one of (C1) to (C7) described above.
  • the management apparatus 20 may determine whether access to the detection data is permitted based on the permission information classified by the condition.
  • the management apparatus 20 may determine whether the access to the detection data is permitted based on a judgment result of the user rather than the information related to access. At this time, the request unit 235 of the management apparatus 20 may request the user of the terminal apparatus 40 to make a judgment related to permission of access to the detection data of the sensor via the terminal apparatus 40. More specifically, the request unit 235 of the management apparatus 20 may transmit, to the terminal apparatus 40, an instruction to request the user of the terminal apparatus 40 to make a judgment related to permission of access to the detection data of the sensor.
  • the management apparatus 20 may transmit the instruction of the judgement request together with condition information (for example, the information of the detection data use condition).
  • condition information may be information corresponding to the condition indicated in at least one of (C1) to (C7) described above.
  • the management apparatus 20 may transmit the instruction together with information related to a validity period or an expiration point set in the detection data.
  • the validity period or the expiration point may be a period or a point designated according to the use case of the sensing service.
  • the reception unit 432 of the terminal apparatus 40 receives the judgement request to the user from the management apparatus 20. Subsequently, the terminal apparatus 40 send an inquiry to the user whether to permit access to the detection data. In a case where the instruction has been received together with the information of the condition, the terminal apparatus 40 may send an inquiry to the user together with the information of the condition. In a case where the instruction has been received together with the information related to the validity period or the expiration point, the terminal apparatus 40 may send an inquiry to the user together with the information related to the validity period or the expiration point. Subsequently, notification unit 431 of terminal apparatus 40 sends a notification of the judgment result of the user to the management apparatus 20.
  • the reception unit 432 of the terminal apparatus 40 receives the request for the management information from the management apparatus 20. Subsequently, the notification unit 431 of the terminal apparatus 40 sends a notification of the management information to the management apparatus 20. When having received the information of the condition together with the request for the management information, the notification unit 431 of the terminal apparatus 40 may send a notification of the management information corresponding to the information of the condition to the management apparatus 20.
  • the management apparatus 20 determines whether access to the detection data is permitted based on the management information. Note that, in a case where the update unit 236 of the management apparatus 20 has acquired the management information from the terminal apparatus 40, the information related to the access held as a part of the subscriber information may be updated with the judgment result of the user.
  • the request unit 235 of the management apparatus 20 requests the terminal apparatus 40 to transmit the detection data determined to be permitted to access the detection data (Step S103).
  • the transmission unit 433 of the terminal apparatus 40 transmits the detection data regarding the access request to the management apparatus 20.
  • the determination unit 434 of the terminal apparatus 40 may determine whether the access to the detection data related to the access request is permitted. Subsequently, in a case where the access is permitted, the detection data related to the access request may be transmitted to the management apparatus 20.
  • the determination unit 434 of the terminal apparatus 40 may determine permission or non-permission of access to the detection data of the sensor related to the access request based on permission information indicating whether to permit access to the detection data.
  • the permission information may be similar to the permission information included in the above-described information related to access.
  • the permission information may be classified according to conditions, for example, as illustrated in Fig. 20B. At this time, the condition may be at least one of (C1) to (C7) described above.
  • the determination unit 434 of the terminal apparatus 40 may determine permission or non-permission of access to the detection data based on permission information classified by conditions.
  • the access request may include information regarding a condition designated by the management apparatus 20.
  • the condition information may be information corresponding to the condition indicated in at least one of (C1) to (C7) described above.
  • the terminal apparatus 40 may determine permission or non-permission of access to the detection data based on condition information included in the access request.
  • Step S102 the notification unit 431 of the terminal apparatus 40 has send a notification of the judgment result of the user to the management apparatus 20
  • the determination unit 434 of the terminal apparatus 40 may determine permission or non-permission of access based on the judgment result of the user rather than the permission information.
  • the determination unit 434 of the terminal apparatus 40 may determine permission or non-permission of access based on the judgment result of the user rather than the permission information.
  • the acquisition unit 232 of the management apparatus 20 acquires detection data from the terminal apparatus 40 (Step S104).
  • the detection data acquired by the management apparatus 20 from the terminal apparatus 40 need not be exactly the data detected by one or a plurality of sensors of the terminal apparatus 40.
  • the detection data may be data obtained by performing predetermined processing on data detected by one or a plurality of sensors.
  • the detection data may be data obtained by converting data detected by one or a plurality of sensors into a predetermined format, or may be data obtained by performing encryption processing on data detected by one or a plurality of sensors.
  • the management apparatus 20 transmits information related to detection data to the server 10 (Step S105).
  • the information related to the detection data may be exactly the detection data acquired from the terminal apparatus 40 or may be information generated based on the acquired detection data.
  • the information related to the detection data may be analysis data generated based on one or a plurality of pieces of detection data, or may be combined data of a plurality of pieces of detection data.
  • the information related to the detection data may be control information for a sensing service or for use case of the sensing service (for example, control information of a mobile body for an automated driving service).
  • Generation of the information related to the detection data may use an artificial intelligence/machine learning model (AI/ML model).
  • AI/ML model artificial intelligence/machine learning model
  • the information related to the detection data may be data generated by inputting one or a plurality of pieces of detection data to the AI/ML model.
  • FIG. 22 is a sequence diagram illustrating an example of processing of a sensing service.
  • Fig. 22 illustrates processing in which the service providing entity acquires detection data of one or a plurality of sensors included in the terminal apparatus 40 as data related to the sensing service.
  • the service providing entity acquires detection data of one or a plurality of sensors included in the terminal apparatus 40 as data related to the sensing service.
  • not all the processing is to be required for the implementation of the invention. In other words, each processing in Fig. 22 can be independently performed.
  • the sensing service may be a service that provides detection data to a service utilization entity.
  • the data related to the sensing service may be sensor detection data itself or may be data generated by performing predetermined processing on one or a plurality of pieces of detection data.
  • the sensing service is assumed to be a service that provides detection data to the service utilization entity.
  • the service utilization entity may be the terminal apparatus 40 or the server 10.
  • the service utilization entity may be an application in the terminal apparatus 40 or an application in the server 10.
  • the service utilization entity may be the AF 548 prepared for the application in the apparatus (for example, the AF 548 prepared for the application of the terminal apparatus 40 and/or the server 10).
  • the service providing entity may be the core network CN.
  • the service providing entity may be an information processing apparatus (for example, the management apparatus 20) having at least one function of the AMF 541, the SEF 554, the UDM 547, and the NEF 542.
  • the service utilization entity is supposed to be an application of the terminal apparatus 40 and/or the server 10, and the service providing entity is supposed to be the core network CN.
  • the processing of the sensing service will be described below with reference to Fig. 22.
  • the AF 548 When having received a request for a sensing service from an application of the terminal apparatus 40 and/or the server 10, the AF 548 specifies a use case of the sensing service. Subsequently, the AF 548 transmits a Nnef_EventExposure_Subscribe request message for the sensing service to the NEF 542 (Step S201).
  • the Nnef_EventExposure_Subscribe request message may be regarded as a request for the sensing service.
  • the NEF 542 decides whether the AF 548 is an entity having authority to access the network function of the core network CN. In a case where the NEF 542 has the authority to access, the NEF 542 transmits an Nsef_Sensing_ProvideSensingInfo request message to the SEF 554 according to the Nnef_EventExposure_Subscribe request for the sensing service received from the AF 548 (Step S202). Note that, in a case where the AF 548 can directly access the SEF 554, the AF 548 may transmit a sensing service request for using a sensing service or an Nsef_Sensing_ProvideSensingInfo request message to the SEF 554.
  • the SEF 554 specifies a sensor that detects detection data necessary for the use case (hereinafter, referred to as a target sensor) based on the use case of the sensing service included in the Nsef_Sensing_ProvideSensingInfo request message (Step S203).
  • the detection data is data detected by one or a plurality of sensors included in the terminal apparatus 40.
  • the detection data may be referred to as sensing data.
  • the detection data need not be exactly the data generated by the sensor. Detection data obtained by converting in some processing can also be regarded as the detection data of the present embodiment.
  • the processing of specifying the target sensor (hereinafter, the processing is referred to as target sensor specifying processing) will be described in detail below.
  • the SEF 554 identifies the terminal apparatus 40 including the target sensor by, for example, a Generic Public Subscription Identifier (GPSI) or a Subscription Permanent Identifier (SUPI). Subsequently, the SEF 554 starts the Nudm_SDM_Get service for the UDM 547 in order to acquire the sensing privacy information of the target terminal apparatus 40 (Step S204).
  • GPSI Generic Public Subscription Identifier
  • SUPI Subscription Permanent Identifier
  • the SEF 554 checks privacy of the target sensor according to the acquired sensing privacy information (Step S205). Subsequently, the SEF 554 generates a permission list including information of the sensor that is permitted to access the detection data based on the check result.
  • the processing of generating the permission list (hereinafter, it is referred to as permission list generation processing) will be described below in detail.
  • the SEF 554 may acquire information indicating the type of sensor necessary for each use case from the UDM 547.
  • the use case is an ADAS of a vehicle
  • examples of types of required sensors include a GNSS sensor, an acceleration sensor, an inertial measurement device including a gyro sensor, an image sensor/camera, LiDAR, and a millimeter wave radar.
  • the SEF 554 may select a sensor to be checked for privacy from among all the sensors included in the terminal apparatus 40 based on information indicating the type of sensor needed for the use case.
  • the information related to the sensor provided in the terminal apparatus 40 can be grasped from the acquired sensing privacy information.
  • Information indicating the type of sensor necessary for each use case may be set in advance as a policy rule and saved in a Unified Data Repository (UDR).
  • the PCF 545 may receive information regarding a type of sensor needed for the requested use case from the AF 548 and generate a service provider specific policy rule. Furthermore, PCF 545 may save the generated service provider specific policy rules in the UDR.
  • the SEF 554 can transmit, to the PCF 545, a message requesting a policy including information for identifying a use case, for example, an ID, and can acquire information indicating a type of a necessary sensor via the PCF 545.
  • the SEF 554 may acquire, in Step S204, the information regarding the condition related to the access privilege as the assistance information in addition to the sensing privacy information of the target terminal apparatus 40.
  • the SEF 554 starts the Nudm_UECM_Get service for the UDM 547.
  • the UDM 547 transmits the network address of the serving AMF 541 of the terminal apparatus 40 identified by the GPSI or the SUPI to the SEF 554 (Step S206).
  • the SEF 554 transmits a Namf_Sensing_ProvideSensingInfo request to the serving AMF 541 according to the network address acquired in Step S206 (Step S207).
  • the SEF 554 may include information regarding frequency (for example, setting of a period or an event to be triggered) of acquiring detection data (sensing data) from the target sensor in the Namf_Sensing_ProvideSensingInfo request.
  • the frequency of acquiring the detection data may be requested from the AF 548 via the Nnef_EventExposure_Subscribe request message for the sensing service in Step S201.
  • the frequency of acquiring the detection data may be set based on a policy rule for a sensing service managed by the PCF 545.
  • the serving AMF 541 starts the procedure of the Network triggered Service request in order to perform processing of the sensing service (Step S208).
  • the Namf_Sensing_ProvideSensingInfo request is a request for providing detection data to a plurality of terminal apparatuses 40
  • a procedure of a Network triggered Service request is started for the plurality of terminal apparatuses 40.
  • the plurality of terminal apparatuses 40 may be set as a group.
  • the serving AMF 541 may start the Network triggered Service request procedure for this group that has been set.
  • the serving AMF 541 transmits the NAS message to start the notification processing of the detection data of the target sensor to the terminal apparatus 40 (Step S209).
  • the NAS message may include an instruction to request explicit permission from the user for access to the detection data of each sensor, or an instruction to request local setting of the terminal apparatus 40 regarding the access privilege to the detection data.
  • the NAS message may indicate a condition for access to detection data of each sensor.
  • the NAS message may include an instruction to request explicit permission from the user under the condition or an instruction to request local setting of the terminal apparatus 40 under the condition.
  • the Namf_Sensing_ProvideSensingInfo request includes information regarding the frequency of acquiring detection data (sensing data) from the target sensor.
  • the serving AMF 541 can include the information in the NAS message that starts the notification of the detection data (sensing data).
  • the terminal apparatus 40 When having acquired the information regarding the frequency of acquiring detection data (sensing data), the terminal apparatus 40 sets the frequency of data detection or measurement for each sensor.
  • the terminal apparatus 40 transmits a NAS message as a response to the serving AMF 541 as necessary according to the received NAS message (Step S210).
  • the NAS message to be a response may include explicit permission from the user of the terminal apparatus 40 or a local setting of the terminal apparatus 40 (setting of an access privilege to detection data of each sensor).
  • the serving AMF 541 starts Nudm_ParameterProvision_Update for sensing privacy information in order to save the setting of the access privilege received from the terminal apparatus 40 in the UDM 547 (Step S211).
  • the UDM 547 saves the updated access privilege setting in the UDR as part of the terminal subscriber data (sensing privacy information).
  • the UDM 547 updates the sensing privacy information saved in the UDR with the updated access privilege setting.
  • the serving AMF 541 transmits a Namf_Sensing_ProvideSensingInfo response message to the SEF 554 as a response to the Namf_Sensing_ProvideSensingInfo request in Step S207 (Step S212).
  • the serving AMF 541 may use this response message to inform the SEF 554 of the information of the sensors (for example, sensor ID) permitted to access the detection data.
  • the terminal apparatus 40 When having accepted the access requested in Step S209, the terminal apparatus 40 transmits detection data of the target sensor to the SEF 554 (Step S213).
  • the SEF 554 acquires detection data of the target sensor from the plurality of terminal apparatuses 40.
  • the terminal apparatus 40 may set a validity period or expiration point pre-designated in the detection data. For example, the validity period or expiration point is designated from the SEF 554 according to the use case.
  • the service utilization entity can use the detection data only within the set validity period or before the expiration point.
  • the terminal apparatus 40 may add information instructing deletion/discard of the detection data after the lapse of the validity period or expiration point. The SEF 554 or the service utilization entity needs to erase/discard the detection data beyond the validity period or expiration point.
  • the SEF 554 transmits an Nsef_Sensing_ProvideSensingInfo response message to the NEF 542 as a response to the Nsef_Sensing_ProvideSensingInfo request in Step S202 (Step S214).
  • the Nsef_Sensing_ProvideSensingInfo response message includes detection data acquired from the terminal apparatus 40 including the target sensor.
  • the NEF 542 Having received the Nsef_Sensing_ProvideSensingInfo response message, the NEF 542 transmits the Nnef_EventExposure_Subscribe response message to the AF 548 as a response to the Nnef_EventExposure_Subscribe request (Step S201) (Step S215).
  • the Nnef_EventExposure_Subscribe response message includes detection data acquired from the terminal apparatus 40 including the target sensor.
  • Namf_Sensing_ProvideSensingInfo is a service-based interface for a service that provides detection data (sensing data acquired by the AMF 541 from the terminal apparatus 40 via the base station 30 or sensing data acquired from the base station 30).
  • the name of this service-based interface is an example, and may be another name.
  • the base station 30 has a service-based interface (for example, Nran_Sensing_ProvideSensingInfo) corresponding to a service-based architecture and prepared for a service that provides detection data (sensing data acquired from the terminal apparatus 40 or sensing data of the base station 30).
  • the SEF 554 may directly request the base station 30 to provide the sensing data by using this service-based interface.
  • the SEF 554 may control to transmit a part or all of the sensing data (detection data) acquired from the terminal apparatus 40 to the application server 10 via the user plane. At that time, the SEF 554 starts a network triggered PDU session establishment procedure to the terminal apparatus 40 equipped with the target sensor.
  • the network transmits a device trigger request message to an application of the terminal apparatus 40.
  • the payload of the device trigger request message includes information related to which application of the terminal apparatus 40 triggers a PDU session establishment request. Based on this information, the application of the terminal apparatus 40 triggers a PDU session establishment request for establishing a session with the application server 10.
  • the information related to which application of the terminal apparatus 40 triggers the PDU session establishment request is transmitted from the AF 548 to the SEF 554 via the Nnef_EventExposure_Subscribe request message and the Nsef_Sensing_ProvideSensingInfo request message for the sensing service described above.
  • the information related to which application of the terminal apparatus 40 triggers the PDU session establishment request may include information indicating which sensor's sensing data (detection data) included in the terminal apparatus 40 is to be transmitted to the application server 10 via the user plane.
  • the SEF 554 may determine which sensor's sensing data included in the terminal apparatus 40 is to be transmitted to the application server 10 via the user plane based on the size of the sensing data. For example, the SEF 554 determines to transmit sensing data having a data size of a preset threshold or more, or above the threshold, to the application server 10 via the user plane.
  • the SEF 554 may determine which sensor's sensing data included in the terminal apparatus 40 is to be transmitted to the application server 10 via the user plane based on QoS required for the sensing data.
  • the SEF 554 determines to transmit sensing data requiring QoS of a preset condition (for example, the packet delay is a predetermined Packet Delay Budget (PDB) or below, or less than the PDB) to the application server 10 via the user plane.
  • a preset condition for example, the packet delay is a predetermined Packet Delay Budget (PDB) or below, or less than the PDB
  • the SEF 554 can acquire the above-described threshold or condition from the PCF 545, for example, as part of a policy rule for the sensing service.
  • Information regarding QoS required for sensing data is transmitted from the AF 548 to the SEF 554 via the Nnef_EventExposure_Subscribe request message and the Nsef_Sensing_ProvideSensingInfo request message for the sensing service described above, for example.
  • the sensing data is sensing data to be transmitted to the application server 10 via the user plane determined by the SEF 554.
  • the PCF 545 generates a policy rule for the QoS flow of the sensing data to be transmitted to the application server 10 via the user plane, based on the information regarding the QoS flow acquired from the SEF 554.
  • the PCF 545 saves the generated policy rule in the UDR.
  • Fig. 23 is a flowchart illustrating an example of target sensor specifying processing. Specifically, Fig. 23 is a flowchart illustrating an example of the target sensor specifying processing illustrated in Step S203. Hereinafter, the target sensor specifying processing will be described with reference to Fig. 23.
  • the SEF 554 receives a request related to a specific sensing service from an application being a service utilization entity via the AF 548 and/or the NEF 542 (Step S301).
  • the sensing service may be a service that provides detection data to a service utilization entity.
  • the request related to the sensing service may be a request for detection data.
  • the utilization purpose/utilization scene of the sensing service may be referred to as a use case of the sensing service.
  • the use case of the sensing service is, for example, a utilization purpose/utilization scene of the detection data by the service utilization entity.
  • At least one of (B1) to (B26) described above may be prepared as the use case of the sensing service.
  • an application that is a service utilization entity is an Advanced Driver-Assistance System (ADAS) application.
  • ADAS Advanced Driver-Assistance System
  • the ADAS application may request detection data for Sensing Assisted Automotive Maneuvering and Navigation as a request related to the sensing service.
  • the SEF 554 may manage these use cases by ID.
  • the application may request a specific sensing service by using the ID.
  • the SEF 554 specifies a target sensing service (or use case) in accordance with a request from the application (Step S302).
  • the SEF 554 specifies a terminal (terminal apparatus 40) and/or a base station as a target for requesting necessary detection data according to the specified sensing service (or use case) (Step S303).
  • the SEF 554 determines a target point and specifies one terminal (terminal apparatus 40) and/or a base station at the determined point.
  • the SEF 554 determines a target area and specifies a group including one or a plurality of terminal apparatuses 40 and/or one or a plurality of base stations 30 existing in the determined area.
  • the specified sensing service is a sensing service related to provision of detection data for Sensing Assisted Automotive Maneuvering and Navigation.
  • the SEF 554 specifies a group including some or all vehicles in an area to be controlled and/or some or all base stations (a road-side unit (Road-Side Unit: RSU)) in the area to be controlled.
  • RSU road-side unit
  • the specified sensing service is a sensing service related to provision of detection data for Automated Guided Vehicles (AGV) detection and tracking in factories.
  • the SEF 554 specifies a group including a part or all of the automatic guided vehicles in the area to be controlled and/or a part or all of the base stations in the area to be controlled.
  • the specified sensing service is a sensing service related to provision of detection data for Autonomous Mobile Robots (AMR) collision avoidance in smart factories.
  • the SEF 554 specifies a group including a part or all of the autonomous mobile robots in the area to be controlled and/or a part or all of the base stations in the area to be controlled.
  • the specified sensing service is a sensing service related to provision of detection data for network assisted sensing to avoid UAV collision.
  • the SEF 554 specifies a group of some or all UAVs in the area to be controlled and/or some or all base stations in the area to be controlled.
  • the SEF 554 specifies a type of a sensor that provides necessary detection data according to the specified sensing service (or use case) (Step S304).
  • the SEF 554 may specify a sensor for positioning as a sensor that provides necessary detection data. At this time, the SEF 554 may specify the sensors of all the positioning methods described in ⁇ 4-4>. Alternatively, the SEF 554 may specify sensors of some positioning methods with higher accuracy, such as A-GNSS positioning, with a constraint on measurement accuracy.
  • the SEF 554 may specify an image sensor and/or a camera as a sensor that provides necessary detection data.
  • the SEF 554 may specify a barometer as a sensor that provides necessary detection data.
  • the SEF 554 ends the target sensor specifying processing.
  • Fig. 24 is a flowchart illustrating an example of permission list generation processing. Specifically, Fig. 24 is a flowchart illustrating an example of permission list generation processing illustrated in Step S205. Hereinafter, the permission list generation processing will be described with reference to Fig. 24.
  • the SEF 554 checks the sensing privacy information of the specified terminal apparatus 40 and/or base station 30 (Step S401).
  • the SEF 554 checks terminal sensing privacy information in the terminal sensing privacy profile regarding the terminal apparatus 40.
  • the SEF 554 checks base station sensing privacy information regarding the base station 30.
  • the PLMN operator to which the base station 30 belongs may manage the base station sensing privacy information as a part of Operations, Administration and Maintenance (OAM) for each base station, for example.
  • OAM Operations, Administration and Maintenance
  • the SEF 554 selects one sensor from one or a plurality of sensors specified in Step S304 by referring to the sensing privacy information (Step S402).
  • the SEF 554 judges whether the access privilege to the detection data of the selected sensor is "enable” (Step S403).
  • the SEF 554 judges whether the access privilege is "enable” based on the condition of the sensing service.
  • Step S403 When the access privilege is "enable” (Step S403: Yes), the SEF 554 adds the selected sensor to the permission list (Step S404). Subsequently, the SEF 554 proceeds to the processing of Step S405.
  • Step S403: No the SEF 554 proceeds to the processing of Step S405.
  • the SEF 554 judges whether check has been performed for all the target sensors (Step S405). In a case where the check has not been performed for all the target sensors (Step S405: No), the SEF 554 changes the sensors (Step S402), and executes the processing of Step S403 and subsequent steps again.
  • Step S405 Yes
  • the SEF 554 ends the processing.
  • the SEF 554 has been described as one function. However, a plurality of functions constituting the SEF 554 may be installed as distributed functions.
  • the distributed installation may include distributed setting of each function in a plurality of apparatuses and performing distributed processing.
  • the SEF 554 may include, for example, at least one of the following functions.
  • sensing privacy information capable of setting an access privilege to detection data for each sensor is introduced in a sensing service.
  • the information that can be acquired from the target terminal apparatus 40 includes capability information defined for each positioning method.
  • A-GNSS Positioning is defined as follows.
  • a setting regarding privacy of one or a plurality of sensors is included in conventional capability information.
  • the capability information of the second embodiment can be regarded as sensing privacy information (information related to access). This makes it possible for the communication system 1 to newly support the function of privacy management while continuously using the conventional procedure.
  • Fig. 25 is a diagram illustrating a configuration example of capability information.
  • a sensor ID for identifying a sensor is assigned to one or a plurality of sensors included in the terminal apparatus 40.
  • the capability information is classified into a type, a support function, and an access privilege, for example.
  • the conventional capability information for example, gnss-SupportList in A-GNSS Positioning, nr-DL-TDOA-Mode in NR DL-TDOA Positioning
  • the support functions for example, gnss-SupportList in A-GNSS Positioning, nr-DL-TDOA-Mode in NR DL-TDOA Positioning
  • definitions as the type of the sensor include use of the sensors (for example, Positioning (GNSS), Positioning (NR DL-TDOA), Image sensor/Camera, LiDAR, Barometer).
  • sensors for example, Positioning (GNSS), Positioning (NR DL-TDOA), Image sensor/Camera, LiDAR, Barometer).
  • a function supported by a newly added sensor is defined in addition to the conventional capability information.
  • an image sensor/camera an image size of 35 mm is defined.
  • LiDAR the detection distance 250 mm at a reflectance of 80% is defined.
  • Barometer the measurement range of 930 to 1070 hPa is defined.
  • the access privilege to the data detected by the sensor corresponding to the sensor ID is set to "enable” or “disable”.
  • the service providing entity can judge whether the service providing entity can access the data detected by the target sensor.
  • the setting of the access privilege can be changed according to conditions, similarly to Fig. 20B.
  • the condition may be at least one of the following (D1) to (D7). Details of these conditions may be similar to the conditions (C1) to (C7) described in ⁇ 7-1. Sensing privacy information>.
  • condition for classifying the permission information may be a condition related to a contract concluded by the service utilization entity.
  • condition for classifying the permission information (access privilege) may be a condition related to a contract concluded by the service providing entity.
  • the condition for classifying the permission information (access privilege) may be a condition related to a use case of a sensing service.
  • the use case of the sensing service may be the use case (for example, at least one of (B1) to (B26)) described above or below.
  • the condition for classifying the permission information may be a condition related to utilization of the sensing service.
  • the condition for classifying the permission information may be a condition related to at least one of a utilization area and a utilization time zone of the sensing service.
  • the condition for classifying the permission information may be a condition related to saving of the detection data.
  • the saving of the detection data may be recording of the detection data in nonvolatile memory such as an SSD or an HDD, or may be holding of the detection data for a time exceeding a set time.
  • the set time may be a specific time (for example, a specific date and time) or an elapsed time (for example, time from acquisition of detection data).
  • the saving condition may be the presence or absence of saving.
  • the condition for classifying the permission information may be a condition related to a disclosure recipient of the detection data.
  • the disclosure recipient to be the condition may include, for example, at least one of identification information of a node or an entity in the system, identification information of UE, identification information of RAN, identification information of a core network, identification information of a network function constituting the core network, identification information of an Application Server (AS), and identification information of EAS.
  • the disclosure recipient serving as the condition may include at least one of an identifier and an address (for example, an IP address) of the apparatus.
  • the disclosure recipient serving as the condition may include, for example, any type classified by a difference in the depth of human relationship, such as the person himself/herself, family members, friends, and other persons.
  • the disclosure recipient as a condition may include a corporation.
  • the corporation includes an operator and a service provider.
  • the disclosure recipient to be the condition may include a group or a range classifying at least one of a node, an entity, and an apparatus.
  • the group or the range can be set at any granularity.
  • the group or range may be set at a granularity of at least one of a country, a region, and an area.
  • the group or the range can be set according to the level of the subscription.
  • condition for classifying the permission information may be a condition related to the type of data generated using detection data.
  • the condition for classifying the permission information (access privilege) may be a condition related to the AI/ML model used for the sensing service.
  • the condition may include a condition regarding use of distributed processing of an AI/ML model to which detection data of a sensor is input.
  • the condition may include a condition (first condition) related to the learning usage of the AI/ML model to which the detection data of the sensor is input.
  • the condition may include a condition (second condition) indicating whether saving of detection data of the sensor is permitted in learning of the AI/ML model in the learning usage satisfying the first condition regarding the learning usage.
  • the access privilege may be individually set for the conditions exemplified here, or the access privilege may be set for a plurality of optionally selected combinations.
  • the communication system 1 of the present embodiment may generate, as a sensing service, secondary information necessary for controlling the ADAS and/or the automated driving by fusing data detected by a plurality of sensors (for example, a plurality of sensors illustrated in Fig. 25).
  • the information regarding images detected by an image sensor/camera and/or LiDAR may include information regarding privacy of the owner of the sensor. Therefore, it is considered necessary to perform not only privacy management for individual sensor data but also privacy management for secondary information generated by sensor fusion.
  • FIG. 26 is a diagram illustrating another configuration example of capability information.
  • "location (position)” and “image (picture)” are set as secondary information types for the image sensor/camera and the LiDAR, and different access privileges are set for each type.
  • image data detected by the image sensor/camera is processed as secondary information together with data detected by other sensors.
  • the owner of the terminal apparatus 40 may set “enable” as the access privilege.
  • the owner of the terminal apparatus 40 may set "disable" as the access privilege in consideration of privacy.
  • the setting of the access privilege can be changed according to a condition. That is, the condition may be at least one of the following (D1) to (D7). Details of these conditions may be similar to (D1) to (D7) described above.
  • Condition related to contract (D1) Condition related to contract (D2) Conditions related to use case of sensing service (D3) Conditions related to utilization of sensing service (D4) Condition related to saving of detection data (D5) Condition related to disclosure recipient of detection data (D6) Condition related to type of generated data (D7) Condition related to AI/ML model
  • the access privilege may be individually set for the conditions exemplified here, or the access privilege may be set for a plurality of optionally selected combinations.
  • the capability information of the sensor may further include information such as the number of various sensors in addition to the information regarding the type, support function, and access privilege described above.
  • Fig. 27 is a flowchart illustrating an example of capability information acquisition processing.
  • UE represents a terminal apparatus 40
  • AS abbreviation of Application Server
  • the server 10 is an information processing apparatus that performs processing of an application related to a sensing service.
  • the capability information acquisition processing will be described with reference to Fig. 27.
  • the server 10 transmits the RequestCapabilities message to the target terminal apparatus 40 (Step S501).
  • the terminal apparatus 40 Having received the RequestCapabilities message, the terminal apparatus 40 specifies the corresponding capability information with reference to the capability information (for example, the capability information illustrated in Fig. 25) of the sensor managed by the terminal apparatus itself. Subsequently, the terminal apparatus 40 transmits the specified capability information to the server 10 (Step S502).
  • the capability information for example, the capability information illustrated in Fig. 25
  • the terminal apparatus 40 transmits the specified capability information to the server 10 (Step S502).
  • the terminal apparatus 40 may transmit a ProvideCapabilities message including information of a sensor ID (for example, "D004" illustrated in Fig. 25) and information of the support function (for example, "Detection Range_250m@80%" illustrated in Fig. 25) to the server 10.
  • Detection Range_250m@80% indicates that the detection distance is 250 m for an object having a reflectance of 80%.
  • the server 10 may transmit a RequestCapabilities message including designation of a type of secondary information to the terminal apparatus 40 (Step S501).
  • the secondary information is, for example, information generated by fusing detection data of a plurality of sensors (for example, location information or image information).
  • the terminal apparatus 40 Having received the RequestCapabilities message including designation of the type of secondary information, the terminal apparatus 40 specifies the corresponding capability information with reference to the capability information (for example, the capability information illustrated in Fig. 26) of the sensor managed by the terminal apparatus itself. Subsequently, the terminal apparatus 40 transmits the specified capability information to the server 10 (Step S502).
  • the capability information for example, the capability information illustrated in Fig. 26
  • the type of the sensor included in the RequestCapabilities message is LiDAR
  • the type of the requested capability information is access privilege
  • the type of the secondary information is location information.
  • the terminal apparatus 40 transmits a ProvideCapabilities message including information of a sensor ID (for example, "D004" illustrated in Fig. 26) and information of an access privilege (for example, "enable” illustrated in Fig. 26) to the server 10.
  • the type of the sensor included in the RequestCapabilities message is LiDAR
  • the type of the requested capability information is access privilege
  • the type of the secondary information is image information.
  • the terminal apparatus 40 transmits a ProvideCapabilities message including information of a sensor ID (for example, "D004" illustrated in Fig. 26) and information of an access privilege (for example, "disable” illustrated in Fig. 26) to the server 10.
  • the server 10 determines a sensor for which access to detection data is permitted based on access privilege information (information related to access) included in the capability information. Subsequently, the server 10 acquires detection data of the sensor for which access is determined to be permitted as data related to the sensing service (for example, data used for processing of an application related to the sensing service). The server 10 executes processing of the application based on the acquired data.
  • access privilege information information related to access
  • the SEF 554 may start, in Step S204, the procedure of the capability information request illustrated in Fig. 27 instead of acquiring the terminal sensing privacy information of the target terminal apparatus 40 from the UDM 547 using the Nudm_SDM_Get service. With this operation, the SEF 554 may acquire the capability information of the sensor from the target terminal apparatus 40.
  • information regarding privacy is added as one piece of capability information of the sensor. This makes it possible for the communication system 1 to newly support the function of privacy management while using the procedure prepared in the conventional LPP or NRPPa. Furthermore, in the second embodiment, the privacy setting of data detected by each sensor can be changed according to the type of secondary information. This enables effective use of data in consideration of privacy.
  • Sensor fusion has attracted attention as one of main technologies for realizing automated driving of a vehicle.
  • Sensor fusion is a technology of automatically analyzing and combining data obtained from a plurality of sensors that satisfy certain standard using computer technology to generate information necessary for decision-making and estimation.
  • Data analysis and/or combining may use an artificial intelligence (AI) technology.
  • AI artificial intelligence
  • An automated driving unit mounted on a vehicle controls automated driving by applying sensor fusion to a plurality of sensors mounted on the same vehicle. While using a plurality of sensors to complement information to be a blind spot that cannot be acquired by a single sensor, the automated driving unit analyzes and combines overlapping information simultaneously detected by the plurality of sensors. This enables highly reliable information provision. That is, in a case where the sensor fusion is applied to the automated driving of the vehicle, information necessary for controlling the automated driving is provided from a plurality of sensors. However, since the number of sensors mounted on one vehicle is limited, completely removing blind spots by sensor fusion using only the sensors mounted on the own vehicle is considered to be difficult.
  • the communication system 1 makes it possible, in a sensing service related to driving/control of the mobile body (for example, a vehicle such as an automobile, a flying object such as a drone, or the like), to utilize one or a plurality of sensors mounted on an apparatus different from a mobile body to be a target of the sensing service.
  • the communication system 1 makes it possible, in a sensing service (for example, automated driving of a mobile body, steering assistance of the mobile body, or navigation) related to driving/control of the mobile body, to utilize one or a plurality of sensors provided in an apparatus (for example, the base station 30 and/or the road-side unit) around the mobile body and/or one or a plurality of sensors provided in another mobile body. This makes it possible to reduce the occurrence of blind spots.
  • the sensor fusion may utilize the AI technology.
  • the AI technology may be data analysis and/or data combining using an artificial intelligence/machine learning model (AI/ML model).
  • AI/ML model an information processing apparatus (for example, at least one of the server 10, the management apparatus 20, the base station 30, and the terminal apparatus 40) that performs sensor fusion processing may store an AI/ML model for sensor fusion processing.
  • the information processing apparatus may execute sensor fusion processing (analysis and/or combining of detection data) using the AI/ML model.
  • the AI/ML model may be referred to as a learning model, a trained model, or simply a model.
  • the learning model is a machine learning model generated by learning processing by machine learning.
  • the machine learning is one of approaches of artificial intelligence that allows a computer to perform recognition, decision, or estimation similar to human, for example.
  • the machine learning process includes two processes of training processing and judgement processing.
  • the training processing is processing for training (for example, training using training data) the learning model.
  • an information processing apparatus hereinafter, referred to as a training apparatus
  • a learning model for example, a neural network model
  • the weighting factor of each edge is optimized in the process of training.
  • the model extracted as a result of the training processing may be referred to as a trained model.
  • the judgement processing is processing for recognition, decision, or estimation.
  • an information processing apparatus that performs judgment inputs data (for example, sensor information) regarding unknown data (for example, information necessary for controlling occurrence a blind spot and automated driving) to a learning model (trained model).
  • the learning model outputs a result of recognition, decision, or estimation for unknown data as a result of arithmetic processing.
  • the judgment apparatus may be the same apparatus as the training apparatus or may be a different apparatus.
  • the learning model is a neural network model, for example.
  • the neural network model includes an input layer, a hidden layer (or intermediate layer), and an output layer, each including a plurality of nodes, the nodes being connected to each other via edges.
  • Each layer has a function referred to as an activation function, and each edge is weighted.
  • the learning model has one or a plurality of intermediate layers (or a hidden layer).
  • learning of the learning model includes, for example, setting the number of hidden layers (or intermediate layers), the number of nodes in each layer, the weight of each edge, or the like.
  • the neural network model may be a deep learning model.
  • the use of a trained deep learning model that has been trained using vast amounts of data improves the accuracy of recognition, decision, or estimation.
  • Examples of typical algorithms used in deep learning include the following.
  • the algorithm of the AI/ML model used in the present embodiment may be at least one of the following:
  • DNN Deep Neural Network
  • CNN Convolutional Neural Network
  • RNN Recurrent Neural Network
  • LSTM Fully-connected neural network
  • LSTM Long Short-Term Memory
  • the number of hidden layers has two or more layers.
  • the DNN can have higher accuracy in recognition, decision, or estimation than the accuracy of a conventional neural network having one hidden layer.
  • a hidden layer includes individual layers each referred to as a convolution layer and a pooling layer.
  • a convolution layer filtering by a convolution operation is performed to extract data referred to as a feature map.
  • the pooling layer information of the feature map output from the convolution layer is compressed to implement down-sampling.
  • the RNN has a network structure in which a value of the hidden layer is recursively input to the hidden layer, and processes short-period time-series data, for example.
  • all the intermediate layers are fully connected layers. That is, in the fully-connected neural network, all the nodes between the respective layers of the intermediate device are connected to each other.
  • the fully-connected neural network has been mainly applied in the field of voice recognition.
  • the influence of the far past output can be held by introducing a parameter referred to as a memory cell that holds the state of the intermediate layer into the intermediate layer output of the RNN. That is, the LSTM processes time-series data of a longer period than the RNN.
  • the autoencoder extracts a low-dimensional feature capable of reproducing input data by unsupervised learning.
  • the autoencoder is effective for noise removal, dimensionality reduction, and the like.
  • the learning model is not limited to a neural network model.
  • the trained model may be a model based on reinforcement learning.
  • reinforcement learning the model is trained through trial and error to take an action (setting) that maximizes value.
  • the learning model may be a logistic regression model.
  • the learning model may include a plurality of models.
  • the learning model may include a plurality of neural network models.
  • the learning model may include, for example, a plurality of neural network models selected from among the above-described plurality of neural network models (for example, DNN, CNN, RNN, LSTM, and the like).
  • the plurality of neural network models may be in a dependent relationship or a parallel relationship.
  • the learning model can be rephrased as an AI model, an ML model, or a trained model.
  • the learning model may be simply referred to as a model.
  • Fig. 28 is a diagram illustrating an example of detection data collection processing for sensor fusion.
  • the sensing service is assumed to be a service of providing detection data for automated driving service (automated driving control) of a mobile body.
  • the mobile body to be subjected to the automated driving service is assumed to be a vehicle such as an automobile, but is not limited to the vehicle, and may be a flying object such as a drone, for example.
  • the description of a first vehicle in the following description can be replaced with a first mobile body or a first communication apparatus.
  • the description of a second vehicle in the following description can be replaced with a second mobile body or a second communication apparatus.
  • the detection data collection processing is initiated when the service provider providing the automated driving service requests the service providing entity to provide detection data (that is, detection data for sensor fusion) for the automated driving service.
  • the service utilization entity is assumed to be the application server (server 10) managed by the service provider, and the service providing entity is assumed to be the core network CN (management apparatus 20).
  • the core network CN has functions of the AMF 541, the SEF 554, the UDM 547, and the NEF 542.
  • the management apparatus 20 includes at least the SEF 554.
  • the first vehicle Prior to execution of the detection data collection processing, the first vehicle (terminal apparatus 40) requests the server 10 managed by the service provider to register in the automated driving service.
  • the server 10 When having received a request for registration in the automated driving service from the first vehicle (terminal apparatus 40), the server 10 requests the sensing service from the management apparatus 20 via the AF 548.
  • the management apparatus 20 receives the sensing service request from the server 10 via the AF 548 (Step S601).
  • the request for the sensing service corresponds to Step S201 illustrated in Fig. 22.
  • the request includes information that identifies the target terminal apparatus 40 (that is, the first vehicle) and information that identifies a use case (for example, Sensing Assisted Automotive Maneuvering and Navigation).
  • the management apparatus 20 acquires the location information of the first vehicle (terminal apparatus 40) which is a target vehicle (Step S602).
  • the acquisition of the location information of the first vehicle (terminal apparatus 40) can use the location service illustrated in Fig. 13.
  • the SEF 554 may transmit an Ngmlc_Location_ProvideLocation request message to the GMLC 552 so as to acquire the location information of the first vehicle (the terminal apparatus 40).
  • the management apparatus 20 (for example, SEF 554) sets an area/region including the surroundings of the first vehicle (terminal apparatus 40) (Step S603).
  • the range and/or size of the area/region may be determined, for example, by the server 10 (service provider) in consideration of the range of control for the first vehicle (terminal apparatus 40).
  • the server 10 may include information of the range and/or size of the area/region in the request for the sensing service.
  • the management apparatus 20 (for example, SEF 554) checks the access privilege of the sensor for each communication apparatus specified in Step S604 according to Steps S204 to S205 in Fig. 22 (Step S605).
  • the management apparatus 20 acquires the detection data from the sensor for which access to the detection data is permitted for each communication apparatus specified in Step S604 according to Steps S206 to S213 illustrated in Fig. 22 (Step S606).
  • the management apparatus 20 transmits the acquired detection data to the server 10 via the AF 548 (Step S607).
  • the SEF 554 transmits the detection data acquired from the sensor for which access to data is permitted the data to the AF 548 according to Steps S214 to S215 illustrated in Fig. 22.
  • the AF 548 transmits the received detection data to the server 10.
  • the management apparatus 20 ends the detection data collection processing.
  • the sensing service provided by the management apparatus 20 to the server 10 may include a sensor fusion processing execution service.
  • the server 10 (service provider) may request the management apparatus 20 to perform sensor fusion processing as a part or all of the sensing service.
  • the SEF 554 of the management apparatus 20 may execute the sensor fusion processing. That is, the SEF 554 may have a function of executing sensor fusion processing.
  • the SEF 554 may apply computation using an AI/Machine Learning model (AI/ML model) to the sensor fusion processing.
  • AI/ML model AI/Machine Learning model
  • the SEF 554 may assign the sensor fusion processing to a network data analytics function (NWDAF) (not illustrated).
  • NWDAF is a network function that provides the 5GS with a network data analysis function.
  • the NWDAF has an Analytics logical function (AnLF) and a Model Training logical function (MTLF).
  • AnLF Analytics logical function
  • MTLF Model Training logical function
  • the AnLF is a logical function of the NWDAF that executes inference using an AI/ML model, acquires analysis information, and discloses analysis information.
  • the analysis information is, for example, past event statistical information or prediction information.
  • the MTLF is a logical function of the NWDAF that executes training of the AI/ML model and discloses the learning result.
  • the information indicating the type of sensor necessary for each use case described above can be set based on the type of data input to the AI/ML model.
  • the NWDAF Based on the type of data input to the AI/ML model, the NWDAF provides the PCF 545 with information of the type of sensor necessary for the use case of sensor fusion, and then, the PCF 545 can generate information indicating the type of sensor necessary for the use case of sensor fusion based on the information acquired from the NWDAF.
  • the SEF 554 designates the Analytics ID corresponding to the sensor fusion processing, and assigns the sensor fusion processing to the NWDAF.
  • detection data of a plurality of sensors is combined in the use case of sensor fusion, information specific to each sensor is diluted. Therefore, in the use case of sensor fusion, there is a low risk of disclosing private elements included in the detection data.
  • the user of the terminal apparatus 40 can set the access privilege to the detection data of each sensor to "enable".
  • the management apparatus 20 acquires sensing privacy information in order to check the access privilege to the detection data.
  • the access privilege is classified according to the condition.
  • One of the conditions may include the presence or absence of use of distributed processing of the AI/ML model in the sensor fusion processing.
  • the AI/ML model may be an AI/ML model for sensor fusion processing.
  • part of the processing for example, processing up to a part of the intermediate layer
  • the terminal apparatus 40 for example, and the result of the part of the processing is provided to the SEF 554.
  • the private information included in the detection data is diluted. Accordingly, the user of the terminal apparatus 40 can set the access privilege to the detection data to "enable" under the condition that the distributed processing of the AI/ML model is used.
  • the SEF 554 may divide the AI/ML model and provide the terminal apparatus 40 with a model on the front side of the divided AI/ML model.
  • the AI/ML model may be an AI/ML model for sensor fusion processing. With this configuration, arithmetic processing related to the AI/ML model is distributed to the terminal apparatus 40 and the SEF 554, or the terminal apparatus 40 and the NWDAF.
  • the SEF 554 may provide all of the AI/ML model to the terminal apparatus 40 and notify only a division point to the terminal apparatus 40.
  • the terminal apparatus 40 processes only the layer in front of the notified division point, and transmits the result of the arithmetic processing of the intermediate layer to the SEF 554.
  • This also realizes distributed processing of the AI/ML model.
  • the SEF 554 would be required to transmit the divided AI/ML model to the terminal apparatus 40 each time.
  • the present method can reduce the transmission load of the AI/ML model of the SEF 554.
  • the SEF 554 may dynamically determine the division point of the AI/ML model in consideration of at least one of the following.
  • Communication quality between the terminal apparatus 40 and the base station 30 ⁇ Type of QoS flow to be used for uplink allocated for transmission of result of operation processing of AI/ML model (for example, 5QI) ⁇ QoS monitoring result on a target QoS flow, notified from a core network CN ⁇ Quality of QoS flows generated from QoS monitoring results (for example, a delay of the UL packet)
  • the access privilege is classified according to the condition.
  • One of the conditions may include a learning usage (first condition) of the AI/ML model.
  • the use of the AI/ML model serving as the condition may be the learning usage of the AI/ML model for the sensor fusion processing.
  • the detection data is provided for the learning with the AI/ML model, the detection data is not to be disclosed and it is the AI/ML model that is disclosed. Therefore, private information included in the detection data has low disclosure risk. Therefore, the user of the terminal apparatus 40 can set the access privilege to the detection data to "enable" under the condition of the learning usage with the AI/ML model.
  • the terminal apparatus 40 may add a condition (second condition) as to whether to permit saving of the detection data.
  • the SEF 554 or the NWDAF refers to the sensing privacy information of the target terminal apparatus 40 and acquires detection data of the sensor having access privilege "enable" under the condition of the learning usage of the AI/ML model.
  • the SEF 554 or the NWDAF may save detection data of a sensor for which data saving is permitted, in the UDR or the Analytics Data Repository Function (ADRF).
  • the SEF 554 or the NWDAF may determine whether the saving of the detection data is permitted based on the above-described second condition. Subsequently, the SEF 554 or the NWDAF may use the saved detection data for relearning of the AI/ML model and/or learning of another AI/ML model.
  • the fourth embodiment will describe an operation of the communication system 1 in a case where an RF-based sensing function (for example, RF-based sensing function supported by 3GPP transceivers) is included in one or a plurality of sensors.
  • an RF-based sensing function for example, RF-based sensing function supported by 3GPP transceivers
  • a 3GPP transceiver can support an RF-based sensing function as a 5G wireless sensing service.
  • the RF-based sensing function supported by the 3GPP transceiver can support various modes of operations such as: a mode of monostatic sensing in which a transmitter and a receiver of sensing are disposed in a same entity/apparatus (for example, the terminal apparatus 40 or the base station 30); a mode of bistatic sensing in which the transmitter and the receiver of sensing are disposed in different entities/apparatuses (for example, the terminal apparatus 40 and the base station 30); and a mode of multistatic sensing by a plurality of transmitters and receivers of sensing.
  • Reflections of sensing signals transmitted from the sensing transmitter are received by the sensing receiver and processed to obtain characteristics (for example, position) of the sensed object and its environment.
  • characteristics for example, position
  • the velocity of the detected object can be estimated by measuring a parameter such as a Doppler shift of the reception signal.
  • Fig. 29 is a sequence diagram illustrating another example of processing of the sensing service. Specifically, Fig. 29 is a diagram illustrating an example of a procedure of a sensing service in a case where an RF-based sensing function is included in one or a plurality of sensors. Since Steps S201 to S204 are the same as those in Fig. 22, the description thereof is omitted.
  • the SEF 554 checks privacy of the target sensor according to the acquired sensing privacy information (Step S205). Subsequently, the SEF 554 generates a permission list including information of the sensor for which access to detection data is permitted. In a case where an RF-based sensing function (for example, an RF-based sensing function supported by 3GPP transceivers) is included in the target sensor, the SEF 554 checks the UE radio capability information of the terminal apparatus 40 to the UCMF 550. Subsequently, the SEF 554 may check that the transceiver (for example, a 3GPP transceiver) included in the terminal apparatus 40 supports the RF-based sensing function.
  • the transceiver for example, a 3GPP transceiver
  • the UE radio capability information may include, for example, information indicating that a 3GPP transceiver mounted on the terminal apparatus 40 supports an RF-based sensing function by monostatic sensing.
  • the UE radio capability information may include, for example, information indicating that a 3GPP transceiver mounted on the terminal apparatus 40 supports an RF-based sensing function of operating as a sensing transmitter in an operation mode of bistatic sensing or multistatic sensing.
  • the UE radio capability information may include, for example, information indicating that a 3GPP transceiver mounted on the terminal apparatus 40 supports an RF-based sensing function of operating as a sensing receiver in an operation mode of bistatic sensing or multistatic sensing.
  • the terminal sensing privacy information for the RF-based sensing function supported by the 3GPP transceiver may include information (for example, information indicating presence or absence of an access privilege) regarding access for each condition of the operation modes of monostatic sensing, bistatic sensing, and multistatic sensing.
  • information for example, information indicating presence or absence of an access privilege
  • the user of the terminal apparatus 40 can set the access privilege to "enable" under the condition of the multistatic sensing operation mode of the RF-based sensing function supported by the 3GPP transceiver mounted on the terminal apparatus 40.
  • the terminal sensing privacy information for the RF-based sensing function supported by the transceiver of the 3GPP may include information (for example, information indicating presence or absence of an access privilege) regarding access for each condition of whether to operate as a sensing transmitter or a sensing receiver.
  • the SEF 554 configures a group of communication apparatuses including a communication apparatus (for example, the terminal apparatus 40 and/or the base station 30) equipped with a 3GPP transceiver that supports the RF-based sensing function permitted to access to sensing data. Furthermore, within this group, the SEF 554 determines an operation mode for each communication apparatus (for example, the terminal apparatus 40 and/or the base station 30), namely, monostatic sensing, bistatic sensing, or multistatic sensing.
  • the SEF 554 sets the operation mode of monostatic sensing for a communication apparatus (for example, the terminal apparatus 40 or the base station 30) permitted to operate as both a sensing transmitter and a sensing receiver.
  • the SEF 554 sets bistatic sensing for a first communication apparatus permitted to operate as a sensing transmitter and a second communication apparatus permitted to operate as a sensing receiver.
  • the SEF 554 can also set multistatic sensing for a plurality of first communication apparatuses and a plurality of second communication apparatuses.
  • the SEF 554 sets the bistatic sensing for the first terminal apparatus 40 permitted to operate as a sensing transmitter and the second terminal apparatus 40 permitted to operate as a sensing receiver.
  • the SEF 554 can also set multistatic sensing for the plurality of first terminal apparatuses 40 and the plurality of second terminal apparatuses 40.
  • the SEF 554 sets bistatic sensing for the terminal apparatus 40 permitted to operate as a sensing transmitter and the base station 30 permitted to operate as a sensing receiver.
  • the SEF 554 can also set multistatic sensing for a plurality of terminal apparatuses 40 and a plurality of base stations 30.
  • the SEF 554 sets bistatic sensing for the base station 30 permitted to operate as a sensing transmitter and the terminal apparatus 40 permitted to operate as a sensing receiver.
  • the SEF 554 can also set multistatic sensing for a plurality of base stations 30 and a plurality of terminal apparatuses 40.
  • the SEF 554 starts the Nudm_UECM_Get service for the UDM 547.
  • the UDM 547 transmits the network address of the serving AMF 541 of the terminal apparatus 40 identified by the GPSI or the SUPI to the SEF 554 (Step S206).
  • the SEF 554 transmits a Namf_Sensing_ProvideSensingInfo request to the serving AMF 541 according to the network address acquired in Step S206 (Step S207).
  • the SEF 554 includes an instruction to activate the RF-based sensing function in the Namf_Sensing_ProvideSensingInfo request in Step S207 for the terminal apparatus 40 including the transceiver (for example, a 3GPP transceiver) supporting the RF-based sensing function.
  • the transceiver for example, a 3GPP transceiver
  • the SEF 554 may include information regarding a frequency (for example, setting of a period or an event to be triggered) of acquiring sensing data from the target sensor in the Namf_Sensing_ProvideSensingInfo request.
  • the target sensor may include the RF-based sensing function.
  • the serving AMF 541 transmits a NAS message to start the notification of the detection data of the target sensor to the terminal apparatus 40 (Step S209).
  • the AMF 541 includes the instruction to activate the RF-based sensing function in the NAS message.
  • the terminal apparatus 40 transmits a NAS message as a response to the serving AMF 541 as necessary according to the received NAS message (Step S210).
  • the NAS message to be a response may include explicit permission from the user of the terminal apparatus 40 or a local setting of the terminal apparatus 40 (setting of an access privilege to detection data of each sensor).
  • the terminal apparatus 40 may include a permission for the instruction to activate the RF-based sensing function in the NAS message to be a response.
  • the SEF 554 may transmit the Namf_Sensing_ProvideSensingInfo request including the instruction to activate the RF-based sensing function, to the AMF 541.
  • This activation instruction may be, for example, an instruction to activate the RF-based sensing function for the RAN 510 (for example, the base station 30) equipped with a 3GPP transceiver that supports the RF-based sensing function.
  • the activation instruction may further include an instruction of an operation mode of monostatic sensing, bistatic sensing, or multistatic sensing for a communication apparatus (for example, the terminal apparatus 40 or the RAN 510 (for example, the base station 30)) equipped with a 3GPP transceiver that supports the RF-based sensing function.
  • a communication apparatus for example, the terminal apparatus 40 or the RAN 510 (for example, the base station 30)
  • a 3GPP transceiver that supports the RF-based sensing function.
  • the Namf_Sensing_ProvideSensingInfo request may further include an instruction as to whether to operate as a sensing transmitter or a sensing receiver.
  • the instruction of operation modes of the monostatic sensing, bistatic sensing, or bistatic sensing for the RAN 510 may be transmitted from the SEF 554 or the AMF 541 to the RAN 510 via an NG Application Protocol (NGAP) or a service-based interface of the RAN 510.
  • NGAP NG Application Protocol
  • the instruction of the operation mode of the monostatic sensing or the bistatic sensing for the terminal apparatus 40 may be transmitted from the AMF 541 to the terminal apparatus 40 using the NAS message in Step S209. Furthermore, the instruction of the operation mode of the multistatic sensing for the terminal apparatus 40 may be transmitted from the AMF 541 to a plurality of the terminal apparatuses 40 using the NAS message in Step S209.
  • the communication apparatus (for example, the terminal apparatus 40 or the RAN 510) executes sensing using a transmitter and a receiver of a 3GPP transceiver included in the communication apparatus.
  • the communication apparatus When having received the instruction of the operation mode of the bistatic sensing or the multistatic sensing and in a case where an instruction to operate as a sensing transmitter is included in the received instruction, the communication apparatus (for example, the terminal apparatus 40 and/or the RAN 510) performs settings necessary for transmitting a sensing signal by using a transmitter of a 3GPP transceiver of the communication apparatus.
  • the communication apparatus In a case where an instruction to operate as a sensing receiver is included in the received instruction, the communication apparatus (for example, the terminal apparatus 40 and/or the RAN 510) performs settings necessary for receiving a sensing signal by using a receiver of a 3GPP transceiver of the communication apparatus.
  • the setting necessary for receiving the sensing signal is setting of a measurement window, for example.
  • the AMF 541 transmits a request for allocating radio resources for the RF-based sensing to the terminal apparatus 40 to the RAN 510 (for example, the base station 30) (Step S701).
  • the AMF 541 may include, in this request, information regarding the frequency of acquiring sensing data from the RF-based sensing function (for example, setting of a period or a trigger event).
  • the RAN 510 (for example, the base station 30) allocates radio resources for RF-based sensing (uplink and/or downlink radio resources) to the terminal apparatus 40 by using an RRCReconfiguration message (Step S702).
  • the RAN 510 may set a measurement window instead of allocating downlink radio resources.
  • the RAN 510 may set a Configured Grant (CG) in uplink radio resources for RF-based sensing.
  • the RAN 510 may set configured scheduling (CS) in downlink radio resources for RF-based sensing.
  • the RAN 510 may set a frequency of reporting the detected sensing data to the SEF 554 (for example, a period or trigger event) in addition to the allocation of the radio resources for the RF-based sensing.
  • the setting of the frequency of the report may be performed onto the terminal apparatus 40 that has instructed the operation mode of the monostatic sensing or the terminal apparatus 40 that has instructed the operation as the sensing receiver in the operation mode of the bistatic sensing or the multistatic sensing.
  • the terminal apparatus 40 performs setting of radio resources for RF-based sensing.
  • a terminal apparatus 40 transmits an RRCReconfiguration Complete message to the RAN 510 (for example, the base station 30) (Step S703).
  • the terminal apparatus 40 activates the RF-based sensing function (Step S704).
  • the serving AMF 541 starts Nudm_ParameterProvision_Update for sensing privacy information in order to save the setting of the access privilege received from the terminal apparatus 40 in the UDM 547 (Step S211).
  • the setting of the access privilege may include setting of an access privilege to data detected by the RF-based sensing function.
  • the serving AMF 541 returns a Namf_Sensing_ProvideSensingInfo response message to the SEF 554 as a response to the Namf_Sensing_ProvideSensingInfo request in Step S207 (Step S212).
  • the serving AMF 541 may use this response message to inform of the information of the sensors (for example, sensor ID) for which access to the detection data is permitted.
  • the information of the sensor may include information of an RF-based sensing function permitted to access the detection data.
  • the terminal apparatus 40 When having accepted the access requested in Step S209, the terminal apparatus 40 transmits detection data of the target sensor to the SEF 554 (Step S213). In a case where the terminal apparatus 40 has accepted access to the data detected by the RF-based sensing function, the terminal apparatus transmits the detection data of the RF-based sensing function to the SEF 554.
  • Step S214 Since the procedures on and after Step S214 are the same as that in Fig. 22, the description thereof will be omitted.
  • radio resources for sensing can be allocated as necessary to the 3GPP transceiver that supports the RF-based sensing function, leading to achievement of effective use of the radio resources. Furthermore, the RF-based sensing function is activated as necessary, leading also to contribution to reduction in power consumption of the terminal apparatus 40.
  • the above embodiment has described the operation related to the sensing service of the 5G system in consideration of the privacy of the detection data of one or a plurality of sensors, in which the system (communication system 1) to which the technology disclosed in each of the above-described embodiments is applied is not limited to the 5G system.
  • the system that performs the processing related to the sensing service may be a 4G system or a system beyond the 5G system (for example, a Beyond 5G (B5G) system and/or a 6G system).
  • the information processing apparatus (for example, an apparatus serving as a service providing entity) that executes processing related to the sensing service is supposed to be the core network CN (for example, the management apparatus 20).
  • the information processing apparatus that executes the processing related to the sensing service is not limited to the core network CN(for example, the management apparatus 20).
  • the information processing apparatus that executes processing related to the sensing service may be the server 10.
  • the description of each block (the reception unit 231 to the processing unit 238) constituting the control unit 23 of the management apparatus 20 described above can be appropriately replaced with the description indicating each block (the reception unit 131 to the processing unit 138) constituting the control unit 13 of the server 10.
  • the information processing apparatus that executes processing related to the sensing service may be the base station 30 or the terminal apparatus 40. Needless to say, the information processing apparatus that executes the processing related to the sensing service may be an apparatus other than these.
  • the communication apparatus including one or a plurality of sensors is supposed to be the terminal apparatus 40.
  • the communication apparatus including one or a plurality of sensors is not limited to the terminal apparatus 40.
  • the communication apparatus including one or a plurality of sensors may be the base station 30.
  • the description of each block (notification unit 431 to acquisition unit 435) constituting the control unit 43 of the terminal apparatus 40 described above can be appropriately replaced with the description indicating each block (notification unit 331 to acquisition unit 335) constituting the control unit 33 of the base station 30.
  • the communication apparatus including one or a plurality of sensors may be the server 10 or the management apparatus 20.
  • the communication apparatus including one or a plurality of sensors may be a road-side unit. Needless to say, the communication apparatus including one or a plurality of sensors may be apparatus other than these.
  • the apparatus for example, an apparatus serving as a service utilization entity
  • the apparatus that requests the processing related to the sensing service is not limited to the server 10.
  • the apparatus requesting the processing related to the sensing service may be the management apparatus 20, the base station 30, or the terminal apparatus 40. Needless to say, The apparatus requesting the processing related to the sensing service may be an apparatus other than these.
  • the information processing apparatus executes processing related to the sensing service.
  • the information processing apparatus receives the request related to the sensing service, and acquires detection data of the sensor for which access to the data is determined to be permitted as the data related to the sensing service, based on the information related to the access.
  • the service to which the technology disclosed in the present embodiment is applicable is not limited to the sensing service.
  • the information processing apparatus may execute the processing disclosed in the present embodiment for services other than the sensing service.
  • the information processing apparatus may receive a request related to the service, and acquires detection data of the sensor for which access to the data is determined to be permitted as data related to the service, based on the information related to the access.
  • the “service” may be a service related to a sensing service (for example, a service necessary for executing processing of the sensing service).
  • the "sensing service” in the description of the present embodiment can be replaced with “service related to sensing service” as appropriate.
  • the "service providing entity” in the description of the present embodiment can be rephrased as an entity that provides a service related to a sensing service.
  • the “service utilization entity” in the description of the present embodiment can be rephrased as an entity that utilizes a service related to a sensing service.
  • the information processing apparatus acquires information for each sensor as the information related to access.
  • the information related to the access acquired by the information processing apparatus is not limited to the information for each sensor.
  • the information related to access may be at least one of the following pieces of information.
  • one or a plurality of sensors included in one or a plurality of communication apparatuses may be grouped into one or a plurality of groups (hereinafter, referred to as a sensor group).
  • the one or plurality of sensor groups may each include one or a plurality of sensors.
  • the information related to the access may be information for each sensor group.
  • the information processing apparatus may acquire detection data for which access is determined to be permitted based on information related to access for each sensor group, as data related to a service (for example, data related to the sensing service).
  • one or a plurality of communication apparatuses including one or a plurality of sensors may be grouped into one or a plurality of groups (hereinafter, referred to as a communication apparatus group).
  • One or a plurality of communication apparatus groups may each include one or a plurality of communication apparatuses.
  • the information related to the access may be information for each communication apparatus group.
  • the information processing apparatus may acquire detection data for which access is determined to be permitted based on information related to access for each communication apparatus group, as data related to a service (for example, data related to the sensing service).
  • one or a plurality of pieces of detection data may be grouped into one or a plurality of groups (hereinafter, referred to as a detection data group).
  • One or a plurality of detection data groups may each include one or a plurality of pieces of detection data.
  • the information related to the access may be information for each detection data group.
  • the information processing apparatus may acquire detection data for which access is determined to be permitted based on information related to access for each detection data group, as data related to a service (for example, data related to the sensing service).
  • the condition for classifying the permission information is supposed to include at least one of (C1) to (C7) described above.
  • the condition information transmitted together with the judgement request or the management information to the user is supposed to be information corresponding to the condition indicated by at least one of (C1) to (C7) described above.
  • these conditions are not limited to the conditions (C1) to (C7) described above.
  • These conditions may include conditions other than (C1) to (C7) described above.
  • these conditions may include at least one of the conditions exemplified below.
  • the condition for classifying the permission information may be a condition related to a location (area/region) where the detection data is stored.
  • the condition may be a condition that access to the detection data is permitted when the location (area/region) where the service (for example, sensing service) is stored is a first country, and access to the detection data is not permitted when the location (area/region) where the service (for example, sensing service) is stored is a second country.
  • the area/region is not limited to a country.
  • the area/region may be a unit area/region larger than a country, or the area/region may be a unit area/region smaller than a country.
  • the condition for classifying permission information may be a condition related to duplication of detection data.
  • the condition may be a condition that access to the detection data is permitted in a case where an entity that acquires information related to the detection data has a duplication management function (for example, a duplication restriction function such as Digital Rights Management (DRM)) regarding the detection data, and access to the detection data is not permitted in a case where the entity has no duplication management function.
  • the entity that acquires the information related to the detection data is, for example, an entity that directly or indirectly acquires information related to the detection data (for example, detection data and/or information generated based on the detection data).
  • the entity that acquires information related to detection data may be a service providing entity, a service utilization entity, or an entity that directly or indirectly acquires the information from the service utilization entity.
  • condition related to duplication of detection data may include the possible number of times of duplication with respect to the target detection data.
  • Duplication of the detection data is performed only by the possible number of times of duplication.
  • the duplication management function can manage the number of remaining times of duplication of the target detection data so as to restrict duplication.
  • duplication management function may be managed by a smart contract in a Decentralized Autonomous Organization (DAO) described below.
  • DAO Decentralized Autonomous Organization
  • the condition for classifying the permission information may be a condition related to data integrity (DI) of detection data.
  • the condition may be a condition that access to detection data is permitted in a case where the entity who acquires the information related to the detection data has a DI function regarding the detection data, and access to the detection data is not permitted in a case where the entity has no DI function.
  • the DI function may include a function of checking whether the detection data has been altered.
  • the DI function may include a function of checking whether there is a missing and/or mismatch in the detection data. Note that the entity that acquires the information related to the detection data may be similar to the entity described in Example 2.
  • the DI function can also be implemented by a distributed autonomous organization referred to as a Decentralized Autonomous Organization (DAO) constituted by a node or an entity in the system.
  • DAO Decentralized Autonomous Organization
  • the DAO is one of the governance methods based on a blockchain and not requiring centralized management, and thus is attracting attention as a new and democratic governance method.
  • a node or entity in the system holds a governance token.
  • the governance token is a token for giving a right to participate in a judgment/determination processing in the DAO to a node or an entity that holds the token.
  • the judgment/determination processing in the system is performed by voting by a node or an entity that holds the governance token, which is referred to as a smart contract.
  • the smart contract is implemented, for example, in the form of a program that is autonomously executable as a policy/rule for judgment/determination processing in the system.
  • the judgment/determination processing in the system includes authorization and verification regarding the access privilege of the detection data, authorization and verification regarding update of the access privilege of the detection data, authorization and verification regarding transmission of the detection data from the first node or entity to the second node or entity, and the like.
  • the smart contract verifies that the detection data has been correctly transmitted from the first node or entity to the second node or entity.
  • the system may implement services other than the DI function by configuring the DAO.
  • the system may classify one or a plurality of services provided by each node or entity into a service based on a smart contract and a service not based on a smart contract.
  • the nodes or entities in the system may include at least one of the server 10, the management apparatus 20, the base station 30, and the terminal apparatus 40.
  • the nodes or entities in the system may include at least one of the communication unit 11, the storage unit 12, and the control unit 13, which constitute the server 10. Furthermore, the nodes or entities in the system may include at least one of the reception unit 131, the acquisition unit 132, the generation unit 133, the saving unit 134, the request unit 135, the update unit 136, the setting unit 137, and the processing unit 138, which constitute the control unit 13.
  • the nodes or entities in the system may include the core network CN which is a form of the management apparatus 20. Furthermore, the nodes or entities in the system may include network functions constituting the core network CN.
  • the nodes or entities in the system may include at least one of the communication unit 21, the storage unit 22, and the control unit 23, which constitute the management apparatus 20. Furthermore, the nodes or entities in the system may include at least one of the reception unit 231, the acquisition unit 232, the generation unit 233, the saving unit 234, the request unit 235, the update unit 236, the setting unit 237, and the processing unit 238, which constitute the control unit 23.
  • the nodes or entities in the system may include at least one of the wireless communication unit 31, the storage unit 32, the control unit 33, and the sensor unit 34, which constitute the base station 30. Furthermore, this may include the transmission processing unit 311, the reception processing unit 312, and the antenna 313, which constitute the wireless communication unit 31.
  • the nodes or entities in the system may include the notification unit 331, the reception unit 332, the transmission unit 333, the determination unit 334, and the acquisition unit 335 constituting the control unit 33.
  • Nodes or entities in the system may include sublayers (for example, at least one of RRC, SDAP, PDCP, RLC, MAC, and PHY) constituting the wireless communication unit 31.
  • sublayers for example, at least one of RRC, SDAP, PDCP, RLC, MAC, and PHY
  • the nodes or entities in the system may include the wireless communication unit 41, the storage unit 42, the control unit 43, the input unit 44, the output unit 45, and the sensor unit 46, which constitute the terminal apparatus 40. Furthermore, the node or entity in the system may include at least one of the transmission processing unit 411, the reception processing unit 412, and the antenna 413, which constitute the wireless communication unit 41.
  • the nodes or entities in the system may include at least one of the notification unit 431, the reception unit 432, the transmission unit 433, the determination unit 434, and the acquisition unit 435, which constitute the control unit 43.
  • Nodes or entities in the system may include sublayers (for example, at least one of RRC, SDAP, PDCP, RLC, MAC, and PHY) constituting the wireless communication unit 41.
  • sublayers for example, at least one of RRC, SDAP, PDCP, RLC, MAC, and PHY
  • the concept of services provided by each node or entity may encompass the concept of processing performed by each node or entity.
  • Each node or entity may acquire input information necessary for processing to be executed from another node or entity being a consumer of a service, and may provide a part or all of a result of the executed processing to another node or entity being a consumer.
  • the execution of the processing regarding the service by the smart contract is authorized and verified by all nodes or entities having the governance token in the system.
  • the processing performed on the service based on the smart contract by the node or entity having no governance token is ignored or discarded in the system.
  • the execution of the processing regarding the service not based on the smart contract is authorized and verified by a specific node or entity in the system.
  • the specific node or entity may be set in advance, or may be determined by a smart contract.
  • processing executed by a specific node or entity may be set in advance or may be determined by a smart contract.
  • Classification of services by the smart contract and services not by the smart contract is managed by the smart contract in the system.
  • the change in the classification is authorized and verified by the smart contract.
  • the condition for classifying permission information may be a condition related to secondary use of detection data.
  • the condition may be a condition that access to the detection data is permitted in a case where the entity that acquires information related to the detection data does not perform secondary use of the detection data, and access to the detection data is not permitted in a case where the entity performs secondary use of the detection data.
  • the condition may be a condition that access to the detection data is permitted in a case where the entity that acquires detection data does not perform tertiary use of the detection data after the secondary use of the detection data, and access to the detection data is not permitted in a case where the entity performs the tertiary use of the detection data after the secondary use of the detection data.
  • the entity that acquires the information related to the detection data may be similar to the entity described in Example 2.
  • the condition for classifying the permission information may be a condition related to a detection data disclosure method.
  • the condition may be a condition used in a case where an entity that acquires information related to detection data discloses the detection data to another entity, such that access to the detection data is permitted in a case where the detection data is disclosed after being processed (for example, with privacy protection processing), and access to the detection data is not permitted in a case where the detection data is disclosed without being processed (for example, without privacy protection processing).
  • the entity that acquires the information related to the detection data may be similar to the entity described in Example 2.
  • the condition for classifying the permission information may be a condition related to an entity disclosing detection data.
  • the condition may be a condition that access to the detection data is permitted in a case where the entity disclosing the information related to the detection data is a first entity, and access to the detection data is not permitted in a case where the entity disclosing the information related to the detection data is a second entity.
  • the disclosing entity as a condition may be a natural person or a group such as a corporation or an organization.
  • the disclosing entity as the condition may be a specific employee of a service providing entity, or may be a specific group (for example, a unit or section) constituting the service providing entity.
  • control apparatus that controls the server 10, the management apparatus 20, the base station 30, or the terminal apparatus 40 of the present embodiment may be implemented by a dedicated computer system or by a general-purpose computer system.
  • a program for executing the above-described operations is stored in a computer-readable recording medium such as an optical disk, semiconductor memory, a magnetic tape, or a flexible disk and distributed.
  • the program is installed on a computer and the above processing is executed to achieve the configuration of the control apparatus.
  • the control apparatus may be an apparatus (for example, a personal computer) external to the server 10, the management apparatus 20, the base station 30, or the terminal apparatus 40.
  • the control apparatus may be an apparatus (for example, the control unit 13, the control unit 23, the control unit 33, or the control unit 43) inside the server 10, the management apparatus 20, the base station 30, or the terminal apparatus 40.
  • the communication program can be stored in a disk device included in a server apparatus on a network such as the Internet so as to be able to be downloaded to a computer, for example.
  • the functions described above may be implemented by using operating system (OS) and application software in cooperation.
  • OS operating system
  • the portions other than the OS can be stored in a medium for distribution, or the portions other than the OS can be stored in a server apparatus so as to be downloaded to a computer, for example.
  • each of components of each device is provided as a functional and conceptional illustration and thus does not necessarily need to be physically configured as illustrated. That is, the specific form of distribution/integration of each of The apparatuses is not limited to those illustrated in the drawings, and all or a part thereof can be functionally or physically distributed or integrated into arbitrary units according to various loads and use situations. This configuration by distribution or integration may be performed dynamically.
  • the present embodiment can be implemented as any configuration constituting an apparatus or a system, for example, a processor as a large scale integration (LSI) or the like, a module using a plurality of processors or the like, a unit using a plurality of modules or the like, and a set obtained by further adding other functions to the unit, or the like (that is, a configuration of a part of The apparatus).
  • LSI large scale integration
  • the system LSI may be referred to as a System on Chip (SOC).
  • SOC System on Chip
  • each of the apparatuses described above or below may be construed as a processor (for example, the CPU) as a system LSI (for example, SoC) or a module that uses or constitutes the processor.
  • the present embodiment may be implemented by any configuration constituting an apparatus or a system, for example, a modem chip (baseband chip) or a Radio Frequency (RF) unit, or a combination thereof.
  • the RF unit includes at least one of an RF circuit and RF Front-end.
  • each of the apparatuses described above or below may be construed as a modem chip (baseband chip), an RF unit, or a combination thereof. Additionally or alternatively, each of the apparatuses described above or below may be construed as a module using or constituting the modem chip or the RF unit.
  • the modem chip performs signal processing related to communication in the apparatus (an apparatus described above or below).
  • the modem chip may have at least a function of a modulator or a demodulator.
  • the RF unit may have a function of at least one of an RF transceiver (RF upconverter, RF downconverter), a power amplifier, and a low noise amplifier.
  • the RF transceiver performs conversion of a baseband signal and an RF frequency signal.
  • the power amplifier performs amplification for transmitting a signal from the antenna.
  • the low noise amplifier amplifies a weak signal received at the antenna.
  • the RF unit (in particular, the RF front-end) may include at least one of the above-described power amplifier, low noise amplifier, envelope tracker, filter, duplexer, multiplexer, antenna switch, and antenna tuner.
  • a system represents a set of a plurality of components (apparatuses, modules (components), or the like), and whether all the components are in the same housing would not be a big issue.
  • a plurality of apparatuses housed in separate housings and connected via a network or the like, and one apparatus in which a plurality of modules is housed in one housing, are both systems.
  • the present embodiment can employ a configuration of cloud computing in which one function is cooperatively shared and processed by a plurality of apparatuses via a network.
  • the information processing apparatus when having received a request related to a sensing service from a service utilization entity (for example, the server 10 or the terminal apparatus 40), acquires information related to access to detection data of one or a plurality of sensors included in one or a plurality of communication apparatuses (for example, one or more terminal apparatuses 40), for each sensor.
  • the information related to access includes, for example, information (permission information) indicating whether to permit access to the detection data for each sensor.
  • the information related to access may include, for example, information (permission information) indicating whether to permit access to the detection data for each data related to the sensor.
  • the information related to access may include, for example, information (permission information) indicating whether to permit access to the detection data for each sensing data.
  • the information processing apparatus determines a sensor for which access to the detection data is permitted based on the information related to the access. Subsequently, the information processing apparatus acquires data detected by the sensor for which access is determined to be permitted. Subsequently, the information processing apparatus transmits the acquired service utilization entity to the server.
  • the information processing apparatus can decide whether to permit access to information related to privacy (for example, detection data of one or a plurality of sensors) in units of sensors instead of in units of services. That is, the information processing apparatus can perform privacy management with finer granularity than before. As a result, it is possible to sufficiently protect privacy of a person related to the communication apparatus.
  • information related to privacy for example, detection data of one or a plurality of sensors
  • the communication apparatus (for example, the terminal apparatus 40) according to the present embodiment transmits capability information of the sensor to the information processing apparatus.
  • information regarding privacy is added to the information to be transmitted to the information processing apparatus as one piece of capability information.
  • An information processing apparatus comprising: a reception unit that receives a request related to a sensing service; a first acquisition unit that acquires information related to privacy related to detection data of one or a plurality of sensors included in one or a plurality of communication apparatuses, for each sensor; and a second acquisition unit that acquires detection data of the sensor for which access is determined to be permitted based on the information related to privacy, as data related to the sensing service.
  • the information processing apparatus (2) The information processing apparatus according to (1), wherein the request related to the sensing service includes identification information for identifying a use case of the sensing service, and the second acquisition unit acquires detection data of the sensor, for which access has been determined to be permitted based on the information related to privacy and which corresponds to the use case indicated by the identification information, as data related to the sensing service.
  • the information related to privacy includes permission information indicating whether to permit access to the detection data of the sensor, the permission information is classified according to a condition, and the second acquisition unit acquires detection data of the sensor for which access is determined to be permitted based on the permission information classified by the condition, as data related to the sensing service.
  • the information processing apparatus according to any one of (1) to (4), further comprising a request unit that requests the communication apparatus for management information that is managed by the communication apparatus and is related to permission of access to detection data of the one or a plurality of sensors included in the communication apparatus, wherein the second acquisition unit acquires detection data of the sensor for which access is determined to be permitted based on the management information, as data related to the sensing service.
  • a request unit that requests the communication apparatus for management information that is managed by the communication apparatus and is related to permission of access to detection data of the one or a plurality of sensors included in the communication apparatus, wherein the second acquisition unit acquires detection data of the sensor for which access is determined to be permitted based on the management information, as data related to the sensing service.
  • the request unit transmits a request for the management information together with condition information, and acquires, from the communication apparatus, the management information corresponding to the transmitted condition information.
  • the information processing apparatus further comprising an update unit that updates the information related to privacy held as a part of subscriber information with the management information.
  • a communication apparatus equipped with one or a plurality of sensors the communication apparatus comprising: a notification unit that sends a notification of information related to privacy related to detection data of the one or the plurality of sensors for each of the sensors to an information processing apparatus that receives a request related to a sensing service; a reception unit that receives a request for access to the detection data of the sensor from the information processing apparatus; and a transmission unit that transmits, in a case where the access is permitted, detection data of the sensor regarding the request for access to the information processing apparatus, as data related to the sensing service.
  • the communication apparatus further comprising a determination unit that determines permission or non-permission of access to the detection data of the sensor related to the request for access based on permission information indicating whether to permit access to the detection data of the sensor, wherein the permission information is classified by a condition, and the determination unit determines the permission or non-permission of access based on the permission information classified according to the condition.
  • the condition for classifying the permission information includes a condition related to a contract concluded by a user who utilizes the sensing service, the contract regarding the request related to the sensing service.
  • the condition for classifying the permission information includes a condition related to a use case of the sensing service.
  • the communication apparatus according to any one of (10) to (12), wherein the condition for classifying the permission information includes a condition related to at least one of an area in which the sensing service is utilized and a time zone in which the sensing service is utilized.
  • the condition for classifying the permission information includes a condition related to a disclosure recipient of detection data of the sensor.
  • the condition for classifying the permission information includes a condition related to a type of data generated by the information processing apparatus using detection data of the sensor.
  • An information processing method comprising: receiving a request related to a sensing service; acquiring information related to privacy related to detection data of one or a plurality of sensors included in one or a plurality of communication apparatuses, for each sensor; and acquiring detection data of the sensor for which access is determined to be permitted based on the information related to privacy, as data related to the sensing service.
  • An information processing apparatus including: a reception unit that receives a request related to a service; a first acquisition unit that acquires information related to access to detection data of one or a plurality of sensors included in one or a plurality of communication apparatuses, for each sensor; and a second acquisition unit that acquires detection data of the sensor for which access is determined to be permitted based on the information related to access, as data related to the service.
  • (X2) The information processing apparatus according to (X1), wherein the request related to the service includes identification information for identifying a use case of the service, and the second acquisition unit acquires detection data of the sensor, for which access has been determined to be permitted based on the information related to access and which corresponds to the use case indicated by the identification information, as data related to the service.
  • (X3) The information processing apparatus according to (X1) or (X2), further including a generation unit that generates a list of the sensors for which access to detection data is permitted, based on the information related to access, wherein the second acquisition unit acquires detection data of the sensor for which access is determined to be permitted based on the list, as data related to the service.
  • (X4) The information processing apparatus according to any one of (X1) to (X3), wherein the information related to access includes permission information indicating whether to permit access to the detection data of the sensor, the permission information is classified according to a condition, and the second acquisition unit acquires detection data of the sensor for which access is determined to be permitted based on the permission information classified by the condition, as data related to the service.
  • (X5) The information processing apparatus according to (X4), wherein the condition for classifying the permission information includes a condition related to a contract concluded by a user who utilizes the service, the contract regarding the request related to the service.
  • (X6) The information processing apparatus according to (X4) or (X5), wherein the condition for classifying the permission information includes a condition related to the use case of the service.
  • (X7) The information processing apparatus according to any one of (X4) to (X6), wherein the condition for classifying the permission information includes a condition related to at least one of a utilization area of the service and a utilization time zone of the service.
  • (X8) The information processing apparatus according to any one of (X4) to (X7), wherein the condition for classifying the permission information includes a condition related to saving of detection data of the sensor.
  • (X9) The information processing apparatus according to any one of (X4) to (X8), wherein the condition for classifying the permission information includes a condition related to a disclosure recipient of detection data of the sensor.
  • the information processing apparatus according to any one of (X4) to (X9), wherein the condition for classifying the permission information includes a condition related to a type of data generated using detection data of the sensor.
  • the condition for classifying the permission information includes a condition related to use of distributed processing of an artificial intelligence/machine learning model (AI/ML model) used for the service, the AI/ML model being a model to which detection data of the sensor is to be input.
  • AI/ML model artificial intelligence/machine learning model
  • condition for classifying the permission information includes a first condition related to a learning usage of the artificial intelligence/machine learning model (AI/ML model) used for the service, the AI/ML model being a model to which detection data of the sensor is to be input.
  • condition for classifying the permission information further includes a second condition indicating whether saving of detection data of the sensor is permitted in learning of the AI/ML model in a learning usage satisfying the first condition.
  • (X14) The information processing apparatus according to (X13), further including a saving unit that saves detection data of the sensor in a case where saving is permitted under the second condition in the learning of the AI/ML model in the learning usage satisfying the first condition.
  • (X15) The information processing apparatus according to any one of (X1) to (X14), further including a request unit that requests a user of the communication apparatus to make a judgment related to permission of access to detection data of the sensor, wherein the second acquisition unit acquires detection data of the sensor for which access is determined to be permitted based on a result of the judgement made by the user, as data related to the service.
  • (X16) The information processing apparatus according to (X15), wherein the request unit transmits a request for a judgment related to the permission of access together with condition information, and acquires the result of the judgment made by the user under the transmitted condition, from the communication apparatus.
  • (X17) The information processing apparatus according to (X15) or (X16), further including an update unit that updates the information related to access held as a part of subscriber information with the result of judgment made by the user.
  • (X18) The information processing apparatus according to any one of (X15) to (X17), wherein the request unit transmits the request for the judgment related to the permission of access together with information related to a validity period or an expiration point set in the detection data of the sensor.
  • (X19) The information processing apparatus according to (X18), wherein the validity period or the expiration point is designated according to the use case of the service.
  • (X20) The information processing apparatus according to any one of (X1) to (X19), wherein the second acquisition unit acquires the detection data via a user plane.
  • (X21) The information processing apparatus according to (X20), wherein the second acquisition unit transmits, to the communication apparatus, a message requesting establishment of a session triggered by a network.
  • (X22) The information processing apparatus according to any one of (X1) to (X14), further including a request unit that requests the communication apparatus for management information that is managed by the communication apparatus and is related to permission of access to detection data of the one or a plurality of sensors included in the communication apparatus, wherein the second acquisition unit acquires detection data of the sensor for which access is determined to be permitted based on the management information, as data related to the service.
  • (X23) The information processing apparatus according to (X22), wherein the permission or non-permission of access is classified according to a condition, and the request unit transmits a request for the management information together with condition information, and acquires, from the communication apparatus, the management information corresponding to the transmitted condition information.
  • (X24) The information processing apparatus according to (X22) or (X23), further including an update unit that updates the information related to access held as a part of subscriber information with the management information.
  • (X25) The information processing apparatus according to any one of (X22) to (X24), wherein the request unit transmits the request for the management information together with information related to a validity period or an expiration point set in the detection data of the sensor.
  • (X26) The information processing apparatus according to (X25), wherein the validity period or the expiration point is designated according to the use case of the service.
  • (X27) The information processing apparatus according to any one of (X1) to (X26), wherein the first acquisition unit acquires information indicating a type of a sensor needed for each use case of the service, and the second acquisition unit determines the sensor to be an acquisition target of data related to the service based on the information indicating the type of the sensor.
  • (X28) The information processing apparatus according to (X27), wherein the information indicating the type of the sensor is information generated based on information related to the type of the sensor acquired from an apparatus that makes a request related to the service.
  • the information processing apparatus further including: a third acquisition unit that acquires, in a case where the use case of the service is sensor fusion, information related to location of the communication apparatus to be provided with control by the sensor fusion; a setting unit that sets an area or a region based on the information related to location; and a specifying unit that specifies a second communication apparatus included in the area or the region, wherein the first acquisition unit and the second acquisition unit acquire the information related to access and the data related to the service by using the second communication apparatus as the communication apparatus.
  • (X30) The information processing apparatus according to (X29), wherein the specifying unit specifies at least one of a base station and a road-side unit included in the area or the region, and the first acquisition unit and the second acquisition unit acquire the information related to access and the data related to the service by using at least one of the base station and the road-side unit as the communication apparatus.
  • (X31) The information processing apparatus according to (X29) or (X30), further including a processing unit that requests execution of processing of analysis and/or combining of data by the sensor fusion according to a request from the apparatus that makes a request related to the service.
  • (X32) The information processing apparatus according to (X31), wherein the processing unit requests a network data analysis function to perform the processing of analysis and/or combining of data.
  • (X33) The information processing apparatus according to (X32), wherein the processing unit requests an edge application server to perform the processing of analysis and combining of data.
  • (X34) The information processing apparatus according to any one of (X1) to (X31), further including a second request unit that requests the communication apparatus to activate an RF-based sensing function supported by a 3GPP transceiver based on the information related to access to the sensing function in a case where the one or the plurality of sensors of the communication apparatus includes the RF-based sensing function.
  • (X35) The information processing apparatus according to (X34), wherein the second request unit judges that the 3GPP transceiver included in the communication apparatus supports the RF-based sensing function based on radio capability information related to the communication apparatus.
  • (X36) The information processing apparatus according to (X34) or (X35), wherein the request for activation of the RF-based sensing function includes an instruction of an operation mode of monostatic sensing, bistatic sensing, or multistatic sensing for the communication apparatus.
  • (X37) The information processing apparatus according to (X36), wherein, in a case where the request for activation includes the instruction of the operation mode of the bistatic sensing or the multistatic sensing, the request for the activation further includes a second instruction of whether to act as a sensing transmitter or a sensing receiver.
  • (X38) The information processing apparatus according to any one of (X34) to (X37), wherein, in a case where the one or the plurality of sensors of the communication apparatus include the RF-based sensing function supported by the 3GPP transceiver, the second request unit requests the base station to allocate radio resources for sensing to the communication apparatus when activating the RF-based sensing function based on the information related to access to the sensing function.
  • the request related to the service is a request related to a sensing service.
  • the sensing service is a service of providing detection data of the one or plurality of sensors.
  • (X41) The information processing apparatus according to (X39) or (X40), wherein the sensing service is a service based on the detection data of the plurality of sensors.
  • (X42) The information processing apparatus according to any one of (X39) to (X41), wherein the one or plurality of sensors include a sensor for detecting an image and/or a shape of an object, and the sensing service is a service based on image data or shape data detected by the sensor.
  • (X43) The information processing apparatus according to any one of (X1) to (X42), wherein the communication apparatus equipped with the one or plurality of sensors is an apparatus different from an apparatus that transmits a request related to the service.
  • An information processing method including: receiving a request related to a service; acquiring information related to access to detection data of one or a plurality of sensors included in one or a plurality of communication apparatuses, for each sensor; and acquiring detection data of the sensor for which access is determined to be permitted based on the information related to access, as data related to the service.
  • (X45) A program for causing a computer to function as: a reception unit that receives a request related to a service; a first acquisition unit that acquires information related to access to detection data of one or a plurality of sensors included in one or a plurality of communication apparatuses, for each sensor; and a second acquisition unit that acquires detection data of the sensor for which access is determined to be permitted based on the information related to access, as data related to the service.
  • a communication apparatus equipped with one or a plurality of sensors including: a notification unit that sends a notification of information related to access to detection data of the one or the plurality of sensors, for each sensor, to an information processing apparatus that receives a request related to a service; a reception unit that receives a request for access to the detection data of the sensor from the information processing apparatus; and a transmission unit that transmits, in a case where the access is permitted, detection data of the sensor regarding the request for access to the information processing apparatus as data related to the service.
  • the communication apparatus includes a determination unit that determines permission or non-permission of access to the detection data of the sensor related to the request for access based on permission information indicating whether to permit access to the detection data of the sensor, wherein the permission information is classified by a condition, and the determination unit determines the permission or non-permission of access based on the permission information classified according to the condition.
  • the condition for classifying the permission information includes a condition related to a contract concluded by a user who utilizes the service, the contract regarding the request related to the service.
  • the communication apparatus according to (Y2) or (Y3), wherein the condition for classifying the permission information includes a condition related to the use case of the service.
  • (Y5) The communication apparatus according to any one of (Y2) to (Y4), wherein the condition for classifying the permission information includes a condition related to at least one of an area in which the service is utilized and a time zone in which the service is utilized.
  • (Y6) The communication apparatus according to any one of (Y2) to (Y5), wherein the condition for classifying the permission information includes a condition related to saving of detection data of the sensor.
  • (Y7) The communication apparatus according to any one of (Y2) to (Y6), wherein the condition for classifying the permission information includes a condition related to a disclosure recipient of detection data of the sensor.
  • the communication apparatus according to any one of (Y2) to (Y7), wherein the condition for classifying the permission information includes a condition related to a type of data generated by the information processing apparatus using detection data of the sensor.
  • the condition for classifying the permission information includes a condition related to use of distributed processing of an artificial intelligence/machine learning model (AI/ML model) used for the service, the AI/ML model being a model to which detection data of the sensor is to be input.
  • AI/ML model artificial intelligence/machine learning model
  • the communication apparatus according to any one of (Y2) to (Y9), wherein the condition for classifying the permission information includes a first condition related to a learning usage of an artificial intelligence/machine learning model (AI/ML model) used for the service, the AI/ML model being a model to which detection data of the sensor is to be input.
  • the condition for classifying the permission information further includes a second condition indicating whether saving of detection data of the sensor is permitted in learning of the AI/ML model in a learning usage satisfying the first condition.
  • (Y12) The communication apparatus according to any one of (Y2) to (Y11), wherein the request for access includes condition information designated by the information processing apparatus, and the determination unit determines permission or non-permission of access to the detection data of the sensor based on the condition information included in the request for access.
  • (Y13) The communication apparatus according to any one of (Y2) to (Y12), wherein the transmission unit transmits detection data of the sensor related to the request for access together with information of a validity period or an expiration point set in the data.
  • (Y14) The communication apparatus according to any one of (Y2) to (Y13), wherein the transmission unit transmits detection data of the sensor related to the request for access together with information instructing discard of the data after a lapse of the validity period or the expiration point set in the data.
  • (Y15) The communication apparatus according to any one of (Y2) to (Y14), wherein the reception unit receives an instruction to request a user of the communication apparatus to make a judgment related to permission of access to the detection data of the sensor, the notification unit sends a notification of a result of the judgment of the user to the information processing apparatus, and in a case where the notification of the result of the judgment has been sent to the information processing apparatus, the determination unit determines the permission or non-permission of access based on the result of the judgment of the user rather than the permission information.
  • (Y16) The communication apparatus according to any one of (Y2) to (Y15), wherein the reception unit receives an instruction for notification of management information that is managed by the communication apparatus and is related to permission of access to detection data of one or a plurality of sensors included in the communication apparatus, the notification unit sends a notification of the management information to the information processing apparatus, and in a case where the notification of the management information has been sent to the information processing apparatus, the determination unit determines the permission or non-permission of access based on the management information rather than the permission information.
  • the communication apparatus according to (Y16), wherein the permission or non-permission of access is classified according to a condition, the instruction for notification of the management information includes condition information, and the notification unit sends a notification of the management information corresponding to the condition information to the communication apparatus.
  • the communication apparatus according to any one of (Y1) to (Y17), further including an acquisition unit, the acquisition unit performs acquisition of a request such that, in a case where the one or plurality of sensors include an RF-based sensing function supported by a 3GPP transceiver, the acquisition unit acquires, from a base station, a request for activation of the RF-based sensing function.
  • (Y21) The communication apparatus according to any one of (Y18) to (Y20), wherein, in a case where the one or plurality of sensors include the RF-based sensing function supported by the 3GPP transceiver, the acquisition unit acquires, from a base station, information related to allocation of radio resources for sensing to the communication apparatus when activating the RF-based sensing function.
  • (Y22) The communication apparatus according to any one of (Y1) to (Y21), wherein the request related to the service is a request related to a sensing service.
  • (Y23) The communication apparatus according to (Y22), wherein the sensing service includes a service of providing the detection data.
  • a communication method executed by a communication apparatus equipped with one or a plurality of sensors including: sending a notification of information related to access to detection data of the one or the plurality of sensors, for each sensor, to an information processing apparatus that receives a request related to a service; receiving a request for access to the detection data of the sensor from the information processing apparatus; and transmitting, in a case where the access is permitted, detection data of the sensor regarding the request for access to the information processing apparatus.
  • Y28 A program for causing a communication apparatus equipped with one or a plurality of sensors to function as: a notification unit that sends a notification of information related to access to detection data of the one or the plurality of sensors, for each sensor, to an information processing apparatus that receives a request related to a service; a reception unit that receives a request for access to the detection data of the sensor from the information processing apparatus; and a transmission unit that transmits, in a case where the access is permitted, detection data of the sensor regarding the request for access to the information processing apparatus.
  • Communication system 10 Server 20 Management apparatus 30 Base station 40 Terminal apparatus 11, 21 Communication unit 31, 41 Wireless communication unit 12, 22, 32, 42 Storage unit 13, 23, 33, 43 Control unit 34, 46 Sensor section 44 Input unit 45 Output unit 311, 411 Transmission processing unit 312, 412 Reception processing unit 313, 413 Antenna 131, 231 Reception unit 132, 232 Acquisition unit 133, 233 Generation unit 134, 234 Saving unit 135, 235 Request unit 136, 236 Update unit 137, 237 Setting unit 138, 238 Processing unit 331, 431 Notification unit 332, 432 Reception unit 333, 433 Transmission unit 334, 434 Determination unit 335, 435 Acquisition unit 510 RAN/AN 520 UPF 530 DN 540 Control plane function group 560 LCS client 570 SES client 541 AMF 542 NEF 543 NRF 544 NSSF 545 PCF 546 SMF 547 UDM 548 AF 549 AUSF 550 UCMF 551 LMF 552 GMLC 553 LRF 554 S

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

L'invention concerne un appareil de traitement d'informations qui comprend des circuits de traitement configurés pour recevoir une demande relative à un service de détection ; acquérir, pour chaque capteur parmi un ou plusieurs capteurs inclus dans un ou plusieurs appareils de communication, des informations de confidentialité relatives à des données de détection du capteur ; et acquérir des données de détection d'un capteur, parmi le ou les capteurs, pour lesquelles un accès est déterminé comme étant autorisé sur la base des informations de confidentialité acquises, les données de détection étant acquises en tant que données relatives au service de détection.
PCT/JP2024/037899 2023-10-25 2024-10-24 Appareil de traitement d'informations, appareil de communication, et procédé de traitement d'informations Pending WO2025089335A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-183473 2023-10-25
JP2023183473A JP2025072968A (ja) 2023-10-25 2023-10-25 情報処理装置、通信装置、及び情報処理方法

Publications (1)

Publication Number Publication Date
WO2025089335A1 true WO2025089335A1 (fr) 2025-05-01

Family

ID=93456092

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/037899 Pending WO2025089335A1 (fr) 2023-10-25 2024-10-24 Appareil de traitement d'informations, appareil de communication, et procédé de traitement d'informations

Country Status (2)

Country Link
JP (1) JP2025072968A (fr)
WO (1) WO2025089335A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001359169A (ja) 2000-06-16 2001-12-26 Fuji Xerox Co Ltd 情報提供システム
US20130174211A1 (en) * 2011-12-30 2013-07-04 Nokia Corporation Method And Apparatus Providing Privacy Setting And Monitoring User Interface
WO2020178277A1 (fr) * 2019-03-04 2020-09-10 Telefonaktiebolaget Lm Ericsson (Publ) Contrôle de la confidentialité d'un équipement utilisateur et appareils associés
CN115734200A (zh) * 2021-09-01 2023-03-03 华为技术有限公司 对终端设备进行感知的方法和通信装置
CN116097689A (zh) * 2022-09-26 2023-05-09 北京小米移动软件有限公司 感知实现方法、装置、通信设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001359169A (ja) 2000-06-16 2001-12-26 Fuji Xerox Co Ltd 情報提供システム
US20130174211A1 (en) * 2011-12-30 2013-07-04 Nokia Corporation Method And Apparatus Providing Privacy Setting And Monitoring User Interface
WO2020178277A1 (fr) * 2019-03-04 2020-09-10 Telefonaktiebolaget Lm Ericsson (Publ) Contrôle de la confidentialité d'un équipement utilisateur et appareils associés
CN115734200A (zh) * 2021-09-01 2023-03-03 华为技术有限公司 对终端设备进行感知的方法和通信装置
EP4387286A1 (fr) * 2021-09-01 2024-06-19 Huawei Technologies Co., Ltd. Procédé de détection d'équipement utilisateur, et appareil de communication
CN116097689A (zh) * 2022-09-26 2023-05-09 北京小米移动软件有限公司 感知实现方法、装置、通信设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"3rd Generation Partnership Project; Technical Specification Group TSG SA; Service requirements for Integrated Sensing and Communication; Stage 1 (Release 19)", no. V1.0.0, 5 September 2023 (2023-09-05), pages 1 - 19, XP052511991, Retrieved from the Internet <URL:https://ftp.3gpp.org/Specs/archive/22_series/22.137/22137-100.zip> [retrieved on 20230905] *

Also Published As

Publication number Publication date
JP2025072968A (ja) 2025-05-12

Similar Documents

Publication Publication Date Title
US20180090013A1 (en) Unmanned aircraft and operation thereof
US20220108092A1 (en) Range extension and dynamic power control for localization on commercial uhf rfid reader
Dymkova Applicability of 5G subscriber equipment and global navigation satellite systems
KR20190101923A (ko) 무인 항공 시스템에서 스테이션 인식을 통한 무인 항공 로봇 착륙 방법 및 이를 지원하기 위한 장치
EP4583586A2 (fr) Détection de type de barrière à l&#39;aide d&#39;une indication de durée de vol et d&#39;intensité de signal de réception
WO2024029281A1 (fr) Dispositif de traitement d&#39;informations
EP4514550A1 (fr) Détection d&#39;un changement d&#39;état d&#39;un environnement physique au moyen d&#39;une indication d&#39;intensité de signal de temps de vol et de réception
WO2025089335A1 (fr) Appareil de traitement d&#39;informations, appareil de communication, et procédé de traitement d&#39;informations
US11493354B2 (en) Policy based navigation control
WO2025147364A1 (fr) Planificateur de trajet de véhicules multiples en nuage
US20250069255A1 (en) Rapid localization for vision-aided positioning
US12457019B2 (en) Protocol for beam width control
WO2024030260A2 (fr) Mode de navigation amélioré avec détection d&#39;emplacement et commutation de couche de carte
CN118283782A (zh) 注册方法及相关设备
US20250126590A1 (en) Apparatus, methods, for apparatus and computer program products for location function including non-terestrial access point
US20250253939A1 (en) System and methods for regulatory-aware access to network resources over satellites
WO2023001397A1 (fr) Procédés et appareil permettant de déterminer un itinéraire d&#39;uav
WO2022018487A1 (fr) Estimation d&#39;état pour équipements utilisateur (ue) aériens fonctionnant dans un réseau sans fil
WO2025199729A1 (fr) Réduction de fausse fusion d&#39;objets ne faisant pas partie du trafic et d&#39;objets stationnaires radar
US20250131742A1 (en) Synergized 3d object and lane/road detection with association and temporal aggregation using graph neural networks
US20250239061A1 (en) Learnable sensor signatures to incorporate modality-specific information into joint representations for multi-modal fusion
US12457026B2 (en) Beam width control
WO2025145313A1 (fr) Système à plusieurs cœurs de processeur pour traitement de carte haute définition
US20240040401A1 (en) Protocol for beam width and power control
WO2025086088A1 (fr) Suivi amélioré pour véhicule de grande taille ou multi-section

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24804622

Country of ref document: EP

Kind code of ref document: A1