WO2025159665A1 - Appareil, procédé de détection et d'identification d'un objet - Google Patents
Appareil, procédé de détection et d'identification d'un objetInfo
- Publication number
- WO2025159665A1 WO2025159665A1 PCT/SE2024/050056 SE2024050056W WO2025159665A1 WO 2025159665 A1 WO2025159665 A1 WO 2025159665A1 SE 2024050056 W SE2024050056 W SE 2024050056W WO 2025159665 A1 WO2025159665 A1 WO 2025159665A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- sensing
- identity
- positioning
- updated
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/027—Services making use of location information using location based information parameters using movement velocity, acceleration information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/90—Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
Definitions
- the invention relates to an apparatus for sensing and identifying an object, corresponding method, correspond computer program, and a corresponding computer readable storage medium.
- 3GPP Third Generation Partnership Project
- NR New Radio
- 5G fifth generation
- NR positioning One use case for NR positioning is Emergency call.
- 3GPP Release 16 a study was conducted to evaluate E911 requirements for positioning accuracy (latitude, longitude, and altitude), network availability, reliability and latency. It also looked at synchronization requirements and the level of complexity experienced in meeting the needs of emergency call related to positioning. Specifications followed for a variety of positioning techniques that support regulatory as well as commercial needs.
- UE User Equipment
- the object e.g. person
- the object carrying the UE is associated with the UE, e.g., being the owner or sole user of the UE.
- sensing technology it is possible to identify a sensed object with motion sensors, radar technology.
- An object of the invention is to improve tracking of an unidentified object.
- an apparatus for sensing and identifying an object is provided.
- the apparatus is configured to obtain positioning information and identification information related to a user equipment (UE).
- the identification information comprises a UE identity.
- the apparatus is configured to obtain sensing information related to the object.
- the apparatus is configured to determine based on the positioning information and the sensing information whether the UE is associated with the object.
- the apparatus is configured to determine, if the UE is determined to be associated with the object, a joint view based on the position information, the identification information, and sensing information.
- the apparatus is configured to assign in the joint view an object identity to the object.
- the object identity is based on the UE identity.
- the joint view enables to provide sensing and identification of the object associated with the UE.
- the apparatus is configured to obtain further sensing information and/or further positioning information.
- the apparatus is configured to determine an updated joint view based on the obtained further sensing information and/or the obtained further positioning information.
- the apparatus is configured to determine a UE motion pattern based on the positioning information.
- the apparatus is configured to determine object motion pattern based on the sensing information.
- determining whether the UE is associated with the object is based on the UE motion pattern, and/or the object motion pattern. According to an embodiment of the first aspect, determining whether the UE is associated with the object comprises determining whether the positioning information overlaps with the sensing information.
- a method is provided.
- the method is performed by an apparatus.
- the method is for sensing and identifying an object.
- the method comprises obtaining positioning information and identification information related to a user equipment, UE.
- the identification information comprises a UE identity.
- the method comprises obtaining sensing information related to the object.
- the method comprises determining based on the positioning information and the sensing information whether the UE is associated with the object.
- the method comprises determining, if the UE is determined to be associated with the object, a joint view based on the position information, the identification information, and sensing information.
- the method comprises assigning in the joint view an object identity to the object, wherein the object identity is based on the UE identity.
- the method comprises obtaining further sensing information and/or further positioning information.
- the method comprises determining an updated joint view based on the obtained further sensing information and/or the obtained further positioning information.
- the method comprises determining a UE motion pattern based on the positioning information.
- the method comprises determining object motion pattern based on the sensing information.
- determining whether the UE is associated with the object is based on the UE motion pattern, and/or the object motion pattern. According to an embodiment of the second aspect, determining whether the UE is associated with the object comprises determining whether the positioning information overlaps with the sensing information.
- a computer program comprises instructions, which when executed on an apparatus, causes the apparatus to perform the method according to one or more embodiments of the second embodiment.
- a computer readable storage medium comprises a computer program according to the third aspect of the invention.
- At least one or more embodiments advantageously enable sensing/positioning an object using sensing technologies.
- At least one or more embodiments advantageously enable identifying an object when identifying method is not available.
- At least one or more embodiments advantageously enable to sense through sensing technology an object.
- At least one or more embodiments advantageously enable increased accuracy of the identification of the object, as the identification is based on 3GPP based positioning.
- Identification of the object is based on a communication identifier communicated between the UE and a network node within the communication network.
- Figure 1 shows a communication network
- Figure 2 shows a method for sensing and identifying an object.
- Figure 3 shows an embodiment of a step of the method.
- Figure 4 shows an embodiment of a positioning and identification view.
- Figure 5 shows an embodiment of a step of the method.
- Figure 6 shows an embodiment of a sensing view.
- Figure 7 shows an embodiment of a joint view.
- Figure 8 shows an embodiment of a step of the method.
- Figure 9 shows an embodiment of an updated view.
- Figure 10 shows a block diagram of the apparatus.
- Figure 11 shows a block diagram of the apparatus.
- Figure 12 shows a block diagram illustrating a virtualization environment.
- a communication network 100 is provided.
- the communication network 100 may be a third generation partnership project (3GPP) cellular network, such as a third generation (3G) cellular network, a fourth generation (4G) cellular network, a fifth generation (5G) cellular network, or any future generation cellular network, such as a sixth generation (6G) cellular network.
- the communication network 100 may be an Institute of Electronical and Electronics Engineers (IEEE) communication network.
- the communication network 100 comprises an apparatus 110.
- the apparatus 110 may be a base station, such as a radio base station, a NodeB, an evolved NB, or an NR NodeB- base station.
- the apparatus 110 implements Joint Communication and Sensing (JCAS).
- JCAS Joint Communication and Sensing
- the communication network 100 comprises a second apparatus 120.
- the second apparatus 120 may be an apparatus implementing JCAS. Alternatively, or additionally, the second apparatus 120 may be a motion sensor.
- the apparatus 110 and the second apparatus 120 communication through 3GPP connections, wireless fidelity (Wi-Fi) connection, fibre connection, and/or ethernet connection.
- Wi-Fi wireless fidelity
- the communication network 100 provides communication services to user equipment, one of which is shown as user equipment (UE) 130.
- UE user equipment
- the UE 130 may be a communication device used directly by an end-user (e.g., a person). Such communication device may be a hand-held telephone, a tablet, or a laptop computer.
- the UE 130 may be a communication device that is equipped or is carrying a device used directly by the end-user.
- a communication device may be a robot equipped with a device used for communication, or a vehicle equipped with a device used for communication.
- the UE 130 may be associated to an object 140.
- the object 140 may be an item that is tangible.
- the object 140 may be, but not limited to, a robot, a person, or a vehicle.
- FIG. 2 a flowchart depicting embodiment of a method 200 is provided.
- the method 200 is performed by the apparatus 110.
- the method 200 is for sensing and identifying the object 140.
- the method 200 comprises obtaining 210 positioning information 310 and identification information 320 related to the UE 130.
- the identification information 320 comprises a UE identity 325.
- an embodiment of the step 210 of the method 200 is provided.
- the positioning information 310 may be based on known 3GPP positioning methods.
- the 3GPP positioning methods may be 5G System Location Services (LCS), such as defined in TS 23.273 V17.8.0, positioning, as defined, for example, in 3GPP TS 37.355 V17.7.0, and sidelink as defined, for example in 3GPP TS 38.355 V18.0.0.
- LCS 5G System Location Services
- the positioning information 310 may comprise receiving, from a set of network nodes in the communication network 100, the positioning information 310.
- the set of network nodes may be base stations.
- the apparatus 110 may be part of the set of network nodes.
- the set of network nodes calculate the positioning information based on signal transmitted by the UE 130.
- the set of network nodes may perform triangulation of uplink communication sent by the UE 130 to each of the set of network nodes.
- the set of network nodes may perform triangulation as defined, for example, in location services in 3GPP Release 99 Features.
- obtaining 210 the positioning information 310 may comprise receiving positioning information 310 from the UE 130.
- the UE 130 calculates the positioning information 310, such as in location services in 3GPP Release 99 Features.
- the UE 130 may be sending uplink communication to the apparatus 110.
- the uplink communication may be comprising the positioning information 310.
- the positioning information 310 may be based on GPS, map information, inertial navigations, and/or fingerprinting.
- the positioning information 310 may comprise a UE location 315 related to the UE 130.
- the UE location 315 may correspond to the location of the UE 130 at a certain point in time.
- the UE location 315 may correspond to a first geographical point defined by a latitude and a longitude.
- the UE location 315 corresponds to a first geographical area.
- the first geographical area may be defined by coordinates (e.g., x coordinates, y coordinates, z coordinates).
- the first geographical area may be, but not limited to, a room, a building, a floor in the building.
- the positioning information 310 may comprise speed information related to the UE 130.
- the speed information may be an average speed at which the UE 130 is moving at, or the current speed of the UE 130.
- the positioning information 310 comprises direction information related to the UE 130.
- the direction information may be the direction within which the UE 130 is moving towards.
- the speed information and/or the direction information may be obtained by performing a doppler analysis of communication signals transmitted by the UE 130. Alternatively, or additionally, the speed information and/or the direction information may be obtained by estimating the travelling speed of the UE 130, as specified in W02024/005682.
- the method 200 comprises determining 215 a UE motion pattern based on the obtained positioning information 310.
- the UE motion pattern may be determined based on the speed information related to the UE 130 and/or direction information related to the UE 130.
- the identification information 320 may comprise obtaining the identification information 320 through communication between the apparatus 110 and the UE 130. Alternatively, or additionally, obtaining 210 comprises receiving identification information 320 from the UE 130. In this embodiment, the UE 130 sends the identification information 320 to the apparatus 110. In particular, the UE 130 sends uplink communication to the apparatus 110. The uplink communication is comprising the identification information 320.
- the identification information 320 may be comprised in a RRC Connection Request.
- the UE identity 325 may be an identification tag or an identification number.
- the UE identity 325 may be 3GPP identifier, such as Identifiers as defined, for example, in 3GPP TS 23 501 V18.4.0.
- the 3GPP identifier may be an International Mobile Subscriber identity (IMSI), a Temporary Mobile subscriber Identity (TMSI), a lobal Unique Temporary Identity (GUTI), an International Mobile Equipment Identity (IMEI), a Subscription Permanent Identifier (SUPI), or a Subscription Concealed Identifier (SUCI).
- IMSI International Mobile Subscriber identity
- TMSI Temporary Mobile subscriber Identity
- GUI lobal Unique Temporary Identity
- IMEI International Mobile Equipment Identity
- SUPI Subscription Permanent Identifier
- SUCI Subscription Concealed Identifier
- the UE identifier 325 may be an IEEE 802 MAC address.
- the UE identity 325 may be a phone number or a name of a person owning the UE 130.
- the step 210 of the method 200 allows the apparatus 110 to link the identified UE 130 with a position of the UE 130 based on the obtained positioning information 310 and identification information 320.
- the method 200 may comprise determining a positioning and identification view 400.
- the position and identification view 400 is based on the positioning information 310 obtained in the step 210 of the method 200, as described herein, and the identification information 320 obtained in the step 210 of the method 200, as described herein.
- an embodiment of the positioning and identification view 400 is provided.
- the first geographical area is illustrated by a cross “X” in the positioning and identification view 400.
- the method 200 comprises obtaining 220 sensing information 510 related to the object 140.
- an embodiment of the step 220 of the method 200 is provided.
- the sensing information 510 may comprise receiving the sensing information 510 from a Joint Communication and Sensing (JCAS) unit in a network node.
- JCAS Joint Communication and Sensing
- the network node may be the apparatus 110.
- JCAS corresponds to spatial sensing of the object 140 within proximity of the network node.
- the network node is a network node other than the apparatus 110 performing JCAS of the object 140 and sending the sensing information 510 to the apparatus 110.
- obtaining 220 the sensing information 510 comprises receiving the sensing information 510 from the second apparatus 120.
- the second apparatus 120 may be configured to perform function of a radar.
- obtaining 220 the sensing information 510 comprises receiving the sensing information 510 from the second apparatus 120.
- the second apparatus 120 may be configured to perform function of a motion sensor.
- obtaining 220 the sensing information 510 comprises receiving the sensing information 510 from the second apparatus 120.
- the second apparatus 120 may be configured to perform function of a satellite.
- the sensing information 510 may be based on JCAS, on radar, on motion sensors, and/or on satellite. In particular, no communication between the object 140 and the communication network 100 is required for the object 140 to be sensed.
- the sensing information 510 may comprise an object location 515 related to the object 140.
- the object location 515 may correspond to the location of the object 140 at a certain point in time.
- the object location 515 may correspond to a second geographical point defined by a latitude and a longitude.
- the object location 515 corresponds to a second geographical area.
- the second geographical area may be defined by coordinates (e.g., x coordinates, y coordinates, z coordinates).
- the second geographical area may be, but not limited to, a room, a building, a floor in the building.
- the sensing information 510 may comprise speed information related to the object 140.
- the speed information may be an average speed at which the object 140 is moving at, or the current speed of the object 140.
- the sensing information 510 comprises direction information related to the object 140.
- the direction information may be the direction within which the object 140 is moving towards.
- the speed information and/or the direction information related to the object 140 may be obtained by estimating travelling speed of the object 140, as specified in W02024/005683.
- the method 200 comprises determining 225 an object motion pattern based on the sensing information 510.
- the object motion pattern may be determined based on the speed information related to the object 140 and/or direction information related to the object 140.
- the sensing information 510 does not require to comprise an identification of the object 140.
- the step 220 of the method 200 allows the apparatus 110 to obtain knowledge of the object 140 with a position of the object 140 based on the obtained sensing information 510.
- the method 200 may comprise determining a sensing view 600.
- the sensing view 600 is based on the sensing information 510 obtained in the step 220 of the method 200, as described herein.
- an embodiment of the sensing view 600 is provided.
- the second geographical area is illustrated by a circle “O” in the sensing view 600.
- the method 200 comprises determining 230 based on the obtained positioning information 310 and the obtained sensing information 510 whether the UE 130 is associated with the object 140.
- the association of the UE 130 with the object 140 may correspond to a relationship of ownership. For example, in the event the UE 130 is a mobile phone and the object 140 is a person, the association of the UE 130 with the object 140 corresponds to the UE 130 being owned by the object 140. Alternatively, in the same example, the association of the UE 130 with the object 140 may correspond to the UE 130 being carried by the object 140.
- the step of determining 230 whether the UE is associated with the object 140 may comprise determining whether the positioning information 310 overlaps with the sensing information 510.
- determining whether the positioning information 310 overlaps with the sensing information 510 comprises determining whether a distance between the UE location and the object location 515 is below or equal to a threshold.
- the distance between the UE location 315 and the object location 515 may be a geographical distance.
- the threshold may be expressed in a distance measurement value, for example but not limited to centimetres, decimetres, metres.
- determining whether the positioning information 310 overlaps with the sensing information 510 comprises determining whether the distance between the UE location 315 and the object location 515 is below or equal to the threshold during a time period.
- determining whether the positioning information 310 overlaps with the sensing information 510 comprises determining whether a part of the first geographical area overlaps with a part of the second geographical area.
- determining whether the positioning information 310 overlaps with the sending information 510 comprises determining whether the UE position 315 and the object location 515 are in the same room or on the same floor.
- determining whether the UE 130 is associated with the object 140 is based on the UE motion pattern determined in the step 215 of the method 200 as described herein, and/or on the object motion pattern determined in the step 225 of the method 200 as described herein.
- the method 200 comprises determining 240, if the UE 130 is determined to be associated with the object 140, a joint view 700 based on the positioning information 310, the identification information 320, and the sensing information 510.
- a joint view 700 is provided.
- the cross “X” from the positioning and identification view 400 is present, as well as the circle “O” from the sensing view 600.
- the method 200 comprises assigning 250 in the joint view 700 an object identity 710 to the object 140.
- the object identity 710 is based on the UE identity 325.
- an embodiment of the object identity 710 is provided.
- the object identity 710 may be the same as the UE identity 325.
- the joint view 700 enables to provide an identity of the object 140 sensed.
- the method 200 comprises obtaining 260 further sensing information 810 and/or further positioning information 820.
- an embodiment of the step 260 is provided.
- the further sensing information 810 may comprise receiving the further sensing information 810 from a Joint Communication and Sensing (JCAS) unit in a network node.
- JCAS Joint Communication and Sensing
- the network node may be the apparatus 110.
- JCAS corresponds to spatial sensing of the object 140 within proximity of the network node.
- the network node is a network node other than the apparatus 110 performing JCAS of the object 140 and sending the further sensing information 810 to the apparatus 110.
- obtaining 260 the further sending information 810 comprises receiving the further sensing information 810 from the second apparatus 120.
- the second apparatus 120 may be configured to perform function of a radar.
- obtaining 260 the further sensing information 810 comprises receiving the further sensing information 810 from the second apparatus 120.
- the second apparatus 120 may be configured to perform function of a motion sensor.
- obtaining 260 the further sensing information 810 comprises receiving the further sensing information 810 from the second apparatus 120.
- the second apparatus 120 may be configured to perform function of a satellite.
- the further sensing information 810 may be based on JCAS, on radar, on motion sensors, and/or on satellite. In particular, no communication between the object 140 and the communication network 100 is required for the object 140 to be sensed.
- the further sensing information 810 may comprise an updated object location 815 related to the object 140.
- the updated object location 815 may correspond to the updated location of the object 140 at a later point in time compared to the location of the object 140 as described in step 220 of the method 200.
- the updated object location 815 may correspond to an updated second geographical point defined by a latitude and a longitude.
- the updated object location 815 corresponds to an updated second geographical area.
- the updated second geographical area may be defined by coordinates (e.g., x coordinates, y coordinates, z coordinates).
- the updated second geographical area may be, but not limited to, a room, a building, a floor in the building.
- the further sensing information 810 may comprise updated speed information related to the object 140.
- the updated speed information may be an updated average speed at which the object 140 is moving at, or the updated current speed of the object 140.
- the further sensing information 810 comprises updated direction information related to the object 140.
- the updated direction information may be the updated direction within which the object 140 is moving towards.
- the updated speed information and/or the updated direction information related to the object 140 may be obtained by estimating travelling updated speed of the object 140, as specified in W02024/005683.
- the method 200 comprises determining an updated object motion pattern based on the further sensing information 810.
- the updated object motion patter may be determined based on the updated speed information related to the object 140 and/or the updated direction information related to the object 140.
- the further sensing information 810 does not require to comprise an identification of the object 140.
- the further positioning information 820 may be based on known 3GPP positioning methods.
- the 3GPP positioning methods may be 5G System Location Services (LCS), such as defined in TS 23.273 V17.8.0, positioning, as defined, for example, in 3GPP TS 37.355 V17.7.0, and sidelink as defined, for example in 3GPP TS 38.355 V18.0.0.
- LCS 5G System Location Services
- the further positioning information 820 may comprise receiving, from the set of network nodes in the communication network 100, the further positioning information 820.
- the set of network nodes may be base stations.
- the apparatus 110 may be part of the set of network nodes.
- the set of network nodes calculate the further positioning information 820 based on signal transmitted by the UE 130.
- the set of network nodes may perform triangulation of uplink communication sent by the UE 130 to each of the set of network nodes.
- the set of network nodes may perform triangulation as defined, for example, in location services in 3GPP Release 99 Features.
- obtaining 260 the further positioning information 820 may comprise receiving the further positioning information 820 from the UE 130.
- the UE 130 calculates the further positioning information 820, such as in location services in 3GPP Release 99 Features.
- the UE 130 sends uplink communication to the apparatus 110.
- the uplink communication is comprising the positioning information 310.
- the further positioning information 820 may be based on GPS, map information, inertial navigations, and/or fingerprinting.
- the further positioning information 820 may comprise an updated UE location 825 related to the UE 130.
- the updated UE location 825 may correspond to the updated location of the UE 130 at a later point in time compared to the UE location of the UE as described in step 210 of the method 100.
- the updated UE location 825 may correspond to an updated first geographical point defined by a latitude and a longitude.
- the updated UE location 825 corresponds to an updated first geographical area.
- the updated first geographical area may be defined by coordinates (e.g., x coordinates, y coordinates, z coordinates).
- the updated first geographical area may be, but not limited to, a room, a building, a floor in the building.
- the further positioning information 820 may comprise updated speed information related to the UE 130.
- the updated speed information may be an updated average speed at which the UE 130 is moving at, or the updated current speed of the UE 130.
- the further positioning information 820 comprises updated direction information related to the UE 130.
- the updated direction information may be the direction within which the UE 130 is moving towards.
- the updated speed information and/or the updated direction information may be obtained by performing a doppler analysis of communication signals transmitted by the UE 130.
- the updated speed information and/or the updated direction information may be obtained by estimating the travelling speed of the UE 130, as specified in W02024/005682.
- the method 200 comprises determining an updated UE motion pattern based on the obtained further positioning information 820.
- the updated UE motion pattern may be determined based on the updated speed information related to the UE 130 and/or the updated direction information related to the UE 130.
- the method 200 comprises determining 270 an updated joint view 900 based on the obtained further sensing information 810 and/or the obtained further positioning information 820.
- the object identity 710 does not change, only the sensing and positioning information are updated over time.
- an embodiment of the updated joint view 900 is provided.
- the sensing view 600 and the positioning and identification view 400 are not overlapping anymore. It can be determined that the sensed object 140 has parted ways or left the UE 130. However, since the sensed object 140 has been assigned the object identity 710 in the step 250 of the method 200, the further sensing information 810 enables to establish the identity of the sensed object 140.
- the object 140 has been sensed at the updated object location 815, and the UE 130 is positioned at the updated location 825.
- the object identity 710 is kept. And thus, it is known that the object 140 is associated with the UE 130.
- the determination in step 270 of the updated joint view 900 is based on one or more of: the updated speed information related to the UE 130, the updated direction information related to the UE 130, the updated speed information related to the object 140, or the updated direction information related to the object 140.
- the object 140 now identified is further sensed while maintaining the object identity.
- the further sensing of the object 140 may be performed in parallel to the further positioning of the UE 130.
- the object 140 is sensed as a unique object. Alternatively, or additionally, the object 140 may be first sensed as a first object and then split into a plurality of objects. For example, in the event, the object 140 sensed is a person carrying a smartphone (e.g., UE 130) within a car.
- a smartphone e.g., UE 130
- the person and the car are sensed as the same object 140 at time t1 . However, when the person leaves the car, the person and the car are sensed as two objects.
- the plurality of objects is associated to the UE 130.
- the car and the person may share the same identifier (e.g. object identity 710) as there are both associated to the smartphone.
- the step 210, and/or the step 215 of the method 200 may be performed before, after or in parallel to the step the step 220, and/or the step 225 of the method 200.
- the step 230, the step 240 and/or the step 250 of the method 200 may be performed in parallel, simultaneously or after the steps 210, the step 215, the step 220, and/or the step 225 of the method 200.
- the step 260 of the method 200 may be performed before, or simultaneously to the step 270 of the method 200.
- the apparatus 110 comprises a first obtaining unit 1010, a second obtaining unit 1020, a first determining unit 1030, a second determining unit 1040, an assigning unit 1050.
- the first obtaining unit 1010 is configured to cause the apparatus 110 to perform the step 210 of the method 200, as described herein.
- the second obtaining unit 1020 is configured to cause the apparatus 110 to perform the step 220 of the method 200, as described herein.
- the first determining unit 1030 is configured to cause the apparatus 110 to perform the step 230 of the method 200, as described herein.
- the second determining unit 1040 is configured to cause the apparatus 110 to perform the step 240 of the method 200, as described herein.
- the assigning unit 1050 is configured to cause the apparatus 110 to perform the step 250 of the method 200, as described herein.
- the apparatus 110 comprises a third determining unit 1015.
- the third determining unit 1015 is configured to cause the apparatus 110 to perform the step 215 of the method 200, as described herein.
- the apparatus 110 comprises a fourth determining unit 1025.
- the fourth determining unit 1025 is configured to cause the apparatus 110 to perform the step 225 of the method 200, as described herein.
- the apparatus 110 comprises a third obtaining unit 1060.
- the third obtaining unit 1060 is configured to cause the apparatus 110 to perform the step 260 of the method 200, as described herein.
- the apparatus 110 comprises a fifth determining unit 1070.
- the fifth obtaining unit 1070 is configured to cause the apparatus 110 to perform the step 270 of the method 200, as described herein.
- the first obtaining unit 1010, the second obtaining unit 1020, and the third obtaining unit 1060 are a same obtaining unit.
- the first determining unit 1030, the second determining unit 1040, the third determining unit 1015, the fourth determining unit 1025, and the fifth determining unit 1070 are a single determining unit.
- the first obtaining unit 1010, the third determining unit 1015, the second obtaining unit 1020, the fourth determining unit 1025, the first determining unit 1030, the second determining unit 1040, the assigning unit 1050, the third obtaining unit 1060, and the fifth determining unit 1070, illustrated in Figure 10, may be implemented as a hardware solution or as a combination of software and hardware, e.g., by one or more of: a processor or a micro-processor and adequate software and memory for storing of the software, a Programmable Logic Device (PLD) or other electronic component(s) or processing circuitry configured to perform the actions described above with regards to the method 200.
- PLD Programmable Logic Device
- the apparatus 110 comprises a processor 1110, and a computer readable storage medium 1120 in the form of a memory 1125.
- the memory 1125 contains a computer program 1130 comprising instructions executable by the processor 1110 whereby the apparatus 110 is operative to perform the steps of the method 200.
- the (non-transitory) computer readable storage media mentioned above may be an Electrically Erasable Programmable Read-Only Memory (EEPROM), a flash memory, Field Programmable Gate Array, and a hard drive.
- EEPROM Electrically Erasable Programmable Read-Only Memory
- the processor 1110 of Figure 11 may be a single Central Processing Unit (CPU), but could also comprise two or more processing units.
- the processor 1110 of Figure 11 may include general purpose microprocessors; instructions set processors and/or related chips sets and/or special purpose microprocessors such as Application Specific Circuits (ASICs).
- ASICs Application Specific Circuits
- the processor 1110 of Figure 11 may also comprise board memory for caching purposes.
- the computer program 1130 of Figure 11 may be carried by a computer program product connected to the processor 1110 of Figure 11.
- the computer program products may be or comprise a non-transitory computer readable storage medium on which computer program 1130 of Figure 11 is stored.
- the computer program products may be a flash memory, a Random-Access memory (RAM), a Read- Only Memory (ROM), or an EEPROM, and the computer programs described above could in alternative embodiments be distributed on different computer program products in the form of memories.
- FIG 12 a block diagram illustrating a virtualization environment QQ500 in which method steps implemented by some embodiments may be virtualized.
- virtualizing means creating virtual versions of apparatus 110 which may include virtualizing hardware platforms, storage devices and networking resources.
- virtualization can be applied to apparatus 110 described herein, or components thereof, and relates to an implementation in which at least a portion of the functionality is implemented as one or more virtual components.
- Some or all of the method steps described herein may be implemented as virtual components executed by one or more virtual machines (VMs) implemented in one or more virtual environments QQ500 hosted by one or more of hardware nodes, such as a hardware computing device that operates as a network node, UE, core network node, or host.
- VMs virtual machines
- the virtualization environment QQ500 includes components defined by the O-RAN Alliance, such as an O-Cloud environment orchestrated by a Service Management and Orchestration Framework via an 0-2 interface.
- Applications QQ502 (which may alternatively be called software instances, virtual appliances, network functions, virtual nodes, virtual network functions, etc.) are run in the virtualization environment QQ500 to implement some of the method steps, features, functions, and/or benefits of some of the embodiments disclosed herein.
- Hardware QQ504 includes processing circuitry, memory that stores software and/or instructions executable by hardware processing circuitry, and/or other hardware devices as described herein, such as a network interface, input/output interface, and so forth.
- Software may be executed by the processing circuitry to instantiate one or more virtualization layers QQ506 (also referred to as hypervisors or virtual machine monitors (VMMs)), provide VMs QQ508a and QQ508b (one or more of which may be generally referred to as VMs QQ508), and/or perform any of the method steps, functions, features and/or benefits described in relation with some embodiments described herein.
- the virtualization layer QQ506 may present a virtual operating platform that appears like networking hardware to the VMs QQ508.
- the VMs QQ508 comprise virtual processing, virtual memory, virtual networking or interface and virtual storage, and may be run by a corresponding virtualization layer QQ506.
- Different embodiments of the instance of a virtual appliance QQ502 may be implemented on one or more of VMs QQ508, and the implementations may be made in different ways.
- Virtualization of the hardware is in some contexts referred to as network function virtualization (NFV). NFV may be used to consolidate many network equipment types onto industry standard high volume server hardware, physical switches, and physical storage, which can be located in data centers, and customer premise equipment.
- NFV network function virtualization
- a VM QQ508 may be a software implementation of a physical machine that runs programs as if they were executing on a physical, non-virtualized machine.
- Each of the VMs QQ508, and that part of hardware QQ504 that executes that VM be it hardware dedicated to that VM and/or hardware shared by that VM with others of the VMs, forms separate virtual network elements.
- a virtual network function is responsible for handling specific network functions that run in one or more VMs QQ508 on top of the hardware QQ504 and corresponds to the application QQ502.
- Hardware QQ504 may be implemented in a standalone network node with generic or specific components. Hardware QQ504 may implement some functions via virtualization. Alternatively, hardware QQ504 may be part of a larger cluster of hardware (e.g. such as in a data center or CPE) where many hardware nodes work together and are managed via management and orchestration QQ510, which, among others, oversees lifecycle management of applications QQ502. In some embodiments, hardware QQ504 is coupled to one or more radio units that each include one or more transmitters and one or more receivers that may be coupled to one or more antennas.
- Radio units may communicate directly with other hardware nodes via one or more appropriate network interfaces and may be used in combination with the virtual components to provide a virtual node with radio capabilities, such as a radio access node or a base station.
- some signaling can be provided with the use of a control system QQ512 which may alternatively be used for communication between hardware nodes and radio units.
- the term “and/or” includes any and all combinations of one or more of the associated listed terms.
- the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limited of example embodiments.
- the single forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
- the disclosure has been described above in reference to embodiments thereof. It should be understood that various modifications, alternatives, and additions can be made by those skilled in the art without departing from the scope of the disclosure. Therefore, the scope of the disclosure is not limited to the above particular embodiments but only defined by the claims as attached.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
L'invention concerne un appareil (110) pour détecter et identifier un objet, un procédé et un programme informatique. L'appareil (110) est configuré pour : obtenir des informations de positionnement (310) et des informations d'identification (320) associées à un équipement utilisateur, UE (130), les informations d'identification (320) comprenant une identité d'UE (325) ; obtenir des informations de détection (510) associées à l'objet (140) ; déterminer, sur la base des informations de positionnement (310) et des informations de détection (510), si l'UE (130) est associé à l'objet (140) ; déterminer, si l'UE (130) est déterminé comme étant associé à l'objet (140), une vue conjointe (700) sur la base des informations de position (310), des informations d'identification (320) et des informations de détection (510) ; et attribuer, dans la vue conjointe (700), une identité d'objet (710) à l'objet (140), l'identité d'objet (710) étant basée sur l'identité d'UE (325).
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/SE2024/050056 WO2025159665A1 (fr) | 2024-01-23 | 2024-01-23 | Appareil, procédé de détection et d'identification d'un objet |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/SE2024/050056 WO2025159665A1 (fr) | 2024-01-23 | 2024-01-23 | Appareil, procédé de détection et d'identification d'un objet |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025159665A1 true WO2025159665A1 (fr) | 2025-07-31 |
Family
ID=96545379
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/SE2024/050056 Pending WO2025159665A1 (fr) | 2024-01-23 | 2024-01-23 | Appareil, procédé de détection et d'identification d'un objet |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025159665A1 (fr) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140206389A1 (en) * | 2013-01-23 | 2014-07-24 | Qualcomm Incorporated | Visual identifier of third party location |
| US20150023562A1 (en) * | 2013-07-18 | 2015-01-22 | Golba Llc | Hybrid multi-camera based positioning |
| WO2015051814A1 (fr) * | 2013-10-07 | 2015-04-16 | Nokia Solutions And Networks Gmbh & Co. Kg | Informations pour la détermination d'objets |
| WO2015069320A2 (fr) * | 2013-05-31 | 2015-05-14 | Andrew Llc | Système et procédé d'identification et de suivi mobiles dans des systèmes de localisation |
| US20190260455A1 (en) * | 2018-02-21 | 2019-08-22 | Qualcomm Incorporated | Using image processing to assist with beamforming |
| US20220191819A1 (en) * | 2020-12-10 | 2022-06-16 | Nokia Technologies Oy | Associating sensing information with a user |
| EP4017034A1 (fr) * | 2020-12-21 | 2022-06-22 | Deutsche Telekom AG | Étiquettes slam de positionnement 5g |
| WO2024005682A1 (fr) * | 2022-07-01 | 2024-01-04 | Telefonaktiebolaget Lm Ericsson (Publ) | Procédé et nœud pour estimer la vitesse de déplacement d'un dispositif sans fil |
-
2024
- 2024-01-23 WO PCT/SE2024/050056 patent/WO2025159665A1/fr active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140206389A1 (en) * | 2013-01-23 | 2014-07-24 | Qualcomm Incorporated | Visual identifier of third party location |
| WO2015069320A2 (fr) * | 2013-05-31 | 2015-05-14 | Andrew Llc | Système et procédé d'identification et de suivi mobiles dans des systèmes de localisation |
| US20150023562A1 (en) * | 2013-07-18 | 2015-01-22 | Golba Llc | Hybrid multi-camera based positioning |
| WO2015051814A1 (fr) * | 2013-10-07 | 2015-04-16 | Nokia Solutions And Networks Gmbh & Co. Kg | Informations pour la détermination d'objets |
| US20190260455A1 (en) * | 2018-02-21 | 2019-08-22 | Qualcomm Incorporated | Using image processing to assist with beamforming |
| US20220191819A1 (en) * | 2020-12-10 | 2022-06-16 | Nokia Technologies Oy | Associating sensing information with a user |
| EP4017034A1 (fr) * | 2020-12-21 | 2022-06-22 | Deutsche Telekom AG | Étiquettes slam de positionnement 5g |
| WO2024005682A1 (fr) * | 2022-07-01 | 2024-01-04 | Telefonaktiebolaget Lm Ericsson (Publ) | Procédé et nœud pour estimer la vitesse de déplacement d'un dispositif sans fil |
Non-Patent Citations (5)
| Title |
|---|
| "3 Generation Partnership Project; Technical Specification Group Radio Access Network; LTE Positioning Protocol (LPP) (Release 17)", 3GPP STANDARD; TECHNICAL SPECIFICATION; 3GPP TS 37.355, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, no. V17.7.0, 15 January 2024 (2024-01-15), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, pages 1 - 353, XP052576809 * |
| "3rd Generation Partnership Project; Technical Specification Group Radio Access Network; NR; Sidelink Positioning Protocol (SLPP); Protocol specification (Release 18)", 3GPP STANDARD; TECHNICAL SPECIFICATION; 3GPP TS 38.355, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, no. V18.0.0, 16 January 2024 (2024-01-16), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, pages 1 - 76, XP052576825 * |
| "3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; 5G System (5GS) Location Services (LCS); Stage 2 (Release 17)", 3GPP STANDARD; TECHNICAL SPECIFICATION; 3GPP TS 23.273, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, no. V17.8.0, 31 March 2023 (2023-03-31), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, pages 1 - 108, XP052284119 * |
| "3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; Service requirements for the 5G system; Stage 1 (Release 17)", 3GPP STANDARD; TECHNICAL SPECIFICATION; 3GPP TS 22.261, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, vol. SA WG1, no. V17.12.0, 22 December 2023 (2023-12-22), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, pages 1 - 83, XP052553071 * |
| "3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; System architecture for the 5G System (5GS); Stage 2 (Release 18)", 3GPP STANDARD; 3GPP TS 23.501, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, vol. SA WG2, no. V18.4.0, 19 December 2023 (2023-12-19), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, pages 1 - 706, XP052552988 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CA3073457C (fr) | Procede et appareil de positionnement | |
| US20220342028A1 (en) | User equipment location information reporting method, user equipment, and network device | |
| CN105008858B (zh) | 用于室内定位的圈内用户架构 | |
| US9395189B2 (en) | Indoor structure inference using points of interest | |
| CN111886910A (zh) | 用于确定用户设备位置的本地实体、用户设备、接入网、全局实体、以及车辆 | |
| JP6957555B2 (ja) | モバイルロケーションサービスへのクライアントアクセス | |
| US20200374658A1 (en) | Positioning Method, Terminal, and Server | |
| US20110227788A1 (en) | Method and system for generating and propagating location information by a mobile device using sensory data | |
| US20170090007A1 (en) | Vision and Radio Fusion Based Precise Indoor Localization | |
| US20170142684A1 (en) | Method and apparatus for determining position of a user equipment | |
| US12461185B2 (en) | Assistance data for RAT-dependent positioning | |
| EP2936189A2 (fr) | Détermination d'un emplacement d'un terminal utilisateur mobile | |
| JP2016517518A (ja) | 支援データ生成のためのアクセスポイント選択 | |
| EP3682690A1 (fr) | Autorisation de positionnement efficace d'un dispositif cible dans un système de communication sans fil | |
| EP4187971B1 (fr) | Procédé de positionnement et appareil associé | |
| WO2015057767A1 (fr) | Sélection d'un point d'accès afin de déterminer une position d'un dispositif mobile sur la base d'informations de charge de trafic liées au point d'accès | |
| EP3403452B1 (fr) | Dispositif électronique et serveur pour la détermination d'emplacement de dispositif | |
| WO2025159665A1 (fr) | Appareil, procédé de détection et d'identification d'un objet | |
| CN118679761A (zh) | 用于感测区域识别的方法及设备 | |
| CN111801598A (zh) | 使用声学上下文数据的位置确定 | |
| US20210360369A1 (en) | Location information for multiple user equipments | |
| KR20130030378A (ko) | 측위 방법 및 장치 | |
| CN114982314B (zh) | 对目标设备进行定位 | |
| WO2025209951A1 (fr) | Procédés, dispositifs et support pour demander un positionnement par ia/ml | |
| CN120303576A (zh) | 用于定位的设备、方法、系统、定位辅助数据和计算机程序 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24920516 Country of ref document: EP Kind code of ref document: A1 |