[go: up one dir, main page]

WO2025211471A1 - Appareil et procédé pour émettre et recevoir des signaux dans un réseau non terrestre - Google Patents

Appareil et procédé pour émettre et recevoir des signaux dans un réseau non terrestre

Info

Publication number
WO2025211471A1
WO2025211471A1 PCT/KR2024/004188 KR2024004188W WO2025211471A1 WO 2025211471 A1 WO2025211471 A1 WO 2025211471A1 KR 2024004188 W KR2024004188 W KR 2024004188W WO 2025211471 A1 WO2025211471 A1 WO 2025211471A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
information
timing advance
specific timing
specific
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/KR2024/004188
Other languages
English (en)
Korean (ko)
Inventor
신원호
이동순
김병길
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Priority to PCT/KR2024/004188 priority Critical patent/WO2025211471A1/fr
Publication of WO2025211471A1 publication Critical patent/WO2025211471A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W48/00Access restriction; Network selection; Access point selection
    • H04W48/08Access restriction or access information delivery, e.g. discovery data delivery
    • H04W48/10Access restriction or access information delivery, e.g. discovery data delivery using broadcasted information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W56/00Synchronisation arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/02Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
    • H04W84/04Large scale networks; Deep hierarchical networks
    • H04W84/06Airborne or Satellite Networks

Definitions

  • the present disclosure relates to a device and method for transmitting and receiving signals in a non-terrestrial network. Specifically, the present disclosure relates to a device and method for estimating terminal-specific timing advance by considering Doppler shift in a non-terrestrial network.
  • Wireless access systems are widely deployed to provide various types of communication services, such as voice and data.
  • wireless access systems are multiple access systems that support communications with multiple users by sharing available system resources (e.g., bandwidth, transmission power).
  • multiple access systems include code division multiple access (CDMA), frequency division multiple access (FDMA), time division multiple access (TDMA), orthogonal frequency division multiple access (OFDMA), and single-carrier frequency division multiple access (SC-FDMA).
  • CDMA code division multiple access
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • OFDMA orthogonal frequency division multiple access
  • SC-FDMA single-carrier frequency division multiple access
  • TA Trimed Advance
  • the present disclosure provides a device and method for inducing terminal-specific Timing Advance (TA) in a non-terrestrial network.
  • TA Timing Advance
  • a method performed by a user equipment (UE) in a communication system may include the steps of: receiving first frequency information allocated to a cell to which the UE belongs from a base station (BS) through a first node; receiving information about a cell-specific timing advance at a specific time from the BS through the first node; obtaining second frequency information of a signal transmitted from the BS through the first node based on the signal; and estimating a UE-specific timing advance for the UE based on at least one of the first frequency information, the second frequency information, and the information about the cell-specific timing advance.
  • BS base station
  • the information about the cell-specific timing advance may include information about a rate of change of the cell-specific timing advance at the specific time.
  • the terminal-specific timing advance may be estimated based on at least one of the terminal-specific timing advance obtained at the specific time and the change rate value of the terminal-specific timing advance obtained at the specific time.
  • the rate of change of the terminal-specific timing advance can be determined based on the rate of change of the cell-specific timing advance.
  • the terminal-specific timing advance obtained at the specific time can be determined based on the location of the terminal and the location of the first node at the specific time.
  • the terminal-specific timing advance obtained at the specific time may be determined based on the terminal-specific timing advance obtained at a time earlier than the specific time.
  • information about the cell-specific timing advance may be transmitted via a System Information Block (SIB).
  • SIB System Information Block
  • the information about the cell-specific timing advance may include information about a rate of change of the cell-specific timing advance at the specific time.
  • the terminal-specific timing advance may be estimated based on at least one of the terminal-specific timing advance obtained at the specific time and the change rate value of the terminal-specific timing advance obtained at the specific time.
  • the rate of change of the terminal-specific timing advance can be determined based on the rate of change of the cell-specific timing advance.
  • the terminal-specific timing advance obtained at the specific time can be determined based on the location of the terminal and the location of the first node at the specific time.
  • the terminal-specific timing advance obtained at the specific time may be determined based on the terminal-specific timing advance obtained at a time earlier than the specific time.
  • a terminal operating in a communication system includes a transceiver, at least one processor, and at least one memory operably connectable to the at least one processor and storing instructions that, when executed by the at least one processor, perform operations, wherein the operations may include all steps of a method of operating the terminal according to various embodiments of the present disclosure.
  • a control device for controlling a terminal in a communication system includes at least one processor and at least one memory operably connected to the at least one processor, wherein the at least one memory stores instructions for performing operations based on being executed by the at least one processor, and the operations may include all steps of an operating method of a terminal according to various embodiments of the present disclosure.
  • one or more non-transitory computer-readable media storing one or more commands, wherein the one or more commands perform operations based on being executed by one or more processors, and the operations may include all steps of a method of operating a terminal according to various embodiments of the present disclosure.
  • one or more non-transitory computer-readable media storing one or more commands, wherein the one or more commands, when executed by one or more processors, perform operations, wherein the operations may include all steps of a method of operating a base station according to various embodiments of the present disclosure.
  • a device and method for transmitting and receiving a signal in a non-terrestrial network can be provided.
  • a terminal-specific TA can be derived by considering the Doppler shift, so that a non-terrestrial network can be configured by considering the required TA limit error.
  • Figure 1 is a diagram illustrating an example of physical channels and general signal transmission used in a 3GPP system.
  • FIG. 2 is a diagram illustrating the system structure of a New Generation Radio Access Network (NG-RAN).
  • NG-RAN New Generation Radio Access Network
  • Figure 3 is a diagram illustrating the functional division between NG-RAN and 5GC.
  • Figure 4 is a diagram illustrating an example of a 5G usage scenario.
  • Figure 9 is a schematic diagram illustrating an example of a convolutional neural network.
  • Figure 19 is a drawing showing the structure of an optical modulator.
  • FIG. 27 illustrates a communication system (1) applicable to various embodiments of the present disclosure.
  • FIG. 32 illustrates a portable device applicable to various embodiments of the present disclosure.
  • FIG. 33 illustrates a vehicle or autonomous vehicle applicable to various embodiments of the present disclosure.
  • FIG. 34 illustrates a vehicle applicable to various embodiments of the present disclosure.
  • FIG. 35 illustrates an XR device applicable to various embodiments of the present disclosure.
  • FIG. 36 illustrates a robot applicable to various embodiments of the present disclosure.
  • FIG. 37 illustrates an AI device applicable to various embodiments of the present disclosure.
  • a or B may mean “only A,” “only B,” or “both A and B.” In other words, in various embodiments of the present disclosure, “A or B” may be interpreted as “A and/or B.” For example, in various embodiments of the present disclosure, “A, B or C” may mean “only A,” “only B,” “only C,” or “any combination of A, B and C.”
  • a slash (/) or a comma may mean “and/or.”
  • A/B may mean “A and/or B.”
  • A/B may mean “only A,” “only B,” or “both A and B.”
  • A, B, C may mean “A, B, or C.”
  • parentheses used in various embodiments of the present disclosure may mean “for example.” Specifically, when indicated as “control information (PDCCH)", “PDCCH” may be proposed as an example of “control information.” In other words, “control information” in various embodiments of the present disclosure is not limited to “PDCCH”, and “PDDCH” may be proposed as an example of "control information.” Furthermore, even when indicated as “control information (i.e., PDCCH)", “PDCCH” may be proposed as an example of "control information.”
  • RRC Radio Resource Control
  • Figure 1 is a diagram illustrating an example of physical channels and general signal transmission used in a 3GPP system.
  • a terminal receives information from a base station via the downlink (DL) and transmits it to the base station via the uplink (UL).
  • the information transmitted and received between the base station and the terminal includes data and various control information, and various physical channels exist depending on the type and purpose of the information being transmitted and received.
  • a terminal that has completed initial cell search can obtain more specific system information by receiving a physical downlink control channel (PDCCH) and a physical downlink shared channel (PDSCH) based on information contained in the PDCCH (S12).
  • PDCCH physical downlink control channel
  • PDSCH physical downlink shared channel
  • the base station transmits a related signal to the terminal through a downlink channel described below, and the terminal receives the related signal from the base station through a downlink channel described below.
  • PDSCH Physical Downlink Shared Channel
  • PDSCH carries downlink data (e.g., DL-shared channel transport block, DL-SCH TB) and applies modulation methods such as Quadrature Phase Shift Keying (QPSK), 16 Quadrature Amplitude Modulation (QAM), 64 QAM, and 256 QAM.
  • Codewords are generated by encoding the TBs.
  • PDSCH can carry multiple codewords. Scrambling and modulation mapping are performed for each codeword, and modulation symbols generated from each codeword are mapped to one or more layers (Layer mapping). Each layer is mapped to resources along with a Demodulation Reference Signal (DMRS), generated as an OFDM symbol signal, and transmitted through the corresponding antenna port.
  • DMRS Demodulation Reference Signal
  • the PDCCH carries downlink control information (DCI) and employs modulation methods such as QPSK.
  • DCI downlink control information
  • a PDCCH consists of 1, 2, 4, 8, or 16 Control Channel Elements (CCEs), depending on the Aggregation Level (AL).
  • CCEs Control Channel Elements
  • Each CCE is comprised of six Resource Element Groups (REGs). Each REG is defined by one OFDM symbol and one (P)RB.
  • the UE obtains DCI transmitted via the PDCCH by performing decoding (also known as blind decoding) on a set of PDCCH candidates.
  • the set of PDCCH candidates decoded by the UE is defined as a PDCCH search space set.
  • the search space set may be a common search space or a UE-specific search space.
  • the UE can obtain DCI by monitoring PDCCH candidates within one or more search space sets established by the MIB or higher layer signaling.
  • the terminal transmits a related signal to the base station through the uplink channel described below, and the base station receives the related signal from the terminal through the uplink channel described below.
  • PUSCH Physical Uplink Shared Channel
  • PUSCH carries uplink data (e.g., UL-shared channel transport block, UL-SCH TB) and/or uplink control information (UCI), and is transmitted based on a CP-OFDM (Cyclic Prefix - Orthogonal Frequency Division Multiplexing) waveform, a DFT-s-OFDM (Discrete Fourier Transform - spread - Orthogonal Frequency Division Multiplexing) waveform, etc.
  • CP-OFDM Cyclic Prefix - Orthogonal Frequency Division Multiplexing
  • DFT-s-OFDM Discrete Fourier Transform - spread - Orthogonal Frequency Division Multiplexing
  • PUSCH transmissions can be dynamically scheduled by UL grants in DCI, or semi-statically scheduled (configured grant) based on higher layer (e.g., RRC) signaling (and/or Layer 1 (L1) signaling (e.g., PDCCH)).
  • PUSCH transmissions can be performed in a codebook-based or non-codebook-based manner.
  • PUCCH carries uplink control information, HARQ-ACK and/or scheduling request (SR), and can be divided into multiple PUCCHs depending on the PUCCH transmission length.
  • new radio access technology new RAT, NR.
  • next-generation communication As more and more communication devices demand greater communication capacity, the need for improved mobile broadband communication compared to existing radio access technology (RAT) is emerging.
  • massive Machine Type Communications (MTC) which connects numerous devices and objects to provide various services anytime, anywhere, is also a key issue to be considered in next-generation communication.
  • communication system design that considers reliability and latency-sensitive services/terminals is being discussed.
  • next-generation radio access technologies that take into account enhanced mobile broadband communication, massive MTC, and URLLC (Ultra-Reliable and Low Latency Communication) is being discussed, and in various embodiments of the present disclosure, such technologies are conveniently referred to as new RAT or NR.
  • FIG. 2 is a diagram illustrating the system structure of a New Generation Radio Access Network (NG-RAN).
  • NG-RAN New Generation Radio Access Network
  • the NG-RAN may include a gNB and/or an eNB that provides user plane and control plane protocol termination to the UE.
  • FIG. 1 illustrates a case where only a gNB is included.
  • the gNB and eNB are connected to each other via an Xn interface.
  • the gNB and eNB are connected to the 5th generation core network (5G Core Network: 5GC) via the NG interface.
  • 5G Core Network: 5GC 5th generation core network
  • the gNB is connected to the access and mobility management function (AMF) via the NG-C interface
  • the gNB is connected to the user plane function (UPF) via the NG-U interface.
  • AMF access and mobility management function
  • UPF user plane function
  • Figure 3 is a diagram illustrating the functional division between NG-RAN and 5GC.
  • the gNB can provide functions such as inter-cell radio resource management (Inter Cell RRM), radio bearer management (RB control), connection mobility control (Connection Mobility Control), radio admission control (Radio Admission Control), measurement configuration and provision, and dynamic resource allocation.
  • the AMF can provide functions such as NAS security and idle state mobility processing.
  • the UPF can provide functions such as mobility anchoring and PDU processing.
  • the SMF Session Management Function
  • Figure 4 is a diagram illustrating an example of a 5G usage scenario.
  • the 5G usage scenario illustrated in FIG. 4 is merely exemplary, and the technical features of various embodiments of the present disclosure can also be applied to other 5G usage scenarios not illustrated in FIG. 4.
  • the three key requirement areas for 5G include (1) enhanced mobile broadband (eMBB), (2) massive machine type communication (mMTC), and (3) ultra-reliable and low latency communications (URLLC).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communication
  • URLLC ultra-reliable and low latency communications
  • KPI key performance indicator
  • eMBB focuses on improving data speeds, latency, user density, and overall capacity and coverage of mobile broadband connections. It targets throughputs of around 10 Gbps. eMBB significantly exceeds basic mobile internet access, enabling rich interactive experiences, media and entertainment applications in the cloud, and augmented reality. Data is a key driver of 5G, and for the first time, dedicated voice services may not be available in the 5G era. In 5G, voice is expected to be handled as an application, simply using the data connection provided by the communication system. The increased traffic volume is primarily due to the increasing content size and the growing number of applications that require high data rates. Streaming services (audio and video), interactive video, and mobile internet connectivity will become more prevalent as more devices connect to the internet.
  • Cloud storage and applications are rapidly growing on mobile communication platforms, and this can be applied to both work and entertainment.
  • Cloud storage is a particular use case driving the growth of uplink data rates.
  • 5G is also used for remote work in the cloud, requiring significantly lower end-to-end latency to maintain a superior user experience when tactile interfaces are used.
  • cloud gaming and video streaming are other key factors driving the demand for mobile broadband.
  • Entertainment is essential on smartphones and tablets, regardless of location, including in highly mobile environments like trains, cars, and airplanes.
  • Another use case is augmented reality and information retrieval for entertainment, where augmented reality requires extremely low latency and instantaneous data volumes.
  • mMTC is designed to enable communication between a large number of low-cost, battery-powered devices, supporting applications such as smart metering, logistics, field, and body sensors.
  • mMTC targets a battery life of approximately 10 years and/or a population of approximately 1 million devices per square kilometer.
  • mMTC enables seamless connectivity of embedded sensors across all sectors and is one of the most anticipated 5G use cases.
  • the number of IoT devices is projected to reach 20.4 billion by 2020.
  • Industrial IoT is one area where 5G will play a key role, enabling smart cities, asset tracking, smart utilities, agriculture, and security infrastructure.
  • URLLC is ideal for vehicle communications, industrial control, factory automation, remote surgery, smart grids, and public safety applications by enabling devices and machines to communicate with high reliability, very low latency, and high availability.
  • URLLC targets latency on the order of 1 ms.
  • URLLC encompasses new services that will transform industries through ultra-reliable, low-latency links, such as remote control of critical infrastructure and autonomous vehicles. This level of reliability and latency is essential for smart grid control, industrial automation, robotics, and drone control and coordination.
  • 5G can complement fiber-to-the-home (FTTH) and cable-based broadband (or DOCSIS) by delivering streams rated at hundreds of megabits per second to gigabits per second. These high speeds may be required to deliver TV at resolutions beyond 4K (6K, 8K, and beyond), as well as virtual reality (VR) and augmented reality (AR).
  • VR and AR applications include near-immersive sports events. Certain applications may require specialized network configurations. For example, for VR gaming, a gaming company may need to integrate its core servers with the network operator's edge network servers to minimize latency.
  • Automotive is expected to be a significant new driver for 5G, with numerous use cases for in-vehicle mobile communications. For example, passenger entertainment demands both high capacity and high mobile broadband, as future users will consistently expect high-quality connectivity regardless of their location and speed.
  • Another automotive application is augmented reality dashboards.
  • An AR dashboard allows drivers to identify objects in the dark on top of what they see through the windshield. The AR dashboard overlays information to inform the driver about the distance and movement of objects.
  • wireless modules will enable vehicle-to-vehicle communication, information exchange between vehicles and supporting infrastructure, and information exchange between vehicles and other connected devices (e.g., devices accompanying pedestrians).
  • Safety systems can guide drivers to safer driving behaviors, reducing the risk of accidents.
  • the next step will be remotely controlled or autonomous vehicles, which require highly reliable and fast communication between different autonomous vehicles and/or between vehicles and infrastructure.
  • autonomous vehicles will perform all driving tasks, leaving drivers to focus solely on traffic anomalies that the vehicle itself cannot detect.
  • the technological requirements for autonomous vehicles will require ultra-low latency and ultra-high-speed reliability, increasing traffic safety to levels unattainable by humans.
  • Smart cities and smart homes often referred to as smart societies, will be embedded with dense wireless sensor networks.
  • a distributed network of intelligent sensors will identify conditions for cost- and energy-efficient maintenance of cities or homes. Similar setups can be implemented for individual homes.
  • Temperature sensors, window and heating controllers, burglar alarms, and appliances will all be wirelessly connected. Many of these sensors typically require low data rates, low power, and low cost. However, for example, real-time HD video may be required from certain types of devices for surveillance purposes.
  • Smart grids interconnect these sensors using digital information and communication technologies to collect and act on information. This information can include the behavior of suppliers and consumers, enabling smart grids to improve efficiency, reliability, economic efficiency, sustainable production, and the automated distribution of fuels like electricity. Smart grids can also be viewed as another low-latency sensor network.
  • Telecommunications systems can support telemedicine, which provides clinical care in remote locations. This can help reduce distance barriers and improve access to health services that are otherwise unavailable in remote rural areas. It can also be used to save lives in critical care and emergency situations.
  • Mobile-based wireless sensor networks can provide remote monitoring and sensors for parameters such as heart rate and blood pressure.
  • Wireless and mobile communications are becoming increasingly important in industrial applications. Wiring is expensive to install and maintain. Therefore, the potential to replace cables with reconfigurable wireless links presents an attractive opportunity for many industries. However, achieving this requires wireless connections to operate with similar latency, reliability, and capacity to cables, while simplifying their management. Low latency and extremely low error rates are new requirements for 5G connectivity.
  • Logistics and freight tracking are important use cases for mobile communications, enabling the tracking of inventory and packages anywhere using location-based information systems. Logistics and freight tracking typically require low data rates but may require wide-range and reliable location information.
  • next-generation communications e.g., 6G
  • 6G next-generation communications
  • the 6G (wireless communication) system aims to achieve (i) very high data rates per device, (ii) a very large number of connected devices, (iii) global connectivity, (iv) very low latency, (v) low energy consumption for battery-free IoT devices, (vi) ultra-reliable connectivity, and (vii) connected intelligence with machine learning capabilities.
  • the vision of the 6G system can be divided into four aspects: intelligent connectivity, deep connectivity, holographic connectivity, and ubiquitous connectivity, and the 6G system can satisfy the requirements as shown in Table 1 below.
  • Table 1 is a table showing an example of the requirements of a 6G system.
  • 6G systems can have key factors such as enhanced mobile broadband (eMBB), ultra-reliable low latency communications (URLLC), massive machine-type communication (mMTC), AI integrated communication, tactile internet, high throughput, high network capacity, high energy efficiency, low backhaul and access network congestion, and enhanced data security.
  • eMBB enhanced mobile broadband
  • URLLC ultra-reliable low latency communications
  • mMTC massive machine-type communication
  • AI integrated communication tactile internet, high throughput, high network capacity, high energy efficiency, low backhaul and access network congestion, and enhanced data security.
  • Figure 5 is a diagram illustrating an example of a communication structure that can be provided in a 6G system.
  • 6G systems are expected to have 50 times the simultaneous wireless connectivity of 5G systems.
  • URLLC a key feature of 5G, will become even more crucial in 6G communications by providing end-to-end latency of less than 1 ms.
  • 6G systems will have significantly higher volumetric spectral efficiency, compared to the commonly used area spectral efficiency.
  • 6G systems can offer extremely long battery life and advanced battery technologies for energy harvesting, eliminating the need for separate charging for mobile devices in 6G systems.
  • New network characteristics in 6G may include:
  • 6G is expected to integrate with satellites to provide a global mobile network.
  • the integration of terrestrial, satellite, and airborne networks into a single wireless communications system is crucial for 6G.
  • Connected Intelligence Unlike previous generations of wireless communication systems, 6G is revolutionary, upgrading the wireless evolution from "connected objects" to "connected intelligence.” AI can be applied at every stage of the communication process (or at every signal processing step, as described below).
  • 6G wireless networks will transfer power to charge the batteries of devices such as smartphones and sensors. Therefore, wireless information and energy transfer (WIET) will be integrated.
  • WIET wireless information and energy transfer
  • Small cell networks The concept of small cell networks was introduced to improve received signal quality in cellular systems by increasing throughput, energy efficiency, and spectral efficiency. Consequently, small cell networks are essential for 5G and beyond-5G (5GB) communication systems. Accordingly, 6G communication systems also adopt the characteristics of small cell networks.
  • High-capacity backhaul Backhaul connections are characterized by high-capacity backhaul networks to support high-volume traffic.
  • High-speed fiber optics and free-space optics (FSO) systems may be potential solutions to this problem.
  • High-precision localization (or location-based services) through communications is a key feature of 6G wireless communication systems. Therefore, radar systems will be integrated with 6G networks.
  • Softwarization and virtualization are two critical features that form the foundation of the design process for 5GB networks to ensure flexibility, reconfigurability, and programmability. Furthermore, billions of devices can be shared on a shared physical infrastructure.
  • AI The most crucial and newly introduced technology for 6G systems is AI. 4G systems did not involve AI. 5G systems will support partial or very limited AI. However, 6G systems will fully support AI for automation. Advances in machine learning will create more intelligent networks for real-time communications in 6G. Incorporating AI into communications can streamline and improve real-time data transmission. AI can use numerous analyses to determine how complex target tasks should be performed. In other words, AI can increase efficiency and reduce processing delays.
  • AI-based physical layer transmission refers to the application of AI-based signal processing and communication mechanisms, rather than traditional communication frameworks, in the fundamental signal processing and communication mechanisms. For example, this may include deep learning-based channel coding and decoding, deep learning-based signal estimation and detection, deep learning-based MIMO mechanisms, and AI-based resource scheduling and allocation.
  • Machine learning can be used for channel estimation and channel tracking, as well as for power allocation and interference cancellation in the physical layer of the downlink (DL). Furthermore, machine learning can be used for antenna selection, power control, and symbol detection in MIMO systems.
  • Deep learning-based AI algorithms require a large amount of training data to optimize training parameters.
  • a large amount of training data is used offline. This means that static training on training data in specific channel environments can lead to conflicts with the dynamic characteristics and diversity of the wireless channel.
  • Machine learning refers to a series of operations that train machines to perform tasks that humans can or cannot perform. Machine learning requires data and a learning model. Data learning methods in machine learning can be broadly categorized into three types: supervised learning, unsupervised learning, and reinforcement learning.
  • Neural network training aims to minimize output errors. It involves repeatedly inputting training data into a neural network, calculating the neural network output and target error for the training data, and backpropagating the neural network error from the output layer to the input layer to update the weights of each node in the neural network to reduce the error.
  • the neural network's calculation of the input data and the backpropagation of the error can constitute a learning cycle (epoch).
  • the learning rate can be applied differently depending on the number of iterations of the neural network's learning cycle. For example, in the early stages of training a neural network, a high learning rate can be used to quickly allow the network to achieve a certain level of performance, thereby increasing efficiency. In the later stages of training, a low learning rate can be used to increase accuracy.
  • Learning methods may vary depending on the characteristics of the data. For example, if the goal is to accurately predict data transmitted by a transmitter in a communication system, supervised learning is preferable to unsupervised learning or reinforcement learning.
  • the learning model corresponds to the human brain, and the most basic linear model can be thought of, but the machine learning paradigm that uses highly complex neural network structures, such as artificial neural networks, as learning models is called deep learning.
  • Figure 7 is a schematic diagram illustrating an example of a multilayer perceptron structure.
  • Figure 8 is a schematic diagram illustrating an example of a deep neural network.
  • Tight integration of multiple frequencies and heterogeneous communication technologies is crucial in 6G systems. As a result, users will be able to seamlessly move from one network to another without requiring any manual configuration on their devices. The best network will be automatically selected from available communication technologies. This will break the limitations of the cell concept in wireless communications. Currently, user movement from one cell to another in dense networks results in excessive handovers, resulting in handover failures, handover delays, data loss, and a ping-pong effect. 6G cell-free communications will overcome all of these challenges and provide better QoS. Cell-free communications will be achieved through multi-connectivity and multi-tier hybrid technologies, as well as heterogeneous radios on devices.
  • WIET uses the same fields and waves as wireless communication systems. Specifically, sensors and smartphones will be charged using wireless power transfer during communication. WIET is a promising technology for extending the life of battery-powered wireless systems. Therefore, battery-less devices will be supported by 6G communications.
  • Autonomous wireless networks are capable of continuously sensing dynamically changing environmental conditions and exchanging information between different nodes.
  • sensing will be tightly integrated with communications to support autonomous systems.
  • the frequency bands expected to be used for THz wireless communication may be the D-band (110 GHz to 170 GHz) or H-band (220 GHz to 325 GHz), which have low propagation loss due to molecular absorption in the air. Discussions on standardization of THz wireless communication are being centered around the IEEE 802.15 THz working group in addition to 3GPP, and standard documents issued by the IEEE 802.15 Task Group (TG3d, TG3e) may specify or supplement the contents described in various embodiments of the present disclosure. THz wireless communication can be applied to wireless cognition, sensing, imaging, wireless communication, THz navigation, etc.
  • Figure 14 is a diagram illustrating an example of a THz communication application.
  • THz wireless communication scenarios can be categorized into macro networks, micro networks, and nanoscale networks.
  • THz wireless communication can be applied to vehicle-to-vehicle and backhaul/fronthaul connections.
  • THz wireless communication can be applied to fixed point-to-point or multi-point connections, such as indoor small cells, wireless connections in data centers, and near-field communications, such as kiosk downloads.
  • THz wireless communications can be categorized based on the methods used to generate and receive THz waves.
  • THz generation methods can be categorized as either optical or electronic-based.
  • Methods for generating THz using electronic components include a method using semiconductor components such as a resonant tunneling diode (RTD), a method using a local oscillator and a multiplier, a MMIC (Monolithic Microwave Integrated Circuits) method using an integrated circuit based on a compound semiconductor HEMT (High Electron Mobility Transistor), and a method using a Si-CMOS-based integrated circuit.
  • a multiplier doubler, tripler, multiplier
  • a multiplier is essential.
  • FIG. 16 is a diagram illustrating an example of a method for generating a THz signal based on an optical element.
  • Fig. 17 is a diagram illustrating an example of an optical element-based THz wireless communication transceiver.
  • Optical component-based THz wireless communication technology refers to a method of generating and modulating THz signals using optical components.
  • Optical component-based THz signal generation technology generates an ultra-high-speed optical signal using a laser and an optical modulator, and converts it into a THz signal using an ultra-high-speed photodetector. Compared to technologies that use only electronic components, this technology can easily increase the frequency, generate high-power signals, and obtain flat response characteristics over a wide frequency band.
  • optical component-based THz signal generation requires a laser diode, a wideband optical modulator, and an ultra-high-speed photodetector.
  • an optical coupler refers to a semiconductor device that transmits an electrical signal using optical waves to provide electrical isolation and coupling between circuits or systems
  • a UTC-PD Uni-Travelling Carrier Photo-Detector
  • the UTC-PD is capable of detecting light at 150 GHz or higher.
  • an EDFA Erbium-Doped Fiber Amplifier
  • a PD Photo Detector
  • an OSA optical module (Optical Sub Assembly) that modularizes various optical communication functions (photoelectric conversion, electro-optical conversion, etc.) into a single component
  • a DSO represents a digital storage oscilloscope.
  • Fig. 18 is a diagram illustrating the structure of a photon source-based transmitter.
  • Figure 19 is a drawing showing the structure of an optical modulator.
  • the phase of a signal can be changed by passing the optical source of a laser through an optical wave guide. At this time, data is loaded by changing the electrical characteristics through a microwave contact, etc. Therefore, the optical modulator output is formed as a modulated waveform.
  • An opto-electrical modulator (O/E converter) can generate THz pulses by optical rectification operation by a nonlinear crystal, photoelectric conversion by a photoconductive antenna, emission from a bunch of relativistic electrons, etc. Terahertz pulses generated in the above manner can have a length in units of femtoseconds to picoseconds.
  • An optical/electronic converter (O/E converter) performs down conversion by utilizing the non-linearity of the device.
  • the available bandwidth can be classified based on the oxygen attenuation of 10 ⁇ 2 dB/km in the spectrum up to 1 THz. Accordingly, a framework in which the available bandwidth is composed of multiple band chunks can be considered. As an example of the above framework, if the THz pulse length for one carrier is set to 50 ps, the bandwidth (BW) becomes approximately 20 GHz.
  • Effective down-conversion from the infrared band (IR band) to the terahertz band (THz band) depends on how to utilize the nonlinearity of the optical/electrical converter (O/E converter).
  • O/E converter optical/electrical converter
  • a terahertz transmission and reception system can be implemented using a single optical-to-electrical converter.
  • the number of optical-to-electrical converters may be equal to the number of carriers. This phenomenon will be particularly noticeable in a multi-carrier system that utilizes multiple broadbands according to the aforementioned spectrum usage plan.
  • a frame structure for the multi-carrier system may be considered.
  • a signal down-converted using an optical-to-electrical converter may be transmitted in a specific resource region (e.g., a specific frame).
  • the frequency region of the specific resource region may include multiple chunks. Each chunk may be composed of at least one component carrier (CC).
  • tellite refers to a low-orbit satellite in a non-terrestrial network. However, the scope of the present invention is not limited by this expression. “Satellite” may also be used to refer to one of the typical nodes in a communications system.
  • Figure 20 is a diagram for explaining timing advance of a non-terrestrial network.
  • T TA (N TA + N TA, offset )T C + T TA, common + T TA, UE-specific
  • N TA represents a closed-loop TA updated by a TA command
  • N TA, offset is a fixed offset value, which may be determined according to the frequency band and duplex used.
  • T TA common may represent a delay between a reference point (RP) and a satellite as illustrated in FIG. 20, and this value may be a common value that the base station notifies to all terminals in the cell.
  • the initial value of N TA may be determined by a random access message (Msg2) of the RACH operation.
  • Msg2 random access message
  • the initial value of N TA may be updated by a TA command when the terminal is in an RRC connected (RRC_CONNECTED) state.
  • T TA refers to the delay difference between the terminal and the satellite as shown in Fig. 20, and may be a UE-specific value that the terminal directly estimates (self-estimates) based on the position information of the terminal and the satellite.
  • T C may represent a basic time unit.
  • Figure 21 is a diagram for explaining changes in service links of a non-terrestrial network.
  • Figure 21 illustrates the orbital pattern of a LEO (Leo Earth Orbit) or low-Earth-orbit satellite in a non-terrestrial network.
  • a service link may refer to a connection path between a terminal and a low-Earth-orbit satellite.
  • the position vectors ⁇ P x , P y , P z > can represent the position vectors of the satellite with respect to the beam center
  • ⁇ P a , P b , P c > can represent the position vectors of the terminal with respect to the beam center.
  • the speeds of the satellite and the terminal can be expressed according to the following mathematical equations 2 and 3, respectively.
  • the Doppler shift applied to the terminal can be expressed according to the following mathematical equation 4.
  • f 0 may mean the initial frequency or emitted frequency of the satellite signal
  • c may mean the speed of the radio wave
  • the relative distance (l) between the terminal and the satellite can be derived as in the following mathematical equation 7.
  • the relative distance can mean the length of the service link.
  • mathematical expression 4 for the Doppler shift can be reorganized into mathematical expression 8 using the relative distance.
  • Equation 8 we can confirm that the rate of change in the length of the service link is proportional to the Doppler shift. Meanwhile, since the length of the service link and the terminal-specific TA are proportional, the first-order differential value of the terminal-specific TA can be organized as shown in Equation 9 below.
  • Figure 22 is a diagram to explain the effect of Doppler shift on a non-terrestrial network.
  • Doppler shift caused by satellite movement can increase the complexity of terminals performing downlink synchronization during the Initial Access (IA) phase. Therefore, discussions may be needed on how to perform pre-compensation on the satellite to ensure that the assigned frequency remains constant relative to the beam or cell center. Pre-compensating for Doppler shift relative to the beam or cell center can ensure sufficient synchronization performance using only the information contained in the conventional Synchronization Signal Block (SSB). Since the Doppler shift value can continuously change due to satellite movement, the pre-compensation value must be continuously updated to account for satellite mobility.
  • SSB Synchronization Signal Block
  • Figure 23 is a diagram for explaining TA changes in a non-terrestrial network.
  • satellite orbital-based TA updates can be a crucial component in maintaining uplink synchronization.
  • terminal-specific TA can be a value independently estimated by the terminal using terminal and satellite position information.
  • the 3GPP standard also defines this estimation method. The following are the satellite data formats or ephemeris data formats transmitted by base stations to inform terminals of satellite positions.
  • Format 1 offers data that can be readily used to calculate terminal-specific TA, offering low complexity and applicability to any system. However, because Format 1 values change significantly over time, information must be provided to terminals at short intervals to avoid significant errors.
  • Format 2 doesn't change significantly over time, providing information to the terminal at relatively long intervals can avoid significant errors.
  • terminals must periodically convert Format 2 data to Format 1
  • terminals using Format 2 must perform additional calculations, which can increase terminal complexity. This increased complexity can lead to increased power consumption.
  • the location information of a terminal is information that the terminal independently acquires through GNSS (Global Navigation Satellite System) measurements, and the validity duration of the location information may vary depending on the speed of the terminal.
  • Terminals that do not support GNSS or terminals for which GNSS measurements are not performed normally cannot estimate terminal-specific TA. Therefore, the terminal-specific TA update method that utilizes the location information of the terminal and satellites has the problem of not being able to obtain a solid estimated value.
  • the performance of the terminal-specific TA update method may be reduced due to errors that occur when the terminal estimates its own location information or errors that occur when the terminal estimates the position of the satellite.
  • the present disclosure proposes a method for estimating terminal-specific TA based on Doppler shift estimation when performing beam centric Doppler shift pre-compensation.
  • FIG. 24 is a diagram for explaining a terminal-specific TA derivation method applicable to the present disclosure.
  • the frequency when the terminal receives the signal transmitted from the satellite can be determined as in the following mathematical expression 11.
  • the frequency f c and offset allocated to the beam or cell can be determined as in Equation 12.
  • the rate of change of the terminal-specific TA at this time can be determined as in Equation 13.
  • the cell-specific TA and the first derivative of the cell-specific TA can be defined as in the following mathematical expressions 14 and 15, respectively.
  • SIB System Information Block
  • the approximate value of the terminal-specific TA can be derived as shown in the following mathematical expression 18.
  • the differential order used for the approximation can be determined based on the maximum error (CP length) of the required TA. Since the CP length can decrease as the SCS (Sub-carrier spacing) increases, a higher-order approximation must be performed to utilize a large value of SCS.
  • Cell-specific TA is a value that considers the mobility of the satellite based on a specific fixed point, like common-TA, and it can be a value that can be transmitted through the base station. Therefore, it is included in SIB19 for NTN defined in the current standard. and When is added, the rate of change of terminal-specific TA can be estimated based on Equations 16 and 17.
  • T TA,UE-specific (t epoch ) can be determined based on the following two methods.
  • it can be determined using the position information of the terminal and satellite at t epoch .
  • Equation 19 it can be calculated by utilizing the formula used in the previous epoch time (t epoch,p ), as in Equation 19 below.
  • both methods can be utilized, and if the terminal cannot normally acquire GNSS information or does not support GNSS, the second method (method using Equation 19) can be used to derive a terminal-specific TA.
  • FIG. 25 is a diagram illustrating an example of a method for a terminal to transmit and receive signals in a system applicable to the present disclosure.
  • information about the cell-specific timing advance may be transmitted via a System Information Block (SIB).
  • SIB System Information Block
  • the rate of change of the terminal-specific timing advance can be determined based on the rate of change of the cell-specific timing advance.
  • the terminal-specific timing advance obtained at the specific time can be determined based on the location of the terminal and the location of the first node at the specific time.
  • the terminal-specific timing advance obtained at the specific time may be determined based on the terminal-specific timing advance obtained at a time earlier than the specific time.
  • a terminal may be provided in a communication system.
  • the terminal may include a transceiver and at least one processor, wherein the at least one processor may be configured to perform the operating method of the terminal according to FIG. 25.
  • a method performed by a base station (BS) in a communication system includes a step (S2610) of transmitting, to a user equipment (UE), first frequency information allocated to a cell to which the UE belongs through a first node, a step (S2620) of transmitting, to the UE, information about a cell-specific timing advance at a specific time through the first node, and a step (S2630) of transmitting, based on a signal transmitted from the BS through the first node, second frequency information of the signal, wherein a UE-specific timing advance for the UE can be estimated based on at least one of the first frequency information, the second frequency information, and the information about the cell-specific timing advance.
  • the terminal-specific timing advance obtained at the specific time may be determined based on the terminal-specific timing advance obtained at a time earlier than the specific time.
  • a base station may be provided in a communication system.
  • the terminal includes a transceiver and at least one processor, and the at least one processor may be configured to perform the operating method of the base station according to FIG. 26.
  • a device for controlling a base station in a communication system may be provided.
  • the device may include at least one processor and at least one memory operably connected to the at least one processor.
  • the at least one memory may be configured to store instructions for performing the operating method of the base station according to FIG. 26 based on instructions executed by the at least one processor.
  • one or more non-transitory computer-readable media storing one or more commands may be provided.
  • the one or more commands when executed by one or more processors, perform operations, and the operations may include an operating method of a base station according to FIG. 26.
  • FIG. 27 illustrates a communication system (1) applicable to various embodiments of the present disclosure.
  • the vehicle may include a vehicle equipped with a wireless communication function, an autonomous vehicle, a vehicle capable of performing vehicle-to-vehicle communication, etc.
  • the vehicle may include an Unmanned Aerial Vehicle (UAV) (e.g., a drone).
  • UAV Unmanned Aerial Vehicle
  • XR devices include AR (Augmented Reality)/VR (Virtual Reality)/MR (Mixed Reality) devices, and may be implemented in the form of a Head-Mounted Device (HMD), a Head-Up Display (HUD) installed in a vehicle, a television, a smartphone, a computer, a wearable device, a home appliance, digital signage, a vehicle, a robot, etc.
  • HMD Head-Mounted Device
  • HUD Head-Up Display
  • Mobile devices may include a smartphone, a smart pad, a wearable device (e.g., a smart watch, smart glasses), a computer (e.g., a laptop, etc.), etc.
  • Home appliances may include a TV, a refrigerator, a washing machine, etc.
  • IoT devices may include a sensor, a smart meter, etc.
  • a base station and a network may also be implemented as a wireless device, and a specific wireless device (200a) may act as a base station/network node to other wireless devices.
  • Wireless devices (100a to 100f) can be connected to a network (300) via a base station (200). Artificial Intelligence (AI) technology can be applied to the wireless devices (100a to 100f), and the wireless devices (100a to 100f) can be connected to an AI server (400) via the network (300).
  • the network (300) can be configured using a 3G network, a 4G (e.g., LTE) network, a 5G (e.g., NR) network, or a 6G network.
  • the wireless devices (100a to 100f) can communicate with each other via the base station (200)/network (300), but can also communicate directly (e.g., sidelink communication) without going through the base station/network.
  • vehicles can communicate directly (e.g., V2V (Vehicle to Vehicle)/V2X (Vehicle to everything) communication).
  • IoT devices e.g., sensors
  • IoT devices can communicate directly with other IoT devices (e.g., sensors) or other wireless devices (100a to 100f).
  • Wireless communication/connection can be established between wireless devices (100a ⁇ 100f)/base stations (200), and base stations (200)/base stations (200).
  • wireless communication/connection can be achieved through various wireless access technologies (e.g., 5G NR) such as uplink/downlink communication (150a), sidelink communication (150b) (or D2D communication), and base station-to-base station communication (150c) (e.g., relay, IAB (Integrated Access Backhaul).
  • 5G NR wireless access technologies
  • uplink/downlink communication 150a
  • sidelink communication 150b
  • base station-to-base station communication 150c
  • wireless devices and base stations/wireless devices, and base stations and base stations can transmit/receive wireless signals to each other.
  • wireless communication/connection can transmit/receive signals through various physical channels.
  • various configuration information setting processes for transmitting/receiving wireless signals various signal processing processes (e.g., channel encoding/decoding, modulation/demodulation, resource mapping/demapping, etc.), and resource allocation processes can be performed based on various proposals of various embodiments of the present disclosure.
  • NR supports multiple numerologies (or subcarrier spacing (SCS)) to support various 5G services.
  • SCS subcarrier spacing
  • an SCS of 15 kHz supports a wide area in traditional cellular bands
  • an SCS of 30 kHz/60 kHz supports dense urban areas, lower latency, and wider carrier bandwidth
  • an SCS of 60 kHz or higher supports a bandwidth greater than 24.25 GHz to overcome phase noise.
  • the NR frequency band can be defined by two types of frequency ranges (FR1, FR2).
  • the numerical values of the frequency ranges can be changed, and for example, the frequency ranges of the two types (FR1, FR2) can be as shown in Table 3 below.
  • FR1 can mean the "sub 6 GHz range”
  • FR2 can mean the "above 6 GHz range” and can be called millimeter wave (mmW).
  • mmW millimeter wave
  • the communication system (1) can support terahertz (THz) wireless communication.
  • the frequency band expected to be used for THz wireless communication may be a D-band (110 GHz to 170 GHz) or H-band (220 GHz to 325 GHz) band where propagation loss due to absorption of molecules in the air is small.
  • FIG. 28 illustrates a wireless device that can be applied to various embodiments of the present disclosure.
  • the first wireless device (100) and the second wireless device (200) can transmit and receive wireless signals through various wireless access technologies (e.g., LTE, NR).
  • ⁇ the first wireless device (100), the second wireless device (200) ⁇ can correspond to ⁇ the wireless device (100x), the base station (200) ⁇ and/or ⁇ the wireless device (100x), the wireless device (100x) ⁇ of FIG. 27.
  • a first wireless device (100) includes one or more processors (102) and one or more memories (104), and may further include one or more transceivers (106) and/or one or more antennas (108).
  • the processor (102) controls the memories (104) and/or the transceivers (106), and may be configured to implement the descriptions, functions, procedures, proposals, methods, and/or operational flowcharts disclosed in this document.
  • the processor (102) may process information in the memory (104) to generate first information/signal, and then transmit a wireless signal including the first information/signal via the transceiver (106).
  • the processor (102) may receive a wireless signal including second information/signal via the transceiver (106), and then store information obtained from signal processing of the second information/signal in the memory (104).
  • the memory (104) may be connected to the processor (102) and may store various information related to the operation of the processor (102). For example, the memory (104) may perform some or all of the processes controlled by the processor (102), or may store software code including commands for performing the descriptions, functions, procedures, proposals, methods, and/or operation flowcharts disclosed in this document.
  • the processor (102) and the memory (104) may be part of a communication modem/circuit/chip designed to implement a wireless communication technology (e.g., LTE, NR).
  • the transceiver (106) may be connected to the processor (102) and may transmit and/or receive wireless signals via one or more antennas (108).
  • the transceiver (106) may include a transmitter and/or a receiver.
  • the transceiver (106) may be used interchangeably with an RF (Radio Frequency) unit.
  • a wireless device may mean a communication modem/circuit/chip.
  • the second wireless device (200) includes one or more processors (202), one or more memories (204), and may further include one or more transceivers (206) and/or one or more antennas (208).
  • the processor (202) controls the memories (204) and/or the transceivers (206), and may be configured to implement the descriptions, functions, procedures, proposals, methods, and/or operational flowcharts disclosed in this document.
  • the processor (202) may process information in the memory (204) to generate third information/signals, and then transmit a wireless signal including the third information/signals via the transceivers (206).
  • the processor (202) may receive a wireless signal including fourth information/signals via the transceivers (206), and then store information obtained from signal processing of the fourth information/signals in the memory (204).
  • the memory (204) may be connected to the processor (202) and may store various information related to the operation of the processor (202). For example, the memory (204) may perform some or all of the processes controlled by the processor (202), or may store software code including commands for performing the descriptions, functions, procedures, proposals, methods, and/or operation flowcharts disclosed in this document.
  • the processor (202) and the memory (204) may be part of a communication modem/circuit/chip designed to implement wireless communication technology (e.g., LTE, NR).
  • the transceiver (206) may be connected to the processor (202) and may transmit and/or receive wireless signals via one or more antennas (208).
  • the transceiver (206) may include a transmitter and/or a receiver.
  • the transceiver (206) may be used interchangeably with an RF unit.
  • a wireless device may also mean a communication modem/circuit/chip.
  • one or more protocol layers may be implemented by one or more processors (102, 202).
  • one or more processors (102, 202) may implement one or more layers (e.g., functional layers such as PHY, MAC, RLC, PDCP, RRC, SDAP).
  • One or more processors (102, 202) may generate one or more Protocol Data Units (PDUs) and/or one or more Service Data Units (SDUs) according to the descriptions, functions, procedures, proposals, methods, and/or operation flowcharts disclosed in this document.
  • PDUs Protocol Data Units
  • SDUs Service Data Units
  • One or more processors (102, 202) may generate messages, control information, data, or information according to the descriptions, functions, procedures, proposals, methods, and/or operation flowcharts disclosed in this document.
  • One or more processors (102, 202) can generate signals (e.g., baseband signals) including PDUs, SDUs, messages, control information, data or information according to the functions, procedures, proposals and/or methods disclosed herein, and provide the signals to one or more transceivers (106, 206).
  • One or more processors (102, 202) can receive signals (e.g., baseband signals) from one or more transceivers (106, 206) and obtain PDUs, SDUs, messages, control information, data or information according to the descriptions, functions, procedures, proposals, methods and/or operational flowcharts disclosed herein.
  • signals e.g., baseband signals
  • the descriptions, functions, procedures, proposals, methods, and/or operational flowcharts disclosed in this document may be implemented using firmware or software, and the firmware or software may be implemented to include modules, procedures, functions, etc.
  • the descriptions, functions, procedures, suggestions, methods and/or operation flowcharts disclosed in this document may be implemented using firmware or software configured to perform one or more processors (102, 202) or stored in one or more memories (104, 204) and executed by one or more processors (102, 202).
  • the descriptions, functions, procedures, suggestions, methods and/or operation flowcharts disclosed in this document may be implemented using firmware or software in the form of codes, instructions and/or sets of instructions.
  • One or more transceivers (106, 206) may convert processed user data, control information, wireless signals/channels, etc. from baseband signals to RF band signals using one or more processors (102, 202).
  • one or more transceivers (106, 206) may include an (analog) oscillator and/or a filter.
  • the signal processing circuit (1000) may include a scrambler (1010), a modulator (1020), a layer mapper (1030), a precoder (1040), a resource mapper (1050), and a signal generator (1060).
  • the operations/functions of FIG. 30 may be performed in the processor (102, 202) and/or the transceiver (106, 206) of FIG. 28.
  • the hardware elements of FIG. 30 may be implemented in the processor (102, 202) and/or the transceiver (106, 206) of FIG. 28.
  • blocks 1010 to 1060 may be implemented in the processor (102, 202) of FIG. 28.
  • blocks 1010 to 1050 may be implemented in the processor (102, 202) of FIG. 28, and block 1060 may be implemented in the transceiver (106, 206) of FIG. 28.
  • a signal processing circuit for a received signal may include a signal restorer, a resource de-mapper, a postcoder, a demodulator, a de-scrambler, and a decoder.
  • Figure 31 illustrates another example of a wireless device applicable to various embodiments of the present disclosure.
  • the wireless device may be implemented in various forms depending on the use case/service.
  • the wireless device (100, 200) corresponds to the wireless device (100, 200) of FIG. 28 and may be composed of various elements, components, units/units, and/or modules.
  • the wireless device (100, 200) may include a communication unit (110), a control unit (120), a memory unit (130), and additional elements (140).
  • the communication unit may include a communication circuit (112) and a transceiver(s) (114).
  • the communication circuit (112) may include one or more processors (102, 202) and/or one or more memories (104, 204) of FIG. 28.
  • the transceiver(s) (114) may include one or more transceivers (106, 206) and/or one or more antennas (108, 208) of FIG. 28.
  • the control unit (120) is electrically connected to the communication unit (110), the memory unit (130), and the additional elements (140) and controls the overall operation of the wireless device.
  • the control unit (120) may control the electrical/mechanical operation of the wireless device based on the program/code/command/information stored in the memory unit (130).
  • control unit (120) may transmit information stored in the memory unit (130) to an external device (e.g., another communication device) via a wireless/wired interface through the communication unit (110), or store information received from an external device (e.g., another communication device) via a wireless/wired interface in the memory unit (130).
  • the additional element (140) may be configured in various ways depending on the type of the wireless device.
  • the additional element (140) may include at least one of a power unit/battery, an input/output unit (I/O unit), a driving unit, and a computing unit.
  • the wireless device may be implemented in the form of a robot (Fig. 27, 100a), a vehicle (Fig. 27, 100b-1, 100b-2), an XR device (Fig. 27, 100c), a portable device (Fig. 27, 100d), a home appliance (Fig. 27, 100e), an IoT device (Fig.
  • Wireless devices may be mobile or stationary depending on the use/service.
  • FIG 32 illustrates a mobile device applicable to various embodiments of the present disclosure.
  • the mobile device may include a smartphone, a smart pad, a wearable device (e.g., a smartwatch, smartglasses), or a portable computer (e.g., a laptop, etc.).
  • the mobile device may be referred to as a Mobile Station (MS), a User Terminal (UT), a Mobile Subscriber Station (MSS), a Subscriber Station (SS), an Advanced Mobile Station (AMS), or a Wireless Terminal (WT).
  • MS Mobile Station
  • UT User Terminal
  • MSS Mobile Subscriber Station
  • SS Subscriber Station
  • AMS Advanced Mobile Station
  • WT Wireless Terminal
  • the portable device (100) may include an antenna unit (108), a communication unit (110), a control unit (120), a memory unit (130), a power supply unit (140a), an interface unit (140b), and an input/output unit (140c).
  • the antenna unit (108) may be configured as a part of the communication unit (110).
  • Blocks 110 to 130/140a to 140c correspond to blocks 110 to 130/140 of FIG. 31, respectively.
  • the communication unit (110) can transmit and receive signals (e.g., data, control signals, etc.) with other wireless devices and base stations.
  • the control unit (120) can control components of the mobile device (100) to perform various operations.
  • the control unit (120) can include an AP (Application Processor).
  • the memory unit (130) can store data/parameters/programs/codes/commands required for operating the mobile device (100). In addition, the memory unit (130) can store input/output data/information, etc.
  • the power supply unit (140a) supplies power to the mobile device (100) and can include a wired/wireless charging circuit, a battery, etc.
  • the interface unit (140b) can support connection between the mobile device (100) and other external devices.
  • the interface unit (140b) can include various ports (e.g., audio input/output ports, video input/output ports) for connection with external devices.
  • the input/output unit (140c) can input or output video information/signals, audio information/signals, data, and/or information input from a user.
  • the input/output unit (140c) may include a camera, a microphone, a user input unit, a display unit (140d), a speaker, and/or a haptic module.
  • the input/output unit (140c) obtains information/signals (e.g., touch, text, voice, image, video) input by the user, and the obtained information/signals can be stored in the memory unit (130).
  • the communication unit (110) converts the information/signals stored in the memory into wireless signals, and can directly transmit the converted wireless signals to other wireless devices or to a base station.
  • the communication unit (110) can receive wireless signals from other wireless devices or base stations, and then restore the received wireless signals to the original information/signals.
  • the restored information/signals can be stored in the memory unit (130) and then output in various forms (e.g., text, voice, image, video, haptic) through the input/output unit (140c).
  • FIG. 33 illustrates a vehicle or autonomous vehicle applicable to various embodiments of the present disclosure.
  • Vehicles or autonomous vehicles can be implemented as mobile robots, cars, trains, manned or unmanned aerial vehicles (AVs), ships, etc.
  • AVs unmanned aerial vehicles
  • a vehicle or autonomous vehicle may include an antenna unit (108), a communication unit (110), a control unit (120), a driving unit (140a), a power supply unit (140b), a sensor unit (140c), and an autonomous driving unit (140d).
  • the antenna unit (108) may be configured as a part of the communication unit (110).
  • Blocks 110/130/140a to 140d correspond to blocks 110/130/140 of FIG. 31, respectively.
  • the communication unit (110) can transmit and receive signals (e.g., data, control signals, etc.) with external devices such as other vehicles, base stations (e.g., base stations, road side units, etc.), and servers.
  • the control unit (120) can control elements of the vehicle or autonomous vehicle (100) to perform various operations.
  • the control unit (120) can include an ECU (Electronic Control Unit).
  • the drive unit (140a) can drive the vehicle or autonomous vehicle (100) on the ground.
  • the drive unit (140a) can include an engine, a motor, a power train, wheels, brakes, a steering device, etc.
  • the power supply unit (140b) supplies power to the vehicle or autonomous vehicle (100) and can include a wired/wireless charging circuit, a battery, etc.
  • the sensor unit (140c) can obtain vehicle status, surrounding environment information, user information, etc.
  • the sensor unit (140c) may include an IMU (inertial measurement unit) sensor, a collision sensor, a wheel sensor, a speed sensor, an incline sensor, a weight detection sensor, a heading sensor, a position module, a vehicle forward/backward sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor, a temperature sensor, a humidity sensor, an ultrasonic sensor, an illuminance sensor, a pedal position sensor, etc.
  • IMU intial measurement unit
  • the autonomous driving unit (140d) may implement a technology for maintaining a driving lane, a technology for automatically controlling speed such as adaptive cruise control, a technology for automatically driving along a set path, a technology for automatically setting a path and driving when a destination is set, etc.
  • the communication unit (110) can receive map data, traffic information data, etc. from an external server.
  • the autonomous driving unit (140d) can generate an autonomous driving route and driving plan based on the acquired data.
  • the control unit (120) can control the drive unit (140a) so that the vehicle or autonomous vehicle (100) moves along the autonomous driving route according to the driving plan (e.g., speed/direction control).
  • the communication unit (110) can irregularly/periodically acquire the latest traffic information data from an external server and can acquire surrounding traffic information data from surrounding vehicles.
  • the sensor unit (140c) can acquire vehicle status and surrounding environment information.
  • the autonomous driving unit (140d) can update the autonomous driving route and driving plan based on newly acquired data/information.
  • the vehicle (100) may include a communication unit (110), a control unit (120), a memory unit (130), an input/output unit (140a), and a position measurement unit (140b).
  • blocks 110 to 130/140a to 140b correspond to blocks 110 to 130/140 of FIG. 31, respectively.
  • the communication unit (110) can transmit and receive signals (e.g., data, control signals, etc.) with other vehicles or external devices such as base stations.
  • the control unit (120) can control components of the vehicle (100) to perform various operations.
  • the memory unit (130) can store data/parameters/programs/codes/commands that support various functions of the vehicle (100).
  • the input/output unit (140a) can output AR/VR objects based on information in the memory unit (130).
  • the input/output unit (140a) can include a HUD.
  • the position measurement unit (140b) can obtain position information of the vehicle (100).
  • the position information can include absolute position information of the vehicle (100), position information within a driving line, acceleration information, position information with respect to surrounding vehicles, etc.
  • the position measurement unit (140b) can include GPS and various sensors.
  • the XR device (100a) may include a communication unit (110), a control unit (120), a memory unit (130), an input/output unit (140a), a sensor unit (140b), and a power supply unit (140c).
  • blocks 110 to 130/140a to 140c correspond to blocks 110 to 130/140 of FIG. 31, respectively.
  • the communication unit (110) can transmit and receive signals (e.g., media data, control signals, etc.) with external devices such as other wireless devices, portable devices, or media servers.
  • the media data can include videos, images, sounds, etc.
  • the control unit (120) can control components of the XR device (100a) to perform various operations.
  • the control unit (120) can be configured to control and/or perform procedures such as video/image acquisition, (video/image) encoding, metadata generation and processing, etc.
  • the memory unit (130) can store data/parameters/programs/codes/commands required for driving the XR device (100a)/generating XR objects.
  • the input/output unit (140a) can obtain control information, data, etc.
  • the input/output unit (140a) can include a camera, a microphone, a user input unit, a display unit, a speaker, and/or a haptic module, etc.
  • the sensor unit (140b) can obtain the XR device status, surrounding environment information, user information, etc.
  • the sensor unit (140b) may include a proximity sensor, an illuminance sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, a light sensor, a microphone, and/or a radar.
  • the power supply unit (140c) supplies power to the XR device (100a) and may include a wired/wireless charging circuit, a battery, etc.
  • the memory unit (130) of the XR device (100a) may include information (e.g., data, etc.) required for creating an XR object (e.g., AR/VR/MR object).
  • the input/output unit (140a) may obtain a command to operate the XR device (100a) from the user, and the control unit (120) may operate the XR device (100a) according to the user's operating command. For example, when a user attempts to watch a movie, news, etc. through the XR device (100a), the control unit (120) may transmit content request information to another device (e.g., a mobile device (100b)) or a media server through the communication unit (130).
  • another device e.g., a mobile device (100b)
  • a media server e.g., a media server
  • the communication unit (130) may download/stream content such as movies and news from another device (e.g., a mobile device (100b)) or a media server to the memory unit (130).
  • the control unit (120) controls and/or performs procedures such as video/image acquisition, (video/image) encoding, and metadata generation/processing for content, and can generate/output an XR object based on information about surrounding space or real objects acquired through the input/output unit (140a)/sensor unit (140b).
  • Figure 36 illustrates robots applicable to various embodiments of the present disclosure. Robots may be classified into industrial, medical, household, and military applications, depending on their intended use or field.
  • the robot (100) may include a communication unit (110), a control unit (120), a memory unit (130), an input/output unit (140a), a sensor unit (140b), and a driving unit (140c).
  • blocks 110 to 130/140a to 140c correspond to blocks 110 to 130/140 of FIG. 31, respectively.
  • the communication unit (110) can transmit and receive signals (e.g., driving information, control signals, etc.) with external devices such as other wireless devices, other robots, or control servers.
  • the control unit (120) can control components of the robot (100) to perform various operations.
  • the memory unit (130) can store data/parameters/programs/codes/commands that support various functions of the robot (100).
  • the input/output unit (140a) can obtain information from the outside of the robot (100) and output information to the outside of the robot (100).
  • the input/output unit (140a) can include a camera, a microphone, a user input unit, a display unit, a speaker, and/or a haptic module.
  • the AI device (100) may include a communication unit (110), a control unit (120), a memory unit (130), an input/output unit (140a/140b), a learning processor unit (140c), and a sensor unit (140d).
  • Blocks 110 to 130/140a to 140d correspond to blocks 110 to 130/140 of FIG. 31, respectively.
  • control unit (120) may collect history information including the operation contents of the AI device (100) or user feedback on the operation, and store the collected history information in the memory unit (130) or the learning processor unit (140c), or transmit the collected history information to an external device such as an AI server (FIG. W1, 400).
  • the collected history information may be used to update a learning model.
  • the sensing unit (140) may include a proximity sensor, an illuminance sensor, an acceleration sensor, a magnetic sensor, a gyro sensor, an inertial sensor, an RGB sensor, an IR sensor, a fingerprint recognition sensor, an ultrasonic sensor, a light sensor, a microphone, and/or a radar, etc.
  • the claims described in the various embodiments of the present disclosure may be combined in various ways.
  • the technical features of the method claims of the various embodiments of the present disclosure may be combined and implemented as a device, and the technical features of the device claims of the various embodiments of the present disclosure may be combined and implemented as a method.
  • the technical features of the method claims of the various embodiments of the present disclosure may be combined and implemented as a device, and the technical features of the method claims of the various embodiments of the present disclosure may be combined and implemented as a method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

Selon divers modes de réalisation de la présente divulgation, un procédé mis en œuvre par un équipement utilisateur (UE) dans un système de communication peut comprendre les étapes consistant à : recevoir, depuis une station de base (BS) par l'intermédiaire d'un premier nœud, des premières informations de fréquence attribuées à une cellule à laquelle appartient l'UE ; recevoir, depuis la BS par l'intermédiaire du premier nœud, des informations concernant une avance temporelle spécifique à une cellule à un moment spécifique ; obtenir, sur la base du signal, des secondes informations de fréquence d'un signal transmis à partir de la BS par l'intermédiaire du premier nœud ; et estimer une avance temporelle spécifique à l'UE pour l'UE sur la base des premières informations de fréquence et/ou des secondes informations de fréquence et/ou des informations concernant l'avance temporelle spécifique à la cellule.
PCT/KR2024/004188 2024-04-01 2024-04-01 Appareil et procédé pour émettre et recevoir des signaux dans un réseau non terrestre Pending WO2025211471A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2024/004188 WO2025211471A1 (fr) 2024-04-01 2024-04-01 Appareil et procédé pour émettre et recevoir des signaux dans un réseau non terrestre

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2024/004188 WO2025211471A1 (fr) 2024-04-01 2024-04-01 Appareil et procédé pour émettre et recevoir des signaux dans un réseau non terrestre

Publications (1)

Publication Number Publication Date
WO2025211471A1 true WO2025211471A1 (fr) 2025-10-09

Family

ID=97267240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2024/004188 Pending WO2025211471A1 (fr) 2024-04-01 2024-04-01 Appareil et procédé pour émettre et recevoir des signaux dans un réseau non terrestre

Country Status (1)

Country Link
WO (1) WO2025211471A1 (fr)

Similar Documents

Publication Publication Date Title
WO2022014728A1 (fr) Procédé et appareil pour effectuer un codage de canal par un équipement utilisateur et une station de base dans un système de communication sans fil
WO2022244903A1 (fr) Procédé et dispositif d'exécution d'un apprentissage fédéré dans un système de communication sans fil
WO2022149641A1 (fr) Procédé et appareil d'apprentissage fédéré basés sur une configuration de serveur multi-antenne et d'utilisateur à antenne unique
WO2024101461A1 (fr) Appareil et procédé permettant de régler la synchronisation de transmission et de réaliser une association entre un ap et un ue dans un système d-mimo
WO2022119021A1 (fr) Procédé et dispositif d'adaptation d'un système basé sur une classe d'apprentissage à la technologie ai mimo
WO2021235563A1 (fr) Procédé de distribution de clés quantiques prêtes à l'emploi basé sur des trajets multiples et une division de longueur d'onde, et dispositif d'utilisation de ce procédé
WO2023068714A1 (fr) Dispositif et procédé permettant de réaliser, sur la base d'informations de canal, un regroupement de dispositifs pour un aircomp, basé sur un apprentissage fédéré, d'un environnement de données non iid dans un système de communication
WO2024195920A1 (fr) Appareil et procédé pour effectuer un codage de canal sur un canal d'interférence ayant des caractéristiques de bruit non local dans un système de communication quantique
WO2023113390A1 (fr) Appareil et procédé de prise en charge de groupement d'utilisateurs de système de précodage de bout en bout dans un système de communication sans fil
WO2024150852A1 (fr) Dispositif et procédé pour exécuter une attribution de ressources quantiques basée sur une configuration d'ensemble de liaisons dans un système de communication quantique
WO2023128604A1 (fr) Procédé et dispositif pour effectuer une correction d'erreurs sur un canal de pauli asymétrique dans un système de communication quantique
WO2024101470A1 (fr) Appareil et procédé pour effectuer une modulation d'état quantique sur la base d'une communication directe sécurisée quantique dans un système de communication quantique
WO2022092351A1 (fr) Procédé et appareil permettant d'atténuer une limitation de puissance de transmission par transmission en chevauchement
WO2025211471A1 (fr) Appareil et procédé pour émettre et recevoir des signaux dans un réseau non terrestre
WO2025216338A1 (fr) Procédé et appareil d'émission et de réception de signal dans un système de communication sans fil
WO2025173806A1 (fr) Procédé et dispositif de transmission et de réception de signaux dans un système de communication sans fil
WO2025211472A1 (fr) Procédé et appareil d'émission et de réception de signal dans un système de communication sans fil
WO2025206432A1 (fr) Procédé et appareil d'émission/de réception d'un signal dans un système de communication sans fil
WO2025211464A1 (fr) Procédé et dispositif d'émission et de réception de signaux dans un système de communication sans fil
WO2025206440A1 (fr) Procédé et appareil de transmission et de réception de signaux dans un système de communication sans fil
WO2025173808A1 (fr) Dispositif et procédé de réalisation d'une commutation dans un réseau non terrestre
WO2025211467A1 (fr) Procédé et dispositif de transmission et de réception de signaux dans un système de communication sans fil
WO2025100600A1 (fr) Appareil et procédé pour effectuer une commutation dans un réseau non terrestre
WO2025206439A1 (fr) Procédé et appareil d'émission et de réception de signal dans un système de communication sans fil
WO2025249599A1 (fr) Procédé et dispositif de transmission et de réception de signaux dans un système de communication sans fil

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24934208

Country of ref document: EP

Kind code of ref document: A1