[go: up one dir, main page]

WO2021015345A1 - Procédé par lequel un terminal exécute une connexion de liaison latérale biunivoque dans un système de communication sans fil, et terminal utilisant le procédé - Google Patents

Procédé par lequel un terminal exécute une connexion de liaison latérale biunivoque dans un système de communication sans fil, et terminal utilisant le procédé Download PDF

Info

Publication number
WO2021015345A1
WO2021015345A1 PCT/KR2019/009201 KR2019009201W WO2021015345A1 WO 2021015345 A1 WO2021015345 A1 WO 2021015345A1 KR 2019009201 W KR2019009201 W KR 2019009201W WO 2021015345 A1 WO2021015345 A1 WO 2021015345A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
sidelink
message
connection
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2019/009201
Other languages
English (en)
Korean (ko)
Inventor
이성근
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Priority to PCT/KR2019/009201 priority Critical patent/WO2021015345A1/fr
Publication of WO2021015345A1 publication Critical patent/WO2021015345A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • H04W76/14Direct-mode setup
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/30Connection release

Definitions

  • the present invention relates to a wireless communication system.
  • a wireless communication system is a multiple access system that supports communication with multiple users by sharing available system resources (eg, bandwidth, transmission power, etc.).
  • multiple access systems include code division multiple access (CDMA) systems, frequency division multiple access (FDMA) systems, time division multiple access (TDMA) systems, orthogonal frequency division multiple access (OFDMA) systems, and single carrier frequency (SC-FDMA) systems. division multiple access) system, MC-FDMA (multi carrier frequency division multiple access) system, and the like.
  • FIG. 1 shows an example of a 5G usage scenario to which the technical features of the present invention can be applied.
  • the 5G usage scenario shown in FIG. 1 is merely exemplary, and the technical features of the present invention can be applied to other 5G usage scenarios not shown in FIG. 1.
  • the three main requirements areas of 5G are (1) an enhanced mobile broadband (eMBB) area, (2) a massive machine type communication (mMTC) area, and ( 3) Ultra-reliable and low latency communications (URLLC) area is included.
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communication
  • URLLC Ultra-reliable and low latency communications
  • Some use cases may require multiple areas for optimization, and other use cases may focus only on one key performance indicator (KPI).
  • KPI key performance indicator
  • eMBB focuses on the overall improvement of data rate, latency, user density, capacity and coverage of mobile broadband access.
  • eMBB targets a throughput of around 10Gbps.
  • eMBB goes far beyond basic mobile Internet access, covering rich interactive work, media and entertainment applications in the cloud or augmented reality.
  • Data is one of the key drivers of 5G, and it may not be possible to see dedicated voice services for the first time in the 5G era.
  • voice is expected to be processed as an application program simply using the data connection provided by the communication system.
  • the main reason for the increased traffic volume is an increase in content size and an increase in the number of applications requiring high data rates.
  • Streaming services audio and video
  • interactive video and mobile Internet connections will become more prevalent as more devices connect to the Internet.
  • Cloud storage and applications are increasing rapidly in mobile communication platforms, which can be applied to both work and entertainment.
  • Cloud storage is a special use case that drives the growth of uplink data rates.
  • 5G is also used for remote work in the cloud and requires much lower end-to-end latency to maintain a good user experience when tactile interfaces are used.
  • cloud gaming and video streaming are another key factor in increasing the demand for mobile broadband capabilities.
  • Entertainment is essential on smartphones and tablets anywhere, including in highly mobile environments such as trains, cars and airplanes.
  • Another use case is augmented reality and information retrieval for entertainment.
  • augmented reality requires very low latency and an instantaneous amount of data.
  • the mMTC is designed to enable communication between a large number of low-cost devices powered by batteries and is intended to support applications such as smart metering, logistics, field and body sensors.
  • the mMTC targets 10 years of batteries and/or 1 million units per km 2 .
  • mMTC makes it possible to seamlessly connect embedded sensors in all fields, and is one of the most anticipated 5G use cases. Potentially, IoT devices are expected to reach 20.4 billion by 2020.
  • Industrial IoT is one of the areas where 5G plays a major role in enabling smart cities, asset tracking, smart utilities, agriculture and security infrastructure.
  • URLLC is ideal for vehicle communication, industrial control, factory automation, teleoperation, smart grid and public safety applications by allowing devices and machines to communicate with high reliability, very low latency and high availability.
  • URLLC aims for a delay of the order of 1ms.
  • URLLC includes new services that will transform the industry through ultra-reliable/low-latency links such as remote control of critical infrastructure and autonomous vehicles. The level of reliability and delay is essential for smart grid control, industrial automation, robotics, drone control and coordination.
  • 5G can complement fiber-to-the-home (FTTH) and cable-based broadband (or DOCSIS) as a means of providing streams rated from hundreds of megabits per second to gigabits per second.
  • FTTH fiber-to-the-home
  • DOCSIS cable-based broadband
  • Such high speed may be required to deliver TVs in resolutions of 4K or higher (6K, 8K and higher) as well as virtual reality (VR) and augmented reality (AR).
  • VR and AR applications involve almost immersive sports events. Certain applications may require special network configuration. In the case of VR games, for example, the game company may need to integrate the core server with the network operator's edge network server to minimize latency.
  • Automotive is expected to be an important new driving force in 5G, with many use cases for mobile communication to vehicles. For example, entertainment for passengers simultaneously demands high capacity and high mobile broadband. The reason is that future users will continue to expect high-quality connections, regardless of their location and speed.
  • Another use case in the automotive sector is an augmented reality dashboard.
  • the augmented reality contrast board allows the driver to identify objects in the dark on top of what they see through the front window.
  • the augmented reality dashboard superimposes information to inform the driver about the distance and movement of objects.
  • wireless modules will enable communication between vehicles, exchange of information between the vehicle and the supporting infrastructure, and exchange of information between the vehicle and other connected devices (eg, devices carried by pedestrians).
  • the safety system can lower the risk of accidents by guiding the driver through alternative courses of action to make driving safer.
  • the next step will be a remotely controlled vehicle or an autonomous vehicle.
  • This requires very reliable and very fast communication between different autonomous vehicles and/or between vehicles and infrastructure.
  • autonomous vehicles will perform all driving activities, and drivers will be forced to focus only on traffic anomalies that the vehicle itself cannot identify.
  • the technical requirements of autonomous vehicles require ultra-low latency and ultra-fast reliability to increase traffic safety to levels that cannot be achieved by humans.
  • Smart cities and smart homes referred to as smart society will be embedded with high-density wireless sensor networks.
  • a distributed network of intelligent sensors will identify the conditions for cost and energy efficient maintenance of a city or home.
  • a similar setup can be done for each household.
  • Temperature sensors, window and heating controllers, burglar alarms and appliances are all wirelessly connected. Many of these sensors typically require low data rates, low power and low cost.
  • real-time HD video may be required in certain types of devices for surveillance.
  • the smart grid interconnects these sensors using digital information and communication technologies to collect information and act accordingly. This information can include the behavior of suppliers and consumers, enabling smart grids to improve efficiency, reliability, economics, sustainability of production and the distribution of fuels such as electricity in an automated manner.
  • the smart grid can also be viewed as another low-latency sensor network.
  • the health sector has many applications that can benefit from mobile communications.
  • the communication system can support telemedicine providing clinical care from remote locations. This can help reduce barriers to distance and improve access to medical services that are not consistently available in remote rural areas. It is also used to save lives in critical care and emergencies.
  • a wireless sensor network based on mobile communication may provide remote monitoring and sensors for parameters such as heart rate and blood pressure.
  • Wireless and mobile communications are becoming increasingly important in industrial applications. Wiring is expensive to install and maintain. Thus, the possibility of replacing cables with reconfigurable wireless links is an attractive opportunity for many industries. However, achieving this requires that the wireless connection operates with a delay, reliability and capacity similar to that of the cable, and its management is simplified. Low latency and very low error probability are new requirements that need to be connected to 5G.
  • Logistics and cargo tracking is an important use case for mobile communications that enables tracking of inventory and packages from anywhere using a location-based information system. Logistics and freight tracking use cases typically require low data rates, but require a wide range and reliable location information.
  • Sidelink refers to a communication method in which a direct link is established between terminals (User Equipment, UEs), and voice or data is directly exchanged between terminals without going through a base station (BS).
  • the sidelink is being considered as a method that can solve the burden of the base station due to rapidly increasing data traffic.
  • V2X vehicle-to-everything refers to a communication technology that exchanges information with other vehicles, pedestrians, and infrastructure-built objects through wired/wireless communication.
  • V2X can be divided into four types: vehicle-to-vehicle (V2V), vehicle-to-infrastructure (V2I), vehicle-to-network (V2N), and vehicle-to-pedestrian (V2P).
  • V2X communication may be provided through a PC5 interface and/or a Uu interface.
  • the access technology may be referred to as new radio access technology (RAT) or new radio (NR).
  • RAT new radio access technology
  • NR new radio
  • V2X vehicle-to-everything
  • the technical problem to be solved by the present invention is to provide a one-to-one sidelink connection method performed by a terminal in a wireless communication system and a terminal using the method.
  • a one-to-one sidelink connection method performed by a first terminal in a wireless communication system receives a one-to-one connection request message from a second terminal, wherein the one-to-one connection request message includes information on a first radio resource transmitted by the second terminal, and the one-to-one connection
  • receives a one-to-one connection request message from a second terminal, wherein the one-to-one connection request message includes information on a first radio resource transmitted by the second terminal, and the one-to-one connection
  • transmits a one-to-one connection setup message to the second terminal, and performs a sidelink operation with the second terminal.
  • the first terminal monitors only the first radio resource, and the application transmitted and received by the first terminal during the sidelink operation ( application) message is characterized in that it is encrypted.
  • the one-to-one connection request message and the response message may be broadcast.
  • the one-to-one connection setup message may include information on a second radio resource transmitted by the first terminal.
  • the message may be encrypted using a symmetric key shared between the first terminal and the second terminal.
  • the first terminal When the first terminal transmits the one-to-one connection setup message, the first terminal determines that a one-to-one connection with the second terminal is established, but the first terminal is a one-to-one connection to the second terminal. When it is determined that the release message is transmitted, the one-to-one connection timer expires, or the received message is not detected in the first radio resource, the first terminal may release the one-to-one connection.
  • the first terminal may delete information related to a sidelink operation with the second terminal.
  • the first terminal measures whether the signal strength of the received message is lower than a threshold value during the signal detection time period, and if there is no received message having a signal strength higher than the threshold value during the signal detection time period, the first terminal It may be determined that the received message is not detected in the first radio resource.
  • the first terminal may periodically measure whether the signal strength of the received message is lower than a threshold value during the signal detection time period.
  • the one-to-one connection request message may be encrypted using an Elliptic Curve Integrated Encryption Scheme (ECIES) algorithm.
  • ECIES Elliptic Curve Integrated Encryption Scheme
  • the identifier may be a MAC (Media Access Control) ID (identifier).
  • the one-to-one connection request message may include a reception identifier (ID) for the first terminal.
  • ID reception identifier
  • the one-to-one connection setup message may include a reception identifier (ID) for the second terminal.
  • ID reception identifier
  • the first terminal transmits a one-to-one connection update message to the second terminal,
  • the one-to-one connection update message may include information on the changed certificate and information on the changed second radio resource.
  • the received message and the application message may be messages related to vehicle-to-everything (V2X).
  • V2X vehicle-to-everything
  • a user equipment (UE) provided in another aspect includes a transceiver for transmitting and receiving a radio signal and a processor operating in combination with the transceiver, the processor comprising: a one-to-one connection request message from another terminal
  • the one-to-one connection request message includes information on a first radio resource to be transmitted by the other terminal, and the one-to-one sidelink connection with the terminal in the one-to-one connection request message
  • the identifier associated with is included, transmits a one-to-one connection setup message to the other terminal, and performs a sidelink operation with the other terminal, and the terminal receives a received message from the other terminal during the sidelink operation.
  • the terminal monitors only the first radio resource, and an application message transmitted and received by the terminal during the sidelink operation is encrypted.
  • V2X uses a broadcast message transmission scheme, and the physical layer manages radio resources for message transmission in a large-scale unit called a resource pool.
  • the physical layer manages radio resources for message transmission in a large-scale unit called a resource pool.
  • the terminal does not need to monitor the entire area of the received radio resource pool in order to receive the V2X message, and only periodically monitors the specific resource area shared in advance, thereby reducing power consumption.
  • the V2X device can decrypt only the target V2X device by encrypting a transmission message, confidentiality between the two V2X devices is guaranteed.
  • the ECIES elliptic curve integrated encryption scheme
  • asymmetric encryption which has a high computational burden, only for initial connection setup, and then using a shared symmetric key. It is possible to reduce the computational burden and transmission delay.
  • FIG. 1 shows an example of a 5G usage scenario to which the technical features of the present invention can be applied.
  • FIG. 2 is a view showing a vehicle according to an embodiment of the present invention.
  • FIG. 3 is a control block diagram of a vehicle according to an embodiment of the present invention.
  • FIG. 4 is a control block diagram of an autonomous driving device according to an embodiment of the present invention.
  • FIG. 5 is a signal flow diagram of an autonomous vehicle according to an embodiment of the present invention.
  • FIG. 6 is a view showing the interior of a vehicle according to an embodiment of the present invention.
  • FIG. 7 is a block diagram referenced to describe a vehicle cabin system according to an embodiment of the present invention.
  • FIG. 8 is a diagram referenced for explaining a usage scenario of a user according to an embodiment of the present invention.
  • FIG 9 shows the structure of an LTE system to which the present invention can be applied.
  • FIG. 10 shows a radio protocol architecture for a user plane to which the present invention can be applied.
  • FIG. 11 shows a radio protocol structure for a control plane to which the present invention can be applied.
  • FIG. 12 shows the structure of an NR system to which the present invention can be applied.
  • FIG. 13 shows functional division between NG-RAN and 5GC to which the present invention can be applied.
  • FIG. 14 shows the structure of an NR radio frame to which the present invention can be applied.
  • FIG. 15 shows a slot structure of an NR frame to which the present invention can be applied.
  • 16 shows a protocol stack for sidelink communication to which the present invention can be applied.
  • 17 shows a protocol stack for sidelink communication to which the present invention can be applied.
  • FIG. 18 shows an example of a unit of time resource in which a sidelink synchronization signal to which the present invention can be applied is transmitted.
  • 19 shows a terminal performing V2X or sidelink communication to which the present invention can be applied.
  • TM transmission mode
  • FIG. 22 shows an example in which a transmission resource to which the present invention can be applied is selected.
  • FIG. 23 shows an example in which a PSCCH is transmitted in sidelink transmission mode 1 or 2 to which the present invention can be applied.
  • FIG. 24 shows an example in which a PSCCH is transmitted in sidelink transmission mode 3 or 4 to which the present invention can be applied.
  • 25 shows an example of physical layer processing at a transmission side to which the present invention can be applied.
  • 26 shows an example of physical layer processing at a receiving side to which the present invention can be applied.
  • FIG. 27 shows a synchronization source or a synchronization reference in V2X to which the present invention can be applied.
  • FIG. 28 shows an example of a scenario in which a BWP to which the present invention can be applied is set.
  • FIG. 29 schematically shows an example of a reference model of a cellular-V2X (CV2X).
  • 31 is a diagram showing the types of V2X operation.
  • FIG. 32 schematically shows an example of a V2X radio resource pool.
  • FIG. 33 is a flowchart illustrating an example of a radio resource reselection method of a communication device performing a V2X operation.
  • 35 shows an example of a secure message format.
  • 39 is a flowchart of a one-to-one sidelink connection method performed by a terminal according to an embodiment of the present invention.
  • FIG. 40 shows a wireless communication device according to an embodiment of the present invention.
  • 41 shows a wireless communication device according to an embodiment of the present invention.
  • FIG. 42 illustrates a transceiver of a wireless communication device according to an embodiment of the present invention.
  • FIG. 43 illustrates a transceiver of a wireless communication device according to an embodiment of the present invention.
  • 44 illustrates an operation of a wireless device related to sidelink communication according to an embodiment of the present invention.
  • 45 illustrates an operation of a network node related to a sidelink according to an embodiment of the present invention.
  • 46 shows an implementation of a wireless device and a network node according to an embodiment of the present invention.
  • A/B may mean “A and/or B”.
  • A, B may mean “A and/or B”.
  • A/B/C may mean “at least one of A, B and/or C”.
  • A, B, C may mean “at least one of A, B and/or C”.
  • FIG. 2 is a view showing a vehicle according to an embodiment of the present invention.
  • a vehicle 10 is defined as a transportation means traveling on a road or track.
  • the vehicle 10 is a concept including a car, a train, and a motorcycle.
  • the vehicle 10 may be a concept including both an internal combustion engine vehicle including an engine as a power source, a hybrid vehicle including an engine and an electric motor as a power source, and an electric vehicle including an electric motor as a power source.
  • the vehicle 10 may be a vehicle owned by an individual.
  • the vehicle 10 may be a shared vehicle.
  • the vehicle 10 may be an autonomous vehicle.
  • FIG. 3 is a control block diagram of a vehicle according to an embodiment of the present invention.
  • the vehicle 10 includes a user interface device 200, an object detection device 210, a communication device 220, a driving operation device 230, a main ECU 240, and a drive control device 250. ), an autonomous driving device 260, a sensing unit 270, and a location data generating device 280.
  • Each of 280 may be implemented as an electronic device that generates an electrical signal and exchanges electrical signals with each other.
  • the user interface device 200 is a device for communicating with the vehicle 10 and a user.
  • the user interface device 200 may receive a user input and provide information generated in the vehicle 10 to the user.
  • the vehicle 10 may implement a user interface (UI) or a user experience (UX) through the user interface device 200.
  • the user interface device 200 may include an input device, an output device, and a user monitoring device.
  • the object detection device 210 may generate information on an object outside the vehicle 10.
  • the information on the object may include at least one of information on the existence of the object, location information of the object, distance information between the vehicle 10 and the object, and relative speed information between the vehicle 10 and the object. .
  • the object detection device 210 may detect an object outside the vehicle 10.
  • the object detection device 210 may include at least one sensor capable of detecting an object outside the vehicle 10.
  • the object detection device 210 may include at least one of a camera, a radar, a lidar, an ultrasonic sensor, and an infrared sensor.
  • the object detection device 210 may provide data on an object generated based on a sensing signal generated by a sensor to at least one electronic device included in the vehicle.
  • the camera may generate information on an object outside the vehicle 10 by using the image.
  • the camera may include at least one lens, at least one image sensor, and at least one processor that is electrically connected to the image sensor and processes a received signal, and generates data about an object based on the processed signal.
  • the camera may be at least one of a mono camera, a stereo camera, and an AVM (Around View Monitoring) camera.
  • the camera may use various image processing algorithms to obtain position information of an object, distance information to an object, or information on a relative speed to an object. For example, from the acquired image, the camera may acquire distance information and relative speed information from the object based on a change in the size of the object over time. For example, the camera may obtain distance information and relative speed information with an object through a pin hole model, road surface profiling, or the like. For example, the camera may obtain distance information and relative speed information with an object based on disparity information from a stereo image obtained from a stereo camera.
  • the camera may be mounted in a position where field of view (FOV) can be secured in the vehicle in order to photograph the outside of the vehicle.
  • the camera may be placed in the interior of the vehicle, close to the front windshield, to acquire an image of the front of the vehicle.
  • the camera can be placed around the front bumper or radiator grille.
  • the camera may be placed close to the rear glass, in the interior of the vehicle, to acquire an image of the rear of the vehicle.
  • the camera can be placed around the rear bumper, trunk or tailgate.
  • the camera may be disposed in proximity to at least one of the side windows in the interior of the vehicle in order to acquire an image of the side of the vehicle.
  • the camera may be disposed around a side mirror, a fender, or a door.
  • the radar may generate information on an object outside the vehicle 10 using radio waves.
  • the radar may include at least one processor that is electrically connected to the electromagnetic wave transmitter, the electromagnetic wave receiver, and the electromagnetic wave transmitter and the electromagnetic wave receiver, processes a received signal, and generates data for an object based on the processed signal.
  • the radar may be implemented in a pulse radar method or a continuous wave radar method according to the principle of radio wave emission.
  • the radar may be implemented in a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method according to a signal waveform among continuous wave radar methods.
  • FMCW frequency modulated continuous wave
  • FSK frequency shift keyong
  • the radar detects an object by means of an electromagnetic wave, a time of flight (TOF) method or a phase-shift method, and detects the position of the detected object, the distance to the detected object, and the relative speed.
  • TOF time of flight
  • the radar may be placed at a suitable location outside of the vehicle to detect objects located in front, rear or side of the vehicle.
  • the lidar may generate information on an object outside the vehicle 10 using laser light.
  • the radar may include at least one processor that is electrically connected to the optical transmitter, the optical receiver, and the optical transmitter and the optical receiver, processes a received signal, and generates data for an object based on the processed signal. .
  • the rider may be implemented in a TOF (Time of Flight) method or a phase-shift method.
  • the lidar can be implemented either driven or non-driven. When implemented as a drive type, the lidar is rotated by a motor, and objects around the vehicle 10 can be detected. When implemented in a non-driven manner, the lidar can detect an object located within a predetermined range with respect to the vehicle by optical steering.
  • the vehicle 100 may include a plurality of non-driven lidars.
  • the radar detects an object based on a time of flight (TOF) method or a phase-shift method by means of a laser light, and determines the position of the detected object, the distance to the detected object, and the relative speed. Can be detected.
  • the lidar may be placed at an appropriate location outside the vehicle to detect objects located in front, rear or side of the vehicle.
  • the communication device 220 may exchange signals with devices located outside the vehicle 10.
  • the communication device 220 may exchange signals with at least one of an infrastructure (eg, a server, a broadcasting station), another vehicle, and a terminal.
  • the communication device 220 may include at least one of a transmission antenna, a reception antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
  • RF radio frequency
  • the communication device may exchange signals with external devices based on C-V2X (Cellular V2X) technology.
  • C-V2X technology may include LTE-based sidelink communication and/or NR-based sidelink communication. Contents related to C-V2X will be described later.
  • a communication device can communicate with external devices based on the IEEE 802.11p PHY/MAC layer technology and the Dedicated Short Range Communications (DSRC) technology based on the IEEE 1609 Network/Transport layer technology, or the Wireless Access in Vehicular Environment (WAVE) standard. Can be exchanged.
  • DSRC or WAVE standard
  • ITS Intelligent Transport System
  • DSRC technology may use a frequency of 5.9GHz band, and may be a communication method having a data transmission rate of 3Mbps ⁇ 27Mbps.
  • IEEE 802.11p technology can be combined with IEEE 1609 technology to support DSRC technology (or WAVE standard).
  • the communication apparatus of the present invention can exchange signals with an external device using only either C-V2X technology or DSRC technology.
  • the communication device of the present invention may exchange signals with external devices by hybridizing C-V2X technology and DSRC technology.
  • the driving operation device 230 is a device that receives a user input for driving. In the case of the manual mode, the vehicle 10 may be driven based on a signal provided by the driving operation device 230.
  • the driving operation device 230 may include a steering input device (eg, a steering wheel), an acceleration input device (eg, an accelerator pedal), and a brake input device (eg, a brake pedal).
  • the main ECU 240 may control the overall operation of at least one electronic device provided in the vehicle 10.
  • the drive control device 250 is a device that electrically controls various vehicle drive devices in the vehicle 10.
  • the drive control device 250 may include a power train drive control device, a chassis drive control device, a door/window drive control device, a safety device drive control device, a lamp drive control device, and an air conditioning drive control device.
  • the power train drive control device may include a power source drive control device and a transmission drive control device.
  • the chassis drive control device may include a steering drive control device, a brake drive control device, and a suspension drive control device.
  • the safety device driving control device may include a safety belt driving control device for controlling the safety belt.
  • the drive control device 250 includes at least one electronic control device (eg, a control Electronic Control Unit (ECU)).
  • ECU control Electronic Control Unit
  • the vehicle type control device 250 may control the vehicle driving device based on a signal received from the autonomous driving device 260.
  • the control device 250 may control a power train, a steering device, and a brake device based on a signal received from the autonomous driving device 260.
  • the autonomous driving device 260 may generate a path for autonomous driving based on the acquired data.
  • the autonomous driving device 260 may generate a driving plan for driving along the generated route.
  • the autonomous driving device 260 may generate a signal for controlling the movement of the vehicle according to the driving plan.
  • the autonomous driving device 260 may provide the generated signal to the driving control device 250.
  • the autonomous driving device 260 may implement at least one ADAS (Advanced Driver Assistance System) function.
  • ADAS includes Adaptive Cruise Control (ACC), Autonomous Emergency Braking (AEB), Forward Collision Warning (FCW), and Lane Keeping Assist (LKA). ), Lane Change Assist (LCA), Target Following Assist (TFA), Blind Spot Detection (BSD), Adaptive High Beam Control System (HBA: High Beam Assist) , Auto Parking System (APS), PD collision warning system (PD collision warning system), Traffic Sign Recognition (TSR), Traffic Sign Assist (TSA), Night Vision System At least one of (NV: Night Vision), Driver Status Monitoring (DSM), and Traffic Jam Assist (TJA) may be implemented.
  • ACC Adaptive Cruise Control
  • AEB Autonomous Emergency Braking
  • FCW Forward Collision Warning
  • LKA Lane Keeping Assist
  • LKA Lane Change Assist
  • TSA Traffic Spot Detection
  • HBA High Beam Ass
  • the autonomous driving device 260 may perform a switching operation from an autonomous driving mode to a manual driving mode or a switching operation from a manual driving mode to an autonomous driving mode. For example, the autonomous driving device 260 may change the mode of the vehicle 10 from the autonomous driving mode to the manual driving mode or the autonomous driving mode from the manual driving mode based on a signal received from the user interface device 200. Can be switched to.
  • the sensing unit 270 may sense the state of the vehicle.
  • the sensing unit 270 includes an inertial measurement unit (IMU) sensor, a collision sensor, a wheel sensor, a speed sensor, a tilt sensor, a weight detection sensor, a heading sensor, a position module, and a vehicle. It may include at least one of a forward/reverse sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor, a temperature sensor, a humidity sensor, an ultrasonic sensor, an illuminance sensor, and a pedal position sensor. Meanwhile, the inertial measurement unit (IMU) sensor may include one or more of an acceleration sensor, a gyro sensor, and a magnetic sensor.
  • IMU inertial measurement unit
  • the sensing unit 270 may generate state data of the vehicle based on a signal generated by at least one sensor.
  • the vehicle state data may be information generated based on data sensed by various sensors provided inside the vehicle.
  • the sensing unit 270 includes vehicle attitude data, vehicle motion data, vehicle yaw data, vehicle roll data, vehicle pitch data, vehicle collision data, vehicle direction data, vehicle angle data, and vehicle speed.
  • the location data generating device 280 may generate location data of the vehicle 10.
  • the location data generating apparatus 280 may include at least one of a Global Positioning System (GPS) and a Differential Global Positioning System (DGPS).
  • GPS Global Positioning System
  • DGPS Differential Global Positioning System
  • the location data generating apparatus 280 may generate location data of the vehicle 10 based on a signal generated by at least one of GPS and DGPS.
  • the location data generating device 280 may correct the location data based on at least one of an IMU (Inertial Measurement Unit) of the sensing unit 270 and a camera of the object detection device 210.
  • the location data generating device 280 may be referred to as a Global Navigation Satellite System (GNSS).
  • GNSS Global Navigation Satellite System
  • Vehicle 10 may include an internal communication system 50.
  • a plurality of electronic devices included in the vehicle 10 may exchange signals through the internal communication system 50.
  • the signal may contain data.
  • the internal communication system 50 may use at least one communication protocol (eg, CAN, LIN, FlexRay, MOST, Ethernet).
  • FIG. 4 is a control block diagram of an autonomous driving device according to an embodiment of the present invention.
  • the autonomous driving device 260 may include a memory 140, a processor 170, an interface unit 180, and a power supply unit 190.
  • the memory 140 is electrically connected to the processor 170.
  • the memory 140 may store basic data for a unit, control data for controlling the operation of the unit, and input/output data.
  • the memory 140 may store data processed by the processor 170.
  • the memory 140 may be configured with at least one of ROM, RAM, EPROM, flash drive, and hard drive.
  • the memory 140 may store various data for the overall operation of the autonomous driving device 260, such as a program for processing or controlling the processor 170.
  • the memory 140 may be implemented integrally with the processor 170. Depending on the embodiment, the memory 140 may be classified as a sub-element of the processor 170.
  • the interface unit 180 may exchange signals with at least one electronic device provided in the vehicle 10 by wire or wirelessly.
  • the interface unit 280 includes an object detection device 210, a communication device 220, a driving operation device 230, a main ECU 240, a drive control device 250, a sensing unit 270, and a position data generating device.
  • a signal may be exchanged with at least one of 280 by wire or wirelessly.
  • the interface unit 280 may be configured with at least one of a communication module, a terminal, a pin, a cable, a port, a circuit, an element, and a device.
  • the power supply unit 190 may supply power to the autonomous driving device 260.
  • the power supply unit 190 may receive power from a power source (eg, a battery) included in the vehicle 10 and supply power to each unit of the autonomous driving device 260.
  • the power supply unit 190 may be operated according to a control signal provided from the main ECU 240.
  • the power supply unit 190 may include a switched-mode power supply (SMPS).
  • SMPS switched-mode power supply
  • the processor 170 may be electrically connected to the memory 140, the interface unit 280, and the power supply unit 190 to exchange signals.
  • the processor 170 includes application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, and controllers. It may be implemented using at least one of (controllers), micro-controllers, microprocessors, and electrical units for performing other functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors and controllers. It may be implemented using at least one of (controllers), micro-controllers, microprocessors, and electrical units for performing other functions.
  • the processor 170 may be driven by power provided from the power supply unit 190.
  • the processor 170 may receive data, process data, generate a signal, and provide a signal while power is supplied by the power supply unit 190.
  • the processor 170 may receive information from another electronic device in the vehicle 10 through the interface unit 180.
  • the processor 170 may provide a control signal to another electronic device in the vehicle 10 through the interface unit 180.
  • the autonomous driving device 260 may include at least one printed circuit board (PCB).
  • the memory 140, the interface unit 180, the power supply unit 190, and the processor 170 may be electrically connected to a printed circuit board.
  • FIG. 5 is a signal flow diagram of an autonomous vehicle according to an embodiment of the present invention.
  • the processor 170 may perform a reception operation.
  • the processor 170 may receive data from at least one of the object detection device 210, the communication device 220, the sensing unit 270, and the location data generation device 280 through the interface unit 180. I can.
  • the processor 170 may receive object data from the object detection apparatus 210.
  • the processor 170 may receive HD map data from the communication device 220.
  • the processor 170 may receive vehicle state data from the sensing unit 270.
  • the processor 170 may receive location data from the location data generating device 280.
  • the processor 170 may perform a processing/determining operation.
  • the processor 170 may perform a processing/determining operation based on the driving situation information.
  • the processor 170 may perform a processing/decision operation based on at least one of object data, HD map data, vehicle state data, and location data.
  • the processor 170 may generate driving plan data.
  • the processor 170 may generate electronic horizon data.
  • the electronic horizon data may be understood as driving plan data within a range from a point where the vehicle 10 is located to a horizon.
  • the horizon may be understood as a point in front of a preset distance from a point where the vehicle 10 is located based on a preset driving route.
  • Horizon may mean a point at which the vehicle 10 can reach after a predetermined time from a point at which the vehicle 10 is located along a preset driving route.
  • the electronic horizon data may include horizon map data and horizon pass data.
  • the horizon map data may include at least one of topology data, road data, HD map data, and dynamic data.
  • the horizon map data may include a plurality of layers.
  • the horizon map data may include a layer matching topology data, a second layer matching road data, a third layer matching HD map data, and a fourth layer matching dynamic data.
  • the horizon map data may further include static object data.
  • Topology data can be described as a map created by connecting the center of the road.
  • the topology data is suitable for roughly indicating the position of the vehicle, and may be in the form of data mainly used in a navigation for a driver.
  • the topology data may be understood as data about road information excluding information about a lane.
  • the topology data may be generated based on data received from an external server through the communication device 220.
  • the topology data may be based on data stored in at least one memory provided in the vehicle 10.
  • the road data may include at least one of slope data of a road, curvature data of a road, and speed limit data of a road.
  • the road data may further include overtaking prohibited section data.
  • Road data may be based on data received from an external server through the communication device 220.
  • the road data may be based on data generated by the object detection apparatus 210.
  • the HD map data includes detailed lane-level topology information of the road, connection information of each lane, and feature information for localization of the vehicle (e.g., traffic signs, lane marking/attributes, road furniture, etc.). I can.
  • the HD map data may be based on data received from an external server through the communication device 220.
  • the dynamic data may include various dynamic information that may be generated on a road.
  • the dynamic data may include construction information, variable speed lane information, road surface condition information, traffic information, moving object information, and the like.
  • the dynamic data may be based on data received from an external server through the communication device 220.
  • the dynamic data may be based on data generated by the object detection apparatus 210.
  • the processor 170 may provide map data within a range from the point where the vehicle 10 is located to the horizon.
  • the horizon pass data may be described as a trajectory that the vehicle 10 can take within a range from the point where the vehicle 10 is located to the horizon.
  • the horizon pass data may include data representing a relative probability of selecting any one road from a decision point (eg, a crossroads, a junction, an intersection, etc.).
  • the relative probability can be calculated based on the time it takes to reach the final destination. For example, at the decision point, if the first road is selected and the time it takes to reach the final destination is less than the second road is selected, the probability of selecting the first road is less than the probability of selecting the second road. Can be calculated higher.
  • Horizon pass data may include a main pass and a sub pass.
  • the main path can be understood as a trajectory connecting roads with a high relative probability to be selected.
  • the sub-path may be branched at at least one decision point on the main path.
  • the sub-path may be understood as a trajectory connecting at least one road having a low relative probability of being selected from at least one decision point on the main path.
  • the processor 170 may perform a control signal generation operation.
  • the processor 170 may generate a control signal based on electronic horizon data.
  • the processor 170 may generate at least one of a powertrain control signal, a brake device control signal, and a steering device control signal based on the electronic horizon data.
  • the processor 170 may transmit the generated control signal to the driving control device 250 through the interface unit 180.
  • the drive control device 250 may transmit a control signal to at least one of the power train 251, the brake device 252, and the steering device 253.
  • FIG. 6 is a view showing the interior of a vehicle according to an embodiment of the present invention.
  • 7 is a block diagram referenced to describe a vehicle cabin system according to an embodiment of the present invention.
  • the vehicle cabin system 300 (hereinafter, the cabin system) may be defined as a convenience system for a user using the vehicle 10.
  • the cabin system 300 may be described as a top-level system including a display system 350, a cargo system 355, a seat system 360, and a payment system 365.
  • the cabin system 300 includes a main controller 370, a memory 340, an interface unit 380, a power supply unit 390, an input device 310, an imaging device 320, a communication device 330, and a display system. 350, a cargo system 355, a seat system 360, and a payment system 365.
  • the cabin system 300 may further include other components in addition to the components described herein, or may not include some of the described components.
  • the main controller 370 is electrically connected to the input device 310, the communication device 330, the display system 350, the cargo system 355, the seat system 360, and the payment system 365 to exchange signals. can do.
  • the main controller 370 may control the input device 310, the communication device 330, the display system 350, the cargo system 355, the seat system 360, and the payment system 365.
  • the main controller 370 includes application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, It may be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
  • the main controller 370 may be configured with at least one sub-controller. Depending on the embodiment, the main controller 370 may include a plurality of sub-controllers. Each of the plurality of sub-controllers may individually control devices and systems included in the grouped cabin system 300. Devices and systems included in the cabin system 300 may be grouped by function or may be grouped based on seatable seats.
  • the main controller 370 may include at least one processor 371. 7 illustrates that the main controller 370 includes one processor 371, the main controller 371 may also include a plurality of processors. The processor 371 may be classified as one of the above-described sub-controllers.
  • the processor 371 may receive signals, information, or data from a user terminal through the communication device 330.
  • the user terminal may transmit signals, information, or data to the cabin system 300.
  • the processor 371 may specify a user based on image data received from at least one of an internal camera and an external camera included in the imaging device.
  • the processor 371 may specify a user by applying an image processing algorithm to image data.
  • the processor 371 may compare information received from the user terminal with image data to identify a user.
  • the information may include at least one of route information, body information, passenger information, luggage information, location information, preferred content information, preferred food information, disability information, and usage history information of the user. .
  • the main controller 370 may include an artificial intelligence agent 372.
  • the artificial intelligence agent 372 may perform machine learning based on data acquired through the input device 310.
  • the artificial intelligence agent 372 may control at least one of the display system 350, the cargo system 355, the seat system 360, and the payment system 365 based on the machine learning result.
  • the memory 340 is electrically connected to the main controller 370.
  • the memory 340 may store basic data for a unit, control data for controlling the operation of the unit, and input/output data.
  • the memory 340 may store data processed by the main controller 370.
  • the memory 340 may be configured with at least one of ROM, RAM, EPROM, flash drive, and hard drive.
  • the memory 340 may store various data for overall operation of the cabin system 300, such as a program for processing or controlling the main controller 370.
  • the memory 340 may be implemented integrally with the main controller 370.
  • the interface unit 380 may exchange signals with at least one electronic device provided in the vehicle 10 by wire or wirelessly.
  • the interface unit 380 may be composed of at least one of a communication module, a terminal, a pin, a cable, a port, a circuit, an element, and a device.
  • the power supply unit 390 may supply power to the cabin system 300.
  • the power supply unit 390 may receive power from a power source (eg, a battery) included in the vehicle 10 and supply power to each unit of the cabin system 300.
  • the power supply unit 390 may be operated according to a control signal provided from the main controller 370.
  • the power supply unit 390 may be implemented as a switched-mode power supply (SMPS).
  • SMPS switched-mode power supply
  • the cabin system 300 may include at least one printed circuit board (PCB).
  • PCB printed circuit board
  • the main controller 370, the memory 340, the interface unit 380, and the power supply unit 390 may be mounted on at least one printed circuit board.
  • the input device 310 may receive a user input.
  • the input device 310 may convert a user input into an electrical signal.
  • the electrical signal converted by the input device 310 may be converted into a control signal and provided to at least one of the display system 350, the cargo system 355, the seat system 360, and the payment system 365.
  • At least one processor included in the main controller 370 or the cabin system 300 may generate a control signal based on an electrical signal received from the input device 310.
  • the input device 310 may include at least one of a touch input unit, a gesture input unit, a mechanical input unit, and a voice input unit.
  • the touch input unit may convert a user's touch input into an electrical signal.
  • the touch input unit may include at least one touch sensor to detect a user's touch input.
  • the touch input unit is integrally formed with at least one display included in the display system 350, thereby implementing a touch screen. Such a touch screen may provide an input interface and an output interface between the cabin system 300 and a user.
  • the gesture input unit may convert a user's gesture input into an electrical signal.
  • the gesture input unit may include at least one of an infrared sensor and an image sensor for detecting a user's gesture input.
  • the gesture input unit may detect a user's 3D gesture input.
  • the gesture input unit may include a light output unit that outputs a plurality of infrared light or a plurality of image sensors.
  • the gesture input unit may detect a user's 3D gesture input through a time of flight (TOF) method, a structured light method, or a disparity method.
  • the mechanical input unit may convert a user's physical input (eg, pressing or rotating) through a mechanical device into an electrical signal.
  • the mechanical input unit may include at least one of a button, a dome switch, a jog wheel, and a jog switch. Meanwhile, the gesture input unit and the mechanical input unit may be integrally formed.
  • the input device 310 may include a gesture sensor, and may include a jog dial device formed to be retractable in a portion of a surrounding structure (eg, at least one of a seat, an armrest, and a door). .
  • a jog dial device formed to be retractable in a portion of a surrounding structure (eg, at least one of a seat, an armrest, and a door).
  • the jog dial device When the jog dial device is in a flat state with the surrounding structure, the jog dial device may function as a gesture input unit.
  • the jog dial device protrudes compared to the surrounding structure, the jog dial device can function as a mechanical input unit.
  • the voice input unit may convert a user's voice input into an electrical signal.
  • the voice input unit may include at least one microphone.
  • the voice input unit may include a beam foaming microphone.
  • the imaging device 320 may include at least one camera.
  • the imaging device 320 may include at least one of an internal camera and an external camera.
  • the internal camera can take an image inside the cabin.
  • the external camera may capture an image outside the vehicle.
  • the internal camera can acquire an image in the cabin.
  • the imaging device 320 may include at least one internal camera. It is preferable that the imaging device 320 includes a number of cameras corresponding to the number of passengers capable of boarding.
  • the imaging device 320 may provide an image acquired by an internal camera.
  • At least one processor included in the main controller 370 or the cabin system 300 detects the user's motion based on the image acquired by the internal camera, generates a signal based on the detected motion, and generates a display system.
  • the external camera may acquire an image outside the vehicle.
  • the imaging device 320 may include at least one external camera. It is preferable that the imaging device 320 includes a number of cameras corresponding to the boarding door.
  • the imaging device 320 may provide an image acquired by an external camera.
  • At least one processor included in the main controller 370 or the cabin system 300 may acquire user information based on an image acquired by an external camera.
  • At least one processor included in the main controller 370 or the cabin system 300 authenticates the user based on the user information, or the user's body information (for example, height information, weight information, etc.), Passenger information, user's luggage information, etc. can be obtained.
  • the communication device 330 can wirelessly exchange signals with an external device.
  • the communication device 330 may exchange signals with an external device through a network network or may directly exchange signals with an external device.
  • the external device may include at least one of a server, a mobile terminal, and another vehicle.
  • the communication device 330 may exchange signals with at least one user terminal.
  • the communication device 330 may include at least one of an antenna, a radio frequency (RF) circuit capable of implementing at least one communication protocol, and an RF element in order to perform communication.
  • the communication device 330 may use a plurality of communication protocols.
  • the communication device 330 may switch the communication protocol according to the distance to the mobile terminal.
  • the communication device may exchange signals with external devices based on C-V2X (Cellular V2X) technology.
  • C-V2X technology may include LTE-based sidelink communication and/or NR-based sidelink communication. Contents related to C-V2X will be described later.
  • a communication device can communicate with external devices based on the IEEE 802.11p PHY/MAC layer technology and the Dedicated Short Range Communications (DSRC) technology based on the IEEE 1609 Network/Transport layer technology, or the Wireless Access in Vehicular Environment (WAVE) standard. Can be exchanged.
  • DSRC or WAVE standard
  • ITS Intelligent Transport System
  • DSRC technology may use a frequency of 5.9GHz band, and may be a communication method having a data transmission rate of 3Mbps ⁇ 27Mbps.
  • IEEE 802.11p technology can be combined with IEEE 1609 technology to support DSRC technology (or WAVE standard).
  • the communication apparatus of the present invention can exchange signals with an external device using only either C-V2X technology or DSRC technology.
  • the communication device of the present invention may exchange signals with external devices by hybridizing C-V2X technology and DSRC technology.
  • the display system 350 may display a graphic object.
  • the display system 350 may include at least one display device.
  • the display system 350 may include a first display device 410 that can be commonly used and a second display device 420 that can be used individually.
  • the first display device 410 may include at least one display 411 that outputs visual content.
  • the display 411 included in the first display device 410 is a flat panel display. It may be implemented as at least one of a curved display, a rollable display, and a flexible display.
  • the first display device 410 may include a first display 411 positioned at the rear of a seat and formed to be in and out of a cabin, and a first mechanism for moving the first display 411.
  • the first display 411 may be disposed in a slot formed in the main frame of the sheet so as to be retractable.
  • the first display device 410 may further include a flexible area control mechanism.
  • the first display may be formed to be flexible, and the flexible area of the first display may be adjusted according to the user's position.
  • the first display device 410 may include a second display positioned on a ceiling in a cabin and formed to be rollable, and a second mechanism for winding or unwinding the second display.
  • the second display may be formed to enable screen output on both sides.
  • the first display device 410 may include a third display positioned on a ceiling in a cabin and formed to be flexible, and a third mechanism for bending or unfolding the third display.
  • the display system 350 may further include at least one processor that provides a control signal to at least one of the first display device 410 and the second display device 420.
  • the processor included in the display system 350 may generate a control signal based on a signal received from at least one of the main controller 370, the input device 310, the imaging device 320, and the communication device 330. I can.
  • the display area of the display included in the first display device 410 may be divided into a first area 411a and a second area 411b.
  • the first area 411a may define content as a display area.
  • the first area 411 may display at least one of entertainment contents (eg, movies, sports, shopping, music, etc.), video conferences, food menus, and graphic objects corresponding to the augmented reality screen. I can.
  • the first area 411a may display a graphic object corresponding to driving situation information of the vehicle 10.
  • the driving situation information may include at least one of object information outside the vehicle, navigation information, and vehicle status information.
  • the object information outside the vehicle may include information on the presence or absence of the object, location information of the object, distance information between the vehicle 300 and the object, and relative speed information between the vehicle 300 and the object.
  • the navigation information may include at least one of map information, set destination information, route information according to the destination setting, information on various objects on the route, lane information, and current location information of the vehicle.
  • the vehicle status information includes vehicle attitude information, vehicle speed information, vehicle tilt information, vehicle weight information, vehicle direction information, vehicle battery information, vehicle fuel information, vehicle tire pressure information, vehicle steering information , Vehicle interior temperature information, vehicle interior humidity information, pedal position information, vehicle engine temperature information, and the like.
  • the second area 411b may be defined as a user interface area.
  • the second area 411b may output an artificial intelligence agent screen.
  • the second area 411b may be located in an area divided by a sheet frame. In this case, the user can view the content displayed in the second area 411b between the plurality of sheets.
  • the first display device 410 may provide holographic content.
  • the first display device 410 may provide holographic content for each of a plurality of users so that only a user who has requested the content can view the content.
  • the second display device 420 may include at least one display 421.
  • the second display device 420 may provide the display 421 at a location where only individual passengers can check the display contents.
  • the display 421 may be disposed on the arm rest of the seat.
  • the second display device 420 may display a graphic object corresponding to the user's personal information.
  • the second display device 420 may include a number of displays 421 corresponding to the number of persons allowed to ride.
  • the second display device 420 may implement a touch screen by forming a layer structure or integrally with the touch sensor.
  • the second display device 420 may display a graphic object for receiving a user input for seat adjustment or room temperature adjustment.
  • the cargo system 355 may provide a product to a user according to a user's request.
  • the cargo system 355 may be operated based on an electrical signal generated by the input device 310 or the communication device 330.
  • the cargo system 355 may include a cargo box.
  • the cargo box may be concealed in a portion of the lower portion of the seat while the goods are loaded.
  • the cargo box may be exposed as a cabin.
  • the user can select a necessary product among the items loaded in the exposed cargo box.
  • the cargo system 355 may include a sliding moving mechanism and a product pop-up mechanism to expose a cargo box according to a user input.
  • the cargo system 355 may include a plurality of cargo boxes to provide various types of goods.
  • a weight sensor for determining whether to be provided for each product may be built into the cargo box.
  • the seat system 360 may provide a user with a customized sheet to the user.
  • the seat system 360 may be operated based on an electrical signal generated by the input device 310 or the communication device 330.
  • the seat system 360 may adjust at least one element of the seat based on the acquired user body data.
  • the seat system 360 may include a user detection sensor (eg, a pressure sensor) to determine whether the user is seated.
  • the seat system 360 may include a plurality of seats each of which a plurality of users can seat. Any one of the plurality of sheets may be disposed to face at least the other. At least two users inside the cabin may sit facing each other.
  • the payment system 365 may provide a payment service to a user.
  • the payment system 365 may be operated based on an electrical signal generated by the input device 310 or the communication device 330.
  • the payment system 365 may calculate a price for at least one service used by the user and request that the calculated price be paid.
  • FIG. 8 is a diagram referenced for explaining a usage scenario of a user according to an embodiment of the present invention.
  • the first scenario S111 is a user's destination prediction scenario.
  • the user terminal may install an application capable of interworking with the cabin system 300.
  • the user terminal may predict the user's destination through the application, based on user's contextual information.
  • the user terminal may provide information on empty seats in the cabin through an application.
  • the second scenario S112 is a cabin interior layout preparation scenario.
  • the cabin system 300 may further include a scanning device for acquiring data on a user located outside the vehicle 300.
  • the scanning device may scan the user to obtain body data and baggage data of the user.
  • the user's body data and baggage data can be used to set the layout.
  • the user's body data may be used for user authentication.
  • the scanning device may include at least one image sensor.
  • the image sensor may acquire a user image by using light in the visible or infrared band.
  • the seat system 360 may set a layout in the cabin based on at least one of a user's body data and baggage data.
  • the seat system 360 may provide a luggage storage space or a car seat installation space.
  • the third scenario S113 is a user welcome scenario.
  • the cabin system 300 may further include at least one guide light.
  • the guide light may be disposed on the floor in the cabin.
  • the cabin system 300 may output a guide light to allow the user to sit on a preset seat among a plurality of seats.
  • the main controller 370 may implement a moving light by sequentially lighting a plurality of light sources over time from an opened door to a preset user seat.
  • the fourth scenario S114 is a seat adjustment service scenario.
  • the seat system 360 may adjust at least one element of a seat matching the user based on the acquired body information.
  • the fifth scenario S115 is a personal content providing scenario.
  • the display system 350 may receive user personal data through the input device 310 or the communication device 330.
  • the display system 350 may provide content corresponding to user personal data.
  • the sixth scenario S116 is a product provision scenario.
  • the cargo system 355 may receive user data through the input device 310 or the communication device 330.
  • the user data may include user preference data and user destination data.
  • the cargo system 355 may provide a product based on user data.
  • the seventh scenario S117 is a payment scenario.
  • the payment system 365 may receive data for price calculation from at least one of the input device 310, the communication device 330, and the cargo system 355.
  • the payment system 365 may calculate a vehicle usage price of the user based on the received data.
  • the payment system 365 may request payment from a user (eg, a user's mobile terminal) at the calculated price.
  • the eighth scenario S118 is a user's display system control scenario.
  • the input device 310 may receive a user input in at least one form and convert it into an electrical signal.
  • the display system 350 may control displayed content based on an electrical signal.
  • the ninth scenario S119 is a multi-channel artificial intelligence (AI) agent scenario for a plurality of users.
  • the artificial intelligence agent 372 may classify a user input for each of a plurality of users.
  • the artificial intelligence agent 372 is at least one of the display system 350, the cargo system 355, the seat system 360, and the payment system 365 based on the electrical signals converted from a plurality of user individual user inputs. Can be controlled.
  • a tenth scenario S120 is a scenario for providing multimedia contents targeting a plurality of users.
  • the display system 350 may provide content that all users can watch together. In this case, the display system 350 may individually provide the same sound to a plurality of users through speakers provided for each sheet.
  • the display system 350 may provide content that can be individually viewed by a plurality of users. In this case, the display system 350 may provide individual sounds through speakers provided for each sheet.
  • the eleventh scenario S121 is a user safety securing scenario.
  • the main controller 370 may control to output an alarm for objects around the vehicle through the display system 350.
  • a twelfth scenario is a scenario for preventing the user's belongings from being lost.
  • the main controller 370 may acquire data on the user's belongings through the input device 310.
  • the main controller 370 may acquire user motion data through the input device 310.
  • the main controller 370 may determine whether the user leaves the belongings and alights based on the data and movement data on the belongings.
  • the main controller 370 may control an alarm regarding belongings to be output through the display system 350.
  • the thirteenth scenario S123 is a getting off report scenario.
  • the main controller 370 may receive a user's getting off data through the input device 310. After getting off the user, the main controller 370 may provide report data according to the getting off to the user's mobile terminal through the communication device 330.
  • the report data may include data on the total usage fee of the vehicle 10.
  • CDMA code division multiple access
  • FDMA frequency division multiple access
  • TDMA time division multiple access
  • OFDMA orthogonal frequency division multiple access
  • SC-FDMA single carrier frequency division multiple access
  • CDMA may be implemented with a radio technology such as universal terrestrial radio access (UTRA) or CDMA2000.
  • TDMA may be implemented with a radio technology such as global system for mobile communications (GSM)/general packet radio service (GPRS)/enhanced data rates for GSM evolution (EDGE).
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • EDGE enhanced data rates for GSM evolution
  • OFDMA may be implemented with wireless technologies such as IEEE (institute of electrical and electronics engineers) 802.11 (Wi-Fi), IEEE 802.16 (WiMAX), IEEE 802-20, and E-UTRA (evolved UTRA).
  • IEEE 802.16m is an evolution of IEEE 802.16e and provides backward compatibility with a system based on IEEE 802.16e.
  • UTRA is part of a universal mobile telecommunications system (UMTS).
  • 3rd generation partnership project (3GPP) long term evolution (LTE) is a part of evolved UMTS (E-UMTS) that uses evolved-UMTS terrestrial radio access (E-UTRA), and employs OFDMA in downlink and SC in uplink.
  • -Adopt FDMA is an evolution of 3GPP LTE.
  • 5G NR is the successor technology of LTE-A, and is a new clean-slate type mobile communication system with features such as high performance, low latency, and high availability.
  • 5G NR can utilize all available spectrum resources, from low frequency bands of less than 1 GHz to intermediate frequency bands of 1 GHz to 10 GHz and high frequency (millimeter wave) bands of 24 GHz or higher.
  • LTE-A or 5G NR is mainly described, but the technical idea of the present invention is not limited thereto.
  • E-UTRAN Evolved-UMTS Terrestrial Radio Access Network
  • LTE Long Term Evolution
  • the E-UTRAN includes a base station (BS) 20 that provides a control plane and a user plane to the terminal 10.
  • the terminal 10 may be fixed or mobile, and may be referred to as other terms such as a mobile station (MS), a user terminal (UT), a subscriber station (SS), a mobile terminal (MT), and a wireless device.
  • the base station 20 refers to a fixed station communicating with the terminal 10, and may be referred to as an evolved-NodeB (eNB), a base transceiver system (BTS), an access point, and the like.
  • eNB evolved-NodeB
  • BTS base transceiver system
  • access point and the like.
  • the base stations 20 may be connected to each other through an X2 interface.
  • the base station 20 is connected to an Evolved Packet Core (EPC) 30 through an S1 interface, more specifically, a Mobility Management Entity (MME) through an S1-MME and a Serving Gateway (S-GW) through an S1-U.
  • EPC Evolved Packet Core
  • MME Mobility Management Entity
  • S-GW Serving Gateway
  • the EPC 30 is composed of MME, S-GW, and P-GW (Packet Data Network-Gateway).
  • the MME has access information of the terminal or information on the capabilities of the terminal, and this information is mainly used for mobility management of the terminal.
  • S-GW is a gateway with E-UTRAN as an endpoint
  • P-GW is a gateway with PDN as an endpoint.
  • the layers of the Radio Interface Protocol between the terminal and the network are L1 (Layer 1) based on the lower 3 layers of the Open System Interconnection (OSI) standard model, which is widely known in communication systems. It can be divided into L2 (second layer) and L3 (third layer).
  • L2 second layer
  • L3 third layer
  • the physical layer belonging to the first layer provides an information transfer service using a physical channel
  • the radio resource control (RRC) layer located in the third layer is a radio resource between the terminal and the network. It plays the role of controlling To this end, the RRC layer exchanges RRC messages between the terminal and the base station.
  • the user plane is a protocol stack for transmitting user data
  • the control plane is a protocol stack for transmitting control signals.
  • a physical layer provides an information transmission service to an upper layer using a physical channel.
  • the physical layer is connected to an upper layer, a medium access control (MAC) layer, through a transport channel. Data is moved between the MAC layer and the physical layer through the transport channel. Transmission channels are classified according to how and with what characteristics data is transmitted over the air interface.
  • MAC medium access control
  • the physical channel may be modulated in an Orthogonal Frequency Division Multiplexing (OFDM) scheme, and uses time and frequency as radio resources.
  • OFDM Orthogonal Frequency Division Multiplexing
  • the MAC layer provides a service to an upper layer, a radio link control (RLC) layer, through a logical channel.
  • the MAC layer provides a mapping function from a plurality of logical channels to a plurality of transport channels.
  • the MAC layer provides a logical channel multiplexing function by mapping a plurality of logical channels to a single transport channel.
  • the MAC sublayer provides a data transmission service on a logical channel.
  • the RLC layer performs concatenation, segmentation, and reassembly of RLC SDUs.
  • the RLC layer In order to ensure various QoS (Quality of Service) required by Radio Bearer (RB), the RLC layer has a Transparent Mode (TM), Unacknowledged Mode (UM), and Acknowledged Mode. , AM).
  • TM Transparent Mode
  • UM Unacknowledged Mode
  • AM Acknowledged Mode.
  • AM RLC provides error correction through automatic repeat request (ARQ).
  • the Radio Resource Control (RRC) layer is defined only in the control plane.
  • the RRC layer is in charge of controlling logical channels, transport channels, and physical channels in relation to configuration, re-configuration, and release of radio bearers.
  • RB refers to a logical path provided by the first layer (PHY layer) and the second layer (MAC layer, RLC layer, PDCP layer) for data transmission between the terminal and the network.
  • Functions of the Packet Data Convergence Protocol (PDCP) layer in the user plane include transmission of user data, header compression, and ciphering.
  • Functions of the Packet Data Convergence Protocol (PDCP) layer in the control plane include transmission of control plane data and encryption/integrity protection.
  • Establishing the RB refers to a process of defining characteristics of a radio protocol layer and channel to provide a specific service, and setting specific parameters and operation methods for each.
  • the RB can be further divided into two types: Signaling Radio Bearer (SRB) and Data Radio Bearer (DRB).
  • SRB is used as a path for transmitting RRC messages in the control plane
  • DRB is used as a path for transmitting user data in the user plane.
  • the UE When an RRC connection is established between the RRC layer of the UE and the RRC layer of the E-UTRAN, the UE is in the RRC_CONNEDTED state, otherwise it is in the RRC_IDLE state.
  • the RRC_INACTIVE state is additionally defined, and the terminal in the RRC_INACTIVE state can release the connection with the base station while maintaining the connection with the core network.
  • a downlink transmission channel for transmitting data from a network to a terminal there is a broadcast channel (BCH) for transmitting system information and a downlink shared channel (SCH) for transmitting user traffic or control messages.
  • BCH broadcast channel
  • SCH downlink shared channel
  • downlink multicast or broadcast service traffic or control messages they may be transmitted through a downlink SCH or a separate downlink multicast channel (MCH).
  • RACH random access channel
  • SCH uplink shared channel
  • BCCH Broadcast Control Channel
  • PCCH Paging Control Channel
  • CCCH Common Control Channel
  • MCCH Multicast Control Channel
  • MTCH Multicast Traffic
  • the physical channel is composed of several OFDM symbols in the time domain and several sub-carriers in the frequency domain.
  • One sub-frame is composed of a plurality of OFDM symbols in the time domain.
  • a resource block is a resource allocation unit and is composed of a plurality of OFDM symbols and a plurality of sub-carriers.
  • each subframe may use specific subcarriers of specific OFDM symbols (eg, the first OFDM symbol) of the corresponding subframe for the PDCCH (Physical Downlink Control Channel), that is, the L1/L2 control channel.
  • TTI Transmission Time Interval
  • FIG. 12 shows the structure of an NR system to which the present invention can be applied.
  • the NG-RAN may include a gNB and/or an eNB that provides a user plane and a control plane protocol termination to a terminal.
  • 12 illustrates a case where only gNB is included.
  • the gNB and the eNB are connected to each other through an Xn interface.
  • the gNB and eNB are connected to the 5th generation core network (5G Core Network: 5GC) through the NG interface.
  • 5G Core Network: 5GC 5th generation core network
  • AMF access and mobility management function
  • UPF user plane function
  • FIG. 13 shows functional division between NG-RAN and 5GC to which the present invention can be applied.
  • the gNB is inter-cell radio resource management (Inter Cell RRM), radio bearer management (RB control), connection mobility control (Connection Mobility Control), radio admission control (Radio Admission Control), measurement setting and provision Functions such as (Measurement configuration & Provision) and dynamic resource allocation may be provided.
  • AMF can provide functions such as NAS security and idle state mobility processing.
  • UPF may provide functions such as mobility anchoring and PDU processing.
  • SMF Session Management Function
  • FIG. 14 shows the structure of an NR radio frame to which the present invention can be applied.
  • radio frames can be used in uplink and downlink transmission in NR.
  • the radio frame has a length of 10 ms and may be defined as two 5 ms half-frames (HF).
  • the half-frame may include five 1ms subframes (Subframe, SF).
  • a subframe may be divided into one or more slots, and the number of slots within a subframe may be determined according to a subcarrier spacing (SCS).
  • SCS subcarrier spacing
  • Each slot may include 12 or 14 OFDM(A) symbols according to a cyclic prefix (CP).
  • CP cyclic prefix
  • each slot may include 14 symbols.
  • each slot may include 12 symbols.
  • the symbol may include an OFDM symbol (or CP-OFDM symbol), an SC-FDMA symbol (or DFT-s-OFDM symbol).
  • Table 1 below shows the number of symbols per slot (N slot symb ), the number of slots per frame (N frame, u slot ), and the number of slots per subframe (N subframe,u slot ) is illustrated.
  • Table 2 illustrates the number of symbols per slot, the number of slots per frame, and the number of slots per subframe according to the SCS when the extended CP is used.
  • OFDM(A) numerology eg, SCS, CP length, etc.
  • OFDM(A) numerology eg, SCS, CP length, etc.
  • the (absolute time) section of the time resource e.g., subframe, slot or TTI
  • TU Time Unit
  • FIG. 15 shows a slot structure of an NR frame to which the present invention can be applied.
  • a slot includes a plurality of symbols in the time domain. For example, in the case of a normal CP, one slot includes 14 symbols, but in the case of an extended CP, one slot may include 12 symbols. Alternatively, in the case of a normal CP, one slot may include 7 symbols, but in the case of an extended CP, one slot may include 6 symbols.
  • the carrier includes a plurality of subcarriers in the frequency domain.
  • Resource Block RB
  • the bandwidth part BWP
  • the carrier may include up to N (eg, 5) BWPs. Data communication can be performed through an activated BWP.
  • Each element may be referred to as a resource element (RE) in the resource grid, and one complex symbol may be mapped.
  • RE resource element
  • V2X or sidelink communication will be described.
  • FIG. 16 shows a protocol stack for sidelink communication to which the present invention can be applied. Specifically, FIG. 16A shows a user plane protocol stack of LTE, and FIG. 16B shows a control plane protocol stack of LTE.
  • FIG. 17 shows a protocol stack for sidelink communication to which the present invention can be applied. Specifically, (a) of FIG. 17 shows a user plane protocol stack of NR, and (b) of FIG. 17 shows a control plane protocol stack of NR.
  • SLSS sidelink synchronization signal
  • SLSS is a sidelink specific sequence and may include a Primary Sidelink Synchronization Signal (PSSS) and a Secondary Sidelink Synchronization Signal (SSSS).
  • PSSS Primary Sidelink Synchronization Signal
  • SSSS Secondary Sidelink Synchronization Signal
  • S-PSS Secondary Sidelink Synchronization Signal
  • S-SSS Seglink Secondary Synchronization Signal
  • the PSBCH Physical Sidelink Broadcast Channel
  • the PSBCH may be a (broadcast) channel through which basic (system) information that the terminal needs to know first before transmitting and receiving a sidelink signal is transmitted.
  • the basic information may be information related to SLSS, duplex mode (DM), TDD UL/DL configuration, resource pool related information, type of application related to SLSS, subframe offset, broadcast information, etc. have.
  • S-PSS, S-SSS and PSBCH may be included in a block format supporting periodic transmission (eg, sidelink SS/PSBCH block, hereinafter S-SSB).
  • the S-SSB may have the same numanology (i.e., SCS and CP length) as the PSCCH (Physical Sidelink Control Channel) / PSSCH (Physical Sidelink Shared Channel) in the carrier, and the transmission bandwidth may be within the (pre) set SL BWP I can.
  • the frequency position of the S-SSB may be set (in advance). Therefore, the terminal does not need to perform hypothesis detection in frequency to discover the S-SSB in the carrier.
  • Each SLSS may have a physical layer sidelink synchronization ID (identity), and the value may be any one of 0 to 335.
  • a synchronization source may be identified.
  • 0, 168, and 169 may refer to global navigation satellite systems (GNSS)
  • 1 to 167 may refer to a base station
  • 170 to 335 may refer to outside coverage.
  • 0 to 167 may be values used by the network
  • 168 to 335 may be values used outside the network coverage.
  • the unit of the time resource may mean a subframe of LTE/LTE-A, and may mean a slot in NR.
  • the specific content is based on the content presented in the 3GPP TS 36 series or 38 series document.
  • the PSBCH may be transmitted on the same time resource unit as the SLSS or on a subsequent time resource unit.
  • DMRS can be used for demodulation of PSBCH.
  • 19 shows a terminal performing V2X or sidelink communication to which the present invention can be applied.
  • the term terminal may mainly mean a user terminal.
  • the base station may also be regarded as a kind of terminal.
  • Terminal 1 may operate to select a resource unit corresponding to a specific resource in a resource pool, which means a set of resources, and transmit a sidelink signal using the corresponding resource unit.
  • Terminal 2 which is a receiving terminal, is configured with a resource pool through which terminal 1 can transmit a signal, and may detect a signal of terminal 1 in the corresponding resource pool.
  • the base station may inform the resource pool.
  • another terminal may inform the resource pool or may be determined as a predetermined resource.
  • the resource pool may be composed of a plurality of resource units, and each terminal may select one or a plurality of resource units and use it for transmitting its own sidelink signal.
  • the total frequency resources of the resource pool may be divided into N F
  • the total time resources of the resource pool may be divided into N T. Accordingly, a total of N F * N T resource units may be defined in the resource pool. 20 shows an example in which a corresponding resource pool is repeated in a period of N T subframes.
  • one resource unit (eg, Unit #0) may be periodically repeated.
  • an index of a physical resource unit to which one logical resource unit is mapped may change in a predetermined pattern over time.
  • a resource pool may mean a set of resource units that can be used for transmission by a terminal to transmit a sidelink signal.
  • Resource pools can be subdivided into several types. For example, according to the content of the sidelink signal transmitted from each resource pool, the resource pool may be classified as follows.
  • SA Scheduling Assignment
  • MCS Modulation and Coding Scheme
  • TA It may be a signal including information such as (Timing Advance).
  • the SA may be multiplexed with sidelink data and transmitted on the same resource unit.
  • the SA resource pool may mean a resource pool in which the SA is multiplexed with sidelink data and transmitted.
  • SA may also be referred to as a sidelink control channel.
  • the sidelink data channel may be a resource pool used by a transmitting terminal to transmit user data. If SA is multiplexed and transmitted along with sidelink data on the same resource unit, only a sidelink data channel excluding SA information may be transmitted from a resource pool for a sidelink data channel. In other words, REs, which have been used to transmit SA information on individual resource units in the SA resource pool, may still be used to transmit sidelink data in the resource pool of the sidelink data channel.
  • the discovery channel may be a resource pool for a transmitting terminal to transmit information such as its own identifier (ID). Through this, the transmitting terminal can allow the neighboring terminal to discover itself.
  • ID its own identifier
  • a method of determining the transmission timing of a sidelink signal for example, whether it is transmitted at the time of reception of the synchronization reference signal or is transmitted by applying a certain timing advance at the time of reception
  • Resource allocation method e.g., whether the base station designates the transmission resource of an individual signal to an individual transmitting terminal or whether the individual transmitting terminal selects an individual signal transmission resource by itself within the resource pool
  • signal format for example, The number of symbols occupied by each sidelink signal in one subframe, or the number of subframes used for transmission of one sidelink signal
  • signal strength from the base station transmission power strength of the sidelink terminal, etc. It can be distinguished.
  • TM transmission mode
  • FIG. 21(a) shows a terminal operation related to transmission mode 1 or transmission mode 3
  • FIG. 21(b) shows a terminal operation related to transmission mode 2 or transmission mode 4.
  • the base station performs resource scheduling to the terminal 1 through a PDCCH (more specifically, DCI), and the terminal 1 is a sidelink with terminal 2 according to the corresponding resource scheduling.
  • /V2X communication is performed.
  • UE 1 transmits sidelink control information (SCI) to UE 2 through a physical sidelink control channel (PSCCH)
  • PSSCH physical sidelink shared channel
  • transmission mode 1 may be applied to general sidelink communication
  • transmission mode 3 may be applied to V2X sidelink communication.
  • the terminal may schedule resources by itself. More specifically, in the case of the LTE sidelink, transmission mode 2 is applied to general sidelink communication, and the terminal may perform a sidelink operation by selecting a resource from a set resource pool by itself. Transmission mode 4 is applied to the V2X sidelink communication, and the terminal may perform a V2X sidelink operation after selecting a resource within the selection window by itself through a sensing/SA decoding process. Terminal 1 may transmit SCI to terminal 2 through PSCCH, and then transmit the SCI-based data through PSSCH.
  • the transmission mode may be abbreviated as a mode.
  • the base station can schedule sidelink resources to be used by the terminal for sidelink transmission.
  • the terminal may determine a sidelink transmission resource within a sidelink resource set by the base station/network or a sidelink resource set in advance.
  • the set sidelink resource or the preset sidelink resource may be a resource/resource pool.
  • the terminal can autonomously select a sidelink resource for transmission.
  • the terminal may help select a sidelink resource for another terminal.
  • the terminal may receive an NR configured grant for sidelink transmission.
  • the terminal can schedule sidelink transmission of another terminal.
  • mode 2 may support at least reservation of sidelink resources for blind retransmission.
  • the sensing procedure may be defined as decoding SCI from other terminals and/or sidelink measurements. Decoding the SCI in the sensing procedure may provide at least information on the sidelink resource indicated by the terminal transmitting the SCI. When the corresponding SCI is decoded, the sensing procedure may use L1 SL RSRP measurement based on SL DMRS. The resource (re) selection procedure can use the result of the sensing procedure to determine a resource for sidelink transmission.
  • a method in which transmission resources of the next packet are also reserved may be used.
  • FIG. 22 shows an example in which a transmission resource to which the present invention can be applied is selected.
  • transmission may be performed twice per MAC PDU.
  • a resource for retransmission may be reserved at a predetermined time gap.
  • the terminal can identify the transmission resources reserved by the other terminal or the resources used by the other terminal through sensing within the sensing window, and after excluding them within the selection window, randomly among the remaining resources with less interference. You can choose a resource.
  • the terminal may decode a PSCCH including information on the period of the reserved resources within the sensing window, and measure the PSSCH RSRP from the resources periodically determined based on the PSCCH.
  • the UE may exclude resources in which the PSSCH RSRP value exceeds the threshold value from within the selection window. Thereafter, the terminal may randomly select a sidelink resource from among the remaining resources in the selection window.
  • the terminal may determine resources with less interference (eg, resources corresponding to the lower 20%) by measuring RSSI (Received Signal Strength Indication) of periodic resources within the sensing window.
  • the terminal may randomly select a sidelink resource from among resources included in the selection window among the periodic resources. For example, if the terminal fails to decode the PSCCH, the terminal can use the above method.
  • FIG. 23 shows an example in which a PSCCH is transmitted in sidelink transmission mode 1 or 2 to which the present invention can be applied.
  • a first PSCCH (or SA) period may start in a time resource unit separated by a predetermined offset indicated by higher layer signaling from a specific system frame.
  • Each PSCCH period may include a PSCCH resource pool and a time resource unit pool for sidelink data transmission.
  • the PSCCH resource pool may include a last time resource unit among time resource units indicated by transmission of the PSCCH in the time resource unit bitmap from the first time resource unit of the PSCCH period.
  • a time resource unit for sidelink data transmission may be determined based on Time-Resource Pattern for Transmission (T-RPT) or Time-Resource Pattern (TRP).
  • the T-RPT may be repeatedly applied.
  • the last applied T-RPT can be applied after being truncated by the number of remaining time resource units.
  • the transmitting terminal performs transmission at a location where the T-RPT bitmap is 1 in the indicated T-RPT, and may transmit one MAC PDU four times.
  • the embodiment of FIG. 23 may be applied to NR sidelink resource allocation mode 1 or mode 2.
  • FIG. 24 shows an example in which a PSCCH is transmitted in sidelink transmission mode 3 or 4 to which the present invention can be applied.
  • PSCCH and PSSCH are transmitted in the FDM scheme.
  • PSCCH and PSSCH may be transmitted in an FDM manner on different frequency resources on the same time resource. Referring to FIG. 24, PSCCH and PSSCH may not be directly adjacent as shown in FIG. 24(a), and PSCCH and PSSCH may be directly adjacent as shown in FIG. 24(b).
  • the basic unit of this transmission is a sub-channel.
  • the subchannel may be a resource unit having one or more RB sizes on a frequency axis on a predetermined time resource (eg, a time resource unit).
  • the number of RBs included in the sub-channel ie, the size of the sub-channel and the starting position on the frequency axis of the sub-channel
  • the embodiment of FIG. 24 may be applied to NR sidelink resource allocation mode 1 or mode 2.
  • CAM Cooperative Awareness Message
  • DENM Decentralized Environmental Notification Message
  • a periodic message type CAM In vehicle-to-vehicle communication, a periodic message type CAM, an event triggered message type DENM, and the like may be transmitted.
  • the CAM may include basic vehicle information such as dynamic state information of the vehicle such as direction and speed, vehicle static data such as dimensions, external lighting conditions, and route history.
  • the size of the CAM can be 50-300 bytes.
  • CAM is broadcast, and the latency should be less than 100ms.
  • DENM may be a message generated in case of an unexpected situation such as a vehicle breakdown or an accident.
  • the size of the DENM can be less than 3000 bytes, and any vehicle within the transmission range can receive the message. In this case, DENM may have a higher priority than CAM.
  • Carrier reselection for V2X/sidelink communication may be performed in the MAC layer based on the Channel Busy Ratio (CBR) of the configured carriers and the PPPP (Prose Per-Packet Priority) of the V2X message to be transmitted.
  • CBR Channel Busy Ratio
  • PPPP Prose Per-Packet Priority
  • CBR may mean the portion of sub-channels in the resource pool detected that the S-RSSI measured by the terminal exceeds a preset threshold.
  • PPPP related to each logical channel may exist, and the setting of the PPPP value should reflect the latency required for both the terminal and the base station.
  • the UE may select one or more carriers among candidate carriers in increasing order from the lowest CBR.
  • the data unit to which the present invention can be applied may be subjected to physical layer processing at the transmitting side before being transmitted through the air interface, and the radio signal carrying the data unit to which the present invention can be applied is the receiving side ( receiving side) can be the object of physical layer processing.
  • 25 shows an example of physical layer processing at a transmission side to which the present invention can be applied.
  • Table 3 may indicate a mapping relationship between an uplink transport channel and a physical channel
  • Table 4 may indicate a mapping relationship between uplink control channel information and a physical channel.
  • Table 5 may indicate a mapping relationship between a downlink transport channel and a physical channel
  • Table 6 may indicate a mapping relationship between downlink control channel information and a physical channel.
  • Table 7 may indicate a mapping relationship between a sidelink transmission channel and a physical channel
  • Table 8 may indicate a mapping relationship between sidelink control channel information and a physical channel.
  • a transport side may perform encoding on a transport block (TB).
  • Data and control streams from the MAC layer may be encoded to provide transport and control services over a radio transmission link at the PHY layer.
  • the TB from the MAC layer may be encoded as a codeword at the transmitting side.
  • the channel coding scheme may be a combination of error detection, error correcting, rate matching, interleaving, and control information separated from a physical channel or a transport channel.
  • the channel coding scheme may be a combination of error detection, error correcting, rate matching, interleaving, and control information mapped on a physical channel or a transport channel. have.
  • the following channel coding scheme may be used for different types of transport channels and different types of control information.
  • a channel coding scheme for each transport channel type may be shown in Table 9.
  • a channel coding scheme for each control information type may be shown in Table 10.
  • Control information Channel coding method DCI Polar code SCI UCI Block code, Polar code
  • the transmitting side may attach a cyclic redundancy check (CRC) sequence to the TB.
  • CRC cyclic redundancy check
  • the transmitting side can provide error detection for the receiving side.
  • the transmitting side may be a transmitting terminal, and the receiving side may be a receiving terminal.
  • a communication device may use an LDPC code to encode/decode UL-SCH and DL-SCH.
  • the NR system can support two LDPC base graphs (ie, two LDPC base metrics).
  • the two LDPC base graphs may be LDPC base graph 1 optimized for small TB and LDPC base graph for large TB.
  • the transmission side may select LDPC base graph 1 or 2 based on the size of the TB and the coding rate (R).
  • the coding rate may be indicated by a modulation coding scheme (MCS) index (I_MCS).
  • MCS index may be dynamically provided to the UE by the PUSCH or the PDCCH scheduling the PDSCH. Or, the MCS index may be dynamically provided to the UE by the PDCCH to (re) initialize or activate the UL configured grant 2 or DL SPS.
  • the MCS index may be provided to the UE by RRC signaling related to UL configured grant type 1.
  • the transmission side may divide the TB to which the CRC is attached into a plurality of code blocks. In addition, the transmission side may attach an additional CRC sequence to each code block.
  • the maximum code block size for LDPC base graph 1 and LDPC base graph 2 may be 8448 bits and 3480 bits, respectively. If the TB to which the CRC is attached is not larger than the maximum code block size for the selected LDPC base graph, the transmitting side may encode the TB to which the CRC is attached to the selected LDPC base graph. The transmitting side may encode each code block of the TB into the selected LDPC basic graph.
  • LDPC-coded blocks may be individually rate-matched.
  • Code block concatenation may be performed to generate a codeword for transmission on a PDSCH or PUSCH.
  • PDSCH Downlink Control Channel
  • PUSCH Up to two codewords (ie, up to two TBs) may be simultaneously transmitted on the PDSCH.
  • PUSCH may be used for transmission of UL-SCH data and layer 1 and/or 2 control information.
  • the layer 1 and/or 2 control information may be multiplexed with a codeword for UL-SCH data.
  • the transmitting side may perform scrambling and modulation on the codeword.
  • the bits of the codeword can be scrambled and modulated to produce a block of complex-valued modulation symbols.
  • the transmitting side may perform layer mapping.
  • the complex-valued modulation symbols of the codeword may be mapped to one or more multiple input multiple output (MIMO) layers.
  • Codewords can be mapped to up to four layers.
  • the PDSCH can carry two codewords, and thus the PDSCH can support up to 8-layer transmission.
  • PUSCH can support a single codeword, and thus, PUSCH can support up to 4-rate transmission.
  • the transmission side may perform precoding conversion.
  • the downlink transmission waveform may be a general OFDM using a cyclic prefix (CP).
  • transform precoding ie, discrete Fourier transform (DFT)
  • DFT discrete Fourier transform
  • the uplink transmission waveform may be a conventional OFDM using a CP having a transform precoding function that performs DFT spreading that can be disabled or enabled.
  • transform precoding can be selectively applied. Transformation precoding may be to spread the uplink data in a special manner to reduce the peak-to-average power ratio (PAPR) of the waveform.
  • Transform precoding may be a form of DFT. That is, the NR system can support two options for an uplink waveform. One may be CP-OFDM (same as the DL waveform), and the other may be DFT-s-OFDM. Whether the terminal should use CP-OFDM or DFT-s-OFDM may be determined by the base station through the RRC parameter.
  • the transmitting side may perform subcarrier mapping. Layers can be mapped to antenna ports.
  • a transparent manner (non-codebook-based) mapping may be supported, and how beamforming or MIMO precoding is performed may be transparent to the terminal. have.
  • both non-codebook-based mapping and codebook-based mapping may be supported.
  • the transmitting side may map complex-valued modulation symbols to subcarriers in the resource block allocated to the physical channel. have.
  • the transmitting side may perform OFDM modulation.
  • the communication device at the transmitting side sets the time-continuous OFDM baseband signal on the antenna port p and the subcarrier spacing for the OFDM symbol 1 in the TTI for the physical channel (u ) Can be created.
  • the communication device of the transmitting side may perform Inverse Fast Fourier Transform (IFFT) on a complex-valued modulation symbol mapped to a resource block of the corresponding OFDM symbol.
  • IFFT Inverse Fast Fourier Transform
  • the communication device of the transmission side may add a CP to the IFFT signal to generate an OFDM baseband signal.
  • the transmitting side may perform up-conversion.
  • the communication device at the transmitting side may up-convert the OFDM baseband signal, subcarrier spacing setting (u), and OFDM symbol (l) for the antenna port (p) to the carrier frequency (f0) of the cell to which the physical channel is allocated. .
  • the processors 9011 and 9021 of FIG. 40 may be configured to perform encoding, scrambling, modulation, layer mapping, precoding transformation (for uplink), subcarrier mapping, and OFDM modulation.
  • 26 shows an example of physical layer processing at a receiving side to which the present invention can be applied.
  • the physical layer processing at the receiving side may basically be an inverse processing of the physical layer processing at the transmitting side.
  • the receiving side may perform frequency down-conversion.
  • the communication device of the receiving side may receive an RF signal of a carrier frequency through an antenna.
  • the transceivers 9013 and 9023 for receiving the RF signal at the carrier frequency may down-convert the carrier frequency of the RF signal to the baseband to obtain an OFDM baseband signal.
  • the receiving side may perform OFDM demodulation.
  • the communication device at the receiving side may acquire a complex-valued modulation symbol through CP separation and FFT. For example, for each OFDM symbol, the communication device at the receiving side may remove the CP from the OFDM baseband signal.
  • the communication device at the receiving side performs FFT on the CP-removed OFDM baseband signal to obtain complex-valued modulation symbols for the antenna port (p), subcarrier spacing (u), and OFDM symbol (l). I can.
  • the receiving side may perform subcarrier demapping.
  • Subcarrier demapping may be performed on a complex-valued modulation symbol to obtain a complex-valued modulation symbol of a corresponding physical channel.
  • the processor of the terminal may obtain a complex-valued modulation symbol mapped to a subcarrier belonging to the PDSCH among complex-valued modulation symbols received in a bandwidth part (BWP).
  • BWP bandwidth part
  • the receiving side may perform transform de-precoding.
  • transform de-precoding eg, IDFT
  • IDFT a complex-value modulated symbol of an uplink physical channel.
  • transform de-precoding may not be performed.
  • step S114 the receiving side may perform layer demapping.
  • the complex-valued modulation symbol can be demapped into one or two codewords.
  • the receiving side may perform demodulation and descrambling.
  • the complex-value modulated symbol of the codeword can be demodulated and descrambled with bits of the codeword.
  • the receiving side may perform decoding.
  • the codeword can be decoded into TB.
  • LDPC base graph 1 or 2 may be selected based on the size of TB and coding rate (R).
  • the codeword may include one or a plurality of coded blocks. Each coded block may be decoded into a code block to which a CRC is attached or a TB to which a CRC is attached to the selected LDPC base graph.
  • the CRC sequence may be removed from each of the code blocks to which the CRC is attached, and code blocks may be obtained.
  • the code block may be connected to the TB to which the CRC is attached.
  • the TB CRC sequence can be removed from the TB to which the CRC is attached, whereby the TB can be obtained.
  • TB can be delivered to the MAC layer.
  • the processors 9011 and 9021 of FIG. 40 may be configured to perform OFDM demodulation, subcarrier demapping, layer demapping, demodulation, descrambling, and decoding.
  • time and frequency domain resources related to subcarrier mapping e.g., OFDM symbol, subcarrier, carrier frequency
  • OFDM modulation e.g., OFDM modulation
  • frequency up/down conversion are resource allocation (e.g. For example, it may be determined based on an uplink grand and downlink allocation).
  • time division multiple access TDMA
  • frequency division multiples access FDMA
  • ISI inter-symbol interference
  • ICI inter-carrier interference
  • MIB-SL-V2X master information block-sidelink-V2X
  • FIG. 27 shows a synchronization source or a synchronization reference in V2X to which the present invention can be applied.
  • the terminal may be synchronized to the GNSS directly through a terminal (in network coverage or out of network coverage) that is directly synchronized with the GNSS (global navigation satellite systems) or directly synchronized with the GNSS.
  • the UE may calculate the DFN and the subframe number using the UTC (Coordinated Universal Time) and (pre) set DFN (Direct Frame Number) offset.
  • the terminal may be directly synchronized with the base station or may be synchronized with another terminal that is time/frequency synchronized with the base station.
  • the base station may be an eNB or a gNB.
  • the terminal may receive synchronization information provided by the base station, and may be directly synchronized with the base station. Thereafter, the terminal may provide synchronization information to other adjacent terminals.
  • the base station timing is set as the synchronization criterion
  • the UE is a cell associated with a corresponding frequency (if it is within cell coverage at the frequency), a primary cell or a serving cell (if it is outside the cell coverage at the frequency) for synchronization and downlink measurement. ) Can be followed.
  • the base station may provide synchronization settings for carriers used for V2X/sidelink communication.
  • the terminal may follow the synchronization setting received from the base station. If the terminal has not detected any cell in the carrier used for the V2X/sidelink communication and has not received a synchronization setting from a serving cell, the terminal may follow a preset synchronization setting.
  • the terminal may be synchronized to another terminal that has not directly or indirectly obtained synchronization information from the base station or the GNSS.
  • the synchronization source and preference may be preset to the terminal.
  • the synchronization source and preference may be set through a control message provided by the base station.
  • the sidelink synchronization source may be associated with synchronization priority.
  • the relationship between the synchronization source and the synchronization priority may be defined as shown in Table 11.
  • Table 11 is only an example, and the relationship between the synchronization source and the synchronization priority may be defined in various forms.
  • GNSS-based synchronization Base station-based synchronization (eNB/gNB-based synchronization) P0 GNSS Base station P1 All terminals synchronized directly to GNSS All terminals synchronized directly to the base station P2 All terminals indirectly synchronized to GNSS All terminals indirectly synchronized to the base station P3 All other terminals GNSS P4 N/A All terminals synchronized directly to GNSS P5 N/A All terminals indirectly synchronized to GNSS P6 N/A All other terminals
  • Whether to use GNSS-based synchronization or base station-based synchronization may be set (in advance).
  • the terminal can derive the transmission timing of the terminal from an available synchronization criterion having the highest priority.
  • bandwidth part BWP
  • resource pool a bandwidth part (BWP) and a resource pool
  • the reception bandwidth and the transmission bandwidth of the terminal need not be as large as the bandwidth of the cell, and the reception bandwidth and the transmission bandwidth of the terminal can be adjusted.
  • the network/base station may inform the terminal of bandwidth adjustment.
  • the terminal may receive information/settings for bandwidth adjustment from the network/base station.
  • the terminal may perform bandwidth adjustment based on the received information/settings.
  • the bandwidth adjustment may include reducing/enlarging the bandwidth, changing the position of the bandwidth, or changing the subcarrier spacing of the bandwidth.
  • bandwidth can be reduced during periods of low activity to save power.
  • the location of the bandwidth can move in the frequency domain.
  • the location of the bandwidth can be moved in the frequency domain to increase scheduling flexibility.
  • subcarrier spacing of the bandwidth may be changed.
  • the subcarrier spacing of the bandwidth can be changed to allow different services.
  • a subset of the total cell bandwidth of a cell may be referred to as a bandwidth part (BWP).
  • the BA may be performed by the base station/network setting the BWP to the terminal and notifying the terminal of the currently active BWP among the BWPs in which the base station/network is set.
  • FIG. 28 shows an example of a scenario in which a BWP to which the present invention can be applied is set.
  • BWP1 having a bandwidth of 40 MHz and subcarrier spacing of 15 kHz, BWP2 having a bandwidth of 10 MHz and subcarrier spacing of 15 kHz, and BWP3 having a bandwidth of 20 MHz and subcarrier spacing of 60 kHz may be set. .
  • the BWP can be defined for sidelink.
  • the same sidelink BWP can be used for transmission and reception.
  • a transmitting terminal may transmit a sidelink channel or a sidelink signal on a specific BWP
  • a receiving terminal may receive a sidelink channel or a sidelink signal on the specific BWP.
  • the sidelink BWP may be defined separately from the Uu BWP, and the sidelink BWP may have separate configuration signaling from the Uu BWP.
  • the terminal may receive the configuration for the sidelink BWP from the base station/network.
  • the sidelink BWP may be configured (in advance) for an NR V2X terminal and an RRC_IDLE terminal out of coverage within a carrier. For the UE in the RRC_CONNECTED mode, at least one sidelink BWP may be activated in the carrier.
  • the resource pool may be a set of time-frequency resources that can be used for sidelink transmission and/or sidelink reception. From the terminal's point of view, the time domain resources in the resource pool may not be contiguous. A plurality of resource pools may be set (in advance) to the terminal within one carrier.
  • FIG. 29 schematically shows an example of a reference model of a cellular-V2X (CV2X).
  • the application layer provides V2X services such as road safety, drive assistance, smart traffic management, and infotainment.
  • the message sublayer plays a role of messaging according to a defined format in order to deliver information generated by an application to other V2X devices.
  • the security service guarantees authentication of the generated message by attaching an electronic signature to the message.
  • the network/transport layer allows V2X messages to be accurately delivered to V2X devices and corresponding applications.
  • the CV2X modem serves as a wireless interface for sending and receiving messages between V2X devices.
  • the V2X message generated by the CV2X application is transmitted to other V2X devices through the sidelink of the physical layer.
  • the sidelink uses one-to-many direct communication, which is a broadcast communication method.
  • Sidelink is defined as the PC5 interface in 3GPP.
  • V2X applications expect nearby V2X devices to deliver application information, and V2X devices provide one-to-many direction (sidelink) communication to nearby V2X devices. It delivers application information through. At this time, the communication radius is limited, and thus broadcast messages are delivered only to peripheral devices.
  • 31 is a diagram showing the types of V2X operation.
  • V2V vehicle-to-vehicle
  • V2P vehicle-to-pedestrian
  • V2I vehicle-to-infrastructure
  • the network may mean a roadside unit (RSU), and the RSU may be a base station of the same type as a vehicle terminal or a mobile communication network base station.
  • V2X sidelink is composed of PSCCH (Physical Sidelink Control Channel) and PSSCH (Physical Sidelink Shared Channel).
  • PSSCH transmits a V2X message as a data channel
  • PSCCH transmits Sidelink Control Information (SCI) as a control channel
  • SCI contains information on a data channel.
  • CV2X technology basically provides only a broadcast (one-to-many) based communication method in the physical layer in charge of the air interface.
  • V2X Many services of V2X require one-to-one communication between specific V2X devices using a request & response method.
  • the use case in which the one-to-one communication can be used is platooning, maneuver coordination (e.g., sharing driving information between different vehicles and/or terminals at an intersection, etc.), ( Traction using sidelink communication.
  • maneuver coordination e.g., sharing driving information between different vehicles and/or terminals at an intersection, etc.
  • Traction using sidelink communication Traction using sidelink communication.
  • Broadcast-based CV2X communication is difficult to provide radio link reliability when communicating between specific V2X devices, and unrelated V2X devices receive messages between specific devices to obtain information, thereby maintaining security or confidentiality. ) Is difficult to perform guaranteed communication.
  • radio resource information used by a V2X device is transmitted when an application message is transmitted between V2X devices. You can share on The target V2X device may receive a V2X message using radio resource information shared by the counterpart device.
  • the probability of receiving a message can be increased, and power consumption used by a monitoring operation when receiving a message can be reduced.
  • FIG. 32 schematically shows an example of a V2X radio resource pool.
  • CV2X manages radio resources by dividing them into resource pools based on time and frequency.
  • the resource pool is a set of radio resources allocated to the sidelink.
  • the resource pool is defined as a starting point of a subchannel (eg, startRBSubchannel in FIG. 32), a size (eg, sizeSubChannel in FIG. 32), and the number (eg, numSubchannel in FIG. 32).
  • the V2X device is assigned a transmission and reception resource pool list for message transmission and reception, respectively.
  • a radio resource that is not used by another V2X device In order to transmit a message from a transmitting V2X device, a radio resource that is not used by another V2X device must be found in the transmission resource pool list. In order to find unused radio resources, the V2X device measures the received signal strength indication (RSSI) of each subchannel in the transmission resource pool list. If the RSSI of the measured subchannel does not exceed the threshold, the V2X device determines that the subchannel is not used. The V2X device must occupy unused sub-channels to transmit its V2X message.
  • the occupied radio resources are used for PSSCH and PSCCH transmission.
  • the radio resources occupied by the V2X device may be used periodically, and the V2X device transmits the occupied (resource reservation) information in sidelink control information (SCI) through PSCCH.
  • SCI includes a radio resource region used by the PSSCH, a modulation coding scheme (MCS), and a usage period.
  • the receiving V2X device monitors the PSCCH for SCI detection in the list of the receiving resource pool.
  • the area to be monitored by the reception terminal may be all radio resource intervals/areas in the reception resource pool list.
  • the receiving V2X device can reduce power consumption by reducing the radio resource monitoring area to be performed. Receiving performance can be improved by identifying the radio resource location.
  • FIG. 33 is a flowchart illustrating an example of a radio resource reselection method of a communication device performing a V2X operation.
  • a transmission V2X device or a transmission terminal selects a new transmission radio resource (S3310), and transmits new radio resource information to the target V2X device (S3320).
  • the new radio resource information may include information on a radio resource region and period of the PSCCH selected for SCI transmission in the transmitting V2X device.
  • the transmitting terminal randomly selects a sidelink resource reselection counter value (SL_RESOURCE_RESELECTION_COUNTER) from values of 5 or more and 15 or less (S3330), and performs a transmission operation (S3340).
  • SL_RESOURCE_RESELECTION_COUNTER a sidelink resource reselection counter value from values of 5 or more and 15 or less
  • S3340 a transmission operation
  • the terminal sets the sidelink resource reselection counter value to a value obtained by decreasing 1 from the current value (S3350).
  • the counter value that is, the sidelink resource reselection counter value
  • the counter value is 1 (S3360)
  • transmission is performed again, and if the counter value is 1, the terminal is 0 for probability P.
  • An arbitrary value is selected from among values greater than or equal to 1 (S3361).
  • step (S3370) if P is greater than the threshold value probResourceKep, the current resource is maintained (S3371), and a sidelink resource reselection counter value is randomly selected from values of 5 or more and 15 or less.
  • step (S3370) if P is not greater than the threshold value probResourceKep, the counter value is set to 0 (S3372), and a new transmission radio resource is selected.
  • the threshold value probResourceKep may be a value previously set by a network or the like.
  • the radio resources occupied by the transmitting V2X device cannot be continuously occupied in order to prevent collision with other V2X devices, and a resource reselection process as shown in FIG. 33 is performed.
  • a service provided by an application can be classified by a provider service identifier (PS ID), and a PS ID for each service can follow the definition of IEEE 1609.2.
  • PS ID provider service identifier
  • one of the unused PS IDs eg, 0x28 in IEEE 1609.2 is newly defined as for a one-to-one communication application.
  • V2X messages used by applications are defined as ASN.1 (abstract syntax notation number one) type in SAE J2735.
  • ASN.1 abtract syntax notation number one
  • SAE J2735 a new application message is proposed in order to establish a one-to-one V2X connection between specific V2X devices. Detailed information about this may be given in Table 12 below.
  • RadioResInfo -Message from the V2X device that wants to establish a one-to-one connection requesting connection to the other V2X device-hid is its own ID and rid is the ID of the target V2X device. Whenever the certificate is changed, it is changed at the same time.
  • -RadioResInfo contains information on the location (time, frequency) and period of radio resources used for V2X communication.
  • RadioResInfo contains information on the location (time, frequency) and period of radio resources used for V2X communication.
  • hid is the home identifier (hid) of the transmitting terminal or the receiving device
  • rid is the remote identifier (rid) of the receiving terminal or the receiving device
  • timestamp is the creation time of the corresponding message.
  • Information on, expiryTime may indicate a valid period or expiry period of a corresponding message
  • DestinationLayer2ID may indicate a link layer identifier identifying a device/terminal that receives (sidelink communication frame).
  • HV denotes a home vehicle
  • RV denotes a remote vehicle, but may denote an object capable of performing V2X operations, such as a terminal and a remote terminal, respectively.
  • BSM basic safety message
  • RV1 transmits its own BSM (basic safety message) to RV2 and HV, respectively (S3410), and RV2 transmits its own BSM to HV and RV1, respectively (S3420).
  • the BSM transmitted by RV1 may include a public key for RV1 or a certificate for RV1 for encryption.
  • the BSM transmitted by RV2 may include a public key for RV2 or a certificate for RV2 for encryption.
  • HV wants to perform one-to-one communication with RV1
  • HV requests one-to-one connection to RV1 and RV2, respectively.
  • a message is transmitted (S3440).
  • the one-to-one connection request message transmitted by the HV includes a public key or certificate for the HV for encryption, a symmetric key encrypted by ECIES, and a cipher text encrypted by the symmetric key. ) May be included.
  • rid of the one-to-one connection request message is set to the ID of RV1.
  • the one-to-one connection request message is encrypted with a symmetric encryption key, and the symmetric encryption key is encrypted using the ECIES (asymmetric) algorithm using the public key of RV1 and the private key of HV. .
  • the encrypted symmetric encryption key can be decrypted only if it has the private key of RV1.
  • the HV transmits a security header including a certificate (HV's public key) and an encrypted symmetric encryption key.
  • RV1 decrypts the symmetric encryption key using the public key of the HV and its own private key included in the one-to-one connection request message. RV1 can obtain the contents of the one-to-one connection request message after decrypting the cipher text using the symmetric encryption key. If RV1 accepts the HV request, it transmits a one-to-one connection setup message, and if RV1 rejects the HV request, it transmits a one-to-one connection release message. The rid of the one-to-one connection setup message is the ID of the HV, and the RadioResInfo of the one-to-one connection setup message is radio resource information used by the RV1 when transmitting the message.
  • RV2 discards the one-to-one connection request message due to decryption failure (discard) (S3450).
  • RV1 may transmit a one-to-one connection setup message to HV and RV2, respectively, as in Alternative 1 of FIG. 34 in which RV1 accepts the one-to-one connection request. (S3460).
  • the one-to-one connection setup message transmitted by RV1 may include a ciphertext encrypted by a symmetric key and a specific MAC ID.
  • a specific MAC ID may be included in the MAC header.
  • the MAC ID may be DestinationLayer2ID.
  • DestinationLayer2ID may be provided to the terminal, and in this case, a mapping relationship between the DestinationLayer2ID and the sidelink service (eg, PS ID, ITS-AID (intelligent transport system-application identifier) of a V2X application, etc.) may be provided together. .
  • the terminal can distinguish which service-related message the terminal has received through the MAC ID.
  • the specific MAC ID may be a MAC ID having a mapping relationship with a one-to-one sidelink connection service.
  • RV2 confirms that the MAC IDs do not match and discards the message (S3461).
  • HV and RV1 may be provided with the specific MAC ID, which is an ID mapped to a one-to-one sidelink connection service, and RV2 may not have been provided with the specific MAC ID. Therefore, when the HV transmits a message including a specific MAC ID through a broadcast method, RV1 receiving the specific MAC ID confirms that the message is a one-to-one connection related message through the specific MAC ID. You can receive it. However, since RV2 has not been provided with the specific MAC ID, it can discard the message after confirming that the MAC ID of the message does not match the MAC ID possessed by it.
  • an application message may be transmitted and received between the HV and the RV1 through a one-to-one connection (S3462).
  • all messages are encrypted by a shared symmetric key, and the MAC ID may be shared with a specific MAC ID.
  • RV2 may discard a one-to-one connection-related message based on a mismatch of MAC IDs during a one-to-one connection process.
  • RV2 cannot decrypt the sidelink message because the sidelink message is encrypted based on the private key of HV or RV1.
  • RV2 cannot read the sidelink message based on the one-to-one connection between HV and RV1.
  • RV1 transmits a one-to-one connection release message to HV and RV2, respectively, as shown in Alternative 2 of FIG. 34 in which RV1 rejects the one-to-one connection request. It may be possible (S3470).
  • the one-to-one connection release message may be OneToOneConnectionRelease of Table 12.
  • 35 shows an example of a secure message format.
  • the security message may include a security header, a safety message, and a security trailer.
  • Contents included in each of the security header, safety message, and security trailer are the same as described above, and therefore, duplicate descriptions are omitted.
  • the certificate is periodically changed for security purposes. If the certificate is changed in the HV, or the terminal or V2X device reselects a new radio resource, a one-to-one connection update message is transmitted and the contents of the one-to-one connection (context ) To change.
  • the security header of the one-to-one connection update message includes the public key of the changed HV, and the new symmetric encryption key is encrypted by the ECIES algorithm.
  • RV1 obtains a symmetric encryption key in the same process as when receiving a one-to-one connection request message.
  • the one-to-one connection update message may include newly selected radio resource information. Even when the RV1's certificate and/or radio resource is changed, the connection context is changed through the same process as above.
  • hid of the one-to-one connection update message may include the ID of the changed HV. Also, if there is an expiry timer, the expiry timer is initialized with the newly received expiryTime.
  • FIG. 36 shows an example of a message transmission/reception procedure for one-to-one connection update. Meanwhile, in FIG. 36, it is assumed that a one-to-one connection between HV and RV1 is established and a state in which a one-to-one connection with HV and/or RV1 is not established as in the case of Alternative 1 of FIG. 34.
  • the HV confirms that the HV's certificate has been changed, or the HV reselects the radio resource (S3610).
  • the HV transmits a one-to-one connection update message to RV1 and RV2 (S3620).
  • the one-to-one connection update message may be as shown in FIG. 12.
  • the HV and RV1 in which the one-to-one connection is established update the one-to-one connection content with each other (S3630). Specifically, the HV and RV1 may transmit and receive information about a new certificate or transmit and receive information about a newly selected radio resource.
  • the established one-to-one connection may be released according to various causes to be described later.
  • FIG. 37 it is assumed that the established connection is a one-to-one connection between HV and RV1.
  • the established connection may be released by transmitting a one-to-one connection release message in HV or RV1.
  • 37 illustrates an example in which the HV transmits a one-to-one connection release message in the case of alternative 1 of FIG. 34.
  • the HV transmits a one-to-one connection release message to RV1 and RV2 (S3710).
  • an implicitly set connection may be released.
  • the implicitly set connection may be released.
  • a process of determining that the terminal or the V2X device is a signal detection failure or signal loss will be described later.
  • FIG. 37 three cases of Alternative 1 to Alternative 3 are illustrated, but it is obvious that various conditions for disconnection of one-to-one may exist in addition to this.
  • the location (time, frequency) and period of the radio resource are shared between the HV and RV1, and the signal power of the radio resource is measured according to the period.
  • the T_Loss timer is started.
  • the terminal determines that the one-to-one connection is lost and releases the connection.
  • the T_Loss timer is running and the signal strength is continuously (T_N_SYNC) higher than the threshold, the T_Loss timer is stopped and it is determined that the one-to-one connection is restored.
  • the object determining the signal loss may be at least one of a terminal or a V2X device constituting a one-to-one connection.
  • the terminal periodically measures a radio resource signal (S3810). Thereafter, it is determined whether the timer T_Loss operates (S3820).
  • step (S3820) if the T_Loss timer operates, it is determined whether the signal strength is greater than a threshold (S3821).
  • N_sync the counter value, N_sync, is increased (S3832), and it is determined whether N_sync is greater than T_N_SYNC (S3842). If N_sync is greater than T_N_SYNC, the T_Loss timer is stopped (S3852), and the process returns to step (S3810). If N_sync is not greater than T_N_SYNC, it returns to step (S3810).
  • the N_sync counter value is reset (S3831), and it is determined whether the T_Loss timer has expired (S3841). If the T_Loss timer has expired, the terminal determines that the connection or signal is lost and releases the connection (S3851). If the T_Loss timer has not expired, the process returns to step (3810).
  • step (S3820) if the T_Loss timer does not operate, it is determined whether the signal strength is less than a threshold (S3822).
  • the N_Loss counter is initialized (S3834), and the process returns to the step (S3810).
  • the N_Loss counter is increased (S3833), and it is determined whether N_Loss is greater than T_N_Loss (S3843). If N_Loss is greater than T_N_Loss, the T_Loss timer is started (S3853), and the process returns to step (S3810). If N_Loss is not greater than T_N_Loss, the process returns to step (S3810).
  • the T_Loss timer is a timer that starts operation when a radio resource signal is not continuously received for a certain period of time, and here, as an example, a case in which the radio resource signal is not continuously received is measured during a certain period For one received signal, it may mean that there is no signal in which the strength of each of the received signals is higher than a threshold.
  • the N_Loss counter may be a counter for counting signals whose strength of the received radio resource signal is less than a threshold value
  • the N_sync counter may be a counter for counting signals having a strength of the received radio resource signal greater than the threshold.
  • the T_N_SYNC value is a threshold or reference value that is a comparison standard with the N_sync value to stop the T_Loss timer
  • the T_N_LOSS value is a threshold value that is a comparison standard with the N_Loss value to start the T_Loss timer or It may be a reference value.
  • the T_N_SYNC value and/or the T_N_LOSS value may be a value previously set by a network or the like.
  • 39 is a flowchart of a one-to-one sidelink connection method performed by a terminal according to an embodiment of the present invention.
  • a terminal receives a one-to-one connection request message from another terminal (S3910).
  • the one-to-one connection request message may include information on radio resources transmitted by the other terminal.
  • the terminal transmits a one-to-one connection setup message to the other terminal (S3920).
  • the terminal performs a sidelink operation with the other terminal (S3930).
  • the terminal may monitor only radio resources transmitted by the other terminal.
  • an application message transmitted and received by the terminal during the sidelink operation may be encrypted.
  • the invention disclosed in this specification can be used in a variety of services/environments requiring a one-to-one V2X operation.
  • the invention proposed in the present specification can be used in platooning, traction, and the like.
  • the present invention may support more efficient V2X communication in the V2X operation with the legacy V2X terminal or the V2X unicast (unicast) is not supported.
  • FIG. 40 shows a wireless communication device according to an embodiment of the present invention.
  • the wireless communication system may include a first device 9010 and a second device 9020.
  • the first device 9010 includes a base station, a network node, a transmitting terminal, a receiving terminal, a wireless device, a wireless communication device, a vehicle, a vehicle equipped with an autonomous driving function, a connected car, a drone (Unmanned Aerial Vehicle, UAV), AI (Artificial Intelligence) Module, Robot, Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device, Hologram Device, Public Safety Device, MTC Device, IoT Device, Medical Device, Pin It may be a tech device (or financial device), a security device, a climate/environment device, a device related to 5G service, or a device related to the fourth industrial revolution field.
  • UAV Unmanned Aerial Vehicle
  • AI Artificial Intelligence
  • Robot Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device
  • Hologram Device Augmented Reality
  • MTC Device Virtual Reality
  • IoT Device Medical Device
  • Pin It may be a tech device (or financial device),
  • the second device 9020 includes a base station, a network node, a transmitting terminal, a receiving terminal, a wireless device, a wireless communication device, a vehicle, a vehicle equipped with an autonomous driving function, a connected car, a drone (Unmanned Aerial Vehicle, UAV), AI (Artificial Intelligence) Module, Robot, Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device, Hologram Device, Public Safety Device, MTC Device, IoT Device, Medical Device, Pin It may be a tech device (or financial device), a security device, a climate/environment device, a device related to 5G service, or a device related to the fourth industrial revolution field.
  • UAV Unmanned Aerial Vehicle
  • AI Artificial Intelligence
  • Robot Augmented Reality (AR) Device, Virtual Reality (VR) Device, Mixed Reality (MR) Device
  • Hologram Device Augmented Reality
  • MTC Device Virtual Reality
  • IoT Device Medical Device
  • Pin It may be a tech device (or financial device),
  • the terminal is a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistants (PDA), a portable multimedia player (PMP), a navigation system, a slate PC, and a tablet.
  • PDA personal digital assistants
  • PMP portable multimedia player
  • PC tablet PC
  • ultrabook ultrabook
  • wearable device wearable device, for example, a watch-type terminal (smartwatch), glass-type terminal (smart glass), HMD (head mounted display)
  • the HMD may be a display device worn on the head.
  • HMD can be used to implement VR, AR or MR.
  • a drone may be a vehicle that is not human and is flying by a radio control signal.
  • the VR device may include a device that implements an object or a background of a virtual world.
  • the AR device may include a device that connects and implements an object or background of a virtual world, such as an object or background of the real world.
  • the MR device may include a device that combines and implements an object or background of a virtual world, such as an object or background of the real world.
  • the hologram device may include a device that implements a 360-degree stereoscopic image by recording and reproducing stereoscopic information by utilizing an interference phenomenon of light generated by the encounter of two laser lights called holography.
  • the public safety device may include an image relay device or an image device wearable on a user's human body.
  • the MTC device and the IoT device may be devices that do not require direct human intervention or manipulation.
  • the MTC device and the IoT device may include a smart meter, a bending machine, a thermometer, a smart light bulb, a door lock, or various sensors.
  • the medical device may be a device used for the purpose of diagnosing, treating, alleviating, treating or preventing a disease.
  • the medical device may be a device used for the purpose of diagnosing, treating, alleviating or correcting an injury or disorder.
  • a medical device may be a device used for the purpose of examining, replacing or modifying a structure or function.
  • the medical device may be a device used for the purpose of controlling pregnancy.
  • the medical device may include a device for treatment, a device for surgery, a device for (extra-corporeal) diagnosis, a device for hearing aid or a procedure.
  • the security device may be a device installed to prevent a risk that may occur and maintain safety.
  • the security device may be a camera, CCTV, recorder, or black box.
  • the fintech device may be a device capable of providing financial services such as mobile payment.
  • the fintech device may include a payment device or a point of sales (POS).
  • the climate/environment device may include a device that monitors or predicts the climate/environment.
  • the first device 9010 may include at least one or more processors such as the processor 9011, at least one or more memories such as the memory 9012, and at least one or more transceivers such as the transceiver 9013.
  • the processor 9011 may perform the functions, procedures, and/or methods described above.
  • the processor 9011 may perform one or more protocols.
  • the processor 9011 may perform one or more layers of an air interface protocol.
  • the memory 9012 is connected to the processor 9011 and may store various types of information and/or commands.
  • the transceiver 9013 may be connected to the processor 9011 and controlled to transmit and receive wireless signals.
  • the transceiver 9013 may be connected to one or more antennas 9014-1 to 9014-n, and the transceiver 9013 may include the methods and methods herein through one or more antennas 9014-1 to 9014-n. / Or it may be set to transmit and receive user data, control information, radio signal / channel, etc. mentioned in the operation flow chart.
  • the n antennas may be the number of physical antennas or the number of logical antenna ports.
  • the second device 9020 may include at least one processor such as the processor 9021, at least one memory device such as the memory 9022, and at least one transceiver such as the transceiver 9023.
  • the processor 9021 may perform the functions, procedures, and/or methods described above.
  • the processor 9021 may implement one or more protocols.
  • the processor 9021 may implement one or more layers of an air interface protocol.
  • the memory 9022 is connected to the processor 9021 and may store various types of information and/or commands.
  • the transceiver 9023 is connected to the processor 9021 and may be controlled to transmit and receive radio signals.
  • the transceiver 9023 may be connected to one or more antennas 9024-1 to 9024-n, and the transceiver 9023 may include the methods and methods herein through one or more antennas 9024-1 to 9024-n. / Or it may be set to transmit and receive user data, control information, radio signal / channel, etc. mentioned in the operation flow chart.
  • the memory 9012 and/or the memory 9022 may be connected inside or outside the processor 9011 and/or the processor 9021, respectively, or other processors through various technologies such as wired or wireless connection. It can also be connected to.
  • 41 shows a wireless communication device according to an embodiment of the present invention.
  • FIG. 41 may be a diagram illustrating in more detail the first or second devices 9010 and 9020 of FIG. 40.
  • the wireless communication device in FIG. 41 is not limited to the terminal.
  • the wireless communication device may be any suitable mobile computer device configured to perform one or more implementations of the present invention, such as a vehicle communication system or device, a wearable device, a portable computer, a smart phone, or the like.
  • the terminal includes at least one processor (e.g., a DSP or microprocessor) such as a processor 9110, a transceiver 9135, a power management module 9105, and one or more antennas 9140-1 to 9140-n), battery 9155, display 9115, keypad 9120, Global Positioning System (GPS) chip 9160, sensor 9165, memory 9130, (optionally) subscriber identification module (SIM) ) It may include a card 9125, a speaker 9145, a microphone 9150, and the like.
  • processor 9110 e.g., a DSP or microprocessor
  • a processor 9110 e.g., a processor 9110, a transceiver 9135, a power management module 9105, and one or more antennas 9140-1 to 9140-n
  • battery 9155 e.g., a DSP or microprocessor
  • the processor 9110 may be configured to perform the above-described functions, procedures, and/or methods of the present invention. According to an implementation example, the processor 9110 may perform one or more protocols, such as layers of a radio interface protocol.
  • the memory 9130 may be connected to the processor 9110 and may store information related to the operation of the processor 9110.
  • the memory 9130 may be located inside or outside the processor 9110, and may be connected to other processors through various technologies such as wired or wireless connection.
  • a user can input various types of information (eg, command information such as a phone number) by pressing a button on the keypad 9120 or using various techniques such as voice activation using the microphone 9150.
  • the processor 9110 may receive and process user information and perform an appropriate function such as dialing a phone number.
  • data eg, operational data
  • the processor 9110 may receive and process GPS information from the GPS chip 9160 in order to perform a function related to the location of the terminal, such as vehicle navigation and map service.
  • the processor 9110 may display various types of information and data on the display 9115 for user's reference or convenience.
  • the transceiver 9135 is connected to the processor 9110 and may transmit and receive radio signals such as RF signals.
  • the processor 9110 may control the transceiver 9135 to initiate communication and transmit a radio signal including various types of information or data such as voice communication data.
  • the transceiver 9135 may include one receiver and one transmitter to send or receive wireless signals.
  • One or more antennas 9140-1 to 9140 -n may facilitate transmission and reception of wireless signals.
  • the transceiver 9135 may forward and convert the signals to a baseband frequency for processing using the processor 9110.
  • the processed signals may be processed according to various technologies, such as converted into audible or readable information to be output through the speaker 9145.
  • the senor 9165 may be connected to the processor 9110.
  • the sensor 9165 may include one or more sensing devices configured to detect various types of information including, but not limited to, speed, acceleration, light, vibration, proximity, position, image, and the like.
  • the processor 9110 may receive and process sensor information obtained from the sensor 9165, and may perform various types of functions such as collision avoidance and automatic driving.
  • various components may be further included in the terminal.
  • the camera may be connected to the processor 9110 and may be used for various services such as automatic driving and vehicle safety service.
  • FIG. 41 is only an example of a terminal, and implementation is not limited thereto.
  • some components e.g., keypad 9120, GPS chip 9160, sensor 9165, speaker 9145 and/or microphone 9150
  • FIG. 42 illustrates a transceiver of a wireless communication device according to an embodiment of the present invention.
  • FIG. 42 may show an example of a transceiver that may be implemented in a frequency division duplex (FDD) system.
  • FDD frequency division duplex
  • At least one processor may process data to be transmitted and may transmit a signal such as an analog output signal to the transmitter 9210.
  • the analog output signal at the transmitter 9210 may be filtered by a low pass filter (LPF) 9211, e.g. to remove noise due to a previous digital-to-analog conversion (ADC), and , It may be upconverted from the baseband to RF by an upconverter (eg, mixer) 9212, and amplified by an amplifier such as a variable gain amplifier (VGA) 9213.
  • the amplified signal may be filtered by a filter 9214, amplified by a power amplifier (PA) 9215, may be routed through a duplexer 9250/antenna switch 9260, and an antenna 9270 ) Can be transmitted.
  • LPF low pass filter
  • PA power amplifier
  • the antenna 9270 may receive signals in a wireless environment, and the received signals may be routed at the antenna switch 9260/duplexer 9250 and sent to the receiver 9220.
  • the signal received by the receiver 9220 may be amplified by an amplifier such as a low noise amplifier (LNA) 9223, filtered by a band pass filter 9224, and a downconverter (e.g. For example, it may be downconverted from RF to baseband by a mixer 9225.
  • LNA low noise amplifier
  • the downconverted signal may be filtered by a low pass filter (LPF) 9226, amplified by an amplifier such as VGA 9272 to obtain an analog input signal, and the analog input signal may be processed by one or more processors.
  • LPF low pass filter
  • the local oscillator (LO) 9240 may generate transmission and reception of an LO signal to be transmitted to the upconverter 9212 and the downconverter 9225, respectively.
  • the phase locked loop (PLL) 9230 may receive control information from the processor, and may send control signals to the LO generator 9240 to transmit/receive LO signals at an appropriate frequency.
  • FIG. 43 illustrates a transceiver of a wireless communication device according to an embodiment of the present invention.
  • FIG. 43 may show an example of a transceiver that may be implemented in a time division duplex communication (TDD) system.
  • TDD time division duplex communication
  • the transmitter 9310 and the receiver 9320 of the transceiver of the TDD system may have one or more similar characteristics to the transmitter and receiver of the transceiver of the FDD system.
  • the structure of the transceiver of the TDD system will be described.
  • the signal amplified by the transmitter's power amplifier (PA) 9315 is routed through a band select switch 9350, a band pass filter (BPF) 9360, and antenna switch(s) 9370. Can be, and can be transmitted to the antenna 9380.
  • PA power amplifier
  • the antenna 9380 receives signals from the wireless environment and the received signals are routed through an antenna switch(s) 9370, a band pass filter (BPF) 9360, and a band select switch 9350. It may be, and may be provided to the receiver 9320.
  • BPF band pass filter
  • the operation of the wireless device related to the sidelink described in FIG. 44 is merely an example, and sidelink operations using various techniques may be performed in the wireless device.
  • the sidelink may be a terminal-to-terminal interface for sidelink communication and/or sidelink discovery.
  • the sidelink may correspond to the PC5 interface.
  • the sidelink operation may be transmission and reception of information between terminals.
  • Sidelinks can carry various types of information.
  • the wireless device may obtain information related to the sidelink.
  • the information related to the sidelink may be one or more resource configurations.
  • Information related to the sidelink can be obtained from other wireless devices or network nodes.
  • the wireless device may decode the information related to the sidelink.
  • the wireless device may perform one or more sidelink operations based on the sidelink-related information.
  • the sidelink operation(s) performed by the wireless device may include one or more operations described herein.
  • FIG. 45 illustrates an operation of a network node related to a sidelink according to an embodiment of the present invention.
  • the operation of the network node related to the sidelink described in FIG. 45 is only an example, and sidelink operations using various technologies may be performed in the network node.
  • the network node may receive information on the sidelink from the wireless device.
  • the information on the sidelink may be sidelink UE information used to inform the network node of the sidelink information.
  • the network node may determine whether to transmit one or more commands related to the sidelink based on the received information.
  • the network node may transmit the command(s) related to the sidelink to the wireless device.
  • the wireless device may perform one or more sidelink operation(s) based on the received command.
  • Network nodes can be replaced by wireless devices or terminals.
  • a wireless device 9610 may include a communication interface 9611 for communicating with one or more other wireless devices, network nodes, and/or other elements in the network.
  • the communication interface 9611 may include one or more transmitters, one or more receivers, and/or one or more communication interfaces.
  • the wireless device 9610 may include a processing circuit 9612.
  • the processing circuit 9612 may include one or more processors such as the processor 9613 and one or more memories such as the memory 9614.
  • the processing circuit 9612 may be configured to control any of the methods and/or processes described herein and/or, for example, to cause the wireless device 9610 to perform such a method and/or process.
  • the processor 9613 may correspond to one or more processors for performing wireless device functions described herein.
  • the wireless device 9610 may include a memory 9614 configured to store data, program software code, and/or other information described herein.
  • the memory 9614 may include software code including instructions for causing the processor 9613 to perform some or all of the processes according to the present invention described above when one or more processors such as the processor 9613 are executed ( 9615).
  • one or more processors such as the processor 9613, which control one or more transceivers such as the transceiver 2223 to transmit and receive information, may perform one or more processes related to transmission and reception of information.
  • the network node 9620 may include a communication interface 9621 for communicating with one or more other network nodes, wireless devices, and/or other elements on the network.
  • the communication interface 9621 may include one or more transmitters, one or more receivers, and/or one or more communication interfaces.
  • the network node 9620 may include a processing circuit 9622.
  • the processing circuit may include a processor 9623 and a memory 9624.
  • the memory 9624 when executed by one or more processors, such as the processor 9623, includes software code 9625 including instructions that cause the processor 9623 to perform some or all of the processes in accordance with the present invention. ) Can be configured to store.
  • one or more processors that control one or more transceivers, such as the transceiver 2213 to transmit and receive information may perform one or more processes related to transmission and reception of information.
  • each structural element or function may be considered selectively.
  • Each of the structural elements or features may be performed without being combined with other structural elements or features. Further, some structural elements and/or features may be combined with each other to constitute implementations of the present invention.
  • the order of operations described in the implementation of the present invention may be changed. Some structural elements or features of one implementation may be included in other implementations, or may be replaced with structural elements or features corresponding to other implementations.
  • Implementations in the present invention may be made by various techniques, for example hardware, firmware, software, or combinations thereof.
  • the method according to the implementation of the present invention includes one or more Application Specific Integrated Circuits (ASICs), one or more Digital Signal Processors (DSPs), one or more Digital Signal Processing Devices (DSPD), and one or more Programmable Logic Devices (PLDs).
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPD Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs Field Programmable Gate Arrays
  • processors one or more controllers
  • microcontrollers one or more microcontrollers, and the like.
  • implementations of the present invention may be implemented in the form of modules, procedures, functions, and the like.
  • the software code can be stored in memory and executed by a processor.
  • the memory may be located inside or outside the processor, and may transmit and receive data from the processor in various ways.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

L'invention concerne un procédé par lequel un premier terminal exécute une connexion de liaison latérale biunivoque dans un système de communication sans fil. Le procédé reçoit un message de demande de connexion biunivoque en provenance d'un second terminal, le message de demande de connexion biunivoque comprenant des informations relatives à une première ressource radio avec laquelle le second terminal effectue une transmission, transmet un message d'établissement de connexion biunivoque au second terminal si un identifiant associé à une connexion de liaison latérale biunivoque avec le premier terminal est inclus dans le message de demande de connexion biunivoque, et effectue une opération de liaison latérale avec le second terminal. Durant l'opération de liaison latérale le premier terminal surveille uniquement la première ressource radio pour un message de réception que le premier terminal reçoit du second terminal et, pendant l'opération de liaison latérale, le message d'application transmis ou reçu par le premier terminal est chiffré.
PCT/KR2019/009201 2019-07-24 2019-07-24 Procédé par lequel un terminal exécute une connexion de liaison latérale biunivoque dans un système de communication sans fil, et terminal utilisant le procédé Ceased WO2021015345A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2019/009201 WO2021015345A1 (fr) 2019-07-24 2019-07-24 Procédé par lequel un terminal exécute une connexion de liaison latérale biunivoque dans un système de communication sans fil, et terminal utilisant le procédé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2019/009201 WO2021015345A1 (fr) 2019-07-24 2019-07-24 Procédé par lequel un terminal exécute une connexion de liaison latérale biunivoque dans un système de communication sans fil, et terminal utilisant le procédé

Publications (1)

Publication Number Publication Date
WO2021015345A1 true WO2021015345A1 (fr) 2021-01-28

Family

ID=74192955

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/009201 Ceased WO2021015345A1 (fr) 2019-07-24 2019-07-24 Procédé par lequel un terminal exécute une connexion de liaison latérale biunivoque dans un système de communication sans fil, et terminal utilisant le procédé

Country Status (1)

Country Link
WO (1) WO2021015345A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116863565A (zh) * 2023-05-24 2023-10-10 山西丰鸿实业有限公司 基于密钥的智能门锁控制方法及装置
WO2024073986A1 (fr) * 2023-01-19 2024-04-11 Lenovo (Beijing) Limited Procédé et appareil de transmission de liaison latérale avec de multiples positions de départ candidates

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150056075A (ko) * 2013-11-14 2015-05-22 삼성전자주식회사 직접 통신을 수행하는 단말간 페이징 방법 및 장치
KR20150075669A (ko) * 2013-12-26 2015-07-06 전자부품연구원 차량간 통신을 위한 유저 데이터 보안 모듈을 갖는 차량 단말기
WO2016018068A1 (fr) * 2014-07-29 2016-02-04 엘지전자 주식회사 Procédé de transmission d'informations de ressources pour des communications d2d et appareil associé dans un système de communication sans fil
WO2018004322A1 (fr) * 2016-07-01 2018-01-04 엘지전자(주) Procédé d'émission et de réception de données dans un système de communication sans fil et appareil associé

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150056075A (ko) * 2013-11-14 2015-05-22 삼성전자주식회사 직접 통신을 수행하는 단말간 페이징 방법 및 장치
KR20150075669A (ko) * 2013-12-26 2015-07-06 전자부품연구원 차량간 통신을 위한 유저 데이터 보안 모듈을 갖는 차량 단말기
WO2016018068A1 (fr) * 2014-07-29 2016-02-04 엘지전자 주식회사 Procédé de transmission d'informations de ressources pour des communications d2d et appareil associé dans un système de communication sans fil
WO2018004322A1 (fr) * 2016-07-01 2018-01-04 엘지전자(주) Procédé d'émission et de réception de données dans un système de communication sans fil et appareil associé

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HUAWEI: "Discussion on sidelink resource allocation mode 1", R1-1903950, 3GPP TSG RAN WG1 MEETING #96BIS, 2 April 2019 (2019-04-02), Xi' an, China, XP051707065 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024073986A1 (fr) * 2023-01-19 2024-04-11 Lenovo (Beijing) Limited Procédé et appareil de transmission de liaison latérale avec de multiples positions de départ candidates
GB2639401A (en) * 2023-01-19 2025-09-24 Lenovo Beijing Ltd Method and apparatus for sidelink transmission with multiple candidate starting positions
CN116863565A (zh) * 2023-05-24 2023-10-10 山西丰鸿实业有限公司 基于密钥的智能门锁控制方法及装置

Similar Documents

Publication Publication Date Title
WO2019240544A1 (fr) Procédé et appareil permettant la réalisation d'une communication de liaison latérale par un ue dans une v2x nr
WO2019240548A1 (fr) Procédé et appareil pour réaliser une communication de liaison latérale par un ue dans un nr v2x
WO2019240550A1 (fr) Procédé et appareil pour rapporter un type de diffusion par un ue dans nr v2x
WO2020022845A1 (fr) Procédé et appareil destinés à transmettre un signal par un terminal de liaison montante dans un système de communication sans fil
WO2020096435A1 (fr) Procédé et appareil d'émission d'un signal de rétroaction au moyen d'un terminal de liaison latérale dans un système de communication sans fil
WO2020145785A1 (fr) Procédé et appareil permettant à un terminal de liaison latérale de transmettre un signal dans un système de communication sans fil
WO2020218636A1 (fr) Véhicule autonome, et système et procédé pour fournir un service à l'aide de celui-ci
WO2021002723A1 (fr) Procédé de fonctionnement d'équipement d'utilisateur relatif à une drx de liaison latérale dans un système de communication sans fil
WO2020171669A1 (fr) Procédé et appareil permettant à un terminal de liaison latérale d'émettre et de recevoir un signal relatif à un rapport d'état de canal dans un système de communication sans fil
WO2021040143A1 (fr) Procédé pour véhicule pour transmettre un signal dans un système de communication sans fil, et véhicule associé
WO2020209564A1 (fr) Procédé de fonctionnement d'un équipement utilisateur (ue) pour communication de liaison latérale et rétroaction dans un système de communication sans fil
WO2021075595A1 (fr) Procédé d'émission et de réception, par un équipement utilisateur, de message destiné à un usager de la route vulnérable dans un système de communication sans fil
WO2019226026A1 (fr) Procédé et appareil de transmission de signal de liaison latérale dans un système de communication sans fil
WO2021100935A1 (fr) Procédé de transmission, par un terminal d'un usager de la route vulnérable, d'un signal dans un système de communication sans fil
WO2020159297A1 (fr) Procédé et appareil permettant de transmettre un signal au moyen d'un terminal de liaison latérale dans un système de communication sans fil
WO2020027572A1 (fr) Procédé et dispositif de transmission d'un signal de synchronisation au moyen d'un terminal de liaison latérale dans un système de communication sans fil
WO2020091346A1 (fr) Procédé et dispositif de transmission de pssch par un terminal dans un système de communication sans fil
WO2021040144A1 (fr) Procédé de transmission et de réception de signal par un véhicule dans un système de communication sans fil et véhicule associé
WO2021100938A1 (fr) Procédé de transmission de signal entre un véhicule, un terminal et un réseau dans un système de communication sans fil, et véhicule, terminal et réseau correspondants
WO2020246818A1 (fr) Procédé de transmission de signal en liaison latérale dans un système de communication sans fil
WO2020032764A1 (fr) Procédé et appareil destinés à transmettre une pluralité de paquets par un terminal à liaison latérale dans un système de communication sans fil
WO2021071332A1 (fr) Procédé d'émission de signal de liaison latérale dans un système de communication sans fil
WO2020085816A1 (fr) Procédé et dispositif permettant à un terminal de liaison latérale de détecter un signal de liaison latérale dans un système de communication sans fil
WO2021045253A1 (fr) Procédé de communication avec un véhicule dans un système de communication sans fil et terminal utilisateur associé
WO2021100937A1 (fr) Procédé de transmission de signal dans un système de communication sans fil entre un véhicule, un terminal et un réseau, et véhicule, terminal et réseau associés

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19938319

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19938319

Country of ref document: EP

Kind code of ref document: A1