WO2025064149A1 - Protocols for drone landing - Google Patents
Protocols for drone landing Download PDFInfo
- Publication number
- WO2025064149A1 WO2025064149A1 PCT/US2024/043493 US2024043493W WO2025064149A1 WO 2025064149 A1 WO2025064149 A1 WO 2025064149A1 US 2024043493 W US2024043493 W US 2024043493W WO 2025064149 A1 WO2025064149 A1 WO 2025064149A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drone
- area
- user
- landing
- computing device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/654—Landing
- G05D1/6546—Emergency landing
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/20—Arrangements for acquiring, generating, sharing or displaying traffic information
- G08G5/26—Transmission of traffic-related information between aircraft and ground stations
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/54—Navigation or guidance aids for approach or landing
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/55—Navigation or guidance aids for a single aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2105/00—Specific applications of the controlled vehicles
- G05D2105/20—Specific applications of the controlled vehicles for transportation
- G05D2105/28—Specific applications of the controlled vehicles for transportation of freight
- G05D2105/285—Specific applications of the controlled vehicles for transportation of freight postal packages
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2107/00—Specific environments of the controlled vehicles
- G05D2107/10—Outdoor regulated spaces
- G05D2107/17—Spaces with priority for humans, e.g. populated areas, pedestrian ways, parks or beaches
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D2109/00—Types of controlled vehicles
- G05D2109/20—Aircraft, e.g. drones
- G05D2109/25—Rotorcrafts
- G05D2109/254—Flying platforms, e.g. multicopters
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/57—Navigation or guidance aids for unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/58—Navigation or guidance aids for emergency situations, e.g. hijacking or bird strikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/59—Navigation or guidance aids in accordance with predefined flight zones, e.g. to avoid prohibited zones
Definitions
- the present disclosure generally relates to drone landings.
- aspects of the present disclosure are related to systems and techniques for providing protocols for drone landing.
- Shipping and delivery' of assets is an important activity for many users (e.g., customer), which may be affiliated with various different business organizations.
- assets e.g., items or goods
- Different types of vehicles can be used to deliver assets, such as trucks, airplanes, among others.
- Drones are becoming common for use within the shipping industry, in addition to other industries, for delivery of items (e.g., parcels) to users (e.g., customers).
- Certain paths where drones fly can be riskier for drone operations than other paths. For example, due to external environmental impacts (e.g., high speed winds, sudden heavy rain fall, and storms), movement of a drone along on a path may be impacted. In some cases, the drone may need to change its path to a new path, which may cause damage to the drone and/or cause the drone to lose connection with a ground controller (e.g., a controller operated by an operator of the drone). As such, with some paths (e.g., paths with inclement weather conditions), there can be a high probability 7 that a drone may experience technical issues (e.g., mechanical and/or operational failures), in which case the drone may be required to perform an emergency landing on private or public property. As such, a generalized protocol for landing drones, such as for performing emergency landings or for delivering items, can be beneficial.
- a generalized protocol for landing drones such as for performing emergency landings or for delivering items, can be beneficial.
- a first computing device associated with a drone includes at least one memory and at least one processor coupled to the at least one memory and configured to: receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determine a path for the drone to travel to the area, based on receiving the authorization message; and cause the drone to follow the path to travel to the area and to land in the area.
- a method of landing a drone includes: receiving, by a first computing device associated with the drone from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determining, by the first computing device associated with the drone, a path for the drone to travel to the area, based on receiving the authorization message; and causing, by the first computing device associated with the drone, the drone to follow the path to travel to the area and to land in the area.
- a non-transitory computer-readable storage medium of a first computing device comprises instructions stored thereon which, when executed by at least one processor, cause the at least one processor to: receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land a drone in an area associated with the user for an unplanned landing of the drone; determine a path for the drone to travel to the area, based on receiving the authorization message; and cause the drone to follow the path to travel to the area and to land in the area.
- a first computing device associated with a drone includes: means for receiving, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; means for determining a path for the drone to travel to the area, based on receiving the authorization message; and means for causing the drone to follow the path to travel to the area and to land in the area.
- a drone in another illustrative example, includes at least one transceiver, at least one memory, and at least one processor coupled to the at least one transceiver and the at least one memory.
- the at least one processor is configured to: broadcast, via the at least one transceiver, an authorization request message to an area for landing the drone; receive, via the at least one transceiver, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and cause the drone to travel along the path and land in the area.
- a method of landing a drone includes: broadcasting, by the drone, an authorization request message to an area for landing the drone; receiving, by the drone, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and traveling, by the drone, along the path and landing in the area.
- a non-transitory computer-readable storage medium of a drone comprises instructions stored thereon which, when executed by at least one processor, cause the at least one processor to: broadcast, via at least one transceiver, an authorization request message to an area for landing the drone; receive, via the at least one transceiver, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and cause the drone to travel along the path and land in the area.
- a drone in another illustrative example, includes: means for broadcasting an authorization request message to an area for landing the drone; means for receiving a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and means for causing the drone to travel along the path and land in the area.
- Aspects generally include a method, apparatus, system, computer program product, non-transitory computer-readable medium, user device, user equipment, wireless communication device, and/or processing system as substantially described with reference to and as illustrated by the drawings and specification.
- Some aspects include a device having a processor configured to perform one or more operations of any of the methods summarized above. Further aspects include processing devices for use in a device configured with processor-executable instructions to perform operations of any of the methods summarized above. Further aspects include a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a device to perform operations of any of the methods summarized above. Further aspects include a device having means for performing functions of any of the methods summarized above.
- FIG. 1 is a block diagram illustrating an example of user equipment (UE), in accordance with aspects of the present disclosure.
- FIG. 2 is a diagram illustrating an example wireless communications system, in accordance with aspects of the present disclosure.
- FIG. 3 is a diagram illustrating an example implementation of a system-on-a-chip (SOC), in accordance with aspects of the present disclosure.
- SOC system-on-a-chip
- FIG. 4 is a diagram illustrating an example of an unmanned aerial vehicle (UAV), in accordance with aspects of the present disclosure.
- UAV unmanned aerial vehicle
- FIG. 5 is a diagram illustrating an example of airspace of an area, in accordance with aspects of the present disclosure.
- FIG. 6 is a diagram illustrating an example of a system for providing pre-authorization for unplanned drone landings, in accordance with aspects of the present disclosure.
- FIG. 7 is a diagram illustrating an example of a web browser for pre-authorization for unplanned drone landings, in accordance with aspects of the present disclosure.
- FIG. 8 is a diagram illustrating an example of a map for unplanned drone landings, in accordance with aspects of the present disclosure.
- FIG. 9 is a diagram illustrating an example of a system for dynamic real-time authorizing of drone landings, in accordance with aspects of the present disclosure.
- FIG. 10 is a flow chart illustrating an example of a process for wireless communications at a computing device associated with a drone, in accordance with aspects of the present disclosure.
- FIG. 11 is a flow chart illustrating an example of a process for wireless communications at a drone, in accordance with aspects of the present disclosure.
- FIG. 12 is a block diagram illustrating an example of a computing system, which may be employed by the disclosed systems and techniques, in accordance with aspects of the present disclosure.
- drones are popular for use within the shipping industry, and within other industries, for delivery of items (e.g.. parcels) to users (e.g.. customers).
- Certain paths where drones fly may be riskier for drone operations than other paths. For example, due to external environmental impacts (e.g., high speed winds, sudden heavy rain fall, and storms), a drone’s journey on a path can be impacted.
- the drone may need to change its path to a new or updated path, which can cause damage to the drone and/or cause the drone to lose connection with a ground controller (e.g., a controller operated by an operator of the drone).
- a ground controller e.g., a controller operated by an operator of the drone.
- a drone may experience technical issues (e.g., mechanical and/or operational failures) and. as such, the drone may be required to perform an emergency landing on private or public property.
- technical issues e.g., mechanical and/or operational failures
- Flying a drone over private property can be legal in some places, such as places where there is no need for permission to fly drones over private property.
- some states and local municipalities e.g., in the United States and/or other countries
- the Federal Aviation Administration is responsible for civil aviation safety within the United States, and regulates the United States airspace. Since the FAA is a federal agency, the FAA rules can preempt the state and local municipality laws. According to the FAA, the airspace over the United States is divided into two categories, which include controlled airspace and uncontrolled airspace. Controlled airspace is defined to be located above 400 feet from the ground, and uncontrolled airspace is defined to be located within 400 feet from the ground. The FAA has rules regarding where commercial and recreational drone pilots are able to fly and not fly their drones within the controlled airspace. As such, in the industry today, a generalized protocol for landing drones, such as for performing emergency landings or for delivering items, on private property can be beneficial.
- systems, apparatuses, processes (also referred to as methods), and computer-readable media are described herein for providing protocols for drone landing.
- the systems and techniques involve receiving a pre-authorization from a user (e.g., land owner) for a drone to land in an area (e.g., land) associated with the user, or receiving a dynamic on-the-spot (e.g., real-time) authorization from the user for the drone to land in the area associated with the user.
- the area associated with the user can include a property’ associated with the user, such as property (e.g., land and/or a house) owned by, leased by, or otherwise under control of the user.
- the systems and techniques involve the drone landing, such as for an emergency landing, in an area without any authorization from a user (e.g., land owner) associated with the area (e.g., land), where an administrator (e.g.. a company) associated with the drone can provide some compensation (e.g., money and/or free services) to the user associated with the area (e.g., a land owner of a parcel of land) for the landing of the drone.
- the computing device associated with the drone may provide compensation information (e.g., information related to a compensatory payment and/or services) to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
- a pre-authorization for a drone to land in an area can be previously provided by a user associated with the area (e.g., a land owner of a parcel of land).
- an administrator e.g., a company
- an online sign-up option e.g., via a web application running on a computing device, such as a computer, associated with the user, or a mobile application running on a computing device, such as a smart phone, associated with the user
- consent e.g., authorization
- a drone associated with the administrator e.g., company
- the user may provide restrictions for the landing of the drone in the area.
- the restrictions may include, but are not limited to, a restriction disallowing operation of a camera (e.g., indicating no operation of the camera) on the drone w hile in the area, a restriction disallowing operation of a sensor(s) (e.g., indicating no operation of the sensor(s)) on the drone while in the area, a speed restriction for the drone (e.g., indicating a reduced speed of the drone and/or drone blades of the drone) while in the area, and/or a restriction indicating no landing of the drone in the area when people are present within the area.
- a computing device e.g., server
- the administrator e.g., company
- the administrator e.g., company
- can generate a map e.g., an emergency map
- a drone when a drone needs to make an emergency landing in an area where an administrator (e.g., company) associated with the drone has received a preauthorization to land from a user associated with the area (e.g., a land owner of a parcel of land), the drone can send (e.g., transmit) a landing notification message to a computing device associated with the user (e.g., a smart phone of the land owner) to notify the user of the emergency landing of the drone in the area.
- a computing device associated with the user e.g., a smart phone of the land owner
- the drone when a drone needs to make an emergency landing in an area, and an administrator (e.g., company) associated with the drone has not received a preauthorization from a user (e.g., land owner) associated with the area, the drone can broadcast (e.g., transmit) an authorization request message to the area (e.g., for reception by any computing devices in the area) requesting for consent (e.g.. authorization) for the drone to fly over and land in the area.
- a computing device e.g., server
- the administrator e.g., company
- the drone may command the drone to broadcast the authorization request message to the area so that any computing devices in the area can receive the authorization request message.
- the drone may wirelessly broadcast the authorization request message to the area, such as via a sidelink communication and/or a wireless wide area netw ork (WWAN) communication.
- the authorization request message may include additional information including, but not limited to, images of a portion of the area (e.g., land) for landing of the drone, a height of the ground of the area for the landing, parcel information related to the area for the landing, a speed of the drone, and/or nearby object detection information.
- a computing device e.g., smart phone
- the user e.g., land owner
- the user can provide, via the computing device (e.g., smart phone) associated with the user, authorization for the landing in the area to the drone itself and/or to a computing device (e.g., server) associated with the administrator (e.g.. company) associated with the drone.
- a computing device e.g., smart phone
- the administrator e.g.. company
- the user may assign someone else as an agent with authority to provide consent (e.g., authorization) on behalf of the user (e.g., land ow ner) for the landing of the drone in the area.
- the computing device e.g., server
- the administrator e.g.. company
- the computing device e.g., server
- the administrator e.g., company
- the computing device can perform route planning for the drone to travel to the area and to land in the area.
- the camera or the system associated with the camera can send a notification to the computing device of the user (e.g.. via a web application running on the computing device, a mobile application running on the computing device, etc.) to provide authorization for landing the drone in the area.
- the user can then provide user input to the computing device to provide consent (e.g., authorization) for the drone to fly over and land in the area associated with the user (e.g., the land owner of the parcel of land).
- the camera or the system associated with the camera can send a notification to the computing device of the user (e.g., via a web application running on the computing device, a mobile application running on the computing device, etc.) to automatically provide consent (e.g., authorization) for the drone to fly over and land in the area associated with the user (e.g., the land owner of the parcel of land).
- consent e.g., authorization
- FIG. 1 is a block diagram of a UE 100, in accordance with aspects of the present disclosure.
- UE 100 may be an example of any of the UEs 212, 213, 214 (as shown in FIG. 2), or network device 620 of FIG. 6 (e.g., a computing device or UE shown in the form of a smart phone), or network device 920 of FIG. 9 (e.g.. a computing device or UE shown in the form of a smart phone).
- UE 100 may include a computing platform including a processor 110, memory 111 including software (SW) 112, one or more sensors 113, a transceiver interface 114 for a transceiver 115 (that includes a wireless transceiver 140 and a wired transceiver 1 0), and a user interface 116.
- the processor 110, the memory 111, the sensor(s) 113, the transceiver interface 114, and the user interface 116 may be communicatively coupled to each other by a bus 120 (which may be configured, e.g., for optical and/or electrical communication).
- a bus 120 which may be configured, e.g., for optical and/or electrical communication.
- One or more of the components shown may be omitted from the UE 100.
- the description may refer only to the processor 110 performing a function, but this includes other implementations such as where the processor 110 executes software and/or firmware.
- the description may refer to the processor 110 performing a function as shorthand for one or more of the processors 130-134 performing the function.
- the description may refer to the UE 100 performing a function as shorthand for one or more appropriate components of the UE 100 performing the function.
- the processor 110 may include a memory with stored instructions in addition to and/or instead of the memory 7 111. Functionality 7 of the processor 110 is discussed more fully below.
- an example configuration of the UE includes one or more of the processors 130-134 of the processor 110, the memory 111, and the wireless transceiver 140.
- Other example configurations include one or more of the processors 130-134 of the processor 110, the memory 11 1, the wireless transceiver 140, and one or more of the sensors 113, the user interface 11 , and/or the wired transceiver 150.
- the UE 100 may comprise the modem processor 132 that may be capable of performing baseband processing of signals received and down-converted by the transceiver 1 15 and/or the SPS receiver 181 (discussed below).
- the modem processor 132 may perform baseband processing of signals to be upconverted for transmission by the transceiver 115. Also or alternatively, baseband processing may be performed by the processor 130 and/or the DSP 131. Other configurations, however, may be used to perform baseband processing.
- the UE 100 includes the sensors 113 that may include one or more of various types of sensors, for example, an environmental sensor 160, a status sensor 170, and a position/motion/orientation (PMO) sensor 180.
- the PMO sensor 180 may include one or more sensors from which position and/or motion and/or orientation of the UE 100 may be determined. While each of the sensors 160, 170, 180 may be referred to in the singular, each of the sensors 160. 170, 180 may include more than one sensor, examples of some of which are discussed explicitly herein.
- the sensors 113 may generate analog and/or digital signals indications of which may be stored in the memory 111 and processed by the processor 110 (e.g., the processor 130, the DSP 131, the video processor 133, and/or the sensor processor 134 as appropriate) in support of one or more applications such as, for example, applications directed to positioning, navigation, and/or resource management.
- the processor 110 e.g., the processor 130, the DSP 131, the video processor 133, and/or the sensor processor 134 as appropriate
- applications such as, for example, applications directed to positioning, navigation, and/or resource management.
- the description herein may refer to the processor 110 generally as performing one or more functions that one or more of the processors 130-134 perform.
- the sensor(s) 113 may be used in resource management, relative location measurements, relative location determination, motion determination, etc. Information detected by the sensor(s) 113 may be used to determine ho ' to allocate resources of the UE 100, e.g., transmission power, processing power for transmission and/or reception of communication signals, transmission and/or reception directionality, etc.
- resources e.g., transmission power, processing power for transmission and/or reception of communication signals, transmission and/or reception directionality, etc.
- the plural term “resources” is often used throughout the discussion herein, but this term includes the singular as well, i.e., a single resource, e.g., being allocated.
- information detected by the sensor(s) may be used for motion detection, relative displacement, dead reckoning, sensor-based location determination, and/or sensor-assisted location determination.
- the sensor(s) 113 may be useful to determine whether the UE 100 is fixed (stationary) or mobile and/or whether to report certain useful information to the server 243 (as shown in FIG. 2) regarding the mobility of the UE 100. For example, based on the information obtained/measured by the sensor(s) 113, the UE 100 may notify/report to the server 243 of FIG. 2 that the UE 100 has detected movements or that the UE 100 has moved, and report the relative displacement/distance (e.g., via dead reckoning, or sensor-based location determination, or sensor-assisted location determination enabled by the sensor(s) 1 13).
- the sensors/ Inertial Measurement Unit can be used to determine the angle, size (e.g., width and/or height), and/or orientation of another device with respect to the UE 100. etc.
- the position and/or motion of the UE 100 may be used in determining resource allocation for communication, e.g., between vehicles and/or drones.
- the UE 100 may, for example, be disposed in or integrated with a vehicle or a drone.
- the UE 100 may be the UE 214 of FIG. 2, which is a vehicle (e.g., a car) in the example shown in FIG. 2, or UE 100 may be the UAV 420 of FIG. 4 in the form of a drone.
- the UE 100 may be configured for various forms of communication, e.g., V2V (vehicle-to-vehicle), V2X (vehicle- to-everything), CV2X (cellular V2X), CV2V (cellular V2V), etc.
- V2V vehicle-to-vehicle
- V2X vehicle- to-everything
- CV2X cellular V2X
- CV2V cellular V2V
- the environmental sensor 160 may include one or more sensors for measuring one or more internal and/or external environmental conditions.
- the environmental sensor 160 includes a camera 161, a microphone 162, an air-flow sensor 163, a temperature sensor 164, a motion sensor 165, and a light-based sensor 166. While each of the sensors 161-166 may be referred to in the singular, each of the sensors 161-166 may include more than one sensor, examples of some of which are discussed explicitly herein.
- the camera 161 may include at least one camera configured (e.g., designed, made, disposed, and directed) to capture images external to the UE 100 and/or may include one or more cameras configured to capture images internal to the UE 100 (e.g., in a passenger compartment of a vehicle).
- the microphone 162, the temperature sensor 164, and/or the motion sensor 165 may include multiple microphones, multiple thermometers, and/or multiple motion detectors configured to detect sound, temperature, and/or motion (respectively) outside and/or inside of the UE 100, e.g.. a vehicle or a drone.
- any of the sensors 161-165 may include multiple respective sensors outside the vehicle (or drone) and/or multiple respective sensors inside the vehicle (or drone) for making respective measurements at multiple locations about the vehicle (or drone) and/or in different directions relative to the vehicle (or drone). While this discussion assumes the UE 100 is a vehicle or a drone, the UE 100 may be a different device (i.e., other than a vehicle or a drone).
- the sensors 161-165 are examples and one or more of the sensors 161-165 may be omitted from the UE 100 and/or one or more other sensors may be included in the UE 100.
- the environmental sensor 160 may include one or more barometric pressure sensors and/or one or more ambient light sensors and/or one or more other sensors.
- the camera 161 may be configured for capturing still and/or moving imagery.
- each camera of the camera 161 may comprise, for example, one or more imaging sensors (e.g., a charge coupled device (CCD) or a CMOS imager), one or more lenses, analog- to-digital circuitry, frame buffers, etc. Additional processing, conditioning, encoding, and/or compression of signals representing captured images may be performed by the general-purpose processor 130 and/or the DSP 131. Also or alternatively, the video processor 133 may perform conditioning, encoding, compression, and/or manipulation of signals representing captured images. The video processor 133 may decode/decompress stored image data for presentation on a display device (not show n), e.g., of the user interface 116.
- a display device not show n
- the motion sensor 165 is configured to detect motion.
- the motion sensor 165 may send and receive sound waves (e.g., ultrasound signals) and analyze the received signals for Doppler effects indicative of motion.
- Sound waves e.g., ultrasound signals
- Use of multiple motion detectors may help identify the relative location (e.g., direction relative to the UE 100) of an object.
- the light-based sensor 166 is configured to determine range to an object, which may be used by the processor 110 to detect the presence of an object. Use of multiple light-based sensors may help identify the relative location (e.g., direction relative to the UE 100) of an object. In some cases, the light-based sensor 166 may be used for detecting relatively small objects, such as vehicles or other artificial (human-made) objects.
- the status sensor 170 is configured to provide one or more indications of one or more UE conditions of the UE 100 indicative of UE status.
- UE conditions where the UE 100 is a vehicle or a drone may include, for example a gear status of the vehicle (e.g., whether the vehicle is in park, drive, or neutral, or in which gear the vehicle is presently (e.g., reverse, first, second, third, fourth, etc.)).
- gear status of the vehicle e.g., whether the vehicle is in park, drive, or neutral, or in which gear the vehicle is presently (e.g., reverse, first, second, third, fourth, etc.)
- numerous other UE conditions may be sensed and indicated where the UE 100 is not a vehicle (e.g., when the UE is a drone) or is not associated with a vehicle.
- the PMO sensor 180 may include one or more sensors for providing one or more UE conditions such as, for example, vehicle or drone conditions.
- the PMO sensor 180 may include one or more sensors for measuring information from which position and/or motion and/or orientation of the UE 100 may be determined and possibly determining position and/or motion (e.g., speed and/or direction of motion) and/or orientation of the UE 100.
- the PMO sensor 180 includes a Satellite Positioning System (SPS) receiver 181, a position device (PD) 182, an Inertial Measurement Unit (IMU) 183, and a magnetometer 184.
- SPS Satellite Positioning System
- PD position device
- IMU Inertial Measurement Unit
- the components of the PMO sensor 180 shown are examples, and one or more of these components may be omitted and/or one or more other components included in the PMO sensor 180.
- each of the components 181-184 of the PMO sensor 180 may be referred to in the singular, each of the components 181-184 may include more than one such component, examples of some of which are discussed explicitly herein.
- the PD 182 may be part of the SPS receiver 181 and/or the IMU 183 and/or part of the processor 110, and may not be a sensor itself (e.g., may not take measurements), but may process information from one or more of the sensors 181. 183, 184 and/or one or more other sensors.
- the PMO 180 may be used to determine UE speed and/or direction of motion, e.g., by determining UE location over time (e.g., determined using SPS, one or more ranging sensors, etc.).
- the IMU 183 may comprise one or more inertial sensors, for example, an accelerometer
- each of the sensors 187, 188 may be referred to in the singular, each of the sensors 187, 188 may include more than one sensor.
- the accelerometer may include one or more three-dimensional accelerometers and the gyroscope may include one or more three- dimensional gyroscopes.
- the IMU 183 may be configured to provide measurements about a direction of motion and/or a speed of motion of the UE 100, which may be used, for example, in relative location determination.
- the IMU 188 of the IMU 183 may detect, respectively, a linear acceleration and a speed of rotation of the UE 100.
- the linear acceleration and speed of rotation measurements of the UE 100 may be integrated overtime (e g., by the IMU 183 and/or the PD 182) to determine an instantaneous direction of motion as well as a displacement of the UE 100.
- the instantaneous direction of motion and the displacement may be integrated to track a location of the UE 100.
- a reference location of the UE 100 may be determined, e.g., using the SPS receiver 181 (and/or by some other means) for a moment in time and measurements from the accelerometer 187 and the gyroscope 188 taken after this moment in time may be used in dead reckoning to determine a present location of the UE 100 based on movement (direction and distance) of the UE 100 relative to the reference location.
- the magnetometer 184 may determine magnetic field strengths in different directions which may be used to determine orientation of the UE 100, which may be used, for example, to provide a digital compass for the UE 100.
- the magnetometer 184 may include a two- dimensional magnetometer configured to detect and provide indications of magnetic field strength in two orthogonal dimensions. Also or alternatively, the magnetometer 184 may include a three-dimensional magnetometer configured to detect and provide indications of magnetic field strength in three orthogonal dimensions.
- the magnetometer 184 may provide means for sensing a magnetic field and providing indications of the magnetic field (e.g., to the processor 110).
- the magnetometer 184 may provide measurements to determine orientation (e.g., relative to magnetic north and/or true north) that may be used for any of a variety of purposes (e.g.. to support one or more compass applications). While referred to in the singular, the magnetometer 184 may include multiple magnetometers.
- the SPS receiver 181 may be capable of receiving and acquiring SPS signals 185 via an SPS antenna 186.
- the antenna 186 is configured to transduce the wireless SPS signals 185 to wired signals (e g., electrical or optical signals, and may be integrated with the antenna 146).
- the SPS receiver 181 may be configured to process, in whole or in part, the acquired SPS signals 185 for estimating a location of the UE 100. For example, the SPS receiver 181 may be configured to determine location of the UE 100 by trilateration using the SPS signals 185.
- the general-purpose processor 130, the memory 111, the DSP 131 and/or one or more specialized processors may be utilized to process acquired SPS signals, in whole or in part, and/or to calculate an estimated location of the UE 100, in conjunction with the SPS receiver 181.
- the memory 111 may store indications (e.g., measurements) of the SPS signals 185 and/or other signals (e.g.. signals acquired from the wireless transceiver 140) for use in performing positioning operations.
- the general-purpose processor 130, the DSP 131, and/or one or more specialized processors, and/or the memory 111 may provide or support a location engine for use in processing measurements to estimate a location of the UE 100. Also or alternatively, some or all of the position determination signal processing may be performed by the PD 182.
- the position device (PD) 182 may be configured to determine a position of the UE 100 (including absolute and/or relative position of the UE 100), motion of the UE 100, and/or time.
- the PD 182 may communicate with, and/or include some or all of, the SPS receiver 181.
- the PD 182 may use measurements from the SPS receiver 181 and/or the IMU 183 and/or the magnetometer 184 to determine position and/or motion of the UE 100, e.g., using trilateration and/or dead reckoning.
- the PD 182 may work in conjunction with the processor 110 and the memory 111 as appropriate to perform at least a portion of one or more positioning methods (to determine location of the UE 100), although the description herein may refer only to the PD 182 being configured to perform, or performing, one or more operations in accordance with the positioning method(s).
- the PD 182 may also or alternatively be configured to determine location of the UE 100 using terrestrial -based signals (e.g., at least some of signals 148 discussed below) for trilateration, for assistance with obtaining and using the SPS signals 185, or both.
- the PD 182 may be configured to use one or more other techniques (e.g., relying on the UE’s self-reported location (e.g., part of the UE’s position beacon)) for determining the location of the UE 100, and may use a combination of techniques (e.g., SPS and terrestrial positioning signals) to determine the location of the UE 100.
- the PD 182 may be configured to provide indications of uncertainty' and/or error in the determined position and/or motion.
- Functionality' of the PD 182 may be provided in a variety of manners and/or configurations (e.g., by the general purpose/application processor 130, the transceiver 115, the SPS receiver 181, and/or another component of the UE 100, and may be provided by hardware, software, firmware, or various combinations thereof).
- the transceiver 115 may include a wireless transceiver 140 and/or a wired transceiver 150 configured to communicate with other devices through wireless connections and wired connections, respectively.
- the wireless transceiver 140 may include a wireless transmitter 142 and a wireless receiver 144 coupled to one or more antennas 146 for transmitting (e.g., on one or more uplink channels and/or one or more sidelink channels) and/or receiving (e.g., on one or more downlink channels and/or one or more sidelink channels) wireless signals 148 and transducing signals from the wireless signals 148 to wired (e.g., electrical and/or optical) signals and from wired signals to the wireless signals 148.
- wired e.g., electrical and/or optical
- the wireless transceiver 140 may be configured for wireless communication to send communications to, and receive communications from, a variety of entities such as other UEs, base stations, etc.
- the wireless transmitter 142 may include multiple transmitters that may be discrete components or combined/integrated components
- the wireless receiver 144 may include multiple receivers that may be discrete components or combined/integrated components.
- the wireless transceiver 140 may be configured to communicate signals (e.g., with Transmission/Reception Points (TRPs) and/or one or more other devices) according to a variety of radio access technologies (RATs) such as 5GNew Radio (NR), GSM (Global System for Mobiles), UMTS (Universal Mobile Telecommunications System), AMPS (Advanced Mobile Phone System).
- RATs radio access technologies
- the wired transceiver 150 may include a wired transmitter 152 and a wired receiver 154 configured for wired communication, e.g., a network interface that may communicate with the network 230 of FIG. 2, e.g., to send communications to, and receive communications from, a gNode B (gNB), for example.
- gNB gNode B
- the wired transmitter 152 may include multiple transmitters that may be discrete components or combined/integrated components, and/or the wired receiver 154 may include multiple receivers that may be discrete components or combined/integrated components.
- the wired transceiver 150 may be configured, e.g., for optical communication and/or electrical communication.
- the transceiver 115 may be communicatively coupled to the transceiver interface 114, e.g., by optical and/or electrical connection.
- the transceiver interface 114 may be at least partially integrated with the transceiver 115.
- the wireless transceiver 140 may be configured for beam management to affect directionality of the wireless transceiver 140, e.g., of the antenna 146.
- the wireless transceiver 140 may be configured to implement beam forming for transmission and/or reception of the signals 148.
- the antenna 146 may include multiple antennas that are configured, e.g., designed, made, disposed, and directed to point in different directions relative to a body of the UE 100.
- One or more of such antennas may be capable of electronic beam steering (e.g., using appropriate phase shifts of elements of the antenna) and/or mechanical beam steering.
- the transceiver 140 may be configured to selectively (e g., under directi on/control of the processor 110) transmit from one or more antennas and/or to selectively process signals (e.g., to pass from the transceiver 115 to the processor 110 or to process by the processor 1 10) received from one or more antennas.
- the user interface 116 may comprise one or more of several devices such as, for example, a speaker, microphone, display device, vibration device, keyboard, touch screen, etc.
- the user interface 116 may include more than one of any of these devices.
- the user interface 116 may be configured to enable a user to interact with one or more applications hosted by the UE 100.
- the user interface 116 may store indications of analog and/or digital signals in the memory 111 to be processed by DSP 131 and/or the general-purpose processor 130 in response to action from a user.
- applications hosted on the UE 100 may store indications of analog and/or digital signals in the memory 111 to present an output signal to a user.
- the user interface 116 may include an audio input/output (I/O) device comprising, for example, a speaker, a microphone, digital-to-analog circuitry, analog-to-digital circuitry, an amplifier and/or gain control circuitry (including more than one of any of these devices). Other configurations of an audio I/O device may be used. Also or alternatively, the user interface 116 may comprise one or more touch sensors responsive to touching and/or pressure, e.g., on a keyboard and/or touch screen of the user interface 116.
- I/O audio input/output
- FIG. 2 illustrates an example wireless communications system 210, in accordance with aspects of the present disclosure.
- Wireless communications system 210 includes a user equipment (UE) 212. aUE 213, a UE 214, base transceiver stations (BTSs) 220, 221, 222, 223, a network 230, a core network 240, an external client 250, and a roadside unit (RSU) 260.
- the core network 240 e.g., a 5G core network (5GC)
- the core network 240 may include back-end devices including, among other things, an Access and Mobility Management Function (AMF) 241, a Session Management Function (SMF) 242, a server 243, and a Gateway Mobile Location Center (GMLC) 244.
- AMF Access and Mobility Management Function
- SMF Session Management Function
- GMLC Gateway Mobile Location Center
- the AMF 241, the SMF 242. the server 243, and the GMLC 244 are communicatively coupled to each other.
- the server 243 may be, for example, a Location Management Function (LMF) that supports positioning of the UEs 212-214 (e.g., using techniques such as Assisted Global Navigation Satellite System (A-GNSS), OTDOA (Observed Time Difference of Arrival, e.g., Downlink (DL) OTDOA and/or Uplink (UL) OTDOA), Round Trip Time (RTT). Multi-Cell RTT, RTK (Real Time Kinematic).
- LMF Location Management Function
- A-GNSS Assisted Global Navigation Satellite System
- OTDOA Observed Time Difference of Arrival, e.g., Downlink (DL) OTDOA and/or Uplink (UL) OTDOA
- RTT Round Trip Time
- Multi-Cell RTT RTK (Real Time Kinematic).
- the RSU 260 may be configured for communication (e.g., bi-directional or uni-directional communication) with the UEs 212-214).
- the RSU 260 may be configured with similar communication capabilities to any of the BTSs 220-223, but perhaps with different functionality(ies) (e.g., different programming).
- the system 200 may include more than one RSU, or may not include any RSUs.
- the communication system 210 may include additional or alternative components.
- the communication system 210 may utilize information from a constellation 280 of satellite vehicles (SVs) 281, 282, 283.
- the constellation 280 may correspond to a respective Global Navigation Satellite System (GNSS) (i.e.. Satellite Positioning System (SPS)) such as the Global Positioning System (GPS), the GLObal NAvigation Satellite System (GLONASS), Galileo, Beidou, or some other local or regional SPS such as the Indian Regional Navigational Satellite System (IRNSS), the European Geostationary Navigation Overlay Service (EGNOS), or the Wide Area Augmentation System (WAAS).
- GNSS Global Navigation Satellite System
- GPS Global Positioning System
- GLONASS GLObal NAvigation Satellite System
- Galileo Galileo
- Beidou or some other local or regional SPS
- IRNSS Indian Regional Navigational Satellite System
- GNOS European Geostationary Navigation Overlay Service
- WAAS Wide Area Augmentation System Only three SVs are shown for the constellation 280, but constellations of GNSS SVs will include
- An LMF may also be referred to as a Location Manager (LM), a Location Function (LF), a commercial LMF (CLMF), or a value-added LMF (VLMF).
- the server 243 e.g., an LMF
- one or more other devices of the system 210 e.g., one or more of the UEs 212- 2114
- the server 243 may communicate directly with the BTS 221 (e.g., a gNB) and/or one or more other BTSs, and may be integrated with the BTS 221 and/or one or more other BTSs.
- the SMF 242 may serve as an initial contact point of a Service Control Function (SCF) (not shown) to create, control, and delete media sessions.
- SCF Service Control Function
- the server 243 e.g., an LMF
- the server 243 may be co-located or integrated with a gNB or a TRP (Transmission/Reception Point), or may be disposed remote from the gNB and/or TRP and configured to communicate directly or indirectly with the gNB and/or the TRP.
- the AMF 241 may serve as a control node that processes signaling between the UEs 212-214 and the core network 240, and provides QoS (Quality of Service) flow and session management.
- the AMF 241 may support mobility of the UEs 212-214 including cell change and handover and may participate in supporting signaling connection to the UEs 212-214.
- the system 210 is capable of wireless communication in that components of the system 210 can communicate with one another (at least some times using wireless connections) directly or indirectly, e.g., via the BTSs 220-223 and/or the network 230 (and/or one or more other devices not shown, such as one or more other base transceiver stations). While the BTSs 220-223 are shown separately from the network 230, the network 230 may include one or more of the BTSs 220-223 and may constitute a Radio Access Network (RAN), e.g., a New Radio (NR) RAN which may also be called a Fifth Generation (5G) Next Generation (NG) RAN (NG-RAN).
- RAN Radio Access Network
- NR New Radio
- NG Next Generation
- the communications may be altered during transmission from one entity to another, e.g.. to alter header information of data packets, to change format, etc.
- the UEs 212-214 may communicate with the BTSs 220-223 via Uu interfaces, e.g., in RRC-encapsulated LPP messages (Radio Resource Control encapsulated LTE Positioning Protocol messages) over Uu interfaces.
- RRC-encapsulated LPP messages Radio Resource Control encapsulated LTE Positioning Protocol messages
- the UEs 212-214 shown are a smartphone, a tablet computer, and a vehicle-based device, but these are examples only as the UEs 212-214 are not required to be any of these configurations, and other configurations of UEs may be used (e.g., one or more of the UEs may be in the form of a drone).
- such other devices may include internet of thing (loT) devices, UAVs, drones, medical devices, home entertainment and/ or automation devices, etc.
- the core network 240 may communicate wi th the external client 250 (e.g., a computer system), e.g.. to allow the external client 250 to request and/or receive location information regarding the UEs 212-214 (e.g., via the GMLC 244).
- the external client 250 e.g., a computer system
- location information regarding the UEs 212-214 e.g., via the GMLC 244
- V2X communications may be cellular (Cellular-V2X (C-V2X)) and/or WiFi (e.g., DSRC (Dedicated Short-Range Connection)).
- the system 210 may support operation on multiple carriers (waveform signals of different frequencies).
- Multi-carrier transmitters can transmit modulated signals simultaneously on the multiple carriers.
- Each modulated signal may be a Code Division Multiple Access (CDMA) signal, a Time Division Multiple Access (TDMA) signal, an Orthogonal Frequency Division Multiple Access (OFDMA) signal, a Single-Carrier Frequency Division Multiple Access (SC-FDMA) signal, etc.
- CDMA Code Division Multiple Access
- TDMA Time Division Multiple Access
- OFDMA Orthogonal Frequency Division Multiple Access
- SC-FDMA Single-Carrier Frequency Division Multiple Access
- Each modulated signal may be sent on a different carrier and may carry pilot, overhead information, data. etc.
- the BTSs 220-223 may wirelessly communicate with the UEs 212-214 in the system
- a BTS may also be referred to as a base station, an access point, a gNode B (gNB), an access node (AN), a Node B, an evolved Node B (eNB), etc.
- gNB gNode B
- AN access node
- eNB evolved Node B
- each of the BTSs 220, 221 may be a gNB or a transmission point gNB
- the BTS 222 may be a macro cell (e.g., a high-power cellular base station) and/or a small cell (e.g., a low- power cellular base station)
- the BTS 223 may be an access point (e.g.. a short-range base station configured to communicate with short-range technology such as WiFi.
- One or more of the BTSs 220-223 may be configured to communicate with the UEs 212-214 via multiple carriers.
- Each of the BTSs 220, 221 may provide communication coverage for a respective geographic region, e.g. a cell.
- Each cell may be partitioned into multiple sectors as a function of the base station antennas.
- a pico TRP may cover a relatively small geographic area (e.g., a pico cell) and may allow unrestricted access by terminals with service subscription.
- a femto or home TRP may cover a relatively small geographic area (e.g., a femto cell) and may allow restricted access by terminals having association with the femto cell (e.g., terminals for users in a home).
- the UEs 212-214 may be configured to connect indirectly to one or more communication networks via one or more device-to-device (D2D) peer-to-peer (P2P) links.
- the D2D P2P links may be supported with any appropriate D2D radio access technology (RAT), such as LTE Direct (LTE-D), WiFi Direct (WiFi-D), Bluetooth®, Ultrawideband (UWB), and so on.
- RAT D2D radio access technology
- LTE-D LTE Direct
- WiFi Direct WiFi Direct
- Bluetooth® Ultrawideband
- UWB Ultrawideband
- One or more of a group of the UEs 212-214 utilizing D2D communications may be within a geographic coverage area of a TRP such as one or more of the BTSs 220-223. Other UEs in such a group may be outside such geographic coverage areas, or be otherwise unable to receive transmissions from a base station.
- Groups of the UEs 212-214 communicating via D2D communications may utilize a one-to-many (EM) system in which each UE may transmit to other UEs in the group.
- EM one-to-many
- a TRP of the BTSs 220-223 may facilitate scheduling of resources for D2D communications.
- D2D communications may be carried out between UEs without the involvement of a TRP.
- FIG. 3 illustrates an example implementation of a system-on-a-chip (SOC) 300, which may include a central processing unit (CPU) 302 or a multi-core CPU. configured to perform one or more of the functions described herein.
- the SOC 300 may be implemented within a computing device (e.g., network device 620 of FIG. 6 in the form of a UE as a smart phone, or network device 920 of FIG. 9 in the form of a UE as a smart phone) and/or a drone (e.g., network device 610 of FIG. 6 in the form of a drone, or network devices 930a, 930b of FIG. 9 in the form of a drone).
- a computing device e.g., network device 620 of FIG. 6 in the form of a UE as a smart phone, or network device 920 of FIG. 9 in the form of a UE as a smart phone
- a drone e.g., network device 610 of FIG. 6 in the form of
- Parameters or variables e.g., neural signals and synaptic weights
- system parameters associated with a computational device e.g., neural network with weights
- delays e.g., frequency bin information, task information, among other information
- NPU neural processing unit
- GPU graphics processing unit
- DSP digital signal processor
- Instructions executed at the CPU 302 may be loaded from a program memory 7 associated with the CPU 302 or may be loaded from a memory ⁇ block 318.
- SOC 300 and/or components thereof may be configured to evaluate environmental conditions.
- the sensor processor 314 may receive and/or process information from one or more sensors 322.
- sensors 322 may include one or more Inertial Measurement Units (IMUs) (e.g., an accelerometer, a gy roscope, etc.), temperature sensors, light sensors, shock sensors, humidity sensors, acceleration sensors, speed sensors, tilt angle sensors, etc. sensors of a device.
- IMUs Inertial Measurement Units
- the sensors 322 may be located on SOC 300.
- the sensor processor 314 may also be coupled to one or more sensors (not shown) that are external to the SOC 300 (e.g., located on a separate chip).
- the sensor processor 314 may also receive, as input, output of one or more processing blocks of the connectivity block 310.
- FIG. 4 is a diagram illustrating an unmanned aerial vehicle (UAV) 420 in the form of a drone, which may perform deliveries of items (e.g., parcels) for users (e.g., customers).
- UAV unmanned aerial vehicle
- the UAV 420 illustrated in FIG. 4 may be an example of a drone that follows the disclosed protocols for drone landings, such as for performing emergency landings or for delivering items.
- the UAV 420 includes a VL camera 410 adjacent to an IR camera 430 along a front portion of a body of the UAV 420.
- the UAV 420 includes multiple propellers 425 along the top of the UAV 420 (which can be a means for traveling along a path).
- the propellers 425 maybe spaced apart from the body of the UAV 420 by one or more appendages to prevent the propellers 425 from snagging on circuitry on the body of the UAV 420 and/or to prevent the propellers 425 from occluding the view of the VL camera 410 and/or the IR camera 430.
- the propellers 425 may act as a conveyance of the UAV 420, and may be motorized using one or more motors. The motors, and thus the propellers 425, may be actuated to move the UAV 420 via a movement actuator housed within the body of the UAV 420.
- a drone may experience technical issues (e.g., mechanical and/or operational failures) and, as such, the drone can be required to perform an emergency landing on private or public property.
- technical issues e.g., mechanical and/or operational failures
- the FAA has rules regarding where commercial and recreational drone pilots are able to fly and not fly their drones within the controlled airspace.
- the FAA has laws and regulations for recreational pilots flying their drones. These FAA laws and regulations include requiring flying the drone while maintaining a visual line of sight to the drone, flying the drone for recreational purposes (not for earning money), not flying the drone at night if the drone does not have recommended lights, properly registering the drone before flying the drone, the drone should not weight more than fifty-five (55) pounds unless certified by a community-based organization, not flying the drone over people or moving vehicles, not flying the drone near emergency response activities, and not flying the drone near any crewed aircraft.
- the FAA also has some laws and regulations for commercial drone pilots flying their drones.
- FAA law s and regulations include requiring a pilot to obtain an FAA-issued remote pilot certificate in order to become a commercial drone pilot to fly a drone commercially, flying the drone at a speed lower than 100 miles per hour (mph), flying the drone only in a Class G designated airspace, always giving w ay to crewed aircraft, and not flying the drone off of a moving vehicle, except when in a low' populated area.
- a generalized protocol for landing drones such as for performing emergency landings or for delivering items, on private property can be useful.
- systems and techniques are described herein for providing protocols for drone landing.
- the systems and techniques involve receiving a pre-authorization from a user (e.g., a land owner) for a drone to land in an area (e.g., land) associated with the user.
- a pre-authorization for a drone to land in an area can be previously provided by a user associated with the area (e.g., a land owner of a parcel of land).
- an administrator e.g., a company associated with the drone may provide an online sign-up option (e.g..
- a web application running on a computing device, such as a computer, associated with the user, or a mobile application running on a computing device, such as a smart phone, associated with the user
- a user e.g., land owner
- consent e.g.. authorization
- a drone associated with the administrator e.g.. company
- the user can provide restrictions for the landing of the drone in the area.
- the restrictions may include, but are not limited to, a restriction indicating no operation of a camera on the drone while in the area, a restriction indicating no operation of a sensor(s) on the drone while in the area, a restriction indicating a reduced speed of the drone (e.g., drone blades) while in the area, and/or a restriction indicating no landing of the drone in the area when people are present within the area.
- a computing device e.g., server
- the administrator e.g., company
- the administrator e.g., company
- can generate a map e.g., an emergency map
- the drone when a drone needs to make an emergency landing in an area where an administrator (e.g., company) associated with the drone has received a preauthorization to land from a user associated with the area (e.g., a land owner of a parcel of land), the drone may send (e.g., transmit) a landing notification message to a computing device associated with the user (e.g., a smart phone of the land owner) to notify the user of the emergency landing of the drone in the area.
- a computing device associated with the user e.g., a smart phone of the land owner
- FIG. 6 is a diagram illustrating an example of a system 600 for providing preauthorization for unplanned drone landings.
- the system 600 is shown to include a network device 610 (e.g., in the form of a drone), a network device 620 (e.g., a computing device or UE in the form of a smart phone), a network device 640 (e.g., in the form of a base station, such as a gNB), and a network entity 630 (e.g., a computing device in the form of a server).
- a network device 610 e.g., in the form of a drone
- a network device 620 e.g., a computing device or UE in the form of a smart phone
- a network device 640 e.g., in the form of a base station, such as a gNB
- a network entity 630 e.g., a computing device in the form of a server.
- the network device 610 shown to be in the form of a drone, may be associated with the network entity 630 and an administrator (e.g., a company, such as a drone company, that owns the drone and uses the drone to deliver items to customers).
- the network device 610 may be in the form of other types of remotely controlled devices, such as other types of robotic devices or vehicles.
- the network device 620 may be associated with a user (e.g., a customer who is purchasing one or more items from the company to be delivered to a location of the user's choice).
- the network device 620 may be in the form of other ty pes of computing devices that may be associated with a user (e.g., a customer), such as a tablet, a wearable device, a smart watch, a laptop computer, and a desktop computer.
- the network device 620 may be located within an area 650 that is associated with the user (e.g., customer).
- the area 650 may be property (e.g., private property) owned by the user (e.g., customer).
- the user e.g., customer
- the user may choose to have the one or more items that the user has purchased from the company to be delivered to the area 650 (e.g., private property).
- the network entity 630 shown to be in the form of a server, may be associated with an administrator (e.g., a company that owns the drone and uses the drone to deliver items to customers).
- the system 600 may include more or less number of network devices 610, 620, 640 and/or network entities 630 than as shown in FIG. 6.
- the system 600 may include different types of network devices 610, 620. 640 and/or network entities 630 than as shown in FIG. 6.
- an administrator e.g., a company that owns drones for delivery of items to customers, referred to as a drone company
- may provide online sign-up options for a user e.g., a property owner, which may be a customer of the company
- authorize e.g., to provide a pre-authorization
- the landing of one or more drones in an area e.g., private property owned by the user for an unplanned landing of the drone(s).
- the unplanned landing may be any landing of a drone that is not scheduled, such as an emergency landing of a drone, a landing of a drone for refueling of the drone, a landing of a drone for providing maintenance to the drone, a landing of a drone due to inclement weather, and a landing of a drone due to congested airspace.
- the web browser and/or mobile application may have stored within its database, information associated with the user (e g., the property owner), such as the address (e.g., property address) of the user, the contact information (e.g., email address and/or phone number) of the user, the shopping history of the user, etc.
- information associated with the user e.g., the property owner
- the address e.g., property address
- the contact information e.g., email address and/or phone number
- FIG. 7 shows an example of a web browser that provides these online sign-up options to a user.
- FIG. 7 is a diagram illustrating an example of a web browser 700 for pre-authorization for unplanned drone landings.
- the web browser 700 is shown to include a search box 710 for searching keywords and terms, and a login and security dropdown menu 720.
- a number of options are presented to the user that can include, but are not limited to, a login option 730, a sign up for drone emergency (or unplanned) landings option 740, a switch accounts option 750, a your addresses option 760, a manage saved identifications (IDs) option 770, and a manage your profiles option 780.
- the user e.g.. a property owner, which may be a customer of the company
- the administrator e.g., drone company
- can allow’ for e.g., via a web browser, such as the web browser 700, or via a mobile application
- the user e.g., property owner
- some restrictions for landing drones on the user’s property e.g., area 650.
- the restrictions may include, but are not limited to, a restriction indicating no operation of a camera on the drone while the drone is located within the property (e.g., in the area 650), a restriction indicating no operation of at least one sensor on the drone while the drone is located within the property (e.g., in the area 650), a restriction indicating a reduced speed for the drone while the drone is located within the property (e.g., in the area 650), and/or a restriction indicating no landing of the drone on the property' (e.g., in the area 650) when people are present on the property (e.g.. in the area 650).
- a restriction indicating no operation of a camera on the drone while the drone is located within the property e.g., in the area 650
- a restriction indicating no operation of at least one sensor on the drone while the drone is located within the property e.g., in the area 650
- a restriction indicating a reduced speed for the drone while the drone is located within the property
- the administrator e.g., drone company
- can offer e.g., via a web browser, such as the web browser 700. or via a mobile application
- compensation can include, but is not limited to, money, one or more free services, one or more free items, and/or one or more discounts on services and/or items.
- a user may use a computing device (e.g., network device 620) associated with the user to access a w eb brow ser (e.g., w eb brow ser 700 of FIG. 7) or application to sign up to provide pre-authorization for unplanned landings of drones on property (e.g., area 650) owned by the user (e.g., property owner).
- a computing device e.g., network device 620
- w eb brow ser e.g., w eb brow ser 700 of FIG. 7
- application e.g., application to sign up to provide pre-authorization for unplanned landings of drones on property (e.g., area 650) owned by the user (e.g., property owner).
- the computing device e.g., network device 620
- the computing device can send (e.g., via wireless signals 605, 615 via the network device 640, and wired signal 625 via the internet 660) an authorization message to the network entity- 630 (e.g., a computing device associated with the drone).
- the authorization message can include the authorization by the user to land drones in the property (e.g., area 650) owned by the user (e.g., property owner) for unplanned landings of drones (and/or for scheduled landings for deliveries) and can, optionally, include one or more restrictions chosen by the user (e.g., property owner).
- the user e.g., property owner
- the network entity 630 receives the authorization message (e.g., which includes the pre-authorization and, optionally, the restrictions) from the network device 620 (e.g., computing device associated with the user, who is the property' owner)
- the network entity 630 can determine route planning for drones travelling in the area 650 owned by the user (e.g., property owner) based on information within the authorization message.
- the network entity 630 can determine the route planning additionally based on pre-authorizations from other users (e.g., other property owners) for unplanned drone landings on their properties.
- the network entity 630 can determine the route planning additionally based on a planned delivery of one or more drones of one or more items to one or more locations, which may be different than the area 650.
- the network entity 630 e.g., computing device associated with the drone
- the network entity 630 can generate a map for unplanned drone landings.
- the map can include one or more paths (e.g., determined by the network entity 630) for one or more drones to travel to one or more areas (e.g., area 650), which have owners that have preauthorized drone landings for those one or more areas.
- One of the paths (e.g., determined by the network entity 630) can be a path for the network device 610 (e.g.. drone) to travel to the area 650 (e.g., for an unplanned landing or for a delivery of one or more items).
- FIG. 8 shows an example of a map for unplanned drone landings.
- FIG. 8 is a diagram illustrating an example of a map 800 for unplanned drone landings (e.g.. an emergency landing map).
- the map 800 is shown to include multiple areas 830, 840, 850, 860, 870, 880.
- Property owners (e.g., users) of the areas 840, 850, 860, 880 have provided pre-authorization for unplanned drone landings. These areas 840, 850, 860, 880 are referred to as “opt in” areas.
- Property owners (e.g., users) of the areas 830, 870 have not provided preauthorization for unplanned drone landings.
- These areas 830, 870 are referred to as “opt out” areas.
- a drone 810 may have a planned delivery (e.g., a scheduled delivery) to deliver one or more items to a destination 820 (e.g., a house) located within the area 880, which is designated as an '‘opt in" area.
- the network entity e.g., computing device associated with the drone
- the original path 805 is shown to span over areas 860, 870. 880.
- area 870 is an “opt out” area, drones cannot perform any unplanned landings within area 870.
- the network entity e.g., computing device associated with the drone
- can determine a different path e.g., an updated path 815) for the drone 810 to travel to the destination 820 such that the drone 810 will only travel over “opt in” areas.
- the updated path 815 is shown to span over areas 860, 840, 850. 880. Since all of the areas 860, 840, 850, 880 in the updated path 815 are “opt in” areas, drones may perform unplanned landings in any of the areas 860, 840, 850, 880, if needed.
- the network entity 630 can transmit a command (e.g.. via signal 635 transmitted wirelessly) to the drone 610 that includes the path (e.g., path information).
- the command can cause the drone 610 to travel (fly), following the path, to the area 650 and to land in the area 650 (e.g., for an unplanned landing or for a scheduled delivery of one or more items).
- the systems and techniques involve receiving a dynamic on- the-spot (e.g., real-time) authorization from the user for the drone to land in the area associated with the user.
- a dynamic on- the-spot e.g., real-time
- the drone may broadcast (e.g., transmit) an authorization request message to the area requesting for consent (e.g., authorization) for the drone to fly over and land in the area.
- a computing device e.g., server
- the administrator e.g., company
- the drone can command the drone to broadcast the authorization request message to the area.
- the drone can wirelessly broadcast the authorization request message to the area, such as via a sidelink communication and/or a wireless wide area network (WWAN) communication.
- the authorization request message can include additional information including, but not limited to, images of a portion of the area (e.g., land) for landing of the drone, a height of the ground of the area for the landing, parcel information related to the area for the landing, a speed of the drone, and/or nearby object detection information.
- a computing device e.g., smart phone
- the user may provide, via the computing device (e.g., smart phone) associated with the user, authorization for the landing in the area to the drone itself and/or to a computing device (e.g., server) associated with the administrator (e.g.. company) associated with the drone.
- a computing device e.g., smart phone
- the administrator e.g.. company
- the user may assign someone else as an agent with authority to provide consent (e.g., authorization) on behalf of the user (e.g., land ow ner) for the landing of the drone in the area.
- the computing device e.g., server
- the administrator e.g., company
- the computing device e.g., server
- the administrator e.g., company
- the computing device may perform route planning for the drone to travel to the area and to land in the area.
- FIG. 9 is a diagram illustrating an example of a system 900 for dynamic real-time authorizing of drone landings (e.g., for unplanned landings or for scheduled deliveries of items).
- the system 900 is showor to include network devices 910a, 910b (e.g., in the form of base stations, such as gNBs), network device 920 (e.g., computing device associated with a user in the form of a smart phone), network devices 930a, 930b (e.g., in the form of drones), a house 950 (e.g., owned by the user associated with the network device 920), and an in-vehicle station 940.
- the house 950 can include one or more home equipped devices (e.g., connected to every thing devices) that have wireless communication capability.
- the system 900 may include more or less number of the network devices 910a, 910b, the network device 920, the network devices 930a, 930b, the house 950, and/or the in-vehicle station 940.
- the network devices 910a, 910b, the network device 920. the network devices 930a. 930b, the house 950, and/or the in- vehicle station 940 may be implemented in different forms than as shown in FIG. 9.
- a network entity e.g., server associated with one or more drones, such as a server associated with a drone company
- authorization request messages e.g., emergency messages
- one or both of the network devices 930a, 930b can transmit (e.g., broadcast) one or more authorization request messages (e.g.. emergency messages) to one or more areas that the one or both of the network devices 930a, 930b (e.g., drones) would like to land.
- the one or more areas may include an area including the network device 920 and the house 950, and an area including the in-vehicle station 940.
- the one or both of the network devices 930a, 930b can send the authorization request messages (e.g., emergency messages) via sidelink signaling and/or via wireless wide area network (WWAN) signaling.
- the authorization request messages e.g., emergency messages
- the authorization request messages may include information related to the landing of the one or both of the network devices 930a, 930b (e.g., drones) in the one or more areas.
- the information related to the landing of the one or both of the network devices 930a, 930b may include, but is not limited to, photos of one or more desired locations within the one or more areas to land, a height that the one or both of the drones is located above the ground, information regarding one or more parcels carried by the one or both of the drones, information regarding the company sending the one or more parcels, a speed of travel of the one or both of the drones, and/or information regarding landmarks nearby the one or more desired locations within the one or more areas to land.
- the network device 920 and/or the one or more home equipped devices within the house 950 can receive the authorization request messages.
- the user e.g., associated with the network device 920 who owns the area that includes the house 950
- may provide a dynamic real-time authorization e.g., via a web browser and/or mobile application running on the network device 920 and/or running on one or more of the home equipped devices located within the house 950
- a netw ork entity e.g., server associated with the drone, such as a server associated with the drone company
- a netw ork entity e.g., server associated with the drone, such as a server associated with the drone company
- the network device 920 can send an authorization message to the network entity (e.g., the server associated with the drones).
- the authorization message may include the authorization by the user to land drones in the property (e.g., area including the house 950) owned by the user (e.g., property owner) and can. optionally, include one or more restrictions chosen by the user (e.g., property owner).
- the network entity e.g., the server associated with the drones
- receives the authorization message e.g., which includes the dynamic real-time authorization and, optionally, the restrictions
- the network device 920 e.g., computing device associated with the user, who owns the area including the house 950
- the network entity can determine route planning for the one or both of the network devices 930a, 930b (e.g., drones) travelling to the area owned by the user (e.g., property owner), based on information within the authorization message.
- the network entity After the network entity has determined a path for the one or both of the network devices 930a, 930b (e g., drones) to travel to the area including the house 950, the network entity can transmit one or more commands to the one or both of the network devices 930a, 930b (e.g., drones) that includes the path(s) (e.g., path information). After the one or both of the network devices 930a. 930b (e.g...).
- the network entity After the one or both of the network devices 930a. 930b (e.g...).
- the one or more commands can cause the one or both of the network devices 930a, 930b (e.g., drones) to travel (fly), following the path(s), to the area and to land in the area (e.g., for an unplanned landing).
- the network devices 930a, 930b e.g., drones
- a camera located in the area of the user can capture images of the area including the house 950.
- the camera(s) can include one or more Internet Protocol (IP) cameras on the house 950, such as a surveillance camera, a doorbell camera, etc.
- IP Internet Protocol
- the camera(s) can capture images of a drone (e.g., network device 930a or network device 930b of FIG. 9) as the drone flies within a field of view of the camera.
- the camera or a system associated with the camera can perform an image processing operation (e.g., object detection, motion detection, and/or other operation) on the images to detect the presence of the drone in the images.
- an image processing operation e.g., object detection, motion detection, and/or other operation
- the camera or the system associated with the camera can transmit a notification to the network device 920 of the user to provide authorization for landing the drone in the area including the house 950.
- the notification can be displayed via a web application running on the network device 920, a mobile application running on the network device 920, etc.
- the user can then provide user input (e.g..
- the systems and techniques involve the drone landing in an area without any authorization (e.g., with no preauthorization) from a user (e.g., land owner) associated with the area (e.g., land), where an administrator (e.g., a company) associated with the drone can provide some compensation (e.g., money, free services, free items, discounts on services, and/or discounts on items) to the user associated with the area (e.g., a land owner of a parcel of land) for the landing of the drone.
- a user e.g., land owner
- some compensation e.g., money, free services, free items, discounts on services, and/or discounts on items
- a drone when a drone lands, such as for an emergency landing, in an area without any authorization from a user associated with the area (e.g., a land owner of a parcel of land), an administrator (e.g., a company) associated with the drone can provide some compensation (e.g.. money and/or free services) to the user (e.g., land owner) associated with the area (e.g., land) for the landing of the drone.
- some compensation e.g. money and/or free services
- FIG. 10 is a flow' chart illustrating an example of a process 1000 for wireless communications utilizing methods for protocols for drone landing.
- the process 1000 can be performed by a first computing device (e.g., network entity 630 of FIG. 6 in the form of a server) or by a component or system (e.g., a chipset) of the first computing device.
- the operations of the process 1000 may be implemented as software components that are executed and run on one or more processors (e.g.. processor 1210 of FIG. 12 or other processor(s)).
- the transmission and reception of signals by the first computing device in the process 1000 may be enabled, for example, by one or more antennas and/or one or more transceivers (e.g., wireless transceiver(s)) associated with the first computing device.
- the first computing device can receive, from a second computing device (e.g., netw ork device 620 of FIG. 6, the netw ork device 920 of FIG. 9, etc.) associated with a user, an authorization message including an authorization by the user to land the drone in an area (e.g., area 650 of FIG. 6, the house 950 of FIG. 9, etc.) associated with the user for an unplanned landing of the drone (e.g., an emergency landing of the drone).
- the area associated with the user is a property associated with the user (e.g., a home of the user, a business location ow ned or leased by the user, etc.).
- the authorization message further includes one or more restrictions by the user for landing the drone in the area.
- the one or more restrictions include a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, a restriction indicating no landing of the drone in the area w hen people are present within the area, any combination thereof, and/or other restrictions.
- the authonzation message is received based on a notification provided to the second computing device associated with the user. For instance, the notification may be based on detection of the drone by at least one camera located in the area.
- a camera or a system associated with the camera can perform an image processing operation (e.g., object detection, motion detection, and/or other operation) on the images to detect the presence of the drone in the images.
- an image processing operation e.g., object detection, motion detection, and/or other operation
- the camera or the system associated with the camera can transmit a notification to the second computing device associated with the user to provide authorization for landing the drone in the area including the house.
- the first computing device can determine a path for the drone to travel to the area, based on receiving the authorization message. In some aspects, the first computing device (or component thereof) can determine the path for the drone to travel to the area further based on a planned delivery of one or more items to a location. In some aspects, the first computing device (or component thereof) can generate, based on receiving the authorization message, a landing map for unplanned landings, where the landing map includes the path.
- the first computing device (or component thereof) can cause the drone to follow the path to travel to the area and to land in the area.
- the location is different from the area associated with the user.
- the location can be a business to which an item is being delivered according to a planned delivery
- the area associated with the user can be a property (e.g., a home) of the user.
- the first computing device can cause the drone to broadcast an authorization request message.
- network devices 930a, 930b e.g., drones
- authorization request messages e.g., emergency messages
- WWAN wireless wide area network
- the authorization request messages can include information related to the landing of the drone (e.g., one or both of the network devices 930a, 930b of FIG. 9) in the one or more areas.
- the information related to the landing may include, but is not limited to, photos of one or more desired locations within the one or more areas to land, a height that the one or both of the drones is located above the ground, information regarding one or more parcels carried by the one or both of the drones, information regarding the company sending the one or more parcels, a speed of travel of the one or both of the drones, information regarding landmarks nearby the one or more desired locations within the one or more areas to land, any combination thereof, and/or other information.
- the second computing device e.g. the network device 920 of FIG. 9 may receive the authorization request message.
- the user associated with the second computing device may provide a dynamic realtime authorization (e.g., via a web browser and/or mobile application running on the network device 920 and/or running on one or more of the home equipped devices located within the house 950) to the first computing device to provide authorization for the drone to land in the area.
- a dynamic realtime authorization e.g., via a web browser and/or mobile application running on the network device 920 and/or running on one or more of the home equipped devices located within the house 950
- the first computing device (or component thereof) can provide compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
- FIG. 11 is a flow chart illustrating an example of a process 1100 for wireless communications utilizing methods for protocols for drone landing.
- the process 1100 can be performed by a drone (e.g., network device 610 of FIG. 6 in the form of a drone, or network devices 930a, 930b of FIG. 9 in the form of a drone) or by a component or system (e.g., a chipset) of the drone.
- the operations of the process 1100 may be implemented as software components that are executed and run on one or more processors (e.g., processor 1210 of FIG. 12 or other processor(s)).
- the transmission and reception of signals by the drone in the process 1100 may be enabled, for example, by one or more antennas and/or one or more transceivers (e.g., wireless transceivers )) on the drone.
- the drone (or component thereof) can broadcast (e.g., via at least one transceiver), an authorization request message to an area (e.g., a property associated with the user) for landing the drone.
- an area e.g., a property associated with the user
- the drone can receive (e.g.. via at least one transceiver), from a first computing device associated with the drone, a path to follow to the area.
- the path is based on an authorization message received from a second computing device of a user associated with the area in response to the authorization request message.
- the authorization message may include an authorization by the user to land the drone in the area for an unplanned landing of the drone (e.g., an emergency landing of the drone).
- the authorization message includes one or more restrictions by the user for landing the drone in the area.
- the one or more restrictions can include a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, a restriction indicating no landing of the drone in the area when people are present within the area, any combination thereof, and/or other restrictions.
- the path is further based on a planned deli very of one or more items to a location. In some examples, the location is different from the area associated with the user.
- the first computing device (or component thereof) can receive (e.g., via at least one transceiver) the authorization message based on a notification provided to the computing device of the user.
- the notification may be based on detection of the drone by at least one camera located in the area.
- the drone can cause the drone to travel along the path and land in the area.
- the first computing device can receive (e.g., via at least one transceiver) a landing map for unplanned landings, wherein the landing map includes the path.
- the computing device e.g., network entity
- the drone e.g., network device
- the computing device may include various components, such as one or more input devices, one or more output devices, one or more processors, one or more microprocessors, one or more microcomputers, one or more cameras, one or more sensors, one or more receivers, transmitters, and/or transceivers, and/or other component(s) that are configured to carry' out the steps of processes described herein.
- the computing device may include a display, a network interface configured to communicate and/or receive the data, any combination thereof, and/or other component(s).
- the network interface may be configured to communicate and/or receive Internet Protocol (IP) based data or other type of data.
- IP Internet Protocol
- the components of the computing device configured to perform the process 1000 of FIG. 10 and/or the components of the drone (e.g., network device) configured to perform the process 1100 of FIG. 11 can be implemented in circuitry'.
- the components can include and/or can be implemented using electronic circuits or other electronic hardware, which can include one or more programmable electronic circuits (e.g.. microprocessors, graphics processing units (GPUs), digital signal processors (DSPs). central processing units (CPUs), and/or other suitable electronic circuits), and/or can include and/or be implemented using computer software, firmware, or any combination thereof, to perform the various operations described herein.
- the process 1000 and the process 1100 are illustrated as logical flow diagrams, the operation of which represents a sequence of operations that can be implemented in hardware, computer instructions, or a combination thereof.
- the operations represent computer-executable instructions stored on one or more computer- readable storage media that, when executed by one or more processors, perform the recited operations.
- computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular data types.
- the order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the processes.
- the process 1000, the process 1100, and/or any other process described herein may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) executing collectively on one or more processors, by hardware, or combinations thereof.
- the code may be stored on a computer-readable or machine-readable storage medium, for example, in the form of a computer program including a plurality of instructions executable by one or more processors.
- the computer-readable or machine-readable storage medium may be non-transitory.
- FIG. 12 is a block diagram illustrating an example of a computing system 1200, which may be employed by the disclosed systems and techniques for protocols for drone landing.
- FIG. 12 illustrates an example of computing system 1200, which can be, for example, any computing device making up internal computing system, a remote computing system, a camera, or any component thereof in which the components of the system are in communication with each other using connection 1205.
- Connection 1205 can be a physical connection using a bus, or a direct connection into processor 1210, such as in a chipset architecture.
- Connection 1205 can also be a virtual connection, networked connection, or logical connection.
- computing system 1200 is a distributed system in which the functions described in this disclosure can be distributed within a datacenter, multiple data centers, a peer network, etc.
- one or more of the described system components represents many such components each performing some or all of the function for which the component is described.
- the components can be physical or virtual devices.
- Example system 1200 includes at least one processing unit (CPU or processor) 1210 and connection 1205 that communicatively couples various system components including system memory 1215. such as read-only memory (ROM) 1220 and random access memory (RAM) 1225 to processor 1210.
- Computing system 1200 can include a cache 1212 of highspeed memory connected directly with, in close proximity' to, or integrated as part of processor 1210.
- Processor 1210 can include any general purpose processor and a hardware service or software service, such as services 1232, 1234, and 1236 stored in storage device 1230, configured to control processor 1210 as well as a special-purpose processor where software instructions are incorporated into the actual processor design.
- Processor 1210 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory ⁇ controller, cache, etc.
- a multi-core processor may be symmetric or asymmetric.
- computing system 1200 includes an input device 1245, which can represent any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech, etc.
- Computing system 1200 can also include output device 1235, which can be one or more of a number of output mechanisms.
- input device 1245 can represent any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech, etc.
- output device 1235 can be one or more of a number of output mechanisms.
- multimodal systems can enable a user to provide multiple types of input/ output to communicate with computing system 1200.
- Computing system 1200 can include communications interface 1240, which can generally govern and manage the user input and system output.
- the communication interface may perform or facilitate receipt and/or transmission wired or wireless communications using wired and/or wireless transceivers, including those making use of an audio jack/plug, a microphone jack/plug, a universal serial bus (USB) port/plug, an AppleTM LightningTM port/plug, an Ethernet port/plug, a fiber optic port/plug, a proprietary wired port/plug, 3G, 4G, 5G and/or other cellular data network wireless signal transfer, a BluetoothTM wireless signal transfer, a BluetoothTM low energy (BLE) wireless signal transfer, an IBEACONTM wireless signal transfer, a radio-frequency identification (RFID) wireless signal transfer, near-field communications (NFC) wireless signal transfer, dedicated short range communication (DSRC) wireless signal transfer, 802.11 Wi-Fi wireless signal transfer, wireless local area network (WLAN) signal transfer, Visible Light Communication (VLC), Worldwide Interoperability for Micro
- PSTN Public Switched Telephone Network
- ISDN Integrated Services Digital Network
- ad-hoc network signal transfer radio wave signal transfer, microwave signal transfer, infrared signal transfer, visible light signal transfer, ultraviolet light signal transfer, wireless signal transfer along the electromagnetic spectrum, or some combination thereof.
- the communications interface 1240 may also include one or more range sensors (e.g., light-based sensors, laser range finders, RF-based sensors, ultrasonic sensors, and infrared (IR) sensors) configured to collect data and provide measurements to processor 1210, whereby processor 1210 can be configured to perform determinations and calculations needed to obtain various measurements for the one or more range sensors.
- the measurements can include time of flight, wavelengths, azimuth angle, elevation angle, range, linear velocity and/or angular velocity, or any combination thereof.
- the communications interface 1240 may also include one or more Global Navigation Satellite System (GNSS) receivers or transceivers that are used to determine a location of the computing system 1200 based on receipt of one or more signals from one or more satellites associated with one or more GNSS systems.
- GNSS systems include, but are not limited to. the US-based GPS. the Russia-based Global Navigation Satellite System (GLONASS), the China-based BeiDou Navigation Satellite System (BDS), and the Europe-based Galileo GNSS.
- GLONASS Global Navigation Satellite System
- BDS BeiDou Navigation Satellite System
- Galileo GNSS Europe-based Galileo GNSS
- Storage device 1230 can be a non-volatile and/or non-transitory and/or computer- readable memory device and can be a hard disk or other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, solid state memory devices, digital versatile disks, cartridges, a floppy disk, a flexible disk, a hard disk, magnetic tape, a magnetic strip/stripe, any other magnetic storage medium, flash memoiy', memristor memory', any other solid-state memory', a compact disc read only memory' (CD-ROM) optical disc, a rewritable compact disc (CD) optical disc, digital video disk (DVD) optical disc, a blu-ray disc (BDD) optical disc, a holographic optical disk, another optical medium, a secure digital (SD) card, a micro secure digital (microSD) card, a Memory Stick® card, a smartcard chip, a EMV chip, a subscriber identity module (SSD) card
- the storage device 1230 can include software services, servers, services, etc., that when the code that defines such software is executed by the processor 1210, it causes the system to perform a function.
- a hardware service that performs a particular function can include the software component stored in a computer-readable medium in connection with the necessary hardware components, such as processor 1210, connection 1205, output device 1235, etc., to carry out the function.
- the term "‘computer-readable medium” includes, but is not limited to, portable or non-portable storage devices, optical storage devices, and various other mediums capable of storing, containing, or carry ing instruction(s) and/or data.
- a computer- readable medium may include a non-transitory medium in which data can be stored and that does not include carrier waves and/or transitory electronic signals propagating wirelessly or over wired connections.
- Examples of a non-transitory medium may include, but are not limited to, a magnetic disk or tape, optical storage media such as compact disk (CD) or digital versatile disk (DVD), flash memory', memory' or memory' devices.
- a computer-readable medium may- have stored thereon code and/or machine-executable instructions that may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements.
- a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents.
- Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, or the like.
- the present technology may be presented as including individual functional blocks including devices, device components, steps or routines in a method embodied in software, or combinations of hardware and software. Additional components may be used other than those shown in the figures and/or described herein.
- circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the aspects in unnecessary detail.
- well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the aspects.
- Processes and methods according to the above-described examples can be implemented using computer-executable instructions that are stored or otherwise available from computer- readable media.
- Such instructions can include, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or a processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network.
- the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memoty, networked storage devices, and so on.
- the computer-readable storage devices, mediums, and memories can include a cable or wireless signal containing a bitstream and the like.
- non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
- the program code or code segments to perform the necessary tasks may be stored in a computer-readable or machine-readable medium.
- a processor(s) may perform the necessary' tasks. Examples of form factors include laptops, smart phones, mobile phones, tablet devices or other small form factor personal computers, personal digital assistants, rackmount devices, standalone devices, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example.
- the instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are example means for providing the functions described in the disclosure.
- the techniques described herein may also be implemented in electronic hardware, computer software, firmware, or any combination thereof. Such techniques may be implemented in any of a variety of devices such as general purposes computers, wireless communication device handsets, or integrated circuit devices having multiple uses including application in wireless communication device handsets and other devices. Any features described as modules or components may be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a computer-readable data storage medium including program code including instructions that, when executed, performs one or more of the methods, algorithms, and/or operations described above. The computer-readable data storage medium may form part of a computer program product, which may include packaging materials.
- the computer-readable medium may include memory or data storage media, such as random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, magnetic or optical data storage media, and the like.
- RAM random access memory
- SDRAM synchronous dynamic random access memory
- ROM read-only memory
- NVRAM non-volatile random access memory
- EEPROM electrically erasable programmable read-only memory
- FLASH memory magnetic or optical data storage media, and the like.
- the techniques additionally, or alternatively, may be realized at least in part by a computer-readable communication medium that carries or communicates program code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer, such as propagated signals or waves.
- the program code may be executed by a processor, which may include one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, an application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry'.
- DSPs digital signal processors
- ASICs application specific integrated circuits
- FPGAs field programmable logic arrays
- Such a processor may be configured to perform any of the techniques described in this disclosure.
- a general-purpose processor may be a microprocessor; but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
- a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality' of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Accordingly, the term “processor.” as used herein may refer to any of the foregoing structure, any combination of the foregoing structure, or any other structure or apparatus suitable for implementation of the techniques described herein.
- Coupled to or “communicatively coupled to” refers to any component that is physically connected to another component either directly or indirectly, and/or any component that is in communication with another component (e.g., connected to the other component over a wired or wireless connection, and/or other suitable communication interface) either directly or indirectly.
- Claim language or other language reciting "at least one of’ a set and/or “one or more” of a set indicates that one member of the set or multiple members of the set (in any combination) satisfy the claim. For example, claim language reciting “at least one of A and B” or “at least one of A or B” means A, B, or A and B. In another example, claim language reciting “at least one of A.
- A, B, and C or “at least one of A, B, or C” means A, B. C, or A and B. or A and C, or B and C, or A and B and C, or any duplicate information or data (e.g., A and A, B and B, C and C, A and A and B, and so on), or any other ordering, duplication, or combination of A, B, and C.
- the language “at least one of’ a set and/or “one or more” of a set does not limit the set to the items listed in the set. For example, claim language reciting “at least one of A and B” or “at least one of A or B” may mean A, B, or A and B, and may additionally include items not listed in the set of A and B.
- the phrases “at least one” and “one or more” are used interchangeably herein.
- Claim language or other language reciting “at least one processor configured to,” “at least one processor being configured to,” “one or more processors configured to,” “one or more processors being configured to,” or the like indicates that one processor or multiple processors (in any combination) can perform the associated operation(s).
- claim language reciting “at least one processor configured to: X. Y, and Z” means a single processor can be used to perform operations X, Y, and Z; or that multiple processors are each tasked with a certain subset of operations X, Y, and Z such that together the multiple processors perform X,
- claim language reciting “at least one processor configured to: X. Y, and Z” can mean that any single processor may only perform at least a subset of operations X, Y, and Z.
- one element may perform all functions, or more than one element may collectively perform the functions.
- each function need not be performed by each of those elements (e.g., different functions may be performed by different elements) and/or each function need not be performed in whole by only one element (e.g.. different elements may perform different sub-functions of a function).
- one element may be configured to cause the other element to perform all functions, or more than one element may collectively be configured to cause the other element to perform the functions.
- an entity e.g., any entity or device described herein
- the entity may be configured to cause one or more elements (individually or collectively) to perform the functions.
- the one or more components of the entity may include at least one memory, at least one processor, at least one communication interface, another component configured to perform one or more (or all) of the functions, and/or any combination thereof.
- the entity may be configured to cause one component to perform all functions, or to cause more than one component to collectively perform the functions.
- each function need not be performed by each of those components (e.g., different functions may be performed by different components) and/or each function need not be performed in whole by only one component (e.g., different components may perform different sub-functions of a function).
- Illustrative aspects of the disclosure include:
- a first computing device associated with a drone comprising: at least one memory; and at least one processor coupled to the at least one memory and configured to: receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determine a path for the drone to travel to the area, based on receiving the authorization message; and cause the drone to follow the path to travel to the area and to land in the area.
- Aspect 2 The first computing device of Aspect 1, wherein the at least one processor is configured to determine the path for the drone to travel to the area further based on a planned delivery of one or more items to a location.
- Aspect 3 The first computing device of Aspect 2, wherein the location is different from the area associated with the user.
- Aspect 4 The first computing device of any one of Aspects 1 to 3, wherein the area associated with the user is a property associated with the user.
- Aspect 5 The first computing device of any one of Aspects 1 to 4, wherein the authorization message further comprises one or more restrictions by the user for landing the drone in the area.
- Aspect 6 The first computing device of Aspect 5, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
- Aspect 7 The first computing device of any one of Aspects 1 to 6, wherein the at least one processor is further configured to cause the drone to broadcast an authorization request message.
- Aspect 8 The first computing device of any one of Aspects 1 to 7, wherein the unplanned landing of the drone is an emergency landing of the drone.
- Aspect 9 The first computing device of any one of Aspects 1 to 8, wherein the at least one processor is further configured to provide compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
- Aspect 10 The first computing device of any one of Aspects 1 to 9, wherein the at least one processor is further configured to generate, based on receiving the authorization message, a landing map for unplanned landings, w herein the landing map comprises the path.
- Aspect 11 The first computing device of any one of Aspects 1 to 10, wherein the authorization message is received based on a notification provided to the second computing device associated with the user, the notification being based on detection of the drone by at least one camera located in the area.
- a method of landing a drone comprising: receiving, by a first computing device associated with the drone from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determining, by the first computing device associated with the drone, a path for the drone to travel to the area, based on receiving the authorization message; and causing, by the first computing device associated with the drone, the drone to follow the path to travel to the area and to land in the area.
- Aspect 13 The method of Aspect 12, wherein determining the path for the drone to travel to the area is further based on a planned delivery of one or more items to a location.
- Aspect 14 The method of Aspect 13, wherein the location is different from the area associated with the user.
- Aspect 15 The method of any one of Aspects 12 to 14. wherein the area associated with the user is a property associated with the user.
- Aspect 16 The method of any one of Aspects 12 to 15, wherein the authorization message further comprises one or more restrictions by the user for landing the drone in the area.
- Aspect 17 The method of Aspect 16, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
- Aspect 18 The method of any one of Aspects 12 to 17, further comprising causing, by the first computing device associated with the drone, the drone to broadcast an authorization request message.
- Aspect 19 The method of any one of Aspects 12 to 18, wherein the unplanned landing of the drone is an emergency landing of the drone.
- Aspect 20 The method of any one of Aspects 12 to 19, further comprising providing compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
- Aspect 21 The method of any one of Aspects 12 to 20, further comprising generating, by the first computing device based on receiving the authorization message, a landing map for unplanned landings, wherein the landing map comprises the path.
- Aspect 22 The method of any one of Aspects 12 to 21, wherein the authorization message is received based on a notification provided to the second computing device associated with the user, the notification being based on detection of the drone by at least one camera located in the area.
- a drone comprising: at least one transceiver; at least one memory; and at least one processor coupled to the at least one transceiver and the at least one memory', the at least one processor configured to: broadcast, via the at least one transceiver, an authorization request message to an area for landing the drone; receive, via the at least one transceiver from a first computing device associated with the drone, a path to follow to the area, wherein the path is based on an authorization message received from a second computing device of a user associated with the area in response to the authorization request message; and cause the drone to travel along the path and land in the area.
- Aspect 24 The drone of Aspect 23, wherein the path is further based on a planned delivery' of one or more items to a location.
- Aspect 25 The drone of Aspect 24, wherein the location is different from the area associated with the user.
- Aspect 26 The drone of any one of Aspects 23 to 25, wherein the area associated with the user is a property associated with the user.
- Aspect 27 The drone of any one of Aspects 23 to 26, wherein the authorization message comprises an authorization by the user to land the drone in the area for an unplanned landing of the drone.
- Aspect 28 The drone of Aspect 27, wherein the unplanned landing of the drone is an emergency landing of the drone.
- Aspect 29 The drone of any one of Aspects 23 to 28, wherein the authorization message comprises one or more restrictions by the user for landing the drone in the area.
- Aspect 30 The drone of Aspect 29, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
- Aspect 31 The drone of any one of Aspects 23 to 30, w herein the at least one processor is further configured to receive, via the at least one transceiver, a landing map for unplanned landings, wherein the landing map comprises the path.
- Aspect 32 The drone of any one of Aspects 23 to 31, wherein the at least one processor is configured to receive, via the at least one transceiver, the authorization message based on a notification provided to the second computing device of the user, the notification being based on detection of the drone by at least one camera located in the area.
- a method of landing a drone comprising: broadcasting, by the drone, an authorization request message to an area for landing the drone; receiving, by the drone, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and traveling, by the drone, along the path and landing in the area.
- Aspect 34 The method of Aspect 33, wherein the path is further based on a planned delivery of one or more items to a location.
- Aspect 35 The method of Aspect 34, wherein the location is different from the area associated with the user.
- Aspect 36 The method of any one of Aspects 33 to 35, wherein the area associated with the user is a property associated with the user.
- Aspect 37 The method of any one of Aspects 33 to 36, wherein the authorization message comprises an authorization by the user to land the drone in the area for an unplanned landing of the drone.
- Aspect 38 The method of Aspect 37, wherein the unplanned landing of the drone is an emergency landing of the drone.
- Aspect 39 The method of any one of Aspects 33 to 38, wherein the authorization message comprises one or more restrictions by the user for landing the drone in the area.
- Aspect 40 The method of Aspect 39, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
- Aspect 41 The method of any one of Aspects 33 to 40, further comprising receiving, by the drone, a landing map for unplanned landings, wherein the landing map comprises the path.
- Aspect 42 The method of any one of Aspects 33 to 41, wherein the authorization message is received based on a notification provided to the computing device of the user, the notification being based on detection of the drone by at least one camera located in the area.
- Aspect 43 A non-transitory computer-readable storage medium comprising instructions stored thereon which, when executed by at least one processor, causes the at least one processor to perform operations according to any one of Aspects 12 to 22.
- Aspect 44 An apparatus comprising one or more means for performing operations according to any one of Aspects 12 to 22.
- Aspect 45 A non-transitory computer-readable storage medium comprising instructions stored thereon which, when executed by at least one processor, causes the at least one processor to perform operations according to any one of Aspects 33 to 42.
- Aspect 46 An apparatus comprising one or more means for performing operations according to any one of Aspects 33 to 42.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Disclosed are systems, apparatuses, processes, and computer-readable media for protocols for drone landing. For example, a first computing device associated with a drone can receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone. The first computing device can determine a path for the drone to travel to the area, based on receiving the authorization message. The first computing device can cause the drone to follow the path to travel to the area and to land in the area.
Description
PROTOCOLS FOR DRONE LANDING
FIELD
[0001] The present disclosure generally relates to drone landings. For example, aspects of the present disclosure are related to systems and techniques for providing protocols for drone landing.
BACKGROUND
[0002] Shipping and delivery' of assets (e.g., items or goods) is an important activity for many users (e.g., customer), which may be affiliated with various different business organizations. Different types of vehicles can be used to deliver assets, such as trucks, airplanes, among others. Drones are becoming common for use within the shipping industry, in addition to other industries, for delivery of items (e.g., parcels) to users (e.g., customers).
[0003] Certain paths where drones fly can be riskier for drone operations than other paths. For example, due to external environmental impacts (e.g., high speed winds, sudden heavy rain fall, and storms), movement of a drone along on a path may be impacted. In some cases, the drone may need to change its path to a new path, which may cause damage to the drone and/or cause the drone to lose connection with a ground controller (e.g., a controller operated by an operator of the drone). As such, with some paths (e.g., paths with inclement weather conditions), there can be a high probability7 that a drone may experience technical issues (e.g., mechanical and/or operational failures), in which case the drone may be required to perform an emergency landing on private or public property. As such, a generalized protocol for landing drones, such as for performing emergency landings or for delivering items, can be beneficial.
SUMMARY
[0004] The following presents a simplified summary relating to one or more aspects disclosed herein. Thus, the following summary' should not be considered an extensive overview relating to all contemplated aspects, nor should the following summary be considered to identify key or critical elements relating to all contemplated aspects or to delineate the scope associated with any particular aspect. Accordingly, the following summary7 has the sole purpose to present
certain concepts relating to one or more aspects relating to the mechanisms disclosed herein in a simplified form to precede the detailed description presented below.
[0005] Systems and techniques are described for protocols for drone landing. According to at least one example, a first computing device associated with a drone is provided. The first computing device includes at least one memory and at least one processor coupled to the at least one memory and configured to: receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determine a path for the drone to travel to the area, based on receiving the authorization message; and cause the drone to follow the path to travel to the area and to land in the area.
[0006] In another illustrative example, a method of landing a drone is provided. The method includes: receiving, by a first computing device associated with the drone from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determining, by the first computing device associated with the drone, a path for the drone to travel to the area, based on receiving the authorization message; and causing, by the first computing device associated with the drone, the drone to follow the path to travel to the area and to land in the area.
[0007] In another illustrative example, a non-transitory computer-readable storage medium of a first computing device is provided that comprises instructions stored thereon which, when executed by at least one processor, cause the at least one processor to: receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land a drone in an area associated with the user for an unplanned landing of the drone; determine a path for the drone to travel to the area, based on receiving the authorization message; and cause the drone to follow the path to travel to the area and to land in the area.
[0008] In another illustrative example, a first computing device associated with a drone is provided. The first computing device includes: means for receiving, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; means for determining a path for the drone to travel to the area, based on receiving the
authorization message; and means for causing the drone to follow the path to travel to the area and to land in the area.
[0009] In another illustrative example, a drone is provided that includes at least one transceiver, at least one memory, and at least one processor coupled to the at least one transceiver and the at least one memory. The at least one processor is configured to: broadcast, via the at least one transceiver, an authorization request message to an area for landing the drone; receive, via the at least one transceiver, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and cause the drone to travel along the path and land in the area.
[0010] In another illustrative example, a method of landing a drone is provided. The method includes: broadcasting, by the drone, an authorization request message to an area for landing the drone; receiving, by the drone, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and traveling, by the drone, along the path and landing in the area.
[0011] In another illustrative example, a non-transitory computer-readable storage medium of a drone is provided that comprises instructions stored thereon which, when executed by at least one processor, cause the at least one processor to: broadcast, via at least one transceiver, an authorization request message to an area for landing the drone; receive, via the at least one transceiver, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and cause the drone to travel along the path and land in the area.
[0012] In another illustrative example, a drone is provided. The drone includes: means for broadcasting an authorization request message to an area for landing the drone; means for receiving a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and means for causing the drone to travel along the path and land in the area.
[0013] Aspects generally include a method, apparatus, system, computer program product, non-transitory computer-readable medium, user device, user equipment, wireless communication device, and/or processing system as substantially described with reference to and as illustrated by the drawings and specification.
[0014] Some aspects include a device having a processor configured to perform one or more operations of any of the methods summarized above. Further aspects include processing devices for use in a device configured with processor-executable instructions to perform operations of any of the methods summarized above. Further aspects include a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a device to perform operations of any of the methods summarized above. Further aspects include a device having means for performing functions of any of the methods summarized above.
[0015] The foregoing has outlined rather broadly the features and technical advantages of examples according to the disclosure in order that the detailed description that follows may be better understood. Additional features and advantages will be described hereinafter. The conception and specific examples disclosed may be readily utilized as a basis for modifying or designing other structures for carrying out the same purposes of the present disclosure. Such equivalent constructions do not depart from the scope of the appended claims. Characteristics of the concepts disclosed herein, both their organization and method of operation, together with associated advantages will be better understood from the following description when considered in connection with the accompanying figures. Each of the figures is provided for the purposes of illustration and description, and not as a definition of the limits of the claims. The foregoing, together with other features and aspects, will become more apparent upon referring to the following specification, claims, and accompanying drawings.
[0016] This summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used in isolation to determine the scope of the claimed subject matter. The subject matter should be understood by reference to appropriate portions of the entire specification of this patent, any or all drawings, and each claim.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The accompanying drawings are presented to aid in the description of various aspects of the disclosure and are provided solely for illustration of the aspects and not limitation thereof. So that the above-recited features of the present disclosure can be understood in detail, a more particular description, briefly summarized above, may be had by reference to aspects, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only certain typical aspects of this disclosure and are therefore not to be considered limiting of its scope, for the description may admit to other equally effective aspects. The same reference numbers in different drawings may identify the same or similar elements.
[0018] FIG. 1 is a block diagram illustrating an example of user equipment (UE), in accordance with aspects of the present disclosure.
[0019] FIG. 2 is a diagram illustrating an example wireless communications system, in accordance with aspects of the present disclosure.
[0020] FIG. 3 is a diagram illustrating an example implementation of a system-on-a-chip (SOC), in accordance with aspects of the present disclosure.
[0021] FIG. 4 is a diagram illustrating an example of an unmanned aerial vehicle (UAV), in accordance with aspects of the present disclosure.
[0022] FIG. 5 is a diagram illustrating an example of airspace of an area, in accordance with aspects of the present disclosure.
[0023] FIG. 6 is a diagram illustrating an example of a system for providing pre-authorization for unplanned drone landings, in accordance with aspects of the present disclosure.
[0024] FIG. 7 is a diagram illustrating an example of a web browser for pre-authorization for unplanned drone landings, in accordance with aspects of the present disclosure.
[0025] FIG. 8 is a diagram illustrating an example of a map for unplanned drone landings, in accordance with aspects of the present disclosure.
[0026] FIG. 9 is a diagram illustrating an example of a system for dynamic real-time authorizing of drone landings, in accordance with aspects of the present disclosure.
[0027] FIG. 10 is a flow chart illustrating an example of a process for wireless communications at a computing device associated with a drone, in accordance with aspects of the present disclosure.
[0028] FIG. 11 is a flow chart illustrating an example of a process for wireless communications at a drone, in accordance with aspects of the present disclosure.
[0029] FIG. 12 is a block diagram illustrating an example of a computing system, which may be employed by the disclosed systems and techniques, in accordance with aspects of the present disclosure.
DETAILED DESCRIPTION
[0030] Certain aspects and embodiments of this disclosure are provided below. Some of these aspects and embodiments may be applied independently and some of them may be applied in combination as would be apparent to those of skill in the art. In the following description, for the purposes of explanation, specific details are set forth in order to provide a thorough understanding of embodiments of the application. However, it will be apparent that various embodiments may be practiced without these specific details. The figures and description are not intended to be restrictive.
[0031] The ensuing description provides example embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the example embodiments will provide those skilled in the art with an enabling description for implementing an example embodiment. It should be understood that various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the application as set forth in the appended claims.
[0032] As previously mentioned, currently, drones are popular for use within the shipping industry, and within other industries, for delivery of items (e.g.. parcels) to users (e.g.. customers). Certain paths where drones fly may be riskier for drone operations than other paths. For example, due to external environmental impacts (e.g., high speed winds, sudden heavy rain fall, and storms), a drone’s journey on a path can be impacted. In some cases, the drone may
need to change its path to a new or updated path, which can cause damage to the drone and/or cause the drone to lose connection with a ground controller (e.g., a controller operated by an operator of the drone). As such, with some paths (e.g., paths with inclement weather conditions), there can be a high probability that a drone may experience technical issues (e.g., mechanical and/or operational failures) and. as such, the drone may be required to perform an emergency landing on private or public property.
[0033] Flying a drone over private property can be legal in some places, such as places where there is no need for permission to fly drones over private property. However, some states and local municipalities (e.g., in the United States and/or other countries) have laws against flying drones over and landing drones on private property because of privacy concerns.
[0034] The Federal Aviation Administration (FAA) is responsible for civil aviation safety within the United States, and regulates the United States airspace. Since the FAA is a federal agency, the FAA rules can preempt the state and local municipality laws. According to the FAA, the airspace over the United States is divided into two categories, which include controlled airspace and uncontrolled airspace. Controlled airspace is defined to be located above 400 feet from the ground, and uncontrolled airspace is defined to be located within 400 feet from the ground. The FAA has rules regarding where commercial and recreational drone pilots are able to fly and not fly their drones within the controlled airspace. As such, in the industry today, a generalized protocol for landing drones, such as for performing emergency landings or for delivering items, on private property can be beneficial.
[0035] Systems, apparatuses, processes (also referred to as methods), and computer-readable media (collectively referred to as "systems and techniques”) are described herein for providing protocols for drone landing. In one or more aspects, the systems and techniques involve receiving a pre-authorization from a user (e.g., land owner) for a drone to land in an area (e.g., land) associated with the user, or receiving a dynamic on-the-spot (e.g., real-time) authorization from the user for the drone to land in the area associated with the user. For example, the area associated with the user can include a property’ associated with the user, such as property (e.g., land and/or a house) owned by, leased by, or otherwise under control of the user.
[0036] In some aspects, the systems and techniques involve the drone landing, such as for an emergency landing, in an area without any authorization from a user (e.g., land owner) associated with the area (e.g., land), where an administrator (e.g.. a company) associated with
the drone can provide some compensation (e.g., money and/or free services) to the user associated with the area (e.g., a land owner of a parcel of land) for the landing of the drone. In some cases, the computing device associated with the drone may provide compensation information (e.g., information related to a compensatory payment and/or services) to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
[0037] In one or more aspects, a pre-authorization for a drone to land in an area can be previously provided by a user associated with the area (e.g., a land owner of a parcel of land). In one or more examples, an administrator (e.g., a company) associated with the drone can provide an online sign-up option (e.g., via a web application running on a computing device, such as a computer, associated with the user, or a mobile application running on a computing device, such as a smart phone, associated with the user) for a user (e.g., land owner) to provide consent (e.g., authorization) for a drone associated with the administrator (e.g., company) to fly over and land in the area associated with the user (e.g., the land owner of the parcel of land). In some examples, during the online sign-up option, the user (e.g., land owner) may provide restrictions for the landing of the drone in the area. In one or more examples, the restrictions may include, but are not limited to, a restriction disallowing operation of a camera (e.g., indicating no operation of the camera) on the drone w hile in the area, a restriction disallowing operation of a sensor(s) (e.g., indicating no operation of the sensor(s)) on the drone while in the area, a speed restriction for the drone (e.g., indicating a reduced speed of the drone and/or drone blades of the drone) while in the area, and/or a restriction indicating no landing of the drone in the area when people are present within the area.
[0038] In one or more examples, after the administrator (e.g., company) associated with the drone receives a pre-authorization from the user (e.g., land owner) for landing of the drone in the area (e.g., land owned by the land owner), a computing device (e.g., server) associated with the administrator (e.g., company) can perform route planning for drones to travel through and/or to the area (e.g., including route planning for the drone to travel to the area). In some examples, the administrator (e.g., company) can generate a map (e.g., an emergency map) for drones to fly and to land only near and on properties where pre-authorizations to fly and land were received. In one or more examples, when a drone needs to make an emergency landing in an area where an administrator (e.g., company) associated with the drone has received a preauthorization to land from a user associated with the area (e.g., a land owner of a parcel of
land), the drone can send (e.g., transmit) a landing notification message to a computing device associated with the user (e.g., a smart phone of the land owner) to notify the user of the emergency landing of the drone in the area.
[0039] In one or more aspects, when a drone needs to make an emergency landing in an area, and an administrator (e.g., company) associated with the drone has not received a preauthorization from a user (e.g., land owner) associated with the area, the drone can broadcast (e.g., transmit) an authorization request message to the area (e.g., for reception by any computing devices in the area) requesting for consent (e.g.. authorization) for the drone to fly over and land in the area. In one or more examples, a computing device (e.g., server) associated with the administrator (e.g., company) associated with the drone may command the drone to broadcast the authorization request message to the area so that any computing devices in the area can receive the authorization request message. In some examples, the drone may wirelessly broadcast the authorization request message to the area, such as via a sidelink communication and/or a wireless wide area netw ork (WWAN) communication. In one or more examples, the authorization request message may include additional information including, but not limited to, images of a portion of the area (e.g., land) for landing of the drone, a height of the ground of the area for the landing, parcel information related to the area for the landing, a speed of the drone, and/or nearby object detection information.
[0040] In one or more examples, after a computing device (e.g., smart phone) associated with the user associated with the area (e.g.. a landowner of a parcel of land) receives the authorization request message, in response, the user (e g., land owner) can provide, via the computing device (e.g., smart phone) associated with the user, authorization for the landing in the area to the drone itself and/or to a computing device (e.g., server) associated with the administrator (e.g.. company) associated with the drone. In some examples, the user (e.g., land owner) may assign someone else as an agent with authority to provide consent (e.g., authorization) on behalf of the user (e.g., land ow ner) for the landing of the drone in the area. In one or more examples, after the drone and/or the computing device (e.g., server) of the administrator (e.g.. company) associated with the drone receives an authorization from the user (e.g.. land owner), or from the user’s agent, for landing of the drone in the area (e.g., land), the computing device (e.g., server) associated with the administrator (e.g., company) can perform route planning for the drone to travel to the area and to land in the area.
[0041] In some cases, a camera located in the area of the user (e.g., an Internet Protocol (IP) camera on the user’s home, such as a surveillance camera, a doorbell camera, etc.) can capture images of the area. The camera can capture images of a drone as the drone flies in a field of view (FOV) of the camera. The images can be processed to detect the drone. For example, the camera or a system associated with the camera (e.g., a server system associated with the camera ) can perform object detection, motion detection, and/or other operation(s) on the images to detect the presence of the drone in the images. Upon detecting the drone in the images, the camera or the system associated with the camera can send a notification to the computing device of the user (e.g.. via a web application running on the computing device, a mobile application running on the computing device, etc.) to provide authorization for landing the drone in the area. The user can then provide user input to the computing device to provide consent (e.g., authorization) for the drone to fly over and land in the area associated with the user (e.g., the land owner of the parcel of land). Alternatively, upon detecting the drone in the images, the camera or the system associated with the camera can send a notification to the computing device of the user (e.g., via a web application running on the computing device, a mobile application running on the computing device, etc.) to automatically provide consent (e.g., authorization) for the drone to fly over and land in the area associated with the user (e.g., the land owner of the parcel of land).
[0042] Additional aspects of the present disclosure are described in more detail below7 with respect to the figures.
[0043] FIG. 1 is a block diagram of a UE 100, in accordance with aspects of the present disclosure. UE 100 may be an example of any of the UEs 212, 213, 214 (as shown in FIG. 2), or network device 620 of FIG. 6 (e.g., a computing device or UE shown in the form of a smart phone), or network device 920 of FIG. 9 (e.g.. a computing device or UE shown in the form of a smart phone). UE 100 may include a computing platform including a processor 110, memory 111 including software (SW) 112, one or more sensors 113, a transceiver interface 114 for a transceiver 115 (that includes a wireless transceiver 140 and a wired transceiver 1 0), and a user interface 116. The processor 110, the memory 111, the sensor(s) 113, the transceiver interface 114, and the user interface 116 may be communicatively coupled to each other by a bus 120 (which may be configured, e.g., for optical and/or electrical communication). One or more of the components shown (e.g., one or more of the sensors 113, etc.) may be omitted from the UE 100.
[0044] The processor 110 may include one or more intelligent hardware devices, e.g., a central processing unit (CPU), a microcontroller, an application specific integrated circuit (ASIC), etc. The processor 110 may comprise multiple processors including a general-purpose/ application processor 130. a Digital Signal Processor (DSP) 131, a modem processor 132. a video processor 133, and/or a sensor processor 134. One or more of the processors 130-134 may comprise multiple devices (e.g., multiple processors). For example, the sensor processor 134 may comprise one or more processors for processing data from RF-based sensors, ultrasoundbased sensors, and/or light-based sensors, etc. The modem processor 132 may support dual SIM/dual connectivity (or even more SIMs). For example, a SIM (Subscriber Identity Module or Subscriber Identification Module) may be used by an Original Equipment Manufacturer (OEM), and another SIM may be used by an end user of the UE 100 for connectivity7. The memory 111 is a non-transitory storage medium that may include random access memory7 (RAM), flash memory, disc memory, and/or read-only memory (ROM), etc. The memory 111 stores the software 112 which may be processor-readable, processorexecutable software code containing instructions that are configured to, when executed, cause the processor 110 to perform various functions described herein. Alternatively, the softw are 112 may not be directly executable by the processor 110 but may be configured to cause the processor 110, e.g., when compiled and executed, to perform the functions. The description may refer only to the processor 110 performing a function, but this includes other implementations such as where the processor 110 executes software and/or firmware. The description may refer to the processor 110 performing a function as shorthand for one or more of the processors 130-134 performing the function. The description may refer to the UE 100 performing a function as shorthand for one or more appropriate components of the UE 100 performing the function. The processor 110 may include a memory with stored instructions in addition to and/or instead of the memory7 111. Functionality7 of the processor 110 is discussed more fully below.
[0045] The configuration of the UE 100 show n in FIG. 1 is an example and not limiting of the aspects and features of the disclosure, including the claims, and other configurations may be used. For example, an example configuration of the UE includes one or more of the processors 130-134 of the processor 110, the memory 111, and the wireless transceiver 140. Other example configurations include one or more of the processors 130-134 of the processor
110, the memory 11 1, the wireless transceiver 140, and one or more of the sensors 113, the user interface 11 , and/or the wired transceiver 150.
[0046] The UE 100 may comprise the modem processor 132 that may be capable of performing baseband processing of signals received and down-converted by the transceiver 1 15 and/or the SPS receiver 181 (discussed below). The modem processor 132 may perform baseband processing of signals to be upconverted for transmission by the transceiver 115. Also or alternatively, baseband processing may be performed by the processor 130 and/or the DSP 131. Other configurations, however, may be used to perform baseband processing.
[0047] The UE 100 includes the sensors 113 that may include one or more of various types of sensors, for example, an environmental sensor 160, a status sensor 170, and a position/motion/orientation (PMO) sensor 180. The PMO sensor 180 may include one or more sensors from which position and/or motion and/or orientation of the UE 100 may be determined. While each of the sensors 160, 170, 180 may be referred to in the singular, each of the sensors 160. 170, 180 may include more than one sensor, examples of some of which are discussed explicitly herein. The sensors 113 may generate analog and/or digital signals indications of which may be stored in the memory 111 and processed by the processor 110 (e.g., the processor 130, the DSP 131, the video processor 133, and/or the sensor processor 134 as appropriate) in support of one or more applications such as, for example, applications directed to positioning, navigation, and/or resource management. The description herein may refer to the processor 110 generally as performing one or more functions that one or more of the processors 130-134 perform.
[0048] The sensor(s) 113 may be used in resource management, relative location measurements, relative location determination, motion determination, etc. Information detected by the sensor(s) 113 may be used to determine ho ' to allocate resources of the UE 100, e.g., transmission power, processing power for transmission and/or reception of communication signals, transmission and/or reception directionality, etc. The plural term “resources” is often used throughout the discussion herein, but this term includes the singular as well, i.e., a single resource, e.g., being allocated. Also or alternatively, information detected by the sensor(s) may be used for motion detection, relative displacement, dead reckoning, sensor-based location determination, and/or sensor-assisted location determination. The sensor(s) 113 may be useful to determine whether the UE 100 is fixed (stationary) or mobile
and/or whether to report certain useful information to the server 243 (as shown in FIG. 2) regarding the mobility of the UE 100. For example, based on the information obtained/measured by the sensor(s) 113, the UE 100 may notify/report to the server 243 of FIG. 2 that the UE 100 has detected movements or that the UE 100 has moved, and report the relative displacement/distance (e.g., via dead reckoning, or sensor-based location determination, or sensor-assisted location determination enabled by the sensor(s) 1 13). In another example, for relative positioning information, the sensors/ Inertial Measurement Unit (IMU) can be used to determine the angle, size (e.g., width and/or height), and/or orientation of another device with respect to the UE 100. etc. The position and/or motion of the UE 100 may be used in determining resource allocation for communication, e.g., between vehicles and/or drones. The UE 100 may, for example, be disposed in or integrated with a vehicle or a drone. For example, the UE 100 may be the UE 214 of FIG. 2, which is a vehicle (e.g., a car) in the example shown in FIG. 2, or UE 100 may be the UAV 420 of FIG. 4 in the form of a drone. Other forms of UEs or vehicles may be used, such as trucks, air-based UEs (e.g.. alternative delivery vehicle, such in the form of a drone), etc. As such, the UE 100 may be configured for various forms of communication, e.g., V2V (vehicle-to-vehicle), V2X (vehicle- to-everything), CV2X (cellular V2X), CV2V (cellular V2V), etc.
[0049] The environmental sensor 160 may include one or more sensors for measuring one or more internal and/or external environmental conditions. In this example, the environmental sensor 160 includes a camera 161, a microphone 162, an air-flow sensor 163, a temperature sensor 164, a motion sensor 165, and a light-based sensor 166. While each of the sensors 161-166 may be referred to in the singular, each of the sensors 161-166 may include more than one sensor, examples of some of which are discussed explicitly herein. For example, the camera 161 may include at least one camera configured (e.g., designed, made, disposed, and directed) to capture images external to the UE 100 and/or may include one or more cameras configured to capture images internal to the UE 100 (e.g., in a passenger compartment of a vehicle). As other examples, the microphone 162, the temperature sensor 164, and/or the motion sensor 165 may include multiple microphones, multiple thermometers, and/or multiple motion detectors configured to detect sound, temperature, and/or motion (respectively) outside and/or inside of the UE 100, e.g.. a vehicle or a drone. Indeed, any of the sensors 161-165 may include multiple respective sensors outside the vehicle (or drone) and/or multiple respective sensors inside the vehicle (or drone) for making respective measurements at multiple
locations about the vehicle (or drone) and/or in different directions relative to the vehicle (or drone). While this discussion assumes the UE 100 is a vehicle or a drone, the UE 100 may be a different device (i.e., other than a vehicle or a drone). The sensors 161-165 are examples and one or more of the sensors 161-165 may be omitted from the UE 100 and/or one or more other sensors may be included in the UE 100. For example, the environmental sensor 160 may include one or more barometric pressure sensors and/or one or more ambient light sensors and/or one or more other sensors.
[0050] The camera 161 may be configured for capturing still and/or moving imagery. For example, each camera of the camera 161 may comprise, for example, one or more imaging sensors (e.g., a charge coupled device (CCD) or a CMOS imager), one or more lenses, analog- to-digital circuitry, frame buffers, etc. Additional processing, conditioning, encoding, and/or compression of signals representing captured images may be performed by the general-purpose processor 130 and/or the DSP 131. Also or alternatively, the video processor 133 may perform conditioning, encoding, compression, and/or manipulation of signals representing captured images. The video processor 133 may decode/decompress stored image data for presentation on a display device (not show n), e.g., of the user interface 116.
[0051] The motion sensor 165 is configured to detect motion. For example, the motion sensor 165 may send and receive sound waves (e.g., ultrasound signals) and analyze the received signals for Doppler effects indicative of motion. Use of multiple motion detectors may help identify the relative location (e.g., direction relative to the UE 100) of an object.
[0052] The light-based sensor 166 is configured to determine range to an object, which may be used by the processor 110 to detect the presence of an object. Use of multiple light-based sensors may help identify the relative location (e.g., direction relative to the UE 100) of an object. In some cases, the light-based sensor 166 may be used for detecting relatively small objects, such as vehicles or other artificial (human-made) objects.
[0053] The status sensor 170 is configured to provide one or more indications of one or more UE conditions of the UE 100 indicative of UE status. For example, UE conditions where the UE 100 is a vehicle or a drone (with UE conditions thus being vehicle or drone conditions) may include, for example a gear status of the vehicle (e.g., whether the vehicle is in park, drive, or neutral, or in which gear the vehicle is presently (e.g., reverse, first, second, third, fourth,
etc.)). Further, numerous other UE conditions may be sensed and indicated where the UE 100 is not a vehicle (e.g., when the UE is a drone) or is not associated with a vehicle.
[0054] The PMO sensor 180 may include one or more sensors for providing one or more UE conditions such as, for example, vehicle or drone conditions. For example, the PMO sensor 180 may include one or more sensors for measuring information from which position and/or motion and/or orientation of the UE 100 may be determined and possibly determining position and/or motion (e.g., speed and/or direction of motion) and/or orientation of the UE 100. In this example, the PMO sensor 180 includes a Satellite Positioning System (SPS) receiver 181, a position device (PD) 182, an Inertial Measurement Unit (IMU) 183, and a magnetometer 184. The components of the PMO sensor 180 shown are examples, and one or more of these components may be omitted and/or one or more other components included in the PMO sensor 180. Also, while each of the components 181-184 of the PMO sensor 180 may be referred to in the singular, each of the components 181-184 may include more than one such component, examples of some of which are discussed explicitly herein. Also, the PD 182 may be part of the SPS receiver 181 and/or the IMU 183 and/or part of the processor 110, and may not be a sensor itself (e.g., may not take measurements), but may process information from one or more of the sensors 181. 183, 184 and/or one or more other sensors. The PMO 180 may be used to determine UE speed and/or direction of motion, e.g., by determining UE location over time (e.g., determined using SPS, one or more ranging sensors, etc.).
[0055] The IMU 183 may comprise one or more inertial sensors, for example, an accelerometer
187 (e.g., responding to acceleration of the UE 100 in three dimensions) and/or a gyroscope 188. While each of the sensors 187, 188 may be referred to in the singular, each of the sensors 187, 188 may include more than one sensor. The accelerometer may include one or more three-dimensional accelerometers and the gyroscope may include one or more three- dimensional gyroscopes. The IMU 183 may be configured to provide measurements about a direction of motion and/or a speed of motion of the UE 100, which may be used, for example, in relative location determination. For example, the accelerometer 187 and/or the gyroscope
188 of the IMU 183 may detect, respectively, a linear acceleration and a speed of rotation of the UE 100. The linear acceleration and speed of rotation measurements of the UE 100 may be integrated overtime (e g., by the IMU 183 and/or the PD 182) to determine an instantaneous direction of motion as well as a displacement of the UE 100. The instantaneous direction of motion and the displacement may be integrated to track a location of the UE 100. For
example, a reference location of the UE 100 may be determined, e.g., using the SPS receiver 181 (and/or by some other means) for a moment in time and measurements from the accelerometer 187 and the gyroscope 188 taken after this moment in time may be used in dead reckoning to determine a present location of the UE 100 based on movement (direction and distance) of the UE 100 relative to the reference location.
[0056] The magnetometer 184 may determine magnetic field strengths in different directions which may be used to determine orientation of the UE 100, which may be used, for example, to provide a digital compass for the UE 100. The magnetometer 184 may include a two- dimensional magnetometer configured to detect and provide indications of magnetic field strength in two orthogonal dimensions. Also or alternatively, the magnetometer 184 may include a three-dimensional magnetometer configured to detect and provide indications of magnetic field strength in three orthogonal dimensions. The magnetometer 184 may provide means for sensing a magnetic field and providing indications of the magnetic field (e.g., to the processor 110). The magnetometer 184 may provide measurements to determine orientation (e.g., relative to magnetic north and/or true north) that may be used for any of a variety of purposes (e.g.. to support one or more compass applications). While referred to in the singular, the magnetometer 184 may include multiple magnetometers.
[0057] The SPS receiver 181 (e.g., a Global Positioning System (GPS) receiver or other Global Navigation Satellite System (GNSS) receiver) may be capable of receiving and acquiring SPS signals 185 via an SPS antenna 186. The antenna 186 is configured to transduce the wireless SPS signals 185 to wired signals (e g., electrical or optical signals, and may be integrated with the antenna 146). The SPS receiver 181 may be configured to process, in whole or in part, the acquired SPS signals 185 for estimating a location of the UE 100. For example, the SPS receiver 181 may be configured to determine location of the UE 100 by trilateration using the SPS signals 185. The general-purpose processor 130, the memory 111, the DSP 131 and/or one or more specialized processors (not shown) may be utilized to process acquired SPS signals, in whole or in part, and/or to calculate an estimated location of the UE 100, in conjunction with the SPS receiver 181. The memory 111 may store indications (e.g., measurements) of the SPS signals 185 and/or other signals (e.g.. signals acquired from the wireless transceiver 140) for use in performing positioning operations. The general-purpose processor 130, the DSP 131, and/or one or more specialized processors, and/or the memory 111 may provide or support a location engine for use in processing measurements to estimate
a location of the UE 100. Also or alternatively, some or all of the position determination signal processing may be performed by the PD 182.
[0058] The position device (PD) 182 may be configured to determine a position of the UE 100 (including absolute and/or relative position of the UE 100), motion of the UE 100, and/or time. For example, the PD 182 may communicate with, and/or include some or all of, the SPS receiver 181. The PD 182 may use measurements from the SPS receiver 181 and/or the IMU 183 and/or the magnetometer 184 to determine position and/or motion of the UE 100, e.g., using trilateration and/or dead reckoning. The PD 182 may work in conjunction with the processor 110 and the memory 111 as appropriate to perform at least a portion of one or more positioning methods (to determine location of the UE 100), although the description herein may refer only to the PD 182 being configured to perform, or performing, one or more operations in accordance with the positioning method(s). The PD 182 may also or alternatively be configured to determine location of the UE 100 using terrestrial -based signals (e.g., at least some of signals 148 discussed below) for trilateration, for assistance with obtaining and using the SPS signals 185, or both. The PD 182 may be configured to use one or more other techniques (e.g., relying on the UE’s self-reported location (e.g., part of the UE’s position beacon)) for determining the location of the UE 100, and may use a combination of techniques (e.g., SPS and terrestrial positioning signals) to determine the location of the UE 100. The PD 182 may be configured to provide indications of uncertainty' and/or error in the determined position and/or motion. Functionality' of the PD 182 may be provided in a variety of manners and/or configurations (e.g., by the general purpose/application processor 130, the transceiver 115, the SPS receiver 181, and/or another component of the UE 100, and may be provided by hardware, software, firmware, or various combinations thereof).
[0059] The transceiver 115 may include a wireless transceiver 140 and/or a wired transceiver 150 configured to communicate with other devices through wireless connections and wired connections, respectively. For example, the wireless transceiver 140 may include a wireless transmitter 142 and a wireless receiver 144 coupled to one or more antennas 146 for transmitting (e.g., on one or more uplink channels and/or one or more sidelink channels) and/or receiving (e.g., on one or more downlink channels and/or one or more sidelink channels) wireless signals 148 and transducing signals from the wireless signals 148 to wired (e.g., electrical and/or optical) signals and from wired signals to the wireless signals 148. The wireless transceiver 140 may be configured for wireless communication to send
communications to, and receive communications from, a variety of entities such as other UEs, base stations, etc. Thus, the wireless transmitter 142 may include multiple transmitters that may be discrete components or combined/integrated components, and/or the wireless receiver 144 may include multiple receivers that may be discrete components or combined/integrated components. The wireless transceiver 140 may be configured to communicate signals (e.g., with Transmission/Reception Points (TRPs) and/or one or more other devices) according to a variety of radio access technologies (RATs) such as 5GNew Radio (NR), GSM (Global System for Mobiles), UMTS (Universal Mobile Telecommunications System), AMPS (Advanced Mobile Phone System). CDMA (Code Division Multiple Access), WCDMA (Wideband CDMA), LTE (Long-Term Evolution), LTE Direct (LTE-D), 3GPP LTE-V2X (PC5), IEEE 802.11 (including IEEE 802. l ip), WiFi, WiFi Direct (WiFi-D), Bluetooth®, Zigbee etc. New Radio may use mm-wave frequencies and/or sub-6GHz frequencies. The wired transceiver 150 may include a wired transmitter 152 and a wired receiver 154 configured for wired communication, e.g., a network interface that may communicate with the network 230 of FIG. 2, e.g., to send communications to, and receive communications from, a gNode B (gNB), for example. The wired transmitter 152 may include multiple transmitters that may be discrete components or combined/integrated components, and/or the wired receiver 154 may include multiple receivers that may be discrete components or combined/integrated components. The wired transceiver 150 may be configured, e.g., for optical communication and/or electrical communication. The transceiver 115 may be communicatively coupled to the transceiver interface 114, e.g., by optical and/or electrical connection. The transceiver interface 114 may be at least partially integrated with the transceiver 115.
[0060] The wireless transceiver 140 may be configured for beam management to affect directionality of the wireless transceiver 140, e.g., of the antenna 146. For example, the wireless transceiver 140 may be configured to implement beam forming for transmission and/or reception of the signals 148. The antenna 146 may include multiple antennas that are configured, e.g., designed, made, disposed, and directed to point in different directions relative to a body of the UE 100. One or more of such antennas may be capable of electronic beam steering (e.g., using appropriate phase shifts of elements of the antenna) and/or mechanical beam steering. Also or alternatively, the transceiver 140 may be configured to selectively (e g., under directi on/control of the processor 110) transmit from one or more antennas and/or
to selectively process signals (e.g., to pass from the transceiver 115 to the processor 110 or to process by the processor 1 10) received from one or more antennas.
[0061] The user interface 116 may comprise one or more of several devices such as, for example, a speaker, microphone, display device, vibration device, keyboard, touch screen, etc. The user interface 116 may include more than one of any of these devices. The user interface 116 may be configured to enable a user to interact with one or more applications hosted by the UE 100. For example, the user interface 116 may store indications of analog and/or digital signals in the memory 111 to be processed by DSP 131 and/or the general-purpose processor 130 in response to action from a user. Similarly, applications hosted on the UE 100 may store indications of analog and/or digital signals in the memory 111 to present an output signal to a user. The user interface 116 may include an audio input/output (I/O) device comprising, for example, a speaker, a microphone, digital-to-analog circuitry, analog-to-digital circuitry, an amplifier and/or gain control circuitry (including more than one of any of these devices). Other configurations of an audio I/O device may be used. Also or alternatively, the user interface 116 may comprise one or more touch sensors responsive to touching and/or pressure, e.g., on a keyboard and/or touch screen of the user interface 116.
[0062] FIG. 2 illustrates an example wireless communications system 210, in accordance with aspects of the present disclosure. Wireless communications system 210 includes a user equipment (UE) 212. aUE 213, a UE 214, base transceiver stations (BTSs) 220, 221, 222, 223, a network 230, a core network 240, an external client 250, and a roadside unit (RSU) 260. The core network 240 (e.g., a 5G core network (5GC)) may include back-end devices including, among other things, an Access and Mobility Management Function (AMF) 241, a Session Management Function (SMF) 242, a server 243, and a Gateway Mobile Location Center (GMLC) 244. The AMF 241, the SMF 242. the server 243, and the GMLC 244 are communicatively coupled to each other. The server 243 may be, for example, a Location Management Function (LMF) that supports positioning of the UEs 212-214 (e.g., using techniques such as Assisted Global Navigation Satellite System (A-GNSS), OTDOA (Observed Time Difference of Arrival, e.g., Downlink (DL) OTDOA and/or Uplink (UL) OTDOA), Round Trip Time (RTT). Multi-Cell RTT, RTK (Real Time Kinematic). PPP (Precise Point Positioning), DGNSS (Differential GNSS), E-CID (Enhanced Cell ID), AoA (Angle of Arrival), AoD (Angle of Departure), etc.). The RSU 260 may be configured for communication (e.g., bi-directional or uni-directional communication) with the UEs 212-214).
For example, the RSU 260 may be configured with similar communication capabilities to any of the BTSs 220-223, but perhaps with different functionality(ies) (e.g., different programming). Also, while one RSU 260 is shown in FIG. 2, the system 200 may include more than one RSU, or may not include any RSUs. The communication system 210 may include additional or alternative components.
[0063] The communication system 210 may utilize information from a constellation 280 of satellite vehicles (SVs) 281, 282, 283. The constellation 280 may correspond to a respective Global Navigation Satellite System (GNSS) (i.e.. Satellite Positioning System (SPS)) such as the Global Positioning System (GPS), the GLObal NAvigation Satellite System (GLONASS), Galileo, Beidou, or some other local or regional SPS such as the Indian Regional Navigational Satellite System (IRNSS), the European Geostationary Navigation Overlay Service (EGNOS), or the Wide Area Augmentation System (WAAS). Only three SVs are shown for the constellation 280, but constellations of GNSS SVs will include more than three SVs.
[0064] An LMF may also be referred to as a Location Manager (LM), a Location Function (LF), a commercial LMF (CLMF), or a value-added LMF (VLMF). The server 243 (e.g., an LMF) and/or one or more other devices of the system 210 (e.g., one or more of the UEs 212- 214) may be configured to determine locations of the UEs 212-214. The server 243 may communicate directly with the BTS 221 (e.g., a gNB) and/or one or more other BTSs, and may be integrated with the BTS 221 and/or one or more other BTSs. The SMF 242 may serve as an initial contact point of a Service Control Function (SCF) (not shown) to create, control, and delete media sessions. The server 243 (e.g., an LMF) may be co-located or integrated with a gNB or a TRP (Transmission/Reception Point), or may be disposed remote from the gNB and/or TRP and configured to communicate directly or indirectly with the gNB and/or the TRP.
[0065] The AMF 241 may serve as a control node that processes signaling between the UEs 212-214 and the core network 240, and provides QoS (Quality of Service) flow and session management. The AMF 241 may support mobility of the UEs 212-214 including cell change and handover and may participate in supporting signaling connection to the UEs 212-214.
[0066] The system 210 is capable of wireless communication in that components of the system 210 can communicate with one another (at least some times using wireless connections) directly or indirectly, e.g., via the BTSs 220-223 and/or the network 230 (and/or one or more other devices not shown, such as one or more other base transceiver stations). While the BTSs
220-223 are shown separately from the network 230, the network 230 may include one or more of the BTSs 220-223 and may constitute a Radio Access Network (RAN), e.g., a New Radio (NR) RAN which may also be called a Fifth Generation (5G) Next Generation (NG) RAN (NG-RAN). For indirect communications, the communications may be altered during transmission from one entity to another, e.g.. to alter header information of data packets, to change format, etc. The UEs 212-214 may communicate with the BTSs 220-223 via Uu interfaces, e.g., in RRC-encapsulated LPP messages (Radio Resource Control encapsulated LTE Positioning Protocol messages) over Uu interfaces. The UEs 212-214 shown are a smartphone, a tablet computer, and a vehicle-based device, but these are examples only as the UEs 212-214 are not required to be any of these configurations, and other configurations of UEs may be used (e.g., one or more of the UEs may be in the form of a drone). For example, such other devices may include internet of thing (loT) devices, UAVs, drones, medical devices, home entertainment and/ or automation devices, etc. The core network 240 may communicate wi th the external client 250 (e.g., a computer system), e.g.. to allow the external client 250 to request and/or receive location information regarding the UEs 212-214 (e.g., via the GMLC 244).
[0067] The UEs 212-214 or other devices may be configured to communicate in various networks and/or for various purposes and/or using various technologies (e.g., 5G, Wi-Fi communication, multiple frequencies of Wi-Fi communication, satellite positioning, one or more types of communications (e.g., GSM (Global System for Mobiles), CDMA (Code Division Multiple Access). LTE (Long-Term Evolution). V2X (e.g., V2P (Vehicle-to- Pedestrian), V2I (Vehicle-to-Infrastructure), V2V (Vehicle-to-Vehicle), etc.), IEEE 802.81p, etc.). V2X communications may be cellular (Cellular-V2X (C-V2X)) and/or WiFi (e.g., DSRC (Dedicated Short-Range Connection)). The system 210 may support operation on multiple carriers (waveform signals of different frequencies). Multi-carrier transmitters can transmit modulated signals simultaneously on the multiple carriers. Each modulated signal may be a Code Division Multiple Access (CDMA) signal, a Time Division Multiple Access (TDMA) signal, an Orthogonal Frequency Division Multiple Access (OFDMA) signal, a Single-Carrier Frequency Division Multiple Access (SC-FDMA) signal, etc. Each modulated signal may be sent on a different carrier and may carry pilot, overhead information, data. etc.
[0068] The BTSs 220-223 may wirelessly communicate with the UEs 212-214 in the system
210 via one or more antennas. A BTS may also be referred to as a base station, an access
point, a gNode B (gNB), an access node (AN), a Node B, an evolved Node B (eNB), etc. For example, each of the BTSs 220, 221 may be a gNB or a transmission point gNB, the BTS 222 may be a macro cell (e.g., a high-power cellular base station) and/or a small cell (e.g., a low- power cellular base station), and the BTS 223 may be an access point (e.g.. a short-range base station configured to communicate with short-range technology such as WiFi. WiFi-Direct (WiFi-D), Bluetooth®, Bluetooth®-low energy (BLE), Zigbee, etc). One or more of the BTSs 220-223 may be configured to communicate with the UEs 212-214 via multiple carriers. Each of the BTSs 220, 221 may provide communication coverage for a respective geographic region, e.g. a cell. Each cell may be partitioned into multiple sectors as a function of the base station antennas.
[0069] The BTSs 220-223 each comprise one or more Transmission/Reception Points (TRPs). For example, each sector within a cell of a BTS may comprise a TRP, although multiple TRPs may share one or more components (e.g., share a processor but have separate antennas). The system 210 may include only macro TRPs or the system 210 may have TRPs of different types, e.g., macro, pico, and/or femto TRPs, etc. A macro TRP may cover a relatively large geographic area (e.g., several kilometers in radius) and may allow unrestricted access by terminals with service subscription. A pico TRP may cover a relatively small geographic area (e.g., a pico cell) and may allow unrestricted access by terminals with service subscription. A femto or home TRP may cover a relatively small geographic area (e.g., a femto cell) and may allow restricted access by terminals having association with the femto cell (e.g., terminals for users in a home).
[0070] The UEs 212-214 may be configured to connect indirectly to one or more communication networks via one or more device-to-device (D2D) peer-to-peer (P2P) links. The D2D P2P links may be supported with any appropriate D2D radio access technology (RAT), such as LTE Direct (LTE-D), WiFi Direct (WiFi-D), Bluetooth®, Ultrawideband (UWB), and so on. One or more of a group of the UEs 212-214 utilizing D2D communications may be within a geographic coverage area of a TRP such as one or more of the BTSs 220-223. Other UEs in such a group may be outside such geographic coverage areas, or be otherwise unable to receive transmissions from a base station. Groups of the UEs 212-214 communicating via D2D communications may utilize a one-to-many (EM) system in which each UE may transmit to other UEs in the group. A TRP of the BTSs 220-223 may
facilitate scheduling of resources for D2D communications. In other cases, D2D communications may be carried out between UEs without the involvement of a TRP.
[0071] FIG. 3 illustrates an example implementation of a system-on-a-chip (SOC) 300, which may include a central processing unit (CPU) 302 or a multi-core CPU. configured to perform one or more of the functions described herein. In one or more examples, the SOC 300 may be implemented within a computing device (e.g., network device 620 of FIG. 6 in the form of a UE as a smart phone, or network device 920 of FIG. 9 in the form of a UE as a smart phone) and/or a drone (e.g., network device 610 of FIG. 6 in the form of a drone, or network devices 930a, 930b of FIG. 9 in the form of a drone).
[0072] Parameters or variables (e.g., neural signals and synaptic weights), system parameters associated with a computational device (e.g., neural network with weights), delays, frequency bin information, task information, among other information may be stored in a memory block associated with a neural processing unit (NPU) 308, in a memory block associated with a CPU 302, in a memory7 block associated with a graphics processing unit (GPU) 304, in a memory block associated with a digital signal processor (DSP) 306, in a memory block 318, and/or may be distributed across multiple blocks. Instructions executed at the CPU 302 may be loaded from a program memory7 associated with the CPU 302 or may be loaded from a memory^ block 318.
[0073] The SOC 300 may also include additional processing blocks tailored to specific functions, such as a GPU 304. a DSP 306, a connectivity block 310. which may include fifth generation (5G) connectivity, fourth generation long term evolution (4G LTE) connectivity, Wi-Fi connectivity7, USB connectivity7, Bluetooth connectivity7, Ultrawideband (UWB) and the like. In one implementation, the NPU is implemented in the CPU 302, DSP 306, and/or GPU 304. The SOC 300 may also include a sensor processor 314, image signal processors (ISPs) 316, and/or navigation module 320, which may include a global navigation satellite system (GNSS) and/or global positioning system (GPS).
[0074] SOC 300 and/or components thereof may be configured to evaluate environmental conditions. For example, the sensor processor 314 may receive and/or process information from one or more sensors 322. Examples of sensors 322 may include one or more Inertial Measurement Units (IMUs) (e.g., an accelerometer, a gy roscope, etc.), temperature sensors, light sensors, shock sensors, humidity sensors, acceleration sensors, speed sensors, tilt angle sensors, etc. sensors of a device. In some cases, the sensors 322 may be located on SOC 300.
In other cases, the sensor processor 314 may also be coupled to one or more sensors (not shown) that are external to the SOC 300 (e.g., located on a separate chip). In some cases, the sensor processor 314 may also receive, as input, output of one or more processing blocks of the connectivity block 310.
[0075] FIG. 4 is a diagram illustrating an unmanned aerial vehicle (UAV) 420 in the form of a drone, which may perform deliveries of items (e.g., parcels) for users (e.g., customers). The UAV 420 illustrated in FIG. 4 may be an example of a drone that follows the disclosed protocols for drone landings, such as for performing emergency landings or for delivering items. The UAV 420 includes a VL camera 410 adjacent to an IR camera 430 along a front portion of a body of the UAV 420. The UAV 420 includes multiple propellers 425 along the top of the UAV 420 (which can be a means for traveling along a path). The propellers 425 maybe spaced apart from the body of the UAV 420 by one or more appendages to prevent the propellers 425 from snagging on circuitry on the body of the UAV 420 and/or to prevent the propellers 425 from occluding the view of the VL camera 410 and/or the IR camera 430. The propellers 425 may act as a conveyance of the UAV 420, and may be motorized using one or more motors. The motors, and thus the propellers 425, may be actuated to move the UAV 420 via a movement actuator housed within the body of the UAV 420.
[0076] As previously mentioned, drones may be used for delivery- of items (e.g., parcels) to users. Certain paths (or routes) where drones fly can be riskier for drone operations than other paths. For example, external environmental conditions (e.g., high speed winds, sudden heavy rain fall, and storms) may impact a drone as it travels along a path or route. In some cases, the drone may need to change its path to a new- or updated path, such as based on the environmental conditions. Such a change in path may cause damage to the drone and/or cause the drone to lose connection with a ground controller (e.g., a controller operated by an operator of the drone). As such, with some paths (e.g., paths with inclement weather conditions), there can be a high probability- that a drone may experience technical issues (e.g., mechanical and/or operational failures) and, as such, the drone can be required to perform an emergency landing on private or public property.
[0077] Flying a drone over private property- may be legal in some geographic locations, such as places where there is no need for permission to fly drones over private property-. How ever, some places (e.g., states and local municipalities, such as in the United States and/or other
countries) have laws that prohibit flying drones over and landing drones on private property, such as due to privacy concerns. For instance, the FAA is responsible for civil aviation safety within the United States, and regulates the United States airspace. Since the FAA is a federal agency, the FAA rules can preempt the state and local municipality laws. The FAA divides the airspace over the United States into two categories, which include controlled airspace and uncontrolled airspace.
[0078] FIG. 5 is a diagram illustrating an example of airspace 500 of an area (e.g., shown to include houses 560 and businesses 570), where the airspace 500 is shown to be divided into a controlled airspace 520 (e.g., shown to include airplanes 540a, 540b) and an uncontrolled airspace 530 (e.g., shown to include drones 550a, 550b, 550c). The controlled airspace 520 is defined to be located above 400 feet from the ground (e.g., as denoted by dashed line 510), and the uncontrolled airspace 530 is defined to be located within 400 feet from the ground.
[0079] The FAA has rules regarding where commercial and recreational drone pilots are able to fly and not fly their drones within the controlled airspace. For example, the FAA has laws and regulations for recreational pilots flying their drones. These FAA laws and regulations include requiring flying the drone while maintaining a visual line of sight to the drone, flying the drone for recreational purposes (not for earning money), not flying the drone at night if the drone does not have recommended lights, properly registering the drone before flying the drone, the drone should not weight more than fifty-five (55) pounds unless certified by a community-based organization, not flying the drone over people or moving vehicles, not flying the drone near emergency response activities, and not flying the drone near any crewed aircraft. The FAA also has some laws and regulations for commercial drone pilots flying their drones. These FAA law s and regulations include requiring a pilot to obtain an FAA-issued remote pilot certificate in order to become a commercial drone pilot to fly a drone commercially, flying the drone at a speed lower than 100 miles per hour (mph), flying the drone only in a Class G designated airspace, always giving w ay to crewed aircraft, and not flying the drone off of a moving vehicle, except when in a low' populated area. As such, in the industry today, a generalized protocol for landing drones, such as for performing emergency landings or for delivering items, on private property can be useful.
[0080] In one or more aspects, systems and techniques are described herein for providing protocols for drone landing. In one or more examples, the systems and techniques involve
receiving a pre-authorization from a user (e.g., a land owner) for a drone to land in an area (e.g., land) associated with the user. In one or more aspects, a pre-authorization for a drone to land in an area can be previously provided by a user associated with the area (e.g., a land owner of a parcel of land). In one or more examples, an administrator (e.g., a company) associated with the drone may provide an online sign-up option (e.g.. via a web application running on a computing device, such as a computer, associated with the user, or a mobile application running on a computing device, such as a smart phone, associated with the user) for a user (e.g., land owner) to provide consent (e.g.. authorization) for a drone associated with the administrator (e.g.. company) to fly over and land in the area associated with the user (e.g., the land owner of the parcel of land). In some examples, during the online sign-up option, the user (e.g., land owvier) can provide restrictions for the landing of the drone in the area. In one or more examples, the restrictions may include, but are not limited to, a restriction indicating no operation of a camera on the drone while in the area, a restriction indicating no operation of a sensor(s) on the drone while in the area, a restriction indicating a reduced speed of the drone (e.g., drone blades) while in the area, and/or a restriction indicating no landing of the drone in the area when people are present within the area.
[0081] In one or more examples, after the administrator (e.g., company) associated with the drone receives a pre-authorization from the user (e.g., land owner) for landing of the drone in the area (e.g., land owned by the land owner), a computing device (e.g., server) associated with the administrator (e.g., company) may perform route planning for drones to travel through and/or to the area (e.g., including route planning for the drone to travel to the area). In some examples, the administrator (e.g., company) can generate a map (e.g., an emergency map) for drones to fly and to land only near and on properties where pre-authorizations to fly and land were received. In one or more examples, when a drone needs to make an emergency landing in an area where an administrator (e.g., company) associated with the drone has received a preauthorization to land from a user associated with the area (e.g., a land owner of a parcel of land), the drone may send (e.g., transmit) a landing notification message to a computing device associated with the user (e.g., a smart phone of the land owner) to notify the user of the emergency landing of the drone in the area.
[0082] FIG. 6 is a diagram illustrating an example of a system 600 for providing preauthorization for unplanned drone landings. In FIG. 6, the system 600 is shown to include a network device 610 (e.g., in the form of a drone), a network device 620 (e.g., a computing
device or UE in the form of a smart phone), a network device 640 (e.g., in the form of a base station, such as a gNB), and a network entity 630 (e.g., a computing device in the form of a server).
[0083] In FIG. 6. the network device 610, shown to be in the form of a drone, may be associated with the network entity 630 and an administrator (e.g., a company, such as a drone company, that owns the drone and uses the drone to deliver items to customers). In some examples, the network device 610 may be in the form of other types of remotely controlled devices, such as other types of robotic devices or vehicles.
[0084] In the system 600 of FIG. 6, the network device 620, shown to be in the form of a smart phone, may be associated with a user (e.g., a customer who is purchasing one or more items from the company to be delivered to a location of the user's choice). In some examples, the network device 620 may be in the form of other ty pes of computing devices that may be associated with a user (e.g., a customer), such as a tablet, a wearable device, a smart watch, a laptop computer, and a desktop computer. The network device 620 may be located within an area 650 that is associated with the user (e.g., customer). For example, the area 650 may be property (e.g., private property) owned by the user (e.g., customer). In some examples, the user (e.g., customer) may choose to have the one or more items that the user has purchased from the company to be delivered to the area 650 (e.g., private property).
[0085] In FIG. 6. the network entity 630. shown to be in the form of a server, may be associated with an administrator (e.g., a company that owns the drone and uses the drone to deliver items to customers). In one or more examples, the system 600 may include more or less number of network devices 610, 620, 640 and/or network entities 630 than as shown in FIG. 6. In some examples, the system 600 may include different types of network devices 610, 620. 640 and/or network entities 630 than as shown in FIG. 6.
[0086] In one or more examples, during operation of the system 600 for pre-authorization for drone landings, an administrator (e.g., a company that owns drones for delivery of items to customers, referred to as a drone company) may provide online sign-up options for a user (e.g., a property owner, which may be a customer of the company) to authorize (e.g., to provide a pre-authorization) the landing of one or more drones in an area (e.g., private property ) owned by the user for an unplanned landing of the drone(s). In one or more examples, the unplanned landing may be any landing of a drone that is not scheduled, such as an emergency landing of
a drone, a landing of a drone for refueling of the drone, a landing of a drone for providing maintenance to the drone, a landing of a drone due to inclement weather, and a landing of a drone due to congested airspace.
[0087] In one or more examples, the administrator (e.g.. a drone company) may provide the online sign-up options for pre-authorization for drone landings to a user (e g., a property owner) via a web browser (e.g., running on a desktop computer or laptop computer of the user) and/or via a mobile application (e.g., running on a smart phone or wearable device of the user). The web browser and/or mobile application (e.g., hosted by the drone company) may have stored within its database, information associated with the user (e g., the property owner), such as the address (e.g., property address) of the user, the contact information (e.g., email address and/or phone number) of the user, the shopping history of the user, etc.
[0088] FIG. 7 shows an example of a web browser that provides these online sign-up options to a user. In particular, FIG. 7 is a diagram illustrating an example of a web browser 700 for pre-authorization for unplanned drone landings. In FIG. 7, the web browser 700 is shown to include a search box 710 for searching keywords and terms, and a login and security dropdown menu 720. When the login and security drop-down menu 720 is selected by a user, a number of options are presented to the user that can include, but are not limited to, a login option 730, a sign up for drone emergency (or unplanned) landings option 740, a switch accounts option 750, a your addresses option 760, a manage saved identifications (IDs) option 770, and a manage your profiles option 780. The user (e.g.. a property owner, which may be a customer of the company) can select the drone emergency (or unplanned) landings option 740, and can specifically give consent (e.g., sign up to provide pre-authorization) to the administrator (e.g.. drone company) to perform unplanned landing of drones on the property (e.g., private property) owned by the user (e.g., property’ owner).
[0089] In one or more examples, the administrator (e.g., drone company) can allow’ for (e.g., via a web browser, such as the web browser 700, or via a mobile application) the user (e.g., property owner) to specify some restrictions for landing drones on the user’s property (e.g., area 650). In one or more examples, the restrictions (e.g., which may be related to safety) that the user (e.g., property owner) may choose for the landing of drones on the user’s property (e.g., the area 650) may include, but are not limited to, a restriction indicating no operation of a camera on the drone while the drone is located within the property (e.g., in the area 650), a
restriction indicating no operation of at least one sensor on the drone while the drone is located within the property (e.g., in the area 650), a restriction indicating a reduced speed for the drone while the drone is located within the property (e.g., in the area 650), and/or a restriction indicating no landing of the drone on the property' (e.g., in the area 650) when people are present on the property (e.g.. in the area 650).
[0090] In one or more examples, the administrator (e.g., drone company) can offer (e.g., via a web browser, such as the web browser 700. or via a mobile application) compensation to the user (e.g., property owner) to provide pre-authorization to perform unplanned landings of drones on property owned by the user. In one or more examples, the compensation can include, but is not limited to, money, one or more free services, one or more free items, and/or one or more discounts on services and/or items.
[0091] Referring again to FIG. 6, a user (e.g., a property owner, which may be a customer of the drone company) may use a computing device (e.g., network device 620) associated with the user to access a w eb brow ser (e.g., w eb brow ser 700 of FIG. 7) or application to sign up to provide pre-authorization for unplanned landings of drones on property (e.g., area 650) owned by the user (e.g., property owner). Once the user (e.g., property owner) signs up on the web browser or application to provide the authorization for unplanned landings of drones on the property (e.g., in the area 650), the computing device (e.g., network device 620) can send (e.g., via wireless signals 605, 615 via the network device 640, and wired signal 625 via the internet 660) an authorization message to the network entity- 630 (e.g., a computing device associated with the drone). The authorization message can include the authorization by the user to land drones in the property (e.g., area 650) owned by the user (e.g., property owner) for unplanned landings of drones (and/or for scheduled landings for deliveries) and can, optionally, include one or more restrictions chosen by the user (e.g., property owner).
[0092] Once the network entity 630 (e.g., computing device associated with the drone) receives the authorization message (e.g., which includes the pre-authorization and, optionally, the restrictions) from the network device 620 (e.g., computing device associated with the user, who is the property' owner), the network entity 630 can determine route planning for drones travelling in the area 650 owned by the user (e.g., property owner) based on information within the authorization message. In one or more examples, the network entity 630 can determine the route planning additionally based on pre-authorizations from other users (e.g., other property
owners) for unplanned drone landings on their properties. In some examples, the network entity 630 can determine the route planning additionally based on a planned delivery of one or more drones of one or more items to one or more locations, which may be different than the area 650. In one or more examples, for the route planning, the network entity 630 (e.g., computing device associated with the drone) can generate a map for unplanned drone landings. The map can include one or more paths (e.g., determined by the network entity 630) for one or more drones to travel to one or more areas (e.g., area 650), which have owners that have preauthorized drone landings for those one or more areas. One of the paths (e.g., determined by the network entity 630) can be a path for the network device 610 (e.g.. drone) to travel to the area 650 (e.g., for an unplanned landing or for a delivery of one or more items).
[0093] FIG. 8 shows an example of a map for unplanned drone landings. In particular, FIG. 8 is a diagram illustrating an example of a map 800 for unplanned drone landings (e.g.. an emergency landing map). In FIG. 8, the map 800 is shown to include multiple areas 830, 840, 850, 860, 870, 880. Property owners (e.g., users) of the areas 840, 850, 860, 880 have provided pre-authorization for unplanned drone landings. These areas 840, 850, 860, 880 are referred to as “opt in” areas. Property owners (e.g., users) of the areas 830, 870have not provided preauthorization for unplanned drone landings. These areas 830, 870 are referred to as “opt out” areas.
[0094] In the example shown in FIG. 8, a drone 810 may have a planned delivery (e.g., a scheduled delivery) to deliver one or more items to a destination 820 (e.g., a house) located within the area 880, which is designated as an '‘opt in" area. The network entity (e.g., computing device associated with the drone) may initially determine a path (e.g., an original path 805) for the drone 810 to travel that is the shortest path for the drone 810 to travel to the destination 820. In FIG. 8, the original path 805 is shown to span over areas 860, 870. 880. However, since area 870 is an “opt out” area, drones cannot perform any unplanned landings within area 870.
[0095] For the possibility of any unplanned landings (as well as for planned landings, such as for planned deliveries), the network entity (e.g., computing device associated with the drone) can determine a different path (e.g., an updated path 815) for the drone 810 to travel to the destination 820 such that the drone 810 will only travel over “opt in” areas. In FIG. 8, the updated path 815 is shown to span over areas 860, 840, 850. 880. Since all of the areas 860,
840, 850, 880 in the updated path 815 are “opt in” areas, drones may perform unplanned landings in any of the areas 860, 840, 850, 880, if needed.
[0096] Referring back to FIG. 6, after the network entity 630 has determined a path for the drone 610 to travel to the area 650, the network entity 630 can transmit a command (e.g.. via signal 635 transmitted wirelessly) to the drone 610 that includes the path (e.g., path information). After the drone 610 receives the command from the network entity7 630, the command can cause the drone 610 to travel (fly), following the path, to the area 650 and to land in the area 650 (e.g., for an unplanned landing or for a scheduled delivery of one or more items).
[0097] In one or more aspects, the systems and techniques involve receiving a dynamic on- the-spot (e.g., real-time) authorization from the user for the drone to land in the area associated with the user. In one or more examples, when a drone needs to make an emergency landing in an area, and an administrator (e.g., company) associated with the drone has not received a preauthorization from a user (e.g., land owner) associated with the area, the drone may broadcast (e.g., transmit) an authorization request message to the area requesting for consent (e.g., authorization) for the drone to fly over and land in the area. In some examples, a computing device (e.g., server) associated with the administrator (e.g., company) associated with the drone can command the drone to broadcast the authorization request message to the area. In some examples, the drone can wirelessly broadcast the authorization request message to the area, such as via a sidelink communication and/or a wireless wide area network (WWAN) communication. In one or more examples, the authorization request message can include additional information including, but not limited to, images of a portion of the area (e.g., land) for landing of the drone, a height of the ground of the area for the landing, parcel information related to the area for the landing, a speed of the drone, and/or nearby object detection information.
[0098] In one or more examples, after a computing device (e.g., smart phone) associated with the user associated with the area (e.g.. a landowner of a parcel of land) receives the authorization request message, in response, the user (e.g., land owner) may provide, via the computing device (e.g., smart phone) associated with the user, authorization for the landing in the area to the drone itself and/or to a computing device (e.g., server) associated with the administrator (e.g.. company) associated with the drone. In some examples, the user (e.g., land
owner) may assign someone else as an agent with authority to provide consent (e.g., authorization) on behalf of the user (e.g., land ow ner) for the landing of the drone in the area. In one or more examples, after the drone and/or the computing device (e.g., server) of the administrator (e.g., company) associated with the drone receives an authorization from the user (e.g.. land owner), or from the user’s agent, for landing of the drone in the area (e.g., land), the computing device (e.g., server) associated with the administrator (e.g., company) may perform route planning for the drone to travel to the area and to land in the area.
[0099] FIG. 9 is a diagram illustrating an example of a system 900 for dynamic real-time authorizing of drone landings (e.g., for unplanned landings or for scheduled deliveries of items). In FIG. 9, the system 900 is showor to include network devices 910a, 910b (e.g., in the form of base stations, such as gNBs), network device 920 (e.g., computing device associated with a user in the form of a smart phone), network devices 930a, 930b (e.g., in the form of drones), a house 950 (e.g., owned by the user associated with the network device 920), and an in-vehicle station 940. In one or more examples, the house 950 can include one or more home equipped devices (e.g., connected to every thing devices) that have wireless communication capability.
[0100] In one or more examples, the system 900 may include more or less number of the network devices 910a, 910b, the network device 920, the network devices 930a, 930b, the house 950, and/or the in-vehicle station 940. In some examples, the network devices 910a, 910b, the network device 920. the network devices 930a. 930b, the house 950, and/or the in- vehicle station 940 may be implemented in different forms than as shown in FIG. 9.
[0101] During operation of the system 900, a network entity (e.g., server associated with one or more drones, such as a server associated with a drone company) may send one or more commands to one or both of the network devices 930a, 930b (e.g., drones) to cause one or both of the network devices 930a, 930b (e.g., drones) to send (e.g., broadcast) one or more authorization request messages (e.g., emergency messages) to one or more areas that one or both of the network devices 930a, 930b (e.g., drones) would like to land (e.g.. immediately land).
[0102] After receiving the one or more commands, one or both of the network devices 930a, 930b (e.g., drones) can transmit (e.g., broadcast) one or more authorization request messages (e.g.. emergency messages) to one or more areas that the one or both of the network devices
930a, 930b (e.g., drones) would like to land. In some examples, the one or more areas may include an area including the network device 920 and the house 950, and an area including the in-vehicle station 940. In some examples, the one or both of the network devices 930a, 930b (e.g., drones) can send the authorization request messages (e.g., emergency messages) via sidelink signaling and/or via wireless wide area network (WWAN) signaling. In some examples, the authorization request messages (e.g., emergency messages) may include information related to the landing of the one or both of the network devices 930a, 930b (e.g., drones) in the one or more areas. The information related to the landing of the one or both of the network devices 930a, 930b (e.g., drones) may include, but is not limited to, photos of one or more desired locations within the one or more areas to land, a height that the one or both of the drones is located above the ground, information regarding one or more parcels carried by the one or both of the drones, information regarding the company sending the one or more parcels, a speed of travel of the one or both of the drones, and/or information regarding landmarks nearby the one or more desired locations within the one or more areas to land.
[0103] In one or more examples, the network device 920 and/or the one or more home equipped devices within the house 950 can receive the authorization request messages. In response, the user (e.g., associated with the network device 920 who owns the area that includes the house 950) may provide a dynamic real-time authorization (e.g., via a web browser and/or mobile application running on the network device 920 and/or running on one or more of the home equipped devices located within the house 950) to a netw ork entity (e.g., server associated with the drone, such as a server associated with the drone company) to provide authorization for one or more drones to land (e.g., immediately land) in an area (e.g., including the house 950) owned by the user.
[0104] Once the user (e.g., property owner of the area including the house 950) provides the dynamic real-time authorization for landings of drones on the property (e.g., in the area including the house 950), the network device 920 can send an authorization message to the network entity (e.g., the server associated with the drones). The authorization message may include the authorization by the user to land drones in the property (e.g., area including the house 950) owned by the user (e.g., property owner) and can. optionally, include one or more restrictions chosen by the user (e.g., property owner).
[0105] Once the network entity (e.g., the server associated with the drones) receives the authorization message (e.g., which includes the dynamic real-time authorization and, optionally, the restrictions) from the network device 920 (e.g., computing device associated with the user, who owns the area including the house 950), the network entity can determine route planning for the one or both of the network devices 930a, 930b (e.g., drones) travelling to the area owned by the user (e.g., property owner), based on information within the authorization message.
[0106] After the network entity has determined a path for the one or both of the network devices 930a, 930b (e g., drones) to travel to the area including the house 950, the network entity can transmit one or more commands to the one or both of the network devices 930a, 930b (e.g., drones) that includes the path(s) (e.g., path information). After the one or both of the network devices 930a. 930b (e.g.. drones) receives the one or more commands from the network entity, the one or more commands can cause the one or both of the network devices 930a, 930b (e.g., drones) to travel (fly), following the path(s), to the area and to land in the area (e.g., for an unplanned landing).
[0107] In some cases, a camera (or multiple cameras in some cases) located in the area of the user can capture images of the area including the house 950. For example, the camera(s) can include one or more Internet Protocol (IP) cameras on the house 950, such as a surveillance camera, a doorbell camera, etc. The camera(s) can capture images of a drone (e.g., network device 930a or network device 930b of FIG. 9) as the drone flies within a field of view of the camera. The camera or a system associated with the camera (e g., a network entity, such as a server associated with the camera) can perform an image processing operation (e.g., object detection, motion detection, and/or other operation) on the images to detect the presence of the drone in the images. In response to detecting the drone in the images, the camera or the system associated with the camera can transmit a notification to the network device 920 of the user to provide authorization for landing the drone in the area including the house 950. For example, the notification can be displayed via a web application running on the network device 920, a mobile application running on the network device 920, etc. The user can then provide user input (e.g.. touch input, voice input, gesture input, etc.) to the network device 920 to provide consent (e.g., authorization) for the drone to fly over and land in the area including the house 950.
[0108] In one or more aspects, the systems and techniques involve the drone landing in an area without any authorization (e.g., with no preauthorization) from a user (e.g., land owner) associated with the area (e.g., land), where an administrator (e.g., a company) associated with the drone can provide some compensation (e.g., money, free services, free items, discounts on services, and/or discounts on items) to the user associated with the area (e.g., a land owner of a parcel of land) for the landing of the drone. In one or more examples, when a drone lands, such as for an emergency landing, in an area without any authorization from a user associated with the area (e.g., a land owner of a parcel of land), an administrator (e.g., a company) associated with the drone can provide some compensation (e.g.. money and/or free services) to the user (e.g., land owner) associated with the area (e.g., land) for the landing of the drone.
[0109] FIG. 10 is a flow' chart illustrating an example of a process 1000 for wireless communications utilizing methods for protocols for drone landing. The process 1000 can be performed by a first computing device (e.g., network entity 630 of FIG. 6 in the form of a server) or by a component or system (e.g., a chipset) of the first computing device. The operations of the process 1000 may be implemented as software components that are executed and run on one or more processors (e.g.. processor 1210 of FIG. 12 or other processor(s)). Further, the transmission and reception of signals by the first computing device in the process 1000 may be enabled, for example, by one or more antennas and/or one or more transceivers (e.g., wireless transceiver(s)) associated with the first computing device.
[0110] At block 1010. the first computing device (or component thereof) can receive, from a second computing device (e.g., netw ork device 620 of FIG. 6, the netw ork device 920 of FIG. 9, etc.) associated with a user, an authorization message including an authorization by the user to land the drone in an area (e.g., area 650 of FIG. 6, the house 950 of FIG. 9, etc.) associated with the user for an unplanned landing of the drone (e.g., an emergency landing of the drone). In one illustrative example, the area associated with the user is a property associated with the user (e.g., a home of the user, a business location ow ned or leased by the user, etc.). In some cases, the authorization message further includes one or more restrictions by the user for landing the drone in the area. In some examples, the one or more restrictions include a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, a restriction indicating no landing of the drone in the area w hen people are present within the area, any combination thereof, and/or other restrictions.
[0111] In some cases, the authonzation message is received based on a notification provided to the second computing device associated with the user. For instance, the notification may be based on detection of the drone by at least one camera located in the area. In one illustrative example, a camera or a system associated with the camera (e.g., a network entity , such as a server associated with the camera) can perform an image processing operation (e.g., object detection, motion detection, and/or other operation) on the images to detect the presence of the drone in the images. In response to detecting the drone in the images, the camera or the system associated with the camera can transmit a notification to the second computing device associated with the user to provide authorization for landing the drone in the area including the house.
[0112] At block 1020, the first computing device (or component thereof) can determine a path for the drone to travel to the area, based on receiving the authorization message. In some aspects, the first computing device (or component thereof) can determine the path for the drone to travel to the area further based on a planned delivery of one or more items to a location. In some aspects, the first computing device (or component thereof) can generate, based on receiving the authorization message, a landing map for unplanned landings, where the landing map includes the path.
[0113] At block 1030, the first computing device (or component thereof) can cause the drone to follow the path to travel to the area and to land in the area. In some cases, the location is different from the area associated with the user. For instance, the location can be a business to which an item is being delivered according to a planned delivery, and the area associated with the user can be a property (e.g., a home) of the user.
[0114] In some aspects, the first computing device (or component thereof) can cause the drone to broadcast an authorization request message. For instance, as described with respect to FIG. 9, network devices 930a, 930b (e.g., drones) can send authorization request messages (e.g., emergency messages) via sidelink signaling and/or via wireless wide area network (WWAN) signaling. In some cases, the authorization request messages can include information related to the landing of the drone (e.g., one or both of the network devices 930a, 930b of FIG. 9) in the one or more areas. The information related to the landing may include, but is not limited to, photos of one or more desired locations within the one or more areas to land, a height that the one or both of the drones is located above the ground, information regarding one or more
parcels carried by the one or both of the drones, information regarding the company sending the one or more parcels, a speed of travel of the one or both of the drones, information regarding landmarks nearby the one or more desired locations within the one or more areas to land, any combination thereof, and/or other information. In some examples, the second computing device (e.g.. the network device 920 of FIG. 9) may receive the authorization request message. In response, the user associated with the second computing device may provide a dynamic realtime authorization (e.g., via a web browser and/or mobile application running on the network device 920 and/or running on one or more of the home equipped devices located within the house 950) to the first computing device to provide authorization for the drone to land in the area.
[0115] In some aspects, the first computing device (or component thereof) can provide compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
[0116] FIG. 11 is a flow chart illustrating an example of a process 1100 for wireless communications utilizing methods for protocols for drone landing. The process 1100 can be performed by a drone (e.g., network device 610 of FIG. 6 in the form of a drone, or network devices 930a, 930b of FIG. 9 in the form of a drone) or by a component or system (e.g., a chipset) of the drone. The operations of the process 1100 may be implemented as software components that are executed and run on one or more processors (e.g., processor 1210 of FIG. 12 or other processor(s)). Further, the transmission and reception of signals by the drone in the process 1100 may be enabled, for example, by one or more antennas and/or one or more transceivers (e.g., wireless transceivers )) on the drone.
[0117] At block 1110. the drone (or component thereof) can broadcast (e.g., via at least one transceiver), an authorization request message to an area (e.g., a property associated with the user) for landing the drone.
[0118] At block 1120, the drone (or component thereof) can receive (e.g.. via at least one transceiver), from a first computing device associated with the drone, a path to follow to the area. The path is based on an authorization message received from a second computing device of a user associated with the area in response to the authorization request message. For instance, the authorization message may include an authorization by the user to land the drone in the area for an unplanned landing of the drone (e.g., an emergency landing of the drone). In some
cases, the authorization message includes one or more restrictions by the user for landing the drone in the area. For instance, the one or more restrictions can include a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, a restriction indicating no landing of the drone in the area when people are present within the area, any combination thereof, and/or other restrictions. In some cases, the path is further based on a planned deli very of one or more items to a location. In some examples, the location is different from the area associated with the user.
[0119] In some aspects, the first computing device (or component thereof) can receive (e.g., via at least one transceiver) the authorization message based on a notification provided to the computing device of the user. As described previously, the notification may be based on detection of the drone by at least one camera located in the area.
[0120] At block 1130, the drone (or component thereof) can cause the drone to travel along the path and land in the area. In some aspects, the first computing device (or component thereof) can receive (e.g., via at least one transceiver) a landing map for unplanned landings, wherein the landing map includes the path.
[0121] The computing device (e.g., network entity) and/or the drone (e.g., network device) may include various components, such as one or more input devices, one or more output devices, one or more processors, one or more microprocessors, one or more microcomputers, one or more cameras, one or more sensors, one or more receivers, transmitters, and/or transceivers, and/or other component(s) that are configured to carry' out the steps of processes described herein. In some examples, the computing device may include a display, a network interface configured to communicate and/or receive the data, any combination thereof, and/or other component(s). The network interface may be configured to communicate and/or receive Internet Protocol (IP) based data or other type of data.
[0122] The components of the computing device (e.g., network entity) configured to perform the process 1000 of FIG. 10 and/or the components of the drone (e.g., network device) configured to perform the process 1100 of FIG. 11 can be implemented in circuitry'. For example, the components can include and/or can be implemented using electronic circuits or other electronic hardware, which can include one or more programmable electronic circuits (e.g.. microprocessors, graphics processing units (GPUs), digital signal processors (DSPs).
central processing units (CPUs), and/or other suitable electronic circuits), and/or can include and/or be implemented using computer software, firmware, or any combination thereof, to perform the various operations described herein.
[0123] The process 1000 and the process 1100 are illustrated as logical flow diagrams, the operation of which represents a sequence of operations that can be implemented in hardware, computer instructions, or a combination thereof. In the context of computer instructions, the operations represent computer-executable instructions stored on one or more computer- readable storage media that, when executed by one or more processors, perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like that perform particular functions or implement particular data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement the processes.
[0124] Additionally, the process 1000, the process 1100, and/or any other process described herein may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) executing collectively on one or more processors, by hardware, or combinations thereof. As noted above, the code may be stored on a computer-readable or machine-readable storage medium, for example, in the form of a computer program including a plurality of instructions executable by one or more processors. The computer-readable or machine-readable storage medium may be non-transitory.
[0125] FIG. 12 is a block diagram illustrating an example of a computing system 1200, which may be employed by the disclosed systems and techniques for protocols for drone landing. In particular, FIG. 12 illustrates an example of computing system 1200, which can be, for example, any computing device making up internal computing system, a remote computing system, a camera, or any component thereof in which the components of the system are in communication with each other using connection 1205. Connection 1205 can be a physical connection using a bus, or a direct connection into processor 1210, such as in a chipset architecture. Connection 1205 can also be a virtual connection, networked connection, or logical connection.
[0126] In some aspects, computing system 1200 is a distributed system in which the functions described in this disclosure can be distributed within a datacenter, multiple data centers, a peer network, etc. In some aspects, one or more of the described system components represents many such components each performing some or all of the function for which the component is described. In some aspects, the components can be physical or virtual devices.
[0127] Example system 1200 includes at least one processing unit (CPU or processor) 1210 and connection 1205 that communicatively couples various system components including system memory 1215. such as read-only memory (ROM) 1220 and random access memory (RAM) 1225 to processor 1210. Computing system 1200 can include a cache 1212 of highspeed memory connected directly with, in close proximity' to, or integrated as part of processor 1210.
[0128] Processor 1210 can include any general purpose processor and a hardware service or software service, such as services 1232, 1234, and 1236 stored in storage device 1230, configured to control processor 1210 as well as a special-purpose processor where software instructions are incorporated into the actual processor design. Processor 1210 may essentially be a completely self-contained computing system, containing multiple cores or processors, a bus, memory^ controller, cache, etc. A multi-core processor may be symmetric or asymmetric.
[0129] To enable user interaction, computing system 1200 includes an input device 1245, which can represent any number of input mechanisms, such as a microphone for speech, a touch-sensitive screen for gesture or graphical input, keyboard, mouse, motion input, speech, etc. Computing system 1200 can also include output device 1235, which can be one or more of a number of output mechanisms. In some instances, multimodal systems can enable a user to provide multiple types of input/ output to communicate with computing system 1200.
[0130] Computing system 1200 can include communications interface 1240, which can generally govern and manage the user input and system output. The communication interface may perform or facilitate receipt and/or transmission wired or wireless communications using wired and/or wireless transceivers, including those making use of an audio jack/plug, a microphone jack/plug, a universal serial bus (USB) port/plug, an Apple™ Lightning™ port/plug, an Ethernet port/plug, a fiber optic port/plug, a proprietary wired port/plug, 3G, 4G, 5G and/or other cellular data network wireless signal transfer, a Bluetooth™ wireless signal transfer, a Bluetooth™ low energy (BLE) wireless signal transfer, an IBEACON™ wireless
signal transfer, a radio-frequency identification (RFID) wireless signal transfer, near-field communications (NFC) wireless signal transfer, dedicated short range communication (DSRC) wireless signal transfer, 802.11 Wi-Fi wireless signal transfer, wireless local area network (WLAN) signal transfer, Visible Light Communication (VLC), Worldwide Interoperability for Microwave Access (WiMAX), Infrared (IR) communication wireless signal transfer. Public Switched Telephone Network (PSTN) signal transfer, Integrated Services Digital Network (ISDN) signal transfer, ad-hoc network signal transfer, radio wave signal transfer, microwave signal transfer, infrared signal transfer, visible light signal transfer, ultraviolet light signal transfer, wireless signal transfer along the electromagnetic spectrum, or some combination thereof.
[0131] The communications interface 1240 may also include one or more range sensors (e.g., light-based sensors, laser range finders, RF-based sensors, ultrasonic sensors, and infrared (IR) sensors) configured to collect data and provide measurements to processor 1210, whereby processor 1210 can be configured to perform determinations and calculations needed to obtain various measurements for the one or more range sensors. In some examples, the measurements can include time of flight, wavelengths, azimuth angle, elevation angle, range, linear velocity and/or angular velocity, or any combination thereof. The communications interface 1240 may also include one or more Global Navigation Satellite System (GNSS) receivers or transceivers that are used to determine a location of the computing system 1200 based on receipt of one or more signals from one or more satellites associated with one or more GNSS systems. GNSS systems include, but are not limited to. the US-based GPS. the Russia-based Global Navigation Satellite System (GLONASS), the China-based BeiDou Navigation Satellite System (BDS), and the Europe-based Galileo GNSS. There is no restriction on operating on any particular hardware arrangement, and therefore the basic features here may easily be substituted for improved hardware or firmware arrangements as they are developed.
[0132] Storage device 1230 can be a non-volatile and/or non-transitory and/or computer- readable memory device and can be a hard disk or other types of computer readable media which can store data that are accessible by a computer, such as magnetic cassettes, flash memory cards, solid state memory devices, digital versatile disks, cartridges, a floppy disk, a flexible disk, a hard disk, magnetic tape, a magnetic strip/stripe, any other magnetic storage medium, flash memoiy', memristor memory', any other solid-state memory', a compact disc read only memory' (CD-ROM) optical disc, a rewritable compact disc (CD) optical disc, digital
video disk (DVD) optical disc, a blu-ray disc (BDD) optical disc, a holographic optical disk, another optical medium, a secure digital (SD) card, a micro secure digital (microSD) card, a Memory Stick® card, a smartcard chip, a EMV chip, a subscriber identity module (SIM) card, a mini/micro/nano/pico SIM card, another integrated circuit (IC) chip/card, random access memory (RAM), static RAM (SRAM), dynamic RAM (DRAM), read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash EPROM (FLASHEPROM), cache memory (e.g., Level 1 (LI) cache, Level 2 (L2) cache, Level 3 (L3) cache, Level 4 (L4) cache, Level 5 (L5) cache, or other (L#) cache), resistive random-access memory (RRAM/ReRAM), phase change memory (PCM), spin transfer torque RAM (STT- RAM), another memory chip or cartridge, and/or a combination thereof.
[0133] The storage device 1230 can include software services, servers, services, etc., that when the code that defines such software is executed by the processor 1210, it causes the system to perform a function. In some aspects, a hardware service that performs a particular function can include the software component stored in a computer-readable medium in connection with the necessary hardware components, such as processor 1210, connection 1205, output device 1235, etc., to carry out the function. The term "‘computer-readable medium” includes, but is not limited to, portable or non-portable storage devices, optical storage devices, and various other mediums capable of storing, containing, or carry ing instruction(s) and/or data. A computer- readable medium may include a non-transitory medium in which data can be stored and that does not include carrier waves and/or transitory electronic signals propagating wirelessly or over wired connections. Examples of a non-transitory medium may include, but are not limited to, a magnetic disk or tape, optical storage media such as compact disk (CD) or digital versatile disk (DVD), flash memory', memory' or memory' devices. A computer-readable medium may- have stored thereon code and/or machine-executable instructions that may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, or the like.
[0134] Specific details are provided in the description above to provide a thorough understanding of the aspects and examples provided herein, but those skilled in the art will recognize that the application is not limited thereto. Thus, while illustrative aspects of the application have been described in detail herein, it is to be understood that the inventive concepts may be otherwise variously embodied and employed, and that the appended claims are intended to be construed to include such variations, except as limited by the prior art. Various features and aspects of the above-described application may be used individually or j ointly . Further, aspects can be utilized in any number of environments and applications beyond those described herein without departing from the broader scope of the specification. The specification and drawings are, accordingly, to be regarded as illustrative rather than restrictive. For the purposes of illustration, methods were described in a particular order. It should be appreciated that in alternate aspects, the methods may be performed in a different order than that described.
[0135] For clarity of explanation, in some instances the present technology may be presented as including individual functional blocks including devices, device components, steps or routines in a method embodied in software, or combinations of hardware and software. Additional components may be used other than those shown in the figures and/or described herein. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the aspects in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the aspects.
[0136] Further, those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the aspects disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
[0137] Individual aspects may be described above as a process or method which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
[0138] Processes and methods according to the above-described examples can be implemented using computer-executable instructions that are stored or otherwise available from computer- readable media. Such instructions can include, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or a processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memoty, networked storage devices, and so on.
[0139] In some aspects the computer-readable storage devices, mediums, and memories can include a cable or wireless signal containing a bitstream and the like. However, when mentioned, non-transitory computer-readable storage media expressly exclude media such as energy, carrier signals, electromagnetic waves, and signals per se.
[0140] Those of skill in the art will appreciate that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof, in some cases depending in part on the particular application, in part on the desired design, in part on the corresponding technology, etc.
[0141] The various illustrative logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed using hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof, and can take any of a variety of form factors. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks (e.g., a computer-program product) may be stored in a computer-readable or machine-readable medium. A processor(s) may perform the necessary' tasks. Examples of form factors include laptops, smart phones, mobile phones, tablet devices or other small form factor personal computers, personal digital assistants, rackmount devices, standalone devices, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example.
[0142] The instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are example means for providing the functions described in the disclosure.
[0143] The techniques described herein may also be implemented in electronic hardware, computer software, firmware, or any combination thereof. Such techniques may be implemented in any of a variety of devices such as general purposes computers, wireless communication device handsets, or integrated circuit devices having multiple uses including application in wireless communication device handsets and other devices. Any features described as modules or components may be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a computer-readable data storage medium including program code including instructions that, when executed, performs one or more of the methods, algorithms, and/or operations described above. The computer-readable data storage medium may form part of a computer program product, which may include packaging materials. The computer-readable medium may include memory or data storage media, such as random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, magnetic or optical data storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a computer-readable communication medium that carries or
communicates program code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer, such as propagated signals or waves.
[0144] The program code may be executed by a processor, which may include one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, an application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry'. Such a processor may be configured to perform any of the techniques described in this disclosure. A general-purpose processor may be a microprocessor; but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality' of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Accordingly, the term “processor.” as used herein may refer to any of the foregoing structure, any combination of the foregoing structure, or any other structure or apparatus suitable for implementation of the techniques described herein.
[0145] One of ordinary skill will appreciate that the less than (“<”) and greater than (“>”) symbols or terminology used herein can be replaced with less than or equal to (“<”) and greater than or equal to (“> ”) symbols, respectively, without departing from the scope of this description.
[0146] Where components are described as being “configured to” perform certain operations, such configuration can be accomplished, for example, by designing electronic circuits or other hardware to perform the operation, by programming programmable electronic circuits (e.g., microprocessors, or other suitable electronic circuits) to perform the operation, or any combination thereof.
[0147] The phrase “coupled to” or “communicatively coupled to” refers to any component that is physically connected to another component either directly or indirectly, and/or any component that is in communication with another component (e.g., connected to the other component over a wired or wireless connection, and/or other suitable communication interface) either directly or indirectly.
[0148] Claim language or other language reciting "at least one of’ a set and/or “one or more” of a set indicates that one member of the set or multiple members of the set (in any combination) satisfy the claim. For example, claim language reciting “at least one of A and B” or “at least one of A or B” means A, B, or A and B. In another example, claim language reciting “at least one of A. B, and C” or “at least one of A, B, or C” means A, B. C, or A and B. or A and C, or B and C, or A and B and C, or any duplicate information or data (e.g., A and A, B and B, C and C, A and A and B, and so on), or any other ordering, duplication, or combination of A, B, and C. The language “at least one of’ a set and/or “one or more” of a set does not limit the set to the items listed in the set. For example, claim language reciting “at least one of A and B” or “at least one of A or B” may mean A, B, or A and B, and may additionally include items not listed in the set of A and B. The phrases “at least one” and “one or more” are used interchangeably herein.
[0149] Claim language or other language reciting “at least one processor configured to,” “at least one processor being configured to,” “one or more processors configured to,” “one or more processors being configured to,” or the like indicates that one processor or multiple processors (in any combination) can perform the associated operation(s). For example, claim language reciting “at least one processor configured to: X. Y, and Z” means a single processor can be used to perform operations X, Y, and Z; or that multiple processors are each tasked with a certain subset of operations X, Y, and Z such that together the multiple processors perform X,
Y, and Z; or that a group of multiple processors work together to perform operations X. Y, and
Z. In another example, claim language reciting “at least one processor configured to: X. Y, and Z” can mean that any single processor may only perform at least a subset of operations X, Y, and Z.
[0150] Where reference is made to one or more elements performing functions (e.g., steps of a method), one element may perform all functions, or more than one element may collectively perform the functions. When more than one element collectively performs the functions, each function need not be performed by each of those elements (e.g., different functions may be performed by different elements) and/or each function need not be performed in whole by only one element (e.g.. different elements may perform different sub-functions of a function). Similarly, where reference is made to one or more elements configured to cause another element (e.g., an apparatus) to perform functions, one element may be configured to cause the
other element to perform all functions, or more than one element may collectively be configured to cause the other element to perform the functions.
[0151] Where reference is made to an entity (e.g., any entity or device described herein) performing functions or being configured to perform functions (e.g., steps of a method), the entity may be configured to cause one or more elements (individually or collectively) to perform the functions. The one or more components of the entity may include at least one memory, at least one processor, at least one communication interface, another component configured to perform one or more (or all) of the functions, and/or any combination thereof. Where reference to the entity performing functions, the entity may be configured to cause one component to perform all functions, or to cause more than one component to collectively perform the functions. When the entity is configured to cause more than one component to collectively perform the functions, each function need not be performed by each of those components (e.g., different functions may be performed by different components) and/or each function need not be performed in whole by only one component (e.g., different components may perform different sub-functions of a function).
[0152] Illustrative aspects of the disclosure include:
[0153] Aspect 1. A first computing device associated with a drone, the first computing device comprising: at least one memory; and at least one processor coupled to the at least one memory and configured to: receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determine a path for the drone to travel to the area, based on receiving the authorization message; and cause the drone to follow the path to travel to the area and to land in the area.
[0154] Aspect 2. The first computing device of Aspect 1, wherein the at least one processor is configured to determine the path for the drone to travel to the area further based on a planned delivery of one or more items to a location.
[0155] Aspect 3. The first computing device of Aspect 2, wherein the location is different from the area associated with the user.
[0156] Aspect 4. The first computing device of any one of Aspects 1 to 3, wherein the area associated with the user is a property associated with the user.
[0157] Aspect 5. The first computing device of any one of Aspects 1 to 4, wherein the authorization message further comprises one or more restrictions by the user for landing the drone in the area.
[0158] Aspect 6. The first computing device of Aspect 5, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
[0159] Aspect 7. The first computing device of any one of Aspects 1 to 6, wherein the at least one processor is further configured to cause the drone to broadcast an authorization request message.
[0160] Aspect 8. The first computing device of any one of Aspects 1 to 7, wherein the unplanned landing of the drone is an emergency landing of the drone.
[0161] Aspect 9. The first computing device of any one of Aspects 1 to 8, wherein the at least one processor is further configured to provide compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
[0162] Aspect 10. The first computing device of any one of Aspects 1 to 9, wherein the at least one processor is further configured to generate, based on receiving the authorization message, a landing map for unplanned landings, w herein the landing map comprises the path.
[0163] Aspect 11. The first computing device of any one of Aspects 1 to 10, wherein the authorization message is received based on a notification provided to the second computing device associated with the user, the notification being based on detection of the drone by at least one camera located in the area.
[0164] Aspect 12. A method of landing a drone, the method comprising: receiving, by a first computing device associated with the drone from a second computing device associated with
a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determining, by the first computing device associated with the drone, a path for the drone to travel to the area, based on receiving the authorization message; and causing, by the first computing device associated with the drone, the drone to follow the path to travel to the area and to land in the area.
[0165] Aspect 13. The method of Aspect 12, wherein determining the path for the drone to travel to the area is further based on a planned delivery of one or more items to a location.
[0166] Aspect 14. The method of Aspect 13, wherein the location is different from the area associated with the user.
[0167] Aspect 15. The method of any one of Aspects 12 to 14. wherein the area associated with the user is a property associated with the user.
[0168] Aspect 16. The method of any one of Aspects 12 to 15, wherein the authorization message further comprises one or more restrictions by the user for landing the drone in the area.
[0169] Aspect 17. The method of Aspect 16, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
[0170] Aspect 18. The method of any one of Aspects 12 to 17, further comprising causing, by the first computing device associated with the drone, the drone to broadcast an authorization request message.
[0171] Aspect 19. The method of any one of Aspects 12 to 18, wherein the unplanned landing of the drone is an emergency landing of the drone.
[0172] Aspect 20. The method of any one of Aspects 12 to 19, further comprising providing compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
[0173] Aspect 21. The method of any one of Aspects 12 to 20, further comprising generating, by the first computing device based on receiving the authorization message, a landing map for unplanned landings, wherein the landing map comprises the path.
[0174] Aspect 22. The method of any one of Aspects 12 to 21, wherein the authorization message is received based on a notification provided to the second computing device associated with the user, the notification being based on detection of the drone by at least one camera located in the area.
[0175] Aspect 23. A drone comprising: at least one transceiver; at least one memory; and at least one processor coupled to the at least one transceiver and the at least one memory', the at least one processor configured to: broadcast, via the at least one transceiver, an authorization request message to an area for landing the drone; receive, via the at least one transceiver from a first computing device associated with the drone, a path to follow to the area, wherein the path is based on an authorization message received from a second computing device of a user associated with the area in response to the authorization request message; and cause the drone to travel along the path and land in the area.
[0176] Aspect 24. The drone of Aspect 23, wherein the path is further based on a planned delivery' of one or more items to a location.
[0177] Aspect 25. The drone of Aspect 24, wherein the location is different from the area associated with the user.
[0178] Aspect 26. The drone of any one of Aspects 23 to 25, wherein the area associated with the user is a property associated with the user.
[0179] Aspect 27. The drone of any one of Aspects 23 to 26, wherein the authorization message comprises an authorization by the user to land the drone in the area for an unplanned landing of the drone.
[0180] Aspect 28. The drone of Aspect 27, wherein the unplanned landing of the drone is an emergency landing of the drone.
[0181] Aspect 29. The drone of any one of Aspects 23 to 28, wherein the authorization message comprises one or more restrictions by the user for landing the drone in the area.
[0182] Aspect 30. The drone of Aspect 29, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
[0183] Aspect 31. The drone of any one of Aspects 23 to 30, w herein the at least one processor is further configured to receive, via the at least one transceiver, a landing map for unplanned landings, wherein the landing map comprises the path.
[0184] Aspect 32. The drone of any one of Aspects 23 to 31, wherein the at least one processor is configured to receive, via the at least one transceiver, the authorization message based on a notification provided to the second computing device of the user, the notification being based on detection of the drone by at least one camera located in the area.
[0185] Aspect 33. A method of landing a drone, the method comprising: broadcasting, by the drone, an authorization request message to an area for landing the drone; receiving, by the drone, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and traveling, by the drone, along the path and landing in the area.
[0186] Aspect 34. The method of Aspect 33, wherein the path is further based on a planned delivery of one or more items to a location.
[0187] Aspect 35. The method of Aspect 34, wherein the location is different from the area associated with the user.
[0188] Aspect 36. The method of any one of Aspects 33 to 35, wherein the area associated with the user is a property associated with the user.
[0189] Aspect 37. The method of any one of Aspects 33 to 36, wherein the authorization message comprises an authorization by the user to land the drone in the area for an unplanned landing of the drone.
[0190] Aspect 38. The method of Aspect 37, wherein the unplanned landing of the drone is an emergency landing of the drone.
[0191] Aspect 39. The method of any one of Aspects 33 to 38, wherein the authorization message comprises one or more restrictions by the user for landing the drone in the area.
[0192] Aspect 40. The method of Aspect 39, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
[0193] Aspect 41. The method of any one of Aspects 33 to 40, further comprising receiving, by the drone, a landing map for unplanned landings, wherein the landing map comprises the path.
[0194] Aspect 42. The method of any one of Aspects 33 to 41, wherein the authorization message is received based on a notification provided to the computing device of the user, the notification being based on detection of the drone by at least one camera located in the area.
[0195] Aspect 43. A non-transitory computer-readable storage medium comprising instructions stored thereon which, when executed by at least one processor, causes the at least one processor to perform operations according to any one of Aspects 12 to 22.
[0196] Aspect 44. An apparatus comprising one or more means for performing operations according to any one of Aspects 12 to 22.
[0197] Aspect 45. A non-transitory computer-readable storage medium comprising instructions stored thereon which, when executed by at least one processor, causes the at least one processor to perform operations according to any one of Aspects 33 to 42.
[0198] Aspect 46. An apparatus comprising one or more means for performing operations according to any one of Aspects 33 to 42.
[0199] The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art. and the generic principles defined herein may be applied to
other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.”
Claims
1. A first computing device associated with a drone, the first computing device comprising: at least one memory; and at least one processor coupled to the at least one memory and configured to: receive, from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determine a path for the drone to travel to the area, based on receiving the authorization message; and cause the drone to follow the path to travel to the area and to land in the area.
2. The first computing device of claim 1, wherein the at least one processor is configured to determine the path for the drone to travel to the area further based on a planned delivery of one or more items to a location.
3. The first computing device of claim 2, wherein the location is different from the area associated with the user.
4. The first computing device of claim 1, wherein the area associated with the user is a property associated with the user.
5. The first computing device of claim 1, wherein the authorization message further comprises one or more restrictions by the user for landing the drone in the area.
6. The first computing device of claim 5, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
7. The first computing device of claim 1, wherein the at least one processor is further configured to cause the drone to broadcast an authorization request message.
8. The first computing device of claim 1, wherein the unplanned landing of the drone is an emergency landing of the drone.
9. The first computing device of claim 1, wherein the at least one processor is further configured to provide compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
10. The first computing device of claim 1, wherein the at least one processor is further configured to generate, based on receiving the authorization message, a landing map for unplanned landings, wherein the landing map comprises the path.
11. The first computing device of claim 1, wherein the authorization message is received based on a notification provided to the second computing device associated with the user, the notification being based on detection of the drone by at least one camera located in the area.
12. A method of landing a drone, the method comprising: receiving, by a first computing device associated with the drone from a second computing device associated with a user, an authorization message comprising an authorization by the user to land the drone in an area associated with the user for an unplanned landing of the drone; determining, by the first computing device associated with the drone, a path for the drone to travel to the area, based on receiving the authorization message; and causing, by the first computing device associated with the drone, the drone to follow the path to travel to the area and to land in the area.
13. The method of claim 12, wherein determining the path for the drone to travel to the area is further based on a planned delivery of one or more items to a location.
14. The method of claim 13, wherein the location is different from the area associated with the user.
15. The method of claim 12, wherein the area associated with the user is a property associated with the user.
16. The method of claim 12, wherein the authorization message further comprises one or more restrictions by the user for landing the drone in the area.
17. The method of claim 16, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
18. The method of claim 12, further comprising causing, by the first computing device associated with the drone, the drone to broadcast an authorization request message.
19. The method of claim 12, wherein the unplanned landing of the drone is an emergency landing of the drone.
20. The method of claim 12, further comprising providing compensation information to the user for providing the authorization to land the drone in the area associated with the user for the unplanned landing of the drone.
21. The method of claim 12, further comprising generating, by the first computing device based on receiving the authorization message, a landing map for unplanned landings, wherein the landing map comprises the path.
22. The method of claim 12, wherein the authorization message is received based on a notification provided to the second computing device associated with the user, the notification being based on detection of the drone by at least one camera located in the area.
23. A drone comprising: at least one transceiver; at least one memory; and at least one processor coupled to the at least one transceiver and the at least one memory, the at least one processor configured to: broadcast, via the at least one transceiver, an authorization request message to an area for landing the drone; receive, via the at least one transceiver from a first computing device associated with the drone, a path to follow to the area, wherein the path is based on an authorization message received from a second computing device of a user associated with the area in response to the authorization request message; and cause the drone to travel along the path and land in the area.
24. The drone of claim 23. wherein the path is further based on a planned delivery of one or more items to a location.
25. The drone of claim 24, wherein the location is different from the area associated with the user.
26. The drone of claim 23, wherein the area associated with the user is a property associated with the user.
27. The drone of claim 23, wherein the authorization message comprises an authorization by the user to land the drone in the area for an unplanned landing of the drone.
28. The drone of claim 27, wherein the unplanned landing of the drone is an emergency landing of the drone.
29. The drone of claim 23, wherein the authorization message comprises one or more restrictions by the user for landing the drone in the area.
30. The drone of claim 29, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction
disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
31. The drone of claim 23, wherein the at least one processor is further configured to receive, via the at least one transceiver, a landing map for unplanned landings, wherein the landing map comprises the path.
32. The drone of claim 23, wherein the at least one processor is configured to receive, via the at least one transceiver, the authorization message based on a notification provided to the second computing device of the user, the notification being based on detection of the drone by at least one camera located in the area.
33. A method of landing a drone, the method comprising: broadcasting, by the drone, an authorization request message to an area for landing the drone; receiving, by the drone, a path to follow to the area, wherein the path is based on an authorization message received from a computing device of a user associated with the area in response to the authorization request message; and traveling, by the drone, along the path and landing in the area.
34. The method of claim 33, wherein the path is further based on a planned delivery of one or more items to a location.
35. The method of claim 34, wherein the location is different from the area associated with the user.
36. The method of claim 33, wherein the area associated with the user is a property associated with the user.
37. The method of claim 33, wherein the authorization message comprises an authorization by the user to land the drone in the area for an unplanned landing of the drone.
38. The method of claim 37, wherein the unplanned landing of the drone is an emergency landing of the drone.
39. The method of claim 33, wherein the authorization message comprises one or more restrictions by the user for landing the drone in the area.
40. The method of claim 39, wherein the one or more restrictions comprise at least one of a restriction disallowing operation of a camera on the drone in the area, a restriction disallowing operation of at least one sensor on the drone in the area, a speed restriction for the drone in the area, or a restriction indicating no landing of the drone in the area when people are present within the area.
41. The method of claim 33, further comprising receiving, by the drone, a landing map for unplanned landings, wherein the landing map comprises the path.
42. The method of claim 33, wherein the authorization message is received based on a notification provided to the computing device of the user, the notification being based on detection of the drone by at least one camera located in the area.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IN202341062517 | 2023-09-18 | ||
| IN202341062517 | 2023-09-18 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025064149A1 true WO2025064149A1 (en) | 2025-03-27 |
Family
ID=92712516
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/043493 Pending WO2025064149A1 (en) | 2023-09-18 | 2024-08-22 | Protocols for drone landing |
Country Status (2)
| Country | Link |
|---|---|
| TW (1) | TW202531164A (en) |
| WO (1) | WO2025064149A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9959771B1 (en) * | 2015-12-18 | 2018-05-01 | Amazon Technologies, Inc. | Unmanned aerial vehicle routing using real-time weather data |
| US10553122B1 (en) * | 2016-03-22 | 2020-02-04 | Amazon Technologies, Inc. | Unmanned aerial vehicle data collection for routing |
| US20200365039A1 (en) * | 2017-11-15 | 2020-11-19 | Ntt Docomo, Inc. | Information processing apparatus |
-
2024
- 2024-08-22 WO PCT/US2024/043493 patent/WO2025064149A1/en active Pending
- 2024-08-23 TW TW113131846A patent/TW202531164A/en unknown
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9959771B1 (en) * | 2015-12-18 | 2018-05-01 | Amazon Technologies, Inc. | Unmanned aerial vehicle routing using real-time weather data |
| US10553122B1 (en) * | 2016-03-22 | 2020-02-04 | Amazon Technologies, Inc. | Unmanned aerial vehicle data collection for routing |
| US20200365039A1 (en) * | 2017-11-15 | 2020-11-19 | Ntt Docomo, Inc. | Information processing apparatus |
Also Published As
| Publication number | Publication date |
|---|---|
| TW202531164A (en) | 2025-08-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7044804B2 (en) | Collision avoidance methods and systems between vehicles and pedestrians | |
| US20200349852A1 (en) | Smart drone rooftop and ground airport system | |
| EP3485585B1 (en) | Dynamic beam steering for unmanned aerial vehicles | |
| CN106603150B (en) | Methods for mmWave communications | |
| US11294398B2 (en) | Personal security robotic vehicle | |
| US20180038695A1 (en) | Generating Crowd-Sourced Navigation Data | |
| US12342316B2 (en) | Communication nodes and methods for relative positioning of wireless communication devices | |
| US20230413026A1 (en) | Vehicle nudge via c-v2x | |
| EP4050532A1 (en) | Method, apparatus, and system for shipment tracking | |
| US10755582B2 (en) | Drone physical and data interface for enhanced distance coverage | |
| EP4354372A2 (en) | Method and system for detecting duration and cause of border delays | |
| EP4591299A1 (en) | Virtual traffic light via c-v2x | |
| US12096166B2 (en) | Method and system for moving status detection for a sensor apparatus | |
| US12389211B2 (en) | Communication resource management | |
| WO2019169105A1 (en) | System and method for assisting unmanned vehicles in delivery transactions | |
| WO2025064149A1 (en) | Protocols for drone landing | |
| US11288624B2 (en) | Method and system for yard asset management | |
| WO2024186427A1 (en) | Risk management and route planning for internet of things (iot) devices | |
| US20250371484A1 (en) | Association-based intelligent asset tracking | |
| US12200058B2 (en) | Sentinel devices in a low power self-organizing tracking sensor network | |
| US20250291970A1 (en) | Extrapolation and interactions of digital twin-based models and applications | |
| US20250083681A1 (en) | Collision avoidance sensitivity | |
| CN119174201A (en) | Roadside system based on aerial platform | |
| JP2025072968A (en) | Information processing device, communication device, and information processing method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24768840 Country of ref document: EP Kind code of ref document: A1 |