[go: up one dir, main page]

US20200001779A1 - Method for communicating intent of an autonomous vehicle - Google Patents

Method for communicating intent of an autonomous vehicle Download PDF

Info

Publication number
US20200001779A1
US20200001779A1 US16/421,423 US201916421423A US2020001779A1 US 20200001779 A1 US20200001779 A1 US 20200001779A1 US 201916421423 A US201916421423 A US 201916421423A US 2020001779 A1 US2020001779 A1 US 2020001779A1
Authority
US
United States
Prior art keywords
autonomous vehicle
intersection
intent
vehicle
ahead
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/421,423
Inventor
Chip J. Alexander
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Drive AI Inc
Original Assignee
Drive AI Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Drive AI Inc filed Critical Drive AI Inc
Priority to US16/421,423 priority Critical patent/US20200001779A1/en
Publication of US20200001779A1 publication Critical patent/US20200001779A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/507Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way
    • G05D2201/0213

Definitions

  • This invention relates generally to the field of autonomous vehicles and more specifically to a new and useful method for communicating intent of an autonomous vehicle in the field of autonomous vehicles.
  • FIG. 1 is a flowchart representation of a method.
  • a method S 100 for communicating intent includes, at an autonomous vehicle: autonomously approaching an intersection in Block S 110 ; autonomously navigating to a stop proximal the intersection in Block S 112 ; while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle in Block S 120 ; calculating a confidence score for possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle in Block S 130 ; projecting the intent icon onto pavement at distances ahead of the autonomous vehicle proportional to the confidence score and greater than the first distance in Block S 122 ; and, in response to the confidence score exceeding a threshold score, autonomously entering the intersection in Block S 140 .
  • One variation of the method S 100 includes, at the autonomous vehicle: autonomously approaching an intersection in Block S 110 ; while slowing upon approach to the intersection, projecting an intent icon onto pavement at a distance ahead of the autonomous vehicle proportional to a speed of the autonomous vehicle in Block S 120 ; while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle in Block S 120 ; predicting possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle in Block S 130 ; in preparation for navigating into the intersection, projecting the intent icon onto pavement at a second distance ahead of the autonomous vehicle greater than the first distance in Block S 122 ; and autonomously entering the intersection in Block S 140 .
  • the method S 100 can be executed by an autonomous vehicle to visually indicate its intent—to other vehicles, vehicle operators, and pedestrians nearby—through a simple intent icon projected onto nearby pavement.
  • the autonomous vehicle can instead project a simple intent icon (e.g., a green circle or “dot” approximately 25-centimeters in diameter) onto nearby pavement at distances from the autonomous vehicle that correspond to the autonomous vehicle's intent to advance forward from its current location, such as into an intersection, through a turn lane, or out of a parking space.
  • a simple intent icon e.g., a green circle or “dot” approximately 25-centimeters in diameter
  • the autonomous vehicle can enable other vehicle operators and pedestrians nearby to quickly intuit the autonomous vehicle's next action and the autonomous vehicle's perception of the field (i.e., whether the autonomous vehicle “sees” these other vehicles and pedestrians).
  • the autonomous vehicle can execute Blocks of the method S 100 in order to simulate eye contact, body language, and gestures between human operators of road vehicles and between vehicle operators and pedestrians nearby via a simple intent icon projected onto pavement near the autonomous vehicle.
  • a human standing on a sidewalk or in an office may look down—such as at her phone or notebook or focusing downwardly on her current task—if she intends to remain in her current location rather than walk forward.
  • this human may naturally look up from her smartphone, notebook, or task when she eventually does intend to walk forward, including looking in the direction she intends to walk.
  • the autonomous vehicle can execute Blocks of the method S 100 to mimic this behavior, including: projecting an intent icon onto the pavement just ahead of the autonomous vehicle when the autonomous vehicle intends to remain stopped in its current location (e.g., at a traffic light, a stop sign, a crosswalk, a turn lane, a parking space); and projecting the intent icon further ahead of the front of the autonomous vehicle when the autonomous vehicle intends to move forward.
  • an intent icon onto the pavement just ahead of the autonomous vehicle when the autonomous vehicle intends to remain stopped in its current location (e.g., at a traffic light, a stop sign, a crosswalk, a turn lane, a parking space)
  • the intent icon further ahead of the front of the autonomous vehicle when the autonomous vehicle intends to move forward.
  • Human operators and pedestrians nearby may: witness this intent icon; intuit the autonomous vehicle's intent to remain stopped when the projected intent icon remains close to the autonomous vehicle, since this intent icon position may simulate a user looking down at her phone, notebook, or other task; and intuit the autonomous vehicle's increasing intent to move forward (or backward, such as out of a parking space) as the autonomous vehicle projects the intent icon at great distances from the autonomous vehicle, since this intent icon position may simulate a user looking up from her phone, notebook, or other task when preparing to move from her current location.
  • the autonomous vehicle can therefore execute Blocks of the method S 100 in order to enable vehicle operators and pedestrians nearby to quickly comprehend the autonomous vehicle's intent (e.g., its next elected navigational action) without reading textual content, interpreting scenario-specific icons, or otherwise holding prior knowledge of such language or iconography projected or rendered by the autonomous vehicle.
  • the autonomous vehicle's intent e.g., its next elected navigational action
  • the method S 100 is described below as executed by an autonomous vehicle to project an intent icon onto pavement in the field near the autonomous vehicle in order to communicate the autonomous vehicle's perception of its field, its perception of its right of way, and its intent, such as in addition to or in combination with left and right turn signals integrated into the autonomous vehicle.
  • the autonomous vehicle can implement similar methods and techniques to render a similar intent icon on exterior displays integrated into the autonomous vehicle.
  • the autonomous vehicle can include a dedicated autonomous rideshare vehicle, an autonomous personal mobility vehicle, an autonomous fleet vehicle, an autonomous delivery or other commercial-type vehicle, an autonomous truck, etc.
  • the autonomous vehicle can include: a suite of sensors configured to collect information about the autonomous vehicle's environment; local memory storing a navigation map defining a route for execution by the autonomous vehicle and a localization map that the autonomous vehicle implements to determine its location in real space; and a controller.
  • the controller can: determine the location of the autonomous vehicle in real space based on sensor data collected from the suite of sensors and the localization map; determine the context of a scene around the autonomous vehicle based on these sensor data; elect a future navigational action (e.g., a navigational decision) based on the context of the scene around the autonomous vehicle, the real location of the autonomous vehicle, and the navigation map, such as by implementing a deep learning and/or artificial intelligence model; and control actuators within the vehicle (e.g., accelerator, brake, and steering actuators) according to elected navigation decisions.
  • a future navigational action e.g., a navigational decision
  • control actuators within the vehicle e.g., accelerator, brake, and steering actuators
  • the autonomous vehicle includes a set of 360° LIDAR sensors arranged on the autonomous vehicle, such as one LIDAR sensor arranged at the front of the autonomous vehicle and a second LIDAR sensor arranged at the rear of the autonomous vehicle or a cluster of LIDAR sensors arranged on the roof of the autonomous vehicle.
  • Each LIDAR sensor can output one three-dimensional distance map (or depth image)—such as in the form of a 3D point cloud representing distances between the LIDAR sensor and external surface within the field of view of the LIDAR sensor—per rotation of the LIDAR sensor (i.e., once per scan cycle).
  • the autonomous vehicle can additionally or alternatively include: a set of infrared emitters configured to project structured light into a field near the autonomous vehicle; a set of infrared detectors (e.g., infrared cameras); and a processor configured to transform images output by the infrared detector(s) into a depth map of the field.
  • a set of infrared emitters configured to project structured light into a field near the autonomous vehicle
  • a set of infrared detectors e.g., infrared cameras
  • a processor configured to transform images output by the infrared detector(s) into a depth map of the field.
  • the autonomous vehicle can also include one or more color cameras facing outwardly from the front, rear, and left lateral and right lateral sides of the autonomous vehicle.
  • each camera can output a video feed containing a sequence of digital photographic images (or “frames”), such as at a rate of 20 Hz.
  • the autonomous vehicle can also include a set of infrared proximity sensors arranged along the perimeter of the base of the autonomous vehicle and configured to output signals corresponding to proximity of objects and pedestrians within one meter of the autonomous vehicle.
  • the controller in the autonomous vehicle can thus fuse data streams from the LIDAR sensor(s), the color camera(s), and the proximity sensor(s), etc.
  • the autonomous vehicle can also collect data broadcast by other vehicles and/or static sensor systems nearby and can incorporate these data into an optical scan to determine a state and context of the scene around the vehicle and to elect subsequent actions.
  • the autonomous vehicle can compare features extracted from this optical scan to like features represented in the localization map—stored in local memory on the autonomous vehicle—in order to determine its geospatial location and orientation in real space and then elect a future navigational action or other navigational decision accordingly.
  • the autonomous vehicle can also: implement a perception model—such as including integrated or discrete vehicle, pedestrian, traffic sign, traffic signal, and lane marker detection models—to detect and identify mutable and immutable objects in its proximity; and implement a navigation or path planning model (e.g., in the form of a convolutional neural network) to elect acceleration, braking, and turning actions based on these mutable and immutable objects and the autonomous vehicle's target destination.
  • a perception model such as including integrated or discrete vehicle, pedestrian, traffic sign, traffic signal, and lane marker detection models—to detect and identify mutable and immutable objects in its proximity
  • a navigation or path planning model e.g., in the form of a convolutional neural network
  • the autonomous vehicle can implement a localization map, a navigational model, and a path planning model to: autonomously approach an intersection in Block Silo; detect a stop sign, yield sign, or traffic signal near this intersection; detect another vehicle or pedestrian near this intersection; perceive that the other vehicle or pedestrian has right of way to enter the intersection or that the autonomous vehicle is otherwise required to stop at the intersection; and then autonomously navigate to a stop proximal the intersection in Block S 112 .
  • the autonomous vehicle can execute subsequent Blocks of the method S 100 to visually communicate its intent by projecting the intent icon onto pavement nearby, as described below.
  • the autonomous vehicle can include any other sensors and can implement any other scanning, signal processing, and autonomous navigation techniques or models to determine its geospatial position and orientation, to perceive objects in its vicinity, and to elect navigational actions based on sensor data collected through these sensors.
  • the autonomous vehicle also includes a front projector configured to project light onto pavement ahead of the autonomous vehicle.
  • the front projector can include a DLP or LCD projector arranged under or integrated into the front bumper of the autonomous vehicle and configured to project light over a distance ahead of the autonomous vehicle, such as including from 50 centimeters ahead of the autonomous vehicle to five meters ahead of the autonomous vehicle.
  • the front projector can be configured to output a beam of light (e.g., an approximately-collimated light beam) and can be pivotably mounted to the front of the autonomous vehicle in order to project the beam of light along a length of pavement ahead of the autonomous vehicle.
  • the forward project can be mounted to or integrated into the autonomous vehicle in any other way, can include a light source of any other type, and can project light into the field—at varying distances from the front of the autonomous vehicle—in any other way.
  • the autonomous vehicle can similarly include: a rear projector configured to project light onto pavement behind the autonomous vehicle; a right projector configured to project light onto pavement to the right of the autonomous vehicle; and/or a left projector configured to project light onto pavement to the left of the autonomous vehicle.
  • Block S 120 of the method S 100 recites, while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle;
  • Block S 130 of the method S 100 recites calculating a confidence score for possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle;
  • Block S 122 of the method S 100 recites projecting the intent icon onto pavement at distances ahead of the autonomous vehicle proportional to the confidence score and greater than the first distance.
  • the autonomous vehicle can project the intent icon onto pavement near the autonomous vehicle at a distance from the autonomous vehicle (or at a position relative to the autonomous vehicle) that corresponds to the autonomous vehicle's intent to either remain stopped in its current location or to enter the intersection.
  • the autonomous vehicle can project a simple intent icon—from which a human nearby may quickly intuit the autonomous vehicle's intent to either remain stopped or to move from its current location—onto pavement in the field near the autonomous vehicle.
  • the autonomous vehicle projects a circular “dot”—such as approximately 25 centimeters in diameter and in a solid color (e.g., white, orange, green, yellow, violet, pink)—onto pavement near the autonomous vehicle in Blocks S 120 and S 122 .
  • a circular “dot” such as approximately 25 centimeters in diameter and in a solid color (e.g., white, orange, green, yellow, violet, pink)—onto pavement near the autonomous vehicle in Blocks S 120 and S 122 .
  • the autonomous vehicle can also animate the intent icon in order to garner attention from vehicle operators and pedestrians nearby. For example, the autonomous vehicle can pulsate the intent icon, including expanding and contracting the projected intent icon by +/ ⁇ 30% of its nominal diameter, in order to increase likelihood that a human onlooker will notice the intent icon. In this implementation, the autonomous vehicle can also pulsate the intent icon at a rate corresponding to its intended rate of acceleration or deceleration from its current location.
  • the autonomous vehicle projects the intent icon in the form of a “dot” into the field with a “tail” trailing the dot back to the autonomous vehicle in order to visually link the projected dot to the autonomous vehicle, such as if other autonomous vehicles nearby are implementing similar methods and techniques to project their own intent icons onto pavement nearby (e.g., when multiple autonomous vehicles converge on one intersection).
  • the autonomous vehicle can project the intent icon that defines a tapered line or arc extending from the autonomous vehicle and widening at greater distances from the autonomous vehicle.
  • the autonomous vehicle can also project the intent icon in a color matched to the autonomous vehicle's exterior color or matched to a graphic on the autonomous vehicle's exterior.
  • the autonomous vehicle can include green and yellow graphics arranged on its exterior; and can project a circular intent icon with concentric green and yellow rings onto pavement nearby in order to enable humans nearby to visually link this projected intent icon to this autonomous vehicle.
  • a second autonomous vehicle can include orange and white graphics arranged on its exterior; and can project a circular intent icon with concentric orange and white rings onto pavement nearby in order to enable humans nearby to visually link this projected intent icon to this second autonomous vehicle and to distinguish this intent icon projected by the second autonomous vehicle from intent icons projected by other autonomous vehicles nearby.
  • the autonomous vehicle can project an intent icon of any other size, geometry, and/or color, etc. and animated in any other way.
  • the autonomous vehicle can activate the front projector to project the intent icon into the field nearby in select scenarios—that is, when the autonomous vehicle is preparing to execute certain navigational actions.
  • the autonomous vehicle activates the front projector to project the intent icon into the field upon stopping at a stop sign and preparing to move into the intersection ahead.
  • the autonomous vehicle can implement autonomous navigation and perception techniques to: autonomously navigate along a segment of road; detect a stop sign ahead of the autonomous vehicle; slow upon approach to the stop sign; stop at or ahead of the intersection; detect and track other vehicles at the intersection; perceive right of way of these vehicles based on detected arrival times at the intersection; and remain stopped at the intersection while waiting for other vehicles—with right of way preceding that of the autonomous vehicle—to enter the intersection.
  • the autonomous vehicle can activate the front projector to project the intent icon just ahead (e.g., within 50 centimeters) of the front bumper of the autonomous vehicle. During this period, the autonomous vehicle can continue to: scan its surrounding field for other vehicles; track locations of these other vehicles with and near the intersection; and derive the autonomous vehicle's right of way to enter the intersection.
  • the autonomous vehicle can trigger the front projector to move the intent icon further ahead of the front of the autonomous vehicle in order to visually indicate the autonomous vehicle's intent to enter the intersection
  • the front projector can “animate” the intent icon moving from its projected position proximal the autonomous vehicle's front bumper to a position further ahead of the autonomous vehicle—such as in the intersection—in order to visually indicate the autonomous vehicle's intent to advance into the intersection soon thereafter.
  • the autonomous vehicle can also: calculate a confidence score that it possesses right of way to enter the intersection based on presence of other vehicles in or near the intersection in Block S 130 ; project the intent icon into the field at a distance from the front of the autonomous vehicle as a function of this confidence score in Block S 122 ; and enter the intersection and resume navigation once this confidence score exceeds a threshold score in Block S 140 . Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance ahead of the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to enter the intersection.
  • the autonomous vehicle activates the front projector to project the intent icon into the field upon stopping ahead of a crosswalk and preparing to advance through the crosswalk.
  • the autonomous vehicle can execute methods and techniques similar to those described above to: autonomously navigate along a road segment; detect a crosswalk ahead of the autonomous vehicle; detect and track a pedestrian near the crosswalk; slow to a stop at or ahead of the crosswalk; and remain stopped at the crosswalk while waiting for the pedestrian to enter and then exit the crosswalk.
  • the autonomous vehicle can activate the front projector to project the intent icon just ahead of the front bumper of the autonomous vehicle in order to visually communicate—to the pedestrian—that the autonomous vehicle intends to remain stopped at the crosswalk for the pedestrian.
  • the autonomous vehicle can continue to track the pedestrian moving through the crosswalk. As the pedestrian approaches an adjacent sidewalk (or median) or once the pedestrian steps onto the sidewalk (or median), the autonomous vehicle can move the intent icon further ahead of the autonomous vehicle in order to visually indicate the autonomous vehicle's intent to advance past the crosswalk.
  • the autonomous vehicle can: estimate a remaining time for the pedestrian to reach the adjacent sidewalk based on a speed of the pedestrian and a remaining distance between the pedestrian and the sidewalk; animate the intent icon moving outwardly from the autonomous vehicle (e.g., into or past the crosswalk) once the autonomous vehicle estimates the pedestrian to be within a threshold time (e.g., four seconds) of the sidewalk; and continue to move the intent icon outwardly from the front of the autonomous vehicle as the pedestrian approaches the sidewalk in order to visually communicate a sense of “urgency” and to visually indicate the autonomous vehicle's intent to pass through the crosswalk once the pedestrian enters the sidewalk.
  • a threshold time e.g., four seconds
  • the autonomous vehicle can: calculate a confidence score that it possesses right of way to enter the crosswalk based on presence of a pedestrian in or near the crosswalk in Block S 130 ; project the intent icon into the field at a distance from the front of the autonomous vehicle as a function of this confidence score in Block S 122 ; and then accelerate through the crosswalk once this confidence score exceeds a threshold score in Block S 140 . Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance ahead of the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to enter the crosswalk.
  • the autonomous vehicle activates the front projector to project the intent icon into the field upon stopping at a right turn lane and preparing to turn right.
  • the autonomous vehicle can execute methods and techniques similar to those described above to: autonomously navigate along a road segment, up to an intersection, and into a right-turn lane; detect and track other vehicles in or approaching the intersection; identify another vehicle heading toward the road segment just ahead of and perpendicular to the autonomous vehicle and with right of way to pass through the intersection; and yield to this other vehicle accordingly.
  • the autonomous vehicle can activate the front projector to project the intent icon just ahead of the front bumper of the autonomous vehicle in order to visually communicate—to the other vehicle and/or its operator—that the autonomous vehicle intends to remain stopped in the right-turn lane for the other vehicle to pass.
  • the autonomous vehicle can continue to track the other vehicle and can move the intent icon further ahead of the autonomous vehicle—in order to visually indicate the autonomous vehicle's intent to turn right from the right-turn lane—once the other vehicle passes in front of the autonomous vehicle.
  • the autonomous vehicle can: estimate a remaining time for the other vehicle to pass the autonomous vehicle based on the position and speed of the other vehicle relative to the autonomous vehicle; animate the intent icon moving outwardly from the autonomous vehicle (e.g., into the road segment ahead of and perpendicular to the autonomous vehicle) once the autonomous vehicle estimates the other vehicle to be within a threshold time (e.g., two seconds) of passing the autonomous vehicle; and continue to move the intent icon outwardly from the front of the autonomous vehicle as the other vehicle passes and moves beyond the autonomous vehicle.
  • the autonomous vehicle determines that the other autonomous vehicle has moved beyond the autonomous vehicle by at least a minimum distance and that other vehicles are not approaching the road segment just ahead of the autonomous vehicle, the autonomous vehicle can resume autonomous navigate and execute a right-turn action onto this road segment.
  • the autonomous vehicle can: calculate a confidence score that it possesses right of way to make a right turn at this intersection based on presence of other vehicles nearby in Block S 130 ; project the intent icon into the field at a distance from the front of the autonomous vehicle as a function of this confidence score in Block S 122 ; and then autonomously execute a right-turn action once this confidence score exceeds a threshold score in Block S 140 . Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance ahead of the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to execute a right turn action.
  • the autonomous vehicle activates the rear projector to project the intent icon into the field when stopped in a parking space and preparing to back out of the parking space.
  • the autonomous vehicle can remain parked in a parking space with its powertrain “OFF” when not in use (e.g., when “idle”).
  • the autonomous vehicle can: power up its powertrain; activate the rear projector to project the intent icon just behind the rear bumper of the autonomous vehicle in order to visually communicate—to the other vehicles and/or vehicle operators nearby—that the autonomous vehicle is active but intends to remain stopped in its current parking space; and scan the field behind and to the sides of the autonomous vehicle for an approaching vehicle, a pedestrian, and/or other obstacle.
  • the autonomous vehicle can remain stopped in the parking space and continue to project the intent icon just behind the rear bumper of the autonomous vehicle via the rear projector.
  • the rear projector can project the intent icon further from the rear of the autonomous vehicle in order to visually communicate its intent to back out of the parking space.
  • the autonomous vehicle can: calculate a confidence score that it possesses right of way to back out of the parking space based on presence of other vehicles and pedestrians nearby in Block S 130 ; project the intent icon into the field at a distance behind the autonomous vehicle as a function of this confidence score in Block S 122 ; and then autonomously navigate a planned path to back out of the parking space once this confidence score exceeds a threshold score in Block S 140 . Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance behind the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to back out of the parking space.
  • the rear project can shift the intent icon closer to the rear of the autonomous vehicle in order to communicate the autonomous vehicle's intent to slow and then change directions.
  • the rear projector can project the intent icon at a distance from the rear of the autonomous vehicle as an inverse function of the distance remaining along the planned path or as a function of the autonomous vehicle's intended speed.
  • the autonomous vehicle can: deactivate the rear projector and trigger the front projector to project the intent icon proximal the front of the autonomous vehicle; while scanning the field ahead of the autonomous vehicle for another vehicle or pedestrian.
  • the front projector can move the intent icon further ahead of the front of the autonomous vehicle in order to communicate the autonomous vehicle's intent to move forward, such as described above.
  • the autonomous vehicle while stopped at a pickup location and waiting for a rider to enter the autonomous vehicle, such as at a curb or loading zone, the autonomous vehicle (or a side projector specifically) can project the intent icon onto pavement adjacent a door of the autonomous vehicle in order to visually communicate to other vehicle operators and pedestrians nearby that the autonomous vehicle is waiting for a rider to enter the autonomous vehicle.
  • the autonomous vehicle can animate the intent icon moving toward the front of the autonomous vehicle and project the intent icon adjacent the front of the autonomous vehicle while the rider prepares for departure.
  • the autonomous vehicle can project the intent icon at greater distances from the front of the autonomous vehicle in order to visually communicate its intent to depart from the pickup location.
  • the autonomous vehicle can project the intent icon ahead of the autonomous vehicle and shift the intent icon closer to the front of the autonomous vehicle in order to visually communicate its intent to slow down.
  • the autonomous vehicle (or the side projector) can shift the intent icon to pavement adjacent a door of the autonomous vehicle (e.g., adjacent the rider's location inside the autonomous vehicle or adjacent a door of the autonomous vehicle opposite a nearby curb) in order to visually communicate to other vehicle operators and pedestrians nearby that the autonomous vehicle is waiting for a rider to exit the autonomous vehicle.
  • the autonomous vehicle can then transition the intent icon back to the front of the autonomous vehicle in preparation for subsequent departure, as described above.
  • the autonomous vehicle can implement similar methods and techniques to activate the front projector when approaching select scenarios or when preparing to execute a navigational change.
  • the autonomous vehicle can project the intent icon onto pavement far (e.g., five meters) ahead of the autonomous vehicle once the speed of the autonomous vehicle drops below a speed threshold (e.g., 25 miles per hour).
  • the autonomous vehicle can project the intent icon onto pavement at closer distances to the front of the autonomous vehicle as the autonomous vehicle slows further upon approach to the right turn lane.
  • the autonomous vehicle can implement methods and techniques described above to project the intent icon just fore of the front of the autonomous vehicle while stopped and to then shift the intent icon further ahead of the autonomous vehicle as the autonomous vehicle prepares to advance forward and execute a right turn maneuver.
  • the autonomous vehicle determines that no traffic is oncoming and prepares to execute the right turn maneuver as the autonomous vehicle approaches the right turn lane without stopping, the autonomous vehicle can then transition to projecting the intent icon at a further distance from the front of the autonomous vehicle—even as the autonomous vehicle slows upon approach to the right turn lane—in order to visually communicate the autonomous vehicle's intent to execute the right turn maneuver without stopping.
  • the autonomous vehicle can deactivate projection of the intent icon into the field.
  • the autonomous vehicle can continue to project the intent icon until the autonomous vehicle has exited the intersection, exited the parking space, passed the crosswalk, merge into a new lane, or otherwise completed the current navigational action.
  • the autonomous vehicle can disable projection of the intent icon at the earlier of: reaching a threshold speed (e.g., 10 miles per hour, 25 miles per hour); and coming within a threshold distance (e.g., five meters) of another vehicle directly ahead of the autonomous vehicle.
  • a threshold speed e.g. 10 miles per hour, 25 miles per hour
  • a threshold distance e.g., five meters
  • the autonomous vehicle can also disable projection of the intent icon when the autonomous vehicle is behind another vehicle and approaching or stopped at an intersection; rather the autonomous vehicle can limit projection of the intent icon into the field to when the autonomous vehicle reaches the front of intersection (i.e., is the leading vehicle in its lane at the intersection).
  • the autonomous vehicle can selectively disable projection of the intent icon in response to any other event.
  • the autonomous vehicle can project the intent icon into the field along the longitudinal axis of the autonomous vehicle and at varying distances from the autonomous vehicle and leverage turn signals integrated into the autonomous vehicle to indicate the autonomous vehicle's intended direction of navigation.
  • the autonomous vehicle can shift the projected intent icon laterally (i.e., off of the longitudinal axis of the autonomous vehicle) in order to visually communicate the autonomous vehicle's intent to execute this turn.
  • the autonomous vehicle can move the intent icon closer to another vehicle or pedestrian nearer to the autonomous vehicle's intended path, which may improve perception and comprehension of the intent icon for vehicle operators and pedestrians.
  • the systems and methods described herein can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
  • the instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a user computer or mobile device, wristband, smartphone, or any suitable combination thereof.
  • Other systems and methods of the embodiment can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions.
  • the instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above.
  • the computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device.
  • the computer-executable component can be a processor but any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Game Theory and Decision Science (AREA)
  • Artificial Intelligence (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Evolutionary Computation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

One variation of a method for communicating intent includes, at an autonomous vehicle: autonomously approaching an intersection; autonomously navigating to a stop proximal the intersection; while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle; calculating a confidence score for possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle; projecting the intent icon onto pavement at distances ahead of the autonomous vehicle proportional to the confidence score and greater than the first distance; and in response to the confidence score exceeding a threshold score, autonomously entering the intersection.

Description

    TECHNICAL FIELD
  • This invention relates generally to the field of autonomous vehicles and more specifically to a new and useful method for communicating intent of an autonomous vehicle in the field of autonomous vehicles.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a flowchart representation of a method.
  • DESCRIPTION OF THE EMBODIMENTS
  • The following description of embodiments of the invention is not intended to limit the invention to these embodiments but rather to enable a person skilled in the art to make and use this invention. Variations, configurations, implementations, example implementations, and examples described herein are optional and are not exclusive to the variations, configurations, implementations, example implementations, and examples they describe. The invention described herein can include any and all permutations of these variations, configurations, implementations, example implementations, and examples.
  • 1. Method
  • As shown in FIG. 1, a method S100 for communicating intent includes, at an autonomous vehicle: autonomously approaching an intersection in Block S110; autonomously navigating to a stop proximal the intersection in Block S112; while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle in Block S120; calculating a confidence score for possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle in Block S130; projecting the intent icon onto pavement at distances ahead of the autonomous vehicle proportional to the confidence score and greater than the first distance in Block S122; and, in response to the confidence score exceeding a threshold score, autonomously entering the intersection in Block S140.
  • One variation of the method S100 includes, at the autonomous vehicle: autonomously approaching an intersection in Block S110; while slowing upon approach to the intersection, projecting an intent icon onto pavement at a distance ahead of the autonomous vehicle proportional to a speed of the autonomous vehicle in Block S120; while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle in Block S120; predicting possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle in Block S130; in preparation for navigating into the intersection, projecting the intent icon onto pavement at a second distance ahead of the autonomous vehicle greater than the first distance in Block S122; and autonomously entering the intersection in Block S140.
  • 2. Applications
  • Generally, the method S100 can be executed by an autonomous vehicle to visually indicate its intent—to other vehicles, vehicle operators, and pedestrians nearby—through a simple intent icon projected onto nearby pavement. In particular, rather than project text or scenario-specific symbols onto pavement or rather than render text or scenario-specific symbols on displays integrated into the autonomous vehicle, the autonomous vehicle can instead project a simple intent icon (e.g., a green circle or “dot” approximately 25-centimeters in diameter) onto nearby pavement at distances from the autonomous vehicle that correspond to the autonomous vehicle's intent to advance forward from its current location, such as into an intersection, through a turn lane, or out of a parking space. By projecting a simple intent icon into its surrounding field at positions (i.e., distances from the autonomous vehicle) linked to the autonomous vehicle's intent to remain stopped or advance forward, the autonomous vehicle can enable other vehicle operators and pedestrians nearby to quickly intuit the autonomous vehicle's next action and the autonomous vehicle's perception of the field (i.e., whether the autonomous vehicle “sees” these other vehicles and pedestrians).
  • The autonomous vehicle can execute Blocks of the method S100 in order to simulate eye contact, body language, and gestures between human operators of road vehicles and between vehicle operators and pedestrians nearby via a simple intent icon projected onto pavement near the autonomous vehicle. For example, a human standing on a sidewalk or in an office may look down—such as at her phone or notebook or focusing downwardly on her current task—if she intends to remain in her current location rather than walk forward. However, this human may naturally look up from her smartphone, notebook, or task when she eventually does intend to walk forward, including looking in the direction she intends to walk. The autonomous vehicle can execute Blocks of the method S100 to mimic this behavior, including: projecting an intent icon onto the pavement just ahead of the autonomous vehicle when the autonomous vehicle intends to remain stopped in its current location (e.g., at a traffic light, a stop sign, a crosswalk, a turn lane, a parking space); and projecting the intent icon further ahead of the front of the autonomous vehicle when the autonomous vehicle intends to move forward. Human operators and pedestrians nearby may: witness this intent icon; intuit the autonomous vehicle's intent to remain stopped when the projected intent icon remains close to the autonomous vehicle, since this intent icon position may simulate a user looking down at her phone, notebook, or other task; and intuit the autonomous vehicle's increasing intent to move forward (or backward, such as out of a parking space) as the autonomous vehicle projects the intent icon at great distances from the autonomous vehicle, since this intent icon position may simulate a user looking up from her phone, notebook, or other task when preparing to move from her current location.
  • The autonomous vehicle can therefore execute Blocks of the method S100 in order to enable vehicle operators and pedestrians nearby to quickly comprehend the autonomous vehicle's intent (e.g., its next elected navigational action) without reading textual content, interpreting scenario-specific icons, or otherwise holding prior knowledge of such language or iconography projected or rendered by the autonomous vehicle.
  • The method S100 is described below as executed by an autonomous vehicle to project an intent icon onto pavement in the field near the autonomous vehicle in order to communicate the autonomous vehicle's perception of its field, its perception of its right of way, and its intent, such as in addition to or in combination with left and right turn signals integrated into the autonomous vehicle. Alternatively, the autonomous vehicle can implement similar methods and techniques to render a similar intent icon on exterior displays integrated into the autonomous vehicle. Furthermore, the autonomous vehicle can include a dedicated autonomous rideshare vehicle, an autonomous personal mobility vehicle, an autonomous fleet vehicle, an autonomous delivery or other commercial-type vehicle, an autonomous truck, etc.
  • 3. Autonomous Vehicle
  • The autonomous vehicle can include: a suite of sensors configured to collect information about the autonomous vehicle's environment; local memory storing a navigation map defining a route for execution by the autonomous vehicle and a localization map that the autonomous vehicle implements to determine its location in real space; and a controller. The controller can: determine the location of the autonomous vehicle in real space based on sensor data collected from the suite of sensors and the localization map; determine the context of a scene around the autonomous vehicle based on these sensor data; elect a future navigational action (e.g., a navigational decision) based on the context of the scene around the autonomous vehicle, the real location of the autonomous vehicle, and the navigation map, such as by implementing a deep learning and/or artificial intelligence model; and control actuators within the vehicle (e.g., accelerator, brake, and steering actuators) according to elected navigation decisions.
  • In one implementation, the autonomous vehicle includes a set of 360° LIDAR sensors arranged on the autonomous vehicle, such as one LIDAR sensor arranged at the front of the autonomous vehicle and a second LIDAR sensor arranged at the rear of the autonomous vehicle or a cluster of LIDAR sensors arranged on the roof of the autonomous vehicle. Each LIDAR sensor can output one three-dimensional distance map (or depth image)—such as in the form of a 3D point cloud representing distances between the LIDAR sensor and external surface within the field of view of the LIDAR sensor—per rotation of the LIDAR sensor (i.e., once per scan cycle). The autonomous vehicle can additionally or alternatively include: a set of infrared emitters configured to project structured light into a field near the autonomous vehicle; a set of infrared detectors (e.g., infrared cameras); and a processor configured to transform images output by the infrared detector(s) into a depth map of the field.
  • The autonomous vehicle can also include one or more color cameras facing outwardly from the front, rear, and left lateral and right lateral sides of the autonomous vehicle. For example, each camera can output a video feed containing a sequence of digital photographic images (or “frames”), such as at a rate of 20 Hz. The autonomous vehicle can also include a set of infrared proximity sensors arranged along the perimeter of the base of the autonomous vehicle and configured to output signals corresponding to proximity of objects and pedestrians within one meter of the autonomous vehicle. The controller in the autonomous vehicle can thus fuse data streams from the LIDAR sensor(s), the color camera(s), and the proximity sensor(s), etc. into one optical scan of the field around the autonomous vehicle—such as in the form of a 3D color map or 3D point cloud of roads, sidewalks, vehicles, pedestrians, etc. in the field around the autonomous vehicle—per scan cycle. The autonomous vehicle can also collect data broadcast by other vehicles and/or static sensor systems nearby and can incorporate these data into an optical scan to determine a state and context of the scene around the vehicle and to elect subsequent actions.
  • Furthermore, the autonomous vehicle can compare features extracted from this optical scan to like features represented in the localization map—stored in local memory on the autonomous vehicle—in order to determine its geospatial location and orientation in real space and then elect a future navigational action or other navigational decision accordingly. The autonomous vehicle can also: implement a perception model—such as including integrated or discrete vehicle, pedestrian, traffic sign, traffic signal, and lane marker detection models—to detect and identify mutable and immutable objects in its proximity; and implement a navigation or path planning model (e.g., in the form of a convolutional neural network) to elect acceleration, braking, and turning actions based on these mutable and immutable objects and the autonomous vehicle's target destination. For example, the autonomous vehicle can implement a localization map, a navigational model, and a path planning model to: autonomously approach an intersection in Block Silo; detect a stop sign, yield sign, or traffic signal near this intersection; detect another vehicle or pedestrian near this intersection; perceive that the other vehicle or pedestrian has right of way to enter the intersection or that the autonomous vehicle is otherwise required to stop at the intersection; and then autonomously navigate to a stop proximal the intersection in Block S112. As the autonomous vehicle slows upon approach to the intersection, while the autonomous vehicle is stopped at the intersection, and/or as the autonomous vehicle then accelerates into the intersection, the autonomous vehicle can execute subsequent Blocks of the method S100 to visually communicate its intent by projecting the intent icon onto pavement nearby, as described below.
  • However, the autonomous vehicle can include any other sensors and can implement any other scanning, signal processing, and autonomous navigation techniques or models to determine its geospatial position and orientation, to perceive objects in its vicinity, and to elect navigational actions based on sensor data collected through these sensors.
  • 4. Projector
  • The autonomous vehicle also includes a front projector configured to project light onto pavement ahead of the autonomous vehicle. For example, the front projector can include a DLP or LCD projector arranged under or integrated into the front bumper of the autonomous vehicle and configured to project light over a distance ahead of the autonomous vehicle, such as including from 50 centimeters ahead of the autonomous vehicle to five meters ahead of the autonomous vehicle. Alternatively, the front projector can be configured to output a beam of light (e.g., an approximately-collimated light beam) and can be pivotably mounted to the front of the autonomous vehicle in order to project the beam of light along a length of pavement ahead of the autonomous vehicle. However, the forward project can be mounted to or integrated into the autonomous vehicle in any other way, can include a light source of any other type, and can project light into the field—at varying distances from the front of the autonomous vehicle—in any other way.
  • The autonomous vehicle can similarly include: a rear projector configured to project light onto pavement behind the autonomous vehicle; a right projector configured to project light onto pavement to the right of the autonomous vehicle; and/or a left projector configured to project light onto pavement to the left of the autonomous vehicle.
  • 5. Intent Icon
  • Block S120 of the method S100 recites, while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle; Block S130 of the method S100 recites calculating a confidence score for possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle; and Block S122 of the method S100 recites projecting the intent icon onto pavement at distances ahead of the autonomous vehicle proportional to the confidence score and greater than the first distance. Generally, in Blocks S120, S122, and S130, the autonomous vehicle can project the intent icon onto pavement near the autonomous vehicle at a distance from the autonomous vehicle (or at a position relative to the autonomous vehicle) that corresponds to the autonomous vehicle's intent to either remain stopped in its current location or to enter the intersection. In particular, the autonomous vehicle can project a simple intent icon—from which a human nearby may quickly intuit the autonomous vehicle's intent to either remain stopped or to move from its current location—onto pavement in the field near the autonomous vehicle.
  • In one implementation shown in FIG. 1, the autonomous vehicle projects a circular “dot”—such as approximately 25 centimeters in diameter and in a solid color (e.g., white, orange, green, yellow, violet, pink)—onto pavement near the autonomous vehicle in Blocks S120 and S122.
  • The autonomous vehicle can also animate the intent icon in order to garner attention from vehicle operators and pedestrians nearby. For example, the autonomous vehicle can pulsate the intent icon, including expanding and contracting the projected intent icon by +/− 30% of its nominal diameter, in order to increase likelihood that a human onlooker will notice the intent icon. In this implementation, the autonomous vehicle can also pulsate the intent icon at a rate corresponding to its intended rate of acceleration or deceleration from its current location.
  • In another implementation shown in FIG. 1, the autonomous vehicle projects the intent icon in the form of a “dot” into the field with a “tail” trailing the dot back to the autonomous vehicle in order to visually link the projected dot to the autonomous vehicle, such as if other autonomous vehicles nearby are implementing similar methods and techniques to project their own intent icons onto pavement nearby (e.g., when multiple autonomous vehicles converge on one intersection). In a similar implementation, the autonomous vehicle can project the intent icon that defines a tapered line or arc extending from the autonomous vehicle and widening at greater distances from the autonomous vehicle.
  • The autonomous vehicle can also project the intent icon in a color matched to the autonomous vehicle's exterior color or matched to a graphic on the autonomous vehicle's exterior. For example, the autonomous vehicle: can include green and yellow graphics arranged on its exterior; and can project a circular intent icon with concentric green and yellow rings onto pavement nearby in order to enable humans nearby to visually link this projected intent icon to this autonomous vehicle. In this example, a second autonomous vehicle: can include orange and white graphics arranged on its exterior; and can project a circular intent icon with concentric orange and white rings onto pavement nearby in order to enable humans nearby to visually link this projected intent icon to this second autonomous vehicle and to distinguish this intent icon projected by the second autonomous vehicle from intent icons projected by other autonomous vehicles nearby.
  • However, the autonomous vehicle can project an intent icon of any other size, geometry, and/or color, etc. and animated in any other way.
  • 6. Intent Icon Activation Scenarios
  • Generally, the autonomous vehicle can activate the front projector to project the intent icon into the field nearby in select scenarios—that is, when the autonomous vehicle is preparing to execute certain navigational actions.
  • 6.1 Stop Sign
  • In one implementation, the autonomous vehicle activates the front projector to project the intent icon into the field upon stopping at a stop sign and preparing to move into the intersection ahead. In this implementation, the autonomous vehicle can implement autonomous navigation and perception techniques to: autonomously navigate along a segment of road; detect a stop sign ahead of the autonomous vehicle; slow upon approach to the stop sign; stop at or ahead of the intersection; detect and track other vehicles at the intersection; perceive right of way of these vehicles based on detected arrival times at the intersection; and remain stopped at the intersection while waiting for other vehicles—with right of way preceding that of the autonomous vehicle—to enter the intersection.
  • While stopped at the stop sign and waiting for other vehicles to enter the intersection before entering the intersection itself, the autonomous vehicle can activate the front projector to project the intent icon just ahead (e.g., within 50 centimeters) of the front bumper of the autonomous vehicle. During this period, the autonomous vehicle can continue to: scan its surrounding field for other vehicles; track locations of these other vehicles with and near the intersection; and derive the autonomous vehicle's right of way to enter the intersection. As the autonomous vehicle detects a last other vehicle with right of way preceding that of the autonomous vehicle entering the intersection, the autonomous vehicle can trigger the front projector to move the intent icon further ahead of the front of the autonomous vehicle in order to visually indicate the autonomous vehicle's intent to enter the intersection For example, the front projector can “animate” the intent icon moving from its projected position proximal the autonomous vehicle's front bumper to a position further ahead of the autonomous vehicle—such as in the intersection—in order to visually indicate the autonomous vehicle's intent to advance into the intersection soon thereafter.
  • In this implementation, the autonomous vehicle can also: calculate a confidence score that it possesses right of way to enter the intersection based on presence of other vehicles in or near the intersection in Block S130; project the intent icon into the field at a distance from the front of the autonomous vehicle as a function of this confidence score in Block S122; and enter the intersection and resume navigation once this confidence score exceeds a threshold score in Block S140. Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance ahead of the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to enter the intersection.
  • 6.2 Crosswalk
  • In another implementation, the autonomous vehicle activates the front projector to project the intent icon into the field upon stopping ahead of a crosswalk and preparing to advance through the crosswalk. In this implementation, the autonomous vehicle can execute methods and techniques similar to those described above to: autonomously navigate along a road segment; detect a crosswalk ahead of the autonomous vehicle; detect and track a pedestrian near the crosswalk; slow to a stop at or ahead of the crosswalk; and remain stopped at the crosswalk while waiting for the pedestrian to enter and then exit the crosswalk. While stopped at the crosswalk and waiting for the pedestrian to exit the crosswalk, the autonomous vehicle can activate the front projector to project the intent icon just ahead of the front bumper of the autonomous vehicle in order to visually communicate—to the pedestrian—that the autonomous vehicle intends to remain stopped at the crosswalk for the pedestrian.
  • The autonomous vehicle can continue to track the pedestrian moving through the crosswalk. As the pedestrian approaches an adjacent sidewalk (or median) or once the pedestrian steps onto the sidewalk (or median), the autonomous vehicle can move the intent icon further ahead of the autonomous vehicle in order to visually indicate the autonomous vehicle's intent to advance past the crosswalk. For example, the autonomous vehicle can: estimate a remaining time for the pedestrian to reach the adjacent sidewalk based on a speed of the pedestrian and a remaining distance between the pedestrian and the sidewalk; animate the intent icon moving outwardly from the autonomous vehicle (e.g., into or past the crosswalk) once the autonomous vehicle estimates the pedestrian to be within a threshold time (e.g., four seconds) of the sidewalk; and continue to move the intent icon outwardly from the front of the autonomous vehicle as the pedestrian approaches the sidewalk in order to visually communicate a sense of “urgency” and to visually indicate the autonomous vehicle's intent to pass through the crosswalk once the pedestrian enters the sidewalk.
  • In this implementation, the autonomous vehicle can: calculate a confidence score that it possesses right of way to enter the crosswalk based on presence of a pedestrian in or near the crosswalk in Block S130; project the intent icon into the field at a distance from the front of the autonomous vehicle as a function of this confidence score in Block S122; and then accelerate through the crosswalk once this confidence score exceeds a threshold score in Block S140. Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance ahead of the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to enter the crosswalk.
  • 6.3 Right Turn
  • In yet another implementation, the autonomous vehicle activates the front projector to project the intent icon into the field upon stopping at a right turn lane and preparing to turn right. In this implementation, the autonomous vehicle can execute methods and techniques similar to those described above to: autonomously navigate along a road segment, up to an intersection, and into a right-turn lane; detect and track other vehicles in or approaching the intersection; identify another vehicle heading toward the road segment just ahead of and perpendicular to the autonomous vehicle and with right of way to pass through the intersection; and yield to this other vehicle accordingly. While stopped in this right turn late and waiting for this other vehicle to pass the autonomous vehicle, the autonomous vehicle can activate the front projector to project the intent icon just ahead of the front bumper of the autonomous vehicle in order to visually communicate—to the other vehicle and/or its operator—that the autonomous vehicle intends to remain stopped in the right-turn lane for the other vehicle to pass.
  • The autonomous vehicle can continue to track the other vehicle and can move the intent icon further ahead of the autonomous vehicle—in order to visually indicate the autonomous vehicle's intent to turn right from the right-turn lane—once the other vehicle passes in front of the autonomous vehicle. In a similar example, the autonomous vehicle can: estimate a remaining time for the other vehicle to pass the autonomous vehicle based on the position and speed of the other vehicle relative to the autonomous vehicle; animate the intent icon moving outwardly from the autonomous vehicle (e.g., into the road segment ahead of and perpendicular to the autonomous vehicle) once the autonomous vehicle estimates the other vehicle to be within a threshold time (e.g., two seconds) of passing the autonomous vehicle; and continue to move the intent icon outwardly from the front of the autonomous vehicle as the other vehicle passes and moves beyond the autonomous vehicle. Once the autonomous vehicle determines that the other autonomous vehicle has moved beyond the autonomous vehicle by at least a minimum distance and that other vehicles are not approaching the road segment just ahead of the autonomous vehicle, the autonomous vehicle can resume autonomous navigate and execute a right-turn action onto this road segment.
  • In this implementation, the autonomous vehicle can: calculate a confidence score that it possesses right of way to make a right turn at this intersection based on presence of other vehicles nearby in Block S130; project the intent icon into the field at a distance from the front of the autonomous vehicle as a function of this confidence score in Block S122; and then autonomously execute a right-turn action once this confidence score exceeds a threshold score in Block S140. Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance ahead of the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to execute a right turn action.
  • 6.4 Parking Space
  • In another implementation, the autonomous vehicle activates the rear projector to project the intent icon into the field when stopped in a parking space and preparing to back out of the parking space. In this implementation, the autonomous vehicle can remain parked in a parking space with its powertrain “OFF” when not in use (e.g., when “idle”). When a user subsequently enters the autonomous vehicle in preparation for a ride to a dropoff location or when the autonomous vehicle receives a new ride request specifying a pickup location other than the parking space, the autonomous vehicle can: power up its powertrain; activate the rear projector to project the intent icon just behind the rear bumper of the autonomous vehicle in order to visually communicate—to the other vehicles and/or vehicle operators nearby—that the autonomous vehicle is active but intends to remain stopped in its current parking space; and scan the field behind and to the sides of the autonomous vehicle for an approaching vehicle, a pedestrian, and/or other obstacle. If the autonomous vehicle detects an approaching vehicle or a pedestrian within a threshold distance of a planned path out of the parking space, the autonomous vehicle can remain stopped in the parking space and continue to project the intent icon just behind the rear bumper of the autonomous vehicle via the rear projector. As this other vehicle passes the autonomous vehicle's parking space and/or as this pedestrian moves outside of the autonomous vehicle's planned path out of the parking space, the rear projector can project the intent icon further from the rear of the autonomous vehicle in order to visually communicate its intent to back out of the parking space. Once the autonomous vehicle confirms that other vehicles and pedestrians are sufficiently remote from all or a portion of the planned path out of the parking space, the autonomous vehicle can autonomously back out of the parking space along the planned path.
  • In this implementation, the autonomous vehicle can: calculate a confidence score that it possesses right of way to back out of the parking space based on presence of other vehicles and pedestrians nearby in Block S130; project the intent icon into the field at a distance behind the autonomous vehicle as a function of this confidence score in Block S122; and then autonomously navigate a planned path to back out of the parking space once this confidence score exceeds a threshold score in Block S140. Therefore, in this implementation, the autonomous vehicle can project the intent icon at a distance behind the autonomous vehicle as a function of the autonomous vehicle's confidence in its right of way, which may correspond to its intent to back out of the parking space.
  • Furthermore, as the autonomous vehicle approaches the end of this planned path out of the parking space, the rear project can shift the intent icon closer to the rear of the autonomous vehicle in order to communicate the autonomous vehicle's intent to slow and then change directions. For example, the rear projector can project the intent icon at a distance from the rear of the autonomous vehicle as an inverse function of the distance remaining along the planned path or as a function of the autonomous vehicle's intended speed.
  • As (or once) the autonomous vehicle slows to a stop at the end of this planned path out of the parking space, the autonomous vehicle can: deactivate the rear projector and trigger the front projector to project the intent icon proximal the front of the autonomous vehicle; while scanning the field ahead of the autonomous vehicle for another vehicle or pedestrian. Once the autonomous vehicle confirms that it has right of way to move forward—such as in the absence of another vehicle or pedestrian in or near a planned route forward from the autonomous vehicle's current location—the front projector can move the intent icon further ahead of the front of the autonomous vehicle in order to communicate the autonomous vehicle's intent to move forward, such as described above.
  • 6.5 Rider Pickup and Dropoff
  • In another implementation, while stopped at a pickup location and waiting for a rider to enter the autonomous vehicle, such as at a curb or loading zone, the autonomous vehicle (or a side projector specifically) can project the intent icon onto pavement adjacent a door of the autonomous vehicle in order to visually communicate to other vehicle operators and pedestrians nearby that the autonomous vehicle is waiting for a rider to enter the autonomous vehicle. Once the rider has entered the autonomous vehicle, the autonomous vehicle can animate the intent icon moving toward the front of the autonomous vehicle and project the intent icon adjacent the front of the autonomous vehicle while the rider prepares for departure. Once the rider confirms that she is ready to depart and as the autonomous vehicle verifies that it possesses right of way, the autonomous vehicle can project the intent icon at greater distances from the front of the autonomous vehicle in order to visually communicate its intent to depart from the pickup location.
  • Similarly, as the autonomous vehicle approaches a drop-off location, the autonomous vehicle can project the intent icon ahead of the autonomous vehicle and shift the intent icon closer to the front of the autonomous vehicle in order to visually communicate its intent to slow down. After stopping at the dropoff location and while waiting for a rider to exit the autonomous vehicle, the autonomous vehicle (or the side projector) can shift the intent icon to pavement adjacent a door of the autonomous vehicle (e.g., adjacent the rider's location inside the autonomous vehicle or adjacent a door of the autonomous vehicle opposite a nearby curb) in order to visually communicate to other vehicle operators and pedestrians nearby that the autonomous vehicle is waiting for a rider to exit the autonomous vehicle. The autonomous vehicle can then transition the intent icon back to the front of the autonomous vehicle in preparation for subsequent departure, as described above.
  • 6.6 Scenario Approach
  • The autonomous vehicle can implement similar methods and techniques to activate the front projector when approaching select scenarios or when preparing to execute a navigational change. In one example, as the autonomous vehicle approaches a right turn lane and prepares to turn right, the autonomous vehicle can project the intent icon onto pavement far (e.g., five meters) ahead of the autonomous vehicle once the speed of the autonomous vehicle drops below a speed threshold (e.g., 25 miles per hour). The autonomous vehicle can project the intent icon onto pavement at closer distances to the front of the autonomous vehicle as the autonomous vehicle slows further upon approach to the right turn lane. If the autonomous vehicle then stops in the right turn lane to yield to oncoming traffic, the autonomous vehicle can implement methods and techniques described above to project the intent icon just fore of the front of the autonomous vehicle while stopped and to then shift the intent icon further ahead of the autonomous vehicle as the autonomous vehicle prepares to advance forward and execute a right turn maneuver. However, if the autonomous vehicle determines that no traffic is oncoming and prepares to execute the right turn maneuver as the autonomous vehicle approaches the right turn lane without stopping, the autonomous vehicle can then transition to projecting the intent icon at a further distance from the front of the autonomous vehicle—even as the autonomous vehicle slows upon approach to the right turn lane—in order to visually communicate the autonomous vehicle's intent to execute the right turn maneuver without stopping.
  • 7. Intent Icon Deactivation
  • Once the autonomous vehicle begins to navigate forward into an intersection or out of a parking space, etc., the autonomous vehicle can deactivate projection of the intent icon into the field. Alternatively, the autonomous vehicle can continue to project the intent icon until the autonomous vehicle has exited the intersection, exited the parking space, passed the crosswalk, merge into a new lane, or otherwise completed the current navigational action. Yet alternately, the autonomous vehicle can disable projection of the intent icon at the earlier of: reaching a threshold speed (e.g., 10 miles per hour, 25 miles per hour); and coming within a threshold distance (e.g., five meters) of another vehicle directly ahead of the autonomous vehicle.
  • The autonomous vehicle can also disable projection of the intent icon when the autonomous vehicle is behind another vehicle and approaching or stopped at an intersection; rather the autonomous vehicle can limit projection of the intent icon into the field to when the autonomous vehicle reaches the front of intersection (i.e., is the leading vehicle in its lane at the intersection).
  • However, the autonomous vehicle can selectively disable projection of the intent icon in response to any other event.
  • 8. Direction
  • In the foregoing implementations, the autonomous vehicle can project the intent icon into the field along the longitudinal axis of the autonomous vehicle and at varying distances from the autonomous vehicle and leverage turn signals integrated into the autonomous vehicle to indicate the autonomous vehicle's intended direction of navigation. Alternatively, when preparing to execute a turn—such as when stopped at a stoplight, traffic signal, or turn lane—the autonomous vehicle can shift the projected intent icon laterally (i.e., off of the longitudinal axis of the autonomous vehicle) in order to visually communicate the autonomous vehicle's intent to execute this turn. In particular, by projecting the intent icon in the direction that the autonomous vehicle intends to navigate while also projecting the intent icon at a distance from the autonomous vehicle corresponding to the autonomous vehicle's intent to execute this navigational action, the autonomous vehicle can move the intent icon closer to another vehicle or pedestrian nearer to the autonomous vehicle's intended path, which may improve perception and comprehension of the intent icon for vehicle operators and pedestrians.
  • The systems and methods described herein can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, hardware/firmware/software elements of a user computer or mobile device, wristband, smartphone, or any suitable combination thereof. Other systems and methods of the embodiment can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above. The computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component can be a processor but any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.
  • As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the embodiments of the invention without departing from the scope of this invention as defined in the following claims.

Claims (2)

I claim:
1. A method for communicating intent comprising, at an autonomous vehicle:
autonomously approaching an intersection;
autonomously navigating to a stop proximal the intersection;
while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle;
calculating a confidence score for possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle;
projecting the intent icon onto pavement at distances ahead of the autonomous vehicle proportional to the confidence score and greater than the first distance; and
in response to the confidence score exceeding a threshold score, autonomously entering the intersection.
2. A method for communicating intent comprising, at an autonomous vehicle:
autonomously approaching an intersection;
while slowing upon approach to the intersection, projecting an intent icon onto pavement at a distance ahead of the autonomous vehicle decreasing with decreasing speed of the autonomous vehicle;
while stopped proximal the intersection, projecting an intent icon onto pavement at a first distance ahead of the autonomous vehicle;
predicting possession of right of way of the autonomous vehicle to enter the intersection based on objects detected in a field around the autonomous vehicle;
in preparation for navigating into the intersection, projecting the intent icon onto pavement at a second distance ahead of the autonomous vehicle greater than the first distance; and
autonomously entering the intersection.
US16/421,423 2018-06-27 2019-05-23 Method for communicating intent of an autonomous vehicle Abandoned US20200001779A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/421,423 US20200001779A1 (en) 2018-06-27 2019-05-23 Method for communicating intent of an autonomous vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862690873P 2018-06-27 2018-06-27
US16/421,423 US20200001779A1 (en) 2018-06-27 2019-05-23 Method for communicating intent of an autonomous vehicle

Publications (1)

Publication Number Publication Date
US20200001779A1 true US20200001779A1 (en) 2020-01-02

Family

ID=69054592

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/421,423 Abandoned US20200001779A1 (en) 2018-06-27 2019-05-23 Method for communicating intent of an autonomous vehicle
US16/455,524 Active US11001196B1 (en) 2018-06-27 2019-06-27 Systems and methods for communicating a machine intent

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/455,524 Active US11001196B1 (en) 2018-06-27 2019-06-27 Systems and methods for communicating a machine intent

Country Status (1)

Country Link
US (2) US20200001779A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11001196B1 (en) * 2018-06-27 2021-05-11 Direct Current Capital LLC Systems and methods for communicating a machine intent
US20210155241A1 (en) * 2019-11-22 2021-05-27 Magna Electronics Inc. Vehicular control system with controlled vehicle stopping and starting at intersection
US11055997B1 (en) * 2020-02-07 2021-07-06 Honda Motor Co., Ltd. System and method for resolving ambiguous right of way
US11079765B2 (en) 2016-12-19 2021-08-03 Direct Current Capital LLC Methods for communicating state, intent, and context of an autonomous vehicle
CN113325976A (en) * 2021-05-27 2021-08-31 北京沃东天骏信息技术有限公司 Application program testing method, device, equipment and storage medium
US11136023B2 (en) * 2019-05-07 2021-10-05 Baidu Usa Llc Method for determining exiting intersection of moving objects for autonomous driving vehicles
CN113865600A (en) * 2021-09-28 2021-12-31 北京三快在线科技有限公司 High-precision map construction method and device
US11244562B2 (en) * 2019-08-26 2022-02-08 Panasonic Intellectual Property Management Co., Ltd. Information processing apparatus, information processing method and recording medium
US20220063604A1 (en) * 2020-08-25 2022-03-03 Subaru Corporation Vehicle travel control device
US11312298B2 (en) * 2020-01-30 2022-04-26 International Business Machines Corporation Modulating attention of responsible parties to predicted dangers of self-driving cars
US20220172620A1 (en) * 2019-06-13 2022-06-02 Zoox, Inc. Junction queueing
US11364912B2 (en) * 2019-03-08 2022-06-21 Honda Motor Co., Ltd. Vehicle control device
US11403850B2 (en) * 2019-07-24 2022-08-02 Honda Motor Co., Ltd. System and method for providing unsupervised domain adaptation for spatio-temporal action localization
US11454986B2 (en) * 2019-05-08 2022-09-27 Toyota Jidosha Kabushiki Kaisha Vehicle, operation management device, operation management method, and computer-readable recording medium
US20220306099A1 (en) * 2019-06-13 2022-09-29 Audi Ag Method and track guidance system for controlling an autonomous motor vehicle in an urban area
US20220412737A1 (en) * 2021-06-23 2022-12-29 Palantir Technologies Inc. Approaches of obtaining geospatial coordinates of sensor data
US11614739B2 (en) 2019-09-24 2023-03-28 Apple Inc. Systems and methods for hedging for different gaps in an interaction zone

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019156180A (en) * 2018-03-13 2019-09-19 本田技研工業株式会社 Vehicle controller, vehicle control method and program
US10743141B2 (en) 2018-06-05 2020-08-11 Kenmar Corporation Systems and methods for determining a location of an electronic device using bilateration
US11345276B2 (en) * 2019-06-17 2022-05-31 Ahmed Kareem Vehicle status indicator
US11410027B2 (en) * 2019-09-16 2022-08-09 SambaNova Systems, Inc. Performance estimation-based resource allocation for reconfigurable architectures
US20220111849A1 (en) * 2020-08-17 2022-04-14 Autobrains Technologies Ltd Determining driving features using timed narrow ai agent allocation

Family Cites Families (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2190123A (en) 1938-07-02 1940-02-13 Stephen C Pace Lighting signal
US7979173B2 (en) 1997-10-22 2011-07-12 Intelligent Technologies International, Inc. Autonomous vehicle travel control systems and methods
US7647180B2 (en) 1997-10-22 2010-01-12 Intelligent Technologies International, Inc. Vehicular intersection management techniques
US6741928B2 (en) 2000-03-07 2004-05-25 Magellan Dis, Inc. Navigation system with figure of merit determination
US7287884B2 (en) 2002-02-07 2007-10-30 Toyota Jidosha Kabushiki Kaisha Vehicle operation supporting device and vehicle operation supporting system
US9428186B2 (en) 2002-04-09 2016-08-30 Intelligent Technologies International, Inc. Exterior monitoring for vehicles
JP4578795B2 (en) 2003-03-26 2010-11-10 富士通テン株式会社 Vehicle control device, vehicle control method, and vehicle control program
KR100811232B1 (en) 2003-07-18 2008-03-07 엘지전자 주식회사 Turn-by-turn navigation system and how to guide
US20050117364A1 (en) 2003-10-27 2005-06-02 Mark Rennick Method and apparatus for projecting a turn signal indication
US7095318B1 (en) 2004-09-28 2006-08-22 Solomon Bekhor Enhanced vehicle advisory system to advise drivers of other vehicles and passengers in the vehicle of actions taken by the driver
TWI443547B (en) 2005-12-07 2014-07-01 Telecomm Systems Inc Method and system for a user input solution for a limited telecommunication device
EP1912157A1 (en) 2006-10-09 2008-04-16 MAGNETI MARELLI SISTEMI ELETTRONICI S.p.A. Digital image processing system for automatically representing surrounding scenes to the driver of a vehicle for driving assistance, and corresponding operating method
US20160040997A1 (en) 2007-03-22 2016-02-11 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Vehicle navigation playback method
US20090069977A1 (en) 2007-09-12 2009-03-12 Gm Global Technology Operations, Inc. Apparatus and methods for automatically activating a motor vehicle turn signal lamp
JP4930446B2 (en) 2008-04-14 2012-05-16 トヨタ自動車株式会社 Vehicle travel control device
EP2291836B1 (en) 2008-05-21 2018-09-12 ADC Automotive Distance Control Systems GmbH Driver assistance system for preventing a vehicle colliding with pedestrians
JP5167051B2 (en) 2008-09-30 2013-03-21 富士重工業株式会社 Vehicle driving support device
US7924146B2 (en) 2009-04-02 2011-04-12 GM Global Technology Operations LLC Daytime pedestrian detection on full-windscreen head-up display
US8269652B2 (en) 2009-04-02 2012-09-18 GM Global Technology Operations LLC Vehicle-to-vehicle communicator on full-windshield head-up display
US8174375B2 (en) 2009-06-30 2012-05-08 The Hong Kong Polytechnic University Detection system for assisting a driver when driving a vehicle using a plurality of image capturing devices
US8253589B2 (en) 2009-10-20 2012-08-28 GM Global Technology Operations LLC Vehicle to entity communication
US8537030B2 (en) 2010-02-15 2013-09-17 Ford Global Technologies, Llc Pedestrian alert system and method
JP5218458B2 (en) 2010-03-23 2013-06-26 株式会社デンソー Vehicle approach notification system
US8509982B2 (en) 2010-10-05 2013-08-13 Google Inc. Zone driving
US9043045B2 (en) 2011-02-21 2015-05-26 Toyota Jidosha Kabushiki Kaisha Travel assist apparatus and travel assist method
US20120242479A1 (en) 2011-03-23 2012-09-27 Ghazarian Ohanes D Vehicle traffic lane change signaling device
US20120310465A1 (en) 2011-06-02 2012-12-06 Harman International Industries, Incorporated Vehicle nagivation system
US20130265791A1 (en) 2012-04-10 2013-10-10 Ford Global Technologies, Llc Vehicle light assembly with photon recycling
GB201213604D0 (en) 2012-07-31 2012-09-12 Bae Systems Plc Detectig moving vehicles
US9196164B1 (en) 2012-09-27 2015-11-24 Google Inc. Pedestrian notifications
US20160054563A9 (en) 2013-03-14 2016-02-25 Honda Motor Co., Ltd. 3-dimensional (3-d) navigation
US9891068B2 (en) 2013-06-08 2018-02-13 Apple Inc. Mapping application search function
US8996224B1 (en) 2013-03-15 2015-03-31 Google Inc. Detecting that an autonomous vehicle is in a stuck condition
US10215583B2 (en) 2013-03-15 2019-02-26 Honda Motor Co., Ltd. Multi-level navigation monitoring and control
US8849494B1 (en) 2013-03-15 2014-09-30 Google Inc. Data selection by an autonomous vehicle for trajectory modification
KR102011457B1 (en) 2013-04-17 2019-08-19 엘지전자 주식회사 Mobile terminal and control method thereof
US9654738B1 (en) 2013-08-07 2017-05-16 Waymo Llc Using multiple exposures to improve image processing for autonomous vehicles
WO2015032795A2 (en) 2013-09-03 2015-03-12 Jaguar Land Rover Limited System for imaging
US20150066284A1 (en) 2013-09-05 2015-03-05 Ford Global Technologies, Llc Autonomous vehicle control for impaired driver
CN105683015B (en) 2013-09-05 2018-06-08 罗伯特·博世有限公司 Enhancing lane departur warning based on the data from rear radar sensor
US9336436B1 (en) 2013-09-30 2016-05-10 Google Inc. Methods and systems for pedestrian avoidance
US9145116B2 (en) 2013-12-04 2015-09-29 Mobileye Vision Technologies Ltd. Systems and methods for implementing a multi-segment braking profile for a vehicle
CN106415896B (en) 2014-01-24 2019-08-09 日产自动车株式会社 electrical device
WO2015177648A1 (en) 2014-05-14 2015-11-26 Ofer Springer Systems and methods for curb detection and pedestrian hazard assessment
US9475422B2 (en) * 2014-05-22 2016-10-25 Applied Invention, Llc Communication between autonomous vehicle and external observers
EP2993083B1 (en) 2014-09-04 2023-11-22 Nokia Technologies Oy Apparatus, method and computer program for controlling road user acknowledgement
US9487129B2 (en) 2014-09-05 2016-11-08 Lenovo (Singapore) Pte. Ltd. Method and system to control vehicle turn indicators
US20160231746A1 (en) 2015-02-06 2016-08-11 Delphi Technologies, Inc. System And Method To Operate An Automated Vehicle
US9718405B1 (en) 2015-03-23 2017-08-01 Rosco, Inc. Collision avoidance and/or pedestrian detection system
US9884631B2 (en) 2015-06-04 2018-02-06 Toyota Motor Engineering & Manufacturing North America, Inc. Transitioning between operational modes of an autonomous vehicle
US9637120B2 (en) 2015-06-24 2017-05-02 Delphi Technologies, Inc. Cognitive driver assist with variable assistance for automated vehicles
US10053001B1 (en) 2015-09-24 2018-08-21 Apple Inc. System and method for visual communication of an operational status
US9849784B1 (en) 2015-09-30 2017-12-26 Waymo Llc Occupant facing vehicle display
US10479373B2 (en) * 2016-01-06 2019-11-19 GM Global Technology Operations LLC Determining driver intention at traffic intersections for automotive crash avoidance
US9975487B2 (en) 2016-02-03 2018-05-22 GM Global Technology Operations LLC Rear vision system for a vehicle and method of using the same
US9969326B2 (en) * 2016-02-22 2018-05-15 Uber Technologies, Inc. Intention signaling for an autonomous vehicle
US9902311B2 (en) * 2016-02-22 2018-02-27 Uber Technologies, Inc. Lighting device for a vehicle
US10134280B1 (en) * 2016-02-23 2018-11-20 Taehyun You Vehicular notifications
US10055652B2 (en) 2016-03-21 2018-08-21 Ford Global Technologies, Llc Pedestrian detection and motion prediction with rear-facing camera
US9857795B2 (en) 2016-03-24 2018-01-02 Honda Motor Co., Ltd. System and method for trajectory planning for unexpected pedestrians
US9535423B1 (en) 2016-03-29 2017-01-03 Adasworks Kft. Autonomous vehicle with improved visual detection ability
US9829889B1 (en) 2016-05-10 2017-11-28 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle advanced notification system and method of use
US9870703B2 (en) 2016-06-14 2018-01-16 Ford Global Technologies, Llc Pedestrian warning system providing adjustable acoustic indications
US9881503B1 (en) 2016-09-08 2018-01-30 GM Global Technology Operations LLC Vehicle-to-pedestrian-communication systems and methods for using the same
US9884585B1 (en) 2016-09-19 2018-02-06 Faraday & Future Inc. Exterior vehicle alerting system
EP3545376A4 (en) * 2016-11-22 2020-07-01 Amazon Technologies Inc. METHOD FOR AUTONOMOUS NAVIGATION THROUGH UNCONTROLLED AND CONTROLLED INtersections
US10196058B2 (en) 2016-11-28 2019-02-05 drive.ai Inc. Method for influencing entities at a roadway intersection
KR20180069147A (en) 2016-12-14 2018-06-25 만도헬라일렉트로닉스(주) Apparatus for warning pedestrian in vehicle
JP6895634B2 (en) 2016-12-16 2021-06-30 パナソニックIpマネジメント株式会社 Information processing systems, information processing methods, and programs
US10261513B2 (en) 2016-12-19 2019-04-16 drive.ai Inc. Methods for communicating state, intent, and context of an autonomous vehicle
US20180276986A1 (en) 2017-03-22 2018-09-27 Toyota Research Institute, Inc. Vehicle-to-human communication in an autonomous vehicle operation
US20180286232A1 (en) 2017-03-31 2018-10-04 David Shau Traffic control using sound signals
US10317907B2 (en) 2017-04-28 2019-06-11 GM Global Technology Operations LLC Systems and methods for obstacle avoidance and path planning in autonomous vehicles
TW201900459A (en) * 2017-05-17 2019-01-01 大陸商上海蔚蘭動力科技有限公司 Driving intention indicating device and method
US10118548B1 (en) 2017-06-15 2018-11-06 State Farm Mutual Automobile Insurance Company Autonomous vehicle signaling of third-party detection
JP6463549B1 (en) * 2017-07-21 2019-02-06 三菱電機株式会社 Irradiation system and irradiation method
US10262528B2 (en) 2017-07-21 2019-04-16 State Farm Mutual Automobile Insurance Company Autonomous vehicle mode alert system for bystanders
WO2019165451A1 (en) * 2018-02-26 2019-08-29 Nvidia Corporation Systems and methods for computer-assisted shuttles, buses, robo-taxis, ride-sharing and on-demand vehicles with situational awareness
US20200001779A1 (en) * 2018-06-27 2020-01-02 drive.ai Inc. Method for communicating intent of an autonomous vehicle
US11120688B2 (en) * 2018-06-29 2021-09-14 Nissan North America, Inc. Orientation-adjust actions for autonomous vehicle operational management
US11345277B2 (en) * 2018-10-16 2022-05-31 GM Global Technology Operations LLC Autonomous vehicle intent signaling
US20200017106A1 (en) * 2019-06-13 2020-01-16 Lg Electronics Inc. Autonomous vehicle control method

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12351105B1 (en) 2016-12-19 2025-07-08 Direct Current Capital LLC Methods for communicating state, intent, and context of an autonomous vehicle
US11914381B1 (en) 2016-12-19 2024-02-27 Direct Current Capital LLC Methods for communicating state, intent, and context of an autonomous vehicle
US11079765B2 (en) 2016-12-19 2021-08-03 Direct Current Capital LLC Methods for communicating state, intent, and context of an autonomous vehicle
US11001196B1 (en) * 2018-06-27 2021-05-11 Direct Current Capital LLC Systems and methods for communicating a machine intent
US11364912B2 (en) * 2019-03-08 2022-06-21 Honda Motor Co., Ltd. Vehicle control device
US11136023B2 (en) * 2019-05-07 2021-10-05 Baidu Usa Llc Method for determining exiting intersection of moving objects for autonomous driving vehicles
US11454986B2 (en) * 2019-05-08 2022-09-27 Toyota Jidosha Kabushiki Kaisha Vehicle, operation management device, operation management method, and computer-readable recording medium
US20220172620A1 (en) * 2019-06-13 2022-06-02 Zoox, Inc. Junction queueing
US11776400B2 (en) * 2019-06-13 2023-10-03 Zoox, Inc. Junction queueing
US20220306099A1 (en) * 2019-06-13 2022-09-29 Audi Ag Method and track guidance system for controlling an autonomous motor vehicle in an urban area
US11403850B2 (en) * 2019-07-24 2022-08-02 Honda Motor Co., Ltd. System and method for providing unsupervised domain adaptation for spatio-temporal action localization
US11580743B2 (en) 2019-07-24 2023-02-14 Honda Motor Co., Ltd. System and method for providing unsupervised domain adaptation for spatio-temporal action localization
US11244562B2 (en) * 2019-08-26 2022-02-08 Panasonic Intellectual Property Management Co., Ltd. Information processing apparatus, information processing method and recording medium
US11614739B2 (en) 2019-09-24 2023-03-28 Apple Inc. Systems and methods for hedging for different gaps in an interaction zone
US12036990B2 (en) * 2019-11-22 2024-07-16 Magna Electronics Inc. Vehicular control system with controlled vehicle stopping and starting at intersection
US20240367652A1 (en) * 2019-11-22 2024-11-07 Magna Electronics Inc. Vehicular vision system with determination of right-of-way at intersection
US20210155241A1 (en) * 2019-11-22 2021-05-27 Magna Electronics Inc. Vehicular control system with controlled vehicle stopping and starting at intersection
US11312298B2 (en) * 2020-01-30 2022-04-26 International Business Machines Corporation Modulating attention of responsible parties to predicted dangers of self-driving cars
US11055997B1 (en) * 2020-02-07 2021-07-06 Honda Motor Co., Ltd. System and method for resolving ambiguous right of way
US20220063604A1 (en) * 2020-08-25 2022-03-03 Subaru Corporation Vehicle travel control device
US12122366B2 (en) * 2020-08-25 2024-10-22 Subaru Corporation Vehicle travel control device having emergency braking control
CN113325976A (en) * 2021-05-27 2021-08-31 北京沃东天骏信息技术有限公司 Application program testing method, device, equipment and storage medium
US20220412737A1 (en) * 2021-06-23 2022-12-29 Palantir Technologies Inc. Approaches of obtaining geospatial coordinates of sensor data
US12298134B2 (en) * 2021-06-23 2025-05-13 Palantir Technologies Inc. Approaches of obtaining geospatial coordinates of sensor data
CN113865600A (en) * 2021-09-28 2021-12-31 北京三快在线科技有限公司 High-precision map construction method and device

Also Published As

Publication number Publication date
US11001196B1 (en) 2021-05-11

Similar Documents

Publication Publication Date Title
US20200001779A1 (en) Method for communicating intent of an autonomous vehicle
US12351105B1 (en) Methods for communicating state, intent, and context of an autonomous vehicle
US11462022B2 (en) Traffic signal analysis system
US11900812B2 (en) Vehicle control device
US11073832B1 (en) Collision mitigation static occupancy grid
CN109863513B (en) Neural network system for autonomous vehicle control
CN106873580B (en) Autonomous driving at intersections based on perception data
US11679780B2 (en) Methods and systems for monitoring vehicle motion with driver safety alerts
JP2024069391A (en) SYSTEM AND METHOD FOR INTERSECTION MANAGEMENT BY AUTONOMOUS VEHICLES - Patent application
CN108604413B (en) Display device control method and display device
JPWO2019069425A1 (en) Vehicle control device, vehicle control method, and program
US20200231144A1 (en) Vehicle control device, vehicle control method, and program
JP7583649B2 (en) Map generation device and vehicle position recognition device
US20240286609A1 (en) Animal collision aware planning systems and methods for autonomous vehicles
JP2022123239A (en) Line recognition device
JP2022156078A (en) transportation system
US12447951B2 (en) Traffic object intent estimation
JP2022129718A (en) map generator
JP7301897B2 (en) map generator
JP2022151012A (en) map generator
JP2022123988A (en) Line recognition device
US12441323B2 (en) Systems and methods for indicating a vehicle virtual connection based on user confusion
US20240317222A1 (en) Systems and methods for high precision lane-keeping by autonomous vehicles
JP7141479B2 (en) map generator
JP7604267B2 (en) Map Generator

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION