[go: up one dir, main page]

WO2025184269A1 - Systèmes et procédés pour un robot mobile autonome fournissant une rétroaction haptique - Google Patents

Systèmes et procédés pour un robot mobile autonome fournissant une rétroaction haptique

Info

Publication number
WO2025184269A1
WO2025184269A1 PCT/US2025/017483 US2025017483W WO2025184269A1 WO 2025184269 A1 WO2025184269 A1 WO 2025184269A1 US 2025017483 W US2025017483 W US 2025017483W WO 2025184269 A1 WO2025184269 A1 WO 2025184269A1
Authority
WO
WIPO (PCT)
Prior art keywords
force
robot
vector
sensor
drive unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/017483
Other languages
English (en)
Inventor
Benjie Holson
Jamie Luong
Justine REMBISZ
Heather Klaubert
Leila Takayama
Anthony Jules
Rodney Brooks
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robust AI Inc
Original Assignee
Robust AI Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/655,609 external-priority patent/US20250304135A1/en
Priority claimed from US18/795,644 external-priority patent/US20250278094A1/en
Application filed by Robust AI Inc filed Critical Robust AI Inc
Publication of WO2025184269A1 publication Critical patent/WO2025184269A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/02Hand grip control means
    • B25J13/025Hand grip control means comprising haptic means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/085Force or torque sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/224Output arrangements on the remote controller, e.g. displays, haptics or speakers
    • G05D1/2242Haptics
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/228Command input arrangements located on-board unmanned vehicles
    • G05D1/2287Command input arrangements located on-board unmanned vehicles using an external force applied to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/20Specific applications of the controlled vehicles for transportation
    • G05D2105/28Specific applications of the controlled vehicles for transportation of freight
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2107/00Specific environments of the controlled vehicles
    • G05D2107/70Industrial sites, e.g. warehouses or factories
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles

Definitions

  • This patent application relates generally to robotics, and more specifically to autonomous mobile robots.
  • Autonomous and semi-autonomous robots can be operated without user input, but in some situations user operation is desirable.
  • User operation of autonomous and semi-autonomous robots may be achieved through a control interface, such as a graphical user interface to allow a user to control the autonomous or semi-autonomous robot from the point of view of the robot.
  • autonomous and semi-autonomous robots may operate in environments with other human workers. Such workers may not be specifically tasked with operating the autonomous and semi-autonomous robots and may not include specific devices to do so, but may still find themselves in situations where they may need to operate such autonomous and semi-autonomous robots. Accordingly, improved mechanisms and techniques for autonomous robots capable of being controlled by humans are desired.
  • Techniques and mechanisms described herein provide systems, methods, and non-transitory computer readable media having instructions stored thereon for an autonomous mobile robot.
  • the techniques described herein relate to a robot including: a force sensor configured to transmit an input message characterizing a physical force exerted on the force sensor in a first direction; a processor configured to: determine a physical force input vector based on the input message, the physical force input vector quantifying the physical force in two or more dimensions, and determine a force output vector aggregating the physical force input vector and a second force input vector, the force output vector quantifying a force to apply to move the robot in a second direction, wherein determining the force output vector includes applying a force multiplier multiplying the physical force input vector; and an omnidirectional mechanical drive unit configured to receive an indication of the force output vector and move the robot in the second direction based on the force output vector.
  • the techniques described herein relate to a robot, wherein the second force input vector includes a friction force input vector characterizing a virtual frictional force exerted in a third dimension opposing the first direction.
  • the techniques described herein relate to a robot, wherein the second force input vector includes a functional force input vector characterizing a virtual functional force to be exerted on the robot in a third direction based on one or more instructions.
  • the techniques described herein relate to a robot, wherein the second force input vector includes a virtual force exerted in a third direction opposing an obstacle located in a physical environment in which the robot is situated.
  • the techniques described herein relate to a robot, wherein the second force input vector includes a virtual force exerted in a third direction along a path determined based on an instruction received from a remote computing system.
  • the techniques described herein relate to a robot, wherein the second force input vector includes a virtual force exerted in a third direction toward a location in a physical environment in which the robot is situated.
  • the techniques described herein relate to a robot, wherein determining the force output vector includes summing the second force input vector and a dot product of the physical force input vector and the force multiplier.
  • the techniques described herein relate to a robot, wherein transmitting the indication of the force output vector to the omnidirectional mechanical drive unit includes determining a velocity output vector based on an existing velocity vector for the robot, the velocity output vector identifying a target velocity to be achieved by the omnidirectional mechanical drive unit.
  • the techniques described herein relate to a robot, wherein determining the force output vector includes dividing a sum of the second force input vector and the physical force input vector by the force multiplier, wherein the force multiplier indicates a virtual mass for the robot.
  • the techniques described herein relate to a robot, wherein the force sensor is communicably coupled with a handlebar attached to the robot and oriented in a vertical direction, wherein the force sensor includes a Hall effect sensor configured to detect a change in a magnetic field.
  • the techniques described herein relate to a robot, wherein the two or more dimensions include two translational dimensions along a planar surface orthogonal to gravity.
  • the techniques described herein relate to a robot, wherein determining the force output vector involves: determining a velocity vector including one or more velocity values each characterizing a current velocity of the robot in a respective dimension, based on the velocity vector, identifying a designated dimension in which the current velocity of the robot is directionally opposed to the physical force input vector, and increasing the force multiplier in the designated dimension.
  • the techniques described herein relate to a robot, wherein the omnidirectional mechanical drive unit is backdrivable.
  • the techniques described herein relate to a robot, wherein the second force input vector is a haptic force input vector to provide haptic navigational feedback via the omnidirectional mechanical drive unit.
  • the techniques described herein relate to a robot, wherein the second force input vector is determined upon detecting a triggering condition.
  • the techniques described herein relate to a robot, wherein the triggering condition includes proximity to a virtual rail, and wherein the haptic force input vector causes the robot to align with the virtual rail.
  • the techniques described herein relate to a robot, wherein the triggering condition includes proximity to a virtual rail, and wherein the haptic force input vector causes the robot to suddenly break free from the virtual rail.
  • the techniques described herein relate to a method for controlling a robot including an omnidirectional mechanical drive unit, the method including: receiving from a force sensor an input message characterizing a physical force exerted on the force sensor in a first direction; determining via a processor a physical force input vector based on the input message and quantifying the physical force in two or more dimensions; determining via the processor a force output vector aggregating the physical force input vector and a second force input vector, the force output vector quantifying a force to apply to move the robot in a second direction, wherein determining the force output vector includes applying a force multiplier multiplyin the physical force input vector; transmitting an indication ofthe force output vector to the omnidirectional mechanical drive unit via a communication interface; and moving the robot via the omnidirectional mechanical drive unit in the second direction based on the force output vector.
  • the techniques described herein relate to a robot including an omnidirectional mechanical drive unit, the robot including: means for receiving from a force sensor an input message characterizing a physical force exerted on the force sensor in a first direction; means for determining via a processor a physical force input vector based on the input message and quantifying the physical force in two or more dimensions; means for determining via the processor a force output vector aggregating the physical force input vector and a second force input vector, the force output vector quantifying a force to apply to move the robot in a second direction, wherein determining the force output vector includes applying a force multiplier multiplying the physical force input vector; means for transmitting an indication of the force output vector to the omnidirectional mechanical drive unit via a communication interface; and means for moving the robot via the omnidirectional mechanical drive unit in the second direction based on the force output vector.
  • an input message characterizing a physical force exerted on a force sensor in a first direction may be received from the force sensor.
  • a physical force input vector may be determined based on the input message.
  • the physical force input vector may quantify the physical force in two or more dimensions.
  • a haptic force input vector may be determined to provide haptic navigational feedback via the omnidirectional mechanical drive unit.
  • a force output vector aggregating the physical force input vector and the haptic force input vector may be determined.
  • the force output vector may quantify a force to apply to move the robot in a second direction.
  • An indication of the force output vector may be transmitted to the omnidirectional mechanical drive unit via a communication interface.
  • the robot may be moved within a physical environment via the omnidirectional mechanical drive unit in the second direction based on the force output vector.
  • the triggering condition may include proximity to a virtual rail.
  • the haptic force input vector may cause the robot to align with the virtual rail.
  • the triggering condition includes proximity to a virtual rail.
  • the haptic force input vector causes the robot to suddenly break free from the virtual rail.
  • a plurality of haptic force input vectors including the haptic force input vector may be determined.
  • the plurality of haptic force input vectors collectively generate vibration via the omnidirectional mechanical drive unit. The vibration may occur in a direction parallel or orthogonal to the physical force input vector.
  • the triggering condition includes proximity to a virtual corridor wall associated with a virtual corridor.
  • the haptic force input vector may be directed away from the virtual corridor wall.
  • light may be projected from the robot onto a surface to indicate a virtual navigational affordance associated with the triggering condition.
  • a virtual navigational affordance associated with the triggering condition may be detected based on sensor data determined by scanning the physical environment.
  • the physical environment may be a warehouse, and detecting the virtual navigational affordance may involve identifying one or more insignia located on a floor of the warehouse.
  • the triggering condition may be associated with a task included in a workflow being performed by the robot based on an instruction received from a fleet controller.
  • the force output vector may be determined based at least in part on a friction force input vector characterizing a virtual frictional force exerted in a third dimension opposing the first direction.
  • the force output vector may be determined based at least in part on a functional force input vector characterizing a virtual functional force to be exerted on the robot in a third direction based on one or more instructions.
  • the force output vector may be determined based at least in part on an obstacle avoidance force input vector exerted in a third direction opposing an obstacle located in the physical environment.
  • the obstacle may be detected based on sensor data received from one or more sensors located at the robot.
  • the omnidirectional mechanical drive unit is backdrivable.
  • determining the force output vector may involve multiplying the physical force input vector by a force multiplier.
  • Figure 1 illustrates a perspective view of an autonomous robot, configured in accordance with one or more embodiments.
  • Figure 2 illustrates a perspective view of an autonomous robot with a horizontal handlebar, configured in accordance with one or more embodiments.
  • Figure 3 illustrates an additional configuration of an autonomous mobile robot, arranged in accordance with one or more embodiments.
  • Figure 4 illustrates a disassembled view of an autonomous mobile robot, configured in accordance with one or more embodiments.
  • Figure 5 illustrates a communication architecture diagram of the autonomous mobile robot, configured in accordance with one or more embodiments.
  • Figure 6 is a block diagram of a computing device, configured in accordance with one or more embodiments.
  • Figure 7 illustrates a diagram of a drive unit, configured in accordance with one or more embodiments.
  • Figure 8 illustrates a view of an autonomous mobile robot that includes a drive assembly containing two drive units, configured in accordance with one or more embodiments.
  • Figure 9A, Figure 9B, and Figure 9C illustrates views of an autonomous mobile robot that includes a drive assembly containing various numbers of drive units, configured in accordance with one or more embodiments.
  • Figure 10A, Figure 10B, and Figure 10C illustrate perspective views of portions of an autonomous mobile robot, configured in accordance with one or more embodiments.
  • Figure 11A and Figure 11B illustrate handlebars for guiding an autonomous mobile robot, configured in accordance with one or more embodiments.
  • Figure 12 illustrates a force sensor, configured in accordance with one or more embodiments.
  • Figure 13 illustrates a perspective view of a force sensing handlebar, configured in accordance with one or more embodiments.
  • Figure 14 illustrates a perspective view of portions of a force sensing handlebar, configured in accordance with one or more embodiments.
  • Figure 15 illustrates a top view of a force sensing handlebar for an autonomous robot in a first manipulated position, configured in accordance with one or more embodiments.
  • Figure 16 illustrates a top view of a force sensing handlebar for an autonomous robot in a second manipulated position, configured in accordance with one or more embodiments.
  • Figure 17 illustrates a perspective view of an autonomous robot with a force sensing base, configured in accordance with one or more embodiments.
  • Figure 18 illustrates a perspective view of portions of a force sensing base, configured in accordance with one or more embodiments.
  • Figure 19 illustrates a method of controlling an autonomous mobile robot, performed in accordance with one or more embodiments.
  • Figure 20 illustrates a method for executing a robot control loop, performed in accordance with one or more embodiments.
  • Figure 21 illustrates an architecture diagram for a control portion of a mobile robot, configured in accordance with one or more embodiments.
  • Figure 22 illustrates a method for determining a robot control output instruction, performed in accordance with one or more embodiments.
  • Figure 23 illustrates a method for autonomous motion control of an autonomous mobile robot, performed in accordance with one or more embodiments.
  • Figure 24 illustrates a diagram showing sensor placement on an autonomous mobile robot configured in accordance with one or more embodiments.
  • Figure 25A illustrates a view of an attachment point between a lighting element and a communication channel in an embodiment including an angled shelf configuration.
  • Figure 25B illustrates a view of an attachment point between a lighting element and a communication channel in an embodiment including a flat shelf configuration.
  • Figure 26 illustrates a method for controlling one or more lights associated with an autonomous mobile robot, performed in accordance with one or more embodiments.
  • Figure 27 illustrates a block diagram representation of a shelf that may form part of an autonomous mobile robot, configured in accordance with one or more embodiments.
  • Figure 28A, Figure 28B, and Figure 28C illustrate diagrams of situations in which haptic feedback may be provided via a mechanical drive unit, generated in accordance with one or more embodiments.
  • Figure 29A, Figure 29B, and Figure 29C illustrate diagrams of situations in which haptic feedback may be provided via a mechanical drive unit, generated in accordance with one or more embodiments.
  • Figure 30A and Figure 30B illustrate diagrams of situations in which haptic feedback may be provided via a mechanical drive unit, generated in accordance with one or more embodiments.
  • Figure 31 illustrates a method of determining a virtual friction vector, performed in accordance with one or more embodiments.
  • Figure 32 illustrates a method of calibrating a physical friction vector for an autonomous mobile robot, performed in accordance with one or more embodiments.
  • the mechanical drive unit may operate based on user input, which may be provided by a human operator via a force sensor included in a handlebar unit.
  • the force sensor may be used to detect a translational and rotational force provided as input, and then determine a direction of feree to apply based on the input.
  • haptic feedback is provided via a dedicated device such as a vibration mechanism.
  • techniques and mechanisms described herein facilitate the integration of haptic feedback into the mechanical drive unit itself, which may be one or more of holonomic, omnidirectional, and backdriveable.
  • the autonomous mobile robot may include a haptic force input vector along with other input vectors when determining an output force vector for the mechanical drive unit. Such an approach may provide for a more natural and intuitive operation of the autonomous mobile robot by integrating the haptic feedback directly into the interaction between the control mechanism and the response mechanism.
  • the haptic force input vector may be implemented as a constant value that operates to encourage a human operator to move toward or away from a particular direction, for instance to follow a path or to avoid an obstacle.
  • the haptic force input vector may be implemented as a sharply changing back-and-forth vector that provides vibrational feedback via the mechanical drive unit.
  • the haptic force input vector may change based on user input. For example, a haptic force input vector implemented to assist a user in evading an obstacle may increase in magnitude as the autonomous mobile robot approaches the obstacle. As another example, a haptic force input vector implemented to assist a user in adhering to a path may be eliminated completely when user input is detected indicate that the user has suddenly jerked the autonomous mobile robot off of the path, reflecting the user's desire to navigate in a different direction.
  • the autonomous mobile robot may be configured as a cart capable of transporting one or more objects.
  • the robot may operate in one of various modes.
  • the robot may operate without physical human intervention, for instance autonomously moving from one location to another and/or performing various types of tasks.
  • the robot in a robot-guided mode, the robot may direct a human to perform a task, such as guiding a human from one location to another.
  • the robot in a person-guided mode, the robot may operate in a manner responsive to human guidance.
  • the robot may be configured to seamlessly switch between such modes, for instance with the aid of computer vision, user interaction, and/or artificial intelligence.
  • an autonomous mobile robot may be configured for operation in a warehouse environment.
  • the robot may be equipped and configured to perform and support warehouse operations such as item picking, item transport, and item replenishment workflows.
  • the robot may be equipped to perform automated item pickup and/or dropoff, for instance via one or more arms or conveyer belts.
  • the robot may be equipped to perform automated charging and/or battery swapping.
  • the robot may be equipped to autonomously navigate to a particular location, follow a user, respond to user instructions, amplify a force exerted on the robot by a user, and/or perform other types of operations.
  • the robot may be adapted to site-specific environmental conditions and/or processes.
  • the robot may include a drive assembly to provide motive power.
  • the drive assembly may include one or more drive units, with each drive unit orientable in an independent manner to that of the other drive units.
  • Each drive unit may include a plurality of driven wheels that may be independently driven. Independent drive of each of the drive wheels of the drive assembly allows for the drive assembly to move the robot in a holonomic manner. That is, the robot may be driven without constraints in direction of motion.
  • the robot may include a force sensing assembly to allow for a user to manipulate the robot.
  • the force sensing assembly may include, for example, a handlebar.
  • the handlebar may be mounted in any orientation, such as in a horizontally or vertically mounted orientation.
  • the force sensing assembly may be a force sensing base or another mechanism configured to receive physical input from a user (e.g., a hand, arm, foot, or leg of a user).
  • a user may manipulate the force sensing assembly by, for example, providing force to a handlebar to move the handlebar from a neutral position.
  • the manipulation of the force sensing assembly may provide instructions to the robot and cause the drive assembly to move the robot in accordance with the instructions provided via the force sensing assembly.
  • Such commands may, for example, override autonomous or semi-autonomous operation of the robot.
  • Techniques and mechanisms described herein also provide for control of a robot, which may be one or more of omnidirectional, holonomic, backdrivable, and autonomous.
  • Input may be received from a force sensor identifying a force exerted on the force sensor. Based on this input, a physical input force vector quantifying a force exerted on the force sensor in two or more dimensions may be determined.
  • a force output vector may be determined by combining the physical input force vector with a second force input vector.
  • the force output vector may quantify a force to apply to move the robot in another direction.
  • the second force input vector may include, for instance, a frictional force and/or a functional force determined based on one or more operational objectives.
  • the force output vector may include a force multiplier multiplying the physical force exerted on the force sensor.
  • An indication of the force output vector may be sent to a mechanical drive unit at the robot and then used to direct the movement of the robot via the mechanical drive unit.
  • an autonomous mobile robot may support omnidirectional movement. That is, the autonomous mobile robot may be capable of movement in any direction.
  • an autonomous mobile robot may support holonomic movement. That is, the autonomous mobile robot may be capable of powered movement in any direction corresponding with a degree of freedom associated with the robot.
  • a conventional automobile is not holonomic because it has three motion degrees of freedom (i.e., x, y, and orientation) but only two controllable degrees of freedom (i.e., speed and steer angle).
  • a conventional train is holonomic because it has one controllable degree of freedom (i.e., speed) and one motion degree of freedom (i.e., position along the track).
  • an autonomous mobile robot may support omnidirectional and holonomic movement. That is, the autonomous mobile robot may be capable of powered movement and rotation in any direction from any position.
  • an autonomous mobile robot may be backdriveable. That is, the drive unit may operate so as to maintain a desired force of interaction at a level close to zero. For instance, when pressure is exerted on the robot, even in an area other than the handlebar, the drive unit may operate to move the robot in a direction consistent with the force so as to reduce the force of interaction. For example, if a person were to exert 10 Newtons of force on a fixed object, such as a wall, the wall would exert an equal and opposite force on the person due to the wall's immobility, causing the person to experience 10 Newtons of force in the opposite direction of the force.
  • the drive unit may cause the robot to move in the direction of the force at a speed such that the person would experience approximately 0 Newtons of force.
  • the robot may be configured to control the drive unit to keep the level of feree experienced by the operator below a designated threshold under normal operating conditions.
  • an autonomous mobile robot may synthesize various types of instructions and input to provide a seamless experience.
  • an autonomous mobile robot may support one or more of the following mobility input mechanisms.
  • First, an autonomous mobile robot may be backdriveable in the sense that the drive unit may operate to move in a direction to effectively cancel out force exerted on the robot from any direction.
  • Second, an autonomous mobile robot may be responsive to force exerted on a force sensor, for instance by instructing the drive unit so as to multiply the exerted force in the direction of movement.
  • an autonomous mobile robot may respond to the presence of a human, for instance based on touch sensor and/or optical sensor data.
  • a robot may cease autonomous movement and wait for more instructions when grasped or approached by a human.
  • an autonomous mobile robot may autonomously navigate along a nominal trajectory based on information determined at the autonomous mobile robot and/or information received from a remote computing device, such as a fleet controller.
  • an autonomous mobile robot may autonomously act in support of operational guidelines and objectives, for instance to avoid both static obstacles (such as walls) and dynamic obstacles (such as humans).
  • an autonomous mobile robot can be onboarded without bringing an autonomous mobile robot on-site for an initial survey. Such rapid deployment can significantly increase adoption speed.
  • autonomous mobile robots When using conventional techniques and mechanisms, industrial autonomous mobile robots are typically configured with expensive hardware that is customized to particular environments. In contrast, various embodiments described herein provide for autonomous mobile robots may be configured with standardized hardware and software that is easily and cheaply applicable and adaptable to a range of environments.
  • an autonomous mobile robot may thus perform and/or facilitate human-centric operations such as zone picking, human following, wave picking, a virtual conveyer belt, and user training. Such operations can increase human engagement and reduce the autonomous mobile robot's impact on foot traffic, for instance when its work is unrelated to people nearby.
  • autonomous mobile robots and automated guided vehicles treat people and dynamic objects (e.g., forklifts) as static obstacles to be avoided.
  • various embodiments described herein provide for autonomous mobile robots that differentiate between persistent, temporary, and in-motion objects, interacting with them fluidly and efficiently.
  • an autonomous mobile robot cannot visually distinguish between different individuals.
  • various embodiments described herein provide for autonomous mobile robots that can respond to requests from particular individuals and navigate around an environment in more fluid, less disruptive ways.
  • an autonomous mobile robot may be configured to follow a particular person around a warehouse environment upon request.
  • Figure 1 illustrates a perspective view of an autonomous robot 100, configured in accordance with one or more embodiments.
  • the autonomous robot 100 includes a drive assembly 102, a payload 108, and a force sensing assembly 110.
  • the drive assembly 102 may include one or more drive units 104 and one or more payload support element 106.
  • Each of the one or more drive units 104 may include one or more powered wheels.
  • Each of the one or more drive units 104 may be configured to be operated, jointly or independently, to power autonomous robot 100 and provide movement to autonomous robot 100 in a backdrivable and holonomic manner.
  • the payload support element 106 may be one or more support features (e.g., castor wheels, sliding pads, and/or other structures that may provide stability while accommodating movement).
  • the payload support element 106 may be disposed within portions of drive assembly 102 and/or coupled to portions of the payload 108 to provide stability for autonomous robot 100.
  • the payload support element 106 may be disposed or coupled to any portion of the drive assembly 102 and/or the payload 108 to provide stability.
  • “coupled” may refer to direct or indirect (e.g., with intermediate elements) relationships between elements while “connected” may refer to direct (e.g., with no intermediate elements) relationships between elements.
  • the payload support element 106 may provide sufficient support for the payload 108 to allow for the one or more drive units 104 to be positioned in a manner to provide for predictable backdrivable and holonomic movement.
  • the payload support element 106 may provide for stability while the payload 108 (which may include, for example, a shelf) is loaded or unloaded and/or while the autonomous robot 100 is in motion.
  • the drive assembly 102 may be configured to couple to the payload 108 to move the payload 108.
  • the drive assembly 102 may couple to the payload 108 via any technique. For example, one or more openings on a body, such as one or more portions of payload 108, may be inserted into one or more openings disposed within the body of drive assembly 102.
  • one or more mechanical fasteners such as bolts, screws, and/or rivets may be employed.
  • permanent or semi-permanent techniques such as welding or adhesives may be used.
  • drive assembly 102 may be a module that may, in some embodiments, be coupled to any number of different versions of the payload 108.
  • the payload 108 may be a commercially available (e.g., off-the-shelf) utility body, such as a shelf. Alternatively, the payload 108 may be customized for use with the drive assembly 102.
  • a commercially available (e.g., off-the-shelf) utility body such as a shelf.
  • the payload 108 may be customized for use with the drive assembly 102.
  • the payload 108 may include any tool or assembly that may assist in operations.
  • the payload 108 may include one or more elements of a a cart (which may include a mounted shelf), a mounted robot, a container box, and/or other such item. While description may be provided in the manner of autonomous carts and shelves, other embodiments of payload 108, such as assembly robots, are within the scope of the disclosure.
  • the force sensing assembly 110 may include, for example, a vertically oriented handle (e.g., a handle with a major axis that is within 10 degrees of vertical) coupled to the autonomous robot 100 and communicatively coupled to the drive assembly 102.
  • Other embodiments of the force sensing assembly 110 may include a handlebar oriented in another orientation (e.g., a horizontally oriented handle within 10 degrees of horizontal) a force sensing base (e.g., a base, such as the base of drive assembly 102, configured to receive input from a foot of a user) of autonomous robot 100, and/or other such mechanism or technique configured to receive directional input from a user.
  • Such input may, for example, allow for the distinguishing of different types of inputs, such as inputs that are intended to cause the autonomous robot 100 to translate in a certain direction as well as inputs that are intended to cause the autonomous robot 100 to rotate in a certain direction.
  • the robot may include one or more sensors for supporting whole body force sensing. Using a whole body force sensing approach, force exerted anywhere on the robot can be detected, even if not exerted on a handlebar connected with a force sensor.
  • the robot's drive unit may detect a force exerted on the robot by comparing a direction and magnitude of motion of the robot compared to an instruction sent to the drive assembly to estimate a force exerted on the robot outside of the handlebar.
  • the robot may be configured to exert a force to support backdriveability in which the robot moves in a direction of a force exerted on the robot so as to negate the force felt by the robot.
  • the force sensing assembly 110 may be configured to provide operating instructions to the drive assembly 102. That is, a user may manipulate the force sensing assembly 110 and appropriate operating instructions may be determined (e.g., by a controller disposed within the force sensing assembly 110 and/or coupled to the force sensing assembly 110 and configured to receive signals from the force sensing assembly 110) for drive assembly 102. Such operating instructions may be communicated to the drive assembly 102.
  • the force sensing assembly 110 may be a force sensing handlebar assembly that is positioned between a human operator and the payload 108 to significantly reduce the effort involved in moving the payload 108 by operating drive assembly 102 via commands determined by manipulation of the force sensing assembly 110.
  • the force sensing assembly 110 may, thus, operate the drive assembly 102 to push, pull, and/or rotate the autonomous robot 100 and, thus, payload 108.
  • the force sensing assembly 110 may be positioned on various areas of the autonomous robot 100.
  • the force sensingassembly 110 may be positioned alongthe top of autonomous robot 100, along the base of the autonomous robot 100, or in a different location. Signals from manipulation of the force sensing assembly 110 may be communicated to the drive assembly 102 in a wired or wireless fashion.
  • vertical orientation of a force sensing handlebar may allow for ergonomic improvements for user interactions with the autonomous robot 100. For example, a human operator may instinctively grab and manipulate items with a vertically oriented hand (e.g., with the thumb of the hand located at the top). Additionally, vertical orientation allows for intuitive rotational control of the autonomous robot 100 as the rotational controls may mimic the wrist rotation of the user.
  • Figure 2 illustrates a perspective view of an autonomous robot 200 with a horizontal handlebar, configured in accordance with one or more embodiments.
  • the autonomous robot 200 that includes drive assembly 202 with one or more drive units 204, a payload support element 206, a payload 208, and a force sensing assembly 210.
  • the autonomous robot 200 is substantially similar to the autonomous robot 100 shown in Figure 1.
  • force sensing assembly 210 may include a horizontal handlebar.
  • the horizontal handlebar of Figure 2 may include sensors, as described herein, disposed at one or both ends (e.g., the horizontal ends) and/or other portions of the handlebar.
  • translational pushes on the horizontal handlebar may cause autonomous robot 200 to translate, but twisting of the horizontal handlebar (e.g., around a vertical axis such that, for example, one end of the handlebar may be moved "forward" while the other end may be moved “backward”) may cause rotation of autonomous robot 200.
  • FIG. 3 illustrates an additional configuration of an autonomous mobile robot 300, configured in accordance with one or more embodiments.
  • the autonomous mobile robot 300 includes a base unit and drive assembly 302, a chassis 304, a sensor unit 306, a user interaction unit 308, and a communication channel 310.
  • the base unit 302 may be configured with an omnidirectional and/or holonomic drive assembly.
  • the drive assembly may include elements such as one or more wheels, treads, motors, controllers, batteries, and/or other components.
  • the base unit and drive assembly 302 may include a force sensor such as a wholerobot force sensor. Alternatively, or additionally, such a force sensor may be included in a different portion of the autonomous mobile robot 300, such as within the user interaction unit 308.
  • the base unit 302 may also include a bump sensor configured to detect an impact.
  • the chassis 304 may include one or more rigid members providing physical support and connection between and among other components of the robots.
  • the chassis 304 may be composed of one or more rods, shelves, bins or other elements.
  • some or all of the chassis 304 may be composed of components from standardized shelving units or carts.
  • chassis may be composed in part of a commodity shelving unit.
  • the commodity shelving unit may be 48 inches long, 38 inches wide, and 73 inches tall.
  • the sensor unit 306 may include one or more sensors configured to sense the physical environment in which the autonomous mobile robot 300 is situated.
  • the sensor unit 306 may include four visible light cameras arranged with one on each of four sides of the robot providing 360-degree or near 360-degree visual coverage.
  • various numbers and types of sensors may be employed. Examples of such sensors may include, but are not limited to: visible light cameras, infrared cameras, time-of-flight depth sensors, structured light depth sensors, RADAR sensors, LIDAR sensors, microphones, and chemical sensors.
  • the sensor unit 306 may include other elements such as one or more autonomous mobile robot controllers or computing units, one or more communication interfaces for communicating with other computing devices, and/or one or more digital display screens for displaying information.
  • the user interaction unit 308 may include one or more elements for facilitating user interaction.
  • the user interaction unit 308 may include a display (e.g., a touch-screen display) for presenting information and/or receiving user input.
  • the user interaction unit 308 may include a force-sensitive handlebar configured to force exerted on the handlebar. The force detected may include degree, direction, and/or rotational elements.
  • the user interaction unit 308 may include a barcode scanner or other sensor.
  • the communication channel 310 may include one or more cables and/or busses for transmitting power, sensor data, instructions, and/or other electronic signals between different components of the robot.
  • the communication channel 310 may include routing for accessor cables, lighting, put-to-light taps, pick-from-light taps, and/or other components.
  • Figure 3 illustrates only one example of a configuration of an autonomous mobile robot.
  • an autonomous mobile robot may be configured in a different manner in accordance with one or more embodiments.
  • an autonomous mobile robot may include more than one handlebar.
  • a handlebar may be configured in a different orientation, such as a vertical orientation.
  • a handlebar may be integrated into the chassis.
  • one or more elements of the sensor unit may be distributed elsewhere on the chassis 304 or base unit 302.
  • the chassis may be arranged with different configurations of shelving or other components.
  • Figure 4 illustrates a disassembled view of an autonomous mobile robot, configured in accordance with one or more embodiments.
  • the autonomous mobile robot may be separated into components for easy assembly, including a charging dock402, a base unit and drive assembly 404, a payload 406, a head unit 408, a light bar 410, and a communication channel 412.
  • the components may be shipped in a disassembled state and assembled on site.
  • a mobile robot may dock with the charging dock 402 to charge a battery included within the base assembly 404.
  • FIG. 5 illustrates a communication architecture diagram 500 of the autonomous mobile robot, configured in accordance with one or more embodiments.
  • the communication architecture diagram 500 is conceptually divided into regions.
  • a sensor unit region 502 corresponds with the sensor unit 306.
  • a user interaction region 504 corresponds with the user interaction unit 308.
  • a base unit region 506 corresponds with the base unit and drive assembly 302.
  • a drive unit region 508 corresponds with one or more drive units within the base unit and drive assembly 102.
  • the sensor unit region 502 may include a main processing unit 510, a communication interface 512, one or more sensors 514, and/or one or more marquees 516.
  • the main processing unit 510 may be a computing device such as an AGX Orin provided by Nvidia.
  • the communication interface 512 may include a hardware radio or other device facilitating communication using a communication protocol such as WiFi, Bluetooth, and/or cellular.
  • the sensors 514 may transmit sensor data to the main processing unit.
  • the main processing unit 510 may be configured to process and/or instruct the sensors 514. For instance, the main processing unit 510 may be configured to determine a model of a physical environment based on camera data from one or more visible light cameras included in the sensors 514.
  • the base unit region 506 includes a main board 524, which includes one or more processors and/or other components.
  • the main board 524 facilitates communication between and among other components.
  • the base unit region 506 also includes one or more force sensors 526 and a power distribution board 528.
  • the power distribution board 528 communicates with one or more battery systems 530, a power dock interface 532, an on button 534, and an electronic stop interface 536.
  • the user interaction region 504 may include one or more end points for interacting with user interface devices such as one or more touch sensors 518, lighting elements 516, touch screens 520, barcode scanners 522, and/or other such devices. Such devices may communicate with the main processing unit 510 and/or the main board 524.
  • user interface devices such as one or more touch sensors 518, lighting elements 516, touch screens 520, barcode scanners 522, and/or other such devices.
  • Such devices may communicate with the main processing unit 510 and/or the main board 524.
  • the drive unit region 510 may communicate with motor driver boards 538 and 540 corresponding to different drive units within the autonomous mobile robot, which may have one, two, or more drive units. Each drive unit may correspond to one or more wheels, treads, orother mechanismsfor locomotion.
  • the motordriver boards may communicate with the encoders 542 and 544 and one or more motors 546 and 548.
  • the encoders 542 and 544 may be absolute encoders
  • the motors 546 and 548 may be brushless DC motors.
  • the communication architecture diagram 500 is one example of a possible configuration of components within the autonomous mobile robot 100, provided for the purpose of illustration. According to various embodiments, various arrangements and combinations of components may be employed in a manner consistent with techniques and mechanisms described herein.
  • FIG. 6 is a block diagram of a computing device, configured in accordance with one or more embodiments.
  • a system 600 suitable for implementing embodiments described herein includes a processor 601, a memory module 603, a storage device 605, an interface 611, and a bus 615 (e.g., a PCI bus or other interconnection fabric.)
  • System 600 may operate as variety of devices such an autonomous mobile robot, a remote server configured as a fleet manager, or any other device or service described herein. Although a particular configuration is described, a variety of alternative configurations are possible.
  • the processor 601 may perform operations such as those described herein with respect to the various devices and methods.
  • Instructions for performing such operations may be embodied in the memory 603, on one or more non-transitory computer readable media, or on some other storage device.
  • Various specially configured devices can also be used in place of or in addition to the processor 601.
  • the interface 611 may be configured to send and receive data packets over a network.
  • a computer system or computing device may include or communicate with a monitor, printer, or other suitable display for providing any of the results mentioned herein to a user.
  • Any of the disclosed implementations may be embodied in various types of hardware, software, firmware, computer readable media, and combinations thereof.
  • some techniques disclosed herein may be implemented, at least in part, by non-transitory computer-readable media that include program instructions, state information, etc., for configuring a computing system to perform various services and operations described herein.
  • Examples of program instructions include both machine code, such as produced by a compiler, and higher-level code that may be executed via an interpreter. Instructions may be embodied in any suitable language such as, for example, Java, Python, C++, C, HTML, any other markup language, JavaScript, ActiveX, VBScript, or Perl.
  • non-transitory computer-readable media include, but are not limited to: magnetic media such as hard disks and magnetic tape; optical media such as flash memory, compact disk (CD) or digital versatile disk (DVD); magneto-optical media; and other hardware devices such as read-only memory (“ROM”) devices and random-access memory (“RAM”) devices.
  • ROM read-only memory
  • RAM random-access memory
  • FIG. 7 illustrates a diagram of a drive unit 700, configured in accordance with one or more embodiments.
  • Figure 8 illustrates a different view of the autonomous mobile robot that includes two drive units 700.
  • the drive unit 700 includes a turntable unit 702 that turns about an axis 704.
  • the drive unit 700 also includes one or more wheels 706 and one or more motors.
  • the wheels 706 are offset from the axis 704 and independently controllable via the motor.
  • the motor may power one or more of the wheels using power supplied by a battery, which may be located inside of the drive unit or outside of the drive unit and connected with the motor, for instance via a slip ring.
  • the drive unit 700 may include one or more unpowered supports, such as the unpowered freely mobile caster wheel 710, for instance to provide additional stability.
  • each drive unit rotates freely around an axis 704. That is, the rotation around the axis 704 is unpowered.
  • the autonomous mobile robot can be caused to move in any direction or rotated by applying power from the motors to the wheels 706.
  • an autonomous mobile robot is further supported by one or more support elements 802.
  • a support element may be a freely spinning and unpowered caster wheel, a slider, or some other component.
  • the support elements may provide additional stability to the autonomous mobile robot.
  • an autonomous mobile robot may be equipped with multiple support elements 802.
  • an autonomous mobile robot may be equipped with fourfreely spinning and unpowered caster wheels located at approximately the four corners of the autonomous mobile robot.
  • an autonomous mobile robot may be equipped with one or more drive units.
  • Figure 9A illustrates a configuration that includes a single drive unit
  • Figure 9B illustrates a configuration that includes a double drive unit
  • Figure 9C illustrates a configuration that includes a triple drive unit.
  • including more than one drive unit may include performance on one or more dimensions. For instance, moving from a single drive unit to a double drive unit may increase backdrivability, decrease torque requirements, decrease the number of unique parts, and decrease the diameter needed for each of the drive units.
  • an autonomous mobile robot may be equipped with one or more drive units of a different type from that shown, such as one or more drive units that employ one or more Mecanum wheels.
  • Figure 10A, Figure 10B, and Figure IOC illustrate perspective views of portions of an autonomous mobile robot.
  • a base unit 102 that includes a drive assembly may be removable from the autonomous mobile robot.
  • Figure 11A and Figure 11B illustrate handlebars for guiding the autonomous mobile robot 100, configured in accordance with one or more embodiments. Depending on the configuration, handlebars may be arranged in any of various ways.
  • a handlebar 1102 may be integrated with the chassis, as shown in Figure 11A.
  • a handlebar 1104 may be set off from the chassis, as shown in Figure 11B.
  • an autonomous mobile robot may be equipped with a single handlebar.
  • an autonomous mobile robot may be equipped with more than one handlebar, as shown in Figure 11A and Figure 11B.
  • the handlebars may be used to detect and amplify force applied to the robot. For example, torque as the handlebar is twisted may be detected and used to instruct the drive unit to rotate the robot around the axis of the handlebar. As another example, translational force as the handlebar is pushed may be detected and used to instruct the drive unit to move the robot in the direction of the force, effectively amplifying the force.
  • Figure 12 illustrates a force sensor 1200, configured in accordance with one or more embodiments.
  • the force sensor 1200 may be used to detect force exerted on one or more of handlebars, for instance a handlebar configured as shown in Figure 11.
  • the force sensor 1200 includes a hall effect sensor 1202, a spring gasket 1204, and one or more magnets 1206.
  • a bar passes through the spring gasket 1204.
  • the spring gasket 1204 causes the bar to return to its original central position.
  • the one or more magnets 1206 may be arranged to as to generate a magnetic field detected by the hall effect sensor 1202, which may detect disruptions to the magnetic field corresponding with force exerted in one, two, three, or four dimensions.
  • the hall effect sensor 1202 may detect disruptions to the magnetic field corresponding with force exerted in the x-axis, the y-axis, the z-axis, and/or a rotational force.
  • the hall effect sensor 1202 may translate the detected disruptions into force sensor data.
  • the force sensor data may identify a direction of the force in one, two, or three translational dimensions and/or a fourth rotational dimensions.
  • the force sensor data may identify a magnitude corresponding with a translational and/or rotational force.
  • an autonomous mobile robot may be equipped with any of various kinds of feree sensors.
  • feree sensors may include, but are not limited to: Hall effect sensors, optical sensors, capacitive touch sensors, button switch sensors, break beam sensors, force sensitive resistors, and force sensitive switches.
  • an autonomous mobile robot may be equipped with various numbers and types of force sensors.
  • an autonomous mobile robot 100 may be equipped with force sensors located at some or all of a set of vertical poles included in the chassis and/or at some or all of handlebars coupled with the autonomous mobile robot.
  • a force sensor may be located in any of various locations on an autonomous mobile robot.
  • a force sensor may be located at the handlebar itself.
  • a force sensor may be located at the robot chassis, for instance along a vertical pole included in the chassis.
  • FIG. 13 illustrates a perspective view of a force sensing handle assembly 1310, configured in accordance with one or more embodiments.
  • the handle assembly 1310 includes a handlebar 1312 and a housing 1328.
  • the handlebar 1312 may be a vertically or horizontally oriented handlebar configured to allow a user to grasp the handlebar to provide operating instructions to the autonomous robot.
  • the housing 1328 may be configured to interface (e.g., mount) to one or more other portions of the autonomous robot, such as to the payload 108. In certain such embodiments, the housing 1328 may be configured to couple to various different portions and/or versions of the autonomous robot.
  • the handle assembly 1410 includes a handlebar 1412.
  • the handlebar 1412 includes a first end 1432A and a second end 1432B on opposite distal portions.
  • the ends of the handlebar 1412 may be coupled to various fixtures.
  • the first end 1432A of the handlebar 1412 may be coupled to the fixture 1414A while the second end 1432B of the handlebar 1412 may be coupled to the fixture 1414B.
  • the handlebar 1412 may be coupled to the fixtures 1414A and/or 1414B via compliant material 1430A and/or 1430B.
  • the compliant material 1430B is not entirely visible in Figure 14, but an ordinal indicatorfor 1430B indicates where the compliant material 1430B is located.
  • the compliant material 1430B is coupled to the fixture 1414B in a similar manner to the manner that the compliant material 1430A is coupled to the fixture 1414A.
  • the compliant material 1430A and/or 1430B may be a form or material, such as a spring or bushing made from an elastomer, rubber, metal, and/or other material, that allows the position of handlebar 1412 to change relative to the fixtures 1414A and/or 1414B in response to force applied to the handlebar 1412 by a human operator.
  • the compliant materials 1430A and/or 1430B may be coupled via casting, friction fit, fasteners, adhesives, and/or another such technique that may allow for the joining of two items (e.g., two items of different materials).
  • the fixtures 1414A and/or 1414B may be directly connected to the autonomous robot (e.g., via any type of direct connection such as adhesives, fasteners, welding, and/or via fasteners or other removable techniques).
  • the compliant material 1430A and/or 1430B may, thus, allow for the handlebar 1412 to translate and/or rotate relative to the fixtures 1414 and/or 1414B in response to force applied by the user.
  • the fixtures 1414A and/or 1414B in combination with the compliant material 1430A and/or 1430B may be configured to hold the handlebar 1412 in a fixed position (e.g., neutral position) when no force is applied to handlebar 1412.
  • the handle assembly 1410 includes a first sensor 1446A and a second sensor 1446B.
  • the first sensor 1446A includes a first sensor first portion 1416A and a first sensor second portion 1418A.
  • the second sensor 1446B includes a second sensor first portion 1416B and a second sensor second portion 1418B.
  • the first sensor first portion 1416A and the second sensor first portion 1416B may be coupled to handlebar 1412.
  • first sensor first portion 1416A and the second sensor first portion 1416B proximate to the opposite distal ends (e.g., the first end 1432A and the second end 1432B) of the handlebar 1412.
  • first sensor first portion 1416A and the second sensor first portion 1416B may be coupled to another portion of handlebar 1412.
  • first sensor second portion 1418A and the second sensor second portion 1418B may be coupled to portions of handle assembly 1410 and/or autonomous robot that handlebar 1412 may be configured to move relative to. That is, the first sensor second portion 1418A and the second sensor second portion 1418B may be coupled to, for example, the housing 1328, the fixtures 1414A and 1414B, respectively, and/or another portion of autonomous robot.
  • first sensor second portion 1418A and the second sensor second portion 1418B may be held in a "fixed" position (e.g., fixed relative to another portion of autonomous robot 100 such as payload 108) so that movement of the handlebar 1412 may cause the first sensor first portion 1416A and the second sensor first portion 1416B to move relative to the first sensor second portion 1418A and the second sensor second portion 1418B.
  • relative movement of the first sensor first portion 1416A to the first sensor second portion 1418A and the second sensor first portion 1416B to the second sensor second portion 1418B may allow for a determination as to whether a human operator is pushing on or rotating the handlebar 1412. Based on the human operator's interaction with the handlebar 1412 (e.g., whether the user is pushing on or rotating handlebar 1412), the autonomous robot may be driven in different manners. Operation of the autonomous robot is discussed in additional detail throughout the application, for instance with respect to Figure 19 through Figure 26.
  • first sensor first portion 1416A and the first sensor second portion 1418A and the second sensor first portion 1416B and the second sensor second portion 1418B may be offset in different positions (e.g., different positions along a vertical axis for vertically oriented handlebars or different positions along a horizontal axis for horizontally oriented handlebars) to allow for distinguishing of translational and rotational operating instructions.
  • first sensor first portion 1416A may be configured to interact with the first sensor second portion 1418A
  • second sensor first portion 1416B may be configured to interact with the second sensor second portion 1418B
  • first sensor second portion 1418A first sensor second portion 1418A may be configured to sense movement of first sensor second portion 1418A first sensor first portion 1416A relative to first sensor second portion 1418A first sensor second portion 1418A
  • the second sensor second portion 1418B may be configured to sense movement of the second sensor first portion 1416B relative to the second sensor second portion 1418B.
  • Such relative movement may be, for example, due to deflection of the compliant materials 1430A and/or 1430B from forces applied to the handlebar 1412.
  • the handlebar assembly may include a break beam sensor.
  • the break beam sensor may transmit a signal when a beam of light is broken and/or reestablished. Such a sensor may be used to detect when a handlebar is grasped and/or released.
  • Figure 15 illustrates a top view of a force sensing handlebar for an autonomous robot in a first manipulated position, configured in accordance with one or more embodiments.
  • Figure 15 illustrates a linear force 1522 applied to the handlebar 1512 of the handle assembly 1510.
  • the linear force 1522 applied to handlebar 1512 may result in drive assembly 102 linearly moving (e.g., translating) the autonomous robot, according to the techniques and mechanisms described herein.
  • the speed of the linear movement may be dependent on the magnitude of the detected force applied to the handlebar 1512.
  • Movement of the first sensor first portion 1516A may be determined relative to the sensor axis 1520A.
  • the first sensor first portion 1516A may be determined to be disposed at the center, or proximate the center (e.g., within a set degree of tolerance, such as within a few millimeters), of the sensor axis 1520A and may be determined to be disposed in a neutral position.
  • the first sensor second portion 1518A may be calibrated to determine the position of the first sensor first portion 1516A.
  • the first sensor second portion 1518A may be calibrated such that when the first sensor first portion 1516A is disposed in the neutral position (e.g., at the center or proximate the center of sensor axis 1520A), the first sensor second portion 1518A may determine that there is no relative movement of the first sensor first portion 1516A.
  • the first sensor second portion 1518A may be configured to detect movement of the first sensor first portion 1516A along two axes (e.g., XA and YA).
  • the first sensor first portion 1516A may move along the XA and/or YA axes (e.g., in the positive or negative XA and/or YA directions relative to the sensor axis 1520A) and such movement may be detected by the first sensor second portion 1518A.
  • movement of the second sensor first portion 1516B may be determined by the second sensor second portion 1518B, relative to the sensor axis 1520B, the center of which may be a neutral position that the second sensor second portion 1518B is calibrated towards.
  • the linear force 1522 may be applied to the handlebar 1512. Due to the application of the linear force 1522, the first sensor first portion 1516A may move relative to the first sensor second portion 1518A. Such movement may be determined as positive or negative according to defined axes. Thus, in certain embodiments, movement in the positive XA direction and the positive YA direction may be classified as positive magnitude, while movement in the opposite direction may be classified as negative magnitude. Similarly, the second sensor second portion 1518B may be configured to determine movement of the second sensor first portion 1516B in the positive and negative XB and YB directions, as shown. In certain embodiments, the positive directions of XA and XB may be in the same direction while the positive directions of YA and YB may be in opposite directions. The positive and negative directions may allow for the determination of whether the handlebar 1512 is translating or rotating. Other orientations of the axes may be possible in other embodiments.
  • the linear force 1522 applied to the handlebar 1512 may cause the reaction 1524A for the first sensor first portion 1516A (e.g., the reaction 1524A may include movement of the first sensor first portion 1516A in the positive XA direction) and may cause the reaction 1524B for the second sensor first portion 1516B (e.g., the reaction 1524B may be movement of the second sensor first portion 1516B in the positive XB direction).
  • the reactions of the first sensorfirst portion 1516A and the second sensor first portion 1516B may be detected by the first sensor second portion 1518A and the second sensor second portion 1518B, respectively.
  • a determination may be made (e.g., by a controller as described herein) that the force 1522 is causing the handlebar 1512 to translate as both first sensor first portion 1516A and the second sensor first portion 1516B may both be determined to be moving in the same direction, along the same vector, and/or with the same magnitude of movement.
  • Figure 16 illustrates a top view of a force sensing handlebar for an autonomous robot in a second manipulated position, configured in accordance with one or more embodiments.
  • Figure 16 illustrates torque 1626 applied to the handlebar 1612 of the handle assembly 1610.
  • the torque 1626 may be, for example, a twisting motion applied by a user.
  • Application of the torque 1626 may cause the orientation of the handlebar 1612 to accordingly twist, resulting in the reaction 1624A for the first sensor first portion 1616A, which may be movement of the first sensorfirst portion 1616A in at least the positive XA direction, and the reaction 1624B for the second sensor first portion 1616B, which may be movement of the second sensor first portion 1616B in at least the negative XB direction.
  • the first sensor first portion 1616A may additionally or alternatively move in the negative YA direction
  • the second sensor first portion 1616B may additionally or alternatively move in the negative YB direction.
  • a determination may be made that the torque 1626 is causing the handlebar 1612 to rotate.
  • the drive assembly 102 may be then operated to cause autonomous robot 100 to rotate orientations.
  • Other embodiments may determine other types of rotation (e.g., with the first sensor first portion 1616A moving in the negative XA direction and the second sensor first portion 1616B moving in the positive XB direction).
  • the magnitude of the force and/or torque applied may also be determined. Such magnitude may be determined based on the stiffness factor of the fixtures 1414A and 1414B and/or the compliant materials 1430A and 1430B.
  • disposing the first sensor portions as close as possible to their respective fixtures may provide a simpler technique for determination of movement of the handlebar at the position of the respective fixtures.
  • the lumped parameter k/m may be an empirically determined factor relating force and magnetic flux.
  • the differences in magnetic flux based on the orientation of the magnet may be detected and, accordingly, whether the magnet is moving in the X or Y direction may be determined.
  • Figure 17 illustrates a perspective view of an autonomous robot with a force sensing base, configured in accordance with one or more embodiments.
  • Figure 17 illustrates an autonomous robot 1700 that includes a drive assembly 1702 with a drive unit 1704, a payload support element 1706, a payload 1708, and the force sensing assembly 1710.
  • a human operator may interact with the force sensing base by, for example, pushing or pulling on the force sensing base with a variety of different force input directions to cause the autonomous robot 1700 to translate and/or rotate (e.g., based on pushes that are orthogonal to the force sensing assembly 1710 to cause translational movement or pushes that are on the bias, such as within +/- 15 degrees of 45 degrees, to the force sensing assembly 1710 to cause rotational movement).
  • a human operator may also push or pull on the payload 1708 to translate and/or rotate the autonomous robot 1700 in a similar manner to that of interacting with the base.
  • Figure 18 illustrates a perspective view of portions of a force sensing base, configured in accordance with one or more embodiments.
  • Figure 18 illustrates an autonomous robot 1800 that includes force sensing assembly 1810 that may be a force sensing base.
  • force sensing base 1810 may include a plurality of sensors including a first sensor 1846A and a second sensor 1846B, as well as additional sensors.
  • the first sensor 1846A may include a first sensor first portion 1816A coupled to a portion of a payload 1808 and a first sensor second portion 1818A coupled to a portion of a force sensing assembly 1810.
  • the fixture 1814A which may include compliant material 1830A configured in the same manner as that described herein, may allow for movement of the payload 1808 relative to the force sensing base 1810 (e.g., in response to user inputs). Such movement may result in the first sensor first portion 1816A moving relative to the first sensor second portion 1818A, due to the compliance of the compliant material.
  • the second sensor 1846B may include the second sensor first portion 1816B, coupled to the payload 1808, and the second sensor second portion 1818B, coupled to the force sensing assembly 1810. Similarity or differences in the detected movement of the first sensor first portion 1816A and the second sensor first portion 1816B, as described herein, may result in a determination of whether the autonomous robot 1800 is moved in a translational, a rotational manner, or both.
  • Figure 19 illustrates a method 1900 of controlling an autonomous mobile robot, configured in accordance with one or more embodiments.
  • the method 1900 may be performed at a robot configured as described herein.
  • the method 1900 may be used to transition between various operating modes. Examples of various operating modes are described herein. However, robots may be configured with various types of operating modes depending on the operating context.
  • a robot may operate in an autonomous mode in which the robot operates autonomously to perform a task based on instructions received from a remote computing device, such as a fleet controller, or based on internal programming logic.
  • a robot operating in autonomous mode may perform operations such as moving from one location to another location while avoiding obstacles, taking one or more actions to avoid damage or injury to objects or humans, and/or operating one or more lights, mechanical arms, conveyer belts, and/or other components.
  • a robot may operate in a manual mode in which the robot acts in a manner responsive to user input. For instance, a robot may remain stationary until force is detected at a force sensor, at which time the robot may move in a direction determined based on the detected force.
  • an operating mode may include both manual and autonomous elements.
  • a robot may nevertheless apply a force so as to avoid colliding with a person or object, effectively overriding the user input.
  • the robot may apply a force that repels the robot from objects and that increases in magnitude with proximity to an object.
  • a robot may nevertheless adjust its movement based on user input, such as force received via a force sensor.
  • a robot may operate in a hybrid mode in which it guides a user along a path but nevertheless allows itself to be redirected based on user input.
  • a robot may operating in a following mode in which it autonomously follows a human, for instance to aid in the human's efforts to perform a task.
  • a robot may shift between operating modes based on user input. For example, a robot that detects the presence of a human may stop what it is doing and face the human. Then, if the human steps away from the robot, the robot may resume autonomous activity. As another example, a robot may operate autonomously until a human grasps the robot's handlebar, at which point the robot may enter into a manual operating model. Then, if the human releases the handlebar, the robot may resume autonomous operation, for instance after the passage of a designated period of time.
  • a request to control an autonomous mobile robot is received.
  • the request may be generated automatically at a main processing unit or other controller of the robot, for instance during an initialization process.
  • an autonomous mobile robot may be configured for operation in various modes, such as forced sensing, autonomous movement, unforced sensing, and/or other modes.
  • the mode may be determined based on user input. For example, a user may touch or approach the robot to remove it from autonomous movement mode. As another example, a user may activate a button or touch screen to place the robot into forced sensing mode.
  • the mode may be determined based on instructions received via a communication interface. For instance, the robot may receive an instruction from a fleet controller to enter or leave autonomous mode. As yet another example, the robot may detect when the operator has let go, for instance via one or more capacitive, tactile, and/or force sensing sensors.
  • I nput for moving the robot is determined at 1906.
  • the input may be received via a user interface at the robot, such as a force sensing handlebar.
  • the input may be received via a communication interface, for instance from a fleet controller.
  • Instructions for a drive mechanism for the robot are determined at 1908. The instructions are transmitted to the drive mechanism at 1910. The instructions may cause the robot to move in a particular direction. As discussed herein, various types of instructions are possible based on various types of input.
  • an autonomous mobile robot in force sensing mode may move in a direction of feree exerted by an operator on a force sensing handlebar.
  • the robot may detect a translational direction, rotational direction, and/or magnitude of force, and then direct the drive unit to move the robot in the direction.
  • the robot may be configured such that the operator need only apply a small amount of force despite the robot carrying a heavy load, with the robot effectively magnifying that force to move in the requested direction.
  • various types of modifications and constraints may be applied to such a force sensing motion configuration.
  • force may be applied asymmetrically, for instance braking much more easily than accelerating.
  • the robot may sense its surroundings and adapt its instructions for safety and/or damage avoidance. For example, the robot may slow to a stop before striking a person or object, moving down a ramp, or entering a hole. As another example, the robot may slow to a stop when it detects that the operator is no longer touching the robot.
  • Figure 20 illustrates a method 2000 for executing a robot control loop, performed in accordance with one or more embodiments.
  • the method 2000 may be used to determine instructions for providing to a robot drive unit for the purpose of causing the robot to move through space.
  • the method 2000 illustrates a more detailed view of operations discussed with respect to the method 1900 shown in Figure 19.
  • the method 2000 is described with reference to Figure 21, which illustrates an architecture diagram for a control portion of a mobile robot configured in accordance with one or more embodiments.
  • the architecture diagram includes drive control input sources 2102 through 2104, which may provide input to a robot drive controller 2106.
  • the robot drive controller 2106 may determine output instructions for a robot drive unit 2110 and may provide the output instructions to the robot drive unit 2110 via a robot drive unit abstraction layer 2108.
  • the method 2000 may be performed at the robot drive controller 2106, which in some embodiments may be implemented at the main board 524 shown in Figure 5.
  • the method 2000 may be used with any of various types of robots and/or drive systems. Some examples of such robots and drive systems are discussed with respect to Figures 1-18. However, the method 2000 may be used in conjunction with various types of robots and/or drive systems. For example, suitable robots may or may not be autonomous, omnidirectional, and/or backdrivable.
  • a request to control a drive unit of a robot is received at 2002.
  • the request may be generated when the robot is activated and enters a mode in which it is controllable.
  • the robot may be controlled based on one or more of user input, autonomous decision-making, and/or remote instructions received via a network interface from a remote system such as a fleet controller.
  • Movement control input information is determined at 2004.
  • the movement control input information may include various types of input received from any of various sources, such as the sources 2102 through 2104 shown in Figure 21.
  • the movement control information may include user input received from a user input device such as a force sensor, joystick, or other control device.
  • the force sensor may be attached to a force sensing handlebar as discussed throughout the application.
  • the movement control information may include one or more configuration parameters such as a force multiplier, a virtual friction coefficient, an indication of an operating mode, or the like.
  • the movement control information may include one or more functional control parameters.
  • the robot may be operating in a mode such that it behaves as if it is moving between virtual rails.
  • the drive unit of the robot may be controlled at least in part by transmitting an instruction determined based on user input received at a force sensor at the robot. For instance, when force is detected at a force sensing handlebar, the robot may be moved in the direction of the force. In general, a relatively larger force detected at the force sensor may correspond with a relatively higher velocity or force applied to the robot drive unit.
  • the term "force multiplier” as used herein refers to any alteration applied to the user input to strengthen or weaken the relationship between the input force received at the force sensor and the force instruction sent to the drive unit. For a given input force, for example, a relatively larger force multiplier would yield a relatively larger increase in velocity.
  • the force multiplier may be a fixed or configurable scalar, vector, or force output function that receives as inputs one or more parameters including data from the force sensor.
  • One or more operating conditions for the robot are determined at 2006.
  • the one or more operating conditions may include any conditions that may affect the robot's handling and control. Examples of such operating conditions may include, but are not limited to: the identity of an operator using the robot, a location of the robot, a direction in which the robot is traveling, an amount of traffic in the vicinity of the robot, a condition associated with the physical environment in which the robot is situated, and the like. For example, the detection of a potentially unsafe condition such as a wet surface may cause all robots to be placed in a safety mode in which a lower force multiplier is applied.
  • information about operating conditions may be determined by the robot itself. For instance, the robot may detect the presence of a wet surface based on a loss of traction in the drive unit. Alternatively, or additionally, such information may be received via a communication interface, for instance from a fleet controller. As still another possibility, a user may provide input indicating one or more operating conditions.
  • a physical input force vector is determined at 2008.
  • the physical input force vector may be determined based on user input.
  • the force sensor force vector may identify values for force exerted in one or more dimensions at one or more force sensors. For instance, a user may provide input by exerting force on a handlebar connected with the robot. As one example, the force sensor force vector may identify values for translational and rotational forces applied to a force sensor attached to a handlebar at the robot.
  • the force sensor force vector may identify values forforce exerted in one or more dimensions at one or more force sensors.
  • the force sensor force vector may identify values for translational (e.g., an x-dimension and a y-dimension) and rotational forces applied to a force sensor attached to a handlebar at the robot.
  • forces may be quantified in a coordinate system.
  • the coordinate system may be parameterized relative to the robot. For instance, the x-direction may be treated as "forward" while the y-direction is treated as "sideways".
  • the coordinate system may be parameterized relative to the physical environment. For instance, the x-direction may be treated as "north" or as a particular direction within a building. Additional details regarding an example of such a coordinate situation are discussed herein with respect to the force sensing handlebar assembly.
  • determining the physical input force vector may involve imposing one or more types of smoothing.
  • the system may impose a "dead band" around zero force.
  • the robot may ignore small amounts of force applied to the force sensor. In this way, the robot may be prevented from drifting when very little force is being applied to the force sensor. Thus, small amounts of force may be mechanically treated as zero force.
  • the system may smooth force over time, for instance to avoid jitter.
  • one or more smoothing operations may be applied in a dimension-specific manner.
  • the non-trivial forward movement force may be accounted for in the physical input force vector while the trivial rotational force is ignored.
  • one or more smoothing operations may be applied in a dimensionless manner. For instance, if the sum of the magnitudes of the dimension-specific forces is nontrivial, then all forces may be accounted for in the physical input force vector.
  • a frictional input force vector is determined at 2010.
  • the friction forces may be applied in the direction opposite to velocity in each dimension. For instance, if the robot is rotating clockwise and moving in a forward direction, then the friction force input vector may be vector with values applying force in an anticlockwise and backward direction. The current direction of movement may be determined based on the operating conditions determined at 2006.
  • a frictional input force vector may be composed of various components.
  • the frictional input force vector may be determined by aggregating these components, for instance by summing them.
  • Examples of components that may be included in a frictional input force vector include, but are not limited to: Coulomb friction, damping friction, static friction, dynamic friction, other types of friction, and/or combinations thereof.
  • a Coulomb friction component may be applied.
  • the Coulomb friction force vector may be constant in the direction against motion.
  • the Coulomb friction may, for instance, cause the robot to slow down over time absent user input.
  • the particular values used for Coulomb friction may depend on a variety of factors, such as the weight of the robot. For instance, a Coulomb friction coefficient between 0.01 and 0.25 may be applied by multiplying the coefficient by the weight of the robot.
  • a damping friction component may be applied.
  • the damping friction vector may be proportional to velocity in the direction against motion. The damping friction vector may, for example, limit the top speed of the robot in any direction.
  • the damping friction vector may also help to reduce stability concerns, for instance in the event that a robot impacts an obstacle and then rebounds sharply in the opposite direction.
  • a damping friction coefficient between 0.5 and 2.0 may be applied by multiplying the coefficient by the weight of the robot.
  • a functional input force vector is determined at 2012. According to various embodiments, the functional input force vector may be used to supply force based on functional considerations such as safety, obstacle avoidance, and/or other operational goals.
  • the functional input force vector may be used to guide the robot to a particular location or along a particular route within an environment.
  • the functional input force vector may include a virtual magnet force that pulls the robot along a route or to a designated location provided as input.
  • the functional input force vector may be used to move the robot along a route through an environment, effectively guiding the operator along.
  • the operator may be able to override the functional input force by, for instance, exerting a sufficiently strong physical force in a different direction.
  • the functional input force vector may be used to guide the robot safely within an environment.
  • the functional input force vector may include a virtual repellant force that causes walls, people, and/or other obstacles to effectively push back against the robot.
  • the functional input force vector may include virtual rumble strips that vibrate the robot underone or more operating conditions.
  • the functional input force vector may be used to enforce a speed limit.
  • the functional input force vector may include a component that pushes back in a direction against velocity to prevent the robot from attaining a speed greater than designated maximum.
  • the designated maximum speed limit may change depending on one or more considerations, such as the robot's location within an environment.
  • the functional input force vector may be used to provide virtual haptic rails or virtual train rails.
  • Haptic rails may be modeled as a virtual track along which the robot tries to maintain alignment. Moving the robot off of the virtual corridor may require user input such as sharp torque to the handlebar to "pop" the robot off of the rail. At the moment of leaving the rail, the robot may apply a sharp impulse such as a pop or a step in velocity to simulate leaving a track.
  • haptic rails or virtual train rails may be defined in any of various ways.
  • the robot may project rails onto the ground via a projector.
  • haptic rails, virtual train rails, and/or areas of particular speed zones may be detected by a robot based on, for instance, tape or paint applied to a region of the ground.
  • the functional input force vector may be used to simulate dynamic rails that lead the operator in a particular direction. For instance, the operator may be guided to move the robot along a virtual track, with movement along the track requiring much less force to the handlebars than movement in a different direction.
  • the rails may be sharp or soft, and may be narrow or wide, depending on the application.
  • the location of virtual rails or obstacles or the initialization of a strafing mode may be determined in various ways.
  • environment detection may involve input from a visual SLAM, inertial measurement unit, or other such data source.
  • a mode may be detected based on user input or one or more configuration parameters.
  • the robot may automatically enter a strafing mode when it enters an aisle.
  • virtual rails may be created based on lines painted or projected on the floor. Alternatively, the robot may project rails onto the floor that match the virtual rails being used by the robot.
  • the functional input force vector may be used to assist in smooth obstacle avoidance.
  • the robot may use the functional input force vector to simulate a repelling and/or dampening force when approaching an obstacle, finally slowing to a stop without hitting the obstacle and despite user input on the handlebars pushing the robot in the direction of the obstacle.
  • the functional input force vector may be used to introduce jitter simulating a rumble strip when moving toward an obstacle or other area where the robot determines that it should not travel.
  • the functional input force vector may be used to provide haptic feedback via one or more motors in a drive unit on the robot.
  • the strength of the haptic feedback and/or the force needed to operate the robot may be adapted for the individual.
  • the identity of the individual may be determined based on an identifier such as a badge, a bar code, an RFID tag, or other such indicator.
  • the adaptation may rely on an estimation of strength as a function of the user's size or force exerted on the robot.
  • the functional input force vector may be used to model the inertia of the device and then change that inertia as experienced by the user. Such a model may be used, for instance, to make the robot seem lighter in rotation than an unpowered robot would be given its mass, or to move the effective center of rotational mass backward toward the operator to simulate a "shopping cart" feeling.
  • the functional input force vector may be used to lock the orientation of the robot into a "strafing mode". In the strafing mode, the robot may align its orientation with an aisle, grid, user, or other reference point. A strafing mode may be used to simulate inertia based on a direction.
  • the robot may snap to a virtual grid but effectively provide a preference for a longitudinal axis.
  • the strafing mode may help to avoid drift down a long straightaway.
  • the robot may seem to be lighter in the preferred direction, with motion in a different direction requiring higher activation force.
  • the functional input force vector may include a component damping motion in a direction lateral to the front of the robot.
  • a component may be used, for instance, to facilitate smooth motion in the direction intended by the operator.
  • such a force may facilitate smooth turning by increasing the force in the direction in which the robot is pointing.
  • the functional input force vector may include multiple components.
  • the functional input force vector may include a combination of functional input force vector components corresponding with (1) haptic rails, (2) a speed limit, (3) a navigation objective, and/or any other elements.
  • one or more components of a functional input force vector may be determined subject to one or more constraints. For example, an obstacle avoidance force component, a speed limit force, and/or other such forces may be limited to operating in a direction opposed to velocity.
  • an unforced sensing mode may allow the robot to be moved without force detection.
  • moving a heavy omnidirectional object without activating a drive mechanism can be difficult due to the challenge in changing inertia, such as when turning a corner.
  • the functional input force vector may be used to simulate virtual fixed wheels at a point in the robot, making the robot more easily turnable in the unforced sensing model.
  • the location of the virtual fixed wheels may be configurable or adjusted automatically, for instance being moved based on physical slide being detected.
  • the functional input force vector may be applied in an unforced sensing mode for safety and/or convenience purposes. For instance, forward force may be applied to compensate for friction. As another example, stopping force may be applied whenever acceleration over a given threshold is detected.
  • a mobile robot may lock to orientation, for instance in strafing mode or when following an operator.
  • orientation for instance, the robot may resist attempts to rotate the robot. For instance, applying force to one corner of the robot in a way that would normally cause the robot to rotate may lead to the robot applying a virtual force to another corner to maintain the orientation.
  • Such locking may occur even in an unforced sensing mode, for instance by measuring displacement, wheel movement, inertia, and/or other such sensor values.
  • the functional input force vector may depend at least in part on configuration settings that may be adapted to the user. For example, a user may "level up" with experience to allow the use of additional features. As another example, some features may be disabled, depending on the application or area in which the robot is operating. As still another example, an operator or fleet manager may activate or disable one or more features.
  • the functional input force vector may include one or more elements received from a remote computing device, such as a fleet controller.
  • the one or more elements may include force vector components to be included in the functional input force vector, a function to calculate such components, a goal to be used in calculating such components, and/or other suitable information.
  • One or more robot drive unit control output instructions are determined at 2014 based on the one or more movement control input instructions.
  • a movement control output instruction may be provided to the robot drive unit to cause the robot drive unit to apply force to the robot.
  • the movement control output instruction may be determined by combining different types of movement control input information into an instruction capable of being acted upon by the robot drive unit.
  • a movement control output instruction may be specified as a vector in one or more dimensions.
  • a vector may include different directional components corresponding to movement in the x-direction, the y-direction, and the rotational direction with the robot being positioned on a virtual x-y plane corresponding with the physical environment in which the robot is situated.
  • a directional component may indicate a magnitude associated with movement in the indicated dimension. The magnitude may be indicated as, for instance, a value corresponding with a velocity or a force.
  • the one or more robot drive unit control output instructions are provided to a drive unit for the robot at 2016.
  • the robot drive unit control output instructions may be provided to an abstraction layer for the robot, such as the abstraction layer 2108 shown in Figure 21.
  • the abstraction layer may provide separation between: (1) the determination of how the robot should move; and (2) the control of the hardware components of the drive unit to achieve that movement. In this way, the same controller logic may be applied to different physical configurations of drive units.
  • the robot drive unit abstraction layer 2108 includes a control interface 2112, an instruction translator 2114, and a drive interface 2116.
  • the control interface 2112 receives the control instructions from the robot controller 2106.
  • the instruction translator 2114 translates those control instructions into hardware-level instructions executable by the robot drive unit 2110.
  • the drive interface 2116 then provides those control instructions to the robot drive unit 2110.
  • the robot drive controller 2106 may issue a control output instruction to the robot drive unit abstraction layer 2108 through the control interface 2112 effectively instructing the drive unit to move with force magnitude (fl, f2, f3) in the (x, y, r) direction.
  • the instruction translator 2114 may then translate that control output instruction into individual motorlevel instructions to one or more motors included in the drive unit 2110, such as motors corresponding to different wheels.
  • the robot drive unit may continue to be controlled until a terminating condition is met. For example, control may be terminated when the robot enters a deactivated state, an error condition, a charging mode, or another situation in which movement is not indicated.
  • the operations 2004 through 2018 may be performed with any suitable frequency.
  • this control loop may operate at a rate between the range of 25 hertz to 2 kilohertz.
  • Figure 22 illustrates a method 2200 for determining a robot control output instruction, performed in accordance with one or more embodiments.
  • the method 2200 may be performed at the robot drive controller 2106 shown in Figure 21.
  • the robot control output instruction may be specified in accordance with the control interface 2112 of the robot drive unit abstraction layer 2108.
  • the robot control output instruction may be specified as a vector identifying values corresponding to magnitude of either force or velocity in one or more dimensions.
  • a request to determine one or more robot control output instructions is received at 2202.
  • the request may be generated in the course of executing a robot control loop.
  • the request may be generated as discussed with respect to operation 2014 shown in Figure 20.
  • the robot may be controlled by directing the drive unit to move the robot in accordance with a vector that specifies velocity values along one or more dimensions.
  • the robot may be controlled by directing the drive unit to move the robot in accordance with a vector that specifies force values along one or more dimensions.
  • the robot may operate entirely in one mode or another.
  • the robot may be configured to operate entirely in a forcebased control mode or a velocity-based control mode.
  • the mode may be dynamically determined based on one or more considerations such as user input, operating conditions, payload weight, location, and/or communication with a remote computing system such as a fleet controller.
  • the determination made at 2204 may reflect one or more tradeoffs. For instance, employing velocity-based controls for a robot may allow the robot to operate with a constant apparent mass to the user, regardless of the actual mass of the robot and any load carried by the robot.
  • a velocity-based control approach may create a counterintuitive situation in which the forces exerted on the robot that are not detected by the force sensor are effectively counteracted through the application of the velocity-based control logic since the robot drive unit is controlled so as to match the target velocity.
  • Employing instead a force-based control through the use of a force multiplier may allow the robot to take into account forces exerted on the robot that do not act through the force sensor.
  • loading the robot with an increased cargo mass will result in the operator feeling the additional mass in the absence of dynamic adjustment to the force multiplier, discussed in greater detail below.
  • a force multiplier is determined at 2206.
  • the force multiplier may include one or more values applied as a multiplier to the force sensor force vector to determine control instructions for the robot. In this way, a user may move the robot in a particular direction by applying a force potentially much smaller than what would be required to move the robot in the direction were no force assistance provided.
  • the force multiplier may be implemented as a scalar. In such a configuration, the same force multiplier may be applied to all dimensions. Alternatively, the force multiplier may be implemented as a vector. In such a configuration, different dimensions may receive different force multiplication. For example, the rotational dimension may be associated with a larger force multiplier due to the difficulty (relative to translational force) of applying rotational force to an input device such as a handlebar.
  • the force multiplier may be fixed. For instance, a force multiplier of 1.5x, 2x, 3x, or another suitable value may be used. Alternatively, the force multiplier may be dynamically determined. For instance, the force multiplier may be increased or decreased based on the operating conditions optionally determined at 2006. In some configurations, the force multiplier may be determined based at least in part on configuration information provided by a fleet administrator and/or robot user.
  • a force multiplier may be increased relative to a previously used and/or default value.
  • a lower force multiplier may be used: (1) when the robot is located in an area designated for slower speed, (2) when a potentially unsafe condition is detected, (3) when a larger or stronger operator is using the robot, (4) when the robot has been manually configured to have a lower force multiplier, (5) when the robot has been detected as carrying less mass, and/or (6) when any other type of operating condition designated for use with a lower force multiplier is detected.
  • a force multiplier may be increased relative to a previously used and/or default value.
  • a higher force multiplier may be used (1) when the robot is located in an area designated for higher speed, (2) when it is determined that the robot is operating in an area of low traffic, (3) when the robot has been manually configured to have a higher force multiplier, (4) when the robot has been detected as carrying more mass, when a smaller or weaker operator is using the robot, and/or (5) when any other type of operating condition designated for use with a higher force multiplier is detected.
  • a force multiplier may depend on the direction of the physical force relative to a current velocity of the robot. For example, a higher force multiplier may be used when the physical force is in the opposite direction as the velocity. In this way, an operator may be able to slow or stop the robot more easily than the operator can increase the robot's speed. In some configurations, such determinations may be made on a dimension-by-dimension basis.
  • a physical force applied clockwise and in the x- direction may receive a higher force multiplier in the rotational direction than in the translational direction since the rotational physical force is against the direction of velocity while the translational force in the same direction as the velocity.
  • a force multiplier may depend on the location of the robot relative to the operator. For example, a larger force multiplier may be used when moving the robot in a direction away from the operatorthan when moving the robot in a direction toward the operator. In this way, the robot may be safely accelerated away from the operator while at the same time preventing the operator from inadvertently cause the robot to unsafely accelerate toward the operator.
  • An updated physical input force vector is determined at 2208.
  • the physical input force vector may be determined by multiplying the physical input force vector determined at 2008 by the force multiplier determined at 2206.
  • An output force vector is determined at 2210 based on the input force vectors.
  • a virtual mass value for the robot is identified at 2212.
  • the virtual mass value may be used to cause the robot to feel as if it weighs a particular amount.
  • the virtual mass may be specified as a configuration value.
  • the virtual mass may be fixed. For instance, a robot may be assigned a virtual mass of 60 pounds.
  • the virtual mass may be dynamically determined. For example, the virtual mass value may be increased for users identified as being relatively larger and/or stronger. As another example, the virtual mass value may be dynamically determined based on observations about user input over time.
  • An acceleration vector for the robot is determined at 2214 based on the force vector and the virtual mass.
  • the acceleration vector may be determined by dividing the force vector by the virtual mass.
  • the acceleration of an object is equal to the force applied to the object divided by the object's mass.
  • a where m is the virtual mass and a is the acceleration vector.
  • a velocity output vector for the drive unit is determined at 2216.
  • the velocity output vector may be obtained by integrating the acceleration vector over a suitable period of time.
  • the current velocity vector may be identified as a vector in dimensions corresponding to those of the acceleration vector determined at 2214.
  • the current velocity vector may be determined as part of the operating conditions determined at 1606.
  • one or more sensors associated with the drive unit may provide sensor data indicating the velocity at which the robot is currently traveling.
  • Figure 23 illustrates a method 2300 for autonomous motion control of an autonomous mobile robot, performed in accordance with one or more embodiments.
  • the method 2300 may be used to direct an autonomous mobile robot to perform a task, for instance by aiding an operator.
  • a request to autonomously control an autonomous mobile robot in a physical environment is received at 2302.
  • the request may be received when the robot enters an autonomous mode, as discussed with respect to operation 1904.
  • a scene graph of the physical environment is determined at 2304.
  • the scene graph may provide a virtual representation of a physical environment.
  • a scene graph of a warehouse may provide a virtual representation of aisles in the warehouse, along with locations of items, zones for dropping off items, and the like.
  • the scene graph may be connected to one or more data sources, for instance allowing the robot to determine a correspondence between an item to be picked up or dropped off and a location in the physical environment.
  • a database may indicate that Item A134 is located in Bin 23 on Shelf 10 of Aisle 9, and the scene graph may provide a virtual representation location of Bin 23, Shelf 10, Aisle 9 in a way that allows the robot to navigate to that location.
  • the scene graph may be received from a remote computing device.
  • the scene graph may be received from a fleet controller configured to control multiple robots.
  • elements of the scene graph may be determined by the robot itself.
  • the robot may analyze sensor data to determine or supplement a scene graph.
  • a scene graph may be received from a remote computing device and then updated by the robot.
  • a current location for the robot on the scene graph is determined at 2306.
  • the location of the robot may be determined in any of various ways. For example, image data from one or more cameras at the robot may be analyzed using a visual SLAM and/or other techniques to determine a location of the robot relative to one or more reference points. A correspondence between a reference point and the scene graph may then allow the robot to determine its location on the scene graph.
  • a task to perform is determined at 2308.
  • a destination location on the scene graph is determined at 2308 based on the task.
  • a route from the current location to the destination location is determined at 2310 based on the scene graph.
  • the robot is instructed to move at 2310 based on the movement instruction.
  • the particular task, location, and movement instruction may depend in significant part on the application.
  • the robotic cart may be configured to aid in the completion of a task in a warehouse.
  • the robot may be configured to aid in a task such as item picking in which it retrieves one or more items from storage in the warehouse for transport to another location.
  • the robot may be configured to aid in a task such as item replenishment in which it delivers one or more items to one or more locations within the warehouse for future picking.
  • the robot may be configured to aid in another task, such as transporting a person, transporting a production input, transporting a production output, providing a mobile light source, monitoring a region, monitoring a person, removing trash, or the like.
  • item picking may be performed using any of a variety of protocols.
  • zone picking a person may operate in an area of the warehouse to pick items while one or more robots travel to the person to collect those items.
  • the robot may be equipped with one or more boxes, some or all of which may include items corresponding with multiple orders.
  • an autonomous mobile robot may be configured to follow a person as the person moves around a warehouse environment and places items into the robotic cart.
  • an autonomous mobile robot may be configured to facilitate order consolidation, in which it moves to a location close to another robot and supports the movement of items between the two carts.
  • the robot may use lights or other affordances to interact with humans.
  • a light strip may include lights that may be adapted to a width of a region on a cart. The lights may then be activated to indicate to a human where to remove an item from or where to place an item on the cart.
  • a task for the robot to perform may be determined by a coordinator such as a fleet management system providing command and control instructions for a fleet of robots.
  • the fleet management system may perform route determination and/or optimization. For example, the fleet management system may spread the load to avoid traffic congestion, determine estimates for task completion time, and/or generally determine tasks in a manner that efficiently utilizes multiple robots.
  • the route may be parameterized as a nominal trajectory that includes one or more individual waypoints.
  • Each individual waypoint may be a location within the scene graph.
  • the robot need not necessarily follow the exact path specified by the nominal trajectory. Instead, the path may be treated as, for instance, a force vector that serves as an input along with other force vectors, such as those associated with obstacle avoidance, that are combined togetherto determine a direction and magnitude of movement for the robot at any given time.
  • the robot may continue to be controlled as long as it remains in autonomous motion mode.
  • an autonomous mobile robot may be equipped with one or more sensors such as visible light cameras.
  • Figure 24 illustrates a diagram 2400 showing sensor placement on an autonomous mobile robot.
  • sensors may be located in various places. Locating a sensor lower on the autonomous mobile robot, for instance as shown at the camera 2404 in Figure 24, provides improved obstacle detection for small obstacles located on the floor due to reduced depth noise. In contrast, locating a sensor higher on the autonomous mobile robot, for instance as shown at the camera 2402 in Figure 24, provides for improved visibility of people close to the autonomous mobile robot and can avoid creating a blind spot around the base. However, such a configuration can reduce the field of view around the robot and render small objects located on the floor more difficult to detect.
  • Figure 25A illustrates a view of an attachment point 2502 between a lighting element 2506 and a communication channel 2504 in an angled shelf configuration
  • Figure 25B illustrates a view of an attachment point 2512 between a lighting element 2506 and a communication channel 2514 in a flat shelf configuration
  • the lighting elements 2506 and/or 2516 may be coupled with the communication channels 2504 or 2514 and may provide lighting in a way that assists in the performance of one or more operations using the autonomous mobile robot.
  • the lighting elements 2506 and/or 2516 may include one or more pick-to-light components that highlight one or more external locations such as a location on a shelf in the physical environment in which the autonomous mobile robot is situated.
  • the lighting elements 2506 and/or 2516 may include one or more put-to-light components that highlight one or more internal locations on the autonomous mobile robot, such as a tote or shelf region.
  • Such lighting elements may facilitate workflows such as picking items from an external shelf and placing them onto the autonomous mobile robot, placing items from the autonomous mobile robot onto an external location such as a shelf or other robot, or the like.
  • lights may be used to support a virtual and/or regular pick wall and/or put wall.
  • a regular pick wall or put wall may be a wall of cubbies with lights on the robot and/or the cubbies indicating where to pick and/or place an item.
  • a robot may be configured to operate as a virtual pick wall or put wall at any location by illuminating lights associated with locations on the robot's chassis (e.g., bins on shelves).
  • multiple robots may coordinate to support item transfer between and among the robots and/or another location such as a regular pick and/or put wall.
  • Figure 26 illustrates a method 2600 for controlling one or more lights associated with an autonomous mobile robot, performed in accordance with one or more embodiments.
  • a request to control a light associated with an autonomous mobile robot is received at 2602.
  • a task being performed by the robot or an operator is identified at 2604.
  • a light configuration in support of the task is determined at 2606.
  • One or more lights are activated based on the light configuration at 2608.
  • a determination is made at 2610 as to whether to continue to control a light associated with the robot.
  • the method 2600 may be used to configure various types of lights.
  • the robot may be equipped with projectors that can project light onto a surface inside the robot, such as onto a shelf location.
  • the robot may be equipped with projectors that can project light onto a surface off of the robot, such as onto a bin located on a shelf in a warehouse aisle.
  • the robot may be equipped with a light strip that can highlight an area of an autonomous mobile robot, such as a region of a shelf.
  • a light may be configured as a laser beam on a gimbal, a laser line, and addressable LED strip, a fixed light whereby the robot moves the light by aligning the robot with the shelf, and/or any other suitable lighting configuration.
  • the method 2600 may be used to configure various types of lighting operations.
  • the robot may be configured to light up an area or container corresponding with where an item is to be picked up from or moved to.
  • the robot may project a light onto an external bin corresponding to an item to be picked up and then instruct an addressable LED strip to light up an area on the robotic car where the item is to be placed.
  • Figure 27 illustrates a navigational feedback method 2700, performed in accordance with one or more embodiments.
  • the navigational feedback method 2700 may be performed at an autonomous mobile robot configured in accordance with techniques and mechanisms described herein.
  • navigational feedback may include a haptic feedback force input vector, which may be a type of functional input force vector that may be provided to the mechanical drive unit.
  • navigational feedback may include other types of feedback such as visual feedback or auditory feedback.
  • U ser input for the autonomous mobile robot is identified at 2704.
  • the user input may include force detected at one or more force sensors associated with a handlebar unit.
  • the user input may include one or both of a translational force and a rotational force exerted on the handlebar.
  • operating condition for a robot may include any of a variety of conditions. Examples of such operating conditions may include, but are not limited to: the identity of an operator using the robot, a location of the robot, a direction in which the robot is traveling, an amount of traffic in the vicinity of the robot, a condition associated with the physical environment in which the robot is situated, and the like.
  • the one or more operating conditions may include one or more situational embodiments of the robot relative to its environment. Examples of such operating conditions may include, but are not limited to: a determination that the robot is moving along an aisle, a determination that the robot is approaching a real or virtual obstacle or barrier, a determination that the robot is approaching a zone associated with a different mode of operation, a determination that the robot is approaching the end of an aisle, a proximity of the robot to an edge of a virtual corridor, and/or other such considerations.
  • one or more operating conditions may include one or more functional embodiments related to a task being performed by the robot, for instance in the context of a workflow. Examples of such operating conditions may include, but are not limited to: determining that a robot is approaching an area corresponding to an item to be moved, determining that a robot is moving away from an area corresponding to an item to be moved, determining that a human operator of the robot is to be notified of information, and/or other operational information.
  • the determination may involve evaluating whether the robot's position coupled with the user input suggests that the robot, absent a correction, is projected to cross a threshold, violate a rule, create a dangerous situation, and/or perform an action that is deemed inefficient.
  • the robot's position, direction, and user input may be collectively analyzed to project that the robot is heading toward a virtual barrier.
  • the determination made at 2708 may involve evaluating whether the robot's position coupled with the user input suggests that the robot is approaching or departing a virtual navigational affordance.
  • a warehouse environment may be configured with one or more virtual high-speed navigational corridors, virtual low speed handling areas, virtual navigational rails, virtual parking areas, or the like.
  • a robot approaching or departing from such a virtual navigational affordance may generate a haptic feedback pattern appropriate for the affordance, such as snapping into or out of a parking spot, navigational rail, or corridor.
  • the determination made at 2708 may involve evaluating whether the robot's position coupled with the user input suggests that the robot is approaching or departing a configuration associated with a workflow.
  • a consolidation workflow may involve an arrangement of two or more robotic carts in which the carts are positioned near each other to facilitate the transfer of items from one cart to another.
  • Such a consolidation workflow may be associated with a predetermined relative spatial orientation of robotic carts to facilitate efficiency on the part of the human operator.
  • a virtual train workflow may involve a sequence of carts configured to closely follow one another in a row, with a human user capable of creating such a sequence by moving the carts in proximity to one another.
  • a robot approaching or departing from a workflow configuration may generate a haptic feedback pattern appropriate for the affordance, such as snapping into or out of position.
  • the determination made at 2708 may involve evaluating whether the robot's position coupled with the user input suggests that the robot is entering or leaving a particular operating mode. For instance, in a guided navigation mode, the robot may autonomously navigate along a path despite the user's hand on the handlebar. In contrast, in a fully manual mode, the robot may cease autonomous navigation apart from obstacle avoidance and instead be fully responsive to user input. The robot may generate haptic or non- haptic feedback to indicate that such a transition has occurred.
  • the determination made at 2708 may be made based at least in part on sensor data identifying one or more indicators of a virtual barrier, corridor, rail, zone, or other such navigational affordance.
  • a virtual barrier, corridor, rail, zone, or other such navigational affordance For example, paint or tape on the floor of a warehouse may be used to indicate the presence of a navigational affordance.
  • a caution sign or other such warning may indicate the presence of a zone associated with reduced speed.
  • a virtual navigational rail or corridor may be automatically created along a linear surface such as a shelf in a warehouse.
  • non-haptic navigational feedback is determined and provided at 2710.
  • non-haptic navigational feedback may include visual feedback provided via one or more light projectors.
  • the autonomous mobile robot may project onto the ground an indication of one or more virtual rails, barriers, or corridors.
  • the autonomous mobile robot may project onto the ground or another surface an indication of a recommended direction or route of travel.
  • the autonomous mobile robot may project onto the ground or another surface one or more messages or indicators.
  • non-haptic navigational feedback may include visual feedback provided via a display screen or light strip.
  • the autonomous mobile robot may update the display screen to display a map with a route, one or more turn-by-turn direction, or a message.
  • non-haptic navigational feedback may include one or more sounds.
  • the autonomous mobile robot may play a sound to indicate that the user is guiding the autonomous mobile robot in a direction that is not recommended.
  • the haptic feedback may be experienced by a human operator as a force sensation in the operator's hand while holding the force-sensitive handlebar.
  • the drive unit may cause the entire autonomous mobile robot to jerk in a particular direction, vibrate, create a feeling of push or pull, or otherwise generate the feeling of force in the handlebar.
  • haptic feedback may include generating a force vector that provides the feel of a bump in one direction.
  • a pattern of haptic feedback may involve generating a short but sharp increase in force in one direction.
  • haptic feedback may include generating a force vector that provides a steady force in one direction or a force that steadily increases or decreases in intensity.
  • haptic feedback may include a force vector generated so as to create via the drive unit a feeling of vibration, for instance by generating a temporal sequence of oscillating haptic feedback input force vectors pointed in opposite directions that collectively cancel out in terms of their ultimate effect on the directional motion of the autonomous mobile robot.
  • haptic feedback may include one or more complex patterns. For example, when it is determined that the autonomous mobile robot has nearly aligned with a virtual rail, a profile of haptic feedback vectors may be determined to provide the feeling that the robot has snapped into a physical rail. As another example, when it is determined that the autonomous mobile robot is moving along a virtual rail, the force vectors may initially act to keep the autonomous mobile robot aligned with the rail, effectively resisting or ignoring user input forces in the direction orthogonal to the rail or in the rotational direction. However, if a sufficiently strong lateral or rotational user input force is detected, a profile of haptic feedback vectors may be determined to provide the feeling that the robot has snapped out of a physical rail.
  • the presence of haptic feedback may cause the autonomous mobile robot to temporarily nullify or ignore input provided via the handlebar.
  • the haptic feedback force input vector may cause the autonomous mobile robot to suddenly snap into or out of a virtual rail, temporarily disregarding user input to the contrary.
  • the particular type of haptic feedback that is provided may be strategically determined based on factors such as the characteristics of the autonomous mobile robot, the physical environment in which it operates, the tasks for which it is employed, and the human users operating the autonomous mobile robot.
  • factors such as the characteristics of the autonomous mobile robot, the physical environment in which it operates, the tasks for which it is employed, and the human users operating the autonomous mobile robot.
  • additional examples of the types of haptic feedback that may be provided and the triggering conditions for generating the haptic feedback are discussed with respect to Figure 28A through Figure 30.
  • non-haptic navigational feedback may be provided in the absence of haptic navigational feedback, or vice versa. Further, non-haptic navigational feedback may be provided as a precursor to providing haptic navigational feedback. For instance, before causing the drive unit to vibrate the robot, the robot may first display or project blinking lights, emit a sound, or provide other such non-haptic feedback. Alternatively, non-haptic feedback may be provided during or after haptic feedback is provided. For instance, a display screen may be updated to display a recommended path of action after or during a time in which the robot provides haptic feedback via the mechanical drive unit.
  • the triggering conditions for providing navigational feedback may be configurable by an environment administrator, system administrator, and/or end user.
  • the examples of triggering conditions and navigational feedback described herein are provided as illustrative examples and are not an exhaustive list of the types of triggering conditions and feedback that may be used. Rather, the triggering conditions and navigational feedback may be strategically determined based on factors such as the characteristics of the autonomous mobile robot, the physical environment in which it operates, the tasks for which it is employed, and the human users operating the autonomous mobile robot.
  • Figure 28A, Figure 28B, and Figure 28C illustrate diagrams of situations in which haptic feedback may be provided via a mechanical drive unit, generated in accordance with one or more embodiments.
  • an autonomous mobile robot 2800 is traveling obliquely toward a virtual rail 2802.
  • the autonomous mobile robot 2800 receives as input an input force vector 2804.
  • the input force vector 2804 includes a consistent force component 2806 in the direction of the virtual rail 2802 and an orthogonal force component 2808 that is orthogonal to the direction of travel. Because the autonomous mobile robot 2800 has not yet arrived at the virtual rail 2802, the direction of travel 2810 mirrors the user input force vector 2804. That is, haptic feedback is not provided.
  • the autonomous mobile robot 2800 has reached the virtual rail 2802.
  • the autonomous mobile robot 2800 receives as input an input force vector 2834.
  • the input force vector 2834 includes a consistent force component 2836 in the direction of the virtual rail 2832 and an orthogonal force component 2838 that is orthogonal to the direction of travel.
  • the consistent force component 2836 is relatively large, while the orthogonal force component 2838 is relatively small.
  • an oppositional haptic feedback force input vector 2840 may be generated. Because the oppositional haptic feedback force input vector 2840 is directionally opposed to the orthogonal force component 2808 and of equal magnitude, the two force vectors cancel out each other. The resulting direction of travel 2842 is in the same direction as the virtual rail 2802.
  • the configuration shown in Figure 28B initially provides the sensation of the autonomous mobile robot 2800 being snapped into the virtual rail 2802 when the autonomous mobile robot 2800 reaches the virtual rail 2802. Such a situation may continue even if the hum orthogonal force component 2838 is maintained.
  • the autonomous mobile robot 2800 receives as input an input force vector 2854.
  • the input force vector 2854 includes a consistent force component 2856 in the direction of the virtual rail 2852 and an orthogonal force component 2808 that is orthogonal to the virtual rail 2802.
  • the orthogonal force component 2858 is large.
  • the autonomous mobile robot 2800 does not generate an oppositional haptic feedback force input vector, and instead may optionally generate an orthogonal haptic feedback force input vector 2860 in the direction of the orthogonal force component 2858.
  • the combination of these vectors leads to a direction of travel 2862 away from the virtual rail 2802 in the direction of the user input force vector 2854.
  • Figure 28C provides the sensation of the autonomous mobile robot 2800 being snapped off of the virtual rail 2802 based on a large orthogonal force applied by the human user. This approach allows the human user to leave the virtual rail 2802 intentionally.
  • FIG. 28A, Figure 28B, and Figure 28C involve static force vectors.
  • changes in force vectors may also be considered when determining or applying haptic feedback.
  • snapping the autonomous mobile robot 2800 off of the virtual rail 2802 may require a sharp increase in the orthogonal force component 2858 rather than steady pressure. That is, a sharp jerk of the autonomous mobile robot 2800 in the orthogonal direction may cause the autonomous mobile robot 2800 to suddenly leave the virtual rail 2802, while a continuous force in the same direction may not have such an effect.
  • Figure 29A, Figure 29B, and Figure 29C illustrate diagrams of situations in which haptic feedback may be provided via a mechanical drive unit, generated in accordance with one or more embodiments.
  • the autonomous mobile robot is traveling in a virtual corridor between the virtual corridor wall 2902a and the virtual corridor wall 2902b.
  • an oppositional haptic feedback force input vector 2906 is applied to gently push the autonomous mobile robot 2800 back toward the center of the virtual corridor.
  • the human operating the autonomous mobile robot 2800 is in control of the autonomous mobile robot 2800. However, the human operator receives guidance for keeping the autonomous mobile robot 2800 within the virtual corridor.
  • the human operating the autonomous mobile robot 2800 again remains in control of the lateral steering of the autonomous mobile robot 2800.
  • the human operator receives guidance to remove the autonomous mobile robot 2800 entirely from the virtual corridor rather than operating it partially within the virtual corridor. Such guidance may help to keep the virtual corridor clear for other robots or humans.
  • a virtual corridor may be dynamically determined based on the routes planned for multiple robots. For instance, when multiple robots are moving in the same direction, they may coordinate to adhere to a dynamically determined virtual corridor in that direction. Then, the robots may generate navigational guidance through haptic feedback to stay within the virtual corridor. If a robot departs from the virtual corridor it may also generate navigational guidance through haptic feedback to fully depart the virtual corridor. In this way, the other robots traveling in the virtual corridor need not adjust their courses.
  • Figure 30A and Figure 30B illustrate diagrams of situations in which haptic feedback may be provided via a mechanical drive unit, generated in accordance with one or more embodiments.
  • the autonomous mobile robot autonomous mobile robot 2800 is approaching a virtual barrier 3002 based on a user input force vector 3004.
  • the proximity 3006 is decreased. This decrease in the proximity 3006 triggers the generation of a haptic feedback force input vector sequence 3008.
  • the haptic feedback force input vector sequence 3008 may be implemented as a sequence of individual haptic feedback force input vectors over time 3010.
  • the haptic feedback force input vector sequence 3008 may include a sequence of individual force vectors pointed in opposite directions.
  • the collective effect of the haptic feedback force input vector sequence 3008 may be experienced by the human operator as a back-and-forth vibration in the autonomous mobile robot 2800.
  • the vibration may be felt in the handlebar attached to the force sensor, or any other portion of the autonomous mobile robot 2800 in contact with the human operator.
  • the autonomous mobile robot 2800 is participating in a task that involves navigating to a target 3054. However, the autonomous mobile robot 2800 has passed the target 3054, responsive to the user input force vector 3054. As the proximity 3056 to the target 3054 decreases, the autonomous mobile robot 2800 generates a vibrational haptic feedback force input vector sequence 3058.
  • the vibrational haptic feedback force input vector sequence 3058 is substantially similar to the haptic feedback force input vector sequence 3008.
  • the vibrational haptic feedback force input vector sequence 3058 includes oppositional vectors arranged in an orthogonal, side-to-side direction rather than a parallel, forward-and-back direction.
  • the presence of haptic feedback in combination with virtual navigational affordances may allow system operators to reduce the distance between robots without generating obstacle avoidance procedures. For example, in the absence of a virtual navigational affordance, two robots may be prevented from passing within one foot of each other by way of obstacle avoidance force vectors. However, such a situation may be permitted if each of the two robots is locked to a respective navigational rail.
  • haptic feedback can be provided using force vectors in a rotational direction, and/or can be triggered based on one or more conditions associated with rotational movement.
  • the illustrative examples presented herein have employed translational force vectors and omitted rotational components.
  • Figure 31 illustrates a method 3100 of determining a virtual friction vector, performed in accordance with one or more embodiments.
  • a request to determine a virtual friction vector for an autonomous mobile robot is received.
  • the request may be received as discussed with respect to the operation 2010 shown in Figure 20.
  • One or more operating conditions for the autonomous mobile robot are identified at 3104.
  • the one or more operating conditions may be identified as discussed with respect to the operation 2006 shown in Figure 20.
  • a physical friction vector for the autonomous mobile robot is determined at 3106.
  • the physical friction vector may represent friction imposed by the components of the robot in one or more dimensions. Such friction may be due to components such as wheels, tires, motors, bearings, and the like. For instance, a robot may be anticipated to have a friction of between IN and 100N based on such components.
  • the physical friction vector may be identified based on a predetermined configuration parameter.
  • the physical friction vector may be dynamically determined based on a calibration method. An example of such a method is discussed with respect to the method 3200 shown in Figure 32.
  • a desired friction vector is determined at 3108 based on the one or more operating conditions. According to various embodiments, various considerations may influence the configuration of the desired friction vector. The following examples are non-exhaustive.
  • lateral friction may be applied.
  • Friction may be applied in the lateral direction to oppose force applied to the robot to move the robot off of the line or the curve. In this way, the friction may help keep the robot on track and reduce the effort needed to control the robot. Such friction may potentially be increased as the robot's speed increases, for instance to provide additional stability.
  • forward friction force may be applied.
  • a forward friction compensation vector of, for instance, ION may be applied to correct for the robot's physical friction.
  • a desired friction vector of 2N in the backward direction may be specified. A small amount of desired friction may be maintained to avoid a situation in which the robot is felt to accelerate away from the operator.
  • a friction compensation force presented as forward friction may be applied when motion exceeds a designated velocity. That is, a dead band around zero velocity may be maintained. Alternatively, the friction compensation force may be gradually increased as velocity rises above zero according to a specified function.
  • a virtual friction vector is determined at 3110 based on the desired friction vector and the physical friction vector.
  • the virtual friction vector may be determined by subtracting the physical friction vector from the desired friction vector.
  • Figure 32 illustrates a method 3200 of calibrating a physical friction vector for an autonomous mobile robot, performed in accordance with one or more embodiments.
  • the method 3200 may be performed at the autonomous mobile robot.
  • the method 3200 may be performed at a different location, such as a fleet controller.
  • the method 3200 may be used to determine a physical friction vector employed as discussed with respect to the method 3100 shown in Figure 31.
  • a request to determine a physical friction vector for an autonomous mobile robot is received at 3202.
  • the request may be generated periodically or upon detection of a triggering condition. For instance, the request may be generated when a designated number of operating hours elapses.
  • Input information for input vectors applied to the drive assembly is determined at 3204.
  • the input information may include recent or current data that includes a set of observations for input vectors applied to the drive assembly over time. Such information may be stored at the robot itself and/or at a remote computing device.
  • output information indicating motion of the autonomous mobile robot is determined.
  • the output information may include values for velocity realized by the autonomous mobile robot in response to the input vectors identified at 3204.
  • a model of the autonomous mobile robot is determined at 3208 based on the input information, the output information, and a model specification.
  • a model specification may be used.
  • a polynomial model may represent values such as force, acceleration, velocity, mass, Coulomb friction, and the like.
  • the model specification may then be applied to the input and output information and solved to determine an estimated friction vector.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)

Abstract

Une unité d'entraînement mécanique pour un robot peut être commandée par réception, en provenance d'un capteur de force, d'un message d'entrée caractérisant une force physique exercée sur le capteur de force dans une première direction. Un vecteur d'entrée de force physique peut être déterminé sur la base du message d'entrée et quantifiant la force physique dans deux dimensions ou plus. Un vecteur de sortie de force agrégeant le vecteur d'entrée de force physique et un second vecteur d'entrée de force et quantifiant une force à appliquer pour déplacer le robot dans une seconde direction peut être déterminé au moins en partie par application d'un multiplicateur de force multipliant le vecteur d'entrée de force physique. Une indication du vecteur de sortie de force peut être transmise à l'unité d'entraînement mécanique omnidirectionnelle par l'intermédiaire d'une interface de communication. Le robot peut être déplacé par l'intermédiaire de l'unité d'entraînement mécanique dans la seconde direction sur la base du vecteur de sortie de force.
PCT/US2025/017483 2024-03-01 2025-02-27 Systèmes et procédés pour un robot mobile autonome fournissant une rétroaction haptique Pending WO2025184269A1 (fr)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US202463560381P 2024-03-01 2024-03-01
US63/560,381 2024-03-01
US202463571352P 2024-03-28 2024-03-28
US63/571,352 2024-03-28
US18/655,609 US20250304135A1 (en) 2024-03-28 2024-05-06 Autonomous Robot with Force Sensing User Handlebar
US18/655,609 2024-05-06
US18/795,644 US20250278094A1 (en) 2024-03-01 2024-08-06 Systems and Methods for an Autonomous Mobile Robot Haptic Feedback
US18/795,644 2024-08-06

Publications (1)

Publication Number Publication Date
WO2025184269A1 true WO2025184269A1 (fr) 2025-09-04

Family

ID=95364671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/017483 Pending WO2025184269A1 (fr) 2024-03-01 2025-02-27 Systèmes et procédés pour un robot mobile autonome fournissant une rétroaction haptique

Country Status (1)

Country Link
WO (1) WO2025184269A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018039337A1 (fr) * 2016-08-23 2018-03-01 Canvas Technology, Inc. Chariot autonome pour des applications de fabrication et d'entrepôt
US20200262460A1 (en) * 2019-02-19 2020-08-20 Lg Electronics Inc. Robot operating in power-assist mode and method for moving the same
US20210011484A1 (en) * 2019-07-10 2021-01-14 Lg Electronics Inc. Method of moving in power assist mode reflecting physical characteristics of user and robot of implementing thereof
EP4067205A1 (fr) * 2021-03-30 2022-10-05 Hublex Poignee de commande haptique pour chariot ou similaire, kit d'assistance au deplacement d'un chariot comprenant une telle poignee de commande, et chariot equipe d'un tel kit d'assistance
US20230168679A1 (en) * 2021-11-30 2023-06-01 Robust AI, Inc. Robotic Cart

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018039337A1 (fr) * 2016-08-23 2018-03-01 Canvas Technology, Inc. Chariot autonome pour des applications de fabrication et d'entrepôt
US20200262460A1 (en) * 2019-02-19 2020-08-20 Lg Electronics Inc. Robot operating in power-assist mode and method for moving the same
US20210011484A1 (en) * 2019-07-10 2021-01-14 Lg Electronics Inc. Method of moving in power assist mode reflecting physical characteristics of user and robot of implementing thereof
EP4067205A1 (fr) * 2021-03-30 2022-10-05 Hublex Poignee de commande haptique pour chariot ou similaire, kit d'assistance au deplacement d'un chariot comprenant une telle poignee de commande, et chariot equipe d'un tel kit d'assistance
US20230168679A1 (en) * 2021-11-30 2023-06-01 Robust AI, Inc. Robotic Cart

Similar Documents

Publication Publication Date Title
US12030757B2 (en) Systems and methods for operating autonomous tug robots
US12135553B2 (en) Robotic cart
CN114502335A (zh) 用于具有几何约束的非线性机器人系统的轨迹优化的方法和系统
Wang et al. An intelligent robotic hospital bed for safe transportation of critical neurosurgery patients along crowded hospital corridors
KR20130045290A (ko) 자율주행 이동로봇
JPWO2019059307A1 (ja) 移動体および移動体システム
US20220043452A1 (en) Agv having dynamic safety zone
Levratti et al. TIREBOT: A novel tire workshop assistant robot
KR20130045289A (ko) 자율주행 이동로봇의 주행제어방법 및 출입통제시스템
US12416930B1 (en) Systems and methods for an autonomous mobile robot
US20250278094A1 (en) Systems and Methods for an Autonomous Mobile Robot Haptic Feedback
US20250278097A1 (en) Autonomous Robot Double Drive Assembly
WO2025184269A1 (fr) Systèmes et procédés pour un robot mobile autonome fournissant une rétroaction haptique
WO2025184263A1 (fr) Systèmes et procédés pour un robot mobile autonome
US20250278087A1 (en) Force Multiplying Mobile Robot
US12436546B2 (en) Systems and methods for an autonomous mobile robot fleet coordination
WO2020220093A1 (fr) Véhicule d'inspection
KR20130045291A (ko) 자율주행 이동로봇 및 주행 제어 방법
US20250304135A1 (en) Autonomous Robot with Force Sensing User Handlebar
CN119768749A (zh) 自主实用小车及机器人小车平台
WO2025207261A1 (fr) Robot autonome avec guidon d'utilisateur à détection de force
KR101041929B1 (ko) 자율 이동 차량의 속도 결정 장치, 이를 구비하는 자율 이동 차량 및 자율 이동 차량의 속도 결정 방법
JP2023051013A (ja) 搬送カートおよび搬送カートの制御方法
Bolanakis et al. Automating loading and locking of new generation air-cargo containers
Sato et al. Operability evaluation of manual operation control for force-sensorless power-assist transport cart

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25718094

Country of ref document: EP

Kind code of ref document: A1