[go: up one dir, main page]

WO2025071419A1 - Methods and systems for robotic item picking ports for automated warehouses - Google Patents

Methods and systems for robotic item picking ports for automated warehouses Download PDF

Info

Publication number
WO2025071419A1
WO2025071419A1 PCT/PL2023/050080 PL2023050080W WO2025071419A1 WO 2025071419 A1 WO2025071419 A1 WO 2025071419A1 PL 2023050080 W PL2023050080 W PL 2023050080W WO 2025071419 A1 WO2025071419 A1 WO 2025071419A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
picking
container
dropping
queue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/PL2023/050080
Other languages
French (fr)
Inventor
Holger Cremer
Konrad BANACHOWICZ
Adam WILKOSZ
Mateusz MADEJ
Pawel KAMINSKI
Marek CYGAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nomagic Spolka Z Ograniczona Odpowiedzialnoscia
Original Assignee
Nomagic Spolka Z Ograniczona Odpowiedzialnoscia
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nomagic Spolka Z Ograniczona Odpowiedzialnoscia filed Critical Nomagic Spolka Z Ograniczona Odpowiedzialnoscia
Priority to PCT/PL2023/050080 priority Critical patent/WO2025071419A1/en
Publication of WO2025071419A1 publication Critical patent/WO2025071419A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/137Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
    • B65G1/1373Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses
    • B65G1/1378Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed for fulfilling orders in warehouses the orders being assembled on fixed commissioning areas remote from the storage areas
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0093Programme-controlled manipulators co-operating with conveyor means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials

Definitions

  • Order picking may be a process of selecting and gathering items from a warehouse or storage location. Items may be selected and gathered for various reasons. For example, items may be selected and gathered for fulfilling customer orders. In such examples, the items may be moved into a shipping container. In another example, items may be selected and gathered for storing. In such examples, the items may be moved into a storage container (e.g., for consolidation).
  • Items may be selected and gathered for various reasons. For example, items may be selected and gathered for fulfilling customer orders. In such examples, the items may be moved into a shipping container. In another example, items may be selected and gathered for storing. In such examples, the items may be moved into a storage container (e.g., for consolidation).
  • Selecting and gathering items may include retrieving the items from respective locations based on the order details, item type, quantity, item attributes, specific customer requirements, etc. While item picking has historically relied on manual labor, where workers navigated a warehouse to locate and retrieve items, certain automations have become popular in more recent times. For example, warehouse automation technologies have revolutionized the order picking process through technologies such as automated guided vehicles, robotic arms, conveyor belts, software algorithms, etc.
  • a system for robotic item picking comprises: (A) a picking port, comprising: (i) a first input picking queue configured to hold and move a first plurality of item picking containers, (ii) a second input picking queue configured to hold and move a second plurality of item picking containers, (iii) a picking location configured to receive an item picking container of (1) said first plurality of item picking containers from said first input picking queue and (2) said second plurality of item picking containers from said second input picking queue, and (iv) a dropping location configured to receive an item dropping container; (B) a robotic arm configured to move one or more items of said item picking container; and (C) a controller configured to: (a) cause said picking location to receive said item picking container from (1) said first input picking queue or (2) said second input picking queue, and (b) cause said robotic arm to (1) pick up said one or more items from said item picking container at said picking location and (2) place said one or more items in said item dropping container at said dropping location.
  • said picking port further comprises (v) an output picking queue configured to hold and move said item picking container.
  • said controller is further configured to, after causing said robotic arm to pick up said one or more items from said item picking container at said picking location in (b): (c) cause said item picking container to be moved from said picking location to said output picking queue.
  • said controller is further configured to, after causing said item picking container to be moved from said picking location to an output picking queue in (c): (d) cause said picking location to receive an additional item picking container from (1) said first input picking queue or (2) said second input picking queue, and (e) cause said robotic arm to pick up one or more additional items from said additional item picking container at said picking location.
  • said picking port further comprises (vi) an output dropping queue configured to hold and move said item dropping container, said picking port further comprises (vii) an input dropping queue configured to hold and move said item dropping container, and said dropping location is configured to receive said item dropping container from said input dropping queue.
  • said input dropping queue comprises a first input dropping queue and a second input dropping queue.
  • said controller is further configured to, after causing said robotic arm to place said one or more items from said item picking container in said output item container at said dropping location in (b): (f) cause said item dropping container to be moved from said dropping location to an output dropping queue, (g) cause said dropping location to receive another item dropping container from said input dropping queue, and (h) cause said robotic arm to place said one or more additional items in said additional item dropping container at said dropping location.
  • said controller is further configured to, after causing said robotic arm to place said one or more items from said item picking container in said output item container at said dropping location in (b): (i) cause said robotic arm to place said one or more additional items in said item dropping container at said dropping location, thereby causing said item dropping container to include both said one or more items and said one or more additional items.
  • at least one of said one or more additional items is substantially identical to at least one of said one or more items.
  • each of said one or more additional items are substantially identical each of said one or more items.
  • at least one of said one or more additional items is substantially different from at least one of said one or more items.
  • each of said one or more additional items are substantially different from each of said one or more items.
  • the system further comprises: (D) one or more sensors configured to collect sensor data corresponding to said picking port.
  • said controller is further configured to perform at least one of operations (a)-(h) based at least in part on said sensor data.
  • said one or more sensors comprise one or more cameras configured to capture image data corresponding to said picking port.
  • said controller is further configured to generate an alert at least in part in response to said image data satisfying an alert condition.
  • said one or more sensors comprise one or more laser curtains.
  • said one or more laser curtains are configured to detect if an item is sticking out outside of a threshold in one or more of (1) said first plurality of item picking containers, (2) said first plurality of item picking containers, (3) said item picking container, or (4) said item dropping container, and said controller is further configured to generate an alert at least in part in response to detecting that said item is sticking out outside of said threshold.
  • said one or more sensors comprise one or more weight sensors.
  • said one or more weight sensors are configured to determine (1) a first weight of a first item in one of said first plurality of item picking containers and (2) a second weight of a second item in one of said second plurality of item picking containers, and said controller is configured to determine, based at least in part on a comparison of said first weight and said second weight, to provide, to said picking location, (1) said one of said first plurality of item picking containers or (2) said one of said second plurality of item picking containers.
  • (1) a position in said first input picking queue proximal to said picking location is a first buffer location and (2) a position in said second input picking queue proximal to said picking location is a second buffer location.
  • (1) said one of said first plurality of item picking containers is at said first buffer location and (2) said one of said second plurality of item picking containers is at said second buffer location.
  • said one or more weight sensors are configured to determine (1) a first weight of said item picking container at said picking location prior to said robotic arm picking up said one or more items from said item picking container and (2) a second weight of said item picking container at said picking location after said robotic arm picking up said one or more items from said item picking container, and said controller is further configured to is configured to determine, based at least in part on a comparison of said first weight and said second weight, a number of items included in said one or more items.
  • each of said first plurality of item picking containers and each of said second plurality of item picking containers is a tote.
  • said tote for each of said first plurality of item picking containers and each of said second plurality of item picking containers comprises an identifier.
  • said identifier is an item identifier corresponding to an item included in said tote for each of said first plurality of item picking containers and each of said second plurality of item picking containers.
  • said item dropping container is a shipping container.
  • said first input picking queue is substantially parallel to said second input picking queue. In some embodiments, said first input picking queue is substantially identical to said second input picking queue.
  • a method for robotic item picking comprises: (a) receiving, at a picking location from a first input picking queue, a first item picking container; (b) picking up, using a robotic arm, a first item from said first item picking container; (c) placing, using said robotic arm, said first item into a first item dropping container at a dropping location; (d) moving said first item picking container from said picking location to an output picking queue; (e) moving said first item dropping container from said dropping location to an output dropping queue; (f) receiving, at said picking location from a second input picking queue, a second item picking container; (g) receiving, at said dropping location from an input dropping queue, a second item dropping container; (h) picking up, using a robotic arm, a second item from said second item picking container; and (i) placing, using said robotic arm, said second item into said second item dropping container.
  • said input dropping queue comprises a first input dropping queue and a second input dropping queue.
  • said first item is substantially identical to said second item.
  • said first item is substantially different from said second item.
  • one or more of operations (a)-(i) are performed based at least in part on sensor data collected by one or more sensors.
  • said one or more sensors comprise one or more cameras, the method further comprising: capturing image data, via said one or more cameras.
  • the method further comprises: generating an alert at least in part in response to said image data satisfying an alert condition.
  • said one or more sensors comprise one or more laser curtains.
  • the method further comprises: detecting, via said one or more laser curtains, that (1) said first item is sticking out outside of a threshold in said first item picking container or said first item dropping container or (2) said second item is sticking out outside of a threshold in said second item picking container or said second item dropping container, and generating an alert at least in part in response to detecting that that (1) said first item is sticking out outside of said threshold in said first item picking container or said first item dropping container or (2) said second item is sticking out outside of said threshold in said second item picking container or said second item dropping container.
  • said one or more sensors comprise one or more weight sensors.
  • the method further comprises: determining, said one or more weight sensors, a weight of said first item and a weight of said second item; and causing said first item to be placed, using said robotic arm, into said first item dropping container prior to said second item being placed, using said robotic arm into said second item dropping container.
  • the method further comprises: determining (1) a first weight of said first item picking container prior to said robotic arm picking up said first item from said first item picking container and (2) a second weight of said first item picking container after said robotic arm picking up said first item from said item picking container.
  • the method further comprises: generating an alert at least in part in response to a comparison of said first weight and said second weight.
  • each of said first item picking container and said second item picking container is a tote.
  • said tote for each of first item picking container and said second item picking container comprises an identifier.
  • said identifier for said first item picking container corresponds to said first item and said identifier for said second item picking container corresponds to said second item.
  • each of said first item dropping container and said second item dropping container is a shipping container.
  • said first input picking queue is substantially parallel to said second input picking queue. In some embodiments, said first input picking queue is substantially identical to said second input picking queue.
  • a method for robotic item picking comprises: (a) receiving, at a picking location from a first input picking queue, a first item picking container; (b) picking up, using a robotic arm, a first item from said first item picking container; (c) placing, using said robotic arm, said first item into an item dropping container at a dropping location; (d) moving said first item picking container from said picking location to an output picking queue; (e) receiving, at said picking location from a second input picking queue, a second item picking container; (f) picking up, using a robotic arm, a second item from said second item picking container; and (g) placing, using said robotic arm, said second item into said item dropping container, thereby causing said item dropping container to include both said first item and said second item.
  • the method further comprises: receiving, from an input dropping queue, said input dropping container.
  • said input dropping queue comprises a first input dropping queue and a second input dropping queue.
  • said first item is substantially identical to said second item.
  • said first item is substantially different from said second item.
  • one or more of operations (a)-(g) are performed based at least in part on sensor data collected by one or more sensors.
  • said one or more sensors comprise one or more cameras, the method further comprising: capturing image data, via said one or more cameras.
  • the method further comprises: generating an alert at least in part in response to said image data satisfying an alert condition.
  • said one or more sensors comprise one or more laser curtains.
  • the method further comprises: detecting, via said one or more laser curtains, that (1) said first item is sticking out outside of a threshold in said first item picking container or said item dropping container or (2) said second item is sticking out outside of a threshold in said second item picking container or said item dropping container, and generating an alert at least in part in response to detecting that that (1) said first item is sticking out outside of said threshold in said first item picking container or said item dropping container or (2) said second item is sticking out outside of said threshold in said second item picking container or said item dropping container.
  • said one or more sensors comprise one or more weight sensors.
  • the method further comprises: determining, said one or more weight sensors, a weight of said first item and a weight of said second item; and causing said first item to be placed, using said robotic arm, into said item dropping container prior to said second item being placed, using said robotic arm into said item dropping container.
  • the method further comprises: determining (1) a first weight of said first item picking container prior to said robotic arm picking up said first item from said first item picking container and (2) a second weight of said first item picking container after said robotic arm picking up said first item from said item picking container.
  • the method further comprises: generating an alert at least in part in response to a comparison of said first weight and said second weight.
  • each of said first item picking container and said second item picking container is a tote.
  • said tote for each of first item picking container and said second item picking container comprises an identifier.
  • said identifier for said first item picking container corresponds to said first item and said identifier for said second item picking container corresponds to said second item.
  • said item dropping container is a shipping container.
  • said first input picking queue is substantially parallel to said second input picking queue. In some embodiments, said first input picking queue is substantially identical to said second input picking queue.
  • a method for robotic item picking comprises: (a) obtaining item information for a plurality of items, wherein said item information comprises one or more of: item weight, item size, item fragility, or item deformability; (b) obtaining an order comprising a subset of said plurality of items; (c) determining a filling order for said subset of said plurality of items based at least in part on said item information corresponding to said subset of said plurality of items; (d) at least in response to said filling order, causing a first input picking queue to move a first item container comprising a first item of said subset of said plurality of items to a picking location; and (e) at least in response to said filling order, causing a second input picking queue to move a second item container comprising a second item of said subset of said plurality of items to said picking location.
  • the method further comprises: causing a robotic arm to pick up said first item from said first item container; and causing an output picking queue to move said first item container from said picking location.
  • said item information further comprises one or more of: item quantity, item name, item price, or item materials.
  • said order comprises a customer order.
  • a system for robotic item picking comprises: (A) a first conveyor system configured to hold and move a first plurality of item picking containers; (B) a second conveyor system configured to hold and move a second plurality of item picking containers; (C) a plurality of sensors configured to collect sensor data corresponding to at least a portion of said first plurality of item picking containers and at least a portion of said second plurality of item picking containers, wherein said plurality of sensors comprises one or more of: a weight sensor, a camera sensor, an identification reader, or a laser curtain; and (D) a controller configured to: (a) obtain an order comprising a plurality of items, (b) determine a filling order for said plurality of items based at least in part on said sensor data, (d) at least in response to said filling order, cause said first conveyor system to move a first item picking container of said first plurality of item picking containers to a picking location, wherein said first item container comprises a first item of said plurality of items, and
  • FIG. 1 shows an example of a picking port and an example of a robotic arm
  • FIG. 2A shows a perspective view of an example of a picking port
  • FIG. 2B shows a top-down view of rollers of the example of the picking port of FIG. 2A;
  • FIG. 2C shows a rear orthographic view of the example of the picking port of FIG. 2A;
  • FIG. 2D shows a right orthographic view of the example of the picking port of FIG. 2A;
  • FIG. 3 shows an example of a picking port, an example of robotic arm, and an example of a tool changer;
  • FIG. 4A shows a top-down view of an example single-input, single-output picking port
  • FIG. 4B shows an example implementation of various sensors in the top-down view of the example single-input, single-output picking port of FIG. 4A;
  • FIG. 5 shows a top-down view of an example dual-input, single-output picking port
  • FIG. 6 shows an example of an operation flowchart for implementing robotic item picking
  • FIG. 7 shows a computer system that is programmed or otherwise configured to implement methods provided herein.
  • picking ports are not robot-friendly due to opening close to walls, thereby reducing robot access. Further, many picking ports are designed around human ergonomics (e.g., having tilts, shapes, structures, etc. to complement the anatomy or shape of a human body or human function).
  • systems, methods, computer-readable media, and techniques disclosed herein for item picking are designed to improve time efficiency, order accuracy, inventory management, scalability, flexibility, cost efficiency, etc. when used with robotics.
  • FIG. 1 shows an example system 100 comprising an example of a picking port 110 and an example of a robotic arm 120.
  • the picking port 110 may be positioned nearby the robotic arm 120 such that the robotic arm 120 may reach one or more locations in the picking port 110.
  • FIG. 1 illustrates an upper-right perspective view of the system 100.
  • the front end of the picking port 110 may be defined as where the robotic picker 110 is picks up items from item containers passing through the picking port 110. The items are picked at a picking location at the front end of the picking port 110 and deposited at a dropping location, also at the front end of the picking port 110.
  • the front side, top side, left side, and right side of the picking port 110 may have panels (e.g., plastic, metal, glass, wood, etc.). These panels may help to cover internal components (e.g., machinery) of the picking port 110. In some cases, these panels may provide mounting surfaces for internal components (e.g., cables) of the picking port 110. These panels may help to prevent injury by a human by providing a separation between the human and the internal components of the picking port 110.
  • the rear side of the picking port 110 may be at least partially without panels to allow the entry and exit of item containers to the picking port 110 via the rear side.
  • the rear side of the picking port 110 may comprise one or more entry positions corresponding to one or more input queues and one or more exit positions corresponding to one or more output queues.
  • the one or more input queues and the one or more output queues may be implemented via one or more of: a conveyor system, a chute system, a pusher system, etc. While the entry positions and the exit positions are illustrated as on the rear side, in some cases, the entry and exit positions may be arranged at other locations in the picking port 110, such as to the left and right sides of the picking port 110.
  • the entry position may be configured for connection to another conveyor, such as a storage system conveyor that transports item containers to the entry position of the picking port 110.
  • the storage system conveyor may bring item containers from a storage facility into the picking port 110.
  • the exit position may be configured for connection to another conveyor, for example, a storage or shipping system conveyor that transports item containers from the exit position of the picking port 110.
  • the storage or shipping system conveyor may bring item containers from the picking port 110 to a storage or shipping facility.
  • the picking port 110 may be configured to transport item containers internally within the picking port 110.
  • the item container positioned in the rear left of the picking port 110 may be in a queue to be transported towards the front left of the picking port 110.
  • item containers are transported forward from an entry position in the rear end towards the front end of the access station picking port 110.
  • the direction of movement changes to transport the item containers backward from the front end of the picking port 110 towards the back end of the picking port.
  • the picking port 110 may be configurable.
  • the picking port 110 may be communicatively coupled to a controller.
  • the controller may be configured to set specifications for controlling picking operations such as the speed of transport for item containers through the picking port 110, reversal of the transport direction through the picking port 110, stop and start functions of the picking port 110, etc.
  • the controller may be remote or co-located with the picking port 110.
  • the controller may correspond to a control panel that may also have a user interface, such as a screen or display configured to display the specifications of the picking port 110 or information about one or more items in the picking port 110.
  • information about the one or more items in the picking port may include the weight of an item or an item container, a size of an item or an item container, a number of items in an item container, an identifier (e.g., barcode, identification number, quick response code, etc.) of an item or an item container, a fragility of an item, a deformability of an item, etc.
  • the controller may be further communicatively coupled to the robotic arm 120. By coupling the controller to the robotic arm 120, operations between the picking port 110 and the robotic arm may be coordinated (e.g., synchronized) [0031]
  • the picking port 110 may be configured to hold one or more item containers.
  • Each item container of the one or more item containers may be configured to hold one or more items.
  • the picking port 110 may described herein as holding or moving one or more item containers, it should be understood that, in some cases, the picking port 110 may be configured to hold one or more items directly (without the use of item containers).
  • the item containers may be used for storage of items.
  • the item containers may include one or more of: a storage bin (e.g., a plastic storage bin, a metal storage bin, a fabric storage bin, a wooden storage bin, a carboard storage bin, etc.), a storage box (e.g., a plastic storage box, a metal storage box, a fabric storage box, a wooden storage box, a carboard storage box, etc.), a case (e.g., a plastic storage case a metal storage case, a fabric storage case, a wooden storage case, a carboard storage case, etc.), a storage tote (e.g., a plastic storage tote, a metal storage tote, a fabric storage tote, a wooden storage tote, a carboard storage tote, etc.), a storage pallet (e.g., a plastic storage pallet, a metal storage pallet, a wooden storage pallet, etc.), or any other container suitable for
  • the item containers may be used for shipping of items.
  • the item containers may include one or more of: a shipping bin (e.g., a plastic shipping bin, a metal shipping bin, a fabric shipping bin, a wooden shipping bin, a carboard shipping bin, etc.), a shipping box (e.g., a plastic shipping box, a metal shipping box, a fabric shipping box, a wooden shipping box, a carboard shipping box, etc.), a case (e.g., a plastic shipping case a metal shipping case, a fabric shipping case, a wooden shipping case, a carboard shipping case, etc.), a shipping tote (e.g., a plastic shipping tote, a metal shipping tote, a fabric shipping tote, a wooden shipping tote, a carboard shipping tote, etc.), a shipping pallet (e.g., a plastic shipping pallet, a metal shipping pallet, a wooden shipping pallet, etc.), or any other container suitable for a shipping bin (e.g
  • the picking port 110 includes one or more locations that are configured to receive an item picking container.
  • the one or more locations may include a picking location and a dropping location.
  • the picking location may be the location at which an item is picked up by the robotic arm 120 from the item’s corresponding item picking container.
  • the dropping location may be the location at which an item is dropped (e.g., placed) by the robotic arm 120 into the item’s corresponding item dropping container.
  • the one or more locations may be discrete with respect to one another.
  • the one or more locations may be continuous with respect to one another (e.g., positions on a conveyor).
  • the picking port 110 may include one or more queues.
  • the one or more queues may comprise one or more locations of the picking port 110.
  • the picking port 110 has multiple queues, each with at least one location.
  • the queues may transport the items or the item containers to or from a location of the picking port 110.
  • the queues may comprise one or more of a conveyor, a chute, a pusher, or any other device used to move (e.g., via gravitational force, mechanical force, electrical force, etc.) objects from one location to another.
  • the robotic arm 120 may be configured to pick up an item from a picking location of the picking port 110 and place the item in a dropping location of the picking port 110. While illustrated as being separate from the picking port 110, in some cases, the robotic arm 120 may be integrated into the picking port 110. For example, the robotic arm 120 may be built directly into the picking port 110. In some cases, a device other than a robotic arm may be used to move items between locations (e.g., a picking location and a dropping location) in the picking port 110.
  • FIG. 2A shows a perspective view of an example of a picking port. The picking port of FIG. 2A may be the same as or similar to the picking port 110 of FIG. 1.
  • FIG. 1 shows a perspective view of an example of a picking port. The picking port of FIG. 2A.
  • FIG. 2B shows a top-down view of rollers of the example of the picking port of FIG. 2A.
  • the rollers of FIG. 2B may be included in conveyors.
  • the picking port of FIG. 2A comprises four queues, each with three positions, for a total of twelve positions.
  • FIGs. 2C and 2D provide additional views of the picking port of FIG. 2A.
  • the picking port of FIGs. 2A-2C may be a single-input, single-output picking port; meaning the picking port has one input picking queue, one output picking queue, one input dropping queue, and one output dropping queue.
  • FIG. 3 shows an example system 300 comprising an example of a picking port 310, an example of robotic arm 320, and an example of a tool changer 330.
  • the picking port 310 and the robotic arm 320 of FIG. 3 may be the same as or similar to the picking port 110 or the robotic arm 120, respectively.
  • the picking port 310 further comprises the tool changer 330, as illustrated. Although illustrated as mounted on the picking port 310, the tool changer may be located in any number of locations (e.g., on the robotic arm 320, separate from the robotic arm 320 and separate from the picking port 310, etc.).
  • the tool changer 330 may be reachable by the robotic arm 320.
  • the tool changer 330 may be configured to provide the robotic arm 320 with various tools (e.g., grippers) for handling items in the picking port 310.
  • the robotic arm 320 may change tools using the tool changer 330.
  • the tool changer 330 may provide the robotic arm 320 with different tools depending on the weight of an item, the size of an item, the material of an item, the fragility of an item, the deformability of an item, etc.
  • the picking port 310 or the robotic arm 320 may include a controller.
  • the controller may be configured to control the picking port 310 or the robotic arm 320.
  • the controller may cause the robotic arm 320 to pick up and move items in the picking port 310 according to an order that reduces the number of tool changes using the tool changer 330.
  • the picking port 310 may sequentially move a first subset of items in the picking port 310 to a picking location (e.g., via using a buffer in the picking port 310) such that the first subset of items may be moved (e.g., sequentially) by the robotic arm 320 without the robotic arm changing tools using the tool changer 330.
  • the robotic arm 320 may reach into the picking port 310 to pick up (e.g., sequentially) a first subset of items in the picking port 310 such that the first subset of items may be moved by the robotic arm 320 without the robotic arm changing tools using the tool changer 330. Then, once the first subset of items in the picking port 310 are moved by the robotic arm 320, the robotic arm may perform a tool change using the tool changer to prepare to move a second subset of items in the item picker 310. Accordingly, the systems, the methods, the computer-readable media, and the techniques disclosed herein may improve efficiency and speed of item picking via reducing a number of tool changes performed while picking items.
  • FIG. 4A shows an example top-down view 400A of an example of a single-input, singleoutput picking port with an example of a robotic arm. Similar to FIG. 2B, the picking port of view 400A has four queues, each with three locations, for a total of twelve locations. Locations 401, 402, and 403 are included in an input picking queue. Locations 404, 405, and 406 are included in an output picking queue. Locations 411, 412, and 413 are included in an input dropping queue. Locations 414, 415, and 416 are included in an output dropping queue.
  • location 404 is a picking location and location 414 is a dropping location.
  • the robotic arm 420 may be configured to pick up one or more items included in an item picking container 451 that is positioned at the picking location 404 and move the one or more items to an item dropping container 460 positioned at the dropping location 414.
  • the item picking container 451 moves to the location 405 and the item picking container 450 moves from the location 403 to the picking location 404.
  • the robotic arm may now pick up one or more items from the item picking container 450.
  • the locations 403 and 413 may be buffers for the picking location 404 and the dropping location 414, respectively.
  • an item picking container may move through the picking port from the input picking queue to the output picking queue.
  • an item picking container may: (A) enter (e.g., from another conveyor system) the input picking queue (and the picking port) at the location 401; (B) move from the location 401 to the location 402; (C) move from the location 402 to the location 403; (D) move from the location 403 to the location 404 (and the output picking queue), where the robotic arm 420 may pick an item out of the item picking container; (E) move from the location 404 to the location 405; and move from the location 405 to the location 406, thereby exiting (e.g., to another conveyor system) the output picking queue (and the picking port).
  • an item picking container may include one or more items when the item picking container is in the input picking queue (the locations 401-403) and may be empty when the item picking container is in the
  • any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(E) may be added or removed. Further, the one or more operations (A)-(E) may be performed in any order. Further, at least one of the one or more operations (A)-(E) may be repeated, e.g., iteratively.
  • an item dropping container may move through the picking port from the input dropping queue to the output dropping queue.
  • an item dropping container may: (A) enter (e.g., from another conveyor system) the input dropping queue (and the picking port) at the location 411; (B) move from the location 411 to the location 412; (C) move from the location 412 to the location 413; (D) move from the location 413 to the location 414 (and the output dropping queue), where the robotic arm 420 may drop an item into the item picking container; and (E) move from the location 414 to the location 415; and move from the location 415 to the location 416, thereby exiting (e.g., to another conveyor system) the output picking queue (and the picking port).
  • an item dropping container may be empty when the item dropping container is in the input dropping queue (the locations 411-413) and may include one or more items when the item dropping container is in
  • any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(E) may be added or removed. Further, the one or more operations (A)-(E) may be performed in any order. Further, at least one of the one or more operations (A)-(E) may be repeated, e.g., iteratively.
  • the picking port may include one or more sensors. As illustrated in FIG. 4A, the picking location 404 and the dropping location 414 have a weight sensor 440 and a weight sensor 441, respectively. In some cases, the weight sensors 440 and 441 may be scales. In some cases, the weight sensors 440 and 441 may be configured to determine a weight of one or more items (and, possibly, the item container) in an item container positioned on the weight sensors 440 and 441.
  • the weight sensor 440 may determine a first weight that includes the weight of the item picking container 451 and the one or more items included therein. After the robotic arm 420 picks the one or more items out from the item picking container 451, the weight sensor 440 may determine a second weight that is the weight of the item picking container 451 without the one or more items included therein. Accordingly, based at least in part on the difference between the first weight and the second weight, the weight of the one or more items may be determined.
  • the weight of the one or more items may be compared against known weights of the one or more items (e.g., from a database). If, for example, the weight of the one or more items differs from the known weights (e.g., by a certain tolerance), an alert may be generated. For example, differing between the weight of the one or more items and the known weights may imply that the one or more items picked up by the robotic arm 420 are not the items corresponding to a label on the item picking container (e.g., the wrong item was picked up). In another example, differing between the weight of the one or more items and the known weights may imply that at least one item or at least one component of an item may have not been picked up by the robotic arm 420.
  • FIG. 4B shows an example implementation of various sensors in an example top-down view 400B of the example single-input, single-output picking port of FIG. 4A. Accordingly, FIG. 4B may be similar to FIG. 4A, but FIG. 4B also depicts various sensors positioned at locations in the picking port. These sensors may be used to position and track item containers in the picking port. For example, these sensors may include identifier readers, such as barcode readers. Barcode readers may read barcodes on item picking containers to identify information about one or more items in the item picking containers, such as, item weight, item size, item quantity, item name, item price, item materials, item fragility, item deformability, etc. As also disclosed herein, these sensors may include weight sensors
  • Laser curtains may be used, for example, to detect when an item (or an item container) is lying outside a certain boundary. For example, if an item (or an item container) does not fit entirely within a single location in the picking port, an alert may be generated. In another example, if an item is sticking (e.g., partially) outside its item container, an alert may be generated.
  • Cameras e.g., an array of cameras may be used to determine if certain alert conditions are satisfied. For example, is an item container has tipped over, an item has fallen out of an item container, an item container is overflowing, an item or item container is stuck, etc.
  • the picking port of the systems, the methods, the computer-readable media, and the techniques disclosed herein may use one or more of cameras, laser curtains, weight sensors, barcode readers, or the like to detect if an alarm condition is satisfied. In some cases, all of cameras, laser curtains, weight sensors, and barcode readers are used to detect if an alarm condition is satisfied.
  • FIG. 5 shows an example top-down view 500 of an example of a dual-input, single-output picking port with an example of a robotic arm 520.
  • the picking port of FIG. 5 may be the same as or similar to the picking port of FIGs. 4A and 4B, at least in certain respects.
  • the picking port of FIG. 5 has two input picking queues.
  • the first input picking queue comprises locations 501, 502, and 503.
  • the second input picking queue comprises locations 504, 505, and 506.
  • the output picking queue comprises locations 507, 508, and 509.
  • the picking port of FIG. 5 has a single input dropping queue and a single output dropping queue.
  • the input dropping queue comprises locations 511, 512, and 513.
  • the output dropping queue comprises locations 514 and 515.
  • the location 507 is a picking location and the location 513 is a dropping location.
  • the robotic arm 520 may be configured to pick up items from an item picking container positioned at the location 507 and drop the items in an item dropping container positioned at the location 513.
  • item picking container 554 is in the picking location 507 and the item dropping container 561 is in the dropping location 513.
  • the item picking container 554 may be moved to be location 508. Then, either item picking container 552 or item picking container 553 may be moved to the picking location 507.
  • the locations 503, 505, and 506 may be buffers for the picking location 507.
  • the locations 512 and 515 may be buffers from the dropping location 513.
  • the picking port of FIG. 5 also includes weight sensors 540 and 541.
  • the weight sensors 540 and 541 can be used to determine the weight of objects (e.g., items or item containers) at their corresponding location.
  • weight sensors may be placed in additional locations, such as the location 505, the location 503, the location 506, etc. to determine the weight of their corresponding objects. This may be useful in determining which item container to move to the picking location 507 next. For example, it may be desirable to load items from multiple different item picking containers into a single item dropping container. In this example, it may be desirable to load the items into the item dropping container from heaviest to lightest, e.g., to avoid crushing lighter items.
  • one example process may include: (A) moving a first item from the item picking container 554 into the item dropping container 561, where the first item is 6 kilograms; (B) moving the item picking container 554 to the location 508; (C) moving item picking container 553 to the picking location 507; (D) moving a second item from the item picking container 553 into the item dropping container 561, where the second item is 4 kilograms; (E) moving the item picking container 554 to the location 509 and moving the item picking container 553 to the location 508; (F) moving item picking container 552 to the picking location 507; and (G) moving a third item and then a fourth item from the item picking container 552 into the item dropping container 561, wherein the third item is 2 kilograms and the fourth item is 1 kilogram.
  • the first item, the second item, the third item, and the fourth item were moved by the robotic arm 520 into the item dropping container 561 in descending order of weight.
  • a plurality of weight sensors e.g., the weight sensors 540 and 541 at various locations in the picking port may be used.
  • weights of items may be accessed from a database comprising item information.
  • achieving this descending order of the above illustrative example may include taking advantage of the additional flexibility offered by the dualinput, single-output picking port that has the first input picking queue and the second input picking queue.
  • any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(G) may be added or removed. Further, the one or more operations (A)-(G) may be performed in any order. Further, at least one of the one or more operations (A)-(G) may be repeated, e.g., iteratively.
  • a picking port may have three input picking queues, four input picking queues, five input picking queues, six input picking queues, seven input picking queues, eight input picking queues, nine input picking queues, about ten input picking queues, about twenty picking input queues, about thirty input picking queues, about forty input picking queues, about fifty input queues, etc.
  • two or more of the input picking queues may run parallel to each other (as illustrated with the two parallel input picking queues of FIG. 5).
  • two or more of the input picking queues may run orthogonal to each other (e.g., a first input picking queue feeding to a picking location from the left (e.g., via a conveyor), a second input picking queue feeding to the picking location from the front, a third input picking queue feeding to the picking location from the back, a fourth input picking queue feeding to the picking location from above (e.g., via a chute), etc.).
  • certain locations can be used as buffers and input picking containers can be advanced through either the first input picking queue or the second input picking queue - thereby enabling control over ordering of items moved to an item dropping container.
  • items may be moved to an item dropping container based on their order in the single input picking queue.
  • this control over ordering of items may increase speed and efficiency of the picking port.
  • item picking containers may be provided to the picking location in an order that reduces (e.g., minimizes) tool changes (e.g., gripper changes) for the robotic arm 520.
  • this control over ordering of items may enable items to be packed in an order based on size to improve (e.g., maximize) packing density of items in the item dropping container.
  • this control over ordering of items may enable items to be packed in a manner that sorts them based on item type (e.g., a first type of item packed in a first item dropping container and a second type of item packed in a second item dropping container). This improved control over ordering of items may further advantageously result in having fewer broken or crushed items due to more effective weight-based item packing.
  • a picking port may have more than one output picking queue, input dropping queue, or output dropping queue. Additional output picking queues, input dropping queues, or output dropping queues may be implemented in the same or similar manner as disclosed with respect to having additional input picking queues.
  • a picking port may have more than one picking location. For example, a picking port may have two, three, four, five, six, seven, eight, nine, about ten, about twenty, about thirty, about forty, about fifty, etc. picking locations. Similar to advantages with having multiple queues (e.g., multiple input picking queues), having multiple picking locations may advantageously enable picking control over ordering of items moved to an item dropping container.
  • a picking port may have multiple dropping locations.
  • a picking port may have more than one robotic arm.
  • a picking port may have two, three, four, five, six, seven, eight, nine, about ten, about twenty, about thirty, about forty, about fifty, etc. robotic arms. Similar to advantages with having multiple queues (e.g., multiple input picking queues), having multiple robotic arms may advantageously enable picking control over ordering of items moved to an item dropping container.
  • robotic arms While illustrated as robotic arms, it should be understood that other devices (e.g., a pusher, a conveyor, etc.) may be used in addition or in alternative of robotic arms (e.g., having a scooper, claw, fingers, magnet, etc.) to move items from a picking location to a dropping location. In some cases, a human may move items from a picking location to a dropping location.
  • devices e.g., a pusher, a conveyor, etc.
  • robotic arms e.g., having a scooper, claw, fingers, magnet, etc.
  • a human may move items from a picking location to a dropping location.
  • FIG. 6 shows an example of a controller diagram 600 for implementing robotic item picking.
  • the diagram 600 includes a warehouse management system (WMS) 605, a conveyor controller 610, port controller 615, and a programable logic controller (PLC) 620.
  • WMS warehouse management system
  • PLC programable logic controller
  • WMS 620 operates at a high level, controlling orders and order lines (an order line is a single position in an order, e.g., two tubes of toothpaste).
  • the WMS 620 monitors which order are filled or packed and which orders have yet to be filled or packed.
  • the conveyor controller 610 controls conveyor systems that move item containers (or items) in an out of a picking port. For example, the conveyor controller 610 controls the feeding of item containers into an input picking queue, out of an output picking queue, into an input dropping queue, and out of an output dropping queue. In cases with multiple input picking queues (e.g., as illustrated with respect to FIG. 5, the conveyor controller 610 controls moving item containers into each of the item picking queues. The conveyor controller 610 may schedule sending and receiving item containers to and from the picking port.
  • the port controller 615 controls a picking port that receives and sends item containers (or items) to and from conveyors (or other devices, such as, chutes) controlled by the conveyor controller 610. More specifically, the port controller 615 controls the movement of item containers (or item) through the picking port. For example, the port controller 615 may manage item container flow into and out of the picking port as well as flow within the picking port (e.g., within a queue, between a picking location and a dropping location, etc.) Further, in some cases, the port controller 615 acknowledges orders and order lines to the WMS 605.
  • the port controller 615 cooperates with the PLC 620.
  • the PLC 620 may manage hardware equipment. Accordingly, the PLC 620 may manage physical item container (or item) movements based at least in part on requests from the port controller 615.
  • One example process depicting communication between the controllers of the controller diagram 600 may include: (A) the WMS 605 indicates to the port controller 615 that an order is available; (B) in response, the port controller 615 requests an item dropping container for an input dropping queue of a picking port; (C) the conveyor controller 610 causes the item dropping container to be delivered to the input dropping queue of the picking port and confirms delivery; (D) the port controller 615 causes the item dropping container to move to a buffer location within the picking port; (E) the port controller 615 performs a first error check by checking an identifier (e.g., a barcode) of the item dropping container against an expected identifier from the WMS 605; (F) the port controller 615 causes the item dropping container to move to a picking position; (G) the port controller 615 receives a first weight of the item dropping container; (H) the port controller 615 causes an item to be placed into the item dropping container; (I) the port controller 615 receives a second weight of an item picking container
  • any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(Q) may be added or removed. Further, the one or more operations (A)-(Q) may be performed in any order. Further, at least one of the one or more operations (A)-(Q) may be repeated, e.g., iteratively. Further, in some cases, one or more operations performed by the port controller 615 may be performed at least in part (e.g., collaboratively) with the PLC 620.
  • items may be moved from item picking containers to item dropping containers.
  • A) one or more first items of a first item picking container are transferred to a first item dropping container;
  • B) one or more second items of a second item picking container are transferred to a second item dropping container;
  • C) one or more third items of a third item picking container are transferred to a third item dropping container (and so on, for example).
  • the one or more first items, the one or more second items, the one or more third items, etc. may be transferred in order to change the type of containers they are placed in. For example, the first item picking container, the second item picking container, the third item picking container, etc.
  • first item dropping container may be container used within a warehouse (e.g., storage containers), whereas the first item dropping container, the second item dropping container, the third item dropping container, etc. may be used for outside a warehouse (e.g., shipping containers).
  • first item dropping container e.g., storage containers
  • second item dropping container e.g., shipping containers
  • first item dropping container e.g., shipping containers
  • first item dropping container e.g., shipping containers
  • second item dropping container e.g., shipping containers
  • first item dropping container e.g., shipping containers
  • first item dropping container e.g., shipping containers
  • second item dropping container e.g., shipping containers
  • the one or more first items may be the same items as the one or more second items. In such case, this may be done to consolidate items.
  • the first item picking container and the second item picking container may each be only partially filled, and, in performing the operations of the second example application, the one or more first items and the one or more second items may be consolidated into the first item dropping container.
  • the first item picking container, the second item picking container, and the first item dropping container may all be containers of the same type (e.g., warehouse storage containers).
  • Contemplate for example, the first item picking container may be 30% filled with basketballs and the second item picking container may be 50% filled with basketballs.
  • This second example application may include transferring the basketballs from both the first item picking container and the second item picking container into the first item dropping container, thereby filling the first item dropping container with 80% basketballs.
  • This first item dropping container can then be reshelved at the warehouse, taking up less space than the combination of the first item picking container and the second item picking container, thereby increasing storage density and efficiency at the warehouse.
  • this first item dropping container may be a shipping container shipped to a customer who ordered an especially large number of basketballs.
  • This second example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5.
  • first item picking container are transferred to a first item dropping container
  • second item of a second item picking container are transferred to the first item dropping container (and so on, for example).
  • the one or more first items may be different than the one or more second items. In such case, this may be done to prepare items for shipment.
  • the first item picking container and the second item picking container may each be warehouse storage containers and the first item dropping container may be a shipping container.
  • Contemplate for example, a customer orders a basketball and two pairs of socks.
  • the first item picking container may include a large number of basketballs and the second item picking container may include a large number of socks.
  • This third example application may include transferring one basketball from the first item picking container and two pairs of socks from the second item picking container into the first item 1 dropping container for shipping to the customer.
  • This second example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5.
  • first item picking container In a fourth example application, (A) one or more first items of a first item picking container are transferred to a first item dropping container; (B) one or more second items of the first item picking container are transferred to a second item dropping container (and so on, for example).
  • the one or more first items may be the same as the one or more second items. In such case, this may be done to prepare items for shipment.
  • the first item picking container may be a warehouse storage container and each of the first item dropping container and the second item dropping container may be a shipping container.
  • Contemplate for example, two customers who each a basketball.
  • the first item picking container may include a large number of basketballs.
  • This fourth example application may include transferring one basketball from the first item picking container into the first item dropping container for shipping to the first customer and transferring one basketball from the first item picking container into the second item dropping container for shipping to the second customer.
  • This fourth example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5.
  • this fourth example application may be enabled by taking advantage of dropping control over ordering of items that is achieved using a dual output picking port (e.g., a picking port have at least two input dropping queues; not illustrated).
  • the one or more first items of a first item picking container are transferred to a first item dropping container;
  • one or more second items of the first item picking container are transferred to a second item dropping container (and so on, for example).
  • the one or more first items may be different than the one or more second items. In such case, this may be done to sort items (e.g., by type, weight, size, material, price, quality, etc.).
  • the first item picking container may be a warehouse storage container with various different types of items mixed inside. Contemplate, for example, the first item picking container has a mixture of basketballs and socks.
  • This fifth example application may include transferring the basketballs from the first item picking container into the first item dropping container and transferring the socks from the second item picking container into the second item dropping container. Determining which items are basketballs and which are socks may use, for example, a camera system, identifier readers, or weight sensors (each of which, e.g., as disclosed herein).
  • This fifth example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5. Furthermore, in some cases, this fifth example application may be enabled by taking advantage of dropping control over ordering of items that is achieved using a dual output picking port (e.g., a picking port have at least two input dropping queues; not illustrated).
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein for automated picking may cooperate with one or more other technologies in a warehouse or storage facility.
  • these one or more other technologies may include robotic technologies such as robotic arms, conveyors, chutes, pushers, item tilters, etc.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may sort, handle, pick, place, or otherwise manipulate one or more objects of a plurality of objects.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may replace tasks which may be performed manually or only in a semi-automated fashion.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may be integrated with machine learning software, such that human involvement may be completely removed over time.
  • the systems, the methods, the computer- readable media, and the techniques disclosed herein are used in analyzing and packing one or more items in a container or package.
  • the container or package is a box.
  • a surveillance system determines if human intervention is needed for one or more tasks.
  • Robotics such as a robotic arm or other automated manipulators, may be used for applications involving picking up or moving objects.
  • Picking up and moving objects may involve picking an object from a picking source location and placing it at a dropping location.
  • a robotic device may be used to fill a container with objects, create a stack of objects, unload objects from a truck bed, move objects to various locations in a warehouse, and transport objects to one or more target locations.
  • the objects may be of the same type.
  • the objects may comprise a mix of different types of objects, varying in size, mass, material, etc.
  • Robotics may direct a robotic arm to pick up objects based on predetermined knowledge of where objects are in the environment.
  • an item manipulation may include using a device or apparatus for reorientation of items or objects.
  • the device for re-orienting an object or item may be referred to herein as an item tilter.
  • an item tilter is used in conjunction with one or more robotic arms.
  • the item tilter may reorient an object, such that it can be properly handled by a robotic arm.
  • an item tilter is provided to properly orient an object prior to placement within a container or box.
  • the item tilter may facilitate proper packing of the container or box to maximize the number of items the container may hold or minimize the additional packing/stuffing materials required for shipping of the items within the container.
  • a database is provided containing information related to products being handled by automated systems of a facility.
  • a database comprises information of how each product or object in an inventory should be handled or manipulated by the item tilter and or robotic arms.
  • a machine learning process dictates and improves upon the handling of a specific product or object.
  • the machine learning is trained by observation and repetition of a specific product or object being handled by a robot or automated handling system.
  • the machine learning is trained by observation of a human interaction with a specific object or product.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may cooperate with a robotic arm (e.g., as illustrated in FIG. 1 as the robotic arm 120, as illustrated in FIG. 3 as the robotic arm 320, as illustrated in FIG. 4A as 420, or as illustrated in FIG. 5 as 520).
  • a robotic arm is a type of mechanical arm that may be used in various applications include, for example, automotive, agriculture, scientific, manufacturing, construction, etc.
  • Robotic arms may be programmable and may be able to perform similar functions to a human arm. While robotic arms may be reliable and accurate, often times they may be taught to only perform narrowly defined tasks such as picking a specific type of object from a specific location with a specific orientation.
  • Robotic arms are often times programmed to automate execution of repetitive tasks, such as applying paint to equipment, moving goods in warehouses, harvesting crops in a farm field, etc.
  • Robotic arms may comprise links of manipulator that are connected by joints enabling either rotational motion (such as in an articulated robot) or translational (linear) displacement.
  • one or more robotic manipulators of the systems, the methods, the computer- readable media, and the techniques comprise robotic arms.
  • a robotic arm comprises one or more of robot joints connecting a robot base and an end effector receiver or end effector.
  • a base joint may be configured to rotate the robot arm around a base axis.
  • a shoulder joint may be configured to rotate the robot arm around a shoulder axis.
  • An elbow joint may be configured to rotate the robot arm about an elbow axis.
  • a wrist joint may be configured to rotate the robot arm around a wrist.
  • a robot arm may be a six-axis robot arm with six degrees of freedom.
  • a robot arm may comprise less or more robot joints and may comprise less than six degrees of freedom.
  • a robot arm may be operatively connected to a controller.
  • the controller may comprise an interface device enabling connection and programming of the robot arm.
  • the controller may comprise a computing device comprising a processor and software or a computer program installed there on.
  • the computing device may can be provided as an external device.
  • the computing device may be integrated into the robot arm.
  • the robotic arm can implement a wiggle movement.
  • the robotic arm may wiggle an object to help segment the box from its surroundings.
  • the robotic arm may employ a wiggle motion in order to create a firm seal against the object.
  • a wiggle motion may be utilized if the system detects that more than one object has been unintendedly handled by the robotic arm.
  • the robotic arm may release and re-grasp an object at another location if the system detects that more than one object has been unintendedly handled by the robotic arm.
  • various end effectors of a robotic arm may comprise grippers, vacuum grippers, magnetic grippers, etc.
  • the robotic arm may be equipped with end effector, such as a suction gripper.
  • the gripper includes one or more suction valves that can be turned on or off either by remote sensing, single point distance measurement, or by detecting whether suction is achieved.
  • an end effector may include an articulated extension.
  • the suction grippers of a robotic arm are configured to monitor a vacuum pressure to determine if a complete seal against a surface of an object is achieved. Upon determination of a complete seal, the vacuum mechanism may be automatically shut off as the robotic manipulator continues to handle the object.
  • sections of suction end effectors may comprise a plurality of folds along a flexible portion of the end effector (i.e., bellow or accordion style folds) such that sections of vacuum end effector can fold down to conform to the surface being gripped.
  • suction grippers comprise a soft or flexible pad to place against a surface of an object, such that the pad conforms to said surface.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein comprises a plurality of end effectors to be received by the robotic arm.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein comprise one or more end effector stages to provide a plurality of end effectors.
  • Robotic arms may comprise one or more end effector receivers to allow the end effectors to removable attach to the robotic arm.
  • End effectors may include single suction grippers, multiple suction grippers, area grippers, finger grippers, and other end effectors.
  • an end effector is selected to handle an object based on analyzation of one or more images captured by one or more image sensors, as disclosed herein.
  • the one or more image sensors are cameras.
  • an image sensor is placed before a robotic handler or arm.
  • the image sensor is in operative communication with a robotic handling system, which resides downstream from the image sensor.
  • the image sensor determines which product type is on the way or will arrive at the robotic handling system next. Based on the determination of the product, the robotic handling system may select and attach the appropriate end effector to handle the specific product type. Determination of a product type prior to the product reaching the handling station may improve efficiency of the system.
  • an end effector is selected to handle an object based on information received by optical sensors scanning a machine-readable code located on the object. In some cases, an end effector is selected to handle an object based on information received from a product database, as disclosed herein.
  • a conveyor is a common piece of mechanical handling equipment that may move materials from one location to another.
  • Many kinds of conveying systems are available and are used according to the various needs of different industries.
  • chain conveyors may include enclosed tracks, I-Beam, towline, power & free, and hand pushed trolleys.
  • Conveyors may offer several advantages, including: increased efficiency, versatility, and cost-effectiveness. While conveyors are widely used and may offer numerous advantages, they also have certain limitations and shortcomings. For example, conveyors operate along a fixed path, which means they may not be suitable for applications that require flexible routing or changes in the material flow direction. Adding flexibility to the system may use additional complex mechanisms or multiple conveyor lines. Examples of Chutes
  • a chute is a vertical or inclined plane, channel, or passage through which objects are moved by means of gravity.
  • Chutes are commonly used in various industries for bulk material handling, allowing the controlled transfer of granular or bulky materials from higher to lower levels or between different processing stages. The design of chutes depends on the specific application and the characteristics of the materials being handled.
  • the entry section of the chute is where the material is introduced into the chute from a higher elevation or conveyor. This section is designed to accommodate the flow of material smoothly and prevent any spillage or blockages.
  • Chutes may include features like baffles or flow control gates to regulate the speed and flow of materials through the chute. These features can help prevent material surges and ensure a steady flow.
  • the exit section of the chute is where the material discharges onto the lower level or conveyor.
  • Chutes may be suited for free-flowing, granular, or bulk materials. Chutes may be less suited for handling cohesive materials, sticky substances, or materials with irregular shapes, as this can lead to blockages and flow issues.
  • the material flow in chutes can result in impact forces, potentially leading to material degradation or fines generation.
  • steeply inclined chutes there may be limitations in controlling the material flow, leading to faster material acceleration and potentially causing material surges or damage to the chute.
  • a pusher in the context of material handling and logistics, refers to a mechanical device or component used to move items or products along a conveyor system or through a production line.
  • the primary function of a pusher is to apply force to push or divert items from one conveyor lane or processing stage to another.
  • Pushers may be used in conveyor systems and automated manufacturing processes to perform tasks. Pushers may be used to divert products from the main conveyor line to specific side lanes or different processing stages. This enables the sorting and distribution of items based on certain criteria, such as destination, size, or product type. Pushers may be employed in sorting systems to direct items to different designated destinations or shipping lanes based on predetermined criteria.
  • Pushers may be used to stage items or products for further processing or packaging. Pushers can transfer products between conveyors or equipment in a production line, facilitating the smooth flow of materials. At very high speeds, pushers may not have enough time to properly engage with and push items, leading to sorting or diverting errors.
  • An item tilter may be a mechanical device used to tilt or rotate items, loads, or pallets to a specific angle.
  • One use of an item tilter is to reorient materials or products to facilitate easier handling, improve ergonomics, or aid in certain manufacturing or processing operations.
  • the design of an item tilter may include a platform or surface on which the load or item is placed. The platform may be attached to a tilting mechanism that allows controlled tilting or rotation of a load. The tilting action can be achieved through hydraulic, pneumatic, or electric means, depending on the item tilter's design and intended application. Item tilters offer advantages in terms of improving efficiency, reducing manual handling strain, and enhancing the overall material handling process.
  • automated systems which may include robotic arms, handle unpacking items from warehouse bins to cardboard boxes preparing them to be shipped to the final customer.
  • the order and position of incoming goods are random. Therefore, items may be initially provided in positions which make it very difficult to place the item in the destination box in the position that optimizes the volume occupied inside the target box.
  • an item tilter is positioned at a good-to-robot station where items are provided to a robot for picking and manipulation. The item tilter may facilitate proper packing in cases where the robot is unable to place items on their flat side. In some cases, the item tilter will reorient items in preparation for packing into a final container or box.
  • the item tilter does not grasp, clamp, or grip the object being manipulated.
  • an item tilter which does not perform grasping, gripping, clamping, or similar actions prevents damage to the objects handled by the tilter.
  • the item tilter comprises a substantially planar surface which the object is placed on. The surface may rotate in a specified direction to properly reorient the object in preparation for packing or manipulation by a robot.
  • the item tilter comprises two substantially planar surfaces, orthogonal to one another.
  • the object is placed against at both surfaces prior to rotation by the item tilter.
  • the object is placed only against one surface and gravity assists with abutting the object against the second surface.
  • the item tilter comprises two or more substantially planar surfaces, wherein connecting surfaces are orthogonal to one another.
  • the object is placed against at least one surface prior to rotation by the item tilter.
  • the item tilter is coupled to a product database, as disclosed herein.
  • the product database may relay an appropriate speed of rotation to the item tilter based on characteristics of the object being handled, as to prevent damage, mishandling, or misplacement of the object.
  • the item tilter comprises one or more surfaces which grasp, clamp, or otherwise hold the object during rotation.
  • the item tilter is capable of applying different pressures to hold the object.
  • the item tilter is coupled to an item database, as disclosed herein.
  • the item database may relay an appropriate pressure based on characteristics of the object being handled, as to prevent damage, mishandling, or misplacement of the object.
  • a rotating surface of the item tilter comprises a suction effector to retain an object during rotation.
  • the item tilter and robots of the system are operatively connected to a product database, programmable logic controller, computer system, or a combination thereof.
  • the device item tilter provides a ready for operation status.
  • the ready for operation comprises a digital output of ON and signifies when the new item can be placed in the device to be tilted.
  • the item tilter provides a final position digital output when an item is ready to be picked by an adjacent robot after being rotated into a desired orientation by the item tilter.
  • the item tilter receives a cycle start indication when rotation of the item is to begin.
  • automated systems disclosed herein may be able to make decisions not to put the item on the tilter based on the product database.
  • an item tilter is provided at a product loading station.
  • the product loading station may be a cell that includes a robot (e.g., possibly with a picking port), a tilter, a scanner, etc.
  • the product loading station may comprise two or more item tilters.
  • the product loading station is provided in proximity to or adjacent to a loading apparatus or output thereof.
  • the loading apparatus provides the object to the item tilter.
  • the loading apparatus may comprise a conveyor, a robotic handler, a chute, a pusher, or a combination thereof.
  • the loading apparatus provides two or more objects to the item tilter for simultaneous rotation of said two or more objects.
  • the product loading station comprises two or more item tilters.
  • an unloading apparatus is provided at the product loading station to move the object into proximity of or place the object inside the destination container.
  • An unloading apparatus may include the item tilter, a conveyor, a robotic handler, a chute, a pusher, or a combination thereof.
  • one or more sensors are provided at the product loading station.
  • a vision system comprising at least one optical sensor is provided at the product loading station.
  • the vision system identifies one or more characteristics of items or objects provided at the product loading station.
  • the vision system is in operative communication with the software module and a computer processor.
  • the software module instructs the unloading apparatus to move the object in proximity to or within a destination container.
  • the software module is operatively connected to a product database to determine one or more characteristics of an object, as disclosed herein.
  • the product database provides a desired orientation for an object provided at the product loading station.
  • automated systems may provide a decision based on the product database whether or not to put an object on the tilter. For example, an item may be deformable, and tilting will not help.
  • Operation of an item tilter may be understood as a cyclic process, wherein a cycle starts when an object is placed into the item tilter by a robot and a is complete when the object is provided in a final position and the item tilter is ready to receive a subsequent object.
  • a signal is sent from a programmable logic controller (PLC) of the system to start the cycle.
  • PLC programmable logic controller
  • item tilter rotates the object, as disclosed herein.
  • the robot will pick a second to be placed in the item tilter.
  • the item is positioned in the final position.
  • the cycle ends when the device is ready for placing the next item.
  • the item tilter is ready for placing the next item even if the first one was not removed from the final position.
  • the robot picks the first object from the final position and places it in the destination container or box, as the second object is being rotated.
  • an additional robot is utilized, wherein one robot places objects into the item tilter and an additional robot places them into a destination container or box.
  • the item tilter is ready to receive the next object for the operation while the previous object is positioned in the final position. In some cases, the item tilter is capable of tilting the next item even if the previous item is in the final position, as mentioned above. In some cases, the total cycle time as described above should take no more than 0.5, 1, 2, 3, 4, 5, 10, or 15 seconds.
  • an item tilter is designed to handle objects having a height (Z-axis), of up to 200 millimeters (mm). In some cases, an item tilter is designed to handle objects having a width (Y- axis), of up to 300 mm.
  • an item tilter is designed to handle objects having a length (X-axis), of up to 200 mm.
  • the item is provided such that the length (X-axis) of the object corresponds to the smallest dimension of the object.
  • the item tilter handles objects weighing up to 3 kilograms (kg).
  • the processes described above are carried out without prior determination of the shape of the object being handled.
  • the shape of the object being handled is provided by a product database or information gathered by sensors, as described herein.
  • the destination container comprises dimensions of about 310 mm length, 220 mm width, and 140 mm height. In some items the destination container comprises dimensions of about 410 mm length, 305 mm width, and 195 mm height. In some items the destination container comprises dimensions of about 595 mm length, 395 mm width, and 250 mm height.
  • an item tilter is provided as a component of an automated warehouse. In some cases, an item tilter is adjacent to one or more conveyor belts. In some cases, an item tilter is adjacent to one or more components for automated movement of items.
  • the automated components may include a robotic arm, a conveyor belt or system, a chute system, a pushing apparatus, or a combination thereof.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein use one or more optical sensors.
  • the optical sensors may be operatively coupled to at least one processor.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may use data storage comprising instructions executable by the at least one processor to cause performance of various functions.
  • the functions may include causing the robotic manipulator to move at least one physical object through a designated area in space of a physical.
  • the functions may further include causing one or more optical sensors to determine a location of a machine-readable code on the at least one physical object as the at least one physical object is moved through a target location. Based on the determined location, at least one optical sensor may scan the machine-readable code as the object is moved so as to determine information associated with the object encoded in the machine-readable code.
  • information obtained by a machine readable code is referenced to a product database.
  • the product database may provide information corresponding to an object being handled by a robotic manipulator, as disclosed herein.
  • the product database may provide information corresponding to a target location or position of the object and verify that the object is in a proper location.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may determine a respective location (e.g., a dropping location) at which to cause a robotic manipulator to place an object.
  • a respective location e.g., a dropping location
  • the systems, the methods, the computer-readable media, and the techniques may place an object at the target location.
  • the information comprises proper orientation of an object.
  • proper orientation is referenced to the surface on which a machine-readable code is provided.
  • Information comprising proper orientation of an object may determine the orientation at which the object is to be placed at the dropping location.
  • Information comprises proper orientation of an object may be used to determine a grasping or handling point at which a robotic manipulator grasps, grips, or otherwise handles the object.
  • information associated with an object obtained from at the machine-readable code may be used to determine one or more anomaly events.
  • Anomaly events may include misplacement of the object within a warehouse or within the system, damage to the object, unintentional connection of more than one object, combinations thereof, or other anomalies which would result in an error in placing an object in an appropriate position or otherwise causing an error in further processing to take place.
  • a warning, alert, or other indication will be provided (e.g., to a human operator).
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may determine that the object is at an improper location from the information associated with the object obtained from the machine-readable code.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may generate an alert that the object is located at an improper location, as disclosed herein.
  • the systems, the methods, the computer- readable media, and the techniques disclosed herein may place the object into at an error or exception location.
  • the exception location may be located within a container. In some cases, the exception location is designated for objects which have been determined to be at an improper location within the system or within a warehouse.
  • information associated with an object obtained from at the machine-readable code may be used to determine one or more properties of the object.
  • the information may include expected dimensions, shapes, or images to be captured.
  • Properties of an object may include an objects size, an objects weight, flexibility of an object, and one or more expected forces to be generated as the object is handled by a robotic manipulator.
  • a robotic manipulator comprises the one or more optical sensors.
  • the one or more optical sensors may be physically coupled to a robotic manipulator.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may use multiple cameras oriented at various positions such that when one or more optical sensors are moved over an object, the optical sensors can view multiple surfaces of the object at various angles.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may use multiple mirrors, such that mirrors so that one or more optical sensors can view multiple surfaces of an object.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein use one or more optical sensors located underneath a platform on which an object is placed or moved over during a scanning procedure.
  • the platform may be transparent or semitransparent so that the optical sensors located underneath it can scan a bottom surface of the object.
  • the robotic arm may bring a box through a reading station after or while orienting the box in a certain manner, such as in a manner in order to place the machine-readable code in a position in space where it can be easily viewed and scanned by one or more optical sensors.
  • the one or more optical sensors comprise one or more images sensors.
  • the one or more image sensors may capture one or more images of an object to be handled by a robotic manipulator or an object being handled by the robotic manipulator.
  • the one or more images sensors comprise one or more cameras.
  • an image sensor is coupled to a robotic manipulator.
  • an image sensor is placed near a workstation of a robotic manipulator to capture images of one or more object to be handled by the manipulator.
  • the image sensor captures images of an object being handled by a robotic manipulator.
  • one or more image sensors comprise a depth camera.
  • the depth camera may be a stereo camera, an RGBD (RGB Depth) camera, or the like.
  • the camera may be a color or monochrome camera.
  • one or more image sensors comprise a RGBaD (RGB+active depth, e.g., an Intel RealSense D415 depth camera) color or monochrome camera registered to a depth sensing device that uses active vision techniques such as projecting a pattern into a scene to enable depth triangulation between the camera or cameras and the known offset pattern projector.
  • the camera is a passive depth camera.
  • cues such as barcodes, texture coherence, color, 3D surface properties, or printed text on the surface may also be used to identify an object or find its pose in order to know where or how to place the object.
  • shadow or texture differences may be employed to segment objects as well.
  • an image sensor comprises a vision processor.
  • an image sensor comprises an inferred stereo sensor system.
  • an image sensor comprises a stereo camera system.
  • a virtual environment including a model of the objects in 2D or 3D may be determined and used to develop a plan or strategy for picking up the objects and verifying their properties are an approximate match to the expected properties.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein use one or more sensors to scan an environment containing objects.
  • a sensor coupled to the arm captures sensor data about a plurality of objects in order to determine shapes or positions of individual objects.
  • a larger picture of a 3D environment may be stitched together by integrating information from individual (e.g., 3D) scans.
  • the image sensors are placed in fixed positions, on a robotic arm, or in other locations.
  • scans may be constructed and used in accordance with any or all of a number of different techniques.
  • scans are conducted by moving a robotic arm upon which one or more image sensors are mounted.
  • Data comprising a position of the robotic arm position may provide be correlated to determine a position at which a mounted sensor is located.
  • Positional data may also be acquired by tracking key points in the environment.
  • scans may be from fixed-mount cameras that have fields of view (FOVs) covering a given area.
  • FOVs fields of view
  • a virtual environment built using a 3D volumetric or surface model to integrate or stitch information from more than one sensor may allow the systems, the methods, the computer-readable media, and the techniques disclosed herein to operate within a larger environment, where one sensor may be insufficient to cover a large environment. Integrating information from multiple sensors may yield finer detail than from a single scan alone. Integration of data from multiple sensors may reduce noise levels. This may yield better results for object detection, surface picking, or other applications.
  • Information obtained from the image sensors may be used to select one or more grasping points of an object. In some cases, information obtained from the image sensors may be used to select an end effector for handling an object. [0120] In some cases, an image sensor is attached to a robotic arm. In some cases, the image sensor is attached to the robotic arm at or adjacent to a wrist joint. In some cases, an image sensor attached to a robotic arm is directed to obtain images of an object. In some cases, the image sensor scans a machine-readable code placed on a surface of an object.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may integrate edge detection software.
  • One or more captured images may be analyzed to detect or locate the edges of an object.
  • the object may be at an initial position prior to being handled by a robotic manipulator or may be in the process of being handled by a robotic manipulator when the images are captured.
  • Edge detection processing may comprise processing one or more two-dimensional images captured by one or more image sensors.
  • Edge detection algorithms utilized may include Canny method detection, first-order differential detection methods, second- order differential detection methods, thresholding, linking, edge thinning, phase congruency methods, phase stretch transformation (PST) methods, subpixel methods (including curve-fitting, moment-based, reconstructive, and partial area effect methods), and combinations thereof.
  • Edge detection methods may utilize sharp contrasts in brightness to locate and detect edges of the captured images.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may record measured dimensional values of an object.
  • the measured dimensional values may be compared to expected dimensional values of an object to determine if an anomaly event has occurred.
  • Anomaly events based on dimensional comparison may indicate a misplaced object, unintentionally connected objects, damage to an object, or combinations thereof. Determination of an anomaly occurrence may trigger an anomaly event, as discussed herein.
  • one or more images captured of an object may be compared to one or more references images.
  • a comparison may be conducted by an integrated computing device of the systems, the methods, the computer-readable media, and the techniques disclosed herein.
  • the one or more reference images are provided by a product database. Appropriate reference images may be correlated to an object by correspondence to a machine-readable code provided on the object.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may compensate for variations in angles and distance at which the images are captured during the analysis.
  • an anomaly alert is generated if the difference between one or more captured images of an object and one or more reference images of the object exceeds a predetermined threshold.
  • a difference one or more captured images and one or more reference images may be taken across one or more dimensions or may be a sum difference between the one or more images.
  • reference images are sent to an operator during a verification process.
  • the operator may view the one or more references images in relation to the one or more captured images to determine if generation of an anomaly event or alert was correct.
  • the operator may view the reference images in a comparison module.
  • the comparison module may present the reference images side-by-side with the captured images.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein provided herein may be configured to detect anomalies of which occur during the handling or processing of one or more objects.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein include obtaining one or more properties of an object prior to being handled by a robotic manipulator and analyzes the obtained properties against one or more expected properties of the object.
  • the systems, the methods, the computer- readable media, and the techniques disclosed herein obtain one or more properties of an object while being handled by a robotic manipulator and analyzes the obtained properties against one or more expected properties of the object.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein obtain one or more properties of an object after being handled by a robotic manipulator and analyzes the obtained properties against one or more expected properties of the object. In some case, if an anomaly is detected, the systems, the methods, the computer-readable media, and the techniques disclosed herein do not proceed to place the object at a target position. The systems, the methods, the computer-readable media, and the techniques disclosed herein may instead instruct a robotic manipulator to place the object at an exception position, as described herein. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may verify a registered anomaly with an operator prior to placing an object at a given position.
  • one or more optical sensors scan a machine-readable code provided on an object. Information obtained by the machine-readable code may be used to verify that an object is in a proper location. If it is determined that an object is misplaced, the systems, the methods, the computer-readable media, and the techniques disclosed herein may register an anomaly event corresponding to a misplacement of said object. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein generate an alert if an anomaly event is registered.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein communicate with an operator or other user.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may communicate with an operator using a computing device.
  • the computing device may be an operator device.
  • the computing device may be configured to receive input from an operator or user with a user interface.
  • the operator device may be provided at a location remote from operations of the facility.
  • an operator utilizes an operator device to verify one or more anomaly events or alerts.
  • the operator device receives captured images from one or more image sensors to verify that an anomaly has occurred in an object.
  • An operator may provide verification that an object has been misplaced or that an object has been damaged based on the one or more images.
  • captured images are provided in a module to be displayed on a screen of an operator device.
  • the module displays the one or more captured images adjacent to one or more reference images corresponding to said object.
  • one or more captured images are displayed on a page adjacent to a page displaying one or more reference images.
  • an operator uses an interface of the operating device to verify that an anomaly event or alert was correctly generated. Verification provided by the operator may be used to train a machine learning algorithm, as disclosed herein.
  • verification that an alert was correctly generated adjusts a predetermined threshold which is used to generate an alert if a difference between one or more measured properties and one or more corresponding expected properties of an object exceeds said predetermined threshold.
  • verification that an alert was incorrectly generated adjusts a predetermined threshold which is used to generate an alert if a difference between one or more measured properties and one or more corresponding expected properties of an object exceeds said predetermined threshold.
  • verification of an alert instructs a robotic manipulator to handle an object in a particular manner. For example, if an anomaly alert corresponding to an object is verified as being correctly generated, the robotic manipulator may place the object at an exception location. In some cases, if an anomaly alert corresponding to an object is verified as being incorrectly generated, the robotic manipulator may place the object at a target location. In some cases, if an alert is generated and an operator verifies that two or more objects are unintentionally being handled simultaneously, then the robotic manipulator performs a wiggling motion in an attempt to separate the two or more objects.
  • one or more images of a target container or target location wherein one or more objects are provided at are transmitted to an operator or user device.
  • An operator or user may then verify that the one or more objects are correctly placed at the target location or with a target container.
  • a user or operator may also provide feedback using an operator or user device to communicate errors if the one or more objects have been incorrectly placed at the target location or within the target container.
  • a database may provide information as to which products requires human intervention or handling.
  • a warehouse surveillance or monitoring system alerts human handlers to incoming products which require human intervention.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein route said product or a container holding said product to a station designated for human intervention.
  • the station may be separated from automated handling systems or robotic arms. Separation may be necessary for safety reasons or to provide an accessible area for a human to handle the products.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may operate using a surveillance system for monitoring operations or product flow in a facility.
  • the surveillance system may operate with a picking port.
  • the surveillance system may monitor items as the items move through a picking port (e.g., from a picking location to a dropping location).
  • the surveillance system is integrated into an existing warehouse with automated handling systems.
  • the surveillance system comprises a database of information for each product to be handled in the warehouse.
  • the database is updated, as disclosed herein.
  • the surveillance system comprises at least one image sensor.
  • the surveillance system allows for identification of a product type.
  • identification of a product type at one or more points through a product flow in a facility allows for monitoring to determine if the facility is running efficiently or if an anomaly has occurred.
  • the surveillance system allows for determination of an appropriate package size for the one or more products to be placed and packaged within.
  • the surveillance system allows for automated quality control of products and packaging within a facility.
  • an image sensor is provided prior to or upstream from an automated handling station.
  • An image sensor provided prior to an automated handling system may allow for proper preparation by the handling system prior to arrival of a specific product type.
  • an image sensor provided prior to an automated handling system captures one or more images of a product or object to facilitate determination of an appropriate handler the product should be sent to.
  • an image sensor provided prior to an automated handling system identifies if a product has been misplaced or will not be able to be handled by an automated system downstream from the image sensor.
  • a surveillance system comprises one or more image sensors located after or downstream from an automated handling robot or system.
  • an image sensor provided downstream from a handling station captures one or more images of a product after being handled or placed to verify correct placement or handling. Verification may be done on products handled on an automated system or by a human handler.
  • the surveillance system includes further sensors, such as weight sensors, motion sensors, laser scanners, or other sensors useful for gathering information related to a product or container.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may be implemented in existing warehouses to automate one or more processes within a warehouse.
  • software and robotic manipulators of the system are integrated with the existing warehouse systems to provide a smooth transition of manual operations being automated.
  • a product database is provided in communication with the systems, the methods, the computer-readable media, and the techniques disclosed herein.
  • the product database may comprise a library of object to be handled, e.g., in a picking port.
  • the product database may include properties of each objects to be handled.
  • the properties of the objects provided by the product data base are expected properties of the objects.
  • the expected properties of the objects may be compared to measured properties of the objects in order to determine if an anomaly has occurred.
  • Expected properties may include expected dimensions, expected forces, expected weights, and expected machine-readable codes, as disclosed herein.
  • Product databases may be updated according to the objects to be handled.
  • Product databases may be generated input of information of the objects to be handled by handled.
  • objects may be processed by the systems, the methods, the computer-readable media, and the techniques disclosed herein to generate a product database.
  • an undamaged object may be handled by one or more robotic manipulators to determine expected properties of the object.
  • Expected properties of the object may include expected dimensions, expected forces, expected weights, and expected machine-readable codes, as disclosed herein. The expected properties may then be input into the product database.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may process a plurality of objects of the same type to determine a standard deviation occurring within objects of that type.
  • the determined standard deviations may be used to set a predetermined threshold, wherein a difference between expected properties and measured properties of an object may trigger an anomaly alert.
  • the predetermined threshold includes a standard deviation of different of one or more objects of the same type.
  • the standard deviation is multiplied by a constant factor to set a predetermined threshold.
  • the product database comprises a set of filtering criterion.
  • the filtering criterion may be used for routing objects to a proper handling station.
  • Filtering criterion may be used for routing objects to a robotic handling station or a human handling station.
  • Filtering criterion may be utilized for routing objects to an appropriate robotic handing station with an automated handler suited for handling a particular object or product type.
  • the database is continually updated.
  • the filtering criterion is continually updated.
  • the filtering criterion may be updated as new handling systems are integrated within a facility.
  • the filtering criterion is updated as new product types are handlined within a facility.
  • the filtering criterion is updated as new manipulation techniques or handling patterns are realized.
  • a machine learning program is utilized to update the database or filtering criterion.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein track objects as they are handled.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein integrate with existing tracking software of a warehouse which the system is implemented within.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may connect with existing software such that information which is normally received by manual input is now communicated electronically.
  • Object tracking may include confirming an object has been received at a source locations or station. Object tracking may include confirming an object has been placed at a target position.
  • Object tracking may include input that an anomaly has been detected. Object tracking may include input that an object has been placed at an exception location. Object tracking may include input that an object or target container has left a handling station or target position to be further processed at another location within a warehouse.
  • a control system may include at least one processor that executes instructions stored in a non- transitory computer readable medium, such as a memory.
  • the control system may also comprise a plurality of computing devices that may serve to control individual components or subsystems of the robotic device.
  • a memory comprises instructions (e.g., program logic) executable by the processor to execute various functions of robotic device disclosed herein.
  • a memory may comprise additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of a mechanical system, a sensor system, a product database, an operator system, or the control system.
  • machine learning algorithms are implemented such that the systems, the methods, the computer-readable media, and the techniques disclosed herein become completely automated.
  • verification operations completed by a human operator are removed after training of machine learning algorithms are complete.
  • the machine learning programs utilize incorporate a supervised learning approach. In some cases, the machine learning programs utilized incorporate a reinforcement learning approach. Information such as verification of alerts/ anomaly events, measured properties of objects being handled, and expected properties of objects being handled by be received by a machine learning algorithm for training.
  • Supervised learning may include active learning algorithms, classification algorithms, similarity learning algorithms, regressive learning algorithms, and combinations thereof.
  • machine learning may generally involve identifying and recognizing patterns in existing data in order to facilitate making predictions for subsequent data
  • machine learning may include a machine learning model (which may include, for example, a machine learning algorithm).
  • Machine learning whether analytical or statistical in nature, may provide deductive or abductive inference based on real or simulated data.
  • the machine learning model may be a trained model, machine learning techniques may comprise one or more supervised, semi-supervised, selfsupervised, or unsupervised machine learning techniques.
  • a machine learning model may be a trained model that is trained through supervised learning (e.g., various parameters are determined as weights or scaling factors), machine learning may comprise one or more of regression analysis, regularization, classification, dimensionality reduction, ensemble learning, meta learning, association rule learning, cluster analysis, anomaly detection, deep learning, or ultra-deep learning, machine learning may comprise: k-means, k-means clustering, k-nearest neighbors, learning vector quantization, linear regression, non-linear regression, least squares regression, partial least squares regression, logistic regression, stepwise regression, multivariate adaptive regression splines, ridge regression, principal component regression, least absolute shrinkage and selection operation (LASSO), least angle regression, canonical correlation analysis, factor analysis, independent component analysis, linear discriminant analysis, multidimensional scaling, non-negative matrix factorization, principal components analysis, principal coordinates analysis, projection pursuit, Sammon mapping, t-distributed stochastic neighbor embedding, AdaBoosting, boosting, gradient boosting, bootstrap aggregation
  • Machine learning algorithms may be applied to anomaly detection, as disclosed herein.
  • machine learning algorithms are applied to programed movement of one or more robotic manipulators.
  • Machine learning algorithms applied to programmed movement of robotic manipulators may be used to optimize actions such as scanning a machine-readable code provided on an object.
  • Machine learning algorithms applied to programmed movement of robotic manipulators may be used to optimize actions such performing a wiggling motion to separate unintentionally combined objects.
  • Machine learning algorithms applied to programmed movement of robotic manipulators may be used to any actions of a robotic manipulator for handling one or more objects, as disclosed herein.
  • machine learning algorithms are applied to make decisions whether or not to put an item on the tilter.
  • trajectories of items handled by robotic manipulators are automatically optimized by the systems, the methods, the computer-readable media, and the techniques disclosed herein.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein automatically adjust the movements of the robotic manipulators to achieve a minimum transportation time while preserving constraints on forces exerted on the item or package being transported.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein monitor forces exerted on the object as they are transported from a source position to a target position, as disclosed herein.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein may monitor acceleration, rate of acceleration (jerk), etc. of an object being transported by a robotic manipulator.
  • the force experienced by the object as it is manipulated may be calculated using the known movement of the robotic manipulator (e.g., position, velocity, and acceleration values of the robotic manipulator as it transports the object) and force values obtained by the weight/torsion and force sensors provided on the robotic manipulator.
  • optical sensors of the systems, the methods, the computer-readable media, and the techniques disclosed herein monitor the movement of objects being transported by the robotic manipulator.
  • the trajectory of objects is optimized to minimize transportation time including scanning of a digital code on the object.
  • the optical sensors recognize defects in the objects or packaging of objects as a result of mishandling (e.g., defects caused by forces applied to the object by the robotic manipulator).
  • the optical sensors monitor the flight or trajectory of objects being manipulated for cases which the objects are dropped. In some cases, detection of mishandling or drops will result in adjustments of the robotic manipulator (e.g., adjustment of trajectory or forces applied at the end effector).
  • the constraints and optimized trajectory information will be stored in the product database, as disclosed herein.
  • the constraints are derived from a history of attempts for the specific object or plurality of similar objects being transported.
  • the systems, the methods, the computer-readable media, and the techniques disclosed herein are trained by increasing the speed at which an object is manipulated over a plurality of attempts until a drop or defect occurs due to mishandling by the robotic manipulator.
  • a technician verifies that a defect or drop has occurred due to mishandling. Verification may include viewing a video recording of the object being handled and confirming that a drop or defect was likely due to mishandling by the robotic manipulator.
  • FIG. 7 shows a computer system 701 that is programmed or otherwise configured to implement the methods, the computer-readable media, and the techniques disclosed herein, such as to control the systems or devices disclosed herein (e.g., a picking port, a robotic arm, a controller, etc.).
  • the computer system 701 can be an electronic device of a user or a computer system that is remotely located with respect to the electronic device.
  • the electronic device can be a mobile electronic device.
  • the computer system 701 includes a central processing unit (CPU, also “processor” and “computer processor” herein) 705, which can be a single core or multi core processor, or a plurality of processors for parallel processing.
  • the computer system 701 also includes memory or memory location 710 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 715 (e.g., hard disk), communication interface 720 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 725, such as cache, other memory, data storage or electronic display adapters.
  • the memory 710, storage unit 715, interface 720 and peripheral devices 725 are in communication with the CPU 705 through a communication bus (solid lines), such as a motherboard.
  • the storage unit 715 can be a data storage unit (or data repository) for storing data.
  • the computer system 701 can be operatively coupled to a computer network (“network”) 730 with the aid of the communication interface 720.
  • the network 730 can be the Internet, an isolated or substantially isolated internet or extranet, or an intranet or extranet that is in communication with the Internet.
  • the network 730 in some cases is a telecommunication or data network.
  • the network 730 can include one or more computer servers, which can enable distributed computing, such as cloud computing.
  • the network 730, in some cases with the aid of the computer system 701, can implement a peer-to-peer network, which may enable devices coupled to the computer system 701 to behave as a client or a server.
  • the CPU 705 can execute a sequence of machine-readable instructions, which can be embodied in a program or software.
  • the instructions may be stored in a memory location, such as the memory 710.
  • the instructions can be directed to the CPU 605, which can subsequently program or otherwise configure the CPU 705 to implement methods of the present disclosure. Examples of operations performed by the CPU 705 can include fetch, decode, execute, and writeback.
  • the CPU 705 can be part of a circuit, such as an integrated circuit. One or more other components of the system 701 can be included in the circuit. In some cases, the circuit is an application specific integrated circuit (ASIC).
  • ASIC application specific integrated circuit
  • the storage unit 715 can store files, such as drivers, libraries and saved programs.
  • the storage unit 715 can store user data, e.g., user preferences and user programs.
  • the computer system 701 in some cases can include one or more additional data storage units that are external to the computer system 701, such as located on a remote server that is in communication with the computer system 701 through an intranet or the Internet.
  • the computer system 701 can communicate with one or more remote computer systems through the network 730.
  • the computer system 701 can communicate with a remote computer system of a user.
  • remote computer systems include personal computers (e.g., portable PC), slate or tablet PC’s (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android-enabled device, Blackberry®), or personal digital assistants.
  • the user can access the computer system 701 via the network 730.
  • Methods as disclosed herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the computer system 701, such as, for example, on the memory 710 or electronic storage unit 715.
  • the machine executable or machine-readable code can be provided in the form of software.
  • the code can be executed by the processor 705.
  • the code can be retrieved from the storage unit 715 and stored on the memory 710 for ready access by the processor 705.
  • the electronic storage unit 715 can be precluded, and machine-executable instructions are stored on memory 710.
  • the code can be pre-compiled and configured for use with a machine having a processor adapted to execute the code or can be compiled during runtime.
  • the code can be supplied in a programming language that can be selected to enable the code to execute in a pre-compiled or as-compiled fashion.
  • aspects of the systems and methods provided herein can be embodied in programming.
  • Various aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of machine (or processor) executable code or associated data that is carried on or embodied in a type of machine readable medium.
  • Machineexecutable code can be stored on an electronic storage unit, such as memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk.
  • “Storage” type media can include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming.
  • All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from a management server or host computer into the computer platform of an application server.
  • another type of media that may bear the software elements includes optical, electrical, and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links.
  • the physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software.
  • terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions to a processor for execution.
  • a machine readable medium such as computer-executable code
  • a tangible storage medium such as computer-executable code
  • Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the databases, etc. shown in the drawings.
  • Volatile storage media include dynamic memory, such as main memory of such a computer platform.
  • Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system.
  • Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications.
  • RF radio frequency
  • IR infrared
  • Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code or data.
  • “Storage” type media can include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from a management server or host computer into the computer platform of an application server.
  • another type of media that may bear the software elements includes optical, electrical, and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links.
  • a machine readable medium such as computer-executable code
  • a tangible storage medium such as computer-executable code
  • Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the databases, etc. shown in the drawings.
  • Volatile storage media include dynamic memory, such as main memory of such a computer platform.
  • Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system.
  • Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications.
  • RF radio frequency
  • IR infrared
  • Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code or data.
  • Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
  • the computer system 701 can include or be in communication with an electronic display 735 that comprises a user interface (UI) 740 for providing, for example, images (e.g., micrographs) of the substrates or the plurality of beads, along with the analysis of the images (e.g., pitch, spacing, occupancy, intensity, nucleic acid sequence data, etc.).
  • UI user interface
  • Examples of UI’s include, without limitation, a graphical user interface (GUI) and web-based user interface.
  • Methods and systems of the present disclosure can be implemented by way of one or more algorithms.
  • An algorithm can be implemented by way of software upon execution by the central processing unit 705.
  • the algorithm can, for example, determine the occupancy, spacing, or other parameters (e.g., full-width half-maximum, mean fluorescence intensity) of an image (e.g., micrograph of a bead or plurality of beads on or adjacent to a substrate).
  • the indefinite articles “a” or “an,” and the corresponding associated definite articles “the” or “said,” are each intended to mean one or more unless otherwise stated, implied, or physically impossible. Yet further, it should be understood that the expressions “at least one of A and B, etc.,” “at least one of A or B, etc.,” “selected from A and B, etc.” and “selected from A or B, etc.” are each intended to mean either any recited element individually or any combination of two or more elements, for example, any of the elements from the group consisting of “A,” “B,” and “A AND B together,” etc.
  • “about” or “approximately” may mean within an acceptable error range for the value, which will depend in part on how the value is measured or determined, e.g., the limitations of the measurement system. For example, “about” may mean within 1 or more than 1 standard deviation, per the practice in the art. Alternatively, “about” may mean a range of up to 20%, up to 10%, up to 5%, or up to 1% of a given value. Where values are described in the application and claims, unless otherwise stated the term “about” meaning within an acceptable error range for the particular value may be assumed.
  • routines, subroutines, applications, or instructions may constitute either software (e.g., code embodied on a machine-readable medium) or hardware.
  • routines, etc. are tangible units capable of performing certain operations and may be configured or arranged in a certain manner.
  • one or more computer systems e.g., a standalone, client or server computer system
  • one or more hardware modules of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a hardware module may be implemented mechanically or electronically.
  • a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC) to perform certain operations.
  • a hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • hardware modules may encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations disclosed herein.
  • hardware modules are temporarily configured (e.g., programmed)
  • each of the hardware modules need not be configured or instantiated at any one instance in time.
  • the hardware modules comprise a general-purpose processor configured using software
  • the general-purpose processor may be configured as respective different hardware modules at different times.
  • Software may accordingly configure processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
  • Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output.
  • Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information.
  • Elements that are described as being coupled and or connected may refer to two or more elements that may be (e.g., direct physical contact) or may not be (e.g., electrically connected, communicatively coupled, etc.) in direct contact with each other, but yet still cooperate or interact with each other.
  • processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
  • the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
  • the methods or routines disclosed herein may be at least partially processor- implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
  • the performance of certain operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
  • the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)

Abstract

A system (100) for robotic item picking includes: (A) a picking port (110), including: (i) a first input picking queue configured to move a first plurality of item picking containers, (ii) a second input picking queue configured to move a second plurality of item picking containers, (iii) a picking location configured to receive an item picking container of said first plurality of item picking containers and said second plurality of item picking containers, and (iv) a dropping location configured to receive an item dropping container; (B) a robotic arm (120) configured to move items of said item picking container; and (C) a controller configured to: (a) cause said picking location to receive said item picking container from said first input picking queue or said second input picking queue, and (b) cause said robotic arm to pick up said items from said item picking container and place said items in said item dropping container.

Description

METHODS AND SYSTEMS FOR ROBOTIC ITEM PICKING PORTS FOR AUTOMATED
WAREHOUSES
BACKGROUND
[0001] Order picking may be a process of selecting and gathering items from a warehouse or storage location. Items may be selected and gathered for various reasons. For example, items may be selected and gathered for fulfilling customer orders. In such examples, the items may be moved into a shipping container. In another example, items may be selected and gathered for storing. In such examples, the items may be moved into a storage container (e.g., for consolidation).
[0002] Selecting and gathering items may include retrieving the items from respective locations based on the order details, item type, quantity, item attributes, specific customer requirements, etc. While item picking has historically relied on manual labor, where workers navigated a warehouse to locate and retrieve items, certain automations have become popular in more recent times. For example, warehouse automation technologies have revolutionized the order picking process through technologies such as automated guided vehicles, robotic arms, conveyor belts, software algorithms, etc.
SUMMARY
[0003] In one aspect, a system for robotic item picking, comprises: (A) a picking port, comprising: (i) a first input picking queue configured to hold and move a first plurality of item picking containers, (ii) a second input picking queue configured to hold and move a second plurality of item picking containers, (iii) a picking location configured to receive an item picking container of (1) said first plurality of item picking containers from said first input picking queue and (2) said second plurality of item picking containers from said second input picking queue, and (iv) a dropping location configured to receive an item dropping container; (B) a robotic arm configured to move one or more items of said item picking container; and (C) a controller configured to: (a) cause said picking location to receive said item picking container from (1) said first input picking queue or (2) said second input picking queue, and (b) cause said robotic arm to (1) pick up said one or more items from said item picking container at said picking location and (2) place said one or more items in said item dropping container at said dropping location. In some embodiments, said picking port further comprises (v) an output picking queue configured to hold and move said item picking container. In some embodiments, said controller is further configured to, after causing said robotic arm to pick up said one or more items from said item picking container at said picking location in (b): (c) cause said item picking container to be moved from said picking location to said output picking queue. In some embodiments, said controller is further configured to, after causing said item picking container to be moved from said picking location to an output picking queue in (c): (d) cause said picking location to receive an additional item picking container from (1) said first input picking queue or (2) said second input picking queue, and (e) cause said robotic arm to pick up one or more additional items from said additional item picking container at said picking location. In some embodiments, said picking port further comprises (vi) an output dropping queue configured to hold and move said item dropping container, said picking port further comprises (vii) an input dropping queue configured to hold and move said item dropping container, and said dropping location is configured to receive said item dropping container from said input dropping queue. In some embodiments, said input dropping queue comprises a first input dropping queue and a second input dropping queue. In some embodiments, said controller is further configured to, after causing said robotic arm to place said one or more items from said item picking container in said output item container at said dropping location in (b): (f) cause said item dropping container to be moved from said dropping location to an output dropping queue, (g) cause said dropping location to receive another item dropping container from said input dropping queue, and (h) cause said robotic arm to place said one or more additional items in said additional item dropping container at said dropping location. In some embodiments, said controller is further configured to, after causing said robotic arm to place said one or more items from said item picking container in said output item container at said dropping location in (b): (i) cause said robotic arm to place said one or more additional items in said item dropping container at said dropping location, thereby causing said item dropping container to include both said one or more items and said one or more additional items. In some embodiments, at least one of said one or more additional items is substantially identical to at least one of said one or more items. In some embodiments, each of said one or more additional items are substantially identical each of said one or more items. In some embodiments, at least one of said one or more additional items is substantially different from at least one of said one or more items. In some embodiments, each of said one or more additional items are substantially different from each of said one or more items. In some embodiments, the system further comprises: (D) one or more sensors configured to collect sensor data corresponding to said picking port. In some embodiments, said controller is further configured to perform at least one of operations (a)-(h) based at least in part on said sensor data. In some embodiments, said one or more sensors comprise one or more cameras configured to capture image data corresponding to said picking port. In some embodiments, said controller is further configured to generate an alert at least in part in response to said image data satisfying an alert condition. In some embodiments, said one or more sensors comprise one or more laser curtains. In some embodiments, said one or more laser curtains are configured to detect if an item is sticking out outside of a threshold in one or more of (1) said first plurality of item picking containers, (2) said first plurality of item picking containers, (3) said item picking container, or (4) said item dropping container, and said controller is further configured to generate an alert at least in part in response to detecting that said item is sticking out outside of said threshold. In some embodiments, said one or more sensors comprise one or more weight sensors. In some embodiments, said one or more weight sensors are configured to determine (1) a first weight of a first item in one of said first plurality of item picking containers and (2) a second weight of a second item in one of said second plurality of item picking containers, and said controller is configured to determine, based at least in part on a comparison of said first weight and said second weight, to provide, to said picking location, (1) said one of said first plurality of item picking containers or (2) said one of said second plurality of item picking containers. In some embodiments, (1) a position in said first input picking queue proximal to said picking location is a first buffer location and (2) a position in said second input picking queue proximal to said picking location is a second buffer location. In some embodiments, (1) said one of said first plurality of item picking containers is at said first buffer location and (2) said one of said second plurality of item picking containers is at said second buffer location. In some embodiments, said one or more weight sensors are configured to determine (1) a first weight of said item picking container at said picking location prior to said robotic arm picking up said one or more items from said item picking container and (2) a second weight of said item picking container at said picking location after said robotic arm picking up said one or more items from said item picking container, and said controller is further configured to is configured to determine, based at least in part on a comparison of said first weight and said second weight, a number of items included in said one or more items. In some embodiments, said controller is further configured to generate an alert at least in part in response to said number of items satisfying an alert condition. In some embodiments, each of said first plurality of item picking containers and each of said second plurality of item picking containers is a tote. In some embodiments, said tote for each of said first plurality of item picking containers and each of said second plurality of item picking containers comprises an identifier. In some embodiments, said identifier is an item identifier corresponding to an item included in said tote for each of said first plurality of item picking containers and each of said second plurality of item picking containers. In some embodiments, said item dropping container is a shipping container. In some embodiments, said first input picking queue is substantially parallel to said second input picking queue. In some embodiments, said first input picking queue is substantially identical to said second input picking queue.
[0004] In another aspect, a method for robotic item picking, comprises: (a) receiving, at a picking location from a first input picking queue, a first item picking container; (b) picking up, using a robotic arm, a first item from said first item picking container; (c) placing, using said robotic arm, said first item into a first item dropping container at a dropping location; (d) moving said first item picking container from said picking location to an output picking queue; (e) moving said first item dropping container from said dropping location to an output dropping queue; (f) receiving, at said picking location from a second input picking queue, a second item picking container; (g) receiving, at said dropping location from an input dropping queue, a second item dropping container; (h) picking up, using a robotic arm, a second item from said second item picking container; and (i) placing, using said robotic arm, said second item into said second item dropping container. In some embodiments, said input dropping queue comprises a first input dropping queue and a second input dropping queue. In some embodiments, said first item is substantially identical to said second item. In some embodiments, said first item is substantially different from said second item. In some embodiments, one or more of operations (a)-(i) are performed based at least in part on sensor data collected by one or more sensors. In some embodiments, said one or more sensors comprise one or more cameras, the method further comprising: capturing image data, via said one or more cameras. In some embodiments, the method further comprises: generating an alert at least in part in response to said image data satisfying an alert condition. In some embodiments, said one or more sensors comprise one or more laser curtains. In some embodiments, the method further comprises: detecting, via said one or more laser curtains, that (1) said first item is sticking out outside of a threshold in said first item picking container or said first item dropping container or (2) said second item is sticking out outside of a threshold in said second item picking container or said second item dropping container, and generating an alert at least in part in response to detecting that that (1) said first item is sticking out outside of said threshold in said first item picking container or said first item dropping container or (2) said second item is sticking out outside of said threshold in said second item picking container or said second item dropping container. In some embodiments, said one or more sensors comprise one or more weight sensors. In some embodiments, the method further comprises: determining, said one or more weight sensors, a weight of said first item and a weight of said second item; and causing said first item to be placed, using said robotic arm, into said first item dropping container prior to said second item being placed, using said robotic arm into said second item dropping container. In some embodiments, the method further comprises: determining (1) a first weight of said first item picking container prior to said robotic arm picking up said first item from said first item picking container and (2) a second weight of said first item picking container after said robotic arm picking up said first item from said item picking container. In some embodiments, the method further comprises: generating an alert at least in part in response to a comparison of said first weight and said second weight. In some embodiments, each of said first item picking container and said second item picking container is a tote. In some embodiments, said tote for each of first item picking container and said second item picking container comprises an identifier. In some embodiments, said identifier for said first item picking container corresponds to said first item and said identifier for said second item picking container corresponds to said second item. In some embodiments, each of said first item dropping container and said second item dropping container is a shipping container. In some embodiments, said first input picking queue is substantially parallel to said second input picking queue. In some embodiments, said first input picking queue is substantially identical to said second input picking queue.
[0005] In another aspect, a method for robotic item picking, comprises: (a) receiving, at a picking location from a first input picking queue, a first item picking container; (b) picking up, using a robotic arm, a first item from said first item picking container; (c) placing, using said robotic arm, said first item into an item dropping container at a dropping location; (d) moving said first item picking container from said picking location to an output picking queue; (e) receiving, at said picking location from a second input picking queue, a second item picking container; (f) picking up, using a robotic arm, a second item from said second item picking container; and (g) placing, using said robotic arm, said second item into said item dropping container, thereby causing said item dropping container to include both said first item and said second item. In some embodiments, the method further comprises: receiving, from an input dropping queue, said input dropping container. In some embodiments, said input dropping queue comprises a first input dropping queue and a second input dropping queue. In some embodiments, said first item is substantially identical to said second item. In some embodiments, said first item is substantially different from said second item. In some embodiments, one or more of operations (a)-(g) are performed based at least in part on sensor data collected by one or more sensors. In some embodiments, said one or more sensors comprise one or more cameras, the method further comprising: capturing image data, via said one or more cameras. In some embodiments, the method further comprises: generating an alert at least in part in response to said image data satisfying an alert condition. In some embodiments, said one or more sensors comprise one or more laser curtains. In some embodiments, the method further comprises: detecting, via said one or more laser curtains, that (1) said first item is sticking out outside of a threshold in said first item picking container or said item dropping container or (2) said second item is sticking out outside of a threshold in said second item picking container or said item dropping container, and generating an alert at least in part in response to detecting that that (1) said first item is sticking out outside of said threshold in said first item picking container or said item dropping container or (2) said second item is sticking out outside of said threshold in said second item picking container or said item dropping container. In some embodiments, said one or more sensors comprise one or more weight sensors. In some embodiments, the method further comprises: determining, said one or more weight sensors, a weight of said first item and a weight of said second item; and causing said first item to be placed, using said robotic arm, into said item dropping container prior to said second item being placed, using said robotic arm into said item dropping container. In some embodiments, the method further comprises: determining (1) a first weight of said first item picking container prior to said robotic arm picking up said first item from said first item picking container and (2) a second weight of said first item picking container after said robotic arm picking up said first item from said item picking container. In some embodiments, the method further comprises: generating an alert at least in part in response to a comparison of said first weight and said second weight. In some embodiments, each of said first item picking container and said second item picking container is a tote. In some embodiments, said tote for each of first item picking container and said second item picking container comprises an identifier. In some embodiments, said identifier for said first item picking container corresponds to said first item and said identifier for said second item picking container corresponds to said second item. In some embodiments, said item dropping container is a shipping container. In some embodiments, said first input picking queue is substantially parallel to said second input picking queue. In some embodiments, said first input picking queue is substantially identical to said second input picking queue.
[0006] In another aspect, a method for robotic item picking, comprises: (a) obtaining item information for a plurality of items, wherein said item information comprises one or more of: item weight, item size, item fragility, or item deformability; (b) obtaining an order comprising a subset of said plurality of items; (c) determining a filling order for said subset of said plurality of items based at least in part on said item information corresponding to said subset of said plurality of items; (d) at least in response to said filling order, causing a first input picking queue to move a first item container comprising a first item of said subset of said plurality of items to a picking location; and (e) at least in response to said filling order, causing a second input picking queue to move a second item container comprising a second item of said subset of said plurality of items to said picking location. In some embodiments, the method further comprises: causing a robotic arm to pick up said first item from said first item container; and causing an output picking queue to move said first item container from said picking location. In some embodiments, said item information further comprises one or more of: item quantity, item name, item price, or item materials. In some embodiments, said order comprises a customer order.
[0007] In another aspect, a system for robotic item picking, comprises: (A) a first conveyor system configured to hold and move a first plurality of item picking containers; (B) a second conveyor system configured to hold and move a second plurality of item picking containers; (C) a plurality of sensors configured to collect sensor data corresponding to at least a portion of said first plurality of item picking containers and at least a portion of said second plurality of item picking containers, wherein said plurality of sensors comprises one or more of: a weight sensor, a camera sensor, an identification reader, or a laser curtain; and (D) a controller configured to: (a) obtain an order comprising a plurality of items, (b) determine a filling order for said plurality of items based at least in part on said sensor data, (d) at least in response to said filling order, cause said first conveyor system to move a first item picking container of said first plurality of item picking containers to a picking location, wherein said first item container comprises a first item of said plurality of items, and (e) at least in response to said filling order, cause said second conveyor system to move a second item picking container of said second plurality of item picking containers to said picking location, wherein said second item container comprises a second item of said plurality of items. Another aspect of the present disclosure provides a system comprising one or more computer processors and computer memory coupled thereto. The computer memory comprises machine executable code that, upon execution by the one or more computer processors, implements any of the methods above or elsewhere herein.
[0008] Additional aspects and advantages of the present disclosure will become readily apparent to those skilled in this art from the following detailed description, wherein only illustrative embodiments of the present disclosure are shown and described. As will be realized, the present disclosure is capable of other and different embodiments, and its several details are capable of modifications in various obvious respects, all without departing from the disclosure. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
INCORPORATION BY REFERENCE
[0009] All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference. To the extent publications and patents or patent applications incorporated by reference contradict the disclosure contained in the specification, the specification is intended to supersede or take precedence over any such contradictory material.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings (also “Figure” and “FIG.” herein), of which:
[0011] FIG. 1 shows an example of a picking port and an example of a robotic arm;
[0012] FIG. 2A shows a perspective view of an example of a picking port;
[0013] FIG. 2B shows a top-down view of rollers of the example of the picking port of FIG. 2A;
[0014] FIG. 2C shows a rear orthographic view of the example of the picking port of FIG. 2A; [0015] FIG. 2D shows a right orthographic view of the example of the picking port of FIG. 2A; [0016] FIG. 3 shows an example of a picking port, an example of robotic arm, and an example of a tool changer;
[0017] FIG. 4A shows a top-down view of an example single-input, single-output picking port;
[0018] FIG. 4B shows an example implementation of various sensors in the top-down view of the example single-input, single-output picking port of FIG. 4A;
[0019] FIG. 5 shows a top-down view of an example dual-input, single-output picking port;
[0020] FIG. 6 shows an example of an operation flowchart for implementing robotic item picking; and
[0021] FIG. 7 shows a computer system that is programmed or otherwise configured to implement methods provided herein. DETAILED DESCRIPTION
[0022] While various embodiments of the invention have been shown and disclosed herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions may occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention disclosed herein may be employed.
[0023] As discussed in the Background Section, warehouse automation is a growing area of interest. However, current automated picking technologies suffer from various drawbacks. For example, many picking ports are designed to be operated exclusively by a human. Other picking ports are designed to be cooperated by a human or a robotic system. In either instance, disadvantageously, in designing a certain picking port to be operable at least by a human, inefficiencies exist when a robotic system is tasked with operating the certain picking port. For example, many picking ports suffer from large distances for a robot to travel between a picking and dropping location. Further, many picking ports are slow to open, which, while a safety feature for humans, is unnecessary when operated exclusively by a robotic system. Further, many picking ports are not robot-friendly due to opening close to walls, thereby reducing robot access. Further, many picking ports are designed around human ergonomics (e.g., having tilts, shapes, structures, etc. to complement the anatomy or shape of a human body or human function).
[0024] Advantageously, systems, methods, computer-readable media, and techniques disclosed herein for item picking are designed to improve time efficiency, order accuracy, inventory management, scalability, flexibility, cost efficiency, etc. when used with robotics.
Example Robotic Item Picking Systems
[0025] FIG. 1 shows an example system 100 comprising an example of a picking port 110 and an example of a robotic arm 120. As illustrated in FIG. 1, the picking port 110 may be positioned nearby the robotic arm 120 such that the robotic arm 120 may reach one or more locations in the picking port 110.
[0026] FIG. 1 illustrates an upper-right perspective view of the system 100. As illustrated, the front end of the picking port 110 may be defined as where the robotic picker 110 is picks up items from item containers passing through the picking port 110. The items are picked at a picking location at the front end of the picking port 110 and deposited at a dropping location, also at the front end of the picking port 110. [0027] In some cases, the front side, top side, left side, and right side of the picking port 110 may have panels (e.g., plastic, metal, glass, wood, etc.). These panels may help to cover internal components (e.g., machinery) of the picking port 110. In some cases, these panels may provide mounting surfaces for internal components (e.g., cables) of the picking port 110. These panels may help to prevent injury by a human by providing a separation between the human and the internal components of the picking port 110.
[0028] In some cases, the rear side of the picking port 110 may be at least partially without panels to allow the entry and exit of item containers to the picking port 110 via the rear side. The rear side of the picking port 110 may comprise one or more entry positions corresponding to one or more input queues and one or more exit positions corresponding to one or more output queues. The one or more input queues and the one or more output queues may be implemented via one or more of: a conveyor system, a chute system, a pusher system, etc. While the entry positions and the exit positions are illustrated as on the rear side, in some cases, the entry and exit positions may be arranged at other locations in the picking port 110, such as to the left and right sides of the picking port 110. The entry position may be configured for connection to another conveyor, such as a storage system conveyor that transports item containers to the entry position of the picking port 110. In some cases, the storage system conveyor may bring item containers from a storage facility into the picking port 110. Similarly, the exit position may be configured for connection to another conveyor, for example, a storage or shipping system conveyor that transports item containers from the exit position of the picking port 110. In some cases, the storage or shipping system conveyor may bring item containers from the picking port 110 to a storage or shipping facility.
[0029] In some cases, the picking port 110 may be configured to transport item containers internally within the picking port 110. For example, as illustrated, the item container positioned in the rear left of the picking port 110 may be in a queue to be transported towards the front left of the picking port 110. Thus, item containers are transported forward from an entry position in the rear end towards the front end of the access station picking port 110. In some cases, at the front end of the picking port 110, the direction of movement changes to transport the item containers backward from the front end of the picking port 110 towards the back end of the picking port.
[0030] The picking port 110 may be configurable. For example, the picking port 110 may be communicatively coupled to a controller. The controller may be configured to set specifications for controlling picking operations such as the speed of transport for item containers through the picking port 110, reversal of the transport direction through the picking port 110, stop and start functions of the picking port 110, etc. The controller may be remote or co-located with the picking port 110. In some cases, the controller may correspond to a control panel that may also have a user interface, such as a screen or display configured to display the specifications of the picking port 110 or information about one or more items in the picking port 110. For example, information about the one or more items in the picking port may include the weight of an item or an item container, a size of an item or an item container, a number of items in an item container, an identifier (e.g., barcode, identification number, quick response code, etc.) of an item or an item container, a fragility of an item, a deformability of an item, etc. In some cases, the controller may be further communicatively coupled to the robotic arm 120. By coupling the controller to the robotic arm 120, operations between the picking port 110 and the robotic arm may be coordinated (e.g., synchronized) [0031] In some cases, the picking port 110 may be configured to hold one or more item containers. Each item container of the one or more item containers may be configured to hold one or more items. Although the picking port 110 may described herein as holding or moving one or more item containers, it should be understood that, in some cases, the picking port 110 may be configured to hold one or more items directly (without the use of item containers).
[0032] In some cases, the item containers may be used for storage of items. For example, the item containers may include one or more of: a storage bin (e.g., a plastic storage bin, a metal storage bin, a fabric storage bin, a wooden storage bin, a carboard storage bin, etc.), a storage box (e.g., a plastic storage box, a metal storage box, a fabric storage box, a wooden storage box, a carboard storage box, etc.), a case (e.g., a plastic storage case a metal storage case, a fabric storage case, a wooden storage case, a carboard storage case, etc.), a storage tote (e.g., a plastic storage tote, a metal storage tote, a fabric storage tote, a wooden storage tote, a carboard storage tote, etc.), a storage pallet (e.g., a plastic storage pallet, a metal storage pallet, a wooden storage pallet, etc.), or any other container suitable for storing items.
[0033] In some cases, the item containers may be used for shipping of items. For example, the item containers may include one or more of: a shipping bin (e.g., a plastic shipping bin, a metal shipping bin, a fabric shipping bin, a wooden shipping bin, a carboard shipping bin, etc.), a shipping box (e.g., a plastic shipping box, a metal shipping box, a fabric shipping box, a wooden shipping box, a carboard shipping box, etc.), a case (e.g., a plastic shipping case a metal shipping case, a fabric shipping case, a wooden shipping case, a carboard shipping case, etc.), a shipping tote (e.g., a plastic shipping tote, a metal shipping tote, a fabric shipping tote, a wooden shipping tote, a carboard shipping tote, etc.), a shipping pallet (e.g., a plastic shipping pallet, a metal shipping pallet, a wooden shipping pallet, etc.), or any other container suitable for shipping items.
[0034] In some cases, the picking port 110 includes one or more locations that are configured to receive an item picking container. For example, the one or more locations may include a picking location and a dropping location. The picking location may be the location at which an item is picked up by the robotic arm 120 from the item’s corresponding item picking container. The dropping location may be the location at which an item is dropped (e.g., placed) by the robotic arm 120 into the item’s corresponding item dropping container. In some cases, the one or more locations may be discrete with respect to one another. In some cases, the one or more locations may be continuous with respect to one another (e.g., positions on a conveyor).
[0035] In some cases, the picking port 110 may include one or more queues. The one or more queues may comprise one or more locations of the picking port 110. For example, as illustrated in FIG. 1, the picking port 110 has multiple queues, each with at least one location. In some cases, the queues may transport the items or the item containers to or from a location of the picking port 110. The queues may comprise one or more of a conveyor, a chute, a pusher, or any other device used to move (e.g., via gravitational force, mechanical force, electrical force, etc.) objects from one location to another.
[0036] In some cases, the robotic arm 120 may be configured to pick up an item from a picking location of the picking port 110 and place the item in a dropping location of the picking port 110. While illustrated as being separate from the picking port 110, in some cases, the robotic arm 120 may be integrated into the picking port 110. For example, the robotic arm 120 may be built directly into the picking port 110. In some cases, a device other than a robotic arm may be used to move items between locations (e.g., a picking location and a dropping location) in the picking port 110. [0037] FIG. 2A shows a perspective view of an example of a picking port. The picking port of FIG. 2A may be the same as or similar to the picking port 110 of FIG. 1. FIG. 2B shows a top-down view of rollers of the example of the picking port of FIG. 2A. The rollers of FIG. 2B may be included in conveyors. As illustrated, the picking port of FIG. 2A comprises four queues, each with three positions, for a total of twelve positions. FIGs. 2C and 2D provide additional views of the picking port of FIG. 2A. The picking port of FIGs. 2A-2C may be a single-input, single-output picking port; meaning the picking port has one input picking queue, one output picking queue, one input dropping queue, and one output dropping queue. [0038] FIG. 3 shows an example system 300 comprising an example of a picking port 310, an example of robotic arm 320, and an example of a tool changer 330. The picking port 310 and the robotic arm 320 of FIG. 3 may be the same as or similar to the picking port 110 or the robotic arm 120, respectively. The picking port 310 further comprises the tool changer 330, as illustrated. Although illustrated as mounted on the picking port 310, the tool changer may be located in any number of locations (e.g., on the robotic arm 320, separate from the robotic arm 320 and separate from the picking port 310, etc.). The tool changer 330 may be reachable by the robotic arm 320. The tool changer 330 may be configured to provide the robotic arm 320 with various tools (e.g., grippers) for handling items in the picking port 310.
[0039] Based at least in part on the item the robotic arm 320 is handling, the robotic arm 320 may change tools using the tool changer 330. For example, the tool changer 330 may provide the robotic arm 320 with different tools depending on the weight of an item, the size of an item, the material of an item, the fragility of an item, the deformability of an item, etc.
[0040] In some cases, the picking port 310 or the robotic arm 320 may include a controller. The controller may be configured to control the picking port 310 or the robotic arm 320. The controller may cause the robotic arm 320 to pick up and move items in the picking port 310 according to an order that reduces the number of tool changes using the tool changer 330. For example, the picking port 310 may sequentially move a first subset of items in the picking port 310 to a picking location (e.g., via using a buffer in the picking port 310) such that the first subset of items may be moved (e.g., sequentially) by the robotic arm 320 without the robotic arm changing tools using the tool changer 330. In another example, the robotic arm 320 may reach into the picking port 310 to pick up (e.g., sequentially) a first subset of items in the picking port 310 such that the first subset of items may be moved by the robotic arm 320 without the robotic arm changing tools using the tool changer 330. Then, once the first subset of items in the picking port 310 are moved by the robotic arm 320, the robotic arm may perform a tool change using the tool changer to prepare to move a second subset of items in the item picker 310. Accordingly, the systems, the methods, the computer-readable media, and the techniques disclosed herein may improve efficiency and speed of item picking via reducing a number of tool changes performed while picking items.
Example Robotic Item Picking Process
[0041] FIG. 4A shows an example top-down view 400A of an example of a single-input, singleoutput picking port with an example of a robotic arm. Similar to FIG. 2B, the picking port of view 400A has four queues, each with three locations, for a total of twelve locations. Locations 401, 402, and 403 are included in an input picking queue. Locations 404, 405, and 406 are included in an output picking queue. Locations 411, 412, and 413 are included in an input dropping queue. Locations 414, 415, and 416 are included in an output dropping queue.
[0042] More specifically, location 404 is a picking location and location 414 is a dropping location. Accordingly, the robotic arm 420 may be configured to pick up one or more items included in an item picking container 451 that is positioned at the picking location 404 and move the one or more items to an item dropping container 460 positioned at the dropping location 414. In some cases, after the one or more items are picked up out of the item picking container 451, the item picking container 451 moves to the location 405 and the item picking container 450 moves from the location 403 to the picking location 404. With the item picking container 450 in the picking location 404, the robotic arm may now pick up one or more items from the item picking container 450. The locations 403 and 413 may be buffers for the picking location 404 and the dropping location 414, respectively.
[0043] As illustrated in FIG. 4A using arrows, item picking containers (or in some cases, items themselves) may move through the picking port from the input picking queue to the output picking queue. For example, as illustrated, an item picking container may: (A) enter (e.g., from another conveyor system) the input picking queue (and the picking port) at the location 401; (B) move from the location 401 to the location 402; (C) move from the location 402 to the location 403; (D) move from the location 403 to the location 404 (and the output picking queue), where the robotic arm 420 may pick an item out of the item picking container; (E) move from the location 404 to the location 405; and move from the location 405 to the location 406, thereby exiting (e.g., to another conveyor system) the output picking queue (and the picking port). Accordingly, in some cases, an item picking container may include one or more items when the item picking container is in the input picking queue (the locations 401-403) and may be empty when the item picking container is in the output picking queue (the locations 404-406).
[0044] In some cases, any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(E) may be added or removed. Further, the one or more operations (A)-(E) may be performed in any order. Further, at least one of the one or more operations (A)-(E) may be repeated, e.g., iteratively.
[0045] As illustrated in FIG. 4A using arrows, item dropping containers (or in some cases, items themselves) may move through the picking port from the input dropping queue to the output dropping queue. For example, as illustrated, an item dropping container may: (A) enter (e.g., from another conveyor system) the input dropping queue (and the picking port) at the location 411; (B) move from the location 411 to the location 412; (C) move from the location 412 to the location 413; (D) move from the location 413 to the location 414 (and the output dropping queue), where the robotic arm 420 may drop an item into the item picking container; and (E) move from the location 414 to the location 415; and move from the location 415 to the location 416, thereby exiting (e.g., to another conveyor system) the output picking queue (and the picking port). Accordingly, in some cases, an item dropping container may be empty when the item dropping container is in the input dropping queue (the locations 411-413) and may include one or more items when the item dropping container is in the output dropping queue (the locations 414-416).
[0046] In some cases, any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(E) may be added or removed. Further, the one or more operations (A)-(E) may be performed in any order. Further, at least one of the one or more operations (A)-(E) may be repeated, e.g., iteratively.
[0047] In some cases, the picking port may include one or more sensors. As illustrated in FIG. 4A, the picking location 404 and the dropping location 414 have a weight sensor 440 and a weight sensor 441, respectively. In some cases, the weight sensors 440 and 441 may be scales. In some cases, the weight sensors 440 and 441 may be configured to determine a weight of one or more items (and, possibly, the item container) in an item container positioned on the weight sensors 440 and 441.
[0048] For example, as illustrated, the weight sensor 440 may determine a first weight that includes the weight of the item picking container 451 and the one or more items included therein. After the robotic arm 420 picks the one or more items out from the item picking container 451, the weight sensor 440 may determine a second weight that is the weight of the item picking container 451 without the one or more items included therein. Accordingly, based at least in part on the difference between the first weight and the second weight, the weight of the one or more items may be determined.
[0049] The weight of the one or more items may be compared against known weights of the one or more items (e.g., from a database). If, for example, the weight of the one or more items differs from the known weights (e.g., by a certain tolerance), an alert may be generated. For example, differing between the weight of the one or more items and the known weights may imply that the one or more items picked up by the robotic arm 420 are not the items corresponding to a label on the item picking container (e.g., the wrong item was picked up). In another example, differing between the weight of the one or more items and the known weights may imply that at least one item or at least one component of an item may have not been picked up by the robotic arm 420. [0050] FIG. 4B shows an example implementation of various sensors in an example top-down view 400B of the example single-input, single-output picking port of FIG. 4A. Accordingly, FIG. 4B may be similar to FIG. 4A, but FIG. 4B also depicts various sensors positioned at locations in the picking port. These sensors may be used to position and track item containers in the picking port. For example, these sensors may include identifier readers, such as barcode readers. Barcode readers may read barcodes on item picking containers to identify information about one or more items in the item picking containers, such as, item weight, item size, item quantity, item name, item price, item materials, item fragility, item deformability, etc. As also disclosed herein, these sensors may include weight sensors
[0051] Other types of sensors that may be used in addition, or in alternative to barcode readers and weight sensors may include laser curtains or cameras. Laser curtains may be used, for example, to detect when an item (or an item container) is lying outside a certain boundary. For example, if an item (or an item container) does not fit entirely within a single location in the picking port, an alert may be generated. In another example, if an item is sticking (e.g., partially) outside its item container, an alert may be generated. Cameras (e.g., an array of cameras) may be used to determine if certain alert conditions are satisfied. For example, is an item container has tipped over, an item has fallen out of an item container, an item container is overflowing, an item or item container is stuck, etc. an alert may be generated. In some cases, the picking port of the systems, the methods, the computer-readable media, and the techniques disclosed herein may use one or more of cameras, laser curtains, weight sensors, barcode readers, or the like to detect if an alarm condition is satisfied. In some cases, all of cameras, laser curtains, weight sensors, and barcode readers are used to detect if an alarm condition is satisfied.
[0052] FIG. 5 shows an example top-down view 500 of an example of a dual-input, single-output picking port with an example of a robotic arm 520. The picking port of FIG. 5 may be the same as or similar to the picking port of FIGs. 4A and 4B, at least in certain respects.
[0053] However, unlike the picking port of FIGs. 4A and 4B, the picking port of FIG. 5 has two input picking queues. The first input picking queue comprises locations 501, 502, and 503. The second input picking queue comprises locations 504, 505, and 506. The output picking queue comprises locations 507, 508, and 509. Also similar to the picking port of FIGs. 4A and 4B, the picking port of FIG. 5 has a single input dropping queue and a single output dropping queue. The input dropping queue comprises locations 511, 512, and 513. The output dropping queue comprises locations 514 and 515. [0054] As illustrated, the location 507 is a picking location and the location 513 is a dropping location. Accordingly, the robotic arm 520 may be configured to pick up items from an item picking container positioned at the location 507 and drop the items in an item dropping container positioned at the location 513. As illustrated, item picking container 554 is in the picking location 507 and the item dropping container 561 is in the dropping location 513. Once the item picking container 554 is emptied, the item picking container 554 may be moved to be location 508. Then, either item picking container 552 or item picking container 553 may be moved to the picking location 507. The locations 503, 505, and 506 may be buffers for the picking location 507. The locations 512 and 515 may be buffers from the dropping location 513.
[0055] The picking port of FIG. 5 also includes weight sensors 540 and 541. The weight sensors 540 and 541, similar to those described with respect to FIG. 4A, can be used to determine the weight of objects (e.g., items or item containers) at their corresponding location. In some cases, weight sensors may be placed in additional locations, such as the location 505, the location 503, the location 506, etc. to determine the weight of their corresponding objects. This may be useful in determining which item container to move to the picking location 507 next. For example, it may be desirable to load items from multiple different item picking containers into a single item dropping container. In this example, it may be desirable to load the items into the item dropping container from heaviest to lightest, e.g., to avoid crushing lighter items.
[0056] For illustrative purposes only, one example process may include: (A) moving a first item from the item picking container 554 into the item dropping container 561, where the first item is 6 kilograms; (B) moving the item picking container 554 to the location 508; (C) moving item picking container 553 to the picking location 507; (D) moving a second item from the item picking container 553 into the item dropping container 561, where the second item is 4 kilograms; (E) moving the item picking container 554 to the location 509 and moving the item picking container 553 to the location 508; (F) moving item picking container 552 to the picking location 507; and (G) moving a third item and then a fourth item from the item picking container 552 into the item dropping container 561, wherein the third item is 2 kilograms and the fourth item is 1 kilogram. In this illustrative example, the first item, the second item, the third item, and the fourth item were moved by the robotic arm 520 into the item dropping container 561 in descending order of weight. To achieve this descending order, a plurality of weight sensors (e.g., the weight sensors 540 and 541) at various locations in the picking port may be used. In addition or in alternative, weights of items may be accessed from a database comprising item information. Furthermore, achieving this descending order of the above illustrative example, may include taking advantage of the additional flexibility offered by the dualinput, single-output picking port that has the first input picking queue and the second input picking queue.
[0057] In some cases, any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(G) may be added or removed. Further, the one or more operations (A)-(G) may be performed in any order. Further, at least one of the one or more operations (A)-(G) may be repeated, e.g., iteratively.
[0058] While illustrated as having two input picking queues, in some cases, more than two input picking queues may be implemented in a single picking port. For example, a picking port may have three input picking queues, four input picking queues, five input picking queues, six input picking queues, seven input picking queues, eight input picking queues, nine input picking queues, about ten input picking queues, about twenty picking input queues, about thirty input picking queues, about forty input picking queues, about fifty input queues, etc. In cases with multiple input picking queues, two or more of the input picking queues may run parallel to each other (as illustrated with the two parallel input picking queues of FIG. 5). In other cases with multiple input picking queues, two or more of the input picking queues may run orthogonal to each other (e.g., a first input picking queue feeding to a picking location from the left (e.g., via a conveyor), a second input picking queue feeding to the picking location from the front, a third input picking queue feeding to the picking location from the back, a fourth input picking queue feeding to the picking location from above (e.g., via a chute), etc.).
[0059] By having two (or, in some cases, more) input picking queues, certain locations can be used as buffers and input picking containers can be advanced through either the first input picking queue or the second input picking queue - thereby enabling control over ordering of items moved to an item dropping container. On the other hand, in a single input picking queue system (e.g., as shown in FIGs. 4A and 4B), items may be moved to an item dropping container based on their order in the single input picking queue. Advantageously, this control over ordering of items may increase speed and efficiency of the picking port. For example, item picking containers may be provided to the picking location in an order that reduces (e.g., minimizes) tool changes (e.g., gripper changes) for the robotic arm 520. In another example, this control over ordering of items may enable items to be packed in an order based on size to improve (e.g., maximize) packing density of items in the item dropping container. In another example, this control over ordering of items may enable items to be packed in a manner that sorts them based on item type (e.g., a first type of item packed in a first item dropping container and a second type of item packed in a second item dropping container). This improved control over ordering of items may further advantageously result in having fewer broken or crushed items due to more effective weight-based item packing.
[0060] In some cases, similar to how a picking port may have more than one input picking queue, a picking port may have more than one output picking queue, input dropping queue, or output dropping queue. Additional output picking queues, input dropping queues, or output dropping queues may be implemented in the same or similar manner as disclosed with respect to having additional input picking queues. Furthermore, in some cases, a picking port may have more than one picking location. For example, a picking port may have two, three, four, five, six, seven, eight, nine, about ten, about twenty, about thirty, about forty, about fifty, etc. picking locations. Similar to advantages with having multiple queues (e.g., multiple input picking queues), having multiple picking locations may advantageously enable picking control over ordering of items moved to an item dropping container. Additionally, in some cases, similar to how a picking port may have multiple picking locations, a picking port may have multiple dropping locations. Furthermore, in some cases, a picking port may have more than one robotic arm. For example, a picking port may have two, three, four, five, six, seven, eight, nine, about ten, about twenty, about thirty, about forty, about fifty, etc. robotic arms. Similar to advantages with having multiple queues (e.g., multiple input picking queues), having multiple robotic arms may advantageously enable picking control over ordering of items moved to an item dropping container. While illustrated as robotic arms, it should be understood that other devices (e.g., a pusher, a conveyor, etc.) may be used in addition or in alternative of robotic arms (e.g., having a scooper, claw, fingers, magnet, etc.) to move items from a picking location to a dropping location. In some cases, a human may move items from a picking location to a dropping location.
Example Controllers for Robotic Picking Systems
[0061] FIG. 6 shows an example of a controller diagram 600 for implementing robotic item picking. The diagram 600 includes a warehouse management system (WMS) 605, a conveyor controller 610, port controller 615, and a programable logic controller (PLC) 620. The
[0062] WMS 620 operates at a high level, controlling orders and order lines (an order line is a single position in an order, e.g., two tubes of toothpaste). The WMS 620 monitors which order are filled or packed and which orders have yet to be filled or packed.
[0063] The conveyor controller 610 controls conveyor systems that move item containers (or items) in an out of a picking port. For example, the conveyor controller 610 controls the feeding of item containers into an input picking queue, out of an output picking queue, into an input dropping queue, and out of an output dropping queue. In cases with multiple input picking queues (e.g., as illustrated with respect to FIG. 5, the conveyor controller 610 controls moving item containers into each of the item picking queues. The conveyor controller 610 may schedule sending and receiving item containers to and from the picking port.
[0064] The port controller 615 controls a picking port that receives and sends item containers (or items) to and from conveyors (or other devices, such as, chutes) controlled by the conveyor controller 610. More specifically, the port controller 615 controls the movement of item containers (or item) through the picking port. For example, the port controller 615 may manage item container flow into and out of the picking port as well as flow within the picking port (e.g., within a queue, between a picking location and a dropping location, etc.) Further, in some cases, the port controller 615 acknowledges orders and order lines to the WMS 605.
[0065] In some cases, the port controller 615 cooperates with the PLC 620. In cooperating, the PLC 620 may manage hardware equipment. Accordingly, the PLC 620 may manage physical item container (or item) movements based at least in part on requests from the port controller 615.
[0066] One example process depicting communication between the controllers of the controller diagram 600 may include: (A) the WMS 605 indicates to the port controller 615 that an order is available; (B) in response, the port controller 615 requests an item dropping container for an input dropping queue of a picking port; (C) the conveyor controller 610 causes the item dropping container to be delivered to the input dropping queue of the picking port and confirms delivery; (D) the port controller 615 causes the item dropping container to move to a buffer location within the picking port; (E) the port controller 615 performs a first error check by checking an identifier (e.g., a barcode) of the item dropping container against an expected identifier from the WMS 605; (F) the port controller 615 causes the item dropping container to move to a picking position; (G) the port controller 615 receives a first weight of the item dropping container; (H) the port controller 615 causes an item to be placed into the item dropping container; (I) the port controller 615 receives a second weight of an item picking container in a picking location and a third weight of the item dropping container with the item in the item dropping container; (J) based at least in part on one or more of the first weight, the second weight, or the third weight, the port controller 615 performs a second error check; (K) the port controller 615 performs a third error check by checking, via a light curtain (e.g., a laser curtain) that the item is not sticking out of the item dropping container; (L) if the WMS 605 instructs the port controller 615 that additional items are to be placed in the item dropping container, then operations (H)-(K) may be repeated (e.g., iteratively); (M) once the order is complete, the port controller 615 causes the item dropping container to move to an output dropping queue; (O) the port controller 615 performs the first error check again, checking the identifier on the item dropping container against the known identifier in the WMS 605; (P) the port controller 615 sends a request to the conveyor controller 610 to receive the item dropping container; and (Q) the conveyor controller 610 causes the item dropping container to leave the picking port (e.g., to a conveyor system) and confirms the item dropping container has left the picking port with the port controller 615.
[0067] In some cases, any number of operations of the one or more operations disclosed above with respect to one or more of operations (A)-(Q) may be added or removed. Further, the one or more operations (A)-(Q) may be performed in any order. Further, at least one of the one or more operations (A)-(Q) may be repeated, e.g., iteratively. Further, in some cases, one or more operations performed by the port controller 615 may be performed at least in part (e.g., collaboratively) with the PLC 620.
[0068] While the operations (A)-(Q) disclosed above are described with respect to the “dropping side” of the picking port (the side moving item dropping containers and receiving items), the same or similar (e.g., mirrored) operations may be performed by the controllers of the controller diagram 600 with respect to the “picking side” of the picking port (the side moving item picking containers and having items picked).
Example Applications
[0069] Using a picking port, e.g., the same as or described with respect to FIGs. 1-5, items may be moved from item picking containers to item dropping containers.
[0070] In a first example application, (A) one or more first items of a first item picking container are transferred to a first item dropping container; (B) one or more second items of a second item picking container are transferred to a second item dropping container; (C) one or more third items of a third item picking container are transferred to a third item dropping container (and so on, for example). In this first example application, the one or more first items, the one or more second items, the one or more third items, etc. may be transferred in order to change the type of containers they are placed in. For example, the first item picking container, the second item picking container, the third item picking container, etc. may be container used within a warehouse (e.g., storage containers), whereas the first item dropping container, the second item dropping container, the third item dropping container, etc. may be used for outside a warehouse (e.g., shipping containers). [0071] In a second example application, (A) one or more first items of a first item picking container are transferred to a first item dropping container; (B) one or more second items of a second item picking container are transferred to the first item dropping container (and so on, for example). In this second example application, the one or more first items may be the same items as the one or more second items. In such case, this may be done to consolidate items. For example, the first item picking container and the second item picking container may each be only partially filled, and, in performing the operations of the second example application, the one or more first items and the one or more second items may be consolidated into the first item dropping container. In such case, the first item picking container, the second item picking container, and the first item dropping container may all be containers of the same type (e.g., warehouse storage containers). Contemplate, for example, the first item picking container may be 30% filled with basketballs and the second item picking container may be 50% filled with basketballs. This second example application may include transferring the basketballs from both the first item picking container and the second item picking container into the first item dropping container, thereby filling the first item dropping container with 80% basketballs. This first item dropping container can then be reshelved at the warehouse, taking up less space than the combination of the first item picking container and the second item picking container, thereby increasing storage density and efficiency at the warehouse. Alternatively, this first item dropping container may be a shipping container shipped to a customer who ordered an especially large number of basketballs. This second example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5.
[0072] In a third example application, (A) one or more first items of a first item picking container are transferred to a first item dropping container; (B) one or more second items of a second item picking container are transferred to the first item dropping container (and so on, for example). In this third example application, the one or more first items may be different than the one or more second items. In such case, this may be done to prepare items for shipment. For example, the first item picking container and the second item picking container may each be warehouse storage containers and the first item dropping container may be a shipping container. Contemplate, for example, a customer orders a basketball and two pairs of socks. The first item picking container may include a large number of basketballs and the second item picking container may include a large number of socks. This third example application may include transferring one basketball from the first item picking container and two pairs of socks from the second item picking container into the first item 1 dropping container for shipping to the customer. This second example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5.
[0073] In a fourth example application, (A) one or more first items of a first item picking container are transferred to a first item dropping container; (B) one or more second items of the first item picking container are transferred to a second item dropping container (and so on, for example). In this fourth example application, the one or more first items may be the same as the one or more second items. In such case, this may be done to prepare items for shipment. For example, the first item picking container may be a warehouse storage container and each of the first item dropping container and the second item dropping container may be a shipping container. Contemplate, for example, two customers who each a basketball. The first item picking container may include a large number of basketballs. This fourth example application may include transferring one basketball from the first item picking container into the first item dropping container for shipping to the first customer and transferring one basketball from the first item picking container into the second item dropping container for shipping to the second customer. This fourth example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5. Furthermore, in some cases, this fourth example application may be enabled by taking advantage of dropping control over ordering of items that is achieved using a dual output picking port (e.g., a picking port have at least two input dropping queues; not illustrated).
[0074] In a fifth example application, (A) one or more first items of a first item picking container are transferred to a first item dropping container; (B) one or more second items of the first item picking container are transferred to a second item dropping container (and so on, for example). In this fifth example application, the one or more first items may be different than the one or more second items. In such case, this may be done to sort items (e.g., by type, weight, size, material, price, quality, etc.). For example, the first item picking container may be a warehouse storage container with various different types of items mixed inside. Contemplate, for example, the first item picking container has a mixture of basketballs and socks. This fifth example application may include transferring the basketballs from the first item picking container into the first item dropping container and transferring the socks from the second item picking container into the second item dropping container. Determining which items are basketballs and which are socks may use, for example, a camera system, identifier readers, or weight sensors (each of which, e.g., as disclosed herein). This fifth example application may be enabled by taking advantage of the picking control over ordering of items that is achieved using the dual input, single output picking port of FIG. 5. Furthermore, in some cases, this fifth example application may be enabled by taking advantage of dropping control over ordering of items that is achieved using a dual output picking port (e.g., a picking port have at least two input dropping queues; not illustrated).
Example Item Manipulation
[0075] The systems, the methods, the computer-readable media, and the techniques disclosed herein for automated picking may cooperate with one or more other technologies in a warehouse or storage facility. As disclosed herein, these one or more other technologies may include robotic technologies such as robotic arms, conveyors, chutes, pushers, item tilters, etc.
[0076] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may sort, handle, pick, place, or otherwise manipulate one or more objects of a plurality of objects. The systems, the methods, the computer-readable media, and the techniques disclosed herein may replace tasks which may be performed manually or only in a semi-automated fashion. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may be integrated with machine learning software, such that human involvement may be completely removed over time. In some cases, the systems, the methods, the computer- readable media, and the techniques disclosed herein are used in analyzing and packing one or more items in a container or package. In some cases, the container or package is a box. In some cases, a surveillance system determines if human intervention is needed for one or more tasks.
[0077] Robotics, such as a robotic arm or other automated manipulators, may be used for applications involving picking up or moving objects. Picking up and moving objects may involve picking an object from a picking source location and placing it at a dropping location. A robotic device may be used to fill a container with objects, create a stack of objects, unload objects from a truck bed, move objects to various locations in a warehouse, and transport objects to one or more target locations. The objects may be of the same type. The objects may comprise a mix of different types of objects, varying in size, mass, material, etc. Robotics may direct a robotic arm to pick up objects based on predetermined knowledge of where objects are in the environment. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may use a plurality of robotic arms, wherein each robotic arm transports objects to one or more dropping locations. [0078] In some cases, an item manipulation may include using a device or apparatus for reorientation of items or objects. The device for re-orienting an object or item may be referred to herein as an item tilter. In some cases, an item tilter is used in conjunction with one or more robotic arms. The item tilter may reorient an object, such that it can be properly handled by a robotic arm. In some cases, an item tilter is provided to properly orient an object prior to placement within a container or box. The item tilter may facilitate proper packing of the container or box to maximize the number of items the container may hold or minimize the additional packing/stuffing materials required for shipping of the items within the container.
[0079] In some cases, a database is provided containing information related to products being handled by automated systems of a facility. In some cases, a database comprises information of how each product or object in an inventory should be handled or manipulated by the item tilter and or robotic arms. In some cases, a machine learning process dictates and improves upon the handling of a specific product or object. In some cases, the machine learning is trained by observation and repetition of a specific product or object being handled by a robot or automated handling system. In some cases, the machine learning is trained by observation of a human interaction with a specific object or product.
[0080] Additional details regarding example operations of item manipulation may be found in, for example, PCT/IB2021/000588, PCT/IB2023/000165, and PCT/US23/72313, all of which are incorporated by reference herein in their entireties.
Examples of Robotic Arms
[0081] The systems, the methods, the computer-readable media, and the techniques disclosed herein may cooperate with a robotic arm (e.g., as illustrated in FIG. 1 as the robotic arm 120, as illustrated in FIG. 3 as the robotic arm 320, as illustrated in FIG. 4A as 420, or as illustrated in FIG. 5 as 520). A robotic arm is a type of mechanical arm that may be used in various applications include, for example, automotive, agriculture, scientific, manufacturing, construction, etc. Robotic arms may be programmable and may be able to perform similar functions to a human arm. While robotic arms may be reliable and accurate, often times they may be taught to only perform narrowly defined tasks such as picking a specific type of object from a specific location with a specific orientation.
Accordingly, robotic arms are often times programmed to automate execution of repetitive tasks, such as applying paint to equipment, moving goods in warehouses, harvesting crops in a farm field, etc. Robotic arms may comprise links of manipulator that are connected by joints enabling either rotational motion (such as in an articulated robot) or translational (linear) displacement. [0082] In some cases, one or more robotic manipulators of the systems, the methods, the computer- readable media, and the techniques comprise robotic arms. In some cases, a robotic arm comprises one or more of robot joints connecting a robot base and an end effector receiver or end effector. A base joint may be configured to rotate the robot arm around a base axis. A shoulder joint may be configured to rotate the robot arm around a shoulder axis. An elbow joint may be configured to rotate the robot arm about an elbow axis. A wrist joint may be configured to rotate the robot arm around a wrist. A robot arm may be a six-axis robot arm with six degrees of freedom. A robot arm may comprise less or more robot joints and may comprise less than six degrees of freedom.
[0083] A robot arm may be operatively connected to a controller. The controller may comprise an interface device enabling connection and programming of the robot arm. The controller may comprise a computing device comprising a processor and software or a computer program installed there on. The computing device may can be provided as an external device. The computing device may be integrated into the robot arm.
[0084] In some cases, the robotic arm can implement a wiggle movement. The robotic arm may wiggle an object to help segment the box from its surroundings. In some cases, wherein a vacuum end effector is employed, the robotic arm may employ a wiggle motion in order to create a firm seal against the object. In some cases, a wiggle motion may be utilized if the system detects that more than one object has been unintendedly handled by the robotic arm. In some cases, the robotic arm may release and re-grasp an object at another location if the system detects that more than one object has been unintendedly handled by the robotic arm.
[0085] In some cases, various end effectors of a robotic arm may comprise grippers, vacuum grippers, magnetic grippers, etc. In some cases, the robotic arm may be equipped with end effector, such as a suction gripper. In some cases, the gripper includes one or more suction valves that can be turned on or off either by remote sensing, single point distance measurement, or by detecting whether suction is achieved. In some cases, an end effector may include an articulated extension. [0086] In some cases, the suction grippers of a robotic arm are configured to monitor a vacuum pressure to determine if a complete seal against a surface of an object is achieved. Upon determination of a complete seal, the vacuum mechanism may be automatically shut off as the robotic manipulator continues to handle the object. In some cases, sections of suction end effectors may comprise a plurality of folds along a flexible portion of the end effector (i.e., bellow or accordion style folds) such that sections of vacuum end effector can fold down to conform to the surface being gripped. In some cases, suction grippers comprise a soft or flexible pad to place against a surface of an object, such that the pad conforms to said surface.
[0087] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein comprises a plurality of end effectors to be received by the robotic arm. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein comprise one or more end effector stages to provide a plurality of end effectors. Robotic arms may comprise one or more end effector receivers to allow the end effectors to removable attach to the robotic arm. End effectors may include single suction grippers, multiple suction grippers, area grippers, finger grippers, and other end effectors.
[0088] In some cases, an end effector is selected to handle an object based on analyzation of one or more images captured by one or more image sensors, as disclosed herein. In some cases, the one or more image sensors are cameras. In some cases, an image sensor is placed before a robotic handler or arm. In some cases, the image sensor is in operative communication with a robotic handling system, which resides downstream from the image sensor. In some cases, the image sensor determines which product type is on the way or will arrive at the robotic handling system next. Based on the determination of the product, the robotic handling system may select and attach the appropriate end effector to handle the specific product type. Determination of a product type prior to the product reaching the handling station may improve efficiency of the system.
[0089] In some cases, an end effector is selected to handle an object based on information received by optical sensors scanning a machine-readable code located on the object. In some cases, an end effector is selected to handle an object based on information received from a product database, as disclosed herein.
Examples of Conveyors
[0090] The systems, the methods, the computer-readable media, and the techniques disclosed herein may cooperate with a conveyor. A conveyor is a common piece of mechanical handling equipment that may move materials from one location to another. Many kinds of conveying systems are available and are used according to the various needs of different industries. For example, chain conveyors (floor and overhead) may be types of conveying systems. Chain conveyors may include enclosed tracks, I-Beam, towline, power & free, and hand pushed trolleys. Conveyors may offer several advantages, including: increased efficiency, versatility, and cost-effectiveness. While conveyors are widely used and may offer numerous advantages, they also have certain limitations and shortcomings. For example, conveyors operate along a fixed path, which means they may not be suitable for applications that require flexible routing or changes in the material flow direction. Adding flexibility to the system may use additional complex mechanisms or multiple conveyor lines. Examples of Chutes
[0091] The systems, the methods, the computer-readable media, and the techniques disclosed herein may cooperate with a chute. A chute is a vertical or inclined plane, channel, or passage through which objects are moved by means of gravity. Chutes are commonly used in various industries for bulk material handling, allowing the controlled transfer of granular or bulky materials from higher to lower levels or between different processing stages. The design of chutes depends on the specific application and the characteristics of the materials being handled. The entry section of the chute is where the material is introduced into the chute from a higher elevation or conveyor. This section is designed to accommodate the flow of material smoothly and prevent any spillage or blockages. Chutes may include features like baffles or flow control gates to regulate the speed and flow of materials through the chute. These features can help prevent material surges and ensure a steady flow. The exit section of the chute is where the material discharges onto the lower level or conveyor. Chutes may be suited for free-flowing, granular, or bulk materials. Chutes may be less suited for handling cohesive materials, sticky substances, or materials with irregular shapes, as this can lead to blockages and flow issues. Depending on the drop height and material characteristics, the material flow in chutes can result in impact forces, potentially leading to material degradation or fines generation. For steeply inclined chutes, there may be limitations in controlling the material flow, leading to faster material acceleration and potentially causing material surges or damage to the chute.
Examples of Pushers
[0092] The systems, the methods, the computer-readable media, and the techniques disclosed herein may cooperate with a pusher. A pusher, in the context of material handling and logistics, refers to a mechanical device or component used to move items or products along a conveyor system or through a production line. The primary function of a pusher is to apply force to push or divert items from one conveyor lane or processing stage to another. Pushers may be used in conveyor systems and automated manufacturing processes to perform tasks. Pushers may be used to divert products from the main conveyor line to specific side lanes or different processing stages. This enables the sorting and distribution of items based on certain criteria, such as destination, size, or product type. Pushers may be employed in sorting systems to direct items to different designated destinations or shipping lanes based on predetermined criteria. Pushers may be used to stage items or products for further processing or packaging. Pushers can transfer products between conveyors or equipment in a production line, facilitating the smooth flow of materials. At very high speeds, pushers may not have enough time to properly engage with and push items, leading to sorting or diverting errors.
Achieving precise positioning and alignment of products for proper pushing can be challenging, especially with varying sizes or misaligned items. For applications involving complex sorting patterns or multiple destination lanes, the design and synchronization of multiple pushers can become intricate.
Examples of Item Tilters
[0093] The systems, the methods, the computer-readable media, and the techniques disclosed herein may cooperate with an item tilter. An item tilter may be a mechanical device used to tilt or rotate items, loads, or pallets to a specific angle. One use of an item tilter is to reorient materials or products to facilitate easier handling, improve ergonomics, or aid in certain manufacturing or processing operations. The design of an item tilter may include a platform or surface on which the load or item is placed. The platform may be attached to a tilting mechanism that allows controlled tilting or rotation of a load. The tilting action can be achieved through hydraulic, pneumatic, or electric means, depending on the item tilter's design and intended application. Item tilters offer advantages in terms of improving efficiency, reducing manual handling strain, and enhancing the overall material handling process.
[0094] In some cases, automated systems, which may include robotic arms, handle unpacking items from warehouse bins to cardboard boxes preparing them to be shipped to the final customer. In some cases, the order and position of incoming goods are random. Therefore, items may be initially provided in positions which make it very difficult to place the item in the destination box in the position that optimizes the volume occupied inside the target box. In some cases, an item tilter is positioned at a good-to-robot station where items are provided to a robot for picking and manipulation. The item tilter may facilitate proper packing in cases where the robot is unable to place items on their flat side. In some cases, the item tilter will reorient items in preparation for packing into a final container or box.
[0095] In some cases, the item tilter does not grasp, clamp, or grip the object being manipulated. In some cases, an item tilter which does not perform grasping, gripping, clamping, or similar actions prevents damage to the objects handled by the tilter. In some cases, the item tilter comprises a substantially planar surface which the object is placed on. The surface may rotate in a specified direction to properly reorient the object in preparation for packing or manipulation by a robot. In some cases, the item tilter comprises two substantially planar surfaces, orthogonal to one another. In some cases, the object is placed against at both surfaces prior to rotation by the item tilter. In some cases, the object is placed only against one surface and gravity assists with abutting the object against the second surface. In some cases, the item tilter comprises two or more substantially planar surfaces, wherein connecting surfaces are orthogonal to one another. In some cases, the object is placed against at least one surface prior to rotation by the item tilter. In some cases, the item tilter is coupled to a product database, as disclosed herein. The product database may relay an appropriate speed of rotation to the item tilter based on characteristics of the object being handled, as to prevent damage, mishandling, or misplacement of the object.
[0096] In some cases, the item tilter comprises one or more surfaces which grasp, clamp, or otherwise hold the object during rotation. In some cases, the item tilter is capable of applying different pressures to hold the object. In some cases, the item tilter is coupled to an item database, as disclosed herein. The item database may relay an appropriate pressure based on characteristics of the object being handled, as to prevent damage, mishandling, or misplacement of the object. In some cases, a rotating surface of the item tilter comprises a suction effector to retain an object during rotation.
[0097] In some cases, the item tilter and robots of the system are operatively connected to a product database, programmable logic controller, computer system, or a combination thereof. In some cases, the device item tilter provides a ready for operation status. In some cases, the ready for operation comprises a digital output of ON and signifies when the new item can be placed in the device to be tilted. In some cases, the item tilter provides a final position digital output when an item is ready to be picked by an adjacent robot after being rotated into a desired orientation by the item tilter. In some cases, the item tilter receives a cycle start indication when rotation of the item is to begin. In some cases, automated systems disclosed herein may be able to make decisions not to put the item on the tilter based on the product database.
[0098] In some cases, an item tilter is provided at a product loading station. The product loading station may be a cell that includes a robot (e.g., possibly with a picking port), a tilter, a scanner, etc. The product loading station may comprise two or more item tilters. In some cases, the product loading station is provided in proximity to or adjacent to a loading apparatus or output thereof. In some cases, the loading apparatus provides the object to the item tilter. The loading apparatus may comprise a conveyor, a robotic handler, a chute, a pusher, or a combination thereof. In some cases, the loading apparatus provides two or more objects to the item tilter for simultaneous rotation of said two or more objects. In some cases, the product loading station comprises two or more item tilters. In some cases, an unloading apparatus is provided at the product loading station to move the object into proximity of or place the object inside the destination container. An unloading apparatus may include the item tilter, a conveyor, a robotic handler, a chute, a pusher, or a combination thereof. [0099] In some cases, one or more sensors are provided at the product loading station. In some cases, a vision system comprising at least one optical sensor is provided at the product loading station. In some cases, the vision system identifies one or more characteristics of items or objects provided at the product loading station. In some cases, the vision system is in operative communication with the software module and a computer processor. In some cases, the software module instructs the unloading apparatus to move the object in proximity to or within a destination container. In some cases, the software module is operatively connected to a product database to determine one or more characteristics of an object, as disclosed herein. In some cases, the product database provides a desired orientation for an object provided at the product loading station. In some cases, automated systems may provide a decision based on the product database whether or not to put an object on the tilter. For example, an item may be deformable, and tilting will not help. [0100] Operation of an item tilter may be understood as a cyclic process, wherein a cycle starts when an object is placed into the item tilter by a robot and a is complete when the object is provided in a final position and the item tilter is ready to receive a subsequent object. In some cases, when an object is placed into the item tilter by a robot, a signal is sent from a programmable logic controller (PLC) of the system to start the cycle. During the cycle, item tilter rotates the object, as disclosed herein. In some cases, as the item tilter is rotating the object, the robot will pick a second to be placed in the item tilter. In some cases, at the end of the cycle, the item is positioned in the final position. In some cases, the cycle ends when the device is ready for placing the next item. In some cases, the item tilter is ready for placing the next item even if the first one was not removed from the final position. In some cases, the robot picks the first object from the final position and places it in the destination container or box, as the second object is being rotated. In some cases, an additional robot is utilized, wherein one robot places objects into the item tilter and an additional robot places them into a destination container or box.
[0101] In some cases, the item tilter is ready to receive the next object for the operation while the previous object is positioned in the final position. In some cases, the item tilter is capable of tilting the next item even if the previous item is in the final position, as mentioned above. In some cases, the total cycle time as described above should take no more than 0.5, 1, 2, 3, 4, 5, 10, or 15 seconds. [0102] In some cases, an item tilter is designed to handle objects having a height (Z-axis), of up to 200 millimeters (mm). In some cases, an item tilter is designed to handle objects having a width (Y- axis), of up to 300 mm. In some cases, an item tilter is designed to handle objects having a length (X-axis), of up to 200 mm. In some cases, the item is provided such that the length (X-axis) of the object corresponds to the smallest dimension of the object. In some cases, the item tilter handles objects weighing up to 3 kilograms (kg). In some cases, the processes described above are carried out without prior determination of the shape of the object being handled. In some cases, the shape of the object being handled is provided by a product database or information gathered by sensors, as described herein.
[0103] In some items the destination container comprises dimensions of about 310 mm length, 220 mm width, and 140 mm height. In some items the destination container comprises dimensions of about 410 mm length, 305 mm width, and 195 mm height. In some items the destination container comprises dimensions of about 595 mm length, 395 mm width, and 250 mm height.
[0104] In some cases, an item tilter is provided as a component of an automated warehouse. In some cases, an item tilter is adjacent to one or more conveyor belts. In some cases, an item tilter is adjacent to one or more components for automated movement of items. The automated components may include a robotic arm, a conveyor belt or system, a chute system, a pushing apparatus, or a combination thereof.
Examples of Optical Sensors
[0105] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein use one or more optical sensors. The optical sensors may be operatively coupled to at least one processor. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may use data storage comprising instructions executable by the at least one processor to cause performance of various functions. For example, the functions may include causing the robotic manipulator to move at least one physical object through a designated area in space of a physical. The functions may further include causing one or more optical sensors to determine a location of a machine-readable code on the at least one physical object as the at least one physical object is moved through a target location. Based on the determined location, at least one optical sensor may scan the machine-readable code as the object is moved so as to determine information associated with the object encoded in the machine-readable code.
[0106] In some cases, information obtained by a machine readable code is referenced to a product database. The product database may provide information corresponding to an object being handled by a robotic manipulator, as disclosed herein. The product database may provide information corresponding to a target location or position of the object and verify that the object is in a proper location.
[0107] In some cases, based on the information associated with the object obtained from the machine-readable code, the systems, the methods, the computer-readable media, and the techniques disclosed herein may determine a respective location (e.g., a dropping location) at which to cause a robotic manipulator to place an object. In some cases, based on the information associated with the object obtained from the machine-readable code, the systems, the methods, the computer-readable media, and the techniques may place an object at the target location.
[0108] In some cases, the information comprises proper orientation of an object. In some cases, proper orientation is referenced to the surface on which a machine-readable code is provided. Information comprising proper orientation of an object may determine the orientation at which the object is to be placed at the dropping location. Information comprises proper orientation of an object may be used to determine a grasping or handling point at which a robotic manipulator grasps, grips, or otherwise handles the object.
[0109] In some cases, information associated with an object obtained from at the machine-readable code may be used to determine one or more anomaly events. Anomaly events may include misplacement of the object within a warehouse or within the system, damage to the object, unintentional connection of more than one object, combinations thereof, or other anomalies which would result in an error in placing an object in an appropriate position or otherwise causing an error in further processing to take place. In some cases, when an anomaly is detected, a warning, alert, or other indication will be provided (e.g., to a human operator).
[0110] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may determine that the object is at an improper location from the information associated with the object obtained from the machine-readable code. The systems, the methods, the computer-readable media, and the techniques disclosed herein may generate an alert that the object is located at an improper location, as disclosed herein. The systems, the methods, the computer- readable media, and the techniques disclosed herein may place the object into at an error or exception location. The exception location may be located within a container. In some cases, the exception location is designated for objects which have been determined to be at an improper location within the system or within a warehouse. [0111] In some cases, information associated with an object obtained from at the machine-readable code may be used to determine one or more properties of the object. The information may include expected dimensions, shapes, or images to be captured. Properties of an object may include an objects size, an objects weight, flexibility of an object, and one or more expected forces to be generated as the object is handled by a robotic manipulator.
[0112] In some cases, a robotic manipulator comprises the one or more optical sensors. The one or more optical sensors may be physically coupled to a robotic manipulator. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may use multiple cameras oriented at various positions such that when one or more optical sensors are moved over an object, the optical sensors can view multiple surfaces of the object at various angles. Alternatively, the systems, the methods, the computer-readable media, and the techniques disclosed herein may use multiple mirrors, such that mirrors so that one or more optical sensors can view multiple surfaces of an object. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein use one or more optical sensors located underneath a platform on which an object is placed or moved over during a scanning procedure. The platform may be transparent or semitransparent so that the optical sensors located underneath it can scan a bottom surface of the object. [0113] In another example configuration, the robotic arm may bring a box through a reading station after or while orienting the box in a certain manner, such as in a manner in order to place the machine-readable code in a position in space where it can be easily viewed and scanned by one or more optical sensors.
[0114] In some cases, the one or more optical sensors comprise one or more images sensors. The one or more image sensors may capture one or more images of an object to be handled by a robotic manipulator or an object being handled by the robotic manipulator. In some cases, the one or more images sensors comprise one or more cameras. In some cases, an image sensor is coupled to a robotic manipulator. In some cases, an image sensor is placed near a workstation of a robotic manipulator to capture images of one or more object to be handled by the manipulator. In some cases, the image sensor captures images of an object being handled by a robotic manipulator. [0115] In some cases, one or more image sensors comprise a depth camera. The depth camera may be a stereo camera, an RGBD (RGB Depth) camera, or the like. The camera may be a color or monochrome camera. In some cases, one or more image sensors comprise a RGBaD (RGB+active depth, e.g., an Intel RealSense D415 depth camera) color or monochrome camera registered to a depth sensing device that uses active vision techniques such as projecting a pattern into a scene to enable depth triangulation between the camera or cameras and the known offset pattern projector. In some cases, the camera is a passive depth camera. In some cases, cues such as barcodes, texture coherence, color, 3D surface properties, or printed text on the surface may also be used to identify an object or find its pose in order to know where or how to place the object. In some cases, shadow or texture differences may be employed to segment objects as well. In some cases, an image sensor comprises a vision processor. In some cases, an image sensor comprises an inferred stereo sensor system. In some cases, an image sensor comprises a stereo camera system.
[0116] In some cases, a virtual environment including a model of the objects in 2D or 3D may be determined and used to develop a plan or strategy for picking up the objects and verifying their properties are an approximate match to the expected properties. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein use one or more sensors to scan an environment containing objects. In an embodiment, as a robotic arm moves, a sensor coupled to the arm captures sensor data about a plurality of objects in order to determine shapes or positions of individual objects. A larger picture of a 3D environment may be stitched together by integrating information from individual (e.g., 3D) scans. In some cases, the image sensors are placed in fixed positions, on a robotic arm, or in other locations. In some cases, scans may be constructed and used in accordance with any or all of a number of different techniques.
[0117] In some cases, scans are conducted by moving a robotic arm upon which one or more image sensors are mounted. Data comprising a position of the robotic arm position may provide be correlated to determine a position at which a mounted sensor is located. Positional data may also be acquired by tracking key points in the environment. In some cases, scans may be from fixed-mount cameras that have fields of view (FOVs) covering a given area.
[0118] In some cases, a virtual environment built using a 3D volumetric or surface model to integrate or stitch information from more than one sensor. This may allow the systems, the methods, the computer-readable media, and the techniques disclosed herein to operate within a larger environment, where one sensor may be insufficient to cover a large environment. Integrating information from multiple sensors may yield finer detail than from a single scan alone. Integration of data from multiple sensors may reduce noise levels. This may yield better results for object detection, surface picking, or other applications.
[0119] Information obtained from the image sensors may be used to select one or more grasping points of an object. In some cases, information obtained from the image sensors may be used to select an end effector for handling an object. [0120] In some cases, an image sensor is attached to a robotic arm. In some cases, the image sensor is attached to the robotic arm at or adjacent to a wrist joint. In some cases, an image sensor attached to a robotic arm is directed to obtain images of an object. In some cases, the image sensor scans a machine-readable code placed on a surface of an object.
[0121] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may integrate edge detection software. One or more captured images may be analyzed to detect or locate the edges of an object. The object may be at an initial position prior to being handled by a robotic manipulator or may be in the process of being handled by a robotic manipulator when the images are captured. Edge detection processing may comprise processing one or more two-dimensional images captured by one or more image sensors. Edge detection algorithms utilized may include Canny method detection, first-order differential detection methods, second- order differential detection methods, thresholding, linking, edge thinning, phase congruency methods, phase stretch transformation (PST) methods, subpixel methods (including curve-fitting, moment-based, reconstructive, and partial area effect methods), and combinations thereof. Edge detection methods may utilize sharp contrasts in brightness to locate and detect edges of the captured images.
[0122] From the edge detection, the systems, the methods, the computer-readable media, and the techniques disclosed herein may record measured dimensional values of an object. The measured dimensional values may be compared to expected dimensional values of an object to determine if an anomaly event has occurred. Anomaly events based on dimensional comparison may indicate a misplaced object, unintentionally connected objects, damage to an object, or combinations thereof. Determination of an anomaly occurrence may trigger an anomaly event, as discussed herein.
[0123] In some cases, one or more images captured of an object may be compared to one or more references images. A comparison may be conducted by an integrated computing device of the systems, the methods, the computer-readable media, and the techniques disclosed herein. In some cases, the one or more reference images are provided by a product database. Appropriate reference images may be correlated to an object by correspondence to a machine-readable code provided on the object.
[0124] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may compensate for variations in angles and distance at which the images are captured during the analysis. In some cases, an anomaly alert is generated if the difference between one or more captured images of an object and one or more reference images of the object exceeds a predetermined threshold. A difference one or more captured images and one or more reference images may be taken across one or more dimensions or may be a sum difference between the one or more images.
[0125] In some cases, reference images are sent to an operator during a verification process. The operator may view the one or more references images in relation to the one or more captured images to determine if generation of an anomaly event or alert was correct. The operator may view the reference images in a comparison module. The comparison module may present the reference images side-by-side with the captured images.
Examples of Anomaly Detection
[0126] The systems, the methods, the computer-readable media, and the techniques disclosed herein provided herein may be configured to detect anomalies of which occur during the handling or processing of one or more objects. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein include obtaining one or more properties of an object prior to being handled by a robotic manipulator and analyzes the obtained properties against one or more expected properties of the object. In some cases, the systems, the methods, the computer- readable media, and the techniques disclosed herein obtain one or more properties of an object while being handled by a robotic manipulator and analyzes the obtained properties against one or more expected properties of the object. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein obtain one or more properties of an object after being handled by a robotic manipulator and analyzes the obtained properties against one or more expected properties of the object. In some case, if an anomaly is detected, the systems, the methods, the computer-readable media, and the techniques disclosed herein do not proceed to place the object at a target position. The systems, the methods, the computer-readable media, and the techniques disclosed herein may instead instruct a robotic manipulator to place the object at an exception position, as described herein. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may verify a registered anomaly with an operator prior to placing an object at a given position.
[0127] In some cases, one or more optical sensors scan a machine-readable code provided on an object. Information obtained by the machine-readable code may be used to verify that an object is in a proper location. If it is determined that an object is misplaced, the systems, the methods, the computer-readable media, and the techniques disclosed herein may register an anomaly event corresponding to a misplacement of said object. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein generate an alert if an anomaly event is registered.
Examples of Humans in the Loop
[0128] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein communicate with an operator or other user. The systems, the methods, the computer-readable media, and the techniques disclosed herein may communicate with an operator using a computing device. The computing device may be an operator device. The computing device may be configured to receive input from an operator or user with a user interface. The operator device may be provided at a location remote from operations of the facility.
[0129] In some cases, an operator utilizes an operator device to verify one or more anomaly events or alerts. In some cases, the operator device receives captured images from one or more image sensors to verify that an anomaly has occurred in an object. An operator may provide verification that an object has been misplaced or that an object has been damaged based on the one or more images.
[0130] In some cases, captured images are provided in a module to be displayed on a screen of an operator device. In some cases, the module displays the one or more captured images adjacent to one or more reference images corresponding to said object. In some cases, one or more captured images are displayed on a page adjacent to a page displaying one or more reference images.
[0131] In some cases, an operator uses an interface of the operating device to verify that an anomaly event or alert was correctly generated. Verification provided by the operator may be used to train a machine learning algorithm, as disclosed herein. In some cases, verification that an alert was correctly generated adjusts a predetermined threshold which is used to generate an alert if a difference between one or more measured properties and one or more corresponding expected properties of an object exceeds said predetermined threshold. In some cases, verification that an alert was incorrectly generated adjusts a predetermined threshold which is used to generate an alert if a difference between one or more measured properties and one or more corresponding expected properties of an object exceeds said predetermined threshold.
[0132] In some cases, verification of an alert instructs a robotic manipulator to handle an object in a particular manner. For example, if an anomaly alert corresponding to an object is verified as being correctly generated, the robotic manipulator may place the object at an exception location. In some cases, if an anomaly alert corresponding to an object is verified as being incorrectly generated, the robotic manipulator may place the object at a target location. In some cases, if an alert is generated and an operator verifies that two or more objects are unintentionally being handled simultaneously, then the robotic manipulator performs a wiggling motion in an attempt to separate the two or more objects.
[0133] In some cases, one or more images of a target container or target location wherein one or more objects are provided at are transmitted to an operator or user device. An operator or user may then verify that the one or more objects are correctly placed at the target location or with a target container. A user or operator may also provide feedback using an operator or user device to communicate errors if the one or more objects have been incorrectly placed at the target location or within the target container.
[0134] In some cases, it may be determined that human intervention is required for proper handling of an object type. In some cases, a specific product may require manual handling or packaging by human operators. As disclosed herein, a database may provide information as to which products requires human intervention or handling. In some cases, a warehouse surveillance or monitoring system alerts human handlers to incoming products which require human intervention. In some cases, upon detection of a product requiring human intervention, the systems, the methods, the computer-readable media, and the techniques disclosed herein route said product or a container holding said product to a station designated for human intervention. The station may be separated from automated handling systems or robotic arms. Separation may be necessary for safety reasons or to provide an accessible area for a human to handle the products.
Example Surveillance System
[0135] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may operate using a surveillance system for monitoring operations or product flow in a facility. The surveillance system may operate with a picking port. For example, the surveillance system may monitor items as the items move through a picking port (e.g., from a picking location to a dropping location). In some cases, the surveillance system is integrated into an existing warehouse with automated handling systems. In some cases, the surveillance system comprises a database of information for each product to be handled in the warehouse. In some cases, the database is updated, as disclosed herein.
[0136] In some cases, the surveillance system comprises at least one image sensor. In some cases, the surveillance system allows for identification of a product type. In some cases, identification of a product type at one or more points through a product flow in a facility allows for monitoring to determine if the facility is running efficiently or if an anomaly has occurred. In some cases, the surveillance system allows for determination of an appropriate package size for the one or more products to be placed and packaged within. In some cases, the surveillance system allows for automated quality control of products and packaging within a facility.
[0137] In some cases, an image sensor is provided prior to or upstream from an automated handling station. An image sensor provided prior to an automated handling system may allow for proper preparation by the handling system prior to arrival of a specific product type. In some cases, an image sensor provided prior to an automated handling system captures one or more images of a product or object to facilitate determination of an appropriate handler the product should be sent to. [0138] In some cases, an image sensor provided prior to an automated handling system identifies if a product has been misplaced or will not be able to be handled by an automated system downstream from the image sensor.
[0139] In some cases, a surveillance system comprises one or more image sensors located after or downstream from an automated handling robot or system. In some cases, an image sensor provided downstream from a handling station captures one or more images of a product after being handled or placed to verify correct placement or handling. Verification may be done on products handled on an automated system or by a human handler.
[0140] In some cases, the surveillance system includes further sensors, such as weight sensors, motion sensors, laser scanners, or other sensors useful for gathering information related to a product or container.
Examples of Warehouse Automation
[0141] The systems, the methods, the computer-readable media, and the techniques disclosed herein may be implemented in existing warehouses to automate one or more processes within a warehouse. In some cases, software and robotic manipulators of the system are integrated with the existing warehouse systems to provide a smooth transition of manual operations being automated.
[0142] In some cases, a product database is provided in communication with the systems, the methods, the computer-readable media, and the techniques disclosed herein. The product database may comprise a library of object to be handled, e.g., in a picking port. The product database may include properties of each objects to be handled. In some cases, the properties of the objects provided by the product data base are expected properties of the objects. The expected properties of the objects may be compared to measured properties of the objects in order to determine if an anomaly has occurred. [0143] Expected properties may include expected dimensions, expected forces, expected weights, and expected machine-readable codes, as disclosed herein. Product databases may be updated according to the objects to be handled. Product databases may be generated input of information of the objects to be handled by handled.
[0144] In some cases, objects may be processed by the systems, the methods, the computer-readable media, and the techniques disclosed herein to generate a product database. For example, an undamaged object may be handled by one or more robotic manipulators to determine expected properties of the object. Expected properties of the object may include expected dimensions, expected forces, expected weights, and expected machine-readable codes, as disclosed herein. The expected properties may then be input into the product database.
[0145] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein may process a plurality of objects of the same type to determine a standard deviation occurring within objects of that type. The determined standard deviations may be used to set a predetermined threshold, wherein a difference between expected properties and measured properties of an object may trigger an anomaly alert. In some cases, the predetermined threshold includes a standard deviation of different of one or more objects of the same type. In some cases, the standard deviation is multiplied by a constant factor to set a predetermined threshold.
[0146] In some cases, the product database comprises a set of filtering criterion. The filtering criterion may be used for routing objects to a proper handling station. Filtering criterion may be used for routing objects to a robotic handling station or a human handling station. Filtering criterion may be utilized for routing objects to an appropriate robotic handing station with an automated handler suited for handling a particular object or product type.
[0147] In some cases, the database is continually updated. In some cases, the filtering criterion is continually updated. In some cases, the filtering criterion may be updated as new handling systems are integrated within a facility. In some cases, the filtering criterion is updated as new product types are handlined within a facility. In some cases, the filtering criterion is updated as new manipulation techniques or handling patterns are realized. In some cases, a machine learning program is utilized to update the database or filtering criterion.
[0148] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein track objects as they are handled. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein integrate with existing tracking software of a warehouse which the system is implemented within. The systems, the methods, the computer-readable media, and the techniques disclosed herein may connect with existing software such that information which is normally received by manual input is now communicated electronically.
[0149] Object tracking may include confirming an object has been received at a source locations or station. Object tracking may include confirming an object has been placed at a target position.
Object tracking may include input that an anomaly has been detected. Object tracking may include input that an object has been placed at an exception location. Object tracking may include input that an object or target container has left a handling station or target position to be further processed at another location within a warehouse.
Examples of Integrated Software
[0150] Many or all of the functions of a robotic device may be controlled by a control system. A control system may include at least one processor that executes instructions stored in a non- transitory computer readable medium, such as a memory. The control system may also comprise a plurality of computing devices that may serve to control individual components or subsystems of the robotic device.
[0151] In some cases, a memory comprises instructions (e.g., program logic) executable by the processor to execute various functions of robotic device disclosed herein. A memory may comprise additional instructions as well, including instructions to transmit data to, receive data from, interact with, or control one or more of a mechanical system, a sensor system, a product database, an operator system, or the control system.
[0152] In some cases, machine learning algorithms are implemented such that the systems, the methods, the computer-readable media, and the techniques disclosed herein become completely automated. In some cases, verification operations completed by a human operator are removed after training of machine learning algorithms are complete.
[0153] In some cases, the machine learning programs utilize incorporate a supervised learning approach. In some cases, the machine learning programs utilized incorporate a reinforcement learning approach. Information such as verification of alerts/ anomaly events, measured properties of objects being handled, and expected properties of objects being handled by be received by a machine learning algorithm for training.
[0154] Other machine learning approaches such as unsupervised learning, feature learning, topical modeling, dimensionality reduction, and meta learning may be utilized by the system. Supervised learning may include active learning algorithms, classification algorithms, similarity learning algorithms, regressive learning algorithms, and combinations thereof.
[0155] In some cases, machine learning may generally involve identifying and recognizing patterns in existing data in order to facilitate making predictions for subsequent data, machine learning may include a machine learning model (which may include, for example, a machine learning algorithm). Machine learning, whether analytical or statistical in nature, may provide deductive or abductive inference based on real or simulated data. The machine learning model may be a trained model, machine learning techniques may comprise one or more supervised, semi-supervised, selfsupervised, or unsupervised machine learning techniques. For example, a machine learning model may be a trained model that is trained through supervised learning (e.g., various parameters are determined as weights or scaling factors), machine learning may comprise one or more of regression analysis, regularization, classification, dimensionality reduction, ensemble learning, meta learning, association rule learning, cluster analysis, anomaly detection, deep learning, or ultra-deep learning, machine learning may comprise: k-means, k-means clustering, k-nearest neighbors, learning vector quantization, linear regression, non-linear regression, least squares regression, partial least squares regression, logistic regression, stepwise regression, multivariate adaptive regression splines, ridge regression, principal component regression, least absolute shrinkage and selection operation (LASSO), least angle regression, canonical correlation analysis, factor analysis, independent component analysis, linear discriminant analysis, multidimensional scaling, non-negative matrix factorization, principal components analysis, principal coordinates analysis, projection pursuit, Sammon mapping, t-distributed stochastic neighbor embedding, AdaBoosting, boosting, gradient boosting, bootstrap aggregation, ensemble averaging, decision trees, conditional decision trees, boosted decision trees, gradient boosted decision trees, random forests, stacked generalization, Bayesian networks, Bayesian belief networks, naive Bayes, Gaussian naive Bayes, multinomial naive Bayes, hidden Markov models, hierarchical hidden Markov models, support vector machines, encoders, decoders, auto-encoders, stacked auto-encoders, perceptrons, multi-layer perceptrons, artificial neural networks, feedforward neural networks, convolutional neural networks, recurrent neural networks, long short-term memory, deep belief networks, deep Boltzmann machines, deep convolutional neural networks, deep recurrent neural networks, large language models, vision transformers, or generative adversarial networks.
[0156] Machine learning algorithms may be applied to anomaly detection, as disclosed herein. In some cases, machine learning algorithms are applied to programed movement of one or more robotic manipulators. Machine learning algorithms applied to programmed movement of robotic manipulators may be used to optimize actions such as scanning a machine-readable code provided on an object. Machine learning algorithms applied to programmed movement of robotic manipulators may be used to optimize actions such performing a wiggling motion to separate unintentionally combined objects. Machine learning algorithms applied to programmed movement of robotic manipulators may be used to any actions of a robotic manipulator for handling one or more objects, as disclosed herein. In some cases, machine learning algorithms are applied to make decisions whether or not to put an item on the tilter.
[0157] In some cases, trajectories of items handled by robotic manipulators are automatically optimized by the systems, the methods, the computer-readable media, and the techniques disclosed herein. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein automatically adjust the movements of the robotic manipulators to achieve a minimum transportation time while preserving constraints on forces exerted on the item or package being transported.
[0158] In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein monitor forces exerted on the object as they are transported from a source position to a target position, as disclosed herein. The systems, the methods, the computer-readable media, and the techniques disclosed herein may monitor acceleration, rate of acceleration (jerk), etc. of an object being transported by a robotic manipulator. The force experienced by the object as it is manipulated may be calculated using the known movement of the robotic manipulator (e.g., position, velocity, and acceleration values of the robotic manipulator as it transports the object) and force values obtained by the weight/torsion and force sensors provided on the robotic manipulator.
[0159] In some cases, optical sensors of the systems, the methods, the computer-readable media, and the techniques disclosed herein monitor the movement of objects being transported by the robotic manipulator. In some cases, the trajectory of objects is optimized to minimize transportation time including scanning of a digital code on the object. In some cases, the optical sensors recognize defects in the objects or packaging of objects as a result of mishandling (e.g., defects caused by forces applied to the object by the robotic manipulator). In some cases, the optical sensors monitor the flight or trajectory of objects being manipulated for cases which the objects are dropped. In some cases, detection of mishandling or drops will result in adjustments of the robotic manipulator (e.g., adjustment of trajectory or forces applied at the end effector). In some cases, the constraints and optimized trajectory information will be stored in the product database, as disclosed herein. In some cases, the constraints are derived from a history of attempts for the specific object or plurality of similar objects being transported. In some cases, the systems, the methods, the computer-readable media, and the techniques disclosed herein are trained by increasing the speed at which an object is manipulated over a plurality of attempts until a drop or defect occurs due to mishandling by the robotic manipulator.
[0160] In some cases, a technician verifies that a defect or drop has occurred due to mishandling. Verification may include viewing a video recording of the object being handled and confirming that a drop or defect was likely due to mishandling by the robotic manipulator.
Example Computer System
[0161] The present disclosure provides computer control systems that are programmed to implement the methods, the computer-readable media, and the techniques of the disclosure. FIG. 7 shows a computer system 701 that is programmed or otherwise configured to implement the methods, the computer-readable media, and the techniques disclosed herein, such as to control the systems or devices disclosed herein (e.g., a picking port, a robotic arm, a controller, etc.). The computer system 701 can be an electronic device of a user or a computer system that is remotely located with respect to the electronic device. The electronic device can be a mobile electronic device.
[0162] The computer system 701 includes a central processing unit (CPU, also “processor” and “computer processor” herein) 705, which can be a single core or multi core processor, or a plurality of processors for parallel processing. The computer system 701 also includes memory or memory location 710 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 715 (e.g., hard disk), communication interface 720 (e.g., network adapter) for communicating with one or more other systems, and peripheral devices 725, such as cache, other memory, data storage or electronic display adapters. The memory 710, storage unit 715, interface 720 and peripheral devices 725 are in communication with the CPU 705 through a communication bus (solid lines), such as a motherboard. The storage unit 715 can be a data storage unit (or data repository) for storing data. The computer system 701 can be operatively coupled to a computer network (“network”) 730 with the aid of the communication interface 720. The network 730 can be the Internet, an isolated or substantially isolated internet or extranet, or an intranet or extranet that is in communication with the Internet. The network 730 in some cases is a telecommunication or data network. The network 730 can include one or more computer servers, which can enable distributed computing, such as cloud computing. The network 730, in some cases with the aid of the computer system 701, can implement a peer-to-peer network, which may enable devices coupled to the computer system 701 to behave as a client or a server. The CPU 705 can execute a sequence of machine-readable instructions, which can be embodied in a program or software. The instructions may be stored in a memory location, such as the memory 710. The instructions can be directed to the CPU 605, which can subsequently program or otherwise configure the CPU 705 to implement methods of the present disclosure. Examples of operations performed by the CPU 705 can include fetch, decode, execute, and writeback. The CPU 705 can be part of a circuit, such as an integrated circuit. One or more other components of the system 701 can be included in the circuit. In some cases, the circuit is an application specific integrated circuit (ASIC).
[0163] The storage unit 715 can store files, such as drivers, libraries and saved programs. The storage unit 715 can store user data, e.g., user preferences and user programs. The computer system 701 in some cases can include one or more additional data storage units that are external to the computer system 701, such as located on a remote server that is in communication with the computer system 701 through an intranet or the Internet.
[0164] The computer system 701 can communicate with one or more remote computer systems through the network 730. For instance, the computer system 701 can communicate with a remote computer system of a user. Examples of remote computer systems include personal computers (e.g., portable PC), slate or tablet PC’s (e.g., Apple® iPad, Samsung® Galaxy Tab), telephones, Smart phones (e.g., Apple® iPhone, Android-enabled device, Blackberry®), or personal digital assistants. The user can access the computer system 701 via the network 730.
[0165] Methods as disclosed herein can be implemented by way of machine (e.g., computer processor) executable code stored on an electronic storage location of the computer system 701, such as, for example, on the memory 710 or electronic storage unit 715. The machine executable or machine-readable code can be provided in the form of software. During use, the code can be executed by the processor 705. In some cases, the code can be retrieved from the storage unit 715 and stored on the memory 710 for ready access by the processor 705. In some situations, the electronic storage unit 715 can be precluded, and machine-executable instructions are stored on memory 710. The code can be pre-compiled and configured for use with a machine having a processor adapted to execute the code or can be compiled during runtime. The code can be supplied in a programming language that can be selected to enable the code to execute in a pre-compiled or as-compiled fashion.
[0166] Aspects of the systems and methods provided herein, such as the computer system 701, can be embodied in programming. Various aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of machine (or processor) executable code or associated data that is carried on or embodied in a type of machine readable medium. Machineexecutable code can be stored on an electronic storage unit, such as memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk. “Storage” type media can include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from a management server or host computer into the computer platform of an application server. Thus, another type of media that may bear the software elements includes optical, electrical, and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links. The physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software. As used herein, unless restricted to non-transitory, tangible “storage” media, terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions to a processor for execution.
[0167] Hence, a machine readable medium, such as computer-executable code, may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium or physical transmission medium. Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the databases, etc. shown in the drawings. Volatile storage media include dynamic memory, such as main memory of such a computer platform. Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution [0168] Aspects of the systems and methods provided herein, such as the computer system 701, can be embodied in programming. Various aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of machine (or processor) executable code or associated data that is carried on or embodied in a type of machine readable medium. Machineexecutable code can be stored on an electronic storage unit, such as memory (e.g., read-only memory, random-access memory, flash memory) or a hard disk. “Storage” type media can include any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor into another, for example, from a management server or host computer into the computer platform of an application server. Thus, another type of media that may bear the software elements includes optical, electrical, and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links. The physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software. As used herein, unless restricted to non-transitory, tangible “storage” media, terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions to a processor for execution.
[0169] Hence, a machine readable medium, such as computer-executable code, may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium or physical transmission medium. Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the databases, etc. shown in the drawings. Volatile storage media include dynamic memory, such as main memory of such a computer platform. Tangible transmission media include coaxial cables; copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier-wave transmission media may take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards paper tape, any other physical storage medium with patterns of holes, a RAM, a ROM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer may read programming code or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.
[0170] The computer system 701 can include or be in communication with an electronic display 735 that comprises a user interface (UI) 740 for providing, for example, images (e.g., micrographs) of the substrates or the plurality of beads, along with the analysis of the images (e.g., pitch, spacing, occupancy, intensity, nucleic acid sequence data, etc.). Examples of UI’s include, without limitation, a graphical user interface (GUI) and web-based user interface.
[0171] Methods and systems of the present disclosure can be implemented by way of one or more algorithms. An algorithm can be implemented by way of software upon execution by the central processing unit 705. The algorithm can, for example, determine the occupancy, spacing, or other parameters (e.g., full-width half-maximum, mean fluorescence intensity) of an image (e.g., micrograph of a bead or plurality of beads on or adjacent to a substrate).
Certain Definitions and Additional Considerations
[0172] Unless otherwise defined, all technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present subject matter belongs. [0173] As used in this specification and the appended claims, “some embodiments,” “further embodiments,” or “a particular embodiment,” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrase “in some embodiments,” or “in further embodiments,” or “in a particular embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0174] As used in this specification and the appended claims, when the term “at least,” “greater than,” or “greater than or equal to” precedes the first numerical value in a series of two or more numerical values, the term “at least,” “greater than” or “greater than or equal to” applies to each of the numerical values in that series of numerical values. For example, greater than or equal to 1, 2, or 3 is equivalent to greater than or equal to 1, greater than or equal to 2, or greater than or equal to 3. [0175] As used in this specification and the appended claims, when the term “no more than,” “less than,” or “less than or equal to” precedes the first numerical value in a series of two or more numerical values, the term “no more than,” “less than,” or “less than or equal to” applies to each of the numerical values in that series of numerical values. For example, less than or equal to 3, 2, or 1 is equivalent to less than or equal to 3, less than or equal to 2, or less than or equal to 1.
[0176] As used in this specification, “or” is intended to mean an “inclusive or” or what is also known as a “logical OR,” wherein when used as a logic statement, the expression “A or B” is true if either A or B is true, or if both A and B are true, and when used as a list of elements, the expression “A, B or C” is intended to include all combinations of the elements recited in the expression, for example, any of the elements selected from the group consisting of A, B, C, (A, B), (A, C), (B, C), and (A, B, C); and so on if additional elements are listed. As such, any reference to “or” herein is intended to encompass “and/or” unless otherwise stated.
[0177] As used in this specification and the appended claims, the indefinite articles “a” or “an,” and the corresponding associated definite articles “the” or “said,” are each intended to mean one or more unless otherwise stated, implied, or physically impossible. Yet further, it should be understood that the expressions “at least one of A and B, etc.,” “at least one of A or B, etc.,” “selected from A and B, etc.” and “selected from A or B, etc.” are each intended to mean either any recited element individually or any combination of two or more elements, for example, any of the elements from the group consisting of “A,” “B,” and “A AND B together,” etc.
[0178] As used in this specification and the appended claims “about” or “approximately” may mean within an acceptable error range for the value, which will depend in part on how the value is measured or determined, e.g., the limitations of the measurement system. For example, “about” may mean within 1 or more than 1 standard deviation, per the practice in the art. Alternatively, “about” may mean a range of up to 20%, up to 10%, up to 5%, or up to 1% of a given value. Where values are described in the application and claims, unless otherwise stated the term “about” meaning within an acceptable error range for the particular value may be assumed.
[0179] While preferred embodiments of the present invention have been shown and disclosed herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. It is not intended that the invention be limited by the specific examples provided within the specification. While the invention has been described with reference to the aforementioned specification, the descriptions and illustrations of the embodiments herein are not meant to be construed in a limiting sense. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. Furthermore, it shall be understood that all aspects of the invention are not limited to the specific depictions, configurations or relative proportions set forth herein which depend upon a variety of conditions and variables. It should be understood that various alternatives to the embodiments of the invention disclosed herein may be employed in practicing the invention. It is therefore contemplated that the invention shall also cover any such alternatives, modifications, variations, or equivalents. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.
[0180] It should be noted that various illustrative or suggested ranges set forth herein are specific to their example embodiments and are not intended to limit the scope or range of disclosed technologies, but, again, merely provide example ranges for frequency, amplitudes, FIG. associated with their respective embodiments or use cases. Where values are described as ranges, it will be understood that such disclosure includes the disclosure of all possible sub-ranges within such ranges, as well as specific numerical values that fall within such ranges irrespective of whether a specific numerical value or specific sub-range is expressly stated.
[0181] It should be understood that, unless a term is expressly defined in this patent using the sentence “As used herein, the term ‘ ’ is hereby defined to mean... ” or a similar sentence, there is no intent to limit the meaning of that term, either expressly or by implication, beyond its plain or ordinary meaning, and such term should not be interpreted to be limited in scope based at least in part on any statement made in any section of this patent (other than the language of the claims). To the extent that any term recited in the claims at the end of this patent is referred to in this patent in a manner consistent with a single meaning, that is done for sake of clarity only so as to not confuse the reader, and it is not intended that such claim term be limited, by implication or otherwise, to that single meaning.
[0182] Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
[0183] Additionally, certain embodiments are disclosed herein as including logic or a number of routines, subroutines, applications, or instructions. These may constitute either software (e.g., code embodied on a machine-readable medium) or hardware. In hardware, the routines, etc., are tangible units capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as disclosed herein.
[0184] In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations. [0185] Accordingly, hardware modules may encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations disclosed herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
[0186] Hardware modules may provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and may operate on a resource (e.g., a collection of information). Elements that are described as being coupled and or connected may refer to two or more elements that may be (e.g., direct physical contact) or may not be (e.g., electrically connected, communicatively coupled, etc.) in direct contact with each other, but yet still cooperate or interact with each other.
[0187] The various operations of example methods disclosed herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
[0188] Similarly, the methods or routines disclosed herein may be at least partially processor- implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
[0189] The performance of certain operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
[0190] It will be understood that, although the terms first, second, FIG. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element may be termed a second element, and, similarly, a second element may be termed a first element, without departing from the scope of the present disclosure.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A system for robotic item picking, comprising:
(A) a picking port, comprising:
(i) a first input picking queue configured to hold and move a first plurality of item picking containers,
(ii) a second input picking queue configured to hold and move a second plurality of item picking containers,
(iii) a picking location configured to receive an item picking container of (1) said first plurality of item picking containers from said first input picking queue and (2) said second plurality of item picking containers from said second input picking queue, and
(iv) a dropping location configured to receive an item dropping container;
(B) a robotic arm configured to move one or more items of said item picking container; and
(C) a controller configured to:
(a) cause said picking location to receive said item picking container from (1) said first input picking queue or (2) said second input picking queue, and
(b) cause said robotic arm to (1) pick up said one or more items from said item picking container at said picking location and (2) place said one or more items in said item dropping container at said dropping location.
2. The system of claim 1, wherein said picking port further comprises (v) an output picking queue configured to hold and move said item picking container.
3. The system of claim 2, wherein said controller is further configured to, after causing said robotic arm to pick up said one or more items from said item picking container at said picking location in (b):
(c) cause said item picking container to be moved from said picking location to said output picking queue.
4. The system of claim 3, wherein said controller is further configured to, after causing said item picking container to be moved from said picking location to an output picking queue in (c):
(d) cause said picking location to receive an additional item picking container from (1) said first input picking queue or (2) said second input picking queue, and
(e) cause said robotic arm to pick up one or more additional items from said additional item picking container at said picking location.
5. The system of claim 4, wherein: said picking port further comprises (vi) an output dropping queue configured to hold and move said item dropping container, said picking port further comprises (vii) an input dropping queue configured to hold and move said item dropping container, and said dropping location is configured to receive said item dropping container from said input dropping queue.
6. The system of claim 5, wherein said input dropping queue comprises a first input dropping queue and a second input dropping queue.
7. The system of either claim 5 or 6, wherein said controller is further configured to, after causing said robotic arm to place said one or more items from said item picking container in said output item container at said dropping location in (b):
(f) cause said item dropping container to be moved from said dropping location to an output dropping queue,
(g) cause said dropping location to receive another item dropping container from said input dropping queue, and
(h) cause said robotic arm to place said one or more additional items in said additional item dropping container at said dropping location.
8. The system of any one of claims 4-6, wherein said controller is further configured to, after causing said robotic arm to place said one or more items from said item picking container in said output item container at said dropping location in (b): (i) cause said robotic arm to place said one or more additional items in said item dropping container at said dropping location, thereby causing said item dropping container to include both said one or more items and said one or more additional items.
9. The system of either claim 7 or 8, wherein at least one of said one or more additional items is substantially identical to at least one of said one or more items.
10. The system of claim 8, wherein each of said one or more additional items are substantially identical each of said one or more items.
11. The system of either claim 7 or 8, wherein at least one of said one or more additional items is substantially different from at least one of said one or more items.
12. The system of claim 11, wherein each of said one or more additional items are substantially different from each of said one or more items.
13. The system of any one of the preceding claims,, further comprising:
(D) one or more sensors configured to collect sensor data corresponding to said picking port.
14. The system of claim 13, wherein said controller is further configured to perform at least one of operations (a)-(h) based at least in part on said sensor data.
15. The system of claim 14, wherein said one or more sensors comprise one or more cameras configured to capture image data corresponding to said picking port.
16. The system of claim 15, wherein said controller is further configured to generate an alert at least in part in response to said image data satisfying an alert condition.
17. The system of claim 14, wherein said one or more sensors comprise one or more laser curtains.
18. The system of claim 17, wherein: said one or more laser curtains are configured to detect if an item is sticking out outside of a threshold in one or more of (1) said first plurality of item picking containers, (2) said first plurality of item picking containers, (3) said item picking container, or (4) said item dropping container, and said controller is further configured to generate an alert at least in part in response to detecting that said item is sticking out outside of said threshold.
19. The system of claim 14, wherein said one or more sensors comprise one or more weight sensors.
20. The system of claim 19, wherein: said one or more weight sensors are configured to determine (1) a first weight of a first item in one of said first plurality of item picking containers and (2) a second weight of a second item in one of said second plurality of item picking containers, and said controller is configured to determine, based at least in part on a comparison of said first weight and said second weight, to provide, to said picking location, (1) said one of said first plurality of item picking containers or (2) said one of said second plurality of item picking containers.
21. The system of claim 20, wherein (1) a position in said first input picking queue proximal to said picking location is a first buffer location and (2) a position in said second input picking queue proximal to said picking location is a second buffer location.
22. The system of claim 21, wherein (1) said one of said first plurality of item picking containers is at said first buffer location and (2) said one of said second plurality of item picking containers is at said second buffer location.
23. The system of claim 20, wherein: said one or more weight sensors are configured to determine (1) a first weight of said item picking container at said picking location prior to said robotic arm picking up said one or more items from said item picking container and (2) a second weight of said item picking container at said picking location after said robotic arm picking up said one or more items from said item picking container, and said controller is further configured to is configured to determine, based at least in part on a comparison of said first weight and said second weight, a number of items included in said one or more items.
24. The system of claim 23, wherein said controller is further configured to generate an alert at least in part in response to said number of items satisfying an alert condition.
25. The system of any one of the preceding claims, wherein each of said first plurality of item picking containers and each of said second plurality of item picking containers is a tote.
26. The system of claim 25, wherein said tote for each of said first plurality of item picking containers and each of said second plurality of item picking containers comprises an identifier.
27. The system of claim 25, wherein said identifier is an item identifier corresponding to an item included in said tote for each of said first plurality of item picking containers and each of said second plurality of item picking containers.
28. The system of any one of the preceding claims, wherein said item dropping container is a shipping container
29. The system of any one of the preceding claims, wherein said first input picking queue is substantially parallel to said second input picking queue.
30. The system of any one of the preceding claims, wherein said first input picking queue is substantially identical to said second input picking queue.
31. A method for robotic item picking, comprising:
(a) receiving, at a picking location from a first input picking queue, a first item picking container;
(b) picking up, using a robotic arm, a first item from said first item picking container;
(c) placing, using said robotic arm, said first item into a first item dropping container at a dropping location; (d) moving said first item picking container from said picking location to an output picking queue;
(e) moving said first item dropping container from said dropping location to an output dropping queue;
(f) receiving, at said picking location from a second input picking queue, a second item picking container;
(g) receiving, at said dropping location from an input dropping queue, a second item dropping container;
(h) picking up, using a robotic arm, a second item from said second item picking container; and
(i) placing, using said robotic arm, said second item into said second item dropping container.
32. The method of claim 31, wherein said input dropping queue comprises a first input dropping queue and a second input dropping queue.
33. The method of either claim 31 or 32, wherein said first item is substantially identical to said second item.
34. The method of either claim 31 or 32, wherein said first item is substantially different from said second item.
35. The method of any one of claims 31-34, wherein one or more of operations (a)-(i) are performed based at least in part on sensor data collected by one or more sensors.
36. The method of claim 35, wherein said one or more sensors comprise one or more cameras, the method further comprising: capturing image data, via said one or more cameras.
37. The method of claim 36, further comprising: generating an alert at least in part in response to said image data satisfying an alert condition.
38. The method of claim 35, wherein said one or more sensors comprise one or more laser curtains.
39. The method of claim 38, further comprising: detecting, via said one or more laser curtains, that (1) said first item is sticking out outside of a threshold in said first item picking container or said first item dropping container or (2) said second item is sticking out outside of a threshold in said second item picking container or said second item dropping container, and generating an alert at least in part in response to detecting that that (1) said first item is sticking out outside of said threshold in said first item picking container or said first item dropping container or (2) said second item is sticking out outside of said threshold in said second item picking container or said second item dropping container.
40. The method of claim 35, wherein said one or more sensors comprise one or more weight sensors.
41. The method of claim 40, further comprising: determining, said one or more weight sensors, a weight of said first item and a weight of said second item; and causing said first item to be placed, using said robotic arm, into said first item dropping container prior to said second item being placed, using said robotic arm into said second item dropping container.
42. The method of claim 40, further comprising: determining (1) a first weight of said first item picking container prior to said robotic arm picking up said first item from said first item picking container and (2) a second weight of said first item picking container after said robotic arm picking up said first item from said item picking container.
43. The method of claim 42, further comprising: generating an alert at least in part in response to a comparison of said first weight and said second weight.
44. The method of any one of claims 31-43, wherein each of said first item picking container and said second item picking container is a tote.
45. The method of claim 44, wherein said tote for each of first item picking container and said second item picking container comprises an identifier.
46. The method of claim 45, wherein said identifier for said first item picking container corresponds to said first item and said identifier for said second item picking container corresponds to said second item.
47. The method of any one of claims 31 -46, wherein each of said first item dropping container and said second item dropping container is a shipping container
48. The method of any one of claims 31-47, wherein said first input picking queue is substantially parallel to said second input picking queue.
49. The method of any one of claims 31-48, wherein said first input picking queue is substantially identical to said second input picking queue.
50. A method for robotic item picking, comprising:
(a) receiving, at a picking location from a first input picking queue, a first item picking container;
(b) picking up, using a robotic arm, a first item from said first item picking container;
(c) placing, using said robotic arm, said first item into an item dropping container at a dropping location;
(d) moving said first item picking container from said picking location to an output picking queue;
(e) receiving, at said picking location from a second input picking queue, a second item picking container;
(f) picking up, using a robotic arm, a second item from said second item picking container; and (g) placing, using said robotic arm, said second item into said item dropping container, thereby causing said item dropping container to include both said first item and said second item.
51. The method of claim 50, further comprising: receiving, from an input dropping queue, said input dropping container.
52. The method of claim 51, wherein said input dropping queue comprises a first input dropping queue and a second input dropping queue.
53. The method of any one of claims 50-52, wherein said first item is substantially identical to said second item.
54. The method of any one of claims 50-52, wherein said first item is substantially different from said second item.
55. The method of any one of claims 50-54, wherein one or more of operations (a)-(g) are performed based at least in part on sensor data collected by one or more sensors.
56. The method of claim 55, wherein said one or more sensors comprise one or more cameras, the method further comprising: capturing image data, via said one or more cameras.
57. The method of claim 56, further comprising: generating an alert at least in part in response to said image data satisfying an alert condition.
58. The method of claim 55, wherein said one or more sensors comprise one or more laser curtains.
59. The method of claim 58, further comprising: detecting, via said one or more laser curtains, that (1) said first item is sticking out outside of a threshold in said first item picking container or said item dropping container or (2) said second item is sticking out outside of a threshold in said second item picking container or said item dropping container, and generating an alert at least in part in response to detecting that that (1) said first item is sticking out outside of said threshold in said first item picking container or said item dropping container or (2) said second item is sticking out outside of said threshold in said second item picking container or said item dropping container.
60. The method of claim 55, wherein said one or more sensors comprise one or more weight sensors.
61. The method of claim 60, further comprising: determining, said one or more weight sensors, a weight of said first item and a weight of said second item; and causing said first item to be placed, using said robotic arm, into said item dropping container prior to said second item being placed, using said robotic arm into said item dropping container.
62. The method of claim 60, further comprising: determining (1) a first weight of said first item picking container prior to said robotic arm picking up said first item from said first item picking container and (2) a second weight of said first item picking container after said robotic arm picking up said first item from said item picking container.
63. The method of claim 62, further comprising: generating an alert at least in part in response to a comparison of said first weight and said second weight.
64. The method of any one of claims 50-63, wherein each of said first item picking container and said second item picking container is a tote.
65. The method of claim 64, wherein said tote for each of first item picking container and said second item picking container comprises an identifier.
66. The method of claim 65, wherein said identifier for said first item picking container corresponds to said first item and said identifier for said second item picking container corresponds to said second item.
67. The method of any one of claims 50-66, wherein said item dropping container is a shipping container
68. The method of any one of claims 50-67, wherein said first input picking queue is substantially parallel to said second input picking queue.
69. The method of any one of claims 50-68, wherein said first input picking queue is substantially identical to said second input picking queue.
70. A method for robotic item picking, comprising:
(a) obtaining item information for a plurality of items, wherein said item information comprises one or more of: item weight, item size, item fragility, or item deformability;
(b) obtaining an order comprising a subset of said plurality of items;
(c) determining a filling order for said subset of said plurality of items based at least in part on said item information corresponding to said subset of said plurality of items;
(d) at least in response to said filling order, causing a first input picking queue to move a first item container comprising a first item of said subset of said plurality of items to a picking location; and
(e) at least in response to said filling order, causing a second input picking queue to move a second item container comprising a second item of said subset of said plurality of items to said picking location.
71. The method of claim 70, further comprising: causing a robotic arm to pick up said first item from said first item container; and causing an output picking queue to move said first item container from said picking location.
72. The method of either claim 70 or 71, wherein said item information further comprises one or more of: item quantity, item name, item price, or item materials.
73. The method of any one of claims 70-72, wherein said order comprises a customer order.
74. One or more non-transitory computer-readable media comprising computer-executable instructions that, when executed by at least one processor, cause the at least one processor to cause the performance of any one of the methods of any one of claims 50-73.
75. A system for robotic item picking, comprising:
(A) a first conveyor system configured to hold and move a first plurality of item picking containers;
(B) a second conveyor system configured to hold and move a second plurality of item picking containers;
(C) a plurality of sensors configured to collect sensor data corresponding to at least a portion of said first plurality of item picking containers and at least a portion of said second plurality of item picking containers, wherein said plurality of sensors comprises one or more of: a weight sensor, a camera sensor, an identification reader, or a laser curtain; and
(D) a controller configured to:
(a) obtain an order comprising a plurality of items,
(b) determine a filling order for said plurality of items based at least in part on said sensor data,
(d) at least in response to said filling order, cause said first conveyor system to move a first item picking container of said first plurality of item picking containers to a picking location, wherein said first item container comprises a first item of said plurality of items, and
(e) at least in response to said filling order, cause said second conveyor system to move a second item picking container of said second plurality of item picking containers to said picking location, wherein said second item container comprises a second item of said plurality of items.
PCT/PL2023/050080 2023-09-26 2023-09-26 Methods and systems for robotic item picking ports for automated warehouses Pending WO2025071419A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/PL2023/050080 WO2025071419A1 (en) 2023-09-26 2023-09-26 Methods and systems for robotic item picking ports for automated warehouses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/PL2023/050080 WO2025071419A1 (en) 2023-09-26 2023-09-26 Methods and systems for robotic item picking ports for automated warehouses

Publications (1)

Publication Number Publication Date
WO2025071419A1 true WO2025071419A1 (en) 2025-04-03

Family

ID=88778260

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/PL2023/050080 Pending WO2025071419A1 (en) 2023-09-26 2023-09-26 Methods and systems for robotic item picking ports for automated warehouses

Country Status (1)

Country Link
WO (1) WO2025071419A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160244262A1 (en) * 2015-02-25 2016-08-25 Dematic Corp. Automated order fulfillment system and method
US10769581B1 (en) * 2016-09-28 2020-09-08 Amazon Technologies, Inc. Overhanging item background subtraction
US20220135329A1 (en) * 2020-10-29 2022-05-05 Berkshire Grey, Inc. Systems and methods for automated packaging and processing for shipping with object pose analysis

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160244262A1 (en) * 2015-02-25 2016-08-25 Dematic Corp. Automated order fulfillment system and method
US10769581B1 (en) * 2016-09-28 2020-09-08 Amazon Technologies, Inc. Overhanging item background subtraction
US20220135329A1 (en) * 2020-10-29 2022-05-05 Berkshire Grey, Inc. Systems and methods for automated packaging and processing for shipping with object pose analysis

Similar Documents

Publication Publication Date Title
US12233438B2 (en) Robotic system having shuttle
US12134200B2 (en) Singulation of arbitrary mixed items
JP7723921B2 (en) ROBOT SYSTEM WITH ADJUSTMENT MECHANISM AND METHOD FOR OPERATING A ROBOT SYSTEM - Patent application
KR20220165262A (en) Pick and Place Robot System
US20240149460A1 (en) Robotic package handling systems and methods
US20230364787A1 (en) Automated handling systems and methods
US20250005518A1 (en) Surveillance system and methods for automated warehouses
EP4572898A2 (en) Robotic package handling systems and methods
WO2025071419A1 (en) Methods and systems for robotic item picking ports for automated warehouses
US20250262773A1 (en) Item manipulation system and methods
US20250304386A1 (en) Robotic package handling systems and methods
CN116061192A (en) Systems and methods for robotic systems with object handling

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23805203

Country of ref document: EP

Kind code of ref document: A1