[go: up one dir, main page]

WO2018166652A1 - Method for picking items - Google Patents

Method for picking items Download PDF

Info

Publication number
WO2018166652A1
WO2018166652A1 PCT/EP2018/000095 EP2018000095W WO2018166652A1 WO 2018166652 A1 WO2018166652 A1 WO 2018166652A1 EP 2018000095 W EP2018000095 W EP 2018000095W WO 2018166652 A1 WO2018166652 A1 WO 2018166652A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
user
picking
pick
order
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/000095
Other languages
French (fr)
Inventor
Pedro OLIVEIRA
Eduard SCHIMPF
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Swisslog AG
Original Assignee
Swisslog AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Swisslog AG filed Critical Swisslog AG
Publication of WO2018166652A1 publication Critical patent/WO2018166652A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management

Definitions

  • Fig. 3 shows
  • Figures 5a and 5b show a closed container 76 containing physical or virtual markers 78 that allow detecting the presence and/or the position of the container 76 in a working space. Additionally, an object identifier 80 will be present to uniquely identify the container 76 and its goods.
  • the data captured by at least one sensing device may be transmitted to the processing device, where the marker 78, location or position can be identified by the processing device.
  • the processing device can generate a decision on the identification of a position, location or product so an augmented view could be generated and displayed to the user highlighting the identified object 76 by means of a highlight marker 82.
  • consolidation / staging / loading to determine where the order container can be dropped off (consolidation location, staging location, dock location).
  • the order container criteria could include the checks related to order routing. If the order needs to be consolidated, the processing device can look up the current consolidation location of the order or, if not specified, the default consolidation location of the system. The system can further create a TU- level consolidation task from the equipment location to the destination determined above.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Development Economics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)

Abstract

The invention relates to a computer aided method for picking comprising: - Displaying information associated to at least one item to be picked by a user by means of a display device of an augmented reality system to the user; - Picking the at least one item - Displaying a confirmation whether the correct item and/or the correct number of items is picked as well as a work station for performing the method.

Description

Method for picking items
The invention is related to a computer-aided method for picking items from a shelf and a corresponding work station comprising an augmented reality system.
Methods for picking items from a shelf in order to place the items into another shelf or pack the items in to a container for transport are known. Human users performing the picking method have to be focused on this picking task in order to avoid picking the wrong item and/or a wrong number of items and/or placing the picked item(s) to the wrong position. Thus, there is a need for supporting the human users to carry out the method.
This object is obtained by the computer-aided method having the features of claim 1 and the work station having the features of claim 7. Preferred embodiments are defined in the dependent claims.
One aspect of the invention includes a computer-aided method for picking comprising:
- Displaying information associated to at least one item to be picked by a user by means of a display device of an augmented reality system to the user;
- Picking the at least one item
- Displaying a confirmation whether the correct item and/or the correct number of items is picked.
The picking step might include different types of picking. By performing a single order picking the user executes the pick run sequence for a single order. By performing zone picking, the user enters the current work zone and is fed with orders that need to be picked in the selected zone. Orders are selected by the same logic as used for auto- feeding during single order picking. The pick tasks are restricted to the selected work zone and given in optimal path sequence within the zone. By performing order group picking, multiple orders haven been grouped and released as an order group.
Depending on the user profile, the user may be automatically assigned or prompted to select the order group. The user is guided to the most optimum path to fulfil the picking. By default, the user is guided using a work path. For full transport units (TU) the entire transport unit is logically moved to the equipment. A transport unit denotes an arbitrary load carrier that allows the physical transport and storage of Load Units or (nested) TUs. A transport unit carries 0 to many load units. A transport unit has a type, e.g. pallets in different heights, bins, roller cages, bins or mobile racks. Transport units are put on warehouse locations or parent transport units. A load unit (LU) represents a specific instance of a product with a unique quantity and with its set of attributes like expiration date and lot number. Load units can be transported with or without a transport unit and can be located on transport units or directly on locations. For load units destination identification is required. This could be a dummy transport unit or a fixed one for the case of trolleys. By performing batch picking, multiple orders are picked together in a batch and picked into the same order transport unit, i.e. dispatching will take place after picking.
Preferably, the displayed information associated to the at least one item and one of the at least one item are visible to the user at the same time.
Preferably, the displayed information associated to the at least one item includes at least one of items the user needs to pick, product name of the item, at least one attribute of the item, a photo of the item, order information, task information, customer information, quantity to pick, and a marker positioned in the vicinity to the item to be picked. The marker can be an arrow directed towards the item, a pointer, or a highlighted an area around and/or near the item.
Preferably, the displayed information associated to the at least one item includes information about the location or position from where the user needs to pick from and where the user needs to put away to. Accordingly, the information associated to the at least one item can alternatively or additionally include information to which position the user has to put the item away to.
Preferably, the displayed information associated to the at least one item includes highlighting an area of any transport unit or of any shelf, compartment, and boxed area. Transport units might be a rack, a tote, a bin, a box, and/or a pallet. Thus, visual guidance can be provided informing the user from where items need to picked from and/or put away to by using visual clues, e.g. arrows, pointers, highlighted areas), wherein the user can be guided by the augmented reality system through the complete process from pick to put away.
Preferably, the displayed information associated to the at least one item includes information whether the picking process is completed. This includes order information, pick information, task information that allows the user to understand if an order for a customer has been completed, if a task part of an order pick has been completed, and/or information about the current pick task at hand. The augmented reality system used in the method could comprise a processing device which is capable to receive the required input data, such as sensing data of the work zone, e.g. from one or more sensing device(s) such as camera(s), laser scanner(s), etc., process the data, e.g. by means of a processor or computer and preferably using at least one of a memory, a central processing unit (CPU), a communication connection to an order fulfillment system, a database, an interface to a display device and generate display data presentable to the user via a display device, such as a head on display.
One aspect of the invention includes a work station comprising a picking source, a user work zone, and an augmented reality system including a computer, wherein the work station is configured to enable a user in the user work zone to carry out the method according to the invention. The picking source might be a shelf or other container which is filled with at least one item for picking. The picking source might be mobile, which means that it might be transported to the work zone for picking and removed afterwards. The work zone might be of a size that a user can reach the picking source as well as a placement device, which might be part of the work station. The placement device might comprise another shelf or another container for dropping the picked item. This shelf or container might be transported to the work zone in an empty condition as removed from the work zone when at least partly filled with the at least one item. The work station comprises an augmented reality system including a computer. The computer controls the display device of the augmented reality system. The computer might be placed distant from the work zone and might connected to the display device via a wired or wireless connection. The computer might also be part of a computer system controlling the movements of the picking source and/or the placement device to and/or from the work zone. The display device might be a head up display device, which is worn by the user like glasses, and which preferably allows the user to see the reality/the environment and additional information. Preferably, the reality is seen through the glasses and additional information is projected on the glasses. As an alternative, the display device might comprise two displays, one for each one of the eyes, and the displays show the reality as captured by one or two cameras enriched with additional information, as generated by the computer. The display device might also comprise screen displays, projecting means and other means for displaying computer generated augmented views.
The user might use the work station carrying out at least one of the steps as follows:
- The user puts on the display device of the augmented reality system, i.e. the head up display;
- The user turns on the augmented reality system;
- The picking application is started automatically or manually by the user;
- The user logs in, preferably using a badge with a barcode, RFID tag, NFC tag, bluetooth or by providing username/password credentials;
- The user selects a workstation, a work zone, an equipment (like a fork lift truck) from a list that is displayed or it's automatically preselected based on configuration either on the display device or on a server that is used to provide configuration and picking process information to be displayed on the display device;
- The user is instructed to start picking by the augmented reality system. While
picking the picking sources and/or placement devices are presented automatically at the work zone of the work station. The presentation can be controlled by means of the computer.
- When all orders assigned are completed or there are no more orders to pick, the user might logout of the augmented reality system.
One aspect of the invention includes a computer program product, such as a computer- readable storage medium, comprising coding segments that when loaded and executed on a suitable system can execute a method for picking items. The storage medium can be a volatile or permanent storage medium, such as a DVD, a CD, an EEPROM, a memory card, a data stream and so on. The suitable system, e.g. the processing device, may comprise a general purpose computer like a PC, a workstation, a microcontroller, an embedded PC, a smartphone and the like.
Additional objects, advantages and features of the invention will now be described in greater detail, by way of example, with reference to preferred embodiments depicted in the figures in which:
Fig. 1 shows
Fig. 2 shows
Fig. 3 shows
Fig. 4 shows
Fig. 5 shows
Fig. 6 shows
Fig. 7 shows
Fig. 8 shows
Fig. 9 shows
Fig. 10 shows
Fig. 1 1 shows
Fig. 12 shows The general augmented reality system shown in Figure 1 allows a user 102 to see a reality 100 and enhance it with additional information. For example objects 109, 1 10, 1 12 or zones/areas 106 might be identified by the augmented reality system and associated information might be displayed via a display device, which is preferably a head on display 04 worn by the user 102 like glasses. The additional information might include the information about an item and the shelf to select in order to pick the item from.
Figure 2 shows an augmented view 10 displayed to the user by means of the display device 104 when working at the work station. Within the work zone the user finds the picking source 12 and the placement device 14. The picking source 12 can be a container 12, preferably located in a shelf or rack, or the picking source 12 can be a defined place on a rack 12 or a shelf 12. In analogy the placement device 14 can be a target container 14, preferably located in a shelf or rack, or the placement device 4 can be a defined place on a rack 14 or a shelf 14. In the embodiment shown in Fig. 2 picking source 12 and target device 14 are two shelfs which are presented at the work zone in reach of the user. But it has to be understood that the picking source 12 and the target device 14 may be on a rack, a shelf, a crane, a conveyor, a lift, an autonomous guided vehicle, a fork-lift truck, a pallet jack, or any device that may be used to transport it.
The augmented reality system is configured to identify any rack or shelf 12 and its configuration 16. The identification process can be initiated automatically when the rack 12 is visible to the augmented reality system. In the same way the configuration 18 of rack 14 could be identified. The augmented reality system is able to enhance the reality by creating virtual shelves or containers 20 that represent the defined areas of a physical rack, here rack 14. When a user needs to pick or put away an item 22 from or to these areas they are marked by highlighting means 24, for example by using a different colour or shape. Further, an area 26 of view 10 might be used to present additional information to the user, such as display order, process, and/or product related information. This additional information can include at least one of the order number, the task number, the product name 30, product quantity 32, product attributes, product images 28, information about the current task and/or next tasks and orders. The augmented reality system is able to display the physical items 22 and the augmented reality shelves 20 or containers. The system can support the picking process by identifying objects or areas and select shelves and/or objects to be picked from and picked to, what product to be picked, how and relevant information for the pick, pick or packaging instructions, process or order information. An arrow 34 can be displayed to indicate to a user the direction to look to in order to perform the next action. Once he looks at the area where the next action should happen the arrow may be dimmed or disappear and the area 20 will be highlighted. An additional information display area 36 will be displayed with actions that a user can perform, for example for error handling, quality control actions or others.
Figure 3 shows a detail of the augmented image as shown in Figure 2 presented to the user to allow him to perform the pick order 40 from the picking source 12 to the target device 14. A processing device of the augmented reality system will bear responsibility for guiding the operator to perform each pick from the picking source 12, e.g. a source container, shelf or area, to the target device 14, e.g. a target container, shelf or area. The identification of the items 22 will be done using the item's shape and/or an object identifier (e.g. barcode, RFID tag, etc.) coupled to or attached to corresponding item 22. The data captured from the work zone by at least one sensing device (e.g. one or more cameras, particularly stereoscopic cameras) may be transmitted to the processing device, which is capable to perform at least one of the steps: identify item 22, determine location and/or orientation of item 22. The processing device can generate a decision on the identification of a orientation, location or item 22 and a augmented representation of the work zone could be generated and presented to the user 102 via display device 104, wherein augmented view 10 presented to the user 102 could include highlighted identified items 22. The augmented reality system will provide sensorial data to the processing device to allow the identification of any source and target rack, container and shelf and it's configuration 16, 18 automatically when it's visible to the augmented reality system. The processing device will generate an output that would allow
augmented images to be overlaid on physical racks, containers, shelfs and/or areas and would be highlighted by using a different colour or shape, as disclosed in more detail regarding to Figures 4 to 10 below.
The pick order 40 shown in Figure 3 includes all information for the user 102 in order to carry out the order completely and correctly. The information area of the pick order 40 can comprise a current pick/order headline 42 identifying the area 44 providing information regarding the order being picked and/or the product to pick. For the users convenience a control element 46 can be provided allowing the user 102 to move the augmented information 44 in the augmented view 0 displayed to him. The required next item 22 to fulfil the order 40 can be additionally represented by means of an item image 48 or product image in order to support the user 102 to decide which item to pick or to control whether the correct item is presented at the picking source 2 and whether he has to look to a different picking source 12. Area 44 can further include item information such as the required quantity, the product identification number, the product name, the order number, and/or the order line number, and other useful information. In case the incorrect item 22 is picked form the picking source 12 or put away to the wrong target device 14, the augmented reality system, preferably the processing unit, may block the user 102 from continuing with any additional tasks until the planned task is performed correctly. The user 102 can be further supported to perform the next pick/order, wherein the information area 50 of the next pick order can be identified via a next pick order headline 52. This area 50 can provide information corresponding to the area 44.
Additionally, the augmented view can comprise one of the arrows 54 displayed to indicate to the user 102 the direction the user 102 should look at to perform the next action. As an option also the next picking source 12 and target device 14 could be highlighted once these were identified.
An instruction area 56 could be displayed with actions 58 that the user 102 can perform, e.g. error handling, quality control actions or others. The instruction area 56 could be indicated by instruction area headline 60 and contain pick related instructions 62 for the current order/pick task. The actions area 58 can include individual actions 64 that can be performed by the user 102. Possible actions 64 may include pick-up related actions like screen exceptions, wrong TU, missing TU, wrong product, wrong product attributes, quality issue, missing quantity, replenish alert, etc. or drop-off related actions like screen exceptions, scan alternative location or parent TU, quality issue, etc.
Figure 4 shows the initial augmented view 10 presented to the user 102 to login. User 102 would provide authentication information such as username 6, password 68, and optionally workstation ID using the help of an input device. On confirmation via confirmation button 72 the data captured by the at least one input device may be transmitted to the processing device, where the user 102 can be identified by the processing device. The processing device can generate a decision on the identification of user 102, and based on that decision the augmented reality system can display information about products and orders to pick for user 102. Cancel button 74 might cancel or restart the login procedure.
Figures 5a and 5b show a closed container 76 containing physical or virtual markers 78 that allow detecting the presence and/or the position of the container 76 in a working space. Additionally, an object identifier 80 will be present to uniquely identify the container 76 and its goods. The data captured by at least one sensing device may be transmitted to the processing device, where the marker 78, location or position can be identified by the processing device. The processing device can generate a decision on the identification of a position, location or product so an augmented view could be generated and displayed to the user highlighting the identified object 76 by means of a highlight marker 82.
Figures 6a to 6c show a container 76 with one or more opened sides, containing physical or virtual markers 78 that allow to detect the presence and/or position of a container 76 in a working space. Additionally, physical or virtual markers 78 can be present or identified to allow the identification of compartments, shelves within the container 76. Such compartments etc. can be defined by container dividers 84, or shelf dividers respectively, which can be made of more or less rigid material or being virtual. An object identifier 80 will be present to uniquely identify the container and/or its goods or the items 22 in the container 76 or shelf.
Figure 7 shows a perspective view of a rack system 86 mounted on a floor. Figure 8 shows a detail of the rack system in Figure 7. It has to be understood that the rack system could also be mounted on an autonomous guided vehicle, a manual vehicle, or a conveyor system. The rack system 86 has one or more opened sides, containing physical or virtual markers 78 that allow detecting the presence and/or position of the racks and its contents or containers in a working zone 106. Physical or virtual markers 78 might be present or identified to allow the identification of compartments, shelves within the racks or containers. Object identifiers 80 could be present to uniquely identify the container and its goods. The identification of items will be done using its shape and/or an object identifier (e.g. barcode, RFID tag, etc.). The data captured by at least one sensing device may be transmitted to the processing device, where the marker, location, position or object can be identified by the processing device. The processing device can generate a decision on the identification of a position, location or product and sent the output to the augmented device so it can identify and generate and augmented display image of the device highlighting the identified object. In manual aisles, operators will be guided by displaying augmented images with routing paths to areas and/or locations and highlighting the container to pick from and to pick to within the working space. In automated workstations, where robots bring the racks, shelves or containers to pick from to the workstation, operators will be guided by displaying augmented images of the areas, locations, shelves, containers the users need to pick from and pick to within the working space of the workstation. Figure 9 shows a rack system that may be on the floor, on an autonomous guided vehicle, manual vehicle, conveyor system, with one or more opened sides, containing physical or virtual markers that allow detecting the presence and/or position of the racks and its contents/containers in a working space. Additionally, physical or virtual markers can be present or identified to allow the identification of compartments 88, shelves within the racks or containers. An object identifier will be present to uniquely identify the container and its goods. The identification of goods will be done using its shape and/or an object identifier (e.g. barcode, RFID tag, etc.). The data captured by at least one sensing device may be transmitted to the processing device, where the marker, location, position or object can be identified by the processing device. The processing device can generate a decision on the identification of a position, location or product and sent the output to the augmented device so it can identify and generate and augmented display image of the device highlighting the identified object. In manual aisles, operators will be guided by displaying augmented images with routing paths to areas and/or locations and highlighting the container to pick from and to pick to within the working space. In automated workstations, where robots bring the racks, shelves or containers to pick from to the workstation, operators will be guided by displaying augmented images of the areas, locations, shelves, containers the users need to pick from 12 and pick to 14 within the working zone 106 of the workstation.
Figure 10 shows an automated system with a workstation that receives containers 12 containing physical or virtual markers that allow to detect the presence and/or position of the containers and its contents in a working zone 106. Additionally, physical or virtual markers 78 can be present or identified to allow the identification of compartments, shelves within the containers 12, 14. An object identifier will be present to uniquely identify the container and its goods. The identification of goods will be done using its shape and/or an object identifier (e.g. barcode, RFID tag, etc.). The data captured by at least one sensing device may be transmitted to the processing device, where the marker, location, position or object can be identified by the processing device. The processing device can generate a decision on the identification of a position, location or product. The output could be used to generate an augmented view to the augmented display device 104, wherein the identified object could be highlighted. Users 102 will be guided by displaying augmented views of the locations, containers the users need to pick from 12 and pick to 14 within the working zone 106. Figures 11 and 12 show the identification or manual or automated vehicles/equipment and the goods (products, containers, etc.) containing physical or virtual markers 78 that allow to detect the presence and/or position of the vehicles and its contents in a working space. Additionally, physical or virtual markers 78 can be present or identified to allow the identification of compartments, shelves within the vehicles. An object identifier 80 will be present to uniquely identify each of the containers and its goods. The
identification of goods will be done using its shape and/or an object identifier (e.g.
barcode, RFID tag, etc.). The data captured by at least one sensing device may be transmitted to the processing device, where the marker, location, position or object can be identified by the processing device. The processing device can generate a decision on the identification of a position, location or product and sent the output to the augmented device so it can identify and generate and augmented display image of the device highlighting the identified object. Operators will be guided by displaying augmented images of the areas, locations, shelves, containers the users need to pick from and pick to within the working space close to the vehicle.
According to a preferred embodiment the method for picking items supported by means of an augmented reality system might have one or more of the method step described below:
1. User puts on the Augmented Reality device and turns it on (cf. Figures 1 ).
2. The picking application is automatically started or the user manually starts it.
3. User logs in using a badge with a barcode, near field communication device, bluetooth or by providing username/password credentials (cf. Figure 4)
4. User identifies the equipment by using at least one sensing device. The data
captured by the sensing device may be transmitted to the processing device, wherein the marker, identifier or object can be identified preferably by the processing device.
5. User Operator enters Picking from a main menu
6. User selects an order from a list of available orders or enters the order number if it is already known.
7. The order list could be sorted according to an order sorting sequence. All orders with released pick tasks of the respective pick method can be shown (optionally also those outside the current zone). The columns displayed could comprise order, dispatch date, customer, shipment, etc.
8. By default, the system could suggest the most important order according to the order sorting sequence.
9. An equipment capacity check could be executed to make sure, the newly selected order can be picked onto the equipment. If not, the User is sent to drop-off the order TU(s) already on the equipment.
10. Once the order is selected, the system displays any order pre-pick instructions (cf.
Figure 3).
1 1 . The augmented reality system could guide the operator to the corresponding work zone (cf. Figure 7) or a location (cf. Figure 8) nearby to state in which zone preferably to start picking the selected order.
12. The augmented reality system assigns the user to the first available pick task from the group, preferably following a pick task assignment strategy.
13. The augmented reality system directs the user to go to the picking source location.
14. Information regarding what to pick next is already displayed (cf. Figure 3): product, attributes, quantity, etc.
15. The augmented reality system guides the user through augmented views to allow the user to identify the picking source, e.g. a container, a shelf, etc.. Once it is identified, preferably using a sensing device, the user could be requested to scan each one of the products displayed and pick the requested quantities.
16. Perform the picking. The Picking step could comprise one or more of the picking steps below:
a) If there are multiple load units on the location or container or if the product requires validation, the user might identify each product using one or more sensing devices. Additionally physical or virtual markers 78 can be present or identified to allow the identification of objects, compartments, shelves within the racks or containers. An object identifier 80 could be present to uniquely identify the container and its products. The identification of products could be done using its shape and/or an object identifier (e.g. barcode, RFID tag, etc.). b) The system could prompt for attribute validation and/or capturing, according to a product attribute configuration.
c) The system could prompt to scan attributes in order to identify the respective load unit the user attempts to pick from: the system compares the product and attributes against potentially different load units (LU) on the location and tries to find a matching load unit against the one referenced by the current pick task. The system could continue prompting for attributes until the load unit can be clearly identified.
Order line picking instructions and product instructions could be displayed (cf. Figure 3).
The user can confirm the full quantity or state a lower quantity.
The user could be guided to scan the destination container and confirm that the picked products are in the destination container. The container may have physical or virtual markers that allow detecting the presence and/or position of a container in a working space. Additionally physical or virtual markers can be present or identified to allow the identification of compartments, shelves within the container. An object identifier may be present to uniquely identify the container and its goods. The identification of goods will be done using its shape and/or an object identifier (e.g. barcode, RFID tag, etc.). The data captured by at least one sensing device may be transmitted to the processing device, where the marker, location, position or object can be identified by the processing device. The processing device can generate a decision on the identification of a position, location or product and sent the output to the augmented device so it can identify and generate and augmented display image of the device highlighting the identified object and confirm the pick. Next, the system could check that the order container criteria for
consolidation / staging / loading to determine where the order container can be dropped off (consolidation location, staging location, dock location). The order container criteria could include the checks related to order routing. If the order needs to be consolidated, the processing device can look up the current consolidation location of the order or, if not specified, the default consolidation location of the system. The system can further create a TU- level consolidation task from the equipment location to the destination determined above.
If the order is allowed to be directly staged, the system can look up the current staging location of the respective shipment and can create a TU-level staging task from the equipment location to the destination determined above. If the order is allowed to be directly loaded, the system can look up the current dock location of the respective shipment and can create a TU- level loading task from the equipment location to the destination determined above.
k) If no more pick tasks for the order are left, the user can select or is assigned by the system a new order and the processes starts again from step 6 otherwise the user continues with step 17.
17. Perform the drop-off. The drop-off step could comprise one or more of the drop-off steps below:
a) If there is more than one container on the equipment (cf. Figures 11 and 12), the system can prompt to state which TU shall be dropped off next. Otherwise this step can be skipped.
b) The user can use one or more sensing devices to identify the container. c) Based on the container identified in the previous step, the system can assign the corresponding consolidation, staging or loading task to the user. d) The user can be guided to the consolidation location.
e) Once the user arrives at the consolidation location, the system can request the use of one of the sensing devices to confirm the destination location and the drop-off into the container.
f) Return to step 6.
The following exceptions may be supported by the augmented reality manual picking process:
Pick-up - Wrong TU
Pick-up - Missing TU
Pick-up - Missing Quantity
Pick-up - Quality Issue
Pick-up - Wrong Attributes
Pick-up - Wrong Product
Pick-up - Replenish Alert (only for Locations with Product Assignment)
Drop-off - Scan Alternative Location or Parent TU
Drop-off - Quality Issue on Equipment
Drop-off - Palletize List of reference signs
10 augmented view
12 picking source
14 placement device
16 configuration
18 configuration
20 virtual shelves or containers
22 item
24 highlighting means
26 area presenting additional information
28 product image
30 product name
32 product quantity
34 arrow
36 additional information display area
40 pick and order information area
42 current Pick/Order headline
44 area providing order/pick information
46 control element
48 Product Image
50 area providing information regarding the next order
52 Next Pick/Order headline
54 arrow
56 instructions area
58 actions area
60 instructions area headline
62 instructions
64 individual actions
66 login username
68 login password
70 optional workstation input
72 confirm login
74 cancel login container
marker
object identifier
highlight marker
container divider (physical or virtual) rack system
compartment
reality
user
display device
work zone / area
object
object
object

Claims

Claims
Computer aided method for picking comprising:
- Displaying information associated to at least one item (22) to be picked by a user (102) by means of a display device (104) of an augmented reality system to the user (102);
- Picking the at least one item (22) from a picking source (12);
- Displaying a confirmation whether the correct item (22) and/or the correct number of items (22) is picked.
The method according to claim 1 , wherein the displayed information associated to the at least one item (22) and one of the at least one item (22) are visible to the user (102) at the same time.
The method according to claim 1 or 2, wherein the displayed information associated to the at least one item (22) includes at least one of items the user needs to pick, product name of the item, at least one attribute of the item, a photo (48) of the item, order information, task information, customer information, quantity to pick, and a marker positioned in the vicinity to the item (22) to be picked.
The method according to one of the preceding claims, wherein the displayed information associated to the at least one item (22) includes information about the location or position from where the user ( 02) needs to pick from and where the user needs to put away to.
The method according to claim 4, wherein the displayed information associated to the at least one item (22) includes highlighting an area of any transport unit (76) or of any shelf (86), compartment (88), and boxed area. 6. The method according to one of the preceding claims, wherein the displayed
information associated to the at least one item (22) includes information whether the picking process is completed.
7. Work station comprising a picking source (12), a user work zone (106), and an augmented reality system including a computer, wherein the work station is configured to enable a user (102) in the user work zone (106) to carry out the method according to one of the preceding claims.
8. A computer-readable storage medium for a computer-aided picking of item (22), wherein the storage medium comprises coding segments that when loaded and executed on a suitable system can execute a method according to any one of claims 1 to 6.
PCT/EP2018/000095 2017-03-13 2018-03-12 Method for picking items Ceased WO2018166652A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP17000401 2017-03-13
EP17000401.4 2017-03-13

Publications (1)

Publication Number Publication Date
WO2018166652A1 true WO2018166652A1 (en) 2018-09-20

Family

ID=58360790

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/000095 Ceased WO2018166652A1 (en) 2017-03-13 2018-03-12 Method for picking items

Country Status (1)

Country Link
WO (1) WO2018166652A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200272969A1 (en) * 2017-09-08 2020-08-27 Ns Solutions Corporation Information processing system, information processing device, information processing method, program, and recording medium
CN112279096A (en) * 2020-09-18 2021-01-29 天津海运职业学院 Container turning prompting method, device, equipment and storage medium based on augmented reality
US20220122029A1 (en) * 2020-09-22 2022-04-21 Rehrig Pacific Company Pick assist system
EP4470683A1 (en) * 2023-06-02 2024-12-04 BEUMER Group GmbH & Co. KG A method and a system for sorting and storing objects
US12456094B2 (en) 2019-02-25 2025-10-28 Rehrig Pacific Company Delivery system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017007866A (en) * 2016-09-13 2017-01-12 オークラ輸送機株式会社 Picking system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017007866A (en) * 2016-09-13 2017-01-12 オークラ輸送機株式会社 Picking system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200272969A1 (en) * 2017-09-08 2020-08-27 Ns Solutions Corporation Information processing system, information processing device, information processing method, program, and recording medium
US12456094B2 (en) 2019-02-25 2025-10-28 Rehrig Pacific Company Delivery system
CN112279096A (en) * 2020-09-18 2021-01-29 天津海运职业学院 Container turning prompting method, device, equipment and storage medium based on augmented reality
CN112279096B (en) * 2020-09-18 2023-01-10 天津海运职业学院 Container turning prompting method, device, equipment and storage medium based on augmented reality
US20220122029A1 (en) * 2020-09-22 2022-04-21 Rehrig Pacific Company Pick assist system
EP4470683A1 (en) * 2023-06-02 2024-12-04 BEUMER Group GmbH & Co. KG A method and a system for sorting and storing objects
WO2024246258A1 (en) * 2023-06-02 2024-12-05 BEUMER Group GmbH & Co. KG A method and a system for sorting and storing objects

Similar Documents

Publication Publication Date Title
US20240043213A1 (en) Hybrid Modular Storage Fetching System
WO2018166652A1 (en) Method for picking items
US11409301B2 (en) Item transport system and method combining storage with picking
AU2017422425B9 (en) Relay cargo picking system and picking method
US10647509B2 (en) Warehouse management system
KR102452858B1 (en) Warehouse automation systems and methods using motorized carts
US10163149B1 (en) Providing item pick and place information to a user
JP6220406B2 (en) Picking system
JP7057793B2 (en) Packing by destination for automatic fulfillment products
US20210319391A1 (en) Perpetual batch order fulfillment
US11687881B2 (en) Virtual put wall
JP7186160B2 (en) Relay type article sorting system and sorting method
KR102216641B1 (en) Operation Method For Location Recommendation And Apparatus Therefor
KR20190107095A (en) Display for improved efficiency in robot-assisted order fulfillment
CN109747897A (en) Article packing method, device and control system based on user's order
CA3049395A1 (en) Hybrid modular storage fetching system
JP2019215879A (en) Perpetual batch order fulfillment
KR102844187B1 (en) Dynamic item loading management using mobile robots
US12387253B1 (en) Item based path development
WO2024111041A1 (en) Display system, display method, and program
JP2023064210A (en) Picking control system, picking control device, and picking control method
JP2023031618A (en) Apparatus, method and system for managing article, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18711476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18711476

Country of ref document: EP

Kind code of ref document: A1