[go: up one dir, main page]

WO2025072750A1 - Appareil de transport de pièce avec détection d'effecteur terminal de vision intégrée - Google Patents

Appareil de transport de pièce avec détection d'effecteur terminal de vision intégrée Download PDF

Info

Publication number
WO2025072750A1
WO2025072750A1 PCT/US2024/048963 US2024048963W WO2025072750A1 WO 2025072750 A1 WO2025072750 A1 WO 2025072750A1 US 2024048963 W US2024048963 W US 2024048963W WO 2025072750 A1 WO2025072750 A1 WO 2025072750A1
Authority
WO
WIPO (PCT)
Prior art keywords
end effector
controller
robot
robot arm
configuration characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/048963
Other languages
English (en)
Inventor
Amirali OMIDFAR
Justo GRACIANO
Harshit Jain
Caspar Hansen
Robert Carlson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brooks Automation US LLC
Original Assignee
Brooks Automation US LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brooks Automation US LLC filed Critical Brooks Automation US LLC
Publication of WO2025072750A1 publication Critical patent/WO2025072750A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/04Gripping heads and other end effectors with provision for the remote detachment or exchange of the head or parts thereof
    • B25J15/0483Gripping heads and other end effectors with provision for the remote detachment or exchange of the head or parts thereof with head identification means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/04Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical coordinate type or polar coordinate type
    • B25J9/041Cylindrical coordinate type
    • B25J9/042Cylindrical coordinate type comprising an articulated arm
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/67005Apparatus not specifically provided for elsewhere
    • H01L21/67242Apparatus for monitoring, sorting or marking
    • H01L21/67294Apparatus for monitoring, sorting or marking using identification means, e.g. labels on substrates or labels on containers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/677Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for conveying, e.g. between different workstations
    • H01L21/67763Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for conveying, e.g. between different workstations the wafers being stored in a carrier, involving loading and unloading
    • H01L21/67766Mechanical parts of transfer devices
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/683Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for supporting or gripping
    • H01L21/687Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for supporting or gripping using mechanical means, e.g. chucks, clamps or pinches
    • H01L21/68707Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for supporting or gripping using mechanical means, e.g. chucks, clamps or pinches the wafers being placed on a robot blade, or gripped by a gripper for conveyance

Definitions

  • the present disclosure generally relates to workpiece handling, and more particularly, to calibration and monitoring of workpiece handling robots.
  • automatic robot teaching technologies for teaching a position of a workpiece handling robot within an automated processing environment employ machine vision (e.g., cameras) mounted to a fixture.
  • a robot arm is moved so as to place the end effector thereof within the field(s) of view of the cameras for detecting a position of the end effector.
  • the motion of the end effector is limited, which in turn limits the imaging of the end effector with the machine vision such that an absolute position of the end effector is not determined.
  • the automatic robot teaching technologies employ a teaching target that is placed on the end effector, such as by a human operator. Placement of this teaching target on the end effector may cause variations in teaching accuracy.
  • the workpiece handling robots generally do not include feedback for physical positions of the end effectors relative to the workpiece handling robot itself. With operation of the workpiece handling robot, it is possible for the end effectors to shift or move relative to other portions of the workpiece handling robot creating pick/placement errors of the workpiece handling robot.
  • the workpiece handling robot is typically manually configured for a type of end effector installed on the workpiece handling robot.
  • a human operator configures the workpiece handling robot by entering (such as into a controller of the workpiece handling robot) the type of (e.g., operating characteristics of) end effector installed.
  • the manual configuration introduces possible human error in the workpiece handling robot configuration process.
  • FIG. 1 is an exemplary schematic illustration of a processing apparatus incorporating the present disclosure
  • FIG. 2A is an exemplary schematic perspective illustration of a portion of the processing apparatus of Fig. 1 in accordance with the present disclosure
  • FIGs. 2B-2G are exemplary illustrations of a portion of a workpiece transport of the processing apparatus in accordance with the present disclosure
  • FIG. 3A is an exemplary top view illustration of a portion of a workpiece transport of the processing apparatus in accordance with the present disclosure
  • Fig. 3B is an exemplary side view illustration of a portion of the workpiece transport of Fig. 3A in accordance with the present disclosure
  • Fig. 3C is an exemplary perspective view illustration of a portion of the workpiece transport of Fig. 3 A in accordance with the present disclosure
  • Fig. 4A is an exemplary top view illustration of a portion of the workpiece transport of Fig. 3A in accordance with the present disclosure
  • Fig. 4B is an exemplary side view illustration of a portion of the workpiece transport of Fig. 3A in accordance with the present disclosure
  • FIG. 5 is a schematic perspective view illustration of an exemplary end effector of the workpiece transport of Fig. 3 A in accordance with the present disclosure
  • FIG. 5A is a schematic illustration of a portion of the processing apparatus of Fig. 1 in accordance with the present disclosure
  • FIG. 5B is a schematic illustration of a portion of the processing apparatus of Fig. 1 in accordance with the present disclosure
  • FIGs. 6 and 7 are flow diagrams of exemplary methods in accordance with the present disclosure.
  • Fig. 8 is an exemplary top view illustration of a portion of the workpiece transport of Fig. 3A in accordance with the present disclosure
  • Fig. 8 A is an exemplary side view illustration of a portion of the workpiece transport of Fig. 3A in accordance with the present disclosure
  • Fig. 9 is a flow diagram of an exemplary method in accordance with the present disclosure
  • FIG. 10 is a schematic illustration of a portion of the processing apparatus of Fig. 1 in accordance with the present disclosure
  • FIGS. 11A-11C are schematic illustrations of a portion of the workpiece transport of Fig. 10 in various orientations in accordance with the present disclosure
  • FIG. 12 is a flow diagram of an exemplary method in accordance with the present disclosure.
  • Fig. 13 is a flow diagram of an exemplary method in accordance with the present disclosure.
  • FIG. 14 is a flow diagram of an exemplary method in accordance with the present disclosure.
  • Fig. 15 is a flow diagram of an exemplary method in accordance with the present disclosure.
  • Fig. 16 is a flow diagram of an exemplary method in accordance with the present disclosure.
  • each refers to a single object (i.e., the object) in the case of a single object or each object in the case of multiple objects.
  • the words “a,” “an,” and “the” as used herein are inclusive of “at least one” and “one or more” so as not to limit the object being referred to as being in its “singular” form.
  • Fig. 1 illustrates an exemplary processing apparatus or tool 100 in accordance with the present disclosure.
  • the present disclosure will be described with reference to the drawings, it should be understood that the present disclosure could be embodied in many forms. In addition, any suitable size, shape or type of elements or materials could be used. It is also noted that while the disclosure makes reference to X, Y, Z, R, EX, EY, EZ axes as well as rotations 0, ERx (pitch), ERy (roll), ERz (yaw), as well as directional language (e.g., vertical, horizontal, etc.) such nomenclature is exemplary only and the different axes and directions referred to herein may be referred to with any suitable nomenclature.
  • the present disclosure provide for an improved (compared to conventional techniques described above) automatic teaching process for teaching a workpiece handling robot (also referred to herein as a configurable robot, robot, or workpiece transport) 180 position within the processing apparatus 100.
  • the automatic teaching procedure provides for a generalized and unified teaching algorithm that encompasses any number of distinct types (e.g., physical structure and operational characteristics as described herein) of end effectors 200.
  • the automatic teaching algorithm provides for the detection/measurement of end effector 200 pose in six degrees of freedom (i.e., X, Y, Z, pitch, roll, and yaw in the transport reference frame and/or EX, EY, EZ, pitch, roll, and yaw in the end effector reference frame) with but three cameras (e.g., at least one two-dimensional camera and at least one of a stereo camera pair and a depth determining camera) or at least one photo-array sensor.
  • the automatic teaching procedure employs a neural network model NNM or any other suitable algorithm that is trained to detect the different types of end effectors so as to substantially eliminate or at least decrease possibilities of human error in the robot configuration process.
  • the stereo camera pair and/or depth determining camera provides depth calculations to more precisely measure the end effector pose.
  • the at least one photo-array sensor also provides for depth calculations to more precisely measure the end effector pose.
  • the present disclosure also provides for automatic configuration of the workpiece transport 180 by automatically detecting the type of end effector 200 installed on the workpiece transport 180 so that the workpiece transport 180 is automatically configured with the operational parameters (also referred to herein as predetermined configuration characteristic(s) PCC, PCCA- PCCn - see Fig. 1) corresponding to the installed end effector 200.
  • each end effector 200 is provided with an integral self-identification feature (also referred to herein as an identifier or identification device) 500 that is detectable by the workpiece transport 180 (e.g., by a detector 370 thereof or communicably coupled thereto).
  • the self-identification feature 500 corresponds (e.g., via a lookup table 500T stored in any suitable memory of the controller 199) with predetermined operational parameters (e g., predetermined operating robot functions) that are automatically loaded into the controller 199 for operation of the workpiece transport 180.
  • predetermined operational parameters e g., predetermined operating robot functions
  • This automatic detection and configuration is effected at initial setup of the workpiece transport 180 and at any reconfiguration (e.g., change of end effector of the same type or a different type) of the workpiece transport 180 at the robot manufacturer facility or in the field (e.g., a customer facility where the workpiece transport 180 operates in production of product).
  • the identifier 500 may be configured to effect end effector pose determination through detection of the identifier 500 with the at least one two-dimensional camera, the at least one of a stereo camera pair and a depth determining camera, and/or the at least one photo-array sensor.
  • the present disclosure further provide for automatic reconfiguration of the workpiece transport 180 through an automatic exchanging of end effectors with the workpiece transport 180 in a production environment (e.g., within a processing tool) avoiding down time of the tool normally associated with workpiece transport reconfiguration.
  • an end effector exchange station 169 is provided in or coupled to the processing tool 100.
  • each end effector 200 may include the self-identification feature (e.g., identifier 500) that is detectable by the workpiece transport 180 for effecting the reconfiguration of the workpiece transport 180 with an exchange of end effectors 200 at the end effector exchange station 169.
  • the substrate holding location of the end effector 200 (which is at a predetermined known location on the end effector 200, which predetermined known location is an operational characteristic of the end effector 200 obtained from the lookup table 500T) is known to the controller 199 such that reteaching of the workpiece transport 180 position may not be necessary.
  • EFEM 130 may have a shell or casing (also referred to as the EFEM frame) defining a protected environment or mini-environment where workpieces W may be accessed and handled with minimized potential for contamination between the transport containers 110.
  • the protected environment is employed to transport the workpieces W to and from the processing apparatus 100, and the workpiece process module 140.
  • the EFEM 130 also includes a workpiece transport 180 disposed within the EFEM 130 frame and configured with at least one end effector 200 for transporting workpieces W between the transport containers 110 and the workpiece process module 140.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Container, Conveyance, Adherence, Positioning, Of Wafer (AREA)

Abstract

Robot configurable possédant une base et un bras de robot articulé possédant un effecteur terminal sur celui-ci. Un dispositif de commande est relié à et conçu pour articuler le bras de robot articulé. Un détecteur détecte l'effecteur terminal. Chaque effecteur terminal possède des caractéristiques d'identification et le détecteur enregistre les caractéristiques d'identification et génère des données incorporant l'identification d'une caractéristique de configuration prédéterminée de l'effecteur terminal détecté. Le dispositif de commande est programmé pour identifier et déterminer automatiquement une caractéristique de configuration prédéterminée de l'effecteur terminal détecté à partir des données.
PCT/US2024/048963 2023-09-29 2024-09-27 Appareil de transport de pièce avec détection d'effecteur terminal de vision intégrée Pending WO2025072750A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363586853P 2023-09-29 2023-09-29
US63/586,853 2023-09-29

Publications (1)

Publication Number Publication Date
WO2025072750A1 true WO2025072750A1 (fr) 2025-04-03

Family

ID=95202264

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/048963 Pending WO2025072750A1 (fr) 2023-09-29 2024-09-27 Appareil de transport de pièce avec détection d'effecteur terminal de vision intégrée

Country Status (1)

Country Link
WO (1) WO2025072750A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180215046A1 (en) * 2015-03-31 2018-08-02 Autonetworks Technologies, Ltd. Image acquisition system for wire group processing
WO2020009656A1 (fr) * 2018-07-06 2020-01-09 Psa International Pte Ltd Appareil et procédé permettant de placer des cônes et/ou de retirer des cônes sur un conteneur
WO2020243631A1 (fr) * 2019-05-30 2020-12-03 Icahn School Of Medicine At Mount Sinai Système de suivi et d'enregistrement de caméra montée sur robot pour chirurgie orthopédique et neurologique
US20220266454A1 (en) * 2019-04-12 2022-08-25 Nikon Corporation Robot system, end effector system, end effector unit, and adapter
WO2023068064A1 (fr) * 2021-10-21 2023-04-27 倉敷紡績株式会社 Procédé de connexion d'un connecteur substrat-substrat

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180215046A1 (en) * 2015-03-31 2018-08-02 Autonetworks Technologies, Ltd. Image acquisition system for wire group processing
WO2020009656A1 (fr) * 2018-07-06 2020-01-09 Psa International Pte Ltd Appareil et procédé permettant de placer des cônes et/ou de retirer des cônes sur un conteneur
US20220266454A1 (en) * 2019-04-12 2022-08-25 Nikon Corporation Robot system, end effector system, end effector unit, and adapter
WO2020243631A1 (fr) * 2019-05-30 2020-12-03 Icahn School Of Medicine At Mount Sinai Système de suivi et d'enregistrement de caméra montée sur robot pour chirurgie orthopédique et neurologique
WO2023068064A1 (fr) * 2021-10-21 2023-04-27 倉敷紡績株式会社 Procédé de connexion d'un connecteur substrat-substrat

Similar Documents

Publication Publication Date Title
US11764093B2 (en) Automatic wafer centering method and apparatus
US11908721B2 (en) Tool auto-teach method and apparatus
US11776834B2 (en) On the fly automatic wafer centering method and apparatus
US20240153794A1 (en) Robot embedded vision apparatus
US8892248B2 (en) Manipulator auto-teach and position correction system
US20230343626A1 (en) Automated Teach Apparatus For Robotic Systems And Method Therefor
US20150166273A1 (en) Workpiece holder for workpiece transport apparatus
WO2007008939A2 (fr) Appareil avec centrage de piece a la volee
CN114758975A (zh) 在传输中自动晶圆定中方法及设备
WO2025072750A1 (fr) Appareil de transport de pièce avec détection d'effecteur terminal de vision intégrée
TWI846916B (zh) 基板運輸設備及操作基板運輸設備的方法
TW202219460A (zh) 用於機器人系統之自動教學裝置及用於機器人系統之自動教學裝置的方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24873721

Country of ref document: EP

Kind code of ref document: A1