WO2019096479A1 - Procédé et moyen pour faire fonctionner un système robotique - Google Patents
Procédé et moyen pour faire fonctionner un système robotique Download PDFInfo
- Publication number
- WO2019096479A1 WO2019096479A1 PCT/EP2018/076216 EP2018076216W WO2019096479A1 WO 2019096479 A1 WO2019096479 A1 WO 2019096479A1 EP 2018076216 W EP2018076216 W EP 2018076216W WO 2019096479 A1 WO2019096479 A1 WO 2019096479A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- contour
- environmental
- detected
- robot arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40548—Compare measured distances to obstacle with model of environment
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40609—Camera to monitor end effector as well as object to be handled
Definitions
- the present invention relates to a method and means for operating a robot assembly having at least one robot arm and a robot assembly having the means and a computer program product for carrying out the method.
- the object of the present invention is to improve the operation of a robot assembly. This object is achieved by a method having the features of claim 1.
- Claims 9 to 11 provide a means for operating a robot assembly according to a method described herein, a robot assembly having means described herein, and a computer program product for performing a method described herein.
- the subclaims relate to advantageous developments.
- a robot arrangement has one or more robot arms, which in one embodiment (one) one or more, in particular at least three, in one embodiment at least six, in particular at least seven, joints or (motion) axes , in particular rotary joints or axes, and / or drives for moving the robot arm, in particular adjusting its joints, has / have.
- the present invention has one or more robot arms, which in one embodiment (one) one or more, in particular at least three, in one embodiment at least six, in particular at least seven, joints or (motion) axes , in particular rotary joints or axes, and / or drives for moving the robot arm, in particular adjusting its joints, has / have.
- Robot arrangement can advantageously operate the robot arrangement in one embodiment, in particular the robot arm based on the environment model or
- Collision risk procedure be moved autonomously in one execution.
- one or more of the robotic arm (s) may be mobile in one embodiment (relative to the environment of the robotic assembly), in particular (in each case or together) on a mobile, in particular mobile, platform.
- the real environment of the robot assembly may, in particular, be an environment (in) of or approached by or with this mobile robot assembly or platform
- the mobile robot arrangement or platform can in the meantime also move away from this position, the robot arrangement being based on or in
- Positions of a mobile robot assembly or platform are performed.
- a free space for a robot arm in the sense of the present invention may in one embodiment be a one-way or more, in particular all-sided, limited Cartesian or joint coordinate space within which the robot arm, in particular when the platform is stationary, can or must proceed.
- An environment model in the sense of the present invention describes, in particular approximates, in one embodiment the real environment of the robot arrangement in mathematical, in particular numerical, form, in particular in the form of
- the environmental contour is contactless, in particular optically, and / or with the aid of at least one sensor, in particular a 2D or 3D sensor, in particular a 2D or 3D camera, in an embodiment using exactly or only one such sensor, in particular three-dimensional, captured.
- a 2D sensor is moved in one embodiment to, or when detecting the environmental contour.
- the environmental contour detected by means of at least one, in particular moving, 2D sensor is detected by calculating the three-dimensional surrounding contour from the environment detected with the aid of the sensor, in particular using a structure-by-motion method known per se.
- a non-contact, in particular optical, detection this can be carried out in one embodiment quickly and / or without obstruction of the robot assembly by a 3D sensor, in particular a 3D camera, in a particularly fast and / or compact execution, by a 2D Sensor, in particular a 2D camera, particularly simple and / or inexpensive in one embodiment.
- the 3D sensor may have a predetermined pattern, the one
- the light source is imaged onto the real environment, or its distortion detected, via travel-time measurements of light distances to the real environment, in one embodiment a so-called ToF camera, in particular a PMD sensor or sensors the like, in particular, detect interferences between measuring and object beams and / or with the aid of (micro) lens arrays or the like in addition to a brightness of pixels and a light direction of rays that lead to a pixel capture or be set up for this purpose or . be used.
- the detection in one embodiment can be carried out particularly quickly and / or compactly.
- the senor which is freely movable in space for this purpose in an embodiment, for or during detection (by a user or a
- it communicates wirelessly or via at least one line with a processing means which determines the environmental model and / or the free space on the basis of data detected or transmitted by the sensor.
- the senor is arranged on a mobile phone, in particular a smartphone, tablet computer or another handheld device such as a personal digital assistant (“PDA”) or the like, in particular integrated or detachably attached.
- a mobile phone in particular a smartphone, tablet computer or another handheld device such as a personal digital assistant (“PDA”) or the like, in particular integrated or detachably attached.
- PDA personal digital assistant
- Smartphones or the like devices that are otherwise used can advantageously also be used for detecting the surrounding contour. Communication via a line can reduce the risk of interference in one embodiment.
- one or more robotic arm-resistant elements are provided.
- a robot-guided tool and / or the (entire) robot arm, identified in the detected environmental contour, in an embodiment using image processing, in particular recognition, are located in one embodiment or a pose of the element (s) in the detected environmental contour and / or to each other.
- robot-guided sensor for detecting one or the three-dimensional fine contour of the real environment of the robot arrangement in the detected surrounding contour, in an embodiment using image processing, in particular recognition,
- a position of the robot arm in particular a position of one or more, in an embodiment of all, joints or axes of the robot arm, in which the surrounding contour is detected or detected, detected in a development using sensors at the joints and / or drives.
- the detected position of the robot arm is wirelessly transmitted to one or the processing means for determining the environment model and / or free space.
- the environment model and / or the free space is in an embodiment based on or in dependence on this identification or localization of or
- the identified robot arm is eliminated from the detected environmental contour, in particular for determining the environmental model
- the detected environmental contour, the environmental model and / or the free space are aligned or positioned and / or oriented relative to a robot-arm-fixed, in particular roboterarm base or flange-resistant reference, in particular a reference (coordinate) system a deviation between the roboterarmfesten identified in the detected environmental contour and their pose, which results from the detected position of the robot arm, is minimal.
- the detected environmental contour the detected environmental contour
- the method comprises the step of: detecting a
- Three-dimensional fine contour of the real environment of the robot assembly after detecting the surrounding contour in an embodiment based on the determined environment model and / or free space.
- the determined environment model in an embodiment, the robot assembly
- Robot arrangement operated on the basis of the determined fine contour, in particular an environmental model and / or, in an embodiment based on this environmental model, a collision space for the robot arm determined.
- the environmental contour in particular with the aid of the manually guided sensor, is thus first roughly recorded and the environmental model and / or free space determined on the basis of this environmental contour (coarse), and subsequently the fine contour recorded with greater accuracy using the robot-guided sensor, wherein the Robot arrangement for this purpose based on the roughly determined
- the roughly determined environment model for the environmental model and / or the roughly determined free space for the collision space are modified, in particular specified.
- Collision space for a robot arm according to the present invention in one embodiment may also be a single or multi, especially all sides, limited Cartesian or joint coordinate space, within which the
- Robot arm in particular when the mobile platform is stationary, and which in one embodiment has a higher accuracy than the (roughly determined) free space, an environmental model according to the present invention corresponding to the real environment of the robot arrangement in mathematical, in particular numerical, shape, in particular in the form of surfaces and / or three-dimensional object models or corresponding data, describes, in particular approximated, and in one embodiment has a higher accuracy than the (roughly determined) environment model.
- an advantageous at least two-stage corresponding to the real environment of the robot arrangement in mathematical, in particular numerical, shape, in particular in the form of surfaces and / or three-dimensional object models or corresponding data, describes, in particular approximated, and in one embodiment has a higher accuracy than the (roughly determined) environment model.
- the environment model and / or the free space in one embodiment additionally or alternatively (also) the environmental model and / or the
- geometric objects in one embodiment are independent of a priori known objects in the vicinity of the robot arm, are given or are and / or, in particular by means of pattern recognition, in the detected
- the detected environmental contour can advantageously be supplemented and / or corrected in one embodiment, whereby the omission of the use of a priori known objects, in particular CAD models of such objects, can simplify and / or accelerate the method.
- the free space is at least partially limited by the working space of the robot arm.
- areas of the surrounding contour which the robot arm, in particular when the platform is stationary can not be excluded in the determination of the free space due to kinematic or structural restrictions. Nevertheless, such areas can be visualized in one embodiment in the detected environmental contour or contained in the determined environment model.
- the working space of the robot arm is determined on the basis of its identification, in particular from a database or base determined the robot arm associated specific working space and this used in the determination of the environment model, free space, environmental model and / or collision space, in particular the free - and / or collision space through this specific
- Working space of the robot arm in particular be his workspace at immovable Roboterarmbasis or platform.
- the surrounding contour can be detected coarser in one embodiment, in particular the number of detected points in the environment can be reduced.
- the environmental model and / or the free space in one embodiment additionally or alternatively (also) the environmental model and / or the collision space, are determined (respectively) based on a 3D occupancy grid, in particular on a voxel basis ("3D occupancy grid") , which represents a particularly advantageous approximation.
- the real environment of the robotic assembly may include a
- Workplace in particular a robot cell, have, in particular be, wherein the robot arm is arranged in one embodiment when detecting the surrounding contour at the workplace, in another embodiment detects the surrounding contour without the robot arm and this is then arranged at the workplace. Accordingly, a future environment (still to be detected when detecting) is also referred to as the real environment of the robot arrangement in the sense of the present invention.
- determining a pose of the robotic arm-fixed reference and the detected environmental contour relative to each other can be simplified and / or their precision and / or reliability improved.
- detecting the surrounding contour without robotic arm can simplify detection.
- an (operating) means for operating the robot arrangement in particular hardware and / or software, in particular program technology, for implementing a method described here is set up and / or has:
- Detection means for, in particular three-dimensional, detecting a
- Processing means for determining an environment model based on or in response to this detected environmental contour and / or determining a free space for the or one or more of the robotic arm (s) based on or in dependence on this detected environmental contour, in a design based on or depending on this environment model;
- the (operating) means and / or his (e) means comprises: means for non-contact, in particular optical detection of the surrounding contour. Additionally or alternatively, in one embodiment, the detection means on at least one sensor, in particular 2D or 3D sensor, in particular a 2D or 3D camera on.
- the senor for detecting manual guidance, in particular pivoting, and / or for communication via at least one line or for wireless communication with the processing means is arranged and / or arranged on a handheld device, in particular a smartphone.
- the (operating) means and / or its agent comprises:
- Robot arrangement in particular a working movement for performing a working process, a retraction movement from a fault pose and / or a search movement for detecting the fine contour using the robot-guided sensor, based on the determined environment model and / or free space and / or the determined fine contour.
- a means in the sense of the present invention may be designed in terms of hardware and / or software, in particular a data or signal-connected, preferably digital, processing, in particular microprocessor unit (CPU) and / or a memory and / or bus system or multiple programs or program modules.
- the CPU may be configured to execute instructions implemented as a program stored in a memory system, to capture input signals from a data bus, and / or
- a storage system may comprise one or more, in particular different, storage media, in particular optical, magnetic, solid state and / or other non-volatile media.
- the program may be such that it is capable of embodying or executing the methods described herein, so that the CPU may perform the steps of such methods and thus, in particular, operate the robot assembly.
- a computer program product may include, in particular, a non-volatile storage medium for storing a program or a program stored thereon, wherein execution of this program causes a system or a controller, in particular a computer, to do so method described herein or one or more of its steps.
- the detection of a three-dimensional environmental contour comprises the detection of a plurality of staggered three-dimensional partial environmental contours of the real environment and their joining to the captured (overall) environmental contour.
- Localized robot arrangement identified in particular as described here, and the detected partial environmental contours then assembled on the basis of these localizations to the overall environment contour, in particular (in each case) transformed into one or the common (s) reference (coordinate) system, wherein the (respective) transformation results from the localization of the robot arrangement in the (respective) partial environmental contour.
- the partial environmental contours become put together so that the robot arrangements located in them
- the detection of a three-dimensional fine contour can also include the detection of a plurality of three-dimensional partial fine contours of the real environment and their joining
- the robot arrangement or a part of the robot arrangement is located, in particular identified in the manner described here, and the partial fine contours are then combined on the basis of these localizations to form the overall fine contour, in particular (in each case) into one or the common (s)
- Environmental model the determination of several sub-environmental or environmental models and their joining to the (overall) environmental or environmental model include.
- the determination of several sub-environmental or environmental models and their joining to the (overall) environmental or environmental model include.
- Environmental models each located the robot assembly or a part of the robot assembly, in particular identified in the manner described here, and the partial environmental or environmental models then based on these localizations for
- FIG. 2 shows a method for operating the robot arrangement according to FIG.
- FIG. 1 shows a robot arrangement with a six-axis robot arm 10, which is arranged in a robot cell 100 that is delimited by cell walls, for example fences 21.
- two objects 22, for example tables, shelves, machine tools, conveyors or the like are arranged by way of example.
- a robot controller 1 1 performs a method of operating the robot assembly according to an embodiment of the present invention, which will be explained below with reference to FIG. 2.
- a three-dimensional environmental contour 110 of the real environment in the form of the robot cell 100 is detected by means of a manually guided by a user 3D camera of a smartphone 30, which this pivots this and transmits the corresponding data wirelessly to the robot controller 1 1.
- a 2D camera can also be used and the three-dimensional contour of the surroundings can be calculated, for example, by means of a so-called "structure from motion" method.
- the detected environmental contour 1 10 is indicated in phantom in Fig. 1, wherein a deviation of this roughly detected environmental contour 1 10 of the real environment 100 is schematized and exaggerated thereby indicated that the detected
- contour 1 10 partially within or outside the real environment 100 is shown.
- step S20 the robot controller 1 1 identifies the robot arm 10 in the detected surrounding contour 1 10, which is indicated cross-hatched in FIG. In addition, in step S20, it detects its position based on its detected joint or axis angle.
- a step S30 it determines based on the detected environmental contour 1 10, the robot arm 10 identified therein and its detected position
- a cuboid 120 is indicated hatched by way of example, with the in
- the robot controller 1 1 directs the environmental model or the detected environmental contour 1 10 relative to a robot arm base fixed coordinate system, with respect to which the position of the robot arm 10 has been detected on the basis of its detected joint or Achswinkel, such that the position of the in the detected ambient contour 1 10 identified robot arm 10 as little as possible deviates from this detected on the basis of its detected joint or axis angle position.
- the robot controller 1 1 can rotate the environment model or the detected environmental contour 1 10 around an axis perpendicular to the plane of the drawing of FIG. 1 and / or move it in the plane of the drawing until the orientation or position of the rocker and the hand of the identified
- Robotic arm as well as possible with the position of rocker and hand according to the detected joint or axis angle matches.
- the robot controller 11 determines 1 based on this roughly determined environment model, which in turn based on the coarse detected
- Environmental contour 1 10 has been determined, roughly a free space 200, the limit 210 is indicated by dashed lines in Fig. 1. As can be seen from this, this free space 200 is also limited by the constructively maximum possible working space 300 of the robot arm, which is indicated in dotted lines in FIG.
- a step S50 the robot controller 1 1 then moves the robot arm 10 within this roughly determined free space 200 in order to precisely record a fine contour 130 of the surroundings 100 with a robot-guided 3D camera 13, which in the context of the drawing accuracy of FIG. 1 with the surroundings 100 coincides and therefore is not separately recognizable in Fig. 1.
- step S60 the robot controller 11 analogous to step S30, S40 can now determine a (more accurate) environmental model and / or a (more accurate) collision space for the robot arm 10 on the basis of the acquired fine contour 130.
- step S60 can also be omitted and the robot controller 11 in step S50, for example, a retraction movement of the robot arm 10 from a
- a partial environmental contour can in each case be detected several times and the robot arm can be located in it, wherein these mutually offset partial environmental contours are then joined to the or an overall environmental contour in accordance with the respective robot arm located therein
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
L'invention concerne un procédé pour faire fonctionner un système robotique présentant au moins un bras robotisé (10), comprenant les étapes suivantes : détection (S10) d'un contour tridimensionnel (110) d'un environnement réel (100) du système robotique ; détermination (S30) d'un modèle d'environnement et/ou d'un espace libre (200) pour le bras robotisé sur la base du contour d'environnement détecté ; et fonctionnement (S50) du système robotique sur la base du modèle d'environnement et/ou de l'espace libre déterminé.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102017010718.1A DE102017010718A1 (de) | 2017-11-17 | 2017-11-17 | Verfahren und Mittel zum Betreiben einer Roboteranordnung |
| DE102017010718.1 | 2017-11-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019096479A1 true WO2019096479A1 (fr) | 2019-05-23 |
Family
ID=63720677
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2018/076216 Ceased WO2019096479A1 (fr) | 2017-11-17 | 2018-09-27 | Procédé et moyen pour faire fonctionner un système robotique |
Country Status (2)
| Country | Link |
|---|---|
| DE (1) | DE102017010718A1 (fr) |
| WO (1) | WO2019096479A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112171671A (zh) * | 2020-09-23 | 2021-01-05 | 中国铁建重工集团股份有限公司 | 工程车辆、机械臂与柔性管路的干涉检测方法及系统 |
| CN114390963A (zh) * | 2019-09-06 | 2022-04-22 | 罗伯特·博世有限公司 | 用于工业机器人的标定方法及装置、三维环境建模方法及设备、计算机存储介质以及工业机器人操作平台 |
| CN116745083A (zh) * | 2020-12-21 | 2023-09-12 | 发那科株式会社 | 安全视觉装置以及安全视觉系统 |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE202019003026U1 (de) * | 2019-07-19 | 2019-08-12 | lndustrie-Partner GmbH Radebeul-Coswig | Roboterzelle zum Einsatz an Werkzeugmaschinen und/oder Montageautomaten |
| EP3865257A1 (fr) | 2020-02-11 | 2021-08-18 | Ingenieurbüro Hannweber GmbH | Équipement et procédés de surveillance et de commande d'un système de travail technique |
| WO2023174876A1 (fr) * | 2022-03-15 | 2023-09-21 | Kuka Deutschland Gmbh | Vérification d'un trajet prédéfini d'un robot |
| DE102023200928A1 (de) * | 2023-02-06 | 2024-08-08 | Kuka Deutschland Gmbh | Ermittlung eines von einem Roboter nutzbaren oder für einen Roboter gesperrten Raums |
| DE102023118992B3 (de) | 2023-07-18 | 2024-08-29 | Hochschule Bielefeld, Körperschaft des Öffentlichen Rechts | System und Verfahren zur Eingabe von virtuellen Objekten in drei Dimensionen mittels Augmented Reality zum Ermitteln des Bewegungsraums für Roboter |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006084385A1 (fr) * | 2005-02-11 | 2006-08-17 | Macdonald Dettwiler & Associates Inc. | Systeme d'imagerie en 3d |
| DE102006005958A1 (de) * | 2006-02-08 | 2007-08-16 | Kuka Roboter Gmbh | Verfahren zum Erzeugen eines Umgebungsbildes |
| DE102007007576A1 (de) * | 2007-02-15 | 2008-08-21 | Kuka Roboter Gmbh | Verfahren und Vorrichtung zum Sichern eines Arbeitsraums |
| WO2017121642A1 (fr) * | 2016-01-15 | 2017-07-20 | Kuka Roboter Gmbh | Système robotique doté d'un ordinateur de poche |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE502006009264D1 (de) * | 2006-09-30 | 2011-05-19 | Abb Technology Ag | Verfahren und System zur Auslegung und Überprüfung von Sicherheitsbereichen eines Roboters |
| EP2048557B1 (fr) * | 2007-10-11 | 2013-03-27 | Sick Ag | Capteur optoélectronique et dispositif mobile ainsi que son procédé de configuration |
| JP6123307B2 (ja) * | 2013-01-23 | 2017-05-10 | 株式会社デンソーウェーブ | ロボット周辺への物体の侵入を監視する監視システムおよび監視方法 |
| DE202013104264U1 (de) * | 2013-09-18 | 2015-01-09 | Daimler Ag | Arbeitsstation |
| DE102013110901B4 (de) * | 2013-10-01 | 2022-11-10 | Mercedes-Benz Group AG | MRK Planungstechnologie |
| DE202014100411U1 (de) * | 2014-01-30 | 2015-05-05 | Kuka Systems Gmbh | Sicherheitseinrichtung |
-
2017
- 2017-11-17 DE DE102017010718.1A patent/DE102017010718A1/de not_active Ceased
-
2018
- 2018-09-27 WO PCT/EP2018/076216 patent/WO2019096479A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006084385A1 (fr) * | 2005-02-11 | 2006-08-17 | Macdonald Dettwiler & Associates Inc. | Systeme d'imagerie en 3d |
| DE102006005958A1 (de) * | 2006-02-08 | 2007-08-16 | Kuka Roboter Gmbh | Verfahren zum Erzeugen eines Umgebungsbildes |
| DE102007007576A1 (de) * | 2007-02-15 | 2008-08-21 | Kuka Roboter Gmbh | Verfahren und Vorrichtung zum Sichern eines Arbeitsraums |
| WO2017121642A1 (fr) * | 2016-01-15 | 2017-07-20 | Kuka Roboter Gmbh | Système robotique doté d'un ordinateur de poche |
Non-Patent Citations (1)
| Title |
|---|
| OLESYA OGORODNIKOVA: "Human Robot Interaction: The Safety Challenge (An integrated frame work for human safety)", 10 January 2010 (2010-01-10), pages 1 - 114, XP055177445, Retrieved from the Internet <URL:http://www.omikk.bme.hu/collections/phd/Gepeszmernoki_Kar/2010/Ogorodnikova_Olesya/ertekezes.pdf> [retrieved on 20150318] * |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114390963A (zh) * | 2019-09-06 | 2022-04-22 | 罗伯特·博世有限公司 | 用于工业机器人的标定方法及装置、三维环境建模方法及设备、计算机存储介质以及工业机器人操作平台 |
| CN112171671A (zh) * | 2020-09-23 | 2021-01-05 | 中国铁建重工集团股份有限公司 | 工程车辆、机械臂与柔性管路的干涉检测方法及系统 |
| CN116745083A (zh) * | 2020-12-21 | 2023-09-12 | 发那科株式会社 | 安全视觉装置以及安全视觉系统 |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102017010718A1 (de) | 2019-05-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019096479A1 (fr) | Procédé et moyen pour faire fonctionner un système robotique | |
| DE102017128543B4 (de) | Störbereich-einstellvorrichtung für einen mobilen roboter | |
| EP1521211B1 (fr) | Procédé et processus pour déterminer la position et l'orientation d'un récepteur d'images | |
| EP1681111B1 (fr) | Méthode d'utilisation d'un dispositif de fabrication | |
| EP2216144B1 (fr) | Système et procédé pour vérifier des composants et/ou des unités fonctionnelles avec un dispositif de test | |
| DE102020130520A1 (de) | Verfahren zum steuern eines roboters in gegenwart menschlicher bediener | |
| EP3324362B1 (fr) | Procédé et dispositif de mise en service d'un système multiaxes | |
| DE102019212452A1 (de) | Interferenzvermeidungsvorrichtung und Robotersystem | |
| DE102010007025A1 (de) | Verfahren und Vorrichtung zur Überwachung eines Manipulatorraumes | |
| DE102014017307B4 (de) | Verfahren und System zum Bearbeiten eines Bauteils mit einem robotergeführten Werkzeug | |
| WO2017063733A1 (fr) | Référencement haptique d'un manipulateur | |
| EP3664973B1 (fr) | Système de manipulation ayant un dispositif de manipulation pour effectuer au moins une étape de travail, procédé et programme informatique | |
| DE102018125841B4 (de) | Roboter, Robotersystem und Verfahren zum Festlegen eines Koordinatensystems eines Roboters | |
| DE112021001173T5 (de) | Entgratungsvorrichtung und Steuerungssystem | |
| WO2023174875A1 (fr) | Planification d'un trajet d'un robot | |
| EP3323565B1 (fr) | Procédé et dispositif de mise en service d'un système multiaxial | |
| EP3414597B1 (fr) | Procédé et système de positionnement très précis d'un dispositif d'interaction robotisé au moyen d'un radar | |
| EP3812106B1 (fr) | Agencement de robot, procédé de fonctionnement de l'agencement de robot, programme informatique ainsi que support d'enregistrement lisible par machine | |
| WO2017191029A1 (fr) | Système de mesure mobile | |
| DE102018205669B4 (de) | Aufnehmen von Nutzlasten mittels eines robotergeführten Werkzeugs | |
| DE102014100538B4 (de) | Verfahren zum Kalibrieren eines Roboters und einer Kamera und System zum Durchführen des Verfahrens | |
| DE102010007027A1 (de) | Verfahren und Vorrichtung zur Überwachung eines Manipulatorraumes | |
| DE102022202563B3 (de) | Planen einer Bahn eines Roboters | |
| EP2353800B1 (fr) | Procédé et dispositif de surveillance d'une chambre de manipulateur | |
| DE112021003175B4 (de) | Robotervorrichtung zur erkennung der störungen eines roboterbauteils |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18780079 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18780079 Country of ref document: EP Kind code of ref document: A1 |