WO1998051452A1 - Application d'une technologie d'automatisme industriel pour systeme automatise de decapage de peinture par milieu sec - Google Patents
Application d'une technologie d'automatisme industriel pour systeme automatise de decapage de peinture par milieu sec Download PDFInfo
- Publication number
- WO1998051452A1 WO1998051452A1 PCT/CA1998/000464 CA9800464W WO9851452A1 WO 1998051452 A1 WO1998051452 A1 WO 1998051452A1 CA 9800464 W CA9800464 W CA 9800464W WO 9851452 A1 WO9851452 A1 WO 9851452A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- effector
- video
- quality
- information
- stripping
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B24—GRINDING; POLISHING
- B24C—ABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
- B24C1/00—Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods
- B24C1/08—Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods for polishing surfaces, e.g. smoothing a surface by making use of liquid-borne abrasives
- B24C1/086—Descaling; Removing coating films
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B24—GRINDING; POLISHING
- B24C—ABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
- B24C3/00—Abrasive blasting machines or devices; Plants
- B24C3/02—Abrasive blasting machines or devices; Plants characterised by the arrangement of the component assemblies with respect to each other
- B24C3/06—Abrasive blasting machines or devices; Plants characterised by the arrangement of the component assemblies with respect to each other movable; portable
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37572—Camera, tv, vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40613—Camera, laser scanner on end effector, hand eye manipulator, local
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45071—Aircraft, airplane, ship cleaning manipulator, paint stripping
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/50—Machine tool, machine tool null till machine tool work handling
- G05B2219/50353—Tool, probe inclination, orientation to surface, posture, attitude
Definitions
- This invention relates to automated dry-media blasting coating removal systems, more particularly to the automated starch media dry-stripping (SMDS) or the plastic media blasting (PMB) processes, and to real time computer-vision controllers for guiding and controlling the depainting nozzle(s) for optimal coating removal or stripping performances.
- SMDS automated starch media dry-stripping
- PMB plastic media blasting
- the blasting nozzle used for projecting the high speed dry medium (e.g. the wheat starch medium) on the surface to be depainted may have a flat rectangular section rather than a circular one, in order to ensure efficient and uniform coating removal over the entire surface to be treated. Since such a nozzle requires to be manipulated at regulated and controlled speed and at particular distance and incidence angle with respect to the surface to be depainted, automatic controllers for holding, moving and guiding the blasting nozzle are required.
- a computer-vision controller for guiding in realtime the nozzle on the surface to be depainted, using image information of that surface collected through at least one video camera installed on the nozzle carrier, herein after called the end-effector.
- the computer-vision system uses an edge-tracking method for determining the linear edge between the painted area and the depainted area in order to avoid depainting the same surface twice. Image color acquired by the cameras is used for assessing the paint stripping level for optimizing the traverse speed for the end-effector to obtain the desired depaint results.
- Another object of the computer-vision controller according to the present invention is to provide depainting quality information via the same camera.
- the control system is a closed-loop system so that the nozzle direction and speed may be adjusted in real time depending on the information provided by the cameras .
- the main components of this automated system are the carrier, the Process Equipment Trailer (PET), the operator control station, and a robotic manipulator.
- the robot's arm is comprised of a travel beam, a 4 degrees of freedom serial link manipulator, and an end effector.
- the 4-degrees of freedom manipulator mounted to the travel beam provides the vertical movement and the compliance of the end effector to the aircraft surface.
- the travel beam mounted on the carrier provides lateral movement for the robot's arm/end effector.
- the end effector contains the blasting nozzle and the vacuum hose. The blasting nozzle delivers the stripping media and the vacuum hose recovers the used media and paint. Sensors mounted on the end effector are used to guide the robot in real-time through its desired stripping path.
- the camera defines the acquisition system of the vision control system developed by the inventor for the automated depaint system.
- This vision system is a crucial part of the automation process as it provides the capability of edge tracking and quality control during the coating removal process.
- the edge tracking is used for strip-trace overlap control for step-down control at the end of a trace.
- Quality control is mainly used to control the degree of depainting.
- One of the main requirements which has been addressed in the vision system is the capability to remove coatings selectively, i.e. the removal of the top coat while leaving the primer.
- an automated coating removal system for treating and stripping painted surfaces, said system comprising: a coat removal end-effector for treating and stripping painted surfaces; sensor means attached to said end-effector for acquiring surface information to be used for controlling and driving said end-effector along a particular path; blasting nozzle means for blasting media particles onto said painted surface, said blasting nozzle means being attached to said coating removal end-effector; vision controller means for analyzing said surface information and providing a control signal for controlling and driving said end-effector along said path; and robot means for driving said end-effector along said path according to said control signal provided by said vision controller means.
- Yet another object of the invention is to provide an apparatus for processing a series of video image signals from a video camera to generate a trajectory control signal for an at least partially automated paint stripping blast end effector robotic system, said apparatus comprising: means for mounting said camera forward of at least one blast nozzle of said end effector; means for analyzing said image signals to obtain a position signal of a substantially linear boundary between a stripped and an unstripped portion of a painted surface; and means for generating said trajectory signal from said position signal, whereby said robotic system tracks said boundary.
- An apparatus for processing a series of video image signals from a video camera and for determining a quality of a stripped portion of a painted surface, said video image signals comprising an intensity component and a color or hue component said apparatus comprising: means for comparing a contrast quality of said intensity component and said color component of at least one of said image signals to determine whether said quality is best determined by said intensity component or by said color component and to output an image type control signal; and means for selectively analyzing one of said intensity component and said color component of said video signals in response to said image type control signal to obtain a quality value of said depainted portion.
- FIG. 1 represents a detailed view of the end- effector
- FIG. 2 shows the hardware block diagram of the vision system
- FIG. 3 represents the software block diagram of the vision system
- FIG. 4 shows the strip-trace overlap definition as used in the present specification
- FIG. 5.A shows the intensity contrast between an aluminum panel and twelve samples of the most common colors used in the aircraft industry
- FIG. 5.B shows the color contrast between an aluminum panel and twelve samples of the most common colors used in the aircraft industry
- FIG. 6 shows the first step in the edge detection which is the three dimension display of the acquired image color intensity level
- FIG. 7 shows the gradient and filtering step in the edge detection
- FIG. 8 shows the gradient selection during the edge detection process
- FIG. 9 shows the final edge representation in the XY plane
- FIG. 10 represents the color and intensity contrast levels for twelve overexposed top coats
- FIG. 11 represents the color an intensity contrast level for twelve underexposed top coats
- FIG. 12 shows the best situations for detecting the edge for 12 samples of surface
- FIG. 13 represents the processing time as a function of the processed area
- FIG. 14 shows the block diagram related to the color detection and quality control feature according to the preferred embodiment of the invention.
- FIG. 15 shows the block diagram related to the edge tracking according to the preferred embodiment of the invention.
- This invention discloses an automated SMDS vision controller, also called hereinafter an automated coating removal system, capable of satisfying both above requirements even when used in a constraint full environment specific to that field of activity.
- the automated coating removal system disclosed by the present invention may comprise a carrier for carrying a Process Equipment Trailer (PET) on which a robot arm is mounted.
- the robot arm comprises an end-effector at its free extremity which is responsible for the effective paint stripping activity.
- the present invention relates specifically to the computerized vision controller which is used for driving the end-effector.
- the vision controller ensure that the end-effector is driven through the right path so overlap between consecutive traces is avoided. It also performs depaint quality control by analyzing the color of the stripped trace so the speed of the end-effector may be adjusted for obtaining the desired coat removal and stripping quality.
- Figure 1 illustrates the end-effector 10 during a typical stripping operation.
- the end effector 10 which is mounted on the robot arm (not shown) is moved horizontally across the surface to be stripped 22. During the depaint process, the end effector 10 moves from left to right, then steps down at the end of the trace, and goes back from right to left.
- the dark zone in Figure 1 is the stripped zone and the light zone is the painted one.
- the blasting nozzle means may comprise two nozzles 16 and 18 located on the right side of the end effector 10 project wheat starch. The mixture's residues are collected with the vacuum hose 20 on the left side of the end effector 10.
- the vision controller To be able to strip a second trace located under the first one, the vision controller must determine the real position of the lateral edge. This data is used by the robot's controller to ensure minimum overlap between traces.
- the same sensors and the same algorithms are used for vertical edge detection, lateral edge detection and starting position identification. Hence, this disclosure focuses on lateral edge detection.
- the image acquisition part of the system which is used to detect the color edge, may be composed of sensor means, such as two micro cameras 12 and 14 which are located at each end of the end-effector 10. Automatic switching between cameras 12 and 14 is performed based on the travel direction but, only one camera may be used as well.
- the second problem addressed by the vision controller is quality control.
- the stripping quality is a non-linear function of the robot's speed.
- the vision system processes images acquired from the aircraft surface in order to define the speed required to maintain the quality of stripping.
- the topcoat typically, for most aircraft, there are three different layers of paint: The topcoat, the primer which is used to increase the adhesion of the paint on the surface, and a chemical protection layer used against corrosion.
- complete stripping refers to the removal of the top coat and the primer.
- Selective stripping refers to the removal of the top coat only.
- the quality control means the quantification of the level of primer and substrate seen by the camera. From these levels an appropriate analysis is performed to determine if the surface is perfectly stripped. For example, in the case of complete stripping, quality control defines if both top coat and primer are removed. Quality information generated is used to determine the stripping speed that leads to the desired performance.
- Figure 2 illustrates the vision system's configuration. Shown in this figure is the image acquisition system composed by the two cameras 12 and 14, the vision processing unit 24 composed of a vision station 26, a server 28 and a robot controller 30, and the link between the vision station 26 and the robot motion controller 30. Positioning and quality data generated by the vision station 26 is sent to the controller 30, where it is used to guide the end-effector 10 for edge tracking, and for stripping quality control.
- the acquisition system shown in Figure 2 is composed of two cameras 12 and 14 used for edge tracking and quality control, and the lighting system for each camera.
- a spectral study is performed on different aircraft surfaces, and different colors representing top coat and primer.
- Cameras and the light source are housed in a box which is optimally designed to take into account the viewing distance, the position, as well as the size and weight constraints.
- polarized filters may be used.
- Each picture acquired by the camera shows a part of the stripped zone, and a part of the unstripped zone. The picture is then sent to the processing unit 24 for processing.
- the processing unit 24 is a PC-based platform. As processing time is a major constraint, some dedicated image processing boards are used to perform pipeline processing. Only a part of the image processing is done on these boards. The analysis and other specific tasks performed with the image are done at the host level. The resulting data from the analysis may be sent to the robot's arm motion controller 30 through an Ethernet link.
- the first step of the process is related to the digitization of the analog signal for both the quality and the lateral edge tracking tasks .
- These tasks may be performed by two RGB 24bits grabbers 32, each one receiving one of the two interlaced signals 34 or 36 from one of the two cameras 12 or 14.
- numerical conversion is performed on the digitized data by the A/N converters 38 and 40.
- Different image processing algorithms are applied on the data which is then sent from the dedicated hardware module 42 to the host 44.
- the edge coordinates and the quality information is deduced from this analysis.
- the host 44 is also used for operator data display and for validation of the data sent to the motion controller 30.
- Edge tracking is used to generate the necessary data in real-time in order to guide the robot during the coat removal and stripping process, and thus minimize overlaps between consecutive traces.
- Edge detection is also used between traces when the robot steps down, and at the beginning of the stripping for each zone. Edge detection during the step-down phase ensures that vertical edges between traces are aligned. Edge detection during the starting phase ensures that the stripping starts at the desired position and hence overlaps between zones are controlled.
- Edge detection is useful to avoid a positive or negative overlap.
- Figure 4 shows the difference between these two overlaps.
- a positive overlap is defined by an unstripped zone located between two consecutive traces.
- a negative overlap is defined by a zone stripped twice at the junction of the two traces.
- the edge being detected comes from a difference in contrast between two regions of the acquired image.
- the easiest way to detect this contrast is to use the intensity contrast level reflected by the aircraft surface.
- the contrast in the color scale may be used as an alternate parameter for detecting the edge.
- One of the main features of the present stripping controller is its capability of choosing one of the color information and the intensity information in order to use the chosen information for edge detection (the same choice is performed for quality control).
- the two colors of the stripped and the unstripped portions respectively are compared using comparing means of the coat removal system.
- the same procedure is applied to the intensity of the two portions, and the best contrast for this particular surface is chosen for being employed for further coat removal.
- the system continues controlling the blasting end-effector using the chosen information, in order to find the position value of the linear boundary between the two portions .
- Figures 5.A and 5.B show the average difference in intensity and in color between an aluminum stripped panel covered with the most common colors used by the aircraft industry.
- Figure 5.A shows the intensity contrast between an aluminum panel (light column) and different classical topcoats (dark column). As illustrated in this figure, the detection becomes possible when we reach a difference of fifty or more, between two columns. For example, sample number six illustrates a non reflective light blue sample compared to an aluminum sample stripped of its paint. As shown in Figure 5.A, we have no contrast in intensity. However, Figure 5.B shows that the color information is better to detect the edge.
- the detection of a line between two consecutive traces of the edge detector 10 is referred to as edge detection or detection of a substantially linear boundary.
- the processing apparatus processes a series of video images acquired by the video cameras 12 and 14 in order to determine that boundary between the stripped and the unstripped portions.
- Figures 6 to 9 show the same picture acquired by one of the cameras 12 or 14 at different steps of the processing phase. These figures show the three dimensional plot of intensity level versus location in the image (x,y) plane.
- Figure 6 shows a separation between a substrate and classical paint. The goal is to isolate the junction between the two regions .
- Figure 7 represents the gradient taken on the previous image after filtering.
- Figure 8 shows highest gradient selection and, finally, Figure 9 shows the final edge representation in the XY plane. That isolated line can be analyzed to determine if it is really an edge.
- Figure 15 represents the high level software diagram for the edge detection.
- the light source level used has a large impact on the robustness of the system. Reflective or non-
- Figure 11 shows a light underexposure situation. It may be observed that for this underexposed situation, samples two to eight loose their intensity contrast. But the color contrast is still useful.
- Figure 12 presents the same information differently.
- Each axis shows a sample with the color or intensity contrast level for one optimal light level.
- the goal from an algorithm point of view, is to stay on the perimeter of the total surface drawn by the two curves in order to avoid any situation where no edge detection would be possible.
- the little circle shown in the center illustrates the dangerous zone that must be avoided.
- Quality control is based on color detection.
- the vision system 24 In order to detect a quality variation, the vision system 24 must learn different color mixtures. The learning process is done during the calibration period.
- the end-effector 10 In order to perform a quick and safe calibration, the end-effector 10 is commanded to strip twenty inches of surface using a constant acceleration. Hence, the trace shows a constant variation of quality. For example, for an aluminum panel, we will see a progression from completely stripped aluminum to the topcoat. Then, this sample will be used to teach the vision system a minimum of ten variations along the trace. Using these ten variations, a mathematical model is built. During the real stripping process, the real picture is compared with the model in order to determine the level of primer or aluminum seen on the surface.
- the system uses the same selection between color information and intensity information of the two portions, respectively the from the stripped portion and the unstripped portion of the surface to be depainted. This procedure has been described in greater details in the previous section. It allows the selection of the best information, either the color or the intensity information, provided by the surface to the vision controller which uses it to compute and control the coat removal quality of the blasting nozzles, by adjusting the speed of the end-effector.
- the simplified quality control software diagram is shown in Figure 14.
- an image of the stripped surface is acquired by one of the cameras 12 or 14, depending on the direction of stripping.
- the image is sent through cables to the vision station 26 where it is converted from analog to digital by the A/N converter 40.
- the intensity contrast or the color contrast is digitized and is compared to a series of samples previously recorded by the system.
- the best mach is found and is used to adjust the speed of the end-effector, controlling at the same time the quality of the stripping process performed on the aircraft surface.
- Figure 13 shows the processing time as a function of the size of the image processed, for both quality and edge detection. Obviously, the larger the picture, the longer the processing time required.
- the vision controller disclosed by the present invention is integrated within the overall coating removal system. As previously presented, the first task of this system is to generate data required to modify the real-time trajectory of the robotics arm so that minimum overlap is obtained between consecutive traces. The second task of the vision controller is to provide quality information used to define the stripping speed so that the desired quality of the stripping is obtained.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Spray Control Apparatus (AREA)
Abstract
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU74204/98A AU7420498A (en) | 1997-05-13 | 1998-05-13 | Enabling process control technology for automated dry media depaint system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US4632897P | 1997-05-13 | 1997-05-13 | |
| US60/046,328 | 1997-05-13 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO1998051452A1 true WO1998051452A1 (fr) | 1998-11-19 |
Family
ID=21942871
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CA1998/000464 Ceased WO1998051452A1 (fr) | 1997-05-13 | 1998-05-13 | Application d'une technologie d'automatisme industriel pour systeme automatise de decapage de peinture par milieu sec |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU7420498A (fr) |
| WO (1) | WO1998051452A1 (fr) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006097133A1 (fr) * | 2005-03-14 | 2006-09-21 | Workinter Limited | Dispositif et procede de decapage a buse par projection d'un fluide charge de particules solides formant un front de decapage optimise |
| WO2006097134A1 (fr) * | 2005-03-14 | 2006-09-21 | Workinter Limited | Sabot et engin de decapage de surfaces pouvant presenter une courbure par projection et evacuation orientees d'un flux de particules |
| FR2891175A1 (fr) * | 2005-09-27 | 2007-03-30 | Applic Lorraine Des Tech Nouve | Dispositif de decapage pour surface metallique |
| KR20160014585A (ko) * | 2013-03-15 | 2016-02-11 | 카네기 멜론 유니버시티 | 복잡한 표면을 관찰하고 처리하기 위한 관리 자동 로봇 장치 |
| ES2695627A1 (es) * | 2017-05-31 | 2019-01-09 | Vilarino David Roca | Robot- máquina automática de pintado de estructuras (R-MAPE) |
| ES2727675A1 (es) * | 2018-04-16 | 2019-10-17 | Eseki S A L | Sistema automatico para granallado de piezas |
| FR3101562A1 (fr) * | 2019-10-08 | 2021-04-09 | Safran Aircraft Engines | Procédé de décapage d’au moins une zone d’une aube de turbomachine |
| CN114227690A (zh) * | 2021-12-30 | 2022-03-25 | 无锡荣恩科技有限公司 | 一种航空部件除漆方法 |
| EP4549091A3 (fr) * | 2023-09-12 | 2025-07-30 | Renfert GmbH | Unité de projection pour un dispositif de projection, en particulier dispositif de projection dentaire |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0165911A2 (fr) * | 1984-06-22 | 1985-12-27 | VIANOVA S.p.A. | Méthode et plate-forme robot pour le lavage, le sablage et la peinture dans une cale sèche |
| WO1991014539A1 (fr) * | 1990-03-27 | 1991-10-03 | Southwest Research Institute | Systeme robotise pour enlever la peinture |
| US5067085A (en) * | 1989-05-15 | 1991-11-19 | Southwest Research Institute | Optical robotic canopy polishing system |
| US5077941A (en) * | 1990-05-15 | 1992-01-07 | Space Time Analyses, Ltd. | Automatic grinding method and system |
| DE4428069A1 (de) * | 1993-08-31 | 1995-03-02 | Putzmeister Maschf | Anordnung zur Oberflächenbearbeitung, insbesondere zur Oberflächenreinigung von Großobjekten |
| US5394654A (en) * | 1990-12-28 | 1995-03-07 | Mazda Motor Corporation | Method of wet-sanding defective parts of coating on vehicle body and system for carrying out the method |
| US5477268A (en) * | 1991-08-08 | 1995-12-19 | Mazda Motor Corporation | Method of and apparatus for finishing a surface of workpiece |
-
1998
- 1998-05-13 AU AU74204/98A patent/AU7420498A/en not_active Abandoned
- 1998-05-13 WO PCT/CA1998/000464 patent/WO1998051452A1/fr not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0165911A2 (fr) * | 1984-06-22 | 1985-12-27 | VIANOVA S.p.A. | Méthode et plate-forme robot pour le lavage, le sablage et la peinture dans une cale sèche |
| US5067085A (en) * | 1989-05-15 | 1991-11-19 | Southwest Research Institute | Optical robotic canopy polishing system |
| WO1991014539A1 (fr) * | 1990-03-27 | 1991-10-03 | Southwest Research Institute | Systeme robotise pour enlever la peinture |
| US5077941A (en) * | 1990-05-15 | 1992-01-07 | Space Time Analyses, Ltd. | Automatic grinding method and system |
| US5394654A (en) * | 1990-12-28 | 1995-03-07 | Mazda Motor Corporation | Method of wet-sanding defective parts of coating on vehicle body and system for carrying out the method |
| US5477268A (en) * | 1991-08-08 | 1995-12-19 | Mazda Motor Corporation | Method of and apparatus for finishing a surface of workpiece |
| DE4428069A1 (de) * | 1993-08-31 | 1995-03-02 | Putzmeister Maschf | Anordnung zur Oberflächenbearbeitung, insbesondere zur Oberflächenreinigung von Großobjekten |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2006097133A1 (fr) * | 2005-03-14 | 2006-09-21 | Workinter Limited | Dispositif et procede de decapage a buse par projection d'un fluide charge de particules solides formant un front de decapage optimise |
| WO2006097134A1 (fr) * | 2005-03-14 | 2006-09-21 | Workinter Limited | Sabot et engin de decapage de surfaces pouvant presenter une courbure par projection et evacuation orientees d'un flux de particules |
| FR2891175A1 (fr) * | 2005-09-27 | 2007-03-30 | Applic Lorraine Des Tech Nouve | Dispositif de decapage pour surface metallique |
| KR102211010B1 (ko) | 2013-03-15 | 2021-02-02 | 카네기 멜론 유니버시티 | 복잡한 표면을 관찰하고 처리하기 위한 관리 자동 로봇 장치 |
| EP2973074A4 (fr) * | 2013-03-15 | 2016-11-16 | Univ Carnegie Mellon | Système robotisé autonome supervisé destiné à l'inspection et au traitement de surface complexe |
| US9796089B2 (en) | 2013-03-15 | 2017-10-24 | Carnegie Mellon University | Supervised autonomous robotic system for complex surface inspection and processing |
| KR20160014585A (ko) * | 2013-03-15 | 2016-02-11 | 카네기 멜론 유니버시티 | 복잡한 표면을 관찰하고 처리하기 위한 관리 자동 로봇 장치 |
| ES2695627A1 (es) * | 2017-05-31 | 2019-01-09 | Vilarino David Roca | Robot- máquina automática de pintado de estructuras (R-MAPE) |
| ES2727675A1 (es) * | 2018-04-16 | 2019-10-17 | Eseki S A L | Sistema automatico para granallado de piezas |
| WO2019202192A1 (fr) * | 2018-04-16 | 2019-10-24 | Eseki, S.A.L. | Système automatisé pour le sablage de pièces |
| FR3101562A1 (fr) * | 2019-10-08 | 2021-04-09 | Safran Aircraft Engines | Procédé de décapage d’au moins une zone d’une aube de turbomachine |
| CN114227690A (zh) * | 2021-12-30 | 2022-03-25 | 无锡荣恩科技有限公司 | 一种航空部件除漆方法 |
| CN114227690B (zh) * | 2021-12-30 | 2023-11-03 | 无锡荣恩科技有限公司 | 一种航空部件除漆方法 |
| EP4549091A3 (fr) * | 2023-09-12 | 2025-07-30 | Renfert GmbH | Unité de projection pour un dispositif de projection, en particulier dispositif de projection dentaire |
Also Published As
| Publication number | Publication date |
|---|---|
| AU7420498A (en) | 1998-12-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8923602B2 (en) | Automated guidance and recognition system and method of the same | |
| US8098928B2 (en) | Apparatus for picking up objects | |
| CN111905983B (zh) | 基于视觉跟随的点胶轨迹修正方法、装置、系统及介质 | |
| US7283661B2 (en) | Image processing apparatus | |
| US20210255117A1 (en) | Methods and plants for locating points on complex surfaces | |
| Miller | Industrial robot handbook | |
| WO1998051452A1 (fr) | Application d'une technologie d'automatisme industriel pour systeme automatise de decapage de peinture par milieu sec | |
| CN111923053A (zh) | 基于深度视觉的工业机器人物件抓取示教系统及方法 | |
| JP2023090683A (ja) | 仕上げ自動化システムおよびその方法 | |
| CN115890639A (zh) | 一种机器人视觉引导定位抓取控制系统 | |
| CN110545920A (zh) | 利用喷雾器为工件涂漆的方法以及涂漆系统 | |
| CN114833040B (zh) | 一种涂胶方法及新能源电驱动端盖涂胶设备 | |
| WO2023118470A1 (fr) | Procédé et appareil pour couper et retirer des parties | |
| JP3543329B2 (ja) | ロボットの教示装置 | |
| Prabhu et al. | Dynamic alignment control using depth imagery for automated wheel assembly | |
| Abicht et al. | New automation solution for brownfield production–Cognitive robots for the emulation of operator capabilities | |
| KR100944425B1 (ko) | 강판 표면의 결함마크 검출 장치 | |
| JP3206849B2 (ja) | ロボツト塗装装置 | |
| CN108435456A (zh) | 工业机器人喷涂控制系统 | |
| Ersü et al. | Vision system for robot guidance and quality measurement systemsin automotive industry | |
| CN120595679A (zh) | 一种多尺寸焊点的自动切换烙铁的控制方法与系统 | |
| CN119839856A (zh) | 一种用于堵板机器人视觉引导的装置、系统及方法 | |
| Saeed et al. | Development of smart painting machine using image processing | |
| Macaire et al. | Automated visual inspection of galvanized and painted metallic strips | |
| JPH0247695B2 (fr) |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM GW HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG |
|
| DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| NENP | Non-entry into the national phase |
Ref country code: CA |
|
| REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
| 122 | Ep: pct application non-entry in european phase |