WO2023069462A2 - Systems and methods for controlled cleaning of vehicles - Google Patents
Systems and methods for controlled cleaning of vehicles Download PDFInfo
- Publication number
- WO2023069462A2 WO2023069462A2 PCT/US2022/047054 US2022047054W WO2023069462A2 WO 2023069462 A2 WO2023069462 A2 WO 2023069462A2 US 2022047054 W US2022047054 W US 2022047054W WO 2023069462 A2 WO2023069462 A2 WO 2023069462A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- cleaning
- interior
- configuration
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60S—SERVICING, CLEANING, REPAIRING, SUPPORTING, LIFTING, OR MANOEUVRING OF VEHICLES, NOT OTHERWISE PROVIDED FOR
- B60S3/00—Vehicle cleaning apparatus not integral with vehicles
- B60S3/008—Vehicle cleaning apparatus not integral with vehicles for interiors of land vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/0085—Cleaning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/02—Manipulators mounted on wheels or on carriages travelling along a guideway
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0093—Programme-controlled manipulators co-operating with conveyor means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41815—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41815—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell
- G05B19/4182—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the cooperation between machine tools, manipulators and conveyor or other workpiece supply system, workcell manipulators and conveyor only
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39082—Collision, real time collision avoidance
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40039—Robot mounted or sliding inside vehicle, on assembly line or for test, service
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40252—Robot on track, rail moves only back and forth
Definitions
- the present disclosure relates to systems and methods for controlled cleaning of vehicles, and in particular, for controlled cleaning of the interior of a vehicle.
- Embodiments of the present disclosure provide a system for cleaning an interior of a vehicle.
- the system may include a robotic arm positioned outside of the vehicle.
- the robotic arm may include an end effector configured as a cleaning implement for cleaning a surface in the interior of the vehicle.
- the system may include a first camera configured to determine a position of the vehicle with respect to a reference point.
- the system may include a second camera configured to scan the interior of the vehicle.
- the system may include a first controller configured to create and/or modify a tool path to execute a cleaning operation, based on the scan, and to send instructions to the robotic arm to execute the cleaning operation in accordance with the created or modified tool path.
- Embodiments of the present disclosure provide an automated method for cleaning an interior of a vehicle.
- the method may include determining a position of the vehicle with respect to one or more robotic arms positioned exterior to the vehicle.
- the method may include scanning a configuration of the vehicle to yield a configuration scan.
- the method may include identifying surfaces to be cleaned.
- the method may include creating and/or modifying a plurality of tool paths for the one or more robotic arms to clean the identified surfaces.
- the method may include controlling the one or more robotic arms to move along the created and/or modified plurality of tool paths and execute a cleaning operation in the interior of the vehicle
- Embodiments of the present disclosure provide automated method for cleaning an interior of a vehicle.
- the method may include storing a plurality of images for a plurality of vehicles in a database to create a master data package.
- the method may include acquiring vehicle specific data from the master data package.
- the method may include scanning a configuration of the vehicle.
- the method may include aligning the acquired vehicle specific data with the scanned configuration.
- the method may include creating a process plan for execution of a plurality of cleaning operations.
- the method may include sending instructions to a robot to execute the plurality of cleaning operations based on the process plan.
- the method may include executing the plurality of cleaning operations in accordance with the instructions.
- FIGs. 1A-1C illustrate a diagram of a system according to an exemplary embodiment.
- FIG. 2 depicts an image of marker identification according to an exemplary embodiment.
- FIG. 3 depicts a method for model generation according to an exemplary embodiment.
- FIGs. 4A-4C illustrate segmented out geometrical regions with one or more tool paths according to an exemplary embodiment.
- FIG. 5 illustrates a method of obstacle detection according to an exemplary embodiment.
- FIG. 6 illustrates an image of initial registration via implementation of an algorithm according to an exemplary embodiment.
- FIG. 7 illustrates an image of seat alignment according to an exemplary embodiment.
- FIG. 8 illustrates an image of collision object detection according to an exemplary embodiment.
- FIGs. 9A-9B illustrate an image of collision object detection according to another exemplary embodiment.
- FIGs. 10A-10C illustrate feature acquisition according to an exemplary embodiment.
- FIG. 11 illustrates an image of motion planning according to an exemplary embodiment.
- FIG. 12 illustrates images for registration and obstacle detection in a vehicle according to an exemplary embodiment.
- FIG. 13 illustrates a decision matrix for a configuration scanner according to an exemplary embodiment.
- FIG. 14 illustrates example master scan data collected according to an exemplary embodiment.
- FIG. 15 illustrates a method for seat registration according to an exemplary embodiment.
- FIGs. 16A-16H illustrates images for pre-registered and registered interior vehicle portions according to an exemplary embodiment.
- FIG. 17 illustrates an image of a tooling attachment according to an exemplary embodiment.
- FIG. 18 illustrates an image of a component against a portion of an interior of a vehicle according to an exemplary embodiment.
- FIG. 19 illustrates an image of a robot in motion according to an exemplary embodiment.
- FIG. 20 illustrates an automated method for cleaning an interior of a vehicle according to an exemplary embodiment.
- FIG. 21 illustrates an automated method for cleaning an interior of a vehicle according to another exemplary embodiment.
- FIG. 22 illustrates a diagram of vehicle zones for cleaning an interior of a vehicle according to an exemplary embodiment.
- FIG. 23A illustrates a diagram of robots for cleaning an interior of a vehicle according to an exemplary embodiment.
- FIG. 23B illustrates a plan view of a diagram of robots for cleaning an interior of a vehicle according to an exemplary embodiment.
- FIG. 24 illustrates a schematic of a vehicle transiting through stages of interior cleaning according to another exemplary embodiment.
- FIG. 25 illustrates a schematic of a vehicle transiting through stages of interior cleaning according to another exemplary embodiment.
- the systems and methods disclosed herein are configured to provide controlled cleaning of vehicles, and in particular, the interior of a vehicle.
- the implementation of the systems and methods disclosed herein provides advances in robotics technology, including the ability to reconstruct surfaces, identify features/surfaces, localize and register items to be processed, autonomous tool path planning and process planning based on presented information, and motion planning that includes coordination of external axes, with high fidelity collision avoidance, and represents improvements over implementations that, in addition to the above-identified deficiencies, are typically low margin and that have historically relied on low cost labor to realize the requirements of the internal cleaning operations.
- the solution described herein provides a guided autonomous solution, where a human may or may not be guiding the process, either quite minimally, or moreso, depending on complexity and other considerations, to realize the idea of a reduction of manual touch labor that currently occurs on vehicles that enter a car wash bay, thereby reducing operational costs, and improving automobile throughput at high levels of quality.
- the systems and methods described herein are configured to meet rigorous throughput and cost requirements of the vehicle cleaning industry by employing a safe, effective, and efficient cleaning process within the footprint of standard vehicle wash facilities.
- one or more robots may be used to perform one or more commands, such as cleaning operations of the interior of a vehicle.
- the one or more robots may be positioned overhead and/or can be floor mounted.
- a robot can be mounted overhead and positioned to clear the open doors of a vehicle, and still be able to reach a floor of the vehicle.
- a vehicle may refer to a car or a truck.
- Conventional robots, such as a Yaskawa HC20 robot with a 1700mm reach may not be suitable.
- a 2-axis gantry with a collaborative robot can be utilized.
- a 2-axis gantry system that is configured to both track and extend, vertically for a ceiling mounted robot, and horizontally for a floor mounted robot can be utilized. In some examples, this would extend far enough into the ceiling or wall so as to be out of the way of the vehicle, and out far enough to position the robot closer to the vehicle.
- a larger non-collaborative robot such as Yaskawa MH50 11-20, would allow the robot to reach into the vehicle, and for the vehicle to safely pass under it.
- the robot may be configured to reach into the footwell at a sufficient angle to avoid collision with the door of the vehicle. It is to be appreciated that other suitable configurations, such as a scara- type rotating linkage may also be employed to extend reach into a vehicle while keeping the robot out of the way.
- autonomous approaches may be configured to scan the interior of the vehicle and plan the trajectory of the tool path(s) in real-time, or in which autonomous robot manipulators may be configured to map out the interior of the vehicle to perform the cleaning of the interior of the vehicle.
- SLAM simultaneous localization and mapping
- FIG. 1A illustrates a diagram of a system 100 according to an exemplary embodiment.
- the system 100 may be configured for cleaning an interior of a vehicle.
- the system 100 may include any number of the following components: master scanning 101, database 102, cycle management 103, coarse localization 104, configuration alignment 105, process planning 106, motion execution 107, and quality assurance feedback 108.
- the process planning 106 may be for desired operations, including spraying of cleaning solution onto vehicle interior surfaces, wiping of interior window surfaces, and/or vacuuming of seats and foot wells.
- the cycle management 103 may be configured to serve as the coordinator for subprocesses and tasking of specific automated operations.
- FIG. IB illustrates a diagram of a system 100 according to another exemplary embodiment.
- the system 100 may be configured for cleaning an interior of a vehicle.
- FIG. IB may incorporate and reference any of the components as explained above with respect to FIG. 1A.
- FIGs. 1A and IB illustrate a sequence of processing steps as a vehicle moves through the system 100.
- the system 100 may include any number of the following components: master scanning 101, database 102, coarse localization 104, configuration alignment 105, process planning 106, motion execution 107, quality assurance feedback 108, and scan, locate and query 109.
- the process planning 106 may be for desired operations, including spraying of cleaning solution onto vehicle interior surfaces, wiping of interior window surfaces, and/or vacuuming of seats and foot wells.
- a “Master Scan” process can be conducted prior to the vehicle arriving at a facility, where a technician captures a high-resolution 3D scan of the vehicle’s interior, in which the scan may be processed to produce a model-specific Master Scan Data Package (“MSDP”).
- MSDP Master Scan Data Package
- this data package may include a file or plurality of files including 3D mesh models, process tool paths, and various other metadata.
- the MSDP is transmitted to a database 102, such as a local database, at the facility. This information can be transmitted to local databases at a plurality of facilities within a network, or can be uploaded to a cloud or cloudbased server in a computing environment and deployed via one or more containers so that each facility can pull information when needed.
- the vehicle is identified, such as through a license plate or VIN scanner, to determine a specific model of the vehicle as stored in the database 102.
- the MSDP creation process can occur at the facility just ahead of the cleaning operation. This process can be used for a vehicle that has not been seen before based on a set of rules relative to the generalizability of the data in the one or more databases relative to available master scan data packages.
- the MSDP may include segmented features, such as consoles, door scans, and seats.
- the MSDP creation process may leverage a 3D scanner.
- the scanner collects data about various surfaces within the vehicle in order to recreate such surfaces of the vehicle to be processed during cleaning. In some examples, regarding car windows, the 3D scanner may be tilted to get into edges where the glass touches the door. At a minimum, the scanner must be configured to obtain the edges or comers of the windows of the vehicle.
- the master scan may be configured to generate new or augment existing measurements for use in the creation of 3D maps of vehicle interiors, including obstructions which are not part of the originally produced vehicle.
- Preferable scanning method uses 3D time of flight cameras for high resolution and speed, but other scanning methods such as LIDAR or other available techniques can be used.
- the scanned data may be transferred to and stored in a database 102 for retrieval by a robotic arm 110.
- a VIN reader or scanning system may be configured to, as part of the vehicle identification, read VIN, or measure length, width, or wheelbase, or identify other distinguishing features of a vehicle for purpose of attributing the identifying characteristics to specific vehicle makes, years, and models.
- This data may be transmitted and stored in a database 102, which may contain interior and exterior information attributed to specific vehicle makes, years, and models and the database 102 may be configured to utilize this information to alert the robotic arms 110 to locate an interior 3D map for the relevant vehicle.
- the scanning system can be a handheld VIN reading system or linked to other camera-based systems, including but not limited to first camera 111 and/or second camera 112.
- FIGS. 1A and IB a plurality of stages exist for the system 100, and are sequentially depicted with respect to the vehicle. These include Stage U, Stage A, Stage B, Stage C, and Stage D.
- a camera 111 may be configured to inspect the vehicle to determine its approximate position on a conveyor belt 117 (Coarse Localization 104).
- This camera 111 can be a 2D camera or 3D camera, positioned overhead relative to the vehicle, and can capture one or more images. This measurement allows the system 100 to adapt a “Configuration Scan” robot path from the database 102 to match the vehicle’s position.
- Stage B can include two robot-mounted cameras 112, such as 3D or depth cameras, to acquire one or more additional images. These cameras 112 may be disposed on opposite sides of the vehicle, such as left and right sides of the vehicle.
- the arrangement of robotic arms 110 or manipulators may be symmetric, whereas in other examples, the arrangement of the robotic arms 110 or manipulators may be asymmetric.
- the robotic arm 110 may be a part of a robot.
- the robotic arms 110 at Stage B execute the Configuration Scan robot paths to position the cameras, which capture Configuration Scan data of the vehicle interior from several viewpoints in both the front and rear compartments (Scan, Locate, Query 109). This vehiclespecific scan is processed and compared against the MSDP. Then, the coarse vehicle alignment from Stage A is fine-tuned and tool paths are shifted, such as to account for front seat adjustments (Configuration Alignment 105), as well as any obstacles that may be observed in the Configuration Scan data.
- Configuration Scan robot paths to position the cameras, which capture Configuration Scan data of the vehicle interior from several viewpoints in both the front and rear compartments (Scan, Locate, Query 109). This vehiclespecific scan is processed and compared against the MSDP. Then, the coarse vehicle alignment from Stage A is fine
- Tool paths from the MSDP are aligned to the current vehicle position, adjusted to avoid detected obstacles, and converted to Robot Motion Paths (Process Planning 106). These computed paths are transferred to robot controllers to execute robot motion and to control process tooling (Motion Execution 107) at Stage C.
- Robot Motion Paths Process Planning 106
- These computed paths are transferred to robot controllers to execute robot motion and to control process tooling (Motion Execution 107) at Stage C.
- four robotic arms 110 are utilized for Motion Execution; however, any suitable number of robotic arms 110 can be employed.
- Stage D one or more operators can enter the process to complete a final manual touch-up and inspection.
- the operator(s) can enter feedback into the system 100 to identify any areas for improvement by later automated and/or manual processes (QA Feedback”).
- Stage A can be a separate station in which the Coarse Localization 104 is performed.
- the position of the vehicle at Stage A can be determined in about 10 seconds. However, in the present example, is not critical to the overall cleaning process time for the system 100 since it is conducted separately.
- the Configuration Scan process includes robot motions to several viewpoints in the front and rear compartments and could take 50% of the cleaning process time at Stage B. The remaining 50% of time at Stage B is available for cleaning processes, such as windows and door panels.
- the Configuration Scan may be managed by pulling in various image captures and merging the discrete point clouds into a single point cloud. This may then be registered to the master scan data via implementation of an iterative closest point algorithm that is specific to this application.
- the ICP algorithm may be configured to highlight differences between the MSDP scan data and the configuration point cloud data as obstacles, which may then be utilized for modifying the tool paths, such as by either clipping/truncating of tool paths (for example, in the case of a car seat), or as a collision object that is to be avoided for motion planning (for example, as a steering wheel), or a device (for example, a phone) holder that is protruding from the dash of the interior of the vehicle. Any object that is identified as not being part of the data matching the scan information in the master data package is treated as a collision object to be avoided, regardless of being a large object or a small object.
- the Configuration Alignment 105 and Process Planning 106 may begin and be completed prior to the vehicle arriving at Stage C.
- the Motion Execution 107 of cleaning processes including but not limited to dash, console, seats, floor pans, windshield, may take 100% of the cycle time at Stage C. In some examples, it is understood that any portion of the Motion Execution 107 may occur at Stage B.
- FIG. 1C illustrates a diagram of a system 100 according to another exemplary embodiment.
- the system 100 may incorporate and reference any of the components as explained above with respect to FIG. 1A and FIG. IB.
- the system 100 may include a robotic arm 110, a first camera 111, a second camera 112, and a first controller 113.
- the system 100 may also include a communication system 114, a second controller 115, a rail 116, a conveyor 117, a state machine 118, and/or any combination thereof. While single instances of the components are illustrated in FIG. 1C, it is understood that system 100 of FIG. 1C may include any number of components.
- a robotic arm 110 may be part of an interconnected robot system.
- the robotic arm 110 may be positioned outside of a vehicle.
- the robotic arm 110 may include an end effector.
- the end effector may be configured as a cleaning implement for cleaning a surface within the vehicle. Without limitation, the surface within the vehicle may refer to an interior surface of the vehicle.
- at least one of the robotic arm 110 and the end effector may include a sensor that is configured to detect objects that are present inside the vehicle.
- a plurality of end effectors of the robotic arm 110 may be configured specifically for vehicle interior cleaning operations.
- a plurality of sensors incorporated into the end effectors and/or robotic arms 110 may be configured to detect the position of objects and obstacles in the vehicle interior in a manner that is robust and timely enough to allow for in-situ collision avoidance via robot control software algorithms.
- this may be configured such that a plurality, including but not limited to up to 4-sets of two robotic arms 100 may be positioned outside the openings of various vehicles, and the vehicles may be conveyed to a point where the interconnected robot systems are stationed.
- Two robotic arms 110 may be joined and mounted on a custom-designed cantilevered, pivotable track system allowing for each robotic arm 110 pair to move up and down and reach horizontally to enter the vehicle and address roughly one quarter of the interior vehicle space to be cleaned.
- Multi-axis robotic arms 110 may be selected for their compactness and ability to easily enter and exit the vehicle through the door opening, as well as for their payload, reach, and dexterity once inside the vehicle.
- the interconnected robot systems can include any number of robotic arms 110, which may be configured to move at higher speeds, or comparably sized collaborative robots, which may be selected for their built-in safety features.
- the interconnected robot systems may be positionally fixed at a location such that vehicles move to the location of the interconnected robot systems in order to be cleaned and thereafter move away from the system 100.
- the interconnected robot system may be configured to move along with a vehicle as the vehicle moves along a predetermined travel pathway.
- the robotic arms 110 of the interconnected robot system can be positioned such that they are affixed on either side of a vehicle to be cleaned, or the robotic arms 110 can be positioned overhead of the vehicle.
- Each of the interconnected robot system may include a robot programmable logic controller and graphical interface which links robot control software of the robotic arm 110 to the vehicle wash facility’s conveyor control system, allowing for sensor driven or robot control software-driven, or control of interior cleaning line conveyor speed, acceleration, and vehicle placement.
- the graphic interface may be configured to monitor the performance by the system 100.
- the graphic interface may indicate a score for assessing how the cleaning of the interior of the vehicle, performed by the systems and methods disclosed herein, looked.
- the first camera 111 may be configured to determine a position of the vehicle with respect to a reference point.
- the first camera may comprise a 2D camera.
- the first camera 111 may be located above the vehicle.
- the first camera 111 may be located at a position above a roof of the vehicle.
- the second camera 112 may be configured to scan the interior of the vehicle. In some examples, the second camera 112 may be configured to detect a seat position, a steering while position, an object present in the vehicle, and/or any combination thereof. The second camera 112 may be positioned outside the vehicle. The second camera 112 may be coupled to the robotic arm 110. In some examples, the second camera 112 may comprise a 3D camera.
- the first and second controller 113, 115 may be configured to operate as a single controller. In other examples, the first and second controllers 113, 115 may operate as separate controllers. The first and second controller 113, 115 may be configured to control the execution of tool paths and coordinate the dissemination of tools and motion paths to each of the robotic manipulator controllers and peripheral hardware. The first controller 113 may be also configured to transmit the instructions to the robotic arm 110 to execute, based on the scanned interior of the vehicle by the second camera 112, the cleaning operation in accordance with the created tool path.
- the tool path represents not only the location on the surface that a path is being applied for motion execution, but also contains information such as metadata that includes the process information, such as stand-off, tool angle or an angle range, and tool. This information may be represented in a MSDP tool file schema.
- the scheme includes a format for the tool paths and requirements for the specific tool, which may be assigned to any candidate surfaces within the cleaning domain that is the interior of the vehicle.
- the second controller 115 may be configured to receive data from one or more sensors to mitigate collision.
- the communication system 114 may be coupled to any number of the controllers 113, 115.
- the communication system 114 may include a quality assurance feedback loop, as previously described above with respect to quality assurance feedback 108.
- the database 102 may include a plurality of stored vehicle configurations.
- the database 102 may contain relevant interior and exterior vehicle data and characteristics which can be constructed into a 3D map of a vehicle’s interior.
- Data in the database 102 may be organized in a way which allows the data to be attributed to and retrieved by a vehicle’s make, year, and model number. Maps can also reflect the 3D map of the interior as designed, as well as observed variations caused by objects, added features or items, or from other modifications to the vehicle.
- the system 100 may be configured to generate or retrieve existing 3D maps of vehicle interiors for instructing and controlling the robotic arm 110 of robot system.
- the controllers 113, 115 may be configured to compare map to real-time sensor-collected data and, in the event of differences, make decisions about the appropriate path to take based on a set of rules and priorities.
- the controllers 113, 115 may be configured to document and update 3D maps and instructions in the robotic arm 110 of the robot system and the database 102 with the new data.
- the system 100 has two primary episodes of robot control/guidance: First, when the vehicle is presented on the conveyor 117 to the robotic arm 110 of the robot system - for guiding the robotic arm-pair 110 inside the vehicle through the door opening. Second, once inside the vehicle, the controllers 113, 115 may be configured to manage the motion, movements, and workflow of the robotic arm 110 and end effectors to complete the cleaning cycle in a timely manner.
- the system 100 may be configured to continuously update the database 102 and 3D vehicle maps based on measured efficacy of motions of the robotic arms 110, using algorithms to Team’ and optimize/minimize the motion path and time required to enter the vehicle and conduct the work.
- the learning and optimization can be attributed to, without limitation, specific vehicle makes, years, and/or models; types of vehicles, such as four-door sedans, trucks, or the like; and/or to commonly observed alterations/obstructions within a vehicle interior (e.g., how to best navigate around a baby seat).
- a computer system may be configured to maintain the database 102, and physically or wirelessly connect to various input, output, and backup components of the system 100.
- the system 100 may be configured to provide data processing or computational support as needed.
- the system 100 can be connected to a network, such as the network described above, such that information stored therein can be accessed via the network at multiple locations, including locations along the same vehicle wash line, such as downstream of the scanning system, and including remote locations, such as at a vehicle wash located in different locations, including a different city and/or state.
- the rail 116 such as a linear rail, may be located above and alongside the vehicle.
- the robotic arm 110 may be configured to move along the linear rail 116.
- the vehicle may be carried on a conveyor 117.
- the 110 may be configured to move along the linear rail 116 in coordination with a motion of the conveyor 117.
- the state machine 118 may be configured to manage timing and coordination of a plurality of cleaning implements and system peripherals.
- the cleaning method and system disclosed herein includes two workflows: actions at an individual robot station, and steps required to fully process a vehicle as it transits through the various cleaning stages. In some examples, multiples of these workflows may be happening in parallel: for example, six robotic arms 110 and three vehicles at a time.
- the state machine 118 may be configured to track and direct sequencing of these parallel tasks.
- Each vehicle in the system 100 may be associated with a “vehicle task list” that is configured to define which tasks are required to fully process a vehicle. Without limitation, the tasks may include computation (motion planning), sensing (overhead location scan), robot motion (execute front driver door cleaning paths).
- the tasks may be executed in parallel (motion planning of various sub-paths) or in a sequence (motion execution requires motion planning to be completed).
- the state machine 118 may be further configured to process active vehicle task lists (for each in-process vehicle) to run each task on asynchronous background threads when the appropriate preceding tasks have been completed.
- the system 100 may include a safety system.
- the safety system may be configured to utilize vision methodologies to sense when humans or other undesired obstacles are close to the interconnected robot system or end effectors of the robotic arm 110, resulting in a reduction in robot system force or halting robot activity.
- the system 100 may be designed to effectively accommodate feedback, measurement, and control specific to robotic arms 110.
- the robotic arms 110 may be used which are designed with greater mass and payload, and operate at higher speeds than collaborative robots for a lower investment and improved operational efficiency.
- barriers such as hard barriers, may be placed to ensure non-workers do not approach or get near working interconnected robot systems. Barriers can be transparent curtain-wall like structures made of glass, acrylic, or polycarbonate, or fence-like structures, or walls.
- any of the components of the system 100 may be implemented as hardware and/or software. Without limitation, any of the components of the system 100 may include a processor and a memory that communicate with each other through one or more networks that may also be part of the systems 100 of FIG. 1A and FIG. IB and FIG. 1C. While single instances of the components are illustrated in FIG. 1A and FIG. IB and FIG. 1C, it is understood that system 100 of FIG. 1A and FIG. IB and FIG. 1C may include any number of components.
- a single processor may be configured to carry out any number of functions in accordance with the systems and methods described herein.
- a plurality of processors may be configured to carry out any number of functions in accordance with the systems and methods described herein.
- the processor may comprise an application specific integrated circuit and may be configured to execute one or more instructions.
- the processor may be part of a device, such as a network-enabled computer.
- a network-enabled computer may include, but is not limited to a computer device, or communications device including, e.g., a server, a network appliance, a personal computer, a workstation, a phone, a handheld PC, a personal digital assistant, a thin client, a fat client, an Internet browser, or other device.
- the device also may be a mobile device; for example, a mobile device may include an iPhone, iPod, iPad from Apple or any other mobile device running Apple's iOS operating system, any device running Microsoft's Windows Mobile operating system, any device running Google's Android operating system, and/or any other smartphone, tablet, or like wearable mobile device.
- the server may be configured as a central system, server or platform to control and call various data at different times to execute a plurality of workflow actions to perform one or more functions described herein.
- the one or more servers may contain, or be in data communication with, one or more databases.
- the device can include a processor and a memory, and it is understood that the processing circuitry may contain additional components, including processors, memories, error and parity/CRC checkers, data encoders, anticollision algorithms, controllers, command decoders, security primitives and tamper proofing hardware, as necessary or desired to perform the functions described herein.
- the device may further include a display and input devices.
- the display may be any type of device for presenting visual information such as a computer monitor, a flat panel display, and a mobile device screen, including liquid crystal displays, light-emitting diode displays, plasma panels, and cathode ray tube displays.
- the input devices may include any device for entering information into the user's device that is available and supported by the user's device, such as a touchscreen, keyboard, mouse, cursor-control device, touchscreen, microphone, digital camera, video recorder or camcorder. These devices may be used to enter information and interact with the software and other devices described herein.
- the memory may be a read-only memory, write-once read-multiple memory or read/write memory, e.g., RAM, ROM, and EEPROM, and the system of FIG. 1A may include one or more of these memories.
- a read-only memory may be factory programmable as readonly or one-time programmable. One-time programmability provides the opportunity to write once then read many times.
- a write once/read-multiple memory may be programmed at a point in time after the memory chip has left the factory. Once the memory is programmed, it may not be rewritten, but it may be read many times.
- a read/write memory may be programmed and re-programmed many times after leaving the factory. It may also be read many times.
- Exemplary memory types that may be used as memory include but are not limited to semiconductor firmware memory, programmable memory, non-volatile memory, read only memory, electrically programmable memory, random access memory, flash memory (which may include, for example NAND or NOR type memory structures), magnetic disk memory, optical disk memory, combinations thereof, and the like. Additionally, or alternatively, memory may include other and/or later-developed types of computer-readable memory.
- the network may be one or more of a wireless network, a wired network, or any combination of wireless network and wired network, and may be configured to connect to any components of system.
- the network may include one or more of a fiber optics network, a passive optical network, a cable network, an Internet network, a satellite network, a wireless local area network (LAN), a Global System for Mobile Communication, a Personal Communication Service, a Personal Area Network, Wireless Application Protocol, Multimedia Messaging Service, Enhanced Messaging Service, Short Message Service, Time Division Multiplexing based systems, Code Division Multiple Access based systems, D-AMPS, Wi-Fi, Fixed Wireless Data, IEEE 802.11b, 802.15.1, 802. l ln and 802.11g, Bluetooth, NFC, Radio Frequency Identification (RFID), Wi-Fi, and/or the like.
- RFID Radio Frequency Identification
- the network may include, without limitation, telephone lines, fiber optics, IEEE Ethernet 902.3, a wide area network, a wireless personal area network, a LAN, or a global network such as the Internet.
- the network may support an Internet network, a wireless communication network, a cellular network, or the like, or any combination thereof.
- the network may further include one network, or any number of the exemplary types of networks mentioned above, operating as a stand-alone network or in cooperation with each other.
- the network may utilize one or more protocols of one or more network elements to which they are communicatively coupled.
- the network may translate to or from other protocols to one or more protocols of network devices.
- the network may comprise a plurality of interconnected networks, such as, for example without limitation, the Internet, a service provider's network, a cable television network, corporate networks, and home networks.
- FIG. 2 depicts the use of reference point markers according to an exemplary embodiment.
- FIG. 2 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- a plurality of markers 210 are positioned across various surfaces of the vehicle prior to scanning to facilitate high accuracy during 3D scanning.
- the markers 210 may include one or more stickers and/or magnets, which typically include an image of a black or white dot printed thereon.
- the markers 210 may comprise 3D printed markers, including but not limited to 3D prism printed markers.
- a 3D scanner uses these high contrast or high reflective images to determine its location relative to prior understood artifacts or imagery.
- markers 210 provide the scanner with a consistent feature to track, and thus, increase accuracy and repeatability of 3D data as opposed to scanning features without markers.
- Narrow features such as on the A-Pillar and B-Pillar of the vehicle, should use denser sticker/marker 210 coverage.
- a “markers only” scan may be conducted first, then followed up by scanning the interior of the vehicle. Markers 210 may not be added to one or more exterior surfaces of the vehicle, including but not limited to one or more doors, until the one or more doors are ready to scan for collision detection.
- the markers 210 are generally placed manually by a technician across to-be-scanned surfaces prior to the master scan data collection.
- the scanning spray may include AESUB 3DTM scanning spray. It is understood that other scanning sprays may be used and that the scanning spray is not limited to such a scanning spray.
- the scanning spray may be applied, for example automatically or manually, to an exterior of the window while scanning the interior of the window to obtain accurate curvature so as to avoid unevenness of the applied spray.
- a light coating may be sufficient, and the scanning spray may evaporate on its own without requiring any cleaning or removal.
- blue painter tape may be applied on interior surfaces of the vehicle that are difficult to reach via the scanning spray.
- the method includes scanning from the outside, which prevents surfaces sticking through each other if any door or portion thereof is moved slightly, e.g., a bump of the door interior may be done first, then it may protrude through scan of the exterior due to the swing and movement of the door.
- the one or more doors of the vehicle may be scanned separately from the frame and interior. This is to manage the situation where doors are bumped during scanning or move slightly. Overlapping geometry may be deleted after scanning, this avoids undesired overlaps when merging various scans.
- Additional exterior data may be gathered for collision avoidance geometry. For instance, for regions of the vehicle that are not be subjected to robot tool path planning, or obstacle identification, or other discrete elements that influence robot motion planning within the vehicle, the general shape of the vehicle is captured for collision avoidance. This includes the exterior of the vehicle from A pillar to the front, and C pillar to the rear. For this process, a patterned blanket may be clipped to the vehicle and the scanning traverse speed by the operator was observed to be faster without repeating seen surfaces to perform detail image filling/ completion. The software then takes the point cloud data that matches/ aligns images due to the texture tracking of the blanket to create a uniform mesh surface that is sufficiently accurate to enable motion planning software to avoid those surfaces to prevent any sort of collision during motion execution.
- Models may be generated and merged from the various master scans of the various regions to form a single model, and this model may be decimated to reduce total size clean-up processes, which may be manual in some examples and may be automated in other examples, may be configured to eliminate holes, fill in regions with no data, in which it is undesirable for any robot to enter (for example, such as under one or more seats of the vehicle), and fix any overlapping as noted above.
- adding textured surfaces over portions of the vehicle that are only needed for collision avoidance may help accelerate scanning and processing and improve overall process flow.
- partial car covers having markers printed, secured, or otherwise coupled thereon can be used.
- segmentation may be initiated. This is the process of segmenting out portions of the scanned data that identify one or more features of interest. These include but are not limited to one or more seat bottoms, floor wells, and dash/console areas.
- the segmentation may be configured to drive a specific process type, such as the spray of a cleaning agent, or tool paths for a vacuum nozzle.
- a trained classifier may or may not be included. While in some examples, a priori knowledge may be present, the MSDP has no a priori knowledge (for example, the MSDP may be yield the presence of a seat bottom, a dashboard, or the like.
- FIG. 3 depicts a method 300 for model generation according to an exemplary embodiment.
- the method 300 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the method 300 may include applying target, or markers, on door frames and flat interior locations of a vehicle.
- the method 300 may include applying a textured material, such as a blanket, to the hood and rear of the vehicle. Using the textured material provides tracking for the 3D scanner and significantly reduces the number of the markers applied to the vehicle.
- the method 300 includes applying a scanning spray for black and/or shiny surfaces and windows. In some examples, the shiny and/or black vehicles may require the spray to collect data.
- the markers may show through a light coating of the spray.
- one or more scans are collected, such as door frames, doors, hood and rear, and interior or the vehicle.
- the method 300 may include aligning and/or merging the one or more scans.
- the method 300 may include exporting the aligned and/or merged scans for processing.
- a model can be created by incorporating the individual scans and aligning the scans to create a unified single model.
- the method 300 may include processing a mesh model, such as segmenting the mesh model, cleaning the mesh, and filling holes.
- the method 300 may include decimating and additional processing the mesh.
- the method 300 may include defeature for use in toolpath generation and/or saving for configuration scan registration.
- the system includes custom software utilities to perform application specific human guided, or autonomous, feature segmentation for path planning and the subsequent tool path planning, including the appendage of approaches and departures for each tool path, and their specific requirements.
- Graphical user interfaces to accompany these software modules will facilitate either human guided processes to facilitate the internal cleaning processes or fully autonomous based on rules-based and/or Al-based intelligence to realize broader generalizability of master scan data sets.
- scanning software and post-processing software may be configured to carry out the above features with respect to FIG. 3.
- FIGs. 4A-4C illustrate segmented out geometrical regions with one or more tool paths according to an exemplary embodiment.
- FIGs. 4A-4C may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- FIG. 4A illustrates a front seat bottom segmentation 410.
- the front seat bottom 410 may belong to a driver seat of the vehicle.
- the front seat bottom 410 may belong to a passenger seat of the vehicle.
- FIG. 4B illustrates a rear floor segmentation 420, where the rear floor may be disposed behind the driver or passenger seat.
- FIG. 4C illustrates a rear seat segmentation 430, where the rear seat may be disposed behind the driver or passenger seat of the vehicle.
- Segmentation may be automated with one or more rules-based algorithms, one or more Al-based feature recognition algorithms, or any combination thereof.
- segmentation may be human assisted.
- a human-guided segmentation process may be utilized from the tools provided by the master scanning hardware’s complimenting software.
- segmentation may be automated.
- autonomous segmentation tools may automate the segmentation process.
- an application that includes instructions for execution on a client device may be configured to enable, via a user interface, a guided experience for an operator to perform segmentation. This interface may enable simple click and classify with automated trimming to highlight only a portion, such as the seat bottom, for instance.
- an automated workflow may be generated which has learned from prior human-guided segmentation and the data created to provide a workflow that is only supervised by an operator or a controller, with a level of automated quality assurances to flag one or more issues that are identified in the segmentation workflow. For example, via a touchscreen, an operator on the line can indicate if a region was not cleaned properly. This could be, for instance, a floor pan that was only 50% covered, leaving visible debris on half. The operator at the end of the robot region, that places objects back in the car and does other final prep, indicates the region not cleaned appropriately. This flags the MSDP, as well as the associated configuration data, and sends a notification for review of root cause.
- tool paths process steps may be configured to plan desired tool paths on as-presented master scans based on application of one or more rules entered by the user for the specific process.
- the one or more rules may include, but are not limited to, tool offset from the target surface, work and travel angle, as well as raster spacing.
- a mesh model may be of sufficient quality, a primitive can be inserted over the surface, such as at a predetermined z-offset distance, to provide a more consistent motion path for a subsequent motion plan. For example, if the mesh model includes highly tufted seats, a tool path motion using the mesh data could be overly complex, which adds unnecessary time to the process and may not lead to any improvements in cleaning performance.
- auto-fit of a primitive just above the target surface may be human-guided.
- the auto-fit of a primitive just above the target surface may be automated.
- a region is initially segmented, once surfaces are classified/identified, the primitive is fit, e.g., a plane that may be a 2D polygon, then subsequent raster paths are planned on the plane as opposed to directly onto the mesh.
- the output mesh from the scan, the identified regions that are segmented, collision geometry, and the tool paths for the target processes may constitute the MSDP.
- tool paths may be applied directly to filtered mesh surfaces and additional filtering via leverage of a cartesian motion planner to provide refined tool paths based on the actual mesh surface may be leveraged, so as to reduce reliance on primitive fitting and allowing for improved tool following in execution.
- a broader range such as full sized 4 door sedans, may be handled that takes into account managing variations such as seat width, similar to how obstructions would be managed by simply clipping tool planned tool paths.
- the aforementioned generalizability may take into account notations on executed tool paths on the line to help inform whether any number of variations of an MSDP for a future vehicle in the same line (continuing with the above example, a 2025 Camry), is needed.
- the MSDP may include detailed surface models, segmented out regions of the interior, collision modeling of surfaces, and specific tool paths on the segmented surfaces that are relevant to the desired process for execution.
- segmentation may be human assisted, e.g., a human may input information into the client device, such as highlight an area (such as surfaces, dashboards, or seat bottoms) via the application comprising instructions for execution on the client device which may identify it as a given region, such as a dashboard.
- these regions may be autonomously identified either via a rules-based algorithm, an Al-assisted semantic segmentation algorithm, or any combination thereof that is configured and optimized for automotive interior surfaces.
- MSDP database which may be a cloud service to enable MSDP information be leveraged by any site that has access to the data. Having data in the cloud and collecting information from sites on how MSDP is both generalizable, and how it performs enables richer capability for optimization via learning, which may further leverage the cloud infrastructure that is leveraged, further reducing the amount of master scanning that has to take place over time and thereby mitigating the need for storage and improving overall system efficiency.
- Operations may be configured to run “on the line” even at first deployment.
- the cycle management coordinator may include a state machine 118 that is configured to handle the timing and coordination of developed sub-modules and interaction with components of the system 100. This may include the launch of configuration scan and alignment, receiving the updated information, and providing the updated tool path offsets to the motion planner. Specific motion plans are assigned to specific assets and then, in a coordinated fashion, dispatched to the specific robot on the line, utilizing the specific hardware with the specific process recipes.
- the configuration scan and alignment is the process that the vehicle to be processed is located on the line and is matched to the target or selected MSDP in the database. Information may be acquired about the vehicle that is entering a bay of a car wash, and this acquisition may be obtained through existing car wash infrastructure. The appropriate MSDP is then acquired, and the configuration scan and alignment process may be initiated.
- An overhead camera such as the overhead 2D camera of FIG. 1, determines the coarse position of the vehicle on the belt, from here, with doors in the open position, and the robots in a backed away position, clear of the doors, an initial image is captured, focusing on the doorjamb.
- doorjambs have been the most consistent feature to locate on the vehicle for launching the process; however, other vehicle features can be used in addition to or in place of the door jamb.
- the pillars of the vehicle frame are typically not changed through customization and thus, are candidates for localization.
- FIG. 5 illustrates a method 500 of obstacle detection according to an exemplary embodiment.
- the method 500 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the method 500 may include providing an initial alignment. In some examples, this may include aligning a car body master scan to a front doorjamb configuration scan, in which the consistency of the shape of the doorjamb is leveraged. The car location transform may be outputted.
- the method 500 may include applying an alignment transform.
- the alignment transform from step 510 may be applied to one or more seat master scans. As a consequence, the one or more master seat scans is put proximate the actual seat position in the configuration scan.
- the method 500 may include refining seat alignment. In some examples, a refined alignment of the one or more seat master scans is performed to the interior front configuration scan. This results in final seat positions that are much closer to the seat in the configuration scan. Additionally, this alignment step restrains the movement of the seat to match up the actual seat’s degrees of freedom. The location of the seat may be outputted.
- the method 500 may include detecting one or more obstacles. In some examples, a point cloud of detected obstacles can be generated from the discrepancies between the configuration and aligned master scans.
- a method using an iterative closest point (ICP) algorithm is implemented and configured in order to align the car body master scan to a configuration scan that includes the geometry of the front doorjamb.
- the doorjamb area was selected because it provides consistent geometric features that are helpful for registration.
- This registration result provides the location of the vehicle body and seat relative to the configuration scan.
- a subsequent registration is performed to locate the seat position more precisely inside the vehicle.
- the seat master scan may include two separate mesh models, such as, the bottom and backrest, aligned independent of each other.
- the algorithm may be configured to remove the geometry that is common between the master and configuration scans and labels what remains as an obstacle.
- an ICP algorithm implementation such as in MeshLab, may be configured to select any number of common points between the master and configuration scans, in which the ICP algorithm may be configured to then make any adjustments for final alignment. It is understood that other algorithms may be configured in lieu of MeshLab.
- the obstacles may include an umbrella, a device mount (such as a phone mount), a drink cup, a jacket, an infant car seat.
- a device mount such as a phone mount
- a drink cup such as a drink cup
- a jacket such as an infant car seat.
- black, metal coffee mugs were often barely visible using a 3D camera (such as a ZividTM camera), whereas a depth camera (such as a RealSense D455TM camera) provided better recognition of the black, metal coffee mug. It is understood that other types of suitable cameras may be implemented.
- two configuration scans can be combined when using the 3D camera.
- an offset may be applied to the configuration scans. For example, a 0.2 radian ( ⁇ 12 degree) and 0. Im offset may be applied to configurations scans.
- a view of the door jamb of the vehicle may be collected. This may comprise collecting views of both front and rear doorjambs. Then, a plurality of views of the interior through the front door are collected. Without limitation, this may include collecting three views. For each of the plurality of views, the following process may take place: collecting a configuration scan with no obstacle; adjusting the seat and/or adding an obstacle; and collecting the configuration scan. If the seat is adjustable, a plurality of seat positions, such as three seat positions, may be collected. After the views of the interior through the front door are collected, a plurality of views of the interior through the rear door may be collected. As with the interior through the front door view collection, this may include collecting three views of the interior through the rear door.
- the following process may take place: collecting a configuration scan with no obstacle; adjusting the seat and/or adding an obstacle; and collecting the configuration scan. If the seat is adjustable, a plurality of seat positions, such as three seat positions, may be collected.
- inputs for the following may include: for master scans, vehicle body without seat and steering wheel, seat bottom, and seat backrest; for configurations scans, front door jamb, front interior (several views captured at various seat positions and with obstacles), rear interior (several views captured with different obstacles), and rear doorjamb (optional); for configuration parameters, alignment constraints and obstacle detection.
- FIG. 6 illustrates an image 600 of initial registration via implementation of an algorithm according to an exemplary embodiment.
- FIG. 6 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the algorithm may comprise an iterative closest point (ICP) algorithm.
- the portion 610 may represent a configuration before alignment, whereas the portion 620 may represent a configuration aligned with master data.
- FIG. 7 illustrates an image 700 of seat alignment according to an exemplary embodiment.
- FIG. 7 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the seat may be identified 710 in the configuration data and the pre-registration data (from the master data package) and is aligned to the observed seat in the configuration scan 720.
- FIG. 8 illustrates an image 800 of collision object detection according to an exemplary embodiment.
- FIG. 8 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- any unregistered features that do not have a match to the master data in the configuration data may be identified as obstructions.
- these obstructions may appear inside the vehicle as denoted 810.
- the obstructions 810 may be treated as collision objects for robot motion planning.
- tool paths may be clipped or truncated to omit tool paths that are within a predetermined distance, such as 1cm to 4cm depending on volumetric considerations of the region being processed, within the motion planning with regards to the avoidance of collision objects.
- FIGs. 9A-9B illustrate an image 900 of collision object detection according to another exemplary embodiment.
- FIGs. 9A-9B may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- a vehicle seat is identified as a collision object 910.
- the collision object 910 may be located in the rear portion of the vehicle but is not limited to such positioning.
- the image 900 illustrates the actual physical vehicle seat 920 that is identified as the collision object 910. Tool paths may be altered that would normally be planned for execution underneath the vehicle seat, thereby only planning an operation, such as the vacuum operation, on the rear seat that is exposed up to the predetermined zone relative to the collision object, in this case the vehicle seat.
- a vehicle seat 920 is depicted in FIG. 9B, it is understood that any number and type of other objects may be detected and is therefore not limited to such.
- FIGs. 10A-10C illustrate feature acquisition according to an exemplary embodiment.
- FIGs. 10A-10C may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- coarse vehicle location is required prior to initiation of any configuration scan, as illustrated in FIG. 10A.
- this is implemented utilizing an overhead camera, such as the overhead camera, that that is configured to identify the extents of the width of the vehicle and find the vehicle doors using image analysis algorithms based on pixel contrast.
- the configuration scan may be performed by the overhead camera, such as a depth camera, that is mounted to a wrist portion of a robot manipulator. In this manner, the overhead camera may be configured to find coarse alignment 1010. As illustrated in FIG.
- the initial image acquisition such as a first image 1020
- the initial image acquisition is backed away from the vehicle, to capture the doorjamb, clear of the doors.
- this may serve as the capture of the initial door jamb.
- a second image 1030 may be taken just outside of the opening of the vehicle doors but positioned by the robot within the envelope of the doors. In this manner, one or more interior features are acquired.
- Internal features acquired are, but not limited to, seats (all interior seats, front and back, and bottom and back rest, accounting for position), visible floor pans, console, dash, steering wheel, all windows, rearview mirror, seat belts that may obscure door opening, objects left in the vehicle, which may include, but not limited to, jackets, umbrellas, car seats, boxes (tissue to items that are similar to standard moving boxes). Objects that do not match to the configuration scan will automatically be treated as collision objects. If they obscure tool paths, the tool paths will be truncated within the allowable collision avoidance tolerance defined for that object class/region. In some examples, a plurality of additional images may be required. For example, up to two additional images may be acquired to fully capture footwell and seat detail information.
- the intent is to optimize based on observation analysis as the images are acquired so as to reduce the necessity of multiple image acquisition, reduce the need for system storage, and also improve the processing efficiency of the system.
- the amount of time available may be on the front of line robots to tolerate the time to acquire images for processing, which may be in the order of seconds. While other cameras, such as a 3D camera, may be utilized for image acquisition, the depth camera is preferred due to improved field of view characteristics and the quantity of information obtained from a single shot.
- the systems and methods disclosed herein may be camera agnostic, however specifications for the particular camera may drive different behavior of the system, such as where the robot positions the camera for the image acquisition and/or how many images are acquired.
- a process plan Prior to sending instructions to the robot for execution of one or more cleaning operations, a process plan is created. From the prior operation, the system may have tool paths aligned to the frame of the robot relative to the vehicle. However, motions that are coordinated with the motion of the vehicle on a conveyor, for example, are planned as well as approaches and departures for the different processes, any tool changes and the transitions from free space (joint space) motion to cartesian (tool path) motion.
- this process planning pipeline for the management of cleaning tasks inside vehicles may comprise: sequencing of robot planning tasks for the completion of a series of cleaning toolpaths (For example: Using planning frameworks; defining planning instructions using YAML files, this is a human readable configuration format that may be updated via a user interface, to allow tuning by technicians; sparse planning of cartesian trajectories; implementing an algorithm, such as dijkstra's shortest path graph search algorithm, to plan robot trajectories for cartesian paths using sampler based approach to more quickly find the configuration for the various poses along the cartesian path and edge evaluators.
- the edge evaluator may account for tool speed limits, this is used by software tools to determine if a move between two consecutive points does not exceed the maximum allowed tool speed.
- the planning instructions program written in YAML file format, may be configured to define sequence of free-space and cartesian toolpath planning instructions, thus supporting two kinds of planning instructions.
- the cartesian toolpath instructions may be configured to specify properties, such as tool standoff, approach and retreat distance, transformations, and coordinate frame of reference.
- each instruction may be configured to specify a profile for each step in the planning process.
- Tool path planners may be configured to facilitate optimal configurations of the manipulators in the motion planning of cartesian tool paths. In some examples, other tool path planners may be used.
- the output of the software tools may be fed as a seed into an optimization pipeline to refine the manipulator free space motion based on defined constraints. These may be defined by acceleration or torque limitations defined by the system 100, or may also be adjusted based on technician and end-user end of line feedback on performance. Further performance enhancements may or may not be realized by the inclusion of a reinforcement learning actor as a seed for the optimization-based motion planning.
- Process paths for various processes of interest may have a unique transition to departure, that is coordinated with other paths for completion within a given time frame.
- the process paths may be configured to be completed within 80 seconds, which marks a departure from domains which are not time-constrained.
- the systems and methods described herein may be implemented in an existing car wash workflow, where a vehicle may be configured for conveyance along a belt in a car wash bay. Additionally, or alternatively, the systems and methods are applicable to stationary vehicles in which the various process steps can be moved into place and conducted in sequence.
- the interior of the vehicle may be cleaned according to the systems and methods described herein while the vehicle remains still, either on or off the conveyor belt.
- the systems and methods can be employed in a car wash workflow using a combination of moving and unmoving stations.
- the systems and methods described herein may be generalizable in that there exists the ability to train a model and perform one or more process paths with limited data that is collected.
- the systems and methods are configured to adapt and modify execution of these process paths based on the limited data acquired, which can avoid the need to create a master scan data package for each car configuration.
- FIG. 11 illustrates an image 1100 of motion planning according to an exemplary embodiment.
- FIG. 11 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the motion planning also incorporates the external axes, in this case a linear rail as well as a rotational compound rectangular prism structure to position the base of the robot nearest to the opening from above the vehicle. This concept moves the robots at the same rate of the vehicle moving on the conveyor; thereby, from the robot motion planning perspective, the vehicle appears stationary.
- Commands, such as one or more motion commands, along the rail may then be coordinated as plus or minus the velocity of the belt.
- the belt speed may range from 0 feet per second to 0.5 feet per second.
- the belt speed may comprise 0.3 feet per second.
- a motion planning algorithm may be configured to parse a planning program and toolpath files to then plan the corresponding robot trajectories. During operation, the motion planning algorithm may be configured to return a sequence of trajectories for the program files. The sequence of planned robot trajectories may be sent to a 3D visualizer for the robot operating system framework
- FIG. 11 depicts the door skin spraying process which is launched after configuration scan and subsequent motion planning.
- the motion planning algorithm in order to plan a new traj ectory , may be configured to utilize toolpath processing; cartesian planning; optimization-based free-space planning; and post-processing.
- changes to the original tool path may be applied prior to planning. For example, this may include up-sampling or down-sampling waypoint count, etc.
- Cartesian planning a planner may be configured to obtain valid robot positions for toolpath points as allowed by the specified tool constraints.
- free-space planning this may be used for planning a free-space move from a most recent robot position to a start of the cartesian robot trajectory.
- time values may be computed for trajectory waypoints so as to move the tool at a desired speed while at the same time adhering to the speed limits of the robot.
- a robot trajectory executor may be configured to execute robot free space and car cleaning tool path trajectories in the proper order, monitor the vehicle position, via a node (instantiated software code that performs a specific function) that tracks via external measuring or belt encoder monitoring, or a combination of both, to actively command a robot linear rail and the manipulator when the car is within the working envelope (reach) of the robot; and send back the robots to the start/initial position when the robot has completed the provided trajectories.
- a node instantiated software code that performs a specific function
- the sequence of planned robot trajectories may be sent to the visualization on the user interface of the client device for viewing, along with the option to halt or approve and send to the industrial robot controller for execution.
- the display of the user interface of the client device may be configured to allow for further inspection of the trajectories, and the instructions for execution may be aborted or may be re-planned after operator intervention.
- the solution enables for the storage of successfully executed trajectories, which enables further optimization as the solution matures, up to including a reinforcement learning implementation for the optimization of industrial application trajectories.
- the database includes successful trajectories and the specific data set that informs the reinforcement learning implementation.
- a programmable logic controller may be configured to coordinate the call of the specific programs for execution and act as a master of the hardware components of the system.
- a device including but not limited to an industrialized personal computer, that contains the developed software may be configured to interact with the PLC to facilitate the coordination of the hardware assets.
- an operator can review the work performed and input via a touch screen of a device, including but not limited to a mobile device, information indicative of whether any rework was performed or assess if regions were skipped that possibly should not have been skipped.
- a skip may refer to a scenario where there was something within the vehicle that did not align between configuration data and master data. This results in collision geometry and may then remove the impacted/ obstructed tool paths, when then results in the region not being processed. This data may be utilized for continuous improvement by adding annotations to the MSDP.
- FIG. 12 illustrates images for registration 1210 and obstacle detection 1220 in a vehicle according to an exemplary embodiment.
- FIG. 12 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above. In some examples, this process may be generalized to different trim types.
- FIG. 13 illustrates a decision matrix for a configuration scanner according to an exemplary embodiment.
- FIG. 13 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the Zivid TwoTM camera or the Intel RealSense D435 may be used.
- the selection of a particular type of 3D sensor is based on consideration of a plurality of factors, including Field of View, Scan Time, Scan Quality, Performance relative to this application’s registration requirements, and impact on solution deployability at scale. The consideration of the plurality of these factors were based on quantitative testing on representative surfaces and collected output aligned to a known condition.
- FIG. 14 illustrates example master scan data collected according to an exemplary embodiment.
- FIG. 14 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- this data 1410, 1420, 1430, 1440, 1450, 1460, each depicted as master scan data, may be collected may be from a variety of types of vehicles, including but not limited to Hyundai, Nissan, Toyota, Chrysler, and Dodge cars, SUVs, or Trucks. It is understood that master scan data may be collected from other vehicles and models, and as such are not limited to collecting data from only these types of vehicles.
- FIG. 15 illustrates a method for seat registration according to an exemplary embodiment.
- the method may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the method may include acquiring configuration scans of a doorjamb and interior of the vehicle.
- the method may include loading seat master models.
- the method may include registering master seat bottom to configuration scans.
- the method may include registering master seat backrest to configuration scans.
- FIGs. 16A-16H illustrates images for pre-registered and registered interior vehicle portions according to an exemplary embodiment.
- FIGs. 16A-16H may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- FIGs. 16A-16B may be read in conjunction with block 1500 of FIG. 15
- FIGs. 16C-16D may be read in conjunction with block 1510 of FIG. 15
- FIGs. 16E-16F may be read in conjunction with block 1520 of FIG.
- FIGs. 16G- 16H may be read in conjunction with block 1530 of FIG. 15.
- the points 1612 within the circled region may correspond to a seat bottom of a vehicle.
- the points 1614 within the circled region may correspond to a seat backrest of the vehicle.
- the pre-registered master seat backrest 1616 may be illustrated.
- the preregistered master seat bottom 1618 may be illustrated.
- the pre-registered master seat bottom 1622 may be illustrated.
- the registered master seat bottom model 1624 may be illustrated. The overlap of points to mesh in the images may indicate similarity.
- the pre-registered master seat backrest 1626 may be illustrated in green, and the registered model 1628 in gray.
- the registered master seat backrest model 1632 may be illustrated.
- the overlap of points to mesh in the images may indicate similarity.
- FIG. 17 illustrates an image 1700 of a tooling attachment 1710 according to an exemplary embodiment.
- FIG. 17 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the tooling attachment may be coupled to any of the robotic arms 110.
- Tooling attachment(s) 1710 can be designed for facilitating cleaning in tight spots.
- tooling attachments 1710 can be generally longer than conventional attachments, as the wrist of a conventional robot may not fit into tight areas disposed at the floor of the car.
- making the tooling attachment 1710 too long may provide difficulty getting into smaller spaces. Therefore, a sufficient extension length and angle for a component, including but not limited to a vacuum, is needed so that it is able to fit in a space between the seat and the glovebox, for example, in the vehicle interior.
- FIG. 18 illustrates an image of a component against a portion of an interior of a vehicle according to an exemplary embodiment.
- FIG. 18 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- this image 1800 illustrates a vacuum against the back of a seat in a scanned car.
- FIG. 19 illustrates an image 1900 of a robot in motion according to an exemplary embodiment.
- FIG. 19 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- this image 1900 illustrates a robotic arm 110 with a scanned model depicting collision in a desired move.
- FIG. 20 illustrates a method 2000.
- the method 2000 may include an automated method for cleaning an interior of a vehicle.
- the method 2000 and FIG. 20 may reference and include any components and functions described above with respect to any of the figures.
- the method 2000 may include determining a position of the vehicle with respect to one or more robotic arms that are positioned exterior to the vehicle.
- the method 2000 may include scanning a configuration of the vehicle.
- the method 2000 step may yield a configuration scan.
- scanning the configuration of the vehicle may include capturing a plurality of images of the vehicle.
- the configuration scan may capture exterior images and interior images of the vehicle.
- the method 2000 may also include acquiring information about the vehicle by retrieving master data from a database.
- the method 2000 may also include aligning data from the configuration scan with the master data.
- the method 2000 may also include detecting obstacles based on discrepancies between the configuration scan and the aligned master data.
- the plurality of tool paths may be created to avoid the detected obstacles by a predetermined distance.
- the method 2000 may include identifying any number of surfaces to be cleaned. In some examples, these surfaces may include interior vehicle surfaces.
- the method 2000 may include creating a plurality tool paths in MSDP, in which robot trajectories are generated for execution of the MSDP-created tool paths for the one or more robotic arms to clean the identified surfaces.
- the tool paths may be created as part of the master data package.
- a tool path(s) may be applied to meshes with appropriate tool information, such as stand-off allowable tool and travel angles. This information may be included in the data scheme that is associated with each segmented region/component.
- creating the plurality of tool paths may include implementing an algorithm that is configured to plan trajectories for the one or more robotic arms using a sampler-based approach.
- the method 2000 may include creating optimized tool paths by running the created plurality of tool paths through a free space motion system and/or cartesian motion planning system.
- the method 2000 may include controlling the one or more robotic arms to move along the created tool paths and execute a cleaning operation in the interior of the vehicle.
- the one or more robotic arms may move along the motion planned tool paths to execute a cleaning operation relative to, including but not limited to, the identified surfaces.
- the tool paths may be clipped/truncated and take into account, for example, steering wheel position, etc. in motion planning or robot trajectory planning. Once the trajectories are planned, they may be sent for execution in association with the cleaning operation.
- the method 2000 may further include optimizing a sequencing of a plurality of tasks relative to a time permitted for each task of the plurality of tasks.
- FIG. 21 illustrates a method 2100.
- the method 2100 may include an automated method for cleaning an interior of a vehicle.
- the method 2100 and FIG. 21 may reference and include any components and functions described above with respect to any of the figures.
- the method 2100 may include storing a plurality of images for a plurality of vehicles in a database to create a master data package.
- the master data package may include a vehicle make and model data, vehicle year data, vehicle class data, and/or any combination thereof.
- the method 2100 may include acquiring vehicle specific data from the master data package.
- the method 2100 may include scanning a configuration of the vehicle.
- the method 2100 may include aligning the acquired vehicle specific data with the scanned configuration.
- the method 2100 may include creating a process plan for execution of a cleaning operation.
- the method 2100 may include sending instructions to a robot to execute a plurality of cleaning operations based on the created process plan.
- the method 2100 may include executing the cleaning operations according to the instructions.
- a cycle time which may be between a start of the scanning the configuration of the vehicle and an end of the execution of the plurality of cleaning operations, may be less than a predetermined time duration.
- the cycle time may be less than five minutes.
- the method 2100 may also include providing feedback on a quality of the executed plurality of cleaning operations.
- FIG. 22 illustrates a diagram of vehicle zones for cleaning an interior of a vehicle.
- FIG. 22 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- a plurality of zones may be used.
- the plurality of zones may include Zone 2 denoted 2210, Zone 3 denoted 2220, and Zone 4 denoted 2230.
- Zone 2 2210 and Zone 4 2230 may each comprise human work zones relative to a vehicle, whereas Zone 3 2230 may comprise a robotic work and safe zone.
- the system 100 may utilize a conveyor belt 117 system which indexes forward to a fixed position for the robotic arms 110 to conduct its work in the vehicle, in this case a passenger car.
- the tasks assigned to the robotic arms 110 in Zone 3 2220 may include, but are not limited to, side window cleaning, door panel cleaning, door jamb and frame blowing & drying, and seat and floor vacuuming. To meet overall cycle time requirements, these tasks may be achieved by the robotic arms 110 in under two minutes, a substantial improvement over previous robotic automation designs. Moreover, this cycle time in Zone 3 2220 leaves ample opportunity for the human steps to be completed in Zones 2 and 42210, 2230 in a timely, cost-effective manner.
- FIG. 23 A illustrates a diagram 2300 of robots for cleaning an interior of a vehicle according to an exemplary embodiment.
- the system 100 may include any number of robotic arms 110 that are part of one or more interconnected robot systems.
- the system 100 employs anetworked configuration of robotic arms 110.
- a plurality of robotic arms 110 may be used by the system 100.
- eight robotic arms 110, each capable of handling more than 5-kilogram payloads may be be attached horizontally to a plurality, including but not limited to four, vertical cantilever structures 2310.
- Each arm 110 will have up 4 to 7 degrees of freedom, horizontal reach of at least 900 mm, and vertical reach of 1,650 mm.
- the rate of motion of the slowest arm linkage may be 300+ degrees per second.
- the 6-axis construction and small footprint may allow for easy entry into the vehicle through open doors without need for minimization.
- other configurations can use 4, 5, 6, or 7 robotic arms 110 in this configuration.
- Each pair of robotic arms 110 may be configured to employ a sliding cantilever structure 2310 allowing the arm-pair to be positioned horizontal to the floor or perpendicular to it, or controlled to any position in-between.
- Each cantilever structure 2310 is fastened to and supports a pair of robotic arms 110, and is placed next to the vehicle on either side, with two cantilever structures 2310 per side of the vehicle, next to a front and rear opening for a total of four robotic arm 110 pairs.
- the cantilever structures 2310 are attached to vertical tracks that allow the robotic arms 110 to rapidly move up and down to reach the optimal height for entering the vehicle and reaching interior surfaces of the vehicle interiors through both the front and back entry doors simultaneously.
- the cantilever structures 2310 are fastened to a deck on either side of the conveyor 117.
- the system 100 may employ a fifth and/or sixth cantilever structure 2310, each with a robotic arm pair 110, placed toward the rear of the vehicle on a linearly actuated movable platform, so that it can clean the rear portion of the vehicle, trunk and /or hatch, at the same time as the primary interior cleaning operations.
- FIG. 23B illustrates a diagram 2300 of a plan view of robots for cleaning an interior of a vehicle according to an exemplary embodiment.
- the cantilever structures 2310 are fastened to a deck on either side of the conveyor 117 (not shown).
- the cantilever structures 2310 can be attached to one or two pivot points below and/or above end of a track system, controllably allowing the robotic arm pair 110 to swing freely in an arc up to 90-degrees, 45 degrees to either side of the line perpendicular to the edge of the conveyor 117 (not shown).
- one, or both, decks on either side of the vehicle can be fastened to a linearly actuated movable platform, allowing the entire cantilever structure 2310 to move perpendicular to the conveyor belt 117 by up to 8- feet in both directions.
- FIGs. 23 A-23B may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- the interconnected robot system allows for rapid actuation and end effector placement of the robotic arms 110.
- the controllers 113, 115 may be configured for high-speed pick and place operations where speed and accuracy are paramount to success of cleaning the interior vehicle by system 100.
- Current processing speeds allow image-capture-to- robot-arm- movements in as low as 500 milliseconds and control 1,400 actuated work steps per hour up to 1 meter apart.
- Each pair of robotic arms 100 are joined in a horizontal position, with the two robotic arms 110 reaching in the same direction, with one lying horizontally atop the other.
- the robotic arms 110 may work in tandem, each armed with a single- or double-tool end of arm effector.
- Duties for each robotic arm 110 can be programmed by the interconnected robot system and typically assign one robotic arm 110 to focus on upper portions of the vehicle interior and the other robotic arm 110 the lower portions of the vehicle interior.
- one robotic arm 110 may address window cleaning, followed by cleaning interior door panels, drying the door frame and cleaning the jamb - while the other robotic arm 100 may simultaneously focus on upholstery and floor cleaning.
- the tooling has been designed so that multiple cleaning effectors are mounted on the robotic arm 110 at once so that the end effectors do not need to be physically removed from the robotic arm 110 to go on to the next task but, instead, can be rotated into and out of place while attached to the robotic arm 110.
- Coordination and managing collision avoidance between the robotic arms 110, end effectors, and elements of the vehicle may be conducted by the interconnected robot system.
- Each interconnected robot system can work in parallel to the others, so that each system and the associated robotic arms 110 are working simultaneously to clean approximately one-quarter of the vehicle interior.
- the end effectors of the robotic arms 110 may be configured specifically for the primary robotic vehicle interior cleaning tasks and their throughput requirements. As the robotic arms 100 and end effectors carry out their functions, they may be configured to detect and collect data through a variety of sensors, including: motion, force, vision, torque, and/or other sensors attached to them. The purpose of this data is not only to determine cleaning efficacy but to also refine the 3D maps of the interior for path planning optimization. These data are fed to the path planning process which controls the movements of the robotic arms 110 and tools. In some examples, the tasking of tools may be first handled in the MSDP and tool path creation process, where a tool and its requirements are assigned to specific paths.
- the tasking and coordination at runtime is handled by a state machine, such as state machine 118 as previously explained above.
- a state machine such as state machine 118 as previously explained above.
- the aggregation of data from multiple vehicles will allow the software and its algorithms to determine optimal motion pathways and enable faster responses to non-standard items within the vehicle (such as baby seats, aftermarket items, etc.), further reducing cycle times and speeding the process of cleaning vehicle interiors.
- FIG. 24 illustrates a schematic 2400 of a vehicle transiting through stages of interior cleaning according to an exemplary embodiment.
- FIG. 24 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above. While a pickup truck is depicted, it is understood that any type of vehicle may be applicable, and as such is not limited to this type of vehicle. A plurality of robotic arms, such as arms 110, may be included, as previously explained.
- the schematic 2400 is illustrated to depict the entry of the vehicle relative to, for example, the stages B and C of the system 100.
- FIG. 25 illustrates a schematic 2500 of a vehicle transiting through stages of interior cleaning according to another exemplary embodiment.
- FIG. 25 may incorporate and reference any of the components and functions as explained above with respect to any of the figures described above.
- a plurality of robotic arms, such as arms 110, may be included, as previously explained.
- the schematic 2500 is illustrated to depict the location of the vehicle relative to the stages B and C, and in particular a close view of the arms in stage C and partially stage B of the system 100.
- the workflow of the systems and methods described herein has been designed, configured, and optimized to minimize cost by retaining human activities least cost-effective for automation - such as placement of vehicle on the conveyor, opening doors, removing and replacing removable items out of and into the vehicle, handling one-off items, and conducting final quality control.
- human workers may be responsible for cleaning rear- and forward-facing window glass interior surfaces, cleaning the trunk area, and removing, cleaning, and replacing floor mats.
- some zones include human workers present on the conveyor system whereas other zones include robot system and are designed to prevent human presence to reduce the risk of injury.
- the systems and methods described herein focus the robotic activities on those prone to poor quality, cumbersome to complete, and/or those posing the most risk to worker comfort, safety, and vehicle damage.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CA3235422A CA3235422A1 (en) | 2021-10-18 | 2022-10-18 | Systems and methods for controlled cleaning of vehicles |
| EP22884376.9A EP4419399A4 (en) | 2021-10-18 | 2022-10-18 | SYSTEMS AND METHODS FOR CONTROLLED CLEANING OF VEHICLES |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202163256763P | 2021-10-18 | 2021-10-18 | |
| US63/256,763 | 2021-10-18 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2023069462A2 true WO2023069462A2 (en) | 2023-04-27 |
| WO2023069462A3 WO2023069462A3 (en) | 2023-07-13 |
Family
ID=85981242
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2022/047054 Ceased WO2023069462A2 (en) | 2021-10-18 | 2022-10-18 | Systems and methods for controlled cleaning of vehicles |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20230123504A1 (en) |
| EP (1) | EP4419399A4 (en) |
| CA (1) | CA3235422A1 (en) |
| WO (1) | WO2023069462A2 (en) |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11504845B2 (en) * | 2019-08-14 | 2022-11-22 | Google Llc | Reconfigurable robotic manufacturing cells |
| EP4124417A1 (en) * | 2021-07-30 | 2023-02-01 | Siemens Aktiengesellschaft | Method for calibration of a robot |
| US20240139969A1 (en) * | 2022-10-31 | 2024-05-02 | Gm Cruise Holdings Llc | Robotic arm localization |
| KR20250031886A (en) * | 2023-08-29 | 2025-03-07 | 현대자동차주식회사 | 3d reconstruction system and 3d reconstruction method thereof |
| CN120395936B (en) * | 2025-07-03 | 2025-10-14 | 北京炎凌嘉业智能科技股份有限公司 | Intelligent spraying composite robot system based on large language model and method thereof |
Family Cites Families (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE10110373C2 (en) * | 2001-03-03 | 2003-03-06 | Wolfgang Daum | Method and device for cleaning the interior of automobiles |
| US20110155192A1 (en) * | 2008-02-27 | 2011-06-30 | Nadeem Ahmad | System and apparatus for automatic built-in vehicle washing and other operations |
| CN105809655B (en) * | 2014-12-30 | 2021-06-29 | 清华大学 | Vehicle inspection method and system |
| US20160335727A1 (en) * | 2015-05-12 | 2016-11-17 | Raymond Jimenez | System and method of real-time imaging and analysis of real-world objects |
| US10173647B2 (en) * | 2016-03-24 | 2019-01-08 | Ford Global Technologies, Llc | Systems and methods for efficient automatic vehicle washing |
| US10029654B2 (en) * | 2016-04-13 | 2018-07-24 | Ford Global Technologies, Llc | Enhanced vehicle cleaning |
| CN111212783A (en) * | 2017-11-15 | 2020-05-29 | 宝马股份公司 | Unmanned aerial vehicle, method and system for providing cleaning services for a vehicle |
| US11106927B2 (en) * | 2017-12-27 | 2021-08-31 | Direct Current Capital LLC | Method for monitoring an interior state of an autonomous vehicle |
| DE102018108343A1 (en) * | 2018-04-09 | 2019-10-10 | Washtec Holding Gmbh | Method for automatic recognition of a loading area |
| US11235471B2 (en) * | 2018-05-22 | 2022-02-01 | Uatc, Llc | Automated cleaning systems for autonomous vehicles |
| CN108773360A (en) * | 2018-07-25 | 2018-11-09 | 珠海格力智能装备有限公司 | Vehicle washing device and vehicle washing method |
| KR102569903B1 (en) * | 2018-12-13 | 2023-08-23 | 현대자동차주식회사 | Car wash apparatus and method of controlling the same |
| DE102018222640A1 (en) * | 2018-12-20 | 2020-06-25 | Volkswagen Aktiengesellschaft | Method for operating an autonomous fleet of vehicles and service module for an autonomous fleet vehicle |
| CN113427490A (en) * | 2021-06-11 | 2021-09-24 | 大连海事大学 | Visual long-range intelligent epidemic prevention disinfection robot of disinfection operation |
-
2022
- 2022-10-18 US US17/968,709 patent/US20230123504A1/en active Pending
- 2022-10-18 EP EP22884376.9A patent/EP4419399A4/en active Pending
- 2022-10-18 WO PCT/US2022/047054 patent/WO2023069462A2/en not_active Ceased
- 2022-10-18 CA CA3235422A patent/CA3235422A1/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| CA3235422A1 (en) | 2023-04-27 |
| EP4419399A4 (en) | 2025-08-20 |
| EP4419399A2 (en) | 2024-08-28 |
| US20230123504A1 (en) | 2023-04-20 |
| WO2023069462A3 (en) | 2023-07-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230123504A1 (en) | Systems and methods for controlled cleaning of vehicles | |
| US10011012B2 (en) | Semi-autonomous multi-use robot system and method of operation | |
| CA2904542C (en) | A supervised autonomous robotic system for complex surface inspection and processing | |
| US8694162B2 (en) | Methods, apparatuses and computer program products for utilizing near field communication to guide robots | |
| US11185985B2 (en) | Inspecting components using mobile robotic inspection systems | |
| KR101863360B1 (en) | 3D laser scanning system using the laser scanner capable of tracking dynamic position in real time | |
| US10328578B2 (en) | Methods and systems for detecting, recognizing, and localizing pallets | |
| Rahman et al. | A railway track reconstruction method using robotic vision on a mobile manipulator: A proposed strategy | |
| US12455354B2 (en) | Autonomous mobile aircraft inspection system | |
| EP3936286A1 (en) | Robot control device, robot control method, and robot control program | |
| US20240404041A1 (en) | Systems and methods for onboard auto-inspection of vehicles using reflective surfaces | |
| CN117636373A (en) | Electronic price tag detection method and device | |
| Fareh et al. | An integrated vision-guided robotic system for rapid vehicle inspection | |
| US12304091B2 (en) | Training of artificial intelligence model | |
| Mosca et al. | VISTA—Vision-based inspection system for automated testing of aircraft interiors: A panoramic view | |
| CN111975776A (en) | Robot movement tracking system and method based on deep learning and Kalman filtering | |
| JP2022050115A (en) | Movable body position estimation system, operator terminal device, and position estimation method | |
| CN112444283A (en) | Detection apparatus for vehicle assembly and vehicle assembly production system | |
| Wang et al. | Deep dynamic layout optimization of photogrammetry camera position based on digital twin | |
| Nakhaeinia et al. | Surface following with an RGB-D vision-guided robotic system for automated and rapid vehicle inspection | |
| WO2025165605A1 (en) | Inspection systems for use at commercial product facilities | |
| CN120948631A (en) | A control method and system for a composite material inspection robot |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22884376 Country of ref document: EP Kind code of ref document: A2 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 3235422 Country of ref document: CA |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2022884376 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2022884376 Country of ref document: EP Effective date: 20240521 |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22884376 Country of ref document: EP Kind code of ref document: A2 |