[go: up one dir, main page]

WO2025079150A1 - Système de commande, procédé de commande et support d'enregistrement lisible par ordinateur - Google Patents

Système de commande, procédé de commande et support d'enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2025079150A1
WO2025079150A1 PCT/JP2023/036789 JP2023036789W WO2025079150A1 WO 2025079150 A1 WO2025079150 A1 WO 2025079150A1 JP 2023036789 W JP2023036789 W JP 2023036789W WO 2025079150 A1 WO2025079150 A1 WO 2025079150A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
control
work
information
working device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/036789
Other languages
English (en)
Japanese (ja)
Inventor
峰斗 佐藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to PCT/JP2023/036789 priority Critical patent/WO2025079150A1/fr
Publication of WO2025079150A1 publication Critical patent/WO2025079150A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/20Drives; Control devices
    • EFIXED CONSTRUCTIONS
    • E02HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
    • E02FDREDGING; SOIL-SHIFTING
    • E02F9/00Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
    • E02F9/24Safety devices, e.g. for preventing overload
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions

Definitions

  • the second control unit 202 may be any unit that performs a predetermined control based on the detection result by the second detection unit 201, and may be installed, for example, at the location where the work device is working, or may be installed at a location different from the location where the work device is working.
  • the second control unit 202 may be, for example, a remote control unit 22 described below.
  • FIG. 2 is a diagram showing an example of the configuration of a control system.
  • a working apparatus 1 is connected to a remote control device 2 and a data storage unit D1 via a communication means N1.
  • the working apparatus 1 includes a controlled unit 11, a control unit 12, a virtual environment unit 14, and a detection unit 15.
  • the working apparatus 1 may further include an observation unit 13.
  • the observation unit 13 may be installed outside the working apparatus 1.
  • the observation unit 13 may be an observation device separate from the working apparatus 1.
  • the control unit 12 is an example of the first control unit 102 described above.
  • the detection unit 15 is an example of the first detection unit 101 described above.
  • the control unit 12 generates and outputs a control signal for the movable part (actuator) of the controlled unit 11 in order to control the position and posture of the controlled unit 11. If the actuator is controlled by an electrical control signal, the control unit 12 may directly output the control signal.
  • the control unit 12 outputs a predetermined signal, specifically a control target value, to a control unit (not shown) attached to the actuator, for example. If the actuator cannot be electrically controlled, for example, by hydraulics, the control unit 12 outputs a predetermined signal to a control unit (not shown) that controls the hydraulics.
  • the control unit 12 may also output a signal by receiving information from a remote control unit 22, which will be described later.
  • the work environment is determined by conditions such as the installation position and installation direction (angle) when the observation unit 13 is installed, and the performance and parameters specific to the observation unit.
  • the installation of the observation unit 13 can be appropriately determined based on the type and performance of the observation unit 13, the specifications of the work device 1 to be observed (e.g., type, size, range of motion, etc.), the work content, and the surrounding environment, and is not limited to this embodiment. It is assumed that the installation position and direction of the observation unit 13 are known in the work device 1 coordinate system.
  • the type is a difference in the measurement method, and examples of the type include a camera, a video camera, a lidar, and a radar. Examples of performance include the field of view (FOV), the longest measurement distance, and the resolution.
  • FOV field of view
  • the virtual environment unit 14 is an environment that simulates the controlled unit 11 through calculations.
  • the virtual environment unit 14 is an environment that is reproduced by mathematically expressing, in other words, modeling, the dynamics of the controlled unit 11 and the surrounding real environment acquired by the observation unit 13.
  • a simulator or a digital twin can be used as the virtual environment unit 14.
  • the moving state of the actual controlled unit 11, that is, the movement (dynamics) is reproduced in real time.
  • reproduction of the movement means that it is sufficient to be able to express the same position and angle displacement as the real controlled unit 11 through calculations, and it is not necessary to reproduce the mechanism or internal structure of the moving part (actuator) that constitutes the controlled unit 11 for the movement.
  • sensors examples include tilt sensors, gyro sensors, acceleration sensors, external sensors such as encoders, hydraulic sensors, etc.
  • the installation positions and number of sensors can be designed appropriately for each task of the target working device 1.
  • the position and orientation data corresponds to the temporal movement (dynamics or motion) of the controlled unit 11.
  • the electrical signal information as position and orientation data is information acquired in response to the movement of the controlled unit 11 within a certain range of error and delay time.
  • the temporal frequency (sampling rate) and spatial resolution (precision) of the electrical signal are not particularly limited, and can be determined appropriately depending on the size and characteristics of the working device 1, the work content, etc.
  • each processing unit shown in the following configuration does not need to be physically the same, i.e., incorporated in the same control system, but only needs to be connected using communication means, and may be set according to the environment and the target work device 1.
  • the remote control device 2 targets the work environment of the work device 1 and an area that at least includes the work device 1, and is hereinafter referred to as the control environment.
  • the control environment represents a three-dimensional space that is the same as the work environment of the work device 1, or a wider one.
  • a coordinate system is set for this control environment at a certain reference point.
  • the coordinate system may be a coordinate system based on a specific position in the area where the work device 1 is located, or a specific geographic coordinate system (such as the UTM coordinate system), and may be selected appropriately according to the control environment and the work device 1.
  • the integration unit 21 acquires information about the reproduced controlled unit 11 and the work environment from the virtual environment unit 14 of the working device 1 via the communication means N1, and outputs integrated information generated by the integration process to the remote virtual environment unit 24.
  • the details of the integration process will be described later, but for example, it includes a coordinate conversion process that converts the information about the controlled unit 11 and the work environment acquired by the observation unit 13, which are obtained as information in the working device 1 coordinate system, and the obstacle candidate information generated by the virtual environment unit 14, into the on-site coordinate system.
  • this information is collectively referred to as integrated information.
  • the remote control unit 22 outputs information related to the control of the controlled unit 11, specifically, information related to future control plans and changes to the plans, to the control unit 12 of the working device 1 via the communication means N1.
  • the remote control unit 22 does not output information for directly controlling the actuators of the controlled unit 11, as the control unit 12 does, but outputs information for the control unit 12 to control.
  • the information output by the remote control unit 22 is input to the remote virtual environment unit 24.
  • the remote virtual environment unit 24 is an environment that simulates the controlled unit 11 through calculations, similar to the virtual environment unit 14. However, the information input is the information output by the integration unit 21 and the information output by the remote control unit 22.
  • the remote observation unit 23 is an environment that reproduces the dynamics of the controlled unit 11 and the surrounding real environment acquired by the observation unit 13 by mathematically expressing them, that is, by modeling them, and can use, for example, a simulator or a digital twin.
  • the method of configuring the remote virtual environment unit 24 may use the same means as the virtual environment unit 14, but as described above, it is different at least in that the state information of the controlled unit 11 is not directly input, but is information output by the integration unit 21, and that the control plan information of the controlled unit 11 output by the remote control unit 22 is input. Therefore, the remote virtual environment unit 24 outputs operation candidate information that reproduces the state of the controlled unit 11 and the working environment in the site coordinate system, and the state of the controlled unit 11 based on the future control plan and plan changes output by the remote control unit 22.
  • the remote virtual environment unit 24 may differ from the virtual environment unit 14 in the following two points regarding its configuration method and the method of reproducing the controlled unit 11 and the work environment.
  • the first point is about the reproduction of the shape of the controlled unit 11.
  • the controlled unit 11 reproduced by the remote virtual environment unit 24 may be a model in which the external shape of the controlled unit 11, that is, the size and three-dimensional shape, is reproduced to be the same as the actual controlled unit 11, or within a certain error range or scale.
  • This model of the controlled unit 11 can be constructed by a polygon or a set of polygons (i.e., a mesh) based on, for example, a design drawing or CAD data of the controlled unit 11, image data of the controlled unit 11, etc.
  • model of the controlled unit 11 when the model of the controlled unit 11 is represented by a polygon, it will be approximated according to the shape, size, density, etc. of the polygon. However, the degree of approximation can be appropriately determined according to the size of the controlled unit 11 to be controlled.
  • model of the controlled part 11 When the model of the controlled part 11 is represented by polygons, the model represents a three-dimensional shape, so there is no need to reproduce the surface material, texture, pattern, etc.
  • the method of constructing the model of the controlled part 11 is not limited to the above-mentioned method.
  • the information required to reproduce the controlled part 11 may be stored in the data storage unit D1.
  • the second point is that, like the virtual environment unit 14, in addition to reproducing the movable state, i.e., movement (dynamics), of the controlled unit 11, as described above, the state of the controlled unit 11 based on the future control plan or changes to the plan output by the remote control unit 22 is reproduced.
  • multiple models of the controlled unit 11 may be set, and each may reproduce the current state of the controlled unit 11 and the state based on the control plan.
  • the state based on the control plan is not limited to one type or one time range, and for example, a simulation of the controlled unit 11 may be performed.
  • the remote detection unit 25 determines, by remote detection processing based on information output by the remote virtual environment unit 24, that the working device 1 has approached an object other than an object to which approach is permitted (an object that is prohibited from approach) in the working environment of the working device 1 and in a controlled environment that at least includes the working device 1, and outputs a determination value.
  • the determination value is at least a true/false value indicating whether or not a certain predetermined reference value is satisfied.
  • the control unit 12 controls the controlled unit 11 based on this determination value. Details of the detection processing and the differences from the remote detection unit 25 will be described later.
  • the data storage unit D1 stores information necessary to reproduce the controlled unit 11 and the work environment in the virtual environment unit 14 and the remote virtual environment unit 24. Specifically, in addition to information about the configuration, dimensions, and range of motion of the controlled unit 11, the data storage unit D1 may store information about the installation position and direction of the observation unit 13, its inherent performance and parameters such as observation range and resolution, and information about the work device 1 coordinate system and the site coordinate system.
  • the physical location and connection of the data storage unit D1 is not limited, and it may be attached to the work device 1, the remote control device 2, or both, for example.
  • the communication means N1 connects at least the work device 1, the remote control device 2, and the data storage unit D1 via a network.
  • a terminal connected by a worker (user) and a monitoring device may also be connected.
  • Specific examples of the network include, but are not limited to, a wireless LAN (local area network), a wired LAN, a WAN (wide area network), a public line network, a mobile data communication network (3G, LTE, 4G, 5G, local 5G, etc.), Wifi (Wireless Fidelity), or a combination of these networks.
  • FIG. 3 is a flowchart showing an example of a control procedure performed by the control system.
  • the control of the control system 100 will be described below with reference to FIG. 3.
  • FIG. 2 will be referred to as appropriate.
  • a control method is implemented by operating the control system. Therefore, the description of the control method in the first embodiment will be replaced with the description of the operation of the control system below.
  • steps S101 to S106 processing of the working device 1
  • steps S201 to S206 steps S201 to S206
  • the virtual environment unit 14 acquires state information of the controlled unit 11 in the real environment and environmental information about the work environment acquired by the observation unit 13 (step S101).
  • the virtual environment unit 14 sets up a virtual environment based on the acquired information on the real environment, i.e., the state information and environmental information of the controlled unit 11, and the model information of the data storage unit (step S102). Specifically, in the coordinate system of the working device 1, based on the position and orientation information of the controlled unit 11 and the observation unit 13, the model that reproduces the movement of the controlled unit 11 and the environmental information are reflected in the virtual environment.
  • the virtual environment unit 14 performs a process to remove the information about the controlled unit 11 from the environmental information, and generates obstacle candidate information (step S103).
  • the space occupied (occluded) by the Controlled unit 11 can be calculated based on the surface shape of the Controlled unit 11.
  • the virtual environment unit 14 of the working device 1 reproduces only the geometric configuration of the actual controlled unit 11, so the surface shape of the controlled unit 11 is represented by shape information that approximates the geometric configuration and the elements that make up the controlled unit 11 with a basic shape, such as the length, width, and height when approximated by a rectangular parallelepiped, or the diameter and height when approximated by a cylinder.
  • This information may be stored in advance in the data storage unit D1. From the above, the area of the environmental information occupied by the controlled unit 11 can be calculated, and obstacle candidate information can be obtained by subtracting this area from the environmental information.
  • the process of excluding specific regions from such three-dimensional information may use existing methods as appropriate. This may be the method of approximating the three-dimensional region occupied by the controlled unit 11 with a basic shape and subtracting it, as described above, or another method. For example, there is a method of expressing the three-dimensional region as a regular lattice (voxel), and expressing the presence or absence of an object at each lattice point with information of 0/1.
  • voxel regular lattice
  • the detection unit 15 acquires information on the reproduction of the controlled unit 11 and obstacle candidate information in the virtual environment reproduced by the virtual environment unit 14, and determines through detection processing that the controlled unit 11 has approached or entered (comes into) an obstacle area, and outputs the determination value (step S104).
  • model of the controlled part 11 in the remote virtual environment unit 24 may be modeled in terms of the external shape, i.e., the surface shape, in addition to the geometric configuration information. Therefore, although the nearest neighbor distance was calculated above to represent a specific point of the controlled part 11, any point on the surface of the controlled part 11 model can be selected as the specific point.
  • the remote detection unit 25 can determine that there is no risk of contact if the nearest distance is greater than a certain set distance (threshold value) (if the criterion is met), similar to the determination process of the detection unit 15. That is, the determination value satisfies the criterion, and the working device 1 can continue to operate (step S205). On the other hand, if the nearest distance is smaller than the threshold value (if the criterion is not met), it is determined that there is a risk of contact, and outputs a warning (alert) and an instruction to the remote control unit 22 (step S206).
  • a certain set distance threshold value
  • the instruction to the remote control unit 22 can be, for example, a temporary stop or emergency stop of the operation of the working device 1, or a slowdown in the operating speed, that is, a change in the control parameters, similar to the instruction to the control unit 12 in the case of processing by the detection unit 15, but as a unique function of instructing the remote control unit 22, an avoidance operation by changing the control plan is possible.
  • the method of changing the control plan can be a method of changing the target value and re-planning, or a method of generating a plan to avoid the area using the position information of the obstacle as a potential, such as reactive control, but is not limited to these.
  • the threshold value set for judgment is not limited to one, i.e., one-stage judgment, as with the detection unit 15. Therefore, by setting multiple threshold values, it is possible to perform multi-stage judgment.
  • the remote detection unit 25 differs from the detection unit 15 in that it can make a judgment before the actual controlled unit 11 starts operating, which can prevent a decrease in the work efficiency of the working device 1.
  • the judgment by the detection unit 15 of the working device 1 according to the current state of the controlled unit 11 even if the selection that can be taken after the judgment is made involves interrupting the work, such as a temporary stop or an emergency stop, or changing the control parameters, such as slowing down the operating speed by the control unit 12, it is not necessarily possible to eliminate the approach to the obstacle. In other words, even if safety can be maintained, the work efficiency will decrease due to the interruption of the work.
  • the remote detection unit 25 of the remote control device 2 can make a judgment in advance before the start of the operation, that is, before the approach to the obstacle occurs, so that the interruption of the work can be prevented by changing the control plan.
  • the result of the judgment by the remote detection unit 25 can maintain a state in which the judgment of the detection unit 15 does not occur. Therefore, according to the present disclosure, it is possible to achieve both safety and work efficiency.
  • the program in this embodiment may be any program that causes a computer to execute steps S101 to S106 and S201 to S206 shown in Fig. 3.
  • the processor of the computer functions as the first detection unit 101, the second detection unit 201, the second control unit 202, the controlled unit 11, the control unit 12, the observation unit 13, the virtual environment unit 14, the detection unit 15, the integration unit 21, the remote control unit 22, the remote virtual environment unit 24, and the remote detection unit 25, and performs processing.
  • the computer include a server computer and a general-purpose PC.
  • a work device is mobile or portable and has limited power, i.e., it is powered by a battery or fuel. Two cases will be considered when detecting an obstacle with this work device.
  • the environmental information is thinned out to keep the communication capacity low, i.e., the spatial or temporal resolution is lowered, sufficient performance cannot be achieved. Therefore, the reliability of detection may decrease, and safety may be affected. Therefore, when the work device is mobile or portable and has power limitations, it is difficult to achieve both safety and work efficiency in a configuration where the detection process is single, i.e., where one control system or processing means is responsible for obstacle detection.
  • control system 100 compared to the systems compared above.
  • the first difference is the configuration, which is divided into the working device 1 and the remote control device 2, each of which is equipped with a detection unit 15 and a remote detection unit 25 for detecting obstacles. Due to this difference in configuration, for example, when the working device 1 is mobile or portable and has limited power, the power consumption of the control system mounted on the working device 1 can be reduced, and the possibility of performance degradation due to this effect can be compensated for by the processing of the remote control device 2.
  • the point of reducing the power consumption of the working device 1 can be realized by making the model of the controlled device 11 reproduced in the virtual environment unit 14 a simple geometric configuration and minimizing the number of specific points on the model of the controlled device 11 to be detected set by the detection unit 15.
  • the point compensated for by the remote control device 2 is that the model of the controlled device 11 reproduced in the remote virtual environment unit 24 can be made a model that can take into account its entire surface, and the number of specific points set by the remote detection unit 25 can be increased, thereby compensating for situations in which the judgment based on the specific points of the working device 1 is insufficient. Furthermore, if there are restrictions on communication capacity of the communication means connecting the work device 1 and the remote control device 2, the communication capacity can be reduced by reducing the amount of information through downsampling or abstraction of the environmental information acquired by the observation unit 13 by the virtual environment unit 14. Therefore, according to this embodiment, even if there are restrictions on power or communication, safety and work efficiency are not reduced.
  • the second difference is the configuration and processing method, in that the working device 1 and the remote control device 2 each have a virtual environment unit 14 and a remote virtual environment unit 24, and perform detection based on the virtual environment, i.e., a simulator or digital twin.
  • this embodiment does not require the area for detecting obstacles to be set in advance or the obstacle candidates to be learned in advance, and detection is performed using the controlled unit 11 itself reproduced in the virtual environment and the environmental information obtained by the observation unit 13. Therefore, according to this embodiment, it is possible to prevent a decrease in work efficiency due to excessive detection and a decrease in safety due to oversights or erroneous detection.
  • the third point is the difference in the processing method, that is, there is a time difference between the detection processing by the detection unit 15 and the remote detection unit 25 of the working device 1 and the remote control device 2, in other words, multiple detection processes using different information and processing methods for detection are executed independently or in parallel.
  • the detection unit 15 of the working device 1 performs detection based on the current state information of the controlled device 11, so it provides a function that reliably stops the device in a situation where approach or contact is imminent, that is, prioritizes safety.
  • the work device 1 and the remote control device 2 are equipped with a detection unit 15 and a remote detection unit 25, respectively.
  • a time difference in the detection processes causes the detection (determination) of the remote detection unit 25 to result in a state in which detection by the detection unit 15 does not occur (is not determined).
  • an effect can be achieved by one detection process affecting the other detection process, that is, by having the detection processes function as if they are not independent.
  • Such an effect cannot be achieved by a comparison means having only a single detection process, nor can it be achieved by simply combining multiple detection processes; it can only be achieved by combining detection processes with different characteristics, as in this embodiment.
  • FIG. 4 is a diagram showing an example of the configuration of the control system 200.
  • the control system 200 is configured to include a plurality of operating devices 1m (1A, 1B, ...: m is an arbitrary label).
  • the plurality of operating devices 1m are each connected to a communication means N1.
  • the operating device 1A shown in FIG. 4 includes a controlled unit 11A, a control unit 12A, an observation unit 13A, a virtual environment unit 14A, and a detection unit 15A, and the operating devices other than the operating device 1A also include independent devices and each processing unit.
  • the configuration and operation of each device and processing unit are the same as those of the control system 100 shown in FIG. 2, so that the description will be omitted below.
  • the remote control device 2 has the same configuration as that of the control system 100 shown in FIG. 2. Only the differences in operation will be described below.
  • the remote control device 2 acquires information from multiple work devices 1m, rather than from a single work device 1. In this respect, it differs from the control system 100 according to the first embodiment. As a result, the following two points are added to the processing of the integration unit 21.
  • the first point is the process of converting information acquired by each work device into the on-site coordinate system of the remote control device 2, which was one of the integrated processes.
  • Each of the multiple work devices 1m has its own work device 1m coordinate system. For example, information acquired from the work device 1A needs to be coordinate converted together with information from this work device 1A coordinate system. The same applies to the other work devices.
  • information on the coordinate system of each of the multiple work devices 1m is stored, for example, in the data storage unit D1.
  • This process allows the remote control device 2 to manage the positions and obstacle candidate information of all work devices 1m under one on-site coordinate system. For processing after conversion to the on-site coordinate system, the same processing as that of the control system 100 can be applied.
  • the second point is the integration process of the obstacle candidate information acquired by the multiple work devices 1m.
  • the conversion of the coordinate system is as described above, but in this embodiment, it is necessary to integrate the environmental information acquired by the multiple observation units.
  • the subsequent processing of the remote control device 2 can be applied to the same processing as that of the control system 100, but there may be the following differences. First, if the obstacle candidate information from each of the multiple work devices 1m is simply superimposed in the site coordinate system, the amount of information increases.
  • the integration unit 21 of this embodiment may perform a process of deleting (filtering) data indicating the same area from multiple pieces of obstacle candidate information from multiple work devices 1m.
  • the remote control device 2 can aggregate the environmental information acquired by each of the multiple work devices 1m.
  • the environmental information acquired by each of the arbitrary work devices 1m depends on the performance of the observation unit mounted on the work device 1m and the position and attitude of the work device 1m. Therefore, by aggregating environmental information from multiple work devices 1m, environmental information about a work area that cannot be acquired by a single work device (specifically, an area that is blocked from a certain direction by a structure or another work device and is a blind spot) can be acquired by the other work devices.
  • the environmental information can be interpolated by multiple work devices 1m. Therefore, the integration unit 21 of this embodiment may have a function to conversely deploy such aggregated obstacle candidate information to each of the multiple work devices 1m.
  • the virtual environment unit 14m of the arbitrary work device 1m further has a function to input the obstacle candidate information acquired from the remote control device 2 via the communication means N1 and update the original obstacle candidate information, in addition to the function of the control system 100 according to the first embodiment.
  • the processing procedure performed by the control system 200 is similar to the processing procedure performed by the control system 100 shown in FIG. 3, and therefore a description thereof will be omitted.
  • the program in this embodiment may be any program that causes a computer to execute steps S101 to S106 and S201 to S206 shown in FIG. 3, similarly to the first embodiment.
  • This embodiment has the effect of expanding the number of operating devices, that is, the effect of supplementing single environmental information by integrating and utilizing multiple environmental information. That is, this embodiment is based on two-way information exchange, in which information from multiple operating devices 1m is not only collected in one direction to the remote control device 2, but also information processed by the remote control device 2 is deployed to each operating device 1m.
  • This is an effect that can be realized by the configuration of this disclosure that includes multiple operating devices 1m and a remote control device, and is characterized in that it is not a one-way information collection as in Patent Document 1.
  • Fig. 5 is a diagram showing an example of the configuration of a control system.
  • the control system 300 is configured to further include a remote observation unit 23n (n is an arbitrary label) and a remote observation processing unit 26 in the configuration of the control system 200 shown in Fig. 3. Since the multiple work devices 1m and the remote control device 2 have the same configuration as the control system 200, only the differences will be described below.
  • the remote observation unit 23n (n is an arbitrary label) may be a remote observation device.
  • a remote observation unit 23n is provided that is connected to the communication means N1.
  • FIG. 5 shows one remote observation unit 23n, but multiple remote observation units may be provided.
  • multiple remote observation units 23n may be provided, with n being an arbitrary label.
  • n and m are different numbers of labels, and the number of remote observation units does not depend on the number of work devices.
  • the remote observation unit 23n may be a device with the same or equivalent performance as the observation unit 13m provided in each work device 1m, but it may also be a completely different observation means, and may be selected appropriately depending on the purpose described below.
  • the remote observation unit 23n acquires environmental information about the current work environment, similar to the observation unit 13m equipped in the work device 1m.
  • the observation unit 13m does not need to be physically built into each of the multiple work devices 1m.
  • the area in which each observation unit 13m can acquire environmental information, i.e., the work environment depends on the position and orientation of the work device 1m, the installation position and direction of the observation unit 13m, and the performance of the observation unit 13m, such as the viewing angle.
  • the remote control device 2 targets an area that includes at least the work device 1m, i.e., the controlled environment.
  • the observation unit 13A is a device capable of acquiring three-dimensional information such as a depth camera or LiDAR, and is illustrated as an example of a configuration mounted on the backhoe 411A, but the type, mounting location, and number of devices are not limited.
  • the control unit 12A, virtual environment unit 14A, and detection unit 15A are the same as in the first and second embodiments.
  • the remote control device 2 includes a remote display device 27. Note that other components and processes of the remote control device 2 are similar to those of the first and second embodiments, and therefore will not be described below.
  • the controlled unit 11 of the work device 40A is a backhoe 411A, and a task of excavating soil is described as an example.
  • the task content is an example and is not limited to the task content shown in this application example.
  • FIG. 6 shows an example of an observation area 413A observed by the observation unit 13A.
  • the observation area 413A may include obstacle candidates and a part of the backhoe 411A.
  • the task assumed in the first application example is a task of excavating soil 61 without contacting the obstacle 60 shown in FIG. 6.
  • the backhoe 411A In order for the backhoe 411A to excavate the soil 61, it is necessary to bring a part of the backhoe 411A, specifically the bucket at the end of the arm, close to the soil 61 and finally bring the bucket into contact with the soil 61. In other words, the backhoe 411A cannot complete the task unless it comes into contact with the soil 61. Thus, to execute an actual task, an area that allows approach or contact is required, even within the observation area 413A observed by the observation unit 13A.
  • the target area 62 which includes at least the soil 61.
  • the backhoe 411A is within the target area 62, it will not be detected even if it is an area acquired by the observation unit 13A, and it can contact the soil 61.
  • the position of the soil 61 is determined (known) in advance before the backhoe 411A actually approaches the soil 61. Otherwise, it cannot approach the position of the soil 61.
  • the position of the soil 61 to be excavated can be determined as part of the control plan of the remote control unit 22 of the remote control device 2 based on the information of the observation unit 13A, but the method of planning the excavation position is not limited in this embodiment. Therefore, the position and range of the target area 62 can be appropriately set based on the determined position information of the soil 61.
  • the soil 61 that is the task target is one location
  • the target area 62 is also one location, but the number of these is set depending on the environment and the task, so the setting method and number are not limited.
  • only one obstacle or obstacle area 60 that is not permitted to approach or enter is shown, but the number and location are not limited to those shown.
  • the process of making the target area 62 an area that is not detected as an obstacle can be executed in the same manner as the process of excluding the three-dimensional area that includes the backhoe 411A, which is the controlled unit, from the environmental information observed by the observation unit 13A.
  • the obstacle candidate information here is information obtained by excluding the area that includes the backhoe 411A and the target area 62 from the environmental information observed by the observation unit 13A.
  • position and orientation data may be obtained by a sensor attached to each moving part or the housing.
  • the sensor may be an external sensor such as an inclination sensor, a gyro sensor, an acceleration sensor, or an encoder.
  • the virtual environment unit 14A constructs a model that simulates the geometric structure and movement of the moving parts of the backhoe 411A based on the moving parts of the backhoe 411A.
  • This model information may be stored in the data storage unit D1.
  • the real backhoe 411A and the model in the virtual environment unit 14A can be synchronized, that is, the position and orientation can be matched within a certain specified error range.
  • the working device 40A coordinate system is set under a certain reference point.
  • the turning center position of the backhoe 411A is known under this working device 40A coordinate system. Therefore, the positions of the obstacle area 60 and the soil 61 are determined under this working device 40A coordinate system.
  • the reference point of the coordinate system of the working device 40A preferably the turning center position of the backhoe 411A, is known.
  • the position can be known by attaching a positioning device such as a global navigation satellite system (GNSS) to the backhoe 411A.
  • GNSS global navigation satellite system
  • the coordinate conversion by the integration unit 21 of the remote control device 2 is possible.
  • the subsequent detection process can be executed in the same manner as in the first and second embodiments.
  • Figure 7 shows a schematic diagram (left) of the virtual environment section 14A and an example of the display of the added remote display device 27 (right) as a diagram showing an example of the operation of the first application example.
  • the diagram on the left side of Figure 7 shows the work device 40A, backhoe 411A, and obstacle area 60.
  • the coordinate system of the work device 40A is illustrated two-dimensionally with the center of rotation of the backhoe 411A as the reference.
  • the first point is detection between work devices.
  • the status information of the movable parts based on the position and posture of each backhoe 411A and 411B, surface information, and actuator information reflects at least the current state. Therefore, for example, approach can be detected from the distance between the bucket of the backhoe 411A and the rear part of the backhoe 411B. In other words, approach and contact between multiple work devices can be prevented.
  • the second point is the use of a control plan. Furthermore, the control plan by the remote control unit 22 is illustrated for the backhoe 411A on the right side of Figure 7 (dotted line).
  • the third point is the integration of obstacle areas.
  • the obstacle area B is a blind spot of the observation unit and cannot be observed in the current position and posture of the backhoe 411B.
  • This backhoe 411A observes, transmits the information to remote control device 2, and as a result of integration processing by integration unit 21, both obstacle areas A and B are recognized as obstacle candidate information within remote control device 2.
  • detection unit 15B of work device 40B can also detect the approach of backhoe 411B to obstacle area B.
  • FIG. 8 shows a timeline of an example of operation of the first application example.
  • the lower diagram is a schematic diagram of the virtual environment section 14A of the working device 40A
  • the joke diagram is a schematic diagram of the remote virtual environment section 24 of the remote control device 2.
  • the schematic diagram in the virtual environment section 14A of the working device 40A shows a backhoe 411A and an obstacle area 60.
  • the schematic diagram in the remote virtual environment section 24 of the remote control device 2 shows the corresponding backhoe 411A and obstacle area A.
  • the horizontal axis represents time, and respectively represents time t1 (left) before approaching the obstacle area, time t2 (center) detected by the remote control device 2 based on the control plan, and time t3 (right) detected by the working device 40A based on the current state.
  • the remote control unit 22 executes a predetermined process, such as a change in the control plan or a change in the control parameters, or a process such as a pause or an emergency stop, and if this is reflected in the actual operation of the working device 40A, the working device 40A will not approach the obstacle area at time t3.
  • the working device 40A may approach the obstacle area as at time t3 due to the possibility of unexpected processing or communication delays, or due to a situation in which control is not possible in time even if it is reflected in the operation of the working device 40A.
  • the working device 40A can avoid contact with the obstacle area by detecting based on the current state and executing a pause or emergency stop.
  • the remote control device 2 can also detect it, so that the control plan can be changed to move away from the obstacle area, for example, a rotation operation instruction can be given as shown in the upper part of FIG. 8.
  • the working device 40A can immediately return to work due to detection and change in the control plan by the remote control device 2.
  • the feature of the present invention is that the work device 40A and the remote control device 2 perform detection and control independently (in parallel). As a result, safety can be guaranteed in parallel, and work interruption time can be kept to a minimum.
  • the above describes the first application example in which the working device 1 is a construction machine and the controlled device 11 is the backhoe 411A and 411B.
  • the first application example when the backhoes 411A and 411B approach an obstacle area or become close to each other, an instruction is given to limit the operating range or operating speed, or to stop them, thereby realizing precise control with high work efficiency and safety.
  • a backhoe is shown as an example of the controlled device 11 equipped in the working device 1, but it can be applied to other working devices that have moving parts, such as construction machines and civil engineering machines.
  • the technology described in the first application example can be suitably applied to working machines in which moving parts such as arms may enter an obstacle area.
  • This application example and the number and arrangement of the backhoes and obstacles shown in FIG. 7 are examples and are not limited to these.
  • the second application example is an example in which the working device 1 in the first and second embodiments is a robot having an arm, a so-called multi-joint robot arm.
  • Fig. 9 is a diagram showing an example of the configuration of a control system 500 of the first application example. 9, the second application example is composed of a working device 50A and a remote control device 2, and the working device 50A comprises at least a robot arm 511A, a control unit 12A that controls the robot arm 511A, and an observation unit 13A, a virtual environment unit 14A, and a detection unit 15A that are installed in the environment of the working device 50A.
  • the CPU 111 loads the program in the embodiment, which is composed of a group of codes stored in the storage device 113, into the main memory 112 and executes each code in a predetermined order to perform various calculations.
  • the main memory 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory).
  • the storage device 113 include a hard disk drive and a semiconductor storage device such as a flash memory.
  • the input interface 114 mediates data transmission between the CPU 111 and input devices 118 such as a keyboard and a mouse.
  • the display controller 115 is connected to the display device 119 and controls the display on the display device 119.
  • the data reader/writer 116 mediates data transmission between the CPU 111 and the recording medium 120, reads programs from the recording medium 120, and writes the results of processing in the computer 110 to the recording medium 120.
  • the communication interface 117 mediates data transmission between the CPU 111 and other computers.
  • a control system comprising:
  • the first control unit limits or stops the operation of the working device, or changes a control parameter when it is detected that the working device has entered a prohibited area.
  • the second control unit changes the control plan or changes a control parameter when it is detected that the working device has approached an object that is prohibited from approaching. 3.
  • the first detection unit detects that the work device has entered a no-entry area based on an actual measurement value of a current situation related to the work device, an estimated value based on a model simulating the work device, and an actual measurement value of the work environment. 4.
  • the second detection unit detects that the work device has approached an approach prohibited object based on a planned value based on the control plan for the work device, an estimated value based on a model that simulates the work device, and an actual measurement value of the work environment. 5.
  • Appendix 7 detecting, based on a current state of the working device in a working environment including at least a part of the working device, that the working device has entered into a prohibited entry area, and executing a predetermined control when the detection is detected; Detecting that the working device has approached an approach-prohibited object in the working environment and a controlled environment including at least the working device, based on a current state of the working device or a control plan, and executing a predetermined control when the approach is detected. Control methods.
  • Appendix 10 detecting that the work device has entered a no-entry area based on an actual measurement value of a current situation related to the work device, an estimated value based on a model simulating the work device, and an actual measurement value of the work environment; 10.
  • Appendix 11 detecting that the working device has approached an object that is prohibited from approaching, based on a planned value based on the control plan for the working device, an estimated value based on a model that simulates the working device, and an actual measured value of the work environment; 11.
  • the control method according to any one of appendix 7 to 10.
  • Appendix 12 When two or more of the work devices are in operation, the work device detects that the work device has approached an object that is prohibited from approaching based on integrated information of actual measured values of the work environment acquired by each of the work devices. 12. The control method of any one of appendix 7 to 11.
  • the program causes the computer to limit or stop the operation of the working device, or change a control parameter when it is detected that the working device has entered a prohibited area.
  • the program causes the computer to change the control plan or change a control parameter when it is detected that the working device has approached an object that is prohibited from approaching.
  • 15. The computer-readable storage medium of claim 13 or 14.
  • Appendix 18 the program causes the computer to detect, when two or more of the work devices are operating, that the work devices have approached an object that is prohibited from approaching based on information that integrates actual measured values of the work environment acquired by each of the work devices; 18.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mining & Mineral Resources (AREA)
  • Structural Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Civil Engineering (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Un système de commande 100 comprend : une première unité de détection 101 pour détecter qu'un dispositif de travail ayant une partie mobile est entré dans une zone interdite d'entrée sur la base de l'état actuel du dispositif de travail dans un environnement de travail qui comprend au moins une partie du dispositif de travail; une première unité de commande 102 pour effectuer une commande prédéfinie lorsque ladite entrée est détectée; une seconde unité de détection 201 pour détecter, sur la base de l'état actuel du dispositif de travail ou d'un plan de commande, que le dispositif de travail s'est approché d'un objet à ne pas approcher dans un environnement de commande qui comprend au moins l'environnement de travail et le dispositif de travail; et une seconde unité de commande 202 pour effectuer une commande prédéfinie lorsque ladite approche est détectée.
PCT/JP2023/036789 2023-10-10 2023-10-10 Système de commande, procédé de commande et support d'enregistrement lisible par ordinateur Pending WO2025079150A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/036789 WO2025079150A1 (fr) 2023-10-10 2023-10-10 Système de commande, procédé de commande et support d'enregistrement lisible par ordinateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/036789 WO2025079150A1 (fr) 2023-10-10 2023-10-10 Système de commande, procédé de commande et support d'enregistrement lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2025079150A1 true WO2025079150A1 (fr) 2025-04-17

Family

ID=95395392

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/036789 Pending WO2025079150A1 (fr) 2023-10-10 2023-10-10 Système de commande, procédé de commande et support d'enregistrement lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2025079150A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000263489A (ja) * 1999-03-16 2000-09-26 Denso Corp 移動ロボットの安全装置
JP2008302496A (ja) * 2006-07-04 2008-12-18 Panasonic Corp ロボットアームの制御装置及び制御方法、ロボット、及びロボットアームの制御プログラム
JP2009093514A (ja) * 2007-10-11 2009-04-30 Panasonic Corp 自走式装置およびプログラム
WO2016151724A1 (fr) * 2015-03-23 2016-09-29 富士機械製造株式会社 Corps mobile
JP2016220823A (ja) * 2015-05-28 2016-12-28 シャープ株式会社 自走式掃除機
JP2019137992A (ja) * 2018-02-07 2019-08-22 株式会社Nippo 作業機械の緊急停止装置及び緊急停止方法
JP2021059878A (ja) * 2019-10-04 2021-04-15 日立建機株式会社 侵入監視制御システムおよび作業機械
JP2021086217A (ja) * 2019-11-25 2021-06-03 トヨタ自動車株式会社 自律移動体システム、自律移動体の制御プログラムおよび自律移動体の制御方法
JP2022033592A (ja) * 2020-08-17 2022-03-02 大成建設株式会社 建設機械接触防止システム
JP2022064653A (ja) * 2020-10-14 2022-04-26 グローバルコネクト株式会社 監視制御装置及び監視制御用プログラム
JP2022179081A (ja) * 2021-05-21 2022-12-02 住友重機械工業株式会社 遠隔操作支援システム、遠隔操作支援装置

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000263489A (ja) * 1999-03-16 2000-09-26 Denso Corp 移動ロボットの安全装置
JP2008302496A (ja) * 2006-07-04 2008-12-18 Panasonic Corp ロボットアームの制御装置及び制御方法、ロボット、及びロボットアームの制御プログラム
JP2009093514A (ja) * 2007-10-11 2009-04-30 Panasonic Corp 自走式装置およびプログラム
WO2016151724A1 (fr) * 2015-03-23 2016-09-29 富士機械製造株式会社 Corps mobile
JP2016220823A (ja) * 2015-05-28 2016-12-28 シャープ株式会社 自走式掃除機
JP2019137992A (ja) * 2018-02-07 2019-08-22 株式会社Nippo 作業機械の緊急停止装置及び緊急停止方法
JP2021059878A (ja) * 2019-10-04 2021-04-15 日立建機株式会社 侵入監視制御システムおよび作業機械
JP2021086217A (ja) * 2019-11-25 2021-06-03 トヨタ自動車株式会社 自律移動体システム、自律移動体の制御プログラムおよび自律移動体の制御方法
JP2022033592A (ja) * 2020-08-17 2022-03-02 大成建設株式会社 建設機械接触防止システム
JP2022064653A (ja) * 2020-10-14 2022-04-26 グローバルコネクト株式会社 監視制御装置及び監視制御用プログラム
JP2022179081A (ja) * 2021-05-21 2022-12-02 住友重機械工業株式会社 遠隔操作支援システム、遠隔操作支援装置

Similar Documents

Publication Publication Date Title
Kim et al. Proximity prediction of mobile objects to prevent contact-driven accidents in co-robotic construction
JP7506876B2 (ja) 自律車両における運動挙動推定および動的挙動推定のための技術
ES2914630T3 (es) Sistema y método para la operación autónoma de maquinaria pesada
JP7353747B2 (ja) 情報処理装置、システム、方法、およびプログラム
US10006772B2 (en) Map production method, mobile robot, and map production system
CN111622296B (zh) 挖掘机安全避障系统和方法
CA3216836A1 (fr) Commande autonome d'equipement lourd et de vehicules au moyen de hierarchies de taches
JP2009193240A (ja) 移動ロボット及び環境地図の生成方法
EP4083336B1 (fr) Procédé et machine pour detecter un terrain de construction
CN111469127B (zh) 代价地图更新方法、装置、机器人及存储介质
CN116352722A (zh) 多传感器融合的矿山巡检救援机器人及其控制方法
KR20250133915A (ko) 경로 계획 방법 및 장치, 그리고 크레인
US12393194B2 (en) Managing conflicting interactions between a movable device and potential obstacles
KR102777443B1 (ko) 지하시설물 탐사를 위한 자율탐사 로봇 및 자율탐사 방법
US20240331369A1 (en) Sensor fusion system and sensing method for construction equipment
WO2022230880A1 (fr) Procédé de commande, système de commande, et dispositif de commande
WO2025079150A1 (fr) Système de commande, procédé de commande et support d'enregistrement lisible par ordinateur
RS64904B1 (sr) Generisanje modela za planiranje trase ili pozicioniranje mobilnog objekta u podzemnom gradilištu
EP4006680A1 (fr) Systèmes et procédés pour commander un véhicule robotisé
Satoh Digital twin-based collision avoidance system for autonomous excavator with automatic 3d lidar sensor calibration
CN119370630A (zh) 一种多机协同作业的清舱方法
Fang et al. Cyber-physical systems (CPS) in intelligent crane operations
US12223837B2 (en) Systems and methods for detecting false positives in collision notifications
Aalerud et al. Industrial environment mapping using distributed static 3d sensor nodes
CN118262071A (zh) 一种工程机械混合现实数字孪生管控方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23955402

Country of ref document: EP

Kind code of ref document: A1