US20250305251A1 - Work support system for excavator - Google Patents
Work support system for excavatorInfo
- Publication number
- US20250305251A1 US20250305251A1 US19/235,080 US202519235080A US2025305251A1 US 20250305251 A1 US20250305251 A1 US 20250305251A1 US 202519235080 A US202519235080 A US 202519235080A US 2025305251 A1 US2025305251 A1 US 2025305251A1
- Authority
- US
- United States
- Prior art keywords
- excavator
- information
- motion
- support system
- work support
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/2025—Particular purposes of control systems not otherwise provided for
- E02F9/205—Remotely operated machines, e.g. unmanned vehicles
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/20—Drives; Control devices
- E02F9/22—Hydraulic or pneumatic drives
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
- E02F9/262—Surveying the work-site to be treated with follow-up actions to control the work tool, e.g. controller
Definitions
- the present disclosure relates to a work support system for an excavator.
- a work support system for supporting excavation motions and the like of an excavator in order to increase the work efficiency of the excavator is known.
- a work support system of the related art for evaluating the work quality of a plurality of operators, learning the parameters of the excavator operation model based on the operation data of the best operator, and supporting the operator based on the learned operation model.
- a work support system for an excavator including the excavator; an environment detection device configured to detect environment information of a work site of the excavator; and a simulation device configured to acquire the environment information detected by the environment detection device during a motion of the excavator, and to generate a three-dimensional virtual space model of the work site.
- FIG. 2 is a functional block diagram illustrating a configuration example of the work support system
- FIG. 3 is a diagram illustrating an example of a three-dimensional virtual space model generated by the virtual space generation part
- FIG. 4 is an explanatory diagram illustrating an excavation motion of a virtual excavator
- FIG. 5 is an explanatory diagram illustrating an earth and sand discharging motion of a virtual excavator
- FIG. 6 is an explanatory diagram illustrating control of an excavation motion of an excavator by the work support system according to the embodiment
- FIG. 7 is a diagram illustrating an example of a work support system according to another embodiment.
- FIG. 8 is a functional block diagram illustrating a configuration example of a work support system
- FIG. 10 is an explanatory diagram illustrating control of an excavation motion of an excavator by the work support system according to the modified example.
- the work support system sets the motion content of the excavator based on the earth and sand information, the actual excavator may not be able to perform excavation as scheduled.
- the present disclosure provides a work support system for an excavator capable of further improving the work efficiency of an excavator, by updating environment information in association with the motion of the excavator and setting motion contents based on the environment information.
- FIG. 1 is a diagram illustrating a work support system SYS 1 according to the embodiment.
- the lower traveling body 1 of the excavator 100 of the present embodiment has a pair of right and left crawlers 1 C.
- the crawlers 1 C are driven by a traveling hydraulic motor 2 M which is a traveling actuator mounted on the lower traveling body 1 .
- the upper turning body 3 is mounted on the lower traveling body 1 in a turnable manner via a turning mechanism 2 .
- the turning mechanism 2 is driven by a turning hydraulic motor 2 A which is a turning actuator mounted on the upper turning body 3 .
- the turning actuator may be an electric actuator (a turning electric generator).
- the arm 5 is rotatably supported with respect to the boom 4 .
- An arm angle sensor S 2 is attached to the arm 5 .
- the arm angle sensor S 2 detects an arm angle ⁇ which is a rotation angle of the arm 5 .
- the arm angle ⁇ is, for example, an opening angle from the fully closed state of the arm 5 . Therefore, the arm angle ⁇ becomes maximum when the arm 5 is fully opened.
- Each of the boom angle sensor S 1 , the arm angle sensor S 2 and the bucket angle sensor S 3 may use only an acceleration sensor or may be a combination of an acceleration sensor and a gyro sensor.
- the boom angle sensor S 1 may be a stroke sensor attached to the boom cylinder 7 , a rotary encoder, a potentiometer, or an inertial measurement device. The same applies to the arm angle sensor S 2 and the bucket angle sensor S 3 .
- the space recognition device 70 is a device for recognizing a three-dimensional real space (environment information) around the excavator 100 .
- the space recognition device 70 is configured to measure the direction and distance to a recognized object from the space recognition device 70 or the excavator 100 .
- the space recognition device 70 includes, for example, an ultrasonic sensor, a millimeter wave radar, a monocular camera, a stereo camera, a LiDAR, a distance image sensor, an infrared sensor, or any combination thereof.
- the space recognition device 70 includes a front sensor 70 F attached to the front end of the upper surface of the cabin 10 , a rear sensor 70 B attached to the rear end of the upper surface of the upper turning body 3 , a left sensor 70 L attached to the left end of the upper surface of the upper turning body 3 , and a right sensor 70 R attached to the right end of the upper surface of the upper turning body 3 .
- the space recognition device 70 may have an upper sensor (not illustrated) attached to the excavator 100 that recognizes an object present in the space above the upper turning body 3 .
- the orientation detection device 71 detects information related to the relative relationship between the orientation of the upper turning body 3 and the orientation of the lower traveling body 1 .
- the orientation detection device 71 may be formed of, for example, a combination of a geomagnetic sensor attached to the lower traveling body 1 and a geomagnetic sensor attached to the upper turning body 3 .
- the orientation detection device 71 may be formed of a combination of a GNSS receiver attached to the lower traveling body 1 and a GNSS receiver attached to the upper turning body 3 .
- the orientation detection device 71 may be a rotary encoder, a rotary position sensor, or any combination thereof.
- the orientation detection device 71 may be formed of a resolver.
- the orientation detection device 71 may be attached to, for example, a center joint provided in relation to the turning mechanism 2 for implementing relative rotation between the lower traveling body 1 and the upper turning body 3 .
- the orientation detection device 71 may be formed of a camera attached to the upper turning body 3 .
- the orientation detection device 71 applies known image processing to the imaging information captured by the camera to extract an image of the lower traveling body 1 included in the imaging information. Then, the orientation detection device 71 identifies the longitudinal direction of the lower traveling body 1 from the image of the lower traveling body 1 , and derives an angle formed between the longitudinal direction of the upper turning body 3 and the longitudinal direction of the lower traveling body 1 .
- the longitudinal direction of the upper turning body 3 is derived from the mounting position of the camera.
- the orientation detection device 71 can identify the longitudinal direction of the lower traveling body 1 by detecting the image of the crawler 1 C.
- the orientation detection device 71 may be integrated into the controller 30 .
- the camera may also use a space recognition device 70 .
- the machine body inclination sensor S 4 detects the inclination of the upper turning body 3 with respect to a predetermined plane.
- the machine body inclination sensor S 4 is an acceleration sensor that detects the tilt angle of the upper turning body 3 around the longitudinal axis and around the lateral axis, with respect to the horizontal plane.
- the longitudinal axis and the lateral axis of the upper turning body 3 are orthogonal to each other and pass through an excavator center point which is a point on the turning axis of the excavator 100 .
- At least one of the boom angle sensor S 1 , the arm angle sensor S 2 , the bucket angle sensor S 3 , the machine body inclination sensor S 4 , or the turning angular velocity sensor S 5 is also referred to as a posture detection device.
- the posture of the excavation attachment AT is detected, for example, based on the respective outputs of the boom angle sensor S 1 , the arm angle sensor S 2 , and the bucket angle sensor S 3 .
- the operation device 21 has an information input device (for example, a right console box, a left console box) for an operator of the excavator 100 to input information to the controller 30 .
- the information input device may be, for example, a switch panel installed close to the display device of the output device 22 .
- the information input device may be a touch panel applied as a display device or a voice input device such as a microphone placed in the cabin 10 .
- the information input device may also be a communication device for obtaining information from the outside.
- the controller 30 is a control device for controlling the excavator 100 .
- the controller 30 is formed of a computer including one or more processors, a memory (volatile memory, non-volatile memory), and the like.
- the one or more processors read and execute a program corresponding to each function from the memory.
- each function includes a machine guidance function for guiding a manual operation of the excavator 100 by an operator, or a machine control function for automatically (or autonomously) operating the excavator 100 .
- the controller 30 may include a contact avoidance function for automatically operating or braking the excavator 100 to avoid contact between the excavator 100 and an object existing around the excavator 100 .
- FIG. 2 is a functional block diagram illustrating a configuration example of the work support system SYS 1 .
- the object included in the current imaging information may be extracted by comparing a plurality of pieces of past imaging information with the current imaging information. Then, the virtual space generation part 31 arranges the extracted topographic information and object information in a three-dimensional virtual space model, and reproduces virtual environment information surrounding the cabin 10 (operator) in the three-dimensional virtual space model.
- the object information extracted for reproducing the three-dimensional virtual space model includes, for example, soil to be excavated (including mounds, holes, walls, ditches, etc.), static objects other than excavated objects, the excavator 100 itself, work machines such as another excavator, vehicles, animals including people, plants, and the like.
- the virtual space generation part 31 can utilize information such as distance and direction between the excavator 100 and the object information measured by the ultrasonic sensor, millimeter wave radar, LiDAR, and the like.
- FIG. 3 is a diagram illustrating an example of a three-dimensional virtual space model 50 generated by the virtual space generation part 31 .
- a virtual excavator 51 obtained by reproducing the excavator 100 in a real space, is arranged in the three-dimensional virtual space model 50 of the virtual space generation part 31 .
- the virtual excavator 51 corresponds to the shape, position, posture, etc., of the excavator 100 at an actual work site.
- the position, posture, and the like of the virtual excavator 51 in the three-dimensional virtual space model 50 are identified by an excavator state identification part 32 described below, and are determined, for example, based on the detection information of at least one of the space recognition device 70 or the orientation detection device 71 .
- the position, posture, and the like of the virtual excavator 51 may be determined or adjusted by using the detection information of a posture detection device, the positioning device 72 , and the like.
- the virtual space generation part 31 associates (adds) various kinds of additional information with the topographic information and object information reproduced in the three-dimensional virtual space model 50 .
- additional information of the excavator such as an identification number, a type, a working time, a type and a posture of the excavation attachment AT, and earth and sand information (additional information) such as a weight, a volume, and a density of the earth and sand put in the bucket 6 , are added to the virtual excavator 51 .
- the additional information of the excavator may be estimated based on the detection information of the environment detection device 79 , or information previously stored in the controller 30 may be used.
- the earth and sand information stored in the excavator 100 can be estimated based on information obtained by detecting the load applied when the excavator 100 is performing excavation, detected by a pressure sensor, a load sensor, or the like, which is a part of the environment detection device 79 .
- the environment detection device 79 is a device that detects information that affects the environment information constituting the three-dimensional virtual space model 50 inside or outside the excavator 100 .
- the environment detection device 79 may include various sensors such as a space recognition device 70 , an orientation detection device 71 , a positioning device 72 , a posture detection device (the boom angle sensor S 1 , the arm angle sensor S 2 , the bucket angle sensor S 3 , the machine body inclination sensor S 4 , and the turning angular velocity sensor S 5 ), a pressure sensor not illustrated, a load sensor, and an operation sensor of the operation device 21 .
- the environment detection device 79 may also include detection information of a device (the external space recognition device 300 , other work machines and vehicles (for example, a dump truck)) installed outside the excavator 100 .
- additional information of an excavation object such as the sediment accumulation amount, weight, density, hardness, and soil quality of the earth and sand is added to the excavation object 52 of the three-dimensional virtual space model 50 .
- the additional information of an excavation object is estimated by the excavation state estimation part 33 described below based on the detection information of the space recognition device 70 .
- the additional information of an excavation object may be estimated by the pressure and load applied to the bucket 6 of the excavator 100 when the excavator 100 operates, the weight of earth and sand recognized by the dump truck when the earth and sand is discharged to the dump truck, an image, or the like.
- a part of the additional information of the excavation object may be extracted from the design data of the work site stored in advance in the controller 30 .
- additional information of the discharge object such as identification number, type, size of the loading platform, state of the loading platform, and working time are added to the virtual dump truck 53 of the three-dimensional virtual space model 50 .
- earth and sand information additional information of the discharge object
- additional information of the discharge object such as weight, volume, and density of earth and sand loaded on the loading platform of the virtual dump truck 53 (the discharge object) may be added to the loading platform of the virtual dump truck 53 .
- the additional information of the discharge object may be estimated based on object information extracted from the detection information of the space recognition device 70 , or information previously stored in the controller 30 may be used.
- the additional information to be added to the environment information may include various kinds of information other than the above.
- the object information thereof is arranged in the three-dimensional virtual space model 50 . Additional information of the relative distance to the excavator 100 can be added to the object information. Further, additional information such as identification number, type, and working time can be added to the information of other work machines.
- the virtual space generation part 31 sequentially updates the environment information of the three-dimensional virtual space model 50 based on the detection information of the space recognition device 70 , the orientation detection device 71 , and the positioning device 72 , information of the operation device 21 , and communication information from the external space recognition device 300 and the dump truck.
- the environment information of the three-dimensional virtual space model 50 may be changed based on the detection information at that time. Even if the earth and sand information (additional information) of the excavation object 52 is added in advance, the earth and sand information may be different when the excavator 100 has actually performed the excavation.
- the virtual space generation part 31 changes or corrects the earth and sand information based on the detection information detected when the excavator 100 actually has actually performed the excavation.
- the virtual space generation part 31 can generate the three-dimensional virtual space model 50 that is even closer to the work site environment.
- the excavation state estimation part 33 estimates additional information of the excavation object required for the work of the excavator 100 based on the detection information of the environment detection device 79 .
- the term “work” of the excavator 100 in the present specification means a concept including a series of movements such as an excavation motion with respect to the earth and sand that is the excavation object, conveyance of the earth and sand accompanying the lifting and turning motion of the excavator 100 , an earth and sand discharging motion with respect to the loading platform of the dump, and the return of the lowering and turning motion of the excavator 100 .
- the hardness of the ground to be excavated, the density of the soil, the soil quality, and the like are examples.
- the excavation state estimation part 33 can estimate the additional information of the excavation object based on the pressure detected by the pressure sensor provided in the hydraulic path of each cylinder of the excavation attachment AT or the load detected by the load sensor provided at an appropriate position of the excavation attachment AT. Besides the excavation with respect to the excavation object, the excavation state estimation part 33 may estimate the earth and sand information based on the detection information of various sensors while the earth and sand are held by the excavation attachment AT.
- the additional information of the excavation object estimated by the excavation state estimation part 33 is added to the earth and sand information of the virtual excavator 51 of the three-dimensional virtual space model 50 as described above, and is appropriately reflected in the additional information of the excavation object 52 excavated by the virtual excavator 51 .
- the determination part 36 determines, based on the detection information of the space recognition device 70 or the external space recognition device 300 , at least one of the position, posture, or motion content of another work machine or vehicle. Whether or not there is a matter to be reported may be determined based on the presence or absence of the same or similar situation in comparison with past cases. For example, if the determination part 36 recognizes the presence of a person near the excavator 100 (the motion range of the excavation attachment AT), the determination part 36 may determine that there is a matter to be reported to the operator.
- the determination part 36 may determine that there is a matter to be reported to the operator. At this time, the determination part 36 may determine the downslope based on the topographic information of the design data stored in advance in the controller 30 . Alternatively, if the determination part 36 detects the existence of an object (for example, electric wire) that may affect the work outside the range covered by the three-dimensional virtual space model 50 , the determination part 36 may determine that there is a matter to be reported to the operator.
- an object for example, electric wire
- the operation prediction part 37 is configured to predict the operation signal after a predetermined time based on the operation signal from the operation device 21 or the signal from the management apparatus 200 . This is to prevent the degrading of the operation responsiveness caused by the delay due to the processing overload and the communication delay.
- the predetermined time is, for example, several milliseconds to several tens of milliseconds.
- the operation prediction part 37 predicts the operation signal after a predetermined time based on the transition of the operation signal (tilt angle of the operation lever of the operation device 21 ) during a predetermined time in the past.
- the operation prediction part 37 detects that the tilt angle of the operation lever has tended to increase during a predetermined time in the past, the operation prediction part 37 predicts that the tilt angle after a predetermined time will be larger than the current tilt angle.
- the excavator 100 can move while reducing the delay of the operation signal.
- the operation intervention part 38 determines that intervention in the operation is imperative. As an example, when the operation intervention part 38 detects the presence of a person on the left side of the excavator 100 and the start of the left turning operation (the operation of tilting the left operation lever to the left), the operation intervention part 38 determines that intervention in the operation is imperative. In this case, the operation intervention part 38 invalidates the operation signal generated based on the left turning operation to prevent the upper turning body 3 from turning left.
- the operation intervention part 38 may intervene in the operation upon determining whether or not the excavator 100 and the object are in contact with each other based on the detection information of the external space recognition device 300 .
- the motion simulator 39 generates a plurality of excavation trajectories of the virtual excavator 51 corresponding to the motion range of the excavation attachment 51 a and the plurality of excavation patterns.
- the example (a) in the upper right figure of FIG. 4 is a pattern in which the first excavation location is the upper part of the excavation object 52 , and then the surface of the excavation object 52 is excavated downward in order, and the inner part of the excavation object 52 is excavated after the surface.
- the example (b) in the upper right figure of FIG. 4 is a pattern in which the first excavation location is the halfway point of the excavation object 52 , and the surrounding area is excavated in detail.
- the pattern (a) is when the earth and sand of the excavation object 52 are soft, and the pattern (b) is when the earth and sand of the excavation object 52 are hard.
- the density, hardness, and soil of the excavation object in real space may differ from the information before the motion during the excavation motion of the excavator 100 . Therefore, even if the additional information of the excavation object is linked, the motion simulator 39 can generate a plurality of excavation trajectories when the density, hardness, and soil of the excavation object 52 are changed.
- the controller 30 When the work support system SYS 1 automatically controls the excavator 100 , the controller 30 provides this optimum excavation trajectory to the actuator driving part 34 as simulation motion information.
- the actuator driving part 34 can control the various solenoid valves 41 and the various actuators 42 to move the excavation attachment 51 a in the real space along the excavation trajectory (see the lower diagram in FIG. 4 ).
- the motion simulator 39 corrects the excavation trajectory based on the updated topographic information, object information, and additional information of the three-dimensional virtual space model 50 .
- the motion simulator 39 may re-select a plurality of excavation trajectories calculated previously, or may newly re-calculate excavation trajectories.
- the actuator driving part 34 switches to this excavation trajectory to move the excavation attachment AT in the real space.
- FIG. 5 is an explanatory diagram illustrating the earth and sand discharging motion of the virtual excavator 51 .
- the motion simulator 39 when the earth and sand discharging motion is simulated, the motion simulator 39 generates a plurality of earth and sand discharging trajectories (simulation motion information) for moving the excavation attachment 51 a of the virtual excavator 51 with respect to the virtual dump truck 53 of the three-dimensional virtual space model 50 .
- the plurality of earth and sand discharging trajectories are generated based on the range in which the excavation attachment AT of the excavator 100 in the real space can actually move based on various sensors of the excavator 100 in the real space and the dump information of the virtual dump truck 53 .
- the motion simulator 39 sets the motion range (including the moving direction, moving distance, posture, and the like of the excavation attachment 51 a ) of the excavation attachment 51 a of the virtual excavator 51 in the earth and sand discharge, to the position above the loading platform of the virtual dump truck 53 . Further, the motion simulator 39 assumes a plurality of earth and sand discharge patterns in which the position of loading onto the loading platform of the virtual dump truck 53 in the motion range of the excavation attachment 51 a and the loading amount are appropriately changed. In the assumption of the plurality of patterns, the sediment amount, density, hardness, and soil quality of earth and sand in the earth and sand information of the excavator 100 and/or the dump truck are used.
- the motion simulator 39 After generating the plurality of earth and sand discharging trajectories, the motion simulator 39 appropriately evaluates (simulates) the plurality of earth and sand discharging trajectories to select the optimum earth and sand discharging trajectory.
- the optimum earth and sand discharging trajectory may be evaluated by using constraints such as the motion speed of the excavation attachment 51 a, the work time, and the safety of work, for example, while applying an objective function to load the earth and sand evenly onto the loading platform of the virtual dump truck 53 .
- the motion simulator 39 can obtain the optimum earth and sand discharging trajectory according to the currently acquired earth and sand information.
- the controller 30 When the excavator 100 is automatically controlled by the work support system SYS 1 , the controller 30 provides the actuator driving part 34 with this optimum earth and sand discharging trajectory.
- the actuator driving part 34 can control various solenoid valves 41 and various actuators 42 to move the excavation attachment AT in the real space along the earth and sand discharging trajectory (see the lower diagram in FIG. 5 ).
- the motion simulator 39 corrects the optimum earth and sand discharging trajectory based on the updated topographic information, object information, and additional information of the three-dimensional virtual space model 50 .
- the motion simulator 39 may re-select a plurality of previously calculated earth and sand discharging trajectories or re-calculate the earth and sand discharging trajectory.
- the actuator driving part 34 switches to this earth and sand discharging trajectory to move the excavation attachment AT of the real space.
- FIG. 6 is an explanatory diagram illustrating the control of the work of the excavator 100 by the work support system SYS 1 according to the embodiment.
- An example of automatic control (machine control function) of the excavator 100 by the controller 30 will be described below.
- the work support system SYS 1 is not limited thereto, and the work support system SYS 1 may operate as a machine guidance function for guiding the operator to the trajectory generated by the controller 30 . Further, the work support system SYS 1 may automatically control and guide the excavator 100 by the management apparatus 200 .
- the controller 30 starts automatic control of the work of the excavator 100 based on the operator turning on the automatic control button 23 of the excavator 100 .
- the virtual space generation part 31 generates the three-dimensional virtual space model 50 based on the stored environment information before the work of the excavator 100 (step S 1 ).
- the environment information stored before the work is stored in the controller 30 and may inherit the three-dimensional virtual space model 50 of the previous work (for example, the previous day).
- the motion simulator 39 simulates the work of the virtual excavator 51 in the three-dimensional virtual space model 50 (step S 2 ). That is, the motion simulator 39 generates each optimum trajectory in the work (excavation motion, lifting and turning motion, earth and sand discharging motion, lowering and turning motion) based on the environment information.
- the controller 30 can obtain the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory that are continuous on the time axis of the work.
- the actuator driving part 34 receives the information of the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory generated by the motion simulator 39 (step S 3 ). Thus, the actuator driving part 34 controls the various solenoid valves 41 and the various actuators 42 according to these trajectories in the excavator 100 in the real space (step S 4 ). That is, the excavator 100 in the real space automatically performs the excavation motion, the lifting and turning motion, the earth and sand discharging motion, and the lowering and turning motion as the actual work.
- the controller 30 acquires the information of the environment detection device 79 (excavation motion information including the operation of the space recognition device 70 , the orientation detection device 71 , the positioning device 72 and various sensors, or the operation device 21 ; communication information from the external space recognition device 300 and the dump truck) at a predetermined timing.
- the predetermined timing may be set sequentially (every predetermined time) during the work of the excavator 100 , or a setting may be made to acquire the log stored by a motion in association with the end of one motion.
- the virtual space generation part 31 updates the topographic information, object information, and additional information of the three-dimensional virtual space model 50 based on this information (step S 5 ).
- the virtual space generation part 31 updates the additional information (ground hardness, soil density, soil quality, etc.) of the excavation object 52 of the three-dimensional virtual space model 50 , to the information estimated by the excavation state estimation part 33 . Further, the additional information and earth and sand information of the excavator of the virtual excavator 51 or the additional information and earth and sand information of the virtual dump truck 53 (discharge object) are also appropriately updated.
- the work support system SYS 1 can appropriately move the excavator 100 according to the situations that change in real time in the real space work site, by repeating the above motions during the work. As a result, the work support system SYS 1 can efficiently and accurately perform the work by the excavator 100 .
- FIG. 7 is a diagram illustrating an example of the work support system SYS 2 according to the other embodiment.
- FIG. 8 is a functional block diagram illustrating an example of the configuration of the work support system SYS 2 .
- the work support system SYS 2 differs from the work support system SYS 1 according to the embodiment in that the excavator 100 is remotely operated by the remote operation room RC provided at a position away from the excavator 100 .
- the configuration other than the remote operation of the excavator 100 is basically the same as that of the embodiment, and a detailed description thereof will be omitted.
- the sound output device A 2 outputs sound, and is configured such that a sound collecting device (not illustrated) attached to the excavator 100 replays the collected sound.
- the indoor imaging device C 2 captures the interior of the remote operation room RC.
- the indoor imaging device C 2 is a camera installed inside the remote operation room RC and captures the operator OP sitting in the driver's seat DS.
- the remote control room RC positioned around the driver's seat DS, has a structure similar to the driver's seat installed in the cabin 10 of the excavator 100 . Specifically, a left console box is arranged at the left of the driver's seat DS, and a right console box is arranged at the right of the driver's seat DS. A left operation lever is arranged on the top front end of the left console box, and a right operation lever is arranged on the top front end of the right console box. A travel lever and a travel pedal are arranged in front of the driver's seat DS. The left operation lever, the right operation lever, the travel lever, and the travel pedal constitute an operation device 21 R of the remote operation room.
- the operation device 21 R is provided with an operation sensor 29 R for detecting the operation contents of the operation device 21 R.
- the operation sensor 29 R includes, for example, an inclination sensor for detecting the tilt angle of the operation lever, an angle sensor for detecting the oscillation angle of the operation lever around an oscillation axis, etc.
- the operation sensor 29 R may be formed of other sensors such as a pressure sensor, a current sensor, a voltage sensor, or a distance sensor.
- the operation sensor 29 R outputs information about the detected operation contents of the operation device 21 R, to the remote controller 30 R.
- the remote controller 30 R generates an operation signal based on the received information, and transmits the generated operation signal to the excavator 100 .
- the remote controller 30 R includes, as functional blocks, an operator state identification part 61 , an image combining part 62 , and an operation signal generation part 63 .
- the operator state identification part 61 , the image combining part 62 , and the operation signal generation part 63 are distinguished for convenience of explanation, they do not have to be physically distinguished, and may be composed entirely or partially of common software components or hardware components.
- the operator state identification part 61 is configured to identify the state of an operator in the remote operation room RC.
- the operator state identification part 61 identifies, for example, the position of the operator's eye (operator's viewpoint) and the direction of the operator's gaze, based on the imaging information of the indoor imaging device C 2 .
- the operator state identification part 61 performs appropriate image processing on the image captured by the indoor imaging device C 2 to identify the position of the operator's viewpoint in the operation room coordinate system and the coordinates of the direction of the operator's gaze.
- the other image may be a design surface image, which is an image generated based on design data.
- the image combining part 62 superimposes, on the environment information as a design surface image, a figure, such as computer graphics, representing the position of the design surface based on design data previously stored in the memory of the remote controller 30 R.
- the image combining part 62 determines the position to superimpose the design surface image based on the position and orientation of the excavator 100 identified by the excavator state identification part 32 of the controller 30 .
- the remote operation of the excavator 100 is not limited to the operation by the operator in the remote operation room RC, and may be performed by an application of a portable terminal 400 as illustrated in FIG. 7 , for example. In this case, most of the motions of the excavator 100 may be set to be automatically controlled in advance to simplify the operation by the portable terminal 400 .
- the work support trajectory can be generated in the controller 30 of the excavator 100 or in the remote controller 30 R of the remote operation room RC. Therefore, the remote controller 30 R may include the virtual space generation part 65 , the determination part 66 , the operation prediction part 67 , the operation intervention part 68 , and the motion simulator 69 having the same functions as the virtual space generation part 31 , the determination part 36 , the operation prediction part 37 , the operation intervention part 38 , and the motion simulator 39 .
- the work support system SYS 2 may generate trajectories for work support by the management apparatus 200 .
- the remote controller 30 R When the work support is performed by remote operation, the remote controller 30 R generates a plurality of trajectories for each divided motion of the work, and selects the optimum trajectory by evaluating each trajectory, similar to the controller 30 . Then, the remote controller 30 R transmits information of the trajectories of each motion to the excavator 100 , and the actuator driving part 34 of the controller 30 controls the motion of the excavator 100 along the trajectories of each motion.
- FIG. 9 is an explanatory diagram illustrating the control of the work of the excavator 100 by the work support system SYS 2 according to the other embodiment.
- An example of automatic control (machine control function) of the excavator 100 by the remote controller 30 R will be described below.
- the work support system SYS 2 is not limited thereto, and the work support system SYS 2 may operate as a machine guidance function to guide the operator to the trajectory generated (simulated) by the remote controller 30 R.
- the remote controller 30 R automatically controls the work of the excavator 100 when the operator operates the operation device 21 R while viewing the imaging information captured by the space recognition device 70 of the excavator 100 .
- the virtual space generation part 65 generates a three-dimensional virtual space model 50 based on the environment detection device 79 (detection information from the space recognition device 70 , the orientation detection device 71 , the positioning device 72 , and various sensors, or information from the operation device 21 , and communication information from the external space recognition device 300 and the dump truck) (step S 11 ).
- the motion simulator 69 simulates the work of the virtual excavator 51 in the three-dimensional virtual space model 50 (step S 12 ).
- the remote controller 30 R can obtain the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory (simulation motion information) that are continuous on the time axis of the operation.
- the remote controller 30 R transmits, to the controller 30 , the operation instructions of the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory generated by the motion simulator 69 (step S 13 ). Then, the actuator driving part 34 of the controller 30 controls the various solenoid valves 41 and the various actuators 42 according to these trajectories in the real space excavator 100 (step S 14 ). The actuator driving part 34 may correct the motion (position, speed, acceleration, etc.) of the excavation attachment AT by feeding back, to the actuator driving part 34 , the detection information of the various sensors of the excavator 100 in each motion of the work.
- the controller 30 When the controller 30 acquires the detection information of the environment detection device 79 and the communication information from the dump truck in the actual work, the controller 30 transmits the acquired information to the remote controller 30 R.
- the virtual space generation part 65 of the remote controller 30 R updates the topographic information, object information, and additional information of the three-dimensional virtual space model 50 based on this information (step S 15 ).
- the motion simulator 69 corrects various trajectories (excavation trajectory, lifting and turning trajectory, earth and sand discharging trajectory, lowering and turning trajectory) of the work based on the updated information (step S 16 ).
- the corrected trajectories are transmitted from the remote controller 30 R to the controller 30 , and are reflected in the control by the actuator driving part 34 of the controller 30 (step S 17 ).
- the work support system SYS 2 can appropriately move the excavator 100 according to the situation that changes in real time in the real space work site by repeating the above-described motions during the work. As a result, the work support system SYS 2 can efficiently and accurately perform the work by the excavator 100 .
- FIG. 10 is an explanatory diagram illustrating the control of the work of the excavator 100 by the work support system SYS 2 according to the modified example.
- the operator operates the operation device 21 R while viewing the three-dimensional virtual space model 50 displayed on the display device RD in the remote operation room RC, and transmits the operation signal of the operation sensor 29 R from the remote controller 30 R to the controller 30 .
- the virtual space generation part 65 of the remote controller 30 R reproduces the environment information (topographic information and object information) in the three-dimensional virtual space model 50 based on the detection information of the environment detection device 79 transmitted from the controller 30 .
- the operator can operate the excavation motion, the lifting and turning operation, the earth and sand discharging motion, and the lowering and turning motion of the work.
- the motion simulator 69 may generate various trajectories of the work and guide the operator to the trajectories.
- the operation intervention part 68 may intervene in the operation and correct the operation signal of the operation sensor 29 R when the operation by the operator deviates greatly from the generated trajectory.
- the controller 30 can control the operation of the excavator 100 in the real space based on the operation signal received from the remote controller 30 R. At this time, the controller 30 performs feedback control of each motion based on detection information detected by various sensors of the excavator 100 .
- the work support systems SYS 1 and SYS 2 of the excavator 100 are exemplary in all respects and are not restrictive.
- the embodiments can be modified and improved in various forms without departing from the scope and gist of the appended claims.
- the matters described in the above plurality of embodiments can have other configurations without contradiction, and can be combined without contradiction.
- the work efficiency of the excavator can be further improved by updating environment information according to the motion of the excavator and setting operation contents based on the environment information.
Landscapes
- Engineering & Computer Science (AREA)
- Mining & Mineral Resources (AREA)
- Civil Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Structural Engineering (AREA)
- Operation Control Of Excavators (AREA)
- Component Parts Of Construction Machinery (AREA)
Abstract
A work support system for an excavator, including the excavator; an environment detection device configured to detect environment information of a work site of the excavator; and a simulation device configured to acquire the environment information detected by the environment detection device during a motion of the excavator, and to generate a three-dimensional virtual space model of the work site.
Description
- The present application is a continuation application of International Application No. PCT/JP2023/045244 filed on Dec. 18, 2023, which is based on and claims priority to Japanese Patent Application No. 2022-203642 filed on Dec. 20, 2022. The contents of these applications are incorporated herein by reference.
- The present disclosure relates to a work support system for an excavator.
- In the related art, a work support system for supporting excavation motions and the like of an excavator in order to increase the work efficiency of the excavator, is known. For example, there is a work support system of the related art for evaluating the work quality of a plurality of operators, learning the parameters of the excavator operation model based on the operation data of the best operator, and supporting the operator based on the learned operation model.
- In this kind of excavator work support system, by investigating and recognizing in advance earth and sand information such as soil quality of the location to be excavated (excavation object) by the excavator, the motion content of the excavator corresponding to the earth and sand information can be set in the excavation motion.
- According to an embodiment of the present invention, there is provided a work support system for an excavator is provided, the work support system including the excavator; an environment detection device configured to detect environment information of a work site of the excavator; and a simulation device configured to acquire the environment information detected by the environment detection device during a motion of the excavator, and to generate a three-dimensional virtual space model of the work site.
-
FIG. 1 is a diagram illustrating an example of a work support system according to an embodiment; -
FIG. 2 is a functional block diagram illustrating a configuration example of the work support system; -
FIG. 3 is a diagram illustrating an example of a three-dimensional virtual space model generated by the virtual space generation part; -
FIG. 4 is an explanatory diagram illustrating an excavation motion of a virtual excavator; -
FIG. 5 is an explanatory diagram illustrating an earth and sand discharging motion of a virtual excavator; -
FIG. 6 is an explanatory diagram illustrating control of an excavation motion of an excavator by the work support system according to the embodiment; -
FIG. 7 is a diagram illustrating an example of a work support system according to another embodiment; -
FIG. 8 is a functional block diagram illustrating a configuration example of a work support system; -
FIG. 9 is an explanatory diagram illustrating control of an excavation motion of an excavator by the work support system according to the other embodiment; and -
FIG. 10 is an explanatory diagram illustrating control of an excavation motion of an excavator by the work support system according to the modified example. - In the actual work site, even if the earth and sand information is held beforehand, the earth and sand information changes easily depending on the excavation site, excavation depth, etc. Therefore, even if the work support system sets the motion content of the excavator based on the earth and sand information, the actual excavator may not be able to perform excavation as scheduled.
- The present disclosure provides a work support system for an excavator capable of further improving the work efficiency of an excavator, by updating environment information in association with the motion of the excavator and setting motion contents based on the environment information.
- Hereinafter, an embodiment of the present disclosure will be described with reference to the drawings. In each of the drawings, the same components are denoted by the same reference numerals and duplicate descriptions may be omitted.
- First, the work support system SYS1 according to the embodiment of the present invention will be described with reference to
FIG. 1 .FIG. 1 is a diagram illustrating a work support system SYS1 according to the embodiment. - An excavator 100 applied to the work support system SYS1 includes a lower traveling body 1, an upper turning body 3 mounted on the lower traveling body 1 so as to be able to turn through a turning mechanism 2, an excavation attachment AT, and a cabin 10.
- The lower traveling body 1 of the excavator 100 of the present embodiment has a pair of right and left crawlers 1C. The crawlers 1C are driven by a traveling hydraulic motor 2M which is a traveling actuator mounted on the lower traveling body 1.
- The upper turning body 3 is mounted on the lower traveling body 1 in a turnable manner via a turning mechanism 2. The turning mechanism 2 is driven by a turning hydraulic motor 2A which is a turning actuator mounted on the upper turning body 3. The turning actuator may be an electric actuator (a turning electric generator).
- A boom 4 is mounted on the upper turning body 3. An arm 5 is mounted on the tip of the boom 4, and a bucket 6 serving as an end attachment is mounted on the tip of the arm 5. The boom 4, the arm 5, and the bucket 6 form an excavation attachment AT serving as an example of an attachment. The boom 4 is driven by a boom cylinder 7, the arm 5 is driven by an arm cylinder 8, and the bucket 6 is driven by a bucket cylinder 9. The boom cylinder 7, the arm cylinder 8, and the bucket cylinder 9 form an attachment actuator. The end attachment may be a slope bucket.
- The boom 4 is supported to be vertically turnable with respect to the upper turning body 3. A boom angle sensor S1 is attached to the boom 4. The boom angle sensor S1 detects a boom angle α which is a turning angle of the boom 4. The boom angle α is, for example, a rising angle from a state in which the boom 4 is fully lowered. Therefore, the boom angle α becomes maximum when the boom 4 is fully raised.
- The arm 5 is rotatably supported with respect to the boom 4. An arm angle sensor S2 is attached to the arm 5. The arm angle sensor S2 detects an arm angle β which is a rotation angle of the arm 5. The arm angle β is, for example, an opening angle from the fully closed state of the arm 5. Therefore, the arm angle β becomes maximum when the arm 5 is fully opened.
- The bucket 6 is rotatably supported with respect to the arm 5. A bucket angle sensor S3 is attached to the bucket 6. The bucket angle sensor S3 detects the bucket angle γ which is the rotation angle of the bucket 6. The bucket angle γ is the opening angle from the state in which the bucket 6 is closed most. Therefore, the bucket angle γ is maximum when the bucket 6 is fully opened.
- Each of the boom angle sensor S1, the arm angle sensor S2 and the bucket angle sensor S3 may use only an acceleration sensor or may be a combination of an acceleration sensor and a gyro sensor. Alternatively, the boom angle sensor S1 may be a stroke sensor attached to the boom cylinder 7, a rotary encoder, a potentiometer, or an inertial measurement device. The same applies to the arm angle sensor S2 and the bucket angle sensor S3.
- The upper turning body 3 is provided with the cabin 10 serving as an operator cab, and a power source such as an engine 11 is mounted. The upper turning body 3 is provided with a space recognition device 70, an orientation detection device 71, and a positioning device 72, and is also provided with various sensors of the excavator 100, such as a machine body inclination sensor S4 and a turning angular velocity sensor S5. Further, an operation device 21, an output device 22, a controller 30 and the like are provided inside the cabin 10. In the present specification, for convenience of explanation, the side of the upper turning body 3 where the excavation attachment AT is attached is referred to as the front side, and the side where the counterweight is attached is referred to as the rear side.
- The space recognition device 70 is a device for recognizing a three-dimensional real space (environment information) around the excavator 100. The space recognition device 70 is configured to measure the direction and distance to a recognized object from the space recognition device 70 or the excavator 100. The space recognition device 70 includes, for example, an ultrasonic sensor, a millimeter wave radar, a monocular camera, a stereo camera, a LiDAR, a distance image sensor, an infrared sensor, or any combination thereof. In the present embodiment, the space recognition device 70 includes a front sensor 70F attached to the front end of the upper surface of the cabin 10, a rear sensor 70B attached to the rear end of the upper surface of the upper turning body 3, a left sensor 70L attached to the left end of the upper surface of the upper turning body 3, and a right sensor 70R attached to the right end of the upper surface of the upper turning body 3. The space recognition device 70 may have an upper sensor (not illustrated) attached to the excavator 100 that recognizes an object present in the space above the upper turning body 3.
- The orientation detection device 71 detects information related to the relative relationship between the orientation of the upper turning body 3 and the orientation of the lower traveling body 1. The orientation detection device 71 may be formed of, for example, a combination of a geomagnetic sensor attached to the lower traveling body 1 and a geomagnetic sensor attached to the upper turning body 3. Alternatively, the orientation detection device 71 may be formed of a combination of a GNSS receiver attached to the lower traveling body 1 and a GNSS receiver attached to the upper turning body 3. The orientation detection device 71 may be a rotary encoder, a rotary position sensor, or any combination thereof. In a configuration in which the upper turning body 3 is driven to turn by a turning electric generator, the orientation detection device 71 may be formed of a resolver. The orientation detection device 71 may be attached to, for example, a center joint provided in relation to the turning mechanism 2 for implementing relative rotation between the lower traveling body 1 and the upper turning body 3.
- The orientation detection device 71 may be formed of a camera attached to the upper turning body 3. In this case, the orientation detection device 71 applies known image processing to the imaging information captured by the camera to extract an image of the lower traveling body 1 included in the imaging information. Then, the orientation detection device 71 identifies the longitudinal direction of the lower traveling body 1 from the image of the lower traveling body 1, and derives an angle formed between the longitudinal direction of the upper turning body 3 and the longitudinal direction of the lower traveling body 1. The longitudinal direction of the upper turning body 3 is derived from the mounting position of the camera. In particular, because the crawler 1C protrudes from the upper turning body 3, the orientation detection device 71 can identify the longitudinal direction of the lower traveling body 1 by detecting the image of the crawler 1C. The orientation detection device 71 may be integrated into the controller 30. The camera may also use a space recognition device 70.
- The positioning device 72 is configured to measure the position of the upper turning body 3. In the present embodiment, the positioning device 72 is a GNSS receiver, which detects the position of the upper turning body 3 and outputs the detected value to the controller 30. The positioning device 72 may be a GNSS compass. In this case, because the positioning device 72 can detect the position and the orientation of the upper turning body 3, the positioning device 72 also functions as an orientation detection device 71.
- The machine body inclination sensor S4 detects the inclination of the upper turning body 3 with respect to a predetermined plane. In the present embodiment, the machine body inclination sensor S4 is an acceleration sensor that detects the tilt angle of the upper turning body 3 around the longitudinal axis and around the lateral axis, with respect to the horizontal plane. The longitudinal axis and the lateral axis of the upper turning body 3, for example, are orthogonal to each other and pass through an excavator center point which is a point on the turning axis of the excavator 100.
- The turning angular velocity sensor S5 detects the turning angular velocity of the upper turning body 3. In the present embodiment, the turning angular velocity sensor S5 is a gyro sensor, but it may be a resolver, a rotary encoder, or any combination thereof. The turning angular velocity sensor S5 may detect the turning velocity. The turning velocity may be calculated from the turning angular velocity.
- Hereinafter, at least one of the boom angle sensor S1, the arm angle sensor S2, the bucket angle sensor S3, the machine body inclination sensor S4, or the turning angular velocity sensor S5 is also referred to as a posture detection device. The posture of the excavation attachment AT is detected, for example, based on the respective outputs of the boom angle sensor S1, the arm angle sensor S2, and the bucket angle sensor S3.
- The operation device 21 is a device provided in the cabin 10 for an operator to operate the excavator 100. For example, the operation device 21 has an operating lever and an operating pedal for controlling the drive of the actuator of the excavator 100. The actuator includes at least one of a hydraulic actuator or an electric actuator.
- Further, the operation device 21 has an information input device (for example, a right console box, a left console box) for an operator of the excavator 100 to input information to the controller 30. The information input device may be, for example, a switch panel installed close to the display device of the output device 22. Alternatively, the information input device may be a touch panel applied as a display device or a voice input device such as a microphone placed in the cabin 10. The information input device may also be a communication device for obtaining information from the outside.
- The output device 22 includes at least one of a display device and a sound output device. The display device is a liquid crystal display installed in the cabin 10. The display device may be a display of a portable terminal such as a smartphone. The sound output device includes at least one of a device for outputting sound to an operator in the cabin 10 or a device for outputting sound to an operator outside the cabin 10. The sound output device may be a speaker of a portable terminal.
- The controller 30 is a control device for controlling the excavator 100. In the present embodiment, the controller 30 is formed of a computer including one or more processors, a memory (volatile memory, non-volatile memory), and the like. The one or more processors read and execute a program corresponding to each function from the memory. For example, each function includes a machine guidance function for guiding a manual operation of the excavator 100 by an operator, or a machine control function for automatically (or autonomously) operating the excavator 100. The controller 30 may include a contact avoidance function for automatically operating or braking the excavator 100 to avoid contact between the excavator 100 and an object existing around the excavator 100.
- That is, the work support is an expression including automatically operating the excavator 100 on behalf of the operator, assisting the operation of the excavator 100 by the operator, and providing operation information to the operator of the excavator 100. For the work support of the excavator 100, the work support system SYS1 applies the controller 30 mounted on the excavator 100 and performing each function, and the environment detection device 79 (see
FIG. 2 ) providing information to the controller 30. However, in addition to the work support system SYS1 being formed of the excavator 100 alone, a management apparatus 200 capable of communicating with the excavator 100 may be applied outside the excavator 100. By applying the management apparatus 200, the work support system SYS1 can support the motions of a plurality of excavators 100 (or other work machines) so as to be interlocked. Further, the work support system SYS1 may include an external space recognition device 300 having the same function as the space recognition device 70, outside the excavator 100. - Next, an example of the work support system SYS1 according to the embodiment will be described with reference to
FIG. 2 .FIG. 2 is a functional block diagram illustrating a configuration example of the work support system SYS1. - The excavator 100 includes a controller 30, a space recognition device 70, an orientation detection device 71, a positioning device 72, various solenoid valves 41, various actuators 42, and a communication device T1 as a configuration of the work support system SYS1. On the other hand, the management apparatus 200 includes a computer body 210 for performing various processes of the work support system SYS1 and a communication device T2.
- The external space recognition device 300 detects the state of the work site where the excavator 100 is located. The detection of the state of the work site includes, for example, measurement of distance, shape, and direction in addition to imaging the work site. For example, the external space recognition device 300 includes an ultrasonic sensor, a millimeter wave radar, a monocular camera, a stereo camera, a LiDAR, a distance image sensor, an infrared sensor, or any combination thereof installed in the work site. The external space recognition device 300 communicates, either wirelessly or via a wired connection, with at least one of the communication device T1 of the excavator 100 or the communication device T2 of the management apparatus 200, and successively transmits detection information of the detected state of the work site.
- The controller 30 of the excavator 100 includes a virtual space generation part 31, an excavator state identification part 32, an excavation state estimation part 33, an actuator driving part 34, a determination part 36, an operation prediction part 37, an operation intervention part 38, and a motion simulator 39 as functional blocks. Although the virtual space generation part 31, the excavator state identification part 32, the excavation state estimation part 33, the actuator driving part 34, the determination part 36, the operation prediction part 37, the operation intervention part 38, and the motion simulator 39 are distinguished for convenience of explanation, they do not need to be physically distinguished, and they may be composed entirely or partially of common software components or hardware components.
- The computer body 210 of the management apparatus 200 includes, as functional blocks, a determination part 211, an operation prediction part 212, an operation intervention part 213, and a motion simulator 214. Although the determination part 211, the operation prediction part 212, the operation intervention part 213, and the motion simulator 214 are distinguished for convenience of explanation, they do not need to be physically distinguished, and may be composed entirely or partially of common software components or hardware components. The determination part 211, the operation prediction part 212, the operation intervention part 213, and the motion simulator 214 have the same functions as the determination part 36, the operation prediction part 37, the operation intervention part 38, and the motion simulator 39 of the controller 30. The management apparatus 200 may include a virtual space generation part 215 (see the dotted line in
FIG. 2 ) having the same function as the virtual space generation part 31 of the controller 30. Hereinafter, each functional block of the controller 30 is described as a representative example, and descriptions of each functional block of the computer body 210 are omitted. The work support system SYS1 will suffice to have the functions of virtual space generation parts 31, 215, the determination parts 36, 211, the operation prediction parts 37, 212, the operation intervention parts 38, 213, and the motion simulators 39, 214 in at least one of the excavator 100 or the management apparatus 200. - The virtual space generation part 31 generates a three-dimensional virtual space model on the virtual three-dimensional coordinates in the virtual space generation part 31 based on the detection information of the space recognition device 70, the orientation detection device 71, and the positioning device 72, and the information of the operation device 21. The three-dimensional virtual space model is formed into a virtual rectangular parallelepiped, cube, sphere, or hemisphere according to the imaging range of the space recognition device 70. The three-dimensional virtual space model may be image information displayed on the display device of the output device 22. In this case, the three-dimensional virtual space model is a three-dimensional topographic image and is formed by computer graphics.
- Typically, the three-dimensional virtual space model is information having a plurality of layers in which object information is superimposed on topographic information indicating the topography around the excavator 100 visible to the operator seated in the cabin 10. Hereinafter, changeable information (parameters) applied to the three-dimensional virtual space model is referred to as environment information. The environment information includes topographic information and object information. The virtual space generation part 31 performs known image processing on the detection information captured by at least one of the front sensor 70F, the rear sensor 70B, the left sensor 70L or the right sensor 70R of the space recognition device 70, and extracts topographic information and object information included in the detection information. At this time, for example, the object included in the current imaging information may be extracted by comparing a plurality of pieces of past imaging information with the current imaging information. Then, the virtual space generation part 31 arranges the extracted topographic information and object information in a three-dimensional virtual space model, and reproduces virtual environment information surrounding the cabin 10 (operator) in the three-dimensional virtual space model.
- The object information extracted for reproducing the three-dimensional virtual space model includes, for example, soil to be excavated (including mounds, holes, walls, ditches, etc.), static objects other than excavated objects, the excavator 100 itself, work machines such as another excavator, vehicles, animals including people, plants, and the like. In the arrangement of the object information with respect to the three-dimensional virtual space model, the virtual space generation part 31 can utilize information such as distance and direction between the excavator 100 and the object information measured by the ultrasonic sensor, millimeter wave radar, LiDAR, and the like. Thus, various objects in the work site where the excavator 100 exists and coordinates of the objects are appropriately reproduced in the three-dimensional virtual space model.
- Further, the virtual space generation part 31 may receive the detection information of the external space recognition device 300 and generate a three-dimensional virtual space model imaged by the external space recognition device 300. Alternatively, the virtual space generation part 31 may be configured to generate one three-dimensional virtual space model by integrating the detection information of the space recognition device 70 of the excavator 100 and the detection information of the external space recognition device 300.
-
FIG. 3 is a diagram illustrating an example of a three-dimensional virtual space model 50 generated by the virtual space generation part 31. For example, a virtual excavator 51 obtained by reproducing the excavator 100 in a real space, is arranged in the three-dimensional virtual space model 50 of the virtual space generation part 31. The virtual excavator 51 corresponds to the shape, position, posture, etc., of the excavator 100 at an actual work site. The position, posture, and the like of the virtual excavator 51 in the three-dimensional virtual space model 50 are identified by an excavator state identification part 32 described below, and are determined, for example, based on the detection information of at least one of the space recognition device 70 or the orientation detection device 71. The position, posture, and the like of the virtual excavator 51 may be determined or adjusted by using the detection information of a posture detection device, the positioning device 72, and the like. - In the three-dimensional virtual space model 50, the excavation object 52 (mound in FIG. 3), which is the topographic information existing around the virtual excavator 51, and a virtual dump truck 53 (discharge object), which is the object information, are arranged, including the shape itself. Although the illustration of other environment information is omitted in
FIG. 3 , the object information extracted from the detection information is appropriately arranged in the three-dimensional virtual space model 50. Further, the virtual space generation part 31 may generate the three-dimensional virtual space model 50 by leaving an image that is difficult to extract in the detection information as the background of the three-dimensional virtual space model 50. - Further, the virtual space generation part 31 according to the present embodiment associates (adds) various kinds of additional information with the topographic information and object information reproduced in the three-dimensional virtual space model 50.
- For example, in addition to the shape, position, or posture, additional information of the excavator such as an identification number, a type, a working time, a type and a posture of the excavation attachment AT, and earth and sand information (additional information) such as a weight, a volume, and a density of the earth and sand put in the bucket 6, are added to the virtual excavator 51. The additional information of the excavator may be estimated based on the detection information of the environment detection device 79, or information previously stored in the controller 30 may be used. Further, the earth and sand information stored in the excavator 100 can be estimated based on information obtained by detecting the load applied when the excavator 100 is performing excavation, detected by a pressure sensor, a load sensor, or the like, which is a part of the environment detection device 79.
- That is, the environment detection device 79 is a device that detects information that affects the environment information constituting the three-dimensional virtual space model 50 inside or outside the excavator 100. The environment detection device 79 may include various sensors such as a space recognition device 70, an orientation detection device 71, a positioning device 72, a posture detection device (the boom angle sensor S1, the arm angle sensor S2, the bucket angle sensor S3, the machine body inclination sensor S4, and the turning angular velocity sensor S5), a pressure sensor not illustrated, a load sensor, and an operation sensor of the operation device 21. The environment detection device 79 may also include detection information of a device (the external space recognition device 300, other work machines and vehicles (for example, a dump truck)) installed outside the excavator 100.
- Further, for example, additional information of an excavation object such as the sediment accumulation amount, weight, density, hardness, and soil quality of the earth and sand is added to the excavation object 52 of the three-dimensional virtual space model 50. The additional information of an excavation object is estimated by the excavation state estimation part 33 described below based on the detection information of the space recognition device 70. Alternatively, the additional information of an excavation object may be estimated by the pressure and load applied to the bucket 6 of the excavator 100 when the excavator 100 operates, the weight of earth and sand recognized by the dump truck when the earth and sand is discharged to the dump truck, an image, or the like. Alternatively, a part of the additional information of the excavation object may be extracted from the design data of the work site stored in advance in the controller 30.
- On the other hand, in addition to the shape, position, and posture, additional information of the discharge object such as identification number, type, size of the loading platform, state of the loading platform, and working time are added to the virtual dump truck 53 of the three-dimensional virtual space model 50. Further, earth and sand information (additional information of the discharge object) such as weight, volume, and density of earth and sand loaded on the loading platform of the virtual dump truck 53 (the discharge object) may be added to the loading platform of the virtual dump truck 53. The additional information of the discharge object may be estimated based on object information extracted from the detection information of the space recognition device 70, or information previously stored in the controller 30 may be used. The earth and sand information is added to the virtual dump truck 53 of the three-dimensional virtual space model 50 by detecting, at the actual dump truck, the weight, image, and the like of the earth and sand that changes when the earth and sand is discharged from the excavator 100, and receiving the information.
- The additional information to be added to the environment information may include various kinds of information other than the above. For example, when a person or another work machine exists around the excavator 100, the object information thereof is arranged in the three-dimensional virtual space model 50. Additional information of the relative distance to the excavator 100 can be added to the object information. Further, additional information such as identification number, type, and working time can be added to the information of other work machines.
- The virtual space generation part 31 may integrate the design data of the work site stored in the controller 30 into the three-dimensional virtual space model 50. The design data has the completed shape of the excavation of the work site, and the virtual space generation part 31 superimposes and displays, on the three-dimensional virtual space model 50, graphics such as computer graphics indicating the position of the completed shape. Further, the design data may include the state of earth and sand (position, shape, soil quality, hardness) of the excavation site investigated in advance, and this information may be added to the topographic information of the three-dimensional virtual space model 50.
- Then, the virtual space generation part 31 sequentially updates the environment information of the three-dimensional virtual space model 50 based on the detection information of the space recognition device 70, the orientation detection device 71, and the positioning device 72, information of the operation device 21, and communication information from the external space recognition device 300 and the dump truck. For example, when the excavator 100 excavates the excavation object of the work site, the environment information of the three-dimensional virtual space model 50 may be changed based on the detection information at that time. Even if the earth and sand information (additional information) of the excavation object 52 is added in advance, the earth and sand information may be different when the excavator 100 has actually performed the excavation. In such a case, the virtual space generation part 31 changes or corrects the earth and sand information based on the detection information detected when the excavator 100 actually has actually performed the excavation. Thus, the virtual space generation part 31 can generate the three-dimensional virtual space model 50 that is even closer to the work site environment.
- Referring back to
FIG. 2 , the excavator state identification part 32 is configured to identify the state of the excavator 100 (posture of the excavation attachment AT, etc.) including the position and orientation of the excavator 100. The position of the excavator 100 is, for example, the latitude, longitude, and altitude of the reference point of the excavator 100. The excavator state identification part 32 identifies the position of the excavator 100 based on the output of the positioning device 72, and identifies the orientation of the excavator 100 based on the output of the orientation detection device 71. The position and orientation of the excavator 100 identified by the excavator state identification part 32 are reflected on the virtual excavator 51 of the three-dimensional virtual space model 50. - The excavation state estimation part 33 estimates additional information of the excavation object required for the work of the excavator 100 based on the detection information of the environment detection device 79. The term “work” of the excavator 100 in the present specification means a concept including a series of movements such as an excavation motion with respect to the earth and sand that is the excavation object, conveyance of the earth and sand accompanying the lifting and turning motion of the excavator 100, an earth and sand discharging motion with respect to the loading platform of the dump, and the return of the lowering and turning motion of the excavator 100. As additional information of the excavation object, the hardness of the ground to be excavated, the density of the soil, the soil quality, and the like are examples. During the excavation motion with respect to the excavation object in the real space, an excavation reaction force is applied from the excavation object to the excavation attachment AT. Therefore, the excavation state estimation part 33 can estimate the additional information of the excavation object based on the pressure detected by the pressure sensor provided in the hydraulic path of each cylinder of the excavation attachment AT or the load detected by the load sensor provided at an appropriate position of the excavation attachment AT. Besides the excavation with respect to the excavation object, the excavation state estimation part 33 may estimate the earth and sand information based on the detection information of various sensors while the earth and sand are held by the excavation attachment AT. The additional information of the excavation object estimated by the excavation state estimation part 33 is added to the earth and sand information of the virtual excavator 51 of the three-dimensional virtual space model 50 as described above, and is appropriately reflected in the additional information of the excavation object 52 excavated by the virtual excavator 51.
- The actuator driving part 34 is configured to drive various solenoid valves 41 and various actuators 42 mounted on the excavator 100. The actuator driving part 34 outputs an operation signal to the corresponding solenoid valve 41 or the corresponding actuator 42 based on the control signal processed by the controller 30 itself, in addition to the operation signal of the operation device 21. By driving the solenoid valve 41 and the actuator 42 which received the operation signal, the excavator 100 performs various motions (excavation motion, lifting and turning motion, earth and sand discharging motion, lowering and turning motion, etc.) in the work.
- On the other hand, the determination part 36 of the controller 30 is configured to determine whether there is any matter to be reported to the operator of the excavator 100 regarding the situation around the excavator 100. For example, the determination part 36 determines whether there is a matter to be reported to the operator of the excavator 100 based on at least one of the detection information of the environment detection device 79 provided in the excavator 100 or the set motion content of the excavator 100. The determination part 36 may also be configured to determine whether there is a matter to be reported to the operator of the excavator 100 based on the detection information of the external space recognition device 300. The external space recognition device 300 may be a sensor (camera, LiDAR, etc.) attached to another work machine or a sensor (camera, LiDAR, etc.) attached to a flying object such as a multicopter (drone) flying over the work site. In the determination, the determination part 36 may use the three-dimensional virtual space model 50 of the virtual space generation part 31 or the simulation result of the motion simulator 39 described below.
- For example, the determination part 36 determines, based on the detection information of the space recognition device 70 or the external space recognition device 300, at least one of the position, posture, or motion content of another work machine or vehicle. Whether or not there is a matter to be reported may be determined based on the presence or absence of the same or similar situation in comparison with past cases. For example, if the determination part 36 recognizes the presence of a person near the excavator 100 (the motion range of the excavation attachment AT), the determination part 36 may determine that there is a matter to be reported to the operator.
- Further, if the determination part 36 detects the presence of a downhill slope around the excavator 100, it may determine that there is a matter to be reported to the operator. At this time, the determination part 36 may determine the downslope based on the topographic information of the design data stored in advance in the controller 30. Alternatively, if the determination part 36 detects the existence of an object (for example, electric wire) that may affect the work outside the range covered by the three-dimensional virtual space model 50, the determination part 36 may determine that there is a matter to be reported to the operator.
- If it is determined that there is a matter to be reported to the operator of the excavator 100, the determination part 36 performs processing to alert the operator. As an example, the determination part 36 transmits information about the report matter to the output device 22. Thus, the output device 22 can report, to the operator, the information about the report matter received from the determination part 36.
- The operation prediction part 37 is configured to predict the operation signal after a predetermined time based on the operation signal from the operation device 21 or the signal from the management apparatus 200. This is to prevent the degrading of the operation responsiveness caused by the delay due to the processing overload and the communication delay. The predetermined time is, for example, several milliseconds to several tens of milliseconds. For example, the operation prediction part 37 predicts the operation signal after a predetermined time based on the transition of the operation signal (tilt angle of the operation lever of the operation device 21) during a predetermined time in the past. As an example, when the operation prediction part 37 detects that the tilt angle of the operation lever has tended to increase during a predetermined time in the past, the operation prediction part 37 predicts that the tilt angle after a predetermined time will be larger than the current tilt angle. Thus, the excavator 100 can move while reducing the delay of the operation signal.
- The operation intervention part 38 determines whether or not to intervene in the operation of the excavator 100 by the operator based on the detection information of the environment detection device 79, and intervenes in the operation as necessary. For example, the operation intervention part 38 functions as a contact avoidance function for avoiding contact between the excavator 100 and other objects by intervening in the operation of the operator. The operation intervention part 38 may use the three-dimensional virtual space model 50 of the virtual space generation part 31, or may use the simulation result of the motion simulator 39 described below in determining the intervention of the operation.
- When the operation intervention part 38 detects that there is a risk of contact between the excavator 100 and an object around the excavator 100, the operation intervention part 38 determines that intervention in the operation is imperative. As an example, when the operation intervention part 38 detects the presence of a person on the left side of the excavator 100 and the start of the left turning operation (the operation of tilting the left operation lever to the left), the operation intervention part 38 determines that intervention in the operation is imperative. In this case, the operation intervention part 38 invalidates the operation signal generated based on the left turning operation to prevent the upper turning body 3 from turning left. The operation intervention part 38 may intervene in the operation upon determining whether or not the excavator 100 and the object are in contact with each other based on the detection information of the external space recognition device 300.
- Further, the operation intervention part 38 may release the braking operation (stop, deceleration, etc.) at the time of intervention based on the release condition being satisfied (for example, the operation lever is temporarily returned to the neutral position or the release button is pressed) at the time of intervention by the operation intervention part 38.
- The motion simulator 39 is configured to simulate the motion of the excavator 100 in the three-dimensional virtual space model 50. The term “motion of the excavator 100” refers to various classified operations in the work, etc., of the excavator 100. For example, the excavation work consists of a plurality of motions such as an excavation motion, a lifting and turning operation, an earth and sand discharge operation, and a lowering and turning motion. The motion simulator 39 simulates the motion of the excavator 100 in response to the ON operation of the automatic control button 23 (see
FIG. 6 ) for supporting the work of the excavator 100. For example, the automatic control button 23 is provided in the operation device 21 in the cabin 10. - Specifically, the motion simulator 39 simulates the work (excavation motion, lifting and turning motion, earth and sand discharging motion, lowering and turning motion, etc.) of the virtual excavator 51 by using the three-dimensional virtual space model 50 generated by the virtual space generation part 31. That is, the controller 30 of the excavator 100 functions as a simulation device of an embodiment of the present invention. The information simulated by the motion simulator 39 may be displayed on the display device of the output device 22. The topographic information of the three-dimensional virtual space model 50 may change according to the simulated motion of the virtual excavator 51.
-
FIG. 4 is an explanatory diagram illustrating the excavation motion of the virtual excavator 51. As illustrated inFIG. 4 , when the excavation motion is simulated, for example, the motion simulator 39 generates a plurality of excavation trajectories (simulation motion information) for moving the excavation attachment 51 a of the virtual excavator 51 with respect to the excavation object 52 in the three-dimensional virtual space model 50. The plurality of excavation trajectories are generated based on the range in which the excavation attachment AT of the excavator 100 in the real space can actually move, and the additional information of the excavation object 52, by using the detection information of the posture detection device of the excavator 100 in the real space. - For example, the motion simulator 39 sets the motion range (including the moving direction, moving distance, posture, and the like of the excavation attachment 51 a) of the excavation attachment 51 a of the virtual excavator 51, to the position of the excavation object 52. Further, the motion simulator 39 assumes a plurality of excavation patterns in which the order of the excavating locations of the excavation object 52 in the motion range of the excavation attachment 51 a and the excavation amount (or excavation depth) are appropriately changed. In the assumption of the plurality of patterns, the density, hardness, soil quality, and the like of the earth and sand in the earth and sand information of the excavation object 52 are used.
- Then, the motion simulator 39 generates a plurality of excavation trajectories of the virtual excavator 51 corresponding to the motion range of the excavation attachment 51 a and the plurality of excavation patterns. The example (a) in the upper right figure of
FIG. 4 is a pattern in which the first excavation location is the upper part of the excavation object 52, and then the surface of the excavation object 52 is excavated downward in order, and the inner part of the excavation object 52 is excavated after the surface. On the other hand, the example (b) in the upper right figure ofFIG. 4 is a pattern in which the first excavation location is the halfway point of the excavation object 52, and the surrounding area is excavated in detail. For example, the pattern (a) is when the earth and sand of the excavation object 52 are soft, and the pattern (b) is when the earth and sand of the excavation object 52 are hard. However, as described above, the density, hardness, and soil of the excavation object in real space may differ from the information before the motion during the excavation motion of the excavator 100. Therefore, even if the additional information of the excavation object is linked, the motion simulator 39 can generate a plurality of excavation trajectories when the density, hardness, and soil of the excavation object 52 are changed. - After generating a plurality of excavation trajectories, the motion simulator 39 appropriately evaluates (simulates) the plurality of excavation trajectories to select an optimum excavation trajectory. In the evaluation of the optimum excavation trajectory, for example, an objective function for efficiently excavating the excavation object 52 may be applied, and constraints such as the motion speed, the work time, and the safety of work of the excavation attachment 51 a may be used to evaluate the optimum excavation trajectory. Thus, the motion simulator 39 can obtain an optimum excavation trajectory corresponding to the shape, the position, and the additional information of the excavation object currently acquired.
- When the work support system SYS1 automatically controls the excavator 100, the controller 30 provides this optimum excavation trajectory to the actuator driving part 34 as simulation motion information. Thus, the actuator driving part 34 can control the various solenoid valves 41 and the various actuators 42 to move the excavation attachment 51 a in the real space along the excavation trajectory (see the lower diagram in
FIG. 4 ). - When the excavation attachment AT in the real space is performing excavation along the excavation trajectory, the controller 30 acquires detection information from the environment detection device 79 (includes a pressure sensor and a load sensor) and communication information from the dump truck. Based on this information, the virtual space generation part 31 updates the shape, position, and posture of the topographic information and object information of the three-dimensional virtual space model 50, and the additional information added thereto.
- Further, the motion simulator 39 corrects the excavation trajectory based on the updated topographic information, object information, and additional information of the three-dimensional virtual space model 50. The motion simulator 39 may re-select a plurality of excavation trajectories calculated previously, or may newly re-calculate excavation trajectories. When the optimum excavation trajectory is newly provided from the motion simulator 39, the actuator driving part 34 switches to this excavation trajectory to move the excavation attachment AT in the real space.
-
FIG. 5 is an explanatory diagram illustrating the earth and sand discharging motion of the virtual excavator 51. As illustrated inFIG. 5 , when the earth and sand discharging motion is simulated, the motion simulator 39 generates a plurality of earth and sand discharging trajectories (simulation motion information) for moving the excavation attachment 51 a of the virtual excavator 51 with respect to the virtual dump truck 53 of the three-dimensional virtual space model 50. The plurality of earth and sand discharging trajectories are generated based on the range in which the excavation attachment AT of the excavator 100 in the real space can actually move based on various sensors of the excavator 100 in the real space and the dump information of the virtual dump truck 53. - For example, the motion simulator 39 sets the motion range (including the moving direction, moving distance, posture, and the like of the excavation attachment 51 a) of the excavation attachment 51 a of the virtual excavator 51 in the earth and sand discharge, to the position above the loading platform of the virtual dump truck 53. Further, the motion simulator 39 assumes a plurality of earth and sand discharge patterns in which the position of loading onto the loading platform of the virtual dump truck 53 in the motion range of the excavation attachment 51 a and the loading amount are appropriately changed. In the assumption of the plurality of patterns, the sediment amount, density, hardness, and soil quality of earth and sand in the earth and sand information of the excavator 100 and/or the dump truck are used.
- Then, the motion simulator 39 generates a plurality of earth and sand discharging trajectories of the virtual excavator 51 corresponding to the motion range of the excavation attachment 51 a and the plurality of earth and sand discharge patterns. For example, in the upper right diagram of
FIG. 5 , the location of earth and sand discharge is positioned at the rear of the loading platform of the virtual dump truck 53, and the trajectory in which the earth and sand on the loading platform is leveled forward by the excavation attachment 51 a after the earth and sand discharge is performed, is illustrated. - After generating the plurality of earth and sand discharging trajectories, the motion simulator 39 appropriately evaluates (simulates) the plurality of earth and sand discharging trajectories to select the optimum earth and sand discharging trajectory. The optimum earth and sand discharging trajectory may be evaluated by using constraints such as the motion speed of the excavation attachment 51 a, the work time, and the safety of work, for example, while applying an objective function to load the earth and sand evenly onto the loading platform of the virtual dump truck 53. Thus, the motion simulator 39 can obtain the optimum earth and sand discharging trajectory according to the currently acquired earth and sand information.
- When the excavator 100 is automatically controlled by the work support system SYS1, the controller 30 provides the actuator driving part 34 with this optimum earth and sand discharging trajectory. Thus, the actuator driving part 34 can control various solenoid valves 41 and various actuators 42 to move the excavation attachment AT in the real space along the earth and sand discharging trajectory (see the lower diagram in
FIG. 5 ). - When the excavation attachment AT in the real space is performing an earth and sand discharging motion along the earth and sand discharging trajectory, the controller 30 acquires detection information of the environment detection device 79. Based on this information, the virtual space generation part 31 updates the shape, position, and posture of the topographic information and object information of the three-dimensional virtual space model 50 and the additional information added thereto.
- The motion simulator 39 corrects the optimum earth and sand discharging trajectory based on the updated topographic information, object information, and additional information of the three-dimensional virtual space model 50. The motion simulator 39 may re-select a plurality of previously calculated earth and sand discharging trajectories or re-calculate the earth and sand discharging trajectory. When the optimum earth and sand discharging trajectory is newly provided, the actuator driving part 34 switches to this earth and sand discharging trajectory to move the excavation attachment AT of the real space.
- The work support system SYS1 of the excavator 100 according to the embodiment is basically constructed as described above, and the operation thereof will be described below with reference to
FIG. 6 .FIG. 6 is an explanatory diagram illustrating the control of the work of the excavator 100 by the work support system SYS1 according to the embodiment. An example of automatic control (machine control function) of the excavator 100 by the controller 30 will be described below. Further, the work support system SYS1 is not limited thereto, and the work support system SYS1 may operate as a machine guidance function for guiding the operator to the trajectory generated by the controller 30. Further, the work support system SYS1 may automatically control and guide the excavator 100 by the management apparatus 200. - The controller 30 starts automatic control of the work of the excavator 100 based on the operator turning on the automatic control button 23 of the excavator 100. After the start, the virtual space generation part 31 generates the three-dimensional virtual space model 50 based on the stored environment information before the work of the excavator 100 (step S1). The environment information stored before the work is stored in the controller 30 and may inherit the three-dimensional virtual space model 50 of the previous work (for example, the previous day). As described above, various kinds of information are added to the topographic information and object information of the three-dimensional virtual space model 50 in addition to the topographic information and object information, such as additional information of the virtual excavator 51, earth and sand information (additional information) of the virtual excavator 51, additional information of the excavation object 52, additional information of the dump (discharge object), and earth and sand information (additional information) of the virtual dump truck 53.
- Then, the motion simulator 39 simulates the work of the virtual excavator 51 in the three-dimensional virtual space model 50 (step S2). That is, the motion simulator 39 generates each optimum trajectory in the work (excavation motion, lifting and turning motion, earth and sand discharging motion, lowering and turning motion) based on the environment information. Thus, the controller 30 can obtain the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory that are continuous on the time axis of the work.
- The actuator driving part 34 receives the information of the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory generated by the motion simulator 39 (step S3). Thus, the actuator driving part 34 controls the various solenoid valves 41 and the various actuators 42 according to these trajectories in the excavator 100 in the real space (step S4). That is, the excavator 100 in the real space automatically performs the excavation motion, the lifting and turning motion, the earth and sand discharging motion, and the lowering and turning motion as the actual work.
- The actuator driving part 34 may correct the motion (position, speed, acceleration, etc.) of the excavation attachment AT by feeding back the detection information of the various sensors of the excavator 100 in the excavation motion to the actuator driving part 34. Further, the actuator driving part 34 may perform feedback control of each motion based on the detection information of various sensors of the excavator 100 even in the lifting and turning operation, the earth and sand discharging motion, and the lowering and turning operation.
- In the actual work, the controller 30 acquires the information of the environment detection device 79 (excavation motion information including the operation of the space recognition device 70, the orientation detection device 71, the positioning device 72 and various sensors, or the operation device 21; communication information from the external space recognition device 300 and the dump truck) at a predetermined timing. The predetermined timing may be set sequentially (every predetermined time) during the work of the excavator 100, or a setting may be made to acquire the log stored by a motion in association with the end of one motion. The virtual space generation part 31 updates the topographic information, object information, and additional information of the three-dimensional virtual space model 50 based on this information (step S5). For example, the virtual space generation part 31 updates the additional information (ground hardness, soil density, soil quality, etc.) of the excavation object 52 of the three-dimensional virtual space model 50, to the information estimated by the excavation state estimation part 33. Further, the additional information and earth and sand information of the excavator of the virtual excavator 51 or the additional information and earth and sand information of the virtual dump truck 53 (discharge object) are also appropriately updated.
- Thus, the motion simulator 39 corrects various trajectories (excavation trajectory, lifting and turning trajectory, earth and sand discharging trajectory, and lowering and turning trajectory) of the work based on the updated information (step S6). The corrected trajectories are transmitted to the actuator driving part 34 and reflected in the control by the actuator driving part 34 (step S7).
- The work support system SYS1 can appropriately move the excavator 100 according to the situations that change in real time in the real space work site, by repeating the above motions during the work. As a result, the work support system SYS1 can efficiently and accurately perform the work by the excavator 100.
- Note that the work support system SYS1 according to an embodiment of the present invention is not limited to the above-described embodiments, and various modified examples can be adopted. As an example, when the controller 30 of the work support system SYS1 recognizes that there is a person around the excavator 100 based on the environment information, a trajectory (motion information) of the virtual excavator 51 can be generated so as to avoid the person in the three-dimensional virtual space model 50. For example, when there is a person on the left of the virtual excavator 51, with respect to the lifting and turning trajectory and the lowering and turning trajectory of the virtual excavator 51, the trajectory may be switched from a counterclockwise direction trajectory to a clockwise direction trajectory.
- Next, a work support system SYS2 of the excavator 100 according to the other embodiment will be described with reference to
FIGS. 7 and 8 .FIG. 7 is a diagram illustrating an example of the work support system SYS2 according to the other embodiment.FIG. 8 is a functional block diagram illustrating an example of the configuration of the work support system SYS2. - The work support system SYS2 according to the other embodiment differs from the work support system SYS1 according to the embodiment in that the excavator 100 is remotely operated by the remote operation room RC provided at a position away from the excavator 100. The configuration other than the remote operation of the excavator 100 is basically the same as that of the embodiment, and a detailed description thereof will be omitted.
- The remote operation room RC includes a remote controller 30R, a sound output device A2, an indoor imaging device C2, a display device RD, and a communication device T3. The remote operation room RC also includes a driver's seat DS where the operator who remotely operates the excavator 100 sits.
- The remote controller 30R is an arithmetic device that executes various arithmetic operations. The remote controller 30R, like the controller 30 of the excavator 100, is formed of a computer including one or more processors and a memory. Various functions of the remote controller 30R are implemented by the processor executing programs stored in the memory.
- The sound output device A2 outputs sound, and is configured such that a sound collecting device (not illustrated) attached to the excavator 100 replays the collected sound.
- The indoor imaging device C2 captures the interior of the remote operation room RC. For example, the indoor imaging device C2 is a camera installed inside the remote operation room RC and captures the operator OP sitting in the driver's seat DS.
- The display device RD displays information about the situation surrounding the excavator 100. For example, the display device RD is a multi-display consisting of a total of 9 monitors of 3 vertical rows and 3 horizontal rows, and is configured to display the conditions of the space at the front, at the left, and at the right of the excavator 100. Alternatively, the display device RD may be a head-mounted display that an operator can wear.
- The communication device T3 is configured to be able to communicate with the communication device T1 of the excavator 100, the communication device T2 of the management apparatus 200, the external space recognition device 300, and the like.
- The remote control room RC, positioned around the driver's seat DS, has a structure similar to the driver's seat installed in the cabin 10 of the excavator 100. Specifically, a left console box is arranged at the left of the driver's seat DS, and a right console box is arranged at the right of the driver's seat DS. A left operation lever is arranged on the top front end of the left console box, and a right operation lever is arranged on the top front end of the right console box. A travel lever and a travel pedal are arranged in front of the driver's seat DS. The left operation lever, the right operation lever, the travel lever, and the travel pedal constitute an operation device 21R of the remote operation room.
- The operation device 21R is provided with an operation sensor 29R for detecting the operation contents of the operation device 21R. The operation sensor 29R includes, for example, an inclination sensor for detecting the tilt angle of the operation lever, an angle sensor for detecting the oscillation angle of the operation lever around an oscillation axis, etc. The operation sensor 29R may be formed of other sensors such as a pressure sensor, a current sensor, a voltage sensor, or a distance sensor. The operation sensor 29R outputs information about the detected operation contents of the operation device 21R, to the remote controller 30R. The remote controller 30R generates an operation signal based on the received information, and transmits the generated operation signal to the excavator 100.
- As illustrated in
FIG. 8 , the remote controller 30R includes, as functional blocks, an operator state identification part 61, an image combining part 62, and an operation signal generation part 63. Although the operator state identification part 61, the image combining part 62, and the operation signal generation part 63 are distinguished for convenience of explanation, they do not have to be physically distinguished, and may be composed entirely or partially of common software components or hardware components. - The operator state identification part 61 is configured to identify the state of an operator in the remote operation room RC. The operator state identification part 61 identifies, for example, the position of the operator's eye (operator's viewpoint) and the direction of the operator's gaze, based on the imaging information of the indoor imaging device C2. Specifically, the operator state identification part 61 performs appropriate image processing on the image captured by the indoor imaging device C2 to identify the position of the operator's viewpoint in the operation room coordinate system and the coordinates of the direction of the operator's gaze.
- The operator state identification part 61 may derive the position of the operator's viewpoint and the direction of the gaze of the operator OP based on the output of other devices other than the indoor imaging device C2, such as LiDAR installed in the remote operation room RC or an inertial measurement device attached to the head-mounted display serving as the display device RD. The inertial measurement device may include a positioning device. The operator state identification part 61 transmits information related to the position of the operator's viewpoint E1 and the direction of the gaze of the operator OP to the excavator 100 via the communication device T3.
- The image combining part 62 generates a composite image by combining the detection information (imaging information) of the space recognition device 70 received from the excavator 100 or the three-dimensional virtual space model 50 with another image. The space recognition device 70 of the excavator 100 converts the position of the operator's viewpoint and the direction of the gaze in the operating room coordinate system transmitted from the remote controller 30R into coordinates in the excavator coordinate system, thereby capturing the imaging information of each sensor and transmitting it to the remote controller 30R.
- The other image may be a design surface image, which is an image generated based on design data. In the present embodiment, the image combining part 62 superimposes, on the environment information as a design surface image, a figure, such as computer graphics, representing the position of the design surface based on design data previously stored in the memory of the remote controller 30R. The image combining part 62 determines the position to superimpose the design surface image based on the position and orientation of the excavator 100 identified by the excavator state identification part 32 of the controller 30.
- The operation signal generation part 63 is configured to generate an operation signal to be transmitted to the excavator 100. The operation signal generation part 63 generates an operation signal based on the output of the operation sensor 29R of the remote operation room RC. Basically, the excavator 100 receives the operation signal generated when the operator of the remote operation room RC performs operation while viewing the image of the display device RD, and the excavator 100 performs a motion corresponding to the operation signal.
- Note that the remote operation of the excavator 100 is not limited to the operation by the operator in the remote operation room RC, and may be performed by an application of a portable terminal 400 as illustrated in
FIG. 7 , for example. In this case, most of the motions of the excavator 100 may be set to be automatically controlled in advance to simplify the operation by the portable terminal 400. - In the work support system SYS2 of the excavator 100 to which the remote operation room RC is applied, the work support trajectory can be generated in the controller 30 of the excavator 100 or in the remote controller 30R of the remote operation room RC. Therefore, the remote controller 30R may include the virtual space generation part 65, the determination part 66, the operation prediction part 67, the operation intervention part 68, and the motion simulator 69 having the same functions as the virtual space generation part 31, the determination part 36, the operation prediction part 37, the operation intervention part 38, and the motion simulator 39. Alternatively, the work support system SYS2 may generate trajectories for work support by the management apparatus 200.
- When the work support is performed by remote operation, the remote controller 30R generates a plurality of trajectories for each divided motion of the work, and selects the optimum trajectory by evaluating each trajectory, similar to the controller 30. Then, the remote controller 30R transmits information of the trajectories of each motion to the excavator 100, and the actuator driving part 34 of the controller 30 controls the motion of the excavator 100 along the trajectories of each motion.
- The work support system SYS2 according to the other embodiment is basically constructed as described above, and its operation will be described below with reference to
FIG. 9 .FIG. 9 is an explanatory diagram illustrating the control of the work of the excavator 100 by the work support system SYS2 according to the other embodiment. An example of automatic control (machine control function) of the excavator 100 by the remote controller 30R will be described below. The work support system SYS2 is not limited thereto, and the work support system SYS2 may operate as a machine guidance function to guide the operator to the trajectory generated (simulated) by the remote controller 30R. - The remote controller 30R automatically controls the work of the excavator 100 when the operator operates the operation device 21R while viewing the imaging information captured by the space recognition device 70 of the excavator 100. The virtual space generation part 65 generates a three-dimensional virtual space model 50 based on the environment detection device 79 (detection information from the space recognition device 70, the orientation detection device 71, the positioning device 72, and various sensors, or information from the operation device 21, and communication information from the external space recognition device 300 and the dump truck) (step S11).
- Thereafter, the motion simulator 69 simulates the work of the virtual excavator 51 in the three-dimensional virtual space model 50 (step S12). Thus, the remote controller 30R can obtain the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory (simulation motion information) that are continuous on the time axis of the operation.
- The remote controller 30R transmits, to the controller 30, the operation instructions of the excavation trajectory, the lifting and turning trajectory, the earth and sand discharging trajectory, and the lowering and turning trajectory generated by the motion simulator 69 (step S13). Then, the actuator driving part 34 of the controller 30 controls the various solenoid valves 41 and the various actuators 42 according to these trajectories in the real space excavator 100 (step S14). The actuator driving part 34 may correct the motion (position, speed, acceleration, etc.) of the excavation attachment AT by feeding back, to the actuator driving part 34, the detection information of the various sensors of the excavator 100 in each motion of the work.
- When the controller 30 acquires the detection information of the environment detection device 79 and the communication information from the dump truck in the actual work, the controller 30 transmits the acquired information to the remote controller 30R. The virtual space generation part 65 of the remote controller 30R updates the topographic information, object information, and additional information of the three-dimensional virtual space model 50 based on this information (step S15). Thus, the motion simulator 69 corrects various trajectories (excavation trajectory, lifting and turning trajectory, earth and sand discharging trajectory, lowering and turning trajectory) of the work based on the updated information (step S16). The corrected trajectories are transmitted from the remote controller 30R to the controller 30, and are reflected in the control by the actuator driving part 34 of the controller 30 (step S17).
- As described above, even with the remote controller 30R, the work support system SYS2 can appropriately move the excavator 100 according to the situation that changes in real time in the real space work site by repeating the above-described motions during the work. As a result, the work support system SYS2 can efficiently and accurately perform the work by the excavator 100.
- The work support system SYS2 according to the other embodiment is not limited to the above-described configuration, and various modifications can be adopted. The operation of the work support system SYS2 according to a modified example will be described below with reference to
FIG. 10 .FIG. 10 is an explanatory diagram illustrating the control of the work of the excavator 100 by the work support system SYS2 according to the modified example. In the work support system SYS2 according to the modified example, the operator operates the operation device 21R while viewing the three-dimensional virtual space model 50 displayed on the display device RD in the remote operation room RC, and transmits the operation signal of the operation sensor 29R from the remote controller 30R to the controller 30. - In this case, the virtual space generation part 65 of the remote controller 30R reproduces the environment information (topographic information and object information) in the three-dimensional virtual space model 50 based on the detection information of the environment detection device 79 transmitted from the controller 30. Based on the topographic information and object information of the three-dimensional virtual space model 50, the operator can operate the excavation motion, the lifting and turning operation, the earth and sand discharging motion, and the lowering and turning motion of the work. At this time, the motion simulator 69 may generate various trajectories of the work and guide the operator to the trajectories. Further, the operation intervention part 68 may intervene in the operation and correct the operation signal of the operation sensor 29R when the operation by the operator deviates greatly from the generated trajectory.
- Thus, the controller 30 can control the operation of the excavator 100 in the real space based on the operation signal received from the remote controller 30R. At this time, the controller 30 performs feedback control of each motion based on detection information detected by various sensors of the excavator 100.
- Further, the controller 30 transmits, to the remote controller 30R, detection information of the environment detection device 79 and communication information of the dump truck at the time of the work. Thus, the virtual space generation part 65 of the remote controller 30R updates the environment information of the three-dimensional virtual space model 50 in the work, and can be used to generate the trajectory of the motion simulator 69 and for the operation by the operator.
- The work support systems SYS1 and SYS2 of the excavator 100 according to the embodiments disclosed herein are exemplary in all respects and are not restrictive. The embodiments can be modified and improved in various forms without departing from the scope and gist of the appended claims. The matters described in the above plurality of embodiments can have other configurations without contradiction, and can be combined without contradiction.
- According to one aspect, the work efficiency of the excavator can be further improved by updating environment information according to the motion of the excavator and setting operation contents based on the environment information.
Claims (19)
1. A work support system for an excavator, comprising:
the excavator;
an environment detection device configured to detect environment information of a work site of the excavator; and
a simulation device configured to acquire the environment information detected by the environment detection device during a motion of the excavator, and to generate a three-dimensional virtual space model of the work site.
2. The work support system for the excavator according to claim 1 , wherein the environment detection device acquires the environment information at a predetermined timing during the motion of the excavator.
3. The work support system for the excavator according to claim 1 , wherein the environment information includes motion information acquired during the motion of the excavator.
4. The work support system for the excavator according to claim 3 , wherein the motion information includes excavation motion information acquired in association with an excavation motion of the excavator.
5. The work support system for the excavator according to claim 4 , wherein the environment information includes earth and sand information estimated from the excavation motion information.
6. The work support system for the excavator according to claim 1 , wherein the simulation device provides the excavator with simulation motion information simulated by a virtual excavator in the three-dimensional virtual space model reproduced by the simulation device.
7. The work support system for the excavator according to claim 6 , wherein the excavator is controlled at the work site based on the simulation motion information.
8. The work support system for the excavator according to claim 6 , wherein the simulation device generates a plurality of trajectories of the virtual excavator with different conditions with respect to an excavation motion or an earth and sand discharging motion, and provides, as the simulation motion information, an optimum trajectory among the plurality of trajectories based on a result of simulating the plurality of trajectories.
9. The work support system for the excavator according to claim 6 , wherein the simulation device generates the simulation motion information for avoiding a person, upon recognizing that the person is present around the excavator based on the environment information.
10. The work support system for the excavator according to claim 1 , further comprising:
a remote operation room arranged at a location away from the excavator, wherein
the simulation device simulates a motion of a virtual excavator in the three-dimensional virtual space model reproduced by the simulation device, based on an operation instruction of an operation device of the remote operation room.
11. The work support system for the excavator according to claim 1 , wherein the simulation device is provided in a controller configured to control the excavator or in a management apparatus capable of communicating with the excavator, the management apparatus being outside the excavator.
12. The work support system for the excavator according to claim 1 , wherein
the environment information is stored in the simulation device before the motion of the excavator is performed, and
the simulation device generates the three-dimensional virtual space model based on the stored environment information, before the motion of the excavator is performed.
13. The work support system for the excavator according to claim 1 , wherein the simulation device extracts, as object information, an object included in the environment information and a position of the object, and arranges the extracted object information in the three-dimensional virtual space model.
14. The work support system for the excavator according to claim 13 , wherein the simulation device adds additional information corresponding to the object information, to the object information arranged in the three-dimensional virtual space model.
15. The work support system for the excavator according to claim 14 , wherein the object information includes at least one of information of a shape or a position of the excavator, information of a shape or a position of an excavation object to be excavated by the excavator, or information of a shape or a position of a discharge object to which the excavation object is to be discharged.
16. The work support system for the excavator according to claim 15 , wherein the additional information corresponding to the information of the shape or the position of the excavator includes at least one of a posture, an identification number, a type, a working time, or an attachment state of the excavator.
17. The work support system for the excavator according to claim 15 , wherein the additional information corresponding to the information of the shape or the position of the excavation object includes at least one of a sediment accumulation amount, a weight, a density, a hardness, or soil quality of the excavation object.
18. The work support system for the excavator according to claim 15 , wherein the additional information corresponding to the information of the shape or the position of the discharge object includes at least one of a posture of the discharge object; an identification number; a type; a size of a loading platform; a state of the loading platform; a working time; or a weight, a volume, or a density of the discharge object.
19. The work support system for the excavator according to claim 1 , wherein
the environment detection device includes a space recognition device configured to recognize topographic information of the work site and object information of an object at the work site, and
the space recognition device is arranged in the excavator or at a location away from the excavator.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022203642 | 2022-12-20 | ||
| JP2022-203642 | 2022-12-20 | ||
| PCT/JP2023/045244 WO2024135603A1 (en) | 2022-12-20 | 2023-12-18 | Construction assisting system for shovel |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/045244 Continuation WO2024135603A1 (en) | 2022-12-20 | 2023-12-18 | Construction assisting system for shovel |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20250305251A1 true US20250305251A1 (en) | 2025-10-02 |
Family
ID=91588682
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US19/235,080 Pending US20250305251A1 (en) | 2022-12-20 | 2025-06-11 | Work support system for excavator |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20250305251A1 (en) |
| JP (1) | JPWO2024135603A1 (en) |
| CN (1) | CN120359338A (en) |
| DE (1) | DE112023005294T5 (en) |
| WO (1) | WO2024135603A1 (en) |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003278159A (en) * | 2002-03-27 | 2003-10-02 | Port & Airport Research Institute | Remote-operated construction method and system |
| DE102016000353A1 (en) * | 2016-01-14 | 2017-07-20 | Liebherr-Components Biberach Gmbh | Crane, construction machine or industrial truck simulator |
| JP7310595B2 (en) * | 2019-12-24 | 2023-07-19 | コベルコ建機株式会社 | Work support server and work support system |
| JP7484401B2 (en) * | 2020-05-13 | 2024-05-16 | コベルコ建機株式会社 | Remote operation support system for work machines |
| JP7721305B2 (en) * | 2021-03-31 | 2025-08-12 | 住友重機械工業株式会社 | Construction machinery management system |
-
2023
- 2023-12-18 JP JP2024566037A patent/JPWO2024135603A1/ja active Pending
- 2023-12-18 DE DE112023005294.8T patent/DE112023005294T5/en active Pending
- 2023-12-18 CN CN202380085865.XA patent/CN120359338A/en active Pending
- 2023-12-18 WO PCT/JP2023/045244 patent/WO2024135603A1/en not_active Ceased
-
2025
- 2025-06-11 US US19/235,080 patent/US20250305251A1/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2024135603A1 (en) | 2024-06-27 |
| WO2024135603A1 (en) | 2024-06-27 |
| CN120359338A (en) | 2025-07-22 |
| DE112023005294T5 (en) | 2025-11-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11959253B2 (en) | Excavator and information processing apparatus | |
| US12024851B2 (en) | Loading machine control device and control method | |
| JP7166108B2 (en) | Image processing system, display device, image processing method, trained model generation method, and training data set | |
| US20230071015A1 (en) | Construction assist system for shovel | |
| JP7597022B2 (en) | Excavator | |
| JP7372029B2 (en) | Display control device, display control system, and display control method | |
| KR20210075157A (en) | A system comprising a working machine, a method executed by a computer, a method of manufacturing a trained localization model, and data for training | |
| JP2016079677A (en) | Area limited excavation control device and construction machine | |
| JP2024052764A (en) | Display control device and display method | |
| US20150004566A1 (en) | Camera Based Scene Recreator for Operator Coaching | |
| US20210395980A1 (en) | System and method for work machine | |
| US20250305251A1 (en) | Work support system for excavator | |
| US20240191458A1 (en) | Control device and control method for loading machine | |
| JP2023150920A (en) | Support equipment, working machines, programs | |
| JP2023093109A (en) | Construction machinery and information processing equipment | |
| CN117999391A (en) | Track generation system | |
| US20250003197A1 (en) | Supporting device, work machine, and program | |
| US20250215668A1 (en) | Work machine, remote operation assisting device, and assisting system | |
| US20250283308A1 (en) | Working machine, information processing device, and program | |
| EP4621143A1 (en) | Work machine and method for object detection including identifying and ignoring a moveable work implement | |
| US20250019937A1 (en) | Assistance device, work machine, and recording medium | |
| JP2023183992A (en) | Support systems, remote operation support devices, working machines, programs | |
| JP2024076741A (en) | Work machine, information processing device, and program | |
| US20220002977A1 (en) | System and method for work machine | |
| WO2025115773A1 (en) | Work machine, information processing device, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |