[go: up one dir, main page]

WO2020179382A1 - Dispositif de surveillance et procédé des surveillance - Google Patents

Dispositif de surveillance et procédé des surveillance Download PDF

Info

Publication number
WO2020179382A1
WO2020179382A1 PCT/JP2020/005270 JP2020005270W WO2020179382A1 WO 2020179382 A1 WO2020179382 A1 WO 2020179382A1 JP 2020005270 W JP2020005270 W JP 2020005270W WO 2020179382 A1 WO2020179382 A1 WO 2020179382A1
Authority
WO
WIPO (PCT)
Prior art keywords
monitoring
marker
monitoring device
reflective material
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/005270
Other languages
English (en)
Japanese (ja)
Inventor
俊之 村松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Publication of WO2020179382A1 publication Critical patent/WO2020179382A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates

Definitions

  • the present disclosure relates to a monitoring device and a monitoring method.
  • a monitoring device that detects the position of an object existing in the imaging area based on an image taken by a camera or the like is known.
  • This type of monitoring device accurately grasps the position of a person or a working machine (hereinafter, collectively referred to as “moving body”) in a work environment such as a construction site or a factory, and monitors the action or operation of the person or the working machine. It is expected to be applied to applications such as analysis and prediction of the behavior or movement of a person or a work machine (hereinafter collectively referred to as "behavior analysis").
  • FIG. 1 is a diagram showing an example of a range image generated by a laser radar.
  • Laser radar determines the distance from its own position to the position of the object by emitting a laser beam and measuring the time (TOF: Time of Flight) until the laser beam reflects and returns to the object. Then, the laser radar generates image data related to the distance image by performing such processing while scanning within a predetermined range in which the monitored area is reflected. Since such a distance image contains information on the three-dimensional position of the moving body, it is useful for recognizing the posture and movement of the moving body.
  • TOF Time of Flight
  • this type of monitoring device in order to analyze the behavior of the moving body, it is required to grasp the position and posture of the moving body as a whole of the moving body. Therefore, in this type of monitoring device, when detecting the position or the like of the moving body from the distance image, it is required to grasp the position or the existence area of the moving body including the portion not reflected in the distance image.
  • the monitoring device when a part of the object is shielded by another object, or when the part of the object is a material that does not diffusely reflect the laser light, A part of the object may not be reflected in the distance image, and the position and orientation of the entire object may not be accurately grasped.
  • Patent Document 1 when a point cloud generated from the reflected light received by the laser radar has a defect, correction processing (estimation) is performed from the surrounding point cloud to include a portion that cannot be captured by the laser radar.
  • a technique for accurately grasping the space in which an object exists is disclosed.
  • the conventional technique according to Patent Document 1 is a method of compensating the distance measuring point at the position estimated from the close point group, the conventional technique may not be able to grasp the accurate position of the entire object. ..
  • the related art when there is a shield between the object to be recognized and the laser radar, it is not possible to estimate the area of the object hidden by the shield.
  • the present disclosure has been made in view of the above problems, and an object thereof is to provide a monitoring device and a monitoring method that can improve the accuracy of moving object monitoring using a laser radar.
  • a monitoring device for monitoring the state of a monitoring target having a reflective material An image acquisition unit that acquires image data related to a distance image generated by a laser radar that monitors a predetermined area, Detecting the pattern of the reflective material of the monitored object reflected in the distance image, based on the detected pattern of the reflective material and the reference pattern of the reflective material stored in the database in advance, in the predetermined area
  • An analysis unit that specifies the position and orientation of the monitored object, Is a monitoring device.
  • a monitoring method for monitoring the state of a monitoring target having a reflective material Obtaining image data related to a range image generated by a laser radar that monitors a predetermined area, The pattern of the reflective material of the monitoring target reflected in the distance image is detected, and the monitoring in the predetermined area is performed based on the detected pattern of the reflective material and a reference pattern of the reflective material stored in advance. Specify the position and orientation of the object, It is a monitoring method.
  • the monitoring device it is possible to improve the accuracy of moving object monitoring using a laser radar.
  • the figure which shows an example of the range image generated by the laser radar The figure which shows an example of the monitoring system which concerns on 1st Embodiment.
  • Diagram showing an example of the configuration of the marker database The figure which shows an example of the object information of the monitoring target object memorize
  • Diagram showing an example of analysis data generated by the analysis unit The flowchart which shows operation
  • FIG. 2 is a diagram showing an example of the monitoring system U according to the present embodiment.
  • the monitoring system U is applied to the purpose of monitoring the position and attitude of the monitored object MT existing in the factory.
  • a vehicle is shown as an example of the monitoring target MT in FIG. 2, the monitoring target MT may be an arbitrary moving body such as a work machine, a robot, or a person.
  • the monitoring system U includes a monitoring device 100, a laser radar 200, and a speaker 300.
  • the laser radar 200 determines the distance from its own position to the position of the object by emitting the laser light and measuring the time (TOF: Time of Flight) until the laser light is reflected and returned to the object. ..
  • the laser radar 200 performs such processing while scanning within a predetermined range in which the monitoring target area is reflected, thereby generating image data related to the distance image (hereinafter, abbreviated as “distance image”).
  • the laser radar 200 continuously generates, for example, a range image in frame units, and outputs image data arranged in time series (that is, moving image data) to the monitoring device 100.
  • a distance image is an image in which each scanning position is a pixel, and measurement data (for example, distance and reflection intensity) of the laser radar 200 is associated as a pixel value for each pixel (also referred to as point cloud data).
  • the distance image is, for example, a three-dimensional Cartesian coordinate system (X, Y, Z) and represents the existence position (that is, the position in the horizontal direction, the vertical direction, and the depth direction) of the object in the monitoring target area.
  • the laser radar 200 is installed at an appropriate position near the monitored area so as to image a predetermined monitored area.
  • the speaker 300 When there is a dangerous situation in which the monitored object MT comes into contact with another object (for example, a person), the speaker 300 notifies the people in the surroundings of the situation.
  • the speaker 300 is controlled by the monitoring device 100.
  • the monitoring device 100 monitors the position and orientation of the monitoring target MT by imaging the marker MK of the monitoring target MT with the laser radar 200. By monitoring the state of the monitoring target MT using the marker MK, the monitoring device 100, as shown in FIG. 2, even if a part of the monitoring target MT is shielded by the shield NB, It is possible to recognize the existence area of the entire monitored object MT (details will be described later).
  • the marker MK is a highly reflective material arranged on the exposed surface of the monitored object MT and generates a strong reflected light with respect to the laser light emitted from the laser radar 200.
  • the marker MK is made of, for example, a reflective tape.
  • FIG. 2 shows a mode in which the marker MK is arranged on the exposed surface side of the top plate of the front part of the monitored object MT (here, vehicle).
  • the marker MK shown in FIG. 2 is composed of three high-reflecting materials arranged apart from each other, and these three high-reflecting materials have a triangular shape in a plan view.
  • the marker MK is arranged on the exposed surface of the monitored object MT so as to have a predetermined shape when viewed from the outside, and constitutes an index indicating the position and orientation of the monitored object MT. For example, how far the position of the monitored object MT is from the laser radar 200, or the attitude of the monitored object MT is facing forward or backward when viewed from the laser radar 200, depending on how the marker MK is seen. It is possible to identify whether or not.
  • the marker MK has a different shape for each monitored object MT (for each object type or each individual), and the shape of the marker MK is such that one monitored object MT is changed to another monitored object MT.
  • the identification information for identifying with is configured.
  • the shape of the marker MK included in the monitored object MT is registered in the marker database Dm together with the object information of the monitored object MT (described later with reference to FIGS. 5 and 6).
  • FIG. 3 is a diagram showing a hardware configuration of the monitoring device 100 according to the present embodiment.
  • the monitoring device 100 includes a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, an external storage device (for example, a flash memory) 104, a communication interface 105, and the like as main components. It is a equipped computer.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • a communication interface 105 a communication interface 105, and the like as main components. It is a equipped computer.
  • Each function to be described later of the monitoring device 100 is realized, for example, by the CPU 101 referring to a control program (for example, an image processing program) and various data stored in the ROM 102, the RAM 103, the external storage device 104, or the like.
  • a part or all of each function may be realized by processing by a DSP (Digital Signal Processor) instead of or together with the processing by the CPU.
  • a part or all of each function may be realized by processing by a dedicated hardware circuit (for example, ASIC or FPGA) instead of or together with the processing by software.
  • FIG. 4 is a diagram showing functional blocks of the monitoring device 100 according to this embodiment.
  • the arrows in FIG. 4 represent the flow of data. ..
  • the monitoring device 100 includes an image acquisition unit 10, a filter processing unit 20, an analysis unit 30, and a notification unit 40.
  • the image acquisition unit 10 acquires the range image generated by the laser radar 200.
  • the image acquisition unit 10 sequentially acquires distance images arranged in time series from the laser radar 200.
  • the filter processing unit 20 extracts only the pixel area in which the reflection intensity of the reflected light received by the laser radar 200 is equal to or more than a threshold value from the range image, and sends the extracted range image to the analysis unit 30. As a result, a distance image in which reflected light from an object other than the marker MK is eliminated from the distance image generated by the laser radar 200 is sent to the analysis unit 30. Note that this makes it possible to more easily perform the pattern detection process of the marker MK in the analysis unit 30.
  • the analysis unit 30 detects the pattern of the marker MK shown in the distance image, and identifies the identification information, the position, and the posture of the monitoring target MT in the monitoring target area based on the detected pattern of the marker MK.
  • the analysis unit 30 identifies the identification information, the position, and the posture of the monitored object MT by comparing the reference pattern of the marker MK stored in the marker database Dm with the pattern of the marker MK shown in the distance image. To do.
  • FIG. 5 is a diagram showing an example of the configuration of the marker database Dm.
  • FIG. 6 is a diagram showing an example of the object information of the monitored object MT stored in association with the marker MK.
  • FIG. 7 is a diagram showing an example of the analysis data DL generated by the analysis unit 30.
  • the marker database Dm stores, for example, the “reference pattern of the marker MK” and the “object information of the monitored object MT” in association with each monitored object MT.
  • the “reference pattern of the marker MK” and the “object information of the monitored object MT” may be registered for each object type (for example, truck, person, working machine, etc.) or for each individual (for example, truck). No. 1, track No. 2, or track No. 3).
  • each of a plurality of locations of one object is set as the monitoring target MT, and the monitoring target MT is registered for each part of one object (for example, the arm portion of the predetermined object or the leg portion of the predetermined object). May be.
  • the “reference pattern of the marker MK” is information related to the shape of the marker MK included in the monitored object MT.
  • the “reference pattern of the marker MK” constitutes the identification information of the monitoring target MT and also an index for specifying the position and orientation of the monitoring target MT in the monitoring target area.
  • the “reference pattern of the marker MK” also includes information on the size (for example, the length of each side) of the marker MK in order to grasp the size ratio with the marker MK shown in the distance image.
  • the "reference pattern of the marker MK” not only the appearance of the marker MK when viewed from one reference position but also the marker MK when viewed from various reference positions (for example, various directions) The appearance of may be registered.
  • the “object information of the monitoring target MT” is a size (for example, length, width, and height) of the monitoring target MT, an area where each part of the monitoring target MT extends based on the position of the marker MK, It also includes information such as the pointing direction (for example, forward) of the monitored object MT based on the pointing direction of the marker MK.
  • the monitored object MT is basically an object that does not move (for example, production equipment in a factory)
  • the information about the movable range of the monitored object MT is also registered in the “object information of the monitored object MT”. It is desirable to keep.
  • the monitoring target MT that cannot exist at the position where the marker MK is detected can be excluded from the search target. It is possible to reduce recognition errors.
  • a triangular shape composed of three high-reflecting materials has a length of each side of the triangular shape (Acm, Bcm, Bcm). It shows the mode registered with.
  • the marker database Dm is associated with the reference pattern of the marker MK, and the length extending in the front-rear direction of the monitoring target MT and the width direction of the monitoring target MT are associated with the position of the marker MK as a reference. Information on the existing length and the length extending in the height direction of the monitored object MT is registered (see FIG. 6).
  • the analysis unit 30 compares the reference pattern of the marker MK stored in the marker database Dm with the pattern of the marker MK shown in the distance image by template matching, for example. Specifically, the analysis unit 30 detects the pattern of the marker MK (for example, the shape of the marker MK and the length of each side of the shape) displayed in the distance image, and detects the pattern of the detected marker MK. A reference pattern of the marker MK stored in the marker database Dm having a similar relative positional relationship is extracted. Further, the analysis unit 30 determines that the marker MK shown in the distance image is viewed from the laser radar 200 from the difference in the directivity between the pattern of the marker MK shown in the distance image and the reference pattern of the marker MK stored in the marker database Dm.
  • template matching for example. Specifically, the analysis unit 30 detects the pattern of the marker MK (for example, the shape of the marker MK and the length of each side of the shape) displayed in the distance image, and detects the pattern of the detected marker MK. A reference pattern of the marker M
  • the analysis unit 30 identifies the marker MK from the marker MK extracted from the list of the markers MK stored in the marker database Dm as a pattern corresponding to the pattern of the markers MK shown in the distance image ( That is, the identification information of the monitoring target MT) is specified.
  • the analysis unit 30 uses the object information of the monitoring target MT associated with the marker MK stored in the marker database Dm from the identification information of the marker MK, and the identification result of the position and the posture, and uses the object information of the marker MK.
  • the identification information, the position, and the posture of the monitoring target MT on which the is arranged are specified. That is, the analysis unit 30 identifies the position and orientation of the monitoring target object MT in the monitoring target region based on the appearance of the marker MK when the image is taken by the laser radar 200. At this time, the analysis unit 30 may also specify the existing area of the monitored object MT in the monitored area based on the object information of the monitored object MT.
  • the method for the analysis unit 30 to analyze the pattern of the marker MK reflected in the distance image may be any known method.
  • the analysis unit 30 specifies the identification information, the position, and the posture of the monitored object MT for each frame of the distance images arranged in time series. Then, the analysis unit 30 outputs the analysis data DL tracking the position of the monitoring target MT.
  • the analysis data DL shown in FIG. 7 includes the position (here, the three-dimensional coordinate position indicating the position of the center of gravity of the monitoring target MT) and the posture (here, the monitoring target MT of the monitoring target MT of the monitoring target MT in each claim.
  • Vector indicating the pointing direction
  • the analysis unit 30 determines the identification information, the position, and the posture of the monitoring target MT for each of the monitoring targets MT.
  • the position of each of the plurality of monitoring targets MT is specified and tracked.
  • the analysis unit 30 may also perform tracking processing for the object to which the marker MK is not attached.
  • the notification unit 40 detects the occurrence of a state in which the monitored object MT and another object are close to each other, based on the analysis data DL generated by the analysis unit 30.
  • the notification unit 40 uses the speaker 300 to notify that effect.
  • the other object to be detected by the notification unit 40 may be another monitoring object MT or an object other than the monitoring object MT.
  • FIG. 8 is a flowchart showing the operation of the monitoring device 100 according to this embodiment.
  • the flowchart illustrated in FIG. 8 is executed by the monitoring device 100 according to a computer program, for example.
  • step S1 the monitoring device 100 acquires image data relating to a range image from the laser radar 200.
  • step S2 the monitoring device 100 extracts a pixel area in which the reflection intensity of reflected light is equal to or more than a threshold value from the distance image.
  • step S3 the monitoring device 100 compares the pattern of the extracted pixel region (that is, the pattern of the marker MK shown in the distance image) with the reference pattern of all the markers MK registered in the marker database Dm, It is determined whether or not there is a match in the reference pattern list of the marker MK registered in the database Dm.
  • S3: YES the pattern of the marker MK shown in the distance image
  • S3: NO the process proceeds to step S5.
  • step S3 when the monitoring device 100 identifies the monitoring target MT, the position of the marker MK detected in the distance image and the size and movement of the monitoring target MT previously stored in the marker database Dm. It is desirable to exclude the monitoring target MT that cannot exist at the position where the marker MK is detected from the search target based on the object information related to the area.
  • step S4 the monitoring device 100 calculates the size ratio and orientation of the pattern of the marker MK in the distance image with respect to the reference pattern of the marker MK. Then, the size ratio and orientation of the pattern of the marker MK appearing in the distance image with respect to the reference pattern of the marker MK, and the object information of the monitoring target MT associated with the marker MK registered in the marker database Dm are referred to. Then, the identification information, position, and orientation of the monitored object MT are estimated.
  • step S5 the monitoring device 100 confirms whether or not there is subsequent image data to be processed, and if other image data to be processed exists (S5: YES), the monitoring device 100 returns to S1 and performs the same processing. Is repeatedly executed. On the other hand, when there is no other image data to be processed (S5: NO), the processing of the series of flowcharts is terminated.
  • the monitoring apparatus 100 specifies the identification information, the position, and the posture of the monitored object MT in the monitored area for each of the plurality of distance images arranged in time series by the above-described processing.
  • the monitoring device 100 is arranged on the image acquisition unit 10 that acquires the image data related to the range image generated by the laser radar 200 and on the exposed surface of the monitoring target MT shown in the range image.
  • the pattern of the marker MK made of a highly reflective material provided is detected, and based on the detected pattern of the marker MK and the previously stored reference pattern of the marker MK, the identification information, the position, and the posture of the monitored object MT.
  • an analysis unit 30 that identifies
  • the monitoring device 100 it is possible to accurately identify the monitoring target MT, and in addition, the monitoring target MT is made of a material that does not diffusely reflect, or the monitoring is performed. Even when the reflected light from a part of the object MT cannot be captured, the position and orientation of the monitored object MT can be accurately estimated.
  • the monitoring device 100 since the monitoring device 100 according to the present embodiment can three-dimensionally capture the pattern of the marker MK, it is possible to highly accurately estimate the position and orientation of the monitored object MT. Accordingly, it is possible to three-dimensionally estimate the area where the monitoring target MT exists in the monitoring target area. Further, by this, behavior analysis of the movement of the monitored object MT (for example, whether the movement of the vehicle is forward or backward), prediction of the next movement (for example, prediction of the moving direction of the industrial machine, etc.) ) Is possible. In addition, this makes it possible to perform complicated behavior analysis such as extracting only the movement path of a specific part of the monitored object MT.
  • behavior analysis of the movement of the monitored object MT for example, whether the movement of the vehicle is forward or backward
  • prediction of the next movement for example, prediction of the moving direction of the industrial machine, etc.
  • FIGS. 9A, 9B, and 10 are diagrams for explaining the usefulness of the monitoring device 100 according to the present embodiment.
  • object detection using the laser radar 200 is useful only in the region where the laser light transmitted from the laser radar 200 hits. Therefore, as shown in FIG. 9A, as a result of the movement of the monitored object MT (here, the rotating arm), a part of the monitored object MT (for example, a region surrounded by a dotted line in FIG. 9A) is laser radar 200. In the case of a shadow from the above, it is difficult for the monitoring device according to the prior art to grasp the entire position and the existing area of the monitored object MT.
  • the object information of the monitoring target MT (here, the rotating arm) is associated with the reference pattern of the marker MK arranged on the monitoring target MT in advance.
  • the marker database Dm are stored in the marker database Dm, and the entire position and existing area of the monitoring target MT are specified from the pattern of the markers MK shown in the distance image. Therefore, according to the monitoring device 100 of the present embodiment, as shown in FIG. 9B, even if a part of the monitored object MT is hidden behind the laser radar 200, the entire position of the monitored object MT is detected. And it becomes possible to specify the existence area.
  • the monitoring device 100 according to the present embodiment, as shown in FIG. 10, for example, it is possible to specify the position and the existence area on the back side of the monitored object MT as viewed from the laser radar 200. Further, according to the monitoring device 100 according to the present embodiment, as shown in FIG. 10, for example, only by tracking the pattern of the marker MK arranged on the exposed surface of the monitoring target MT, the entire monitoring target MT can be tracked. It is also possible to track a predetermined portion of the monitored object MT (for example, a portion surrounded by a dotted line in FIG. 9A).
  • FIG. 11 is a diagram showing a configuration of the marker MK according to the first modification. ..
  • two markers MKa and MKb forming different identification information are arranged on the exposed surface of the monitored object MT (here, the exposed surface of the top plate of the vehicle).
  • One marker MKa is arranged in the front part of the monitored object MT, and the other marker MKb is arranged in the rear part of the monitored object MT.
  • one of the two markers MKa and MKb is shielded by the obstacle. Therefore, it is possible to estimate the position and orientation of the monitored object MT even when it cannot be visually recognized from the laser radar 200.
  • one of the markers MKa has a triangular shape with one large highly reflective material.
  • the other marker MKb is formed of two high-reflecting materials in the shape of a colon symbol.
  • the marker MKb is composed of a plurality of highly reflective materials MKb1 and MKb2 having different reflectances so as to form one pattern capable of identifying the orientation of the marker MKb.
  • the marker database Dm in addition to the shape of the marker MKb, information on the reflectance of each of the plurality of high-reflecting materials MKb1 and MKb2 of the marker MKb is also registered as information on the “reference pattern of the marker MK”. In this way, it is possible to configure more diverse identification information by making the reflectance different in addition to the shape formed by the highly reflective material.
  • 12A and 12B are diagrams showing the configuration of the marker MK according to the second modification.
  • different identification information is configured on the rear exposed surface (see FIG. 12A) and the front exposed surface (see FIG. 12B) of the monitored object MT (here, vehicle) 2
  • Two markers MK are arranged.
  • the laser radar 200 is monitored. Even when only one of the front side and the back side of the target object MT is visible, the monitoring apparatus 100 can estimate the identification information, the position, and the posture of the monitoring target object MT.
  • the shape of the marker MK can be modified in various ways other than the above.
  • the marker MK preferably has a polygonal shape in plan view, and more preferably has a non-axisymmetric polygonal shape in plan view (for example, a triangle or a pentagon). This makes it possible for the monitoring device 100 to more easily recognize the directivity direction of the marker MK displayed in the distance image.
  • the marker MK is not limited to the two-dimensional shape and may have a three-dimensional shape.
  • the marker MK is composed of a plurality of high-reflecting materials arranged apart from each other, and the plurality of high-reflecting materials are integrally formed with each other to form a predetermined shape. With this, it is possible to secure a large size of the marker MK, and it is possible to make the monitoring device 100 more easily recognize the pattern of the marker MK shown in the distance image.
  • the marker MK may be configured to be incorporated as a part of the member on the exposed surface of the monitored object MT, instead of being attached to the exposed surface of the monitored object MT.
  • FIG. 13 is a diagram showing the configuration of the monitoring device 100 according to the second embodiment.
  • the monitoring device 100 according to the present embodiment is different from the first embodiment in that it has a data registration unit 50. Note that the description of the configuration common to the first embodiment will be omitted (the same applies to other embodiments below). ..
  • the data registration unit 50 newly registers the unregistered monitoring target MT in the marker database Dm in association with the reference pattern of the marker MK of the monitoring target MT and the object information of the monitoring target MT. ..
  • the data registration unit 50 measures the overall shape (and size) of the monitoring target MT using the laser radar 200, for example, and extracts the marker MK measured together with the monitoring target MT from the measurement data. Then, the data registration unit 50 stores the shape and size of the marker MK obtained from the measurement data as a “reference pattern of the marker MK”, and the shape and size of the monitoring target MT obtained from the measurement data as a “monitoring object”. It is stored as "MT object information”, and both are associated with each other by identification information and registered in the marker database Dm.
  • the data registration unit 50 also determines the position of the marker MK in the monitoring target MT based on the positional relationship between the monitoring target MT and the marker MK shown in the distance image as “object information of the monitoring target MT. Is stored in the marker database Dm. The data registration unit 50 attaches new identification information to the newly registered monitored object MT and stores it in the marker database Dm.
  • the data registration unit 50 uses the laser radar 200 to measure the overall shape (and size) of the monitored object MT, it is typically in the upward direction (or downward direction) of the monitored object MT.
  • the laser radar 200 images the monitored object MT from each of the left direction (or right direction) and the front direction (or back direction). Then, the data registration unit 50 registers the length in the vertical direction, the length in the horizontal direction, and the length in the front-rear direction of the monitored object MT obtained by these as object information.
  • FIG. 14 is a flowchart showing a registration process by the monitoring device 100 according to this embodiment.
  • the flowchart shown in FIG. 14 is executed by the monitoring device 100 in accordance with a computer program, for example, when a new registration command is issued by the user.
  • the monitoring device 100 uses the laser radar 200 to determine the shape of the entire monitored object MT to be registered in the marker database Dm in each direction of the monitored object MT (for example, front and rear, left and right, and up and down). Measure from. At this time, the monitoring device 100 causes the laser radar 200 to measure the surface shape of the monitored object MT in each direction, and synthesizes the distance image obtained thereby to three-dimensionally monitor the monitored object MT. Shape may be recognized. At this time, the laser radar 200 may be moved with respect to the monitoring target MT, or the monitoring target MT may be moved with respect to the laser radar 200.
  • step S12 the monitoring device 100 registers the size and shape of the monitoring target MT recognized from the measurement data of step S11 in the marker database Dm as the object information of the monitoring target MT.
  • step S13 the monitoring apparatus 100 extracts a region in which the reflection intensity of reflected light is equal to or more than a threshold value (that is, a pixel region in which the reflection intensity of reflected light in the distance image is equal to or more than a threshold value) from the measurement data in step S11. , Detect the shape and size of the marker MK.
  • a threshold value that is, a pixel region in which the reflection intensity of reflected light in the distance image is equal to or more than a threshold value
  • step S14 the monitoring device 100 compares the shape of the marker MK identified in step S13 with the reference patterns of all the other markers MK registered in the marker database Dm. In step S14, the monitoring device 100 calculates this comparison result as a degree of agreement.
  • step S15 if the shape of the marker MK identified in step S13 does not match any of the reference patterns of the other markers MK registered in the marker database Dm in step S15 (S15: NO), the monitoring device 100 proceeds to step S18. Proceed with processing. On the other hand, if the shape of the marker MK specified in step S13 matches any of the reference patterns of the other markers MK registered in the marker database Dm (S15: YES), the process proceeds to step S16.
  • step S16 the monitoring device 100 notifies that the shape of the marker MK should be changed and re-registered because the marker MK of the same pattern has already been registered. Further, in this step S16, the monitoring device 100 deletes the object information of the monitoring target MT temporarily registered in step S12 from the marker database Dm, and advances the process to step S17.
  • step S17 the monitoring device 100 again waits for a re-registration command from the user (S17: NO), and when a re-registration request is issued from the user (S17: YES), the monitoring device 100 returns to step S11 and the same process is performed. Execute the process.
  • step S18 the monitoring device 100 associates the shape and size of the marker MK identified in step S13 with the object information of the monitoring target MT registered in step S12 as a reference pattern of the marker MK, and stores the shape and size in the marker database Dm. sign up.
  • the monitoring device 100 As described above, with the monitoring device 100 according to the present embodiment, it is possible to easily register a new monitoring target MK in the marker database Dm. This makes it possible to sequentially update the marker database Dm according to the user's usage.
  • FIG. 15 is a diagram showing the configuration of the monitoring device 100 according to the third embodiment.
  • the monitoring device 100 is different from the first embodiment in that it is built in the laser radar 200. Then, the image acquisition unit 10 of the monitoring device 100 acquires image data directly from the measurement unit 210 of the laser radar 200 (that is, the imaging unit that generates a distance image).
  • the monitoring device 100 is useful in that it is not necessary to prepare a computer separate from the laser radar 200.
  • the functions of the image acquisition unit 10, the filter processing unit 20, the analysis unit 30, and the notification unit 40 are described as being realized by one computer.
  • it may be realized by a plurality of computers.
  • the programs and data read by the computer may be distributed and stored in a plurality of computers.
  • the monitoring device it is possible to improve the accuracy of moving object monitoring using a laser radar.
  • U monitoring system 100 monitoring device 101 CPU 102 ROM 103 RAM 104 external storage device 105 communication interface 10 image acquisition unit 20 filter processing unit 30 analysis unit 40 notification unit 50 data registration unit 200 laser radar 210 measurement unit 300 speaker Dm marker database DL analysis data MK, MKa, MKb marker (reflecting material) MT monitoring target NB shield

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un dispositif de surveillance (100) qui surveille l'état d'une cible de surveillance (MT) ayant un matériau réfléchissant (MK) et qui comprend : une unité d'acquisition d'image (10) qui obtient des données d'image concernant une image télémétrique générée par un radar laser (200) ; et une unité d'analyse (30) qui détecte un motif dans le matériau réfléchissant (MK) de la cible de surveillance (MT) représentée à l'intérieur de l'image télémétrique et spécifie la position et la posture de la cible de surveillance (MT), sur la base du motif de matériau réfléchissant (MK) détecté et de motifs de référence pour le matériau réfléchissant (MK) stocké au préalable dans une base de données (Dm).
PCT/JP2020/005270 2019-03-01 2020-02-12 Dispositif de surveillance et procédé des surveillance Ceased WO2020179382A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-037590 2019-03-01
JP2019037590 2019-03-01

Publications (1)

Publication Number Publication Date
WO2020179382A1 true WO2020179382A1 (fr) 2020-09-10

Family

ID=72338288

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/005270 Ceased WO2020179382A1 (fr) 2019-03-01 2020-02-12 Dispositif de surveillance et procédé des surveillance

Country Status (1)

Country Link
WO (1) WO2020179382A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114155245A (zh) * 2022-02-10 2022-03-08 中煤科工开采研究院有限公司 一种煤矿井下基于三维点云的围岩变形监测方法及装置
JP2023038693A (ja) * 2021-09-07 2023-03-17 株式会社トプコン 測量データ処理装置、測量データ処理方法および測量データ処理用プログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004233173A (ja) * 2003-01-30 2004-08-19 Ricoh Co Ltd 角度・位置検出装置及びそれを内蔵した自在継手
JP2007122507A (ja) * 2005-10-28 2007-05-17 Secom Co Ltd 侵入検知装置
WO2018230517A1 (fr) * 2017-06-13 2018-12-20 川崎重工業株式会社 Système d'actionnement

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004233173A (ja) * 2003-01-30 2004-08-19 Ricoh Co Ltd 角度・位置検出装置及びそれを内蔵した自在継手
JP2007122507A (ja) * 2005-10-28 2007-05-17 Secom Co Ltd 侵入検知装置
WO2018230517A1 (fr) * 2017-06-13 2018-12-20 川崎重工業株式会社 Système d'actionnement

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023038693A (ja) * 2021-09-07 2023-03-17 株式会社トプコン 測量データ処理装置、測量データ処理方法および測量データ処理用プログラム
CN114155245A (zh) * 2022-02-10 2022-03-08 中煤科工开采研究院有限公司 一种煤矿井下基于三维点云的围岩变形监测方法及装置
CN114155245B (zh) * 2022-02-10 2022-05-03 中煤科工开采研究院有限公司 一种煤矿井下基于三维点云的围岩变形监测方法及装置

Similar Documents

Publication Publication Date Title
US11243072B2 (en) Method for the three dimensional measurement of moving objects during a known movement
US11511421B2 (en) Object recognition processing apparatus and method, and object picking apparatus and method
US11335182B2 (en) Methods and systems for detecting intrusions in a monitored volume
JP7194015B2 (ja) センサシステム及び距離測定方法
US9443308B2 (en) Position and orientation determination in 6-DOF
US9182763B2 (en) Apparatus and method for generating three-dimensional map using structured light
CN103415780B (zh) 用于确定测量仪的位置和定向的方法和系统
RU2669200C2 (ru) Устройство обнаружения препятствий при помощи пересекающихся плоскостей и способ обнаружения с применением такого устройства
JP2003530561A (ja) 測定装置及び方法
CN104769454A (zh) 用于确定对象的取向的方法和装置
US12299926B2 (en) Tracking with reference to a world coordinate system
US9970762B2 (en) Target point detection method
WO2020179382A1 (fr) Dispositif de surveillance et procédé des surveillance
US20240273734A1 (en) Marker and markerless tracking
Godil et al. 3D ground-truth systems for object/human recognition and tracking
CN118244233A (zh) 扫描投影规划
CN113048878B (zh) 光学定位系统、方法以及多视图三维重建系统、方法
US10408604B1 (en) Remote distance estimation system and method
WO2019093371A1 (fr) Système de détection d'objet et programme de détection d'objet
Mohsin et al. Calibration of multiple depth sensor network using reflective pattern on spheres: theory and experiments
WO2019093372A1 (fr) Procédé de détection d'objet et programme de détection d'objet
WO2023163760A1 (fr) Suivi avec référence à un système de coordonnées universel
KR20250020916A (ko) 골프 론치 모니터 및 골프 론치 모니터의 타겟 정렬 방법
Chen et al. Active vision sensors
HK1234828A1 (en) Device for detecting an obstacle by means of intersecting planes and detection method using such a device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20766461

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20766461

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP