WO2019159409A1 - Dispositif de suivi de marchandises, compteur de marchandises, procédé de suivi de marchandises, procédé de comptage de marchandises, système de suivi de marchandises, système de comptage de marchandises - Google Patents
Dispositif de suivi de marchandises, compteur de marchandises, procédé de suivi de marchandises, procédé de comptage de marchandises, système de suivi de marchandises, système de comptage de marchandises Download PDFInfo
- Publication number
- WO2019159409A1 WO2019159409A1 PCT/JP2018/034186 JP2018034186W WO2019159409A1 WO 2019159409 A1 WO2019159409 A1 WO 2019159409A1 JP 2018034186 W JP2018034186 W JP 2018034186W WO 2019159409 A1 WO2019159409 A1 WO 2019159409A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tracking
- detection
- image
- images
- transported
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G43/00—Control devices, e.g. for safety, warning or fault-correcting
- B65G43/08—Control devices operated by article or material being fed, conveyed or discharged
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06M—COUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
- G06M7/00—Counting of objects carried by a conveyor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
Definitions
- the present invention relates to a conveyance tracking device, a conveyance counting device, a conveyance tracking method, a conveyance counting method, a conveyance tracking system, and a conveyance counting system.
- a product that is transported on a conveyor such as a belt conveyor is imaged with a camera and the product is monitored.
- the present inventors acquire images of the conveyed object over time, detect the conveyed object in each image, and detect the same conveyed object in each image.
- a device capable of detecting and tracking the transported object hereinafter also referred to as “device of reference example”.
- detection since detection is performed for each image, there is a problem that it takes time for detection and tracking processing. In addition, this problem becomes particularly noticeable when, for example, a transported object is detected using a learning model created by machine learning or deep learning.
- an object of the present invention is to provide a transported object tracking device and a tracking method capable of reducing the processing time.
- the transported object tracking device of the present invention acquires n images over time for the transported material being transported by the transporting device. Acquisition means; Detecting means for detecting the conveyed object for k detection target images selected from the n images; In the image acquired after each detection target image among the n images, tracking means for tracking the conveyed product detected in each detection target image, K in the detection means is smaller than n.
- the transporting object counting device of the present invention includes tracking means for tracking the transported material being transported by the transporting device, Counting means for counting the tracked transported goods,
- the tracking means is the transported object tracking device of the present invention.
- the transported object tracking method of the present invention includes an image acquisition step of acquiring n images over time for the transported object being transported by the transporting device, A detection step of detecting the conveyed object for k detection target images selected from the n images; In the image acquired after each detection target image among the n images, a tracking step of tracking a conveyed object detected in each detection target image, In the detection step, k is smaller than n.
- the transported object counting method of the present invention includes a tracking step for tracking the transported object being transported by the transporting device; Counting the tracked transported object,
- the tracking step is the transported object tracking method of the present invention.
- the program of the present invention includes an image acquisition process for acquiring n images over time with respect to a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; Among the n images, in the image acquired after each detection target image, it is possible to execute on the computer a tracking process for tracking the conveyed object detected in each detection target image, In the detection process, k is smaller than n.
- the program of the present invention includes an image acquisition process for acquiring n images over time with respect to a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; In the image obtained after each detection target image among the n images, a tracking process for tracking the conveyed object detected in each detection target image; And a counting process for counting the number of the tracked transported objects can be executed on a computer, In the detection process, k is smaller than n.
- the delivery tracking system of the present invention includes a terminal and a server, The terminal and the server are connectable via a communication network outside the system,
- the terminal includes an imaging device, The imaging device captures n images over time for a transported object transported by a transport device,
- the server includes image acquisition means, detection means, and tracking means,
- the image acquisition means acquires n images over time for a transported object transported by the transport device,
- the detection means detects a conveyed object for k detection target images among n images,
- the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images, K in the detection means is smaller than n.
- the transported object counting system of the present invention includes a terminal and a server, The terminal and the server are connectable via a communication network outside the system,
- the terminal includes an imaging device, The imaging device captures n images over time for a transported object transported by a transport device,
- the server includes image acquisition means, detection means, tracking means, and counting means,
- the image acquisition means acquires n images over time for a transported object transported by the transport device,
- the detection means detects a conveyed object for k detection target images among n images,
- the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
- the counting means counts the number of the tracked transported objects, K in the detection means is smaller than n.
- the processing time can be shortened.
- FIG. 1 is a block diagram illustrating a tracking apparatus according to the first embodiment.
- FIG. 2 is a block diagram illustrating an example of a hardware configuration of the tracking device according to the first embodiment.
- FIG. 3 is a flowchart illustrating the tracking method and program according to the first embodiment.
- FIG. 4 is a block diagram illustrating another example of the tracking device according to the first embodiment.
- FIG. 5 is a block diagram illustrating another example of the tracking device according to the first embodiment.
- FIG. 6 is a block diagram illustrating another example of the tracking apparatus according to the second embodiment.
- FIG. 7 is a flowchart illustrating the tracking method and program according to the second embodiment.
- FIG. 8 is a flowchart illustrating the tracking method and program according to the second embodiment.
- FIG. 1 is a block diagram illustrating a tracking apparatus according to the first embodiment.
- FIG. 2 is a block diagram illustrating an example of a hardware configuration of the tracking device according to the first embodiment.
- FIG. 3 is a flowchar
- FIG. 9 is a diagram illustrating an image acquired by the tracking device.
- FIG. 10 is a diagram illustrating a timeline of a part of processing in the tracking method according to the second embodiment.
- FIG. 11 is a block diagram illustrating a counting device according to the third embodiment.
- FIG. 12 is a flowchart illustrating a counting method and program according to the third embodiment.
- FIG. 13 is a block diagram illustrating a sorting apparatus according to the fourth embodiment.
- FIG. 14 is a flowchart illustrating a selection method and program according to the fourth embodiment.
- FIG. 15 is a block diagram illustrating a tracking system according to the fifth embodiment.
- FIG. 16 is a block diagram illustrating a sorting system according to the fifth embodiment.
- Embodiment 1 relates to a tracking device and a tracking method of the present invention.
- FIG. 1 shows a block diagram of the tracking device in the present embodiment.
- the tracking device 10 of this embodiment includes an image acquisition unit 111, a detection unit 112, and a tracking unit 113.
- the image acquisition unit 111, the detection unit 112, and the tracking unit 113 may be incorporated into data processing unit (data processing apparatus) 11 that is hardware, and software or the software is incorporated therein. It may be hardware.
- the data processing unit 11 may include a CPU or the like.
- the data processing unit 11 may include, for example, a ROM, a RAM, and the like which will be described later.
- FIG. 2 illustrates a block diagram of the hardware configuration of the tracking device 10.
- the tracking device 10 includes, for example, a CPU (Central Processing Unit) 201, a memory 202, a bus 203, a storage device 204, an input device 206, a display 207, a communication device 208, and the like. Each part of the tracking device 10 is connected via a bus 203 by a respective interface (I / F).
- the hardware configuration of the tracking device 10 can be employed as a hardware configuration of a counting device and a sorting device, which will be described later, for example.
- the CPU 201 operates in cooperation with other components by a controller (system controller, I / O controller, etc.), for example, and takes charge of overall control of the tracking device 10.
- the program 205 and other programs of the present invention are executed by the CPU 201, and various information is read and written.
- the CPU 201 functions as the image acquisition unit 111, the detection unit 112, and the tracking unit 113.
- the tracking device 10 includes a CPU as an arithmetic device, the tracking device 10 may include another arithmetic device such as a GPU (Graphics Processing Unit), an APU (Accelerated Processing Unit), or a combination of the CPU and these. Good.
- the CPU 201 functions as each unit other than the storage unit in Embodiments 2 to 4 and Modifications 1 to 5 described later, for example.
- the memory 202 includes, for example, a main memory.
- the main memory is also called a main storage device.
- the memory 202 reads various operation programs such as the program 205 of the present invention stored in a storage device 204 (auxiliary storage device) described later.
- the CPU 201 reads data from the memory 202, decodes it, and executes the program.
- the main memory is, for example, a RAM (Random Access Memory).
- the memory 202 further includes a ROM (read only memory), for example.
- the bus 203 can be connected to an external device, for example.
- the external device include an external storage device (external database and the like), a printer, and the like.
- the tracking device 10 can be connected to a communication line network by a communication device 208 connected to the bus 103, and can also be connected to the external device via the communication line network.
- the tracking device 10 can also be connected to a terminal or the like via the communication device 208 and a communication line network.
- the storage device 204 is also referred to as a so-called auxiliary storage device for the main memory (main storage device), for example.
- the storage device 204 stores an operation program including the program 205 of the present invention.
- the storage device 204 includes, for example, a storage medium and a drive that reads from and writes to the storage medium.
- the storage medium is not particularly limited, and may be, for example, a built-in type or an external type, such as HD (hard disk), FD (floppy (registered trademark) disk), CD-ROM, CD-R, CD-RW, MO, Examples of the drive include a DVD, a flash memory, and a memory card, and the drive is not particularly limited.
- the storage device 204 may be, for example, a hard disk drive (HDD) in which the storage medium and the drive are integrated.
- HDD hard disk drive
- the tracking device 10 further includes an input device 206 and a display 207, for example.
- the input device 206 includes, for example, a pointing device such as a touch panel, a track pad, and a mouse; a keyboard; an imaging unit such as a camera and a scanner; a card reader such as an IC card reader and a magnetic card reader; a voice input unit such as a microphone; It is done.
- Examples of the display 208 include display devices such as an LED display and a liquid crystal display.
- the input device 206 and the display 207 are configured separately, but the input device 206 and the display 207 may be configured as a single unit like a touch panel display.
- the image acquisition unit 111 is electrically connected to the detection unit 112 and the tracking unit 113.
- the image acquisition unit 111 acquires n images over time with respect to the conveyed item being conveyed by the conveying device.
- An example of the image acquisition unit 111 is a CPU.
- the image acquisition unit 111 acquires the n images over time from, for example, an imaging unit (imaging device) inside or outside the device, a storage unit inside or outside the device, or the like. To do.
- the image acquisition unit 111 acquires n images according to the order in which the images are captured.
- the image acquisition unit 111 may capture the n images over time.
- the image acquisition unit 111 is, for example, an imaging unit (imaging device) that captures the image.
- the imaging means is, for example, a still camera, a video camera, a mobile phone with a camera, a smartphone with a camera, a mobile terminal with a camera such as a tablet terminal with a camera, a computer with a webcam, a head-mounted display with a camera, etc.
- Examples of the storage means include random access memory (RAM), read only memory (ROM), flash memory, hard disk (HD), optical disk, floppy disk (FD), and the like.
- the storage means may be a built-in device or an external device such as an external storage device.
- the image can be acquired, for example, by capturing an image with the imaging unit when the transport device is transporting a transported object.
- the image may include, for example, one of an image including the transported object and an image not including the transported object, or may include both.
- the image is, for example, an image including all or part of the transport device.
- the image may be, for example, an image in which a certain portion of a transport device that transports the transported object is captured, or may be an image in which different locations are captured. That is, the image may be, for example, an image obtained by imaging a certain part of a route (conveyance route) along which the conveyed product is conveyed, or may be an image obtained by imaging a different part.
- each image is captured such that the captured regions partially overlap, for example.
- Examples of the transfer device include a contact type or non-contact type transfer device.
- Examples of the transport device include a conveyor such as a belt conveyor, a shooter, and a carriage.
- the transported object is, for example, an object transported by the transport device.
- the transported object is not particularly limited and can be any object.
- Specific examples of the transported material include raw materials, work-in-process, semi-finished products, products, merchandise, and stored items.
- the tracked transported object may be all or a part of the transported object.
- the tracked transported object may be, for example, an object or a tracking object.
- the transported object includes an object, the object may be one type or a plurality of types. Examples of the object include non-defective products and defective products.
- the frequency of acquiring the image is not particularly limited, and the lower limit thereof is, for example, 3 FPS (FlameslamPer Second), preferably 12 FPS, more preferably 20 FPS, and the upper limit thereof is not particularly limited. .
- the frequency ranges are, for example, 10 to 100 FPS, 10 to 20 FPS, and 60 to 100 FPS.
- n is an arbitrary positive integer of 2 or more, and the upper limit is not particularly limited.
- the “n sheets” is, for example, 2 sheets (frames) or more, preferably 3 sheets or more, and more preferably 5 sheets or more.
- the detection means 112 is electrically connected to the image acquisition means 111 and the tracking means 113.
- An example of the detection unit 112 is a CPU.
- the detection unit 112 detects the transported object for k detection target images selected from the n images acquired by the image acquisition unit 111.
- the detection unit 112 detects the transported object for the k detection target images in the order acquired by the image acquisition unit 111.
- the detection target image is an image for detecting the transported object, for example.
- the detection target image can be acquired by selecting from the n images. The method for selecting the detection target image will be described later.
- the detection target image may be used for tracking by the tracking unit 113 described later.
- the “k” only needs to be smaller than n, more specifically, any positive integer smaller than n. That is, k is an arbitrary integer that satisfies 1 ⁇ k ⁇ n.
- the “k sheets” only needs to satisfy the numerical range described above, and is appropriately changed according to the selection method of the detection target image, for example.
- the tracking unit 113 is electrically connected to the image acquisition unit 111 and the detection unit 112.
- An example of the detection unit 113 is a CPU.
- the tracking unit 113 includes a transported object (hereinafter referred to as “detected”) detected in each of the detection target images in an image acquired after each of the detection target images among n images acquired by the image acquisition unit 111. (Also called “conveyance”). For example, the tracking unit 113 detects the transported object for the n detection target images in the order acquired by the image acquisition unit 111.
- FIG. 3 shows a flowchart of the tracking method in the present embodiment.
- the tracking method of this embodiment is implemented as follows, for example using the tracking apparatus 10 of FIG.
- the tracking method of the present embodiment includes an S1 step (image acquisition), an S2 step (detection), and an S3 step (tracking).
- S2 step and S3 step may be implemented in parallel and may be implemented in series.
- n images are acquired by the image acquisition unit 111 over time with respect to the transported object transported by the transport device.
- the imaging unit captures a transport route, and the image acquisition unit 111 acquires the captured image as the image.
- the image acquisition unit 111 is an imaging unit
- the image acquisition unit 111 captures the transport route and acquires it as the image.
- the image acquisition unit 111 reads and acquires the image stored in the data storage unit (not shown).
- the detection unit 112 detects the transported object for k detection target images selected from the n images. That is, in step S2, the transported object is detected for some of the n images. Specifically, first, in step S2, the detection target image is selected.
- the detection target image may be selected by, for example, the detection unit 112 or may be selected by another unit.
- the selection method of the detection target image is not particularly limited. For example, in n images, a predetermined number of images may be selected as the detection target image, or an image every predetermined time is selected as the detection target image. You may choose as The predetermined number and the predetermined time can be appropriately set according to the detection time of the detection target image of the detection unit 112, for example.
- the detection unit 112 performs, for example, coordinates (for example, center coordinates), sizes (for example, width and length, or vertical and horizontal sizes (areas)) of each conveyance object in the detection target image.
- the position information of the conveyed product is detected.
- detection results detection information
- the detection result may include, for example, the number of detections of each conveyed product.
- the certainty factor is, for example, the probability (possibility) that the transported object detected by the detection unit 112 is a transported object.
- the detection unit 112 may detect an object in the transported object.
- the detection unit 112 may detect the object by detecting a conveyance object other than the object in the conveyance object.
- the detection unit 112 can detect the transport object or the target object using, for example, color-based identification, contour extraction, template matching, a learning model that can detect the transport object or the target object, and the like.
- the learning model can be created, for example, by performing machine learning or deep learning on the transported object or object.
- the detection unit 112 may classify which object corresponds to, for example.
- the detection means 112 may output the classification class of the detected target object as a detection result, for example.
- step S3 the tracking unit 113 tracks the transported object detected in each detection target image in the images acquired after each detection target image among the n images. That is, in step S3, when the tracking unit 113 detects a transported object in the pth detection target image among the k detection target images, the pth out of the n images. In the image acquired (captured) after the detection target image, the conveyance object detected in the p-th detection target image is tracked. The tracking unit 113 tracks, for example, the conveyed object detected in the p-th detection target image in all or part of the image acquired after the p-th detection target image.
- the “p” is a positive integer equal to or less than k.
- the transported object can be traced by optical flow estimation such as Lucas Kanade method and Horn-Schunk method.
- the tracking of the transported object is detected in, for example, an image tracking the transported object, an image acquired immediately before the image to be tracked, and an image acquired immediately before the image to be tracked.
- position information such as coordinates (for example, center coordinates) of the conveyed object
- position information such as coordinates of the detected conveyed object (for example, center coordinates) in the image used for tracking is calculated by optical flow estimation. This can be done.
- the tracking unit 113 may output the position information of the conveyed object detected in the image used for the tracking as a tracking result (tracking information). If the detection unit 112 outputs the detection result, in step S3, the tracking unit 113 may associate the detection result and the tracking result with respect to the corresponding transported object.
- the transported object is detected in k detection target images selected from the n images. That is, in the tracking device and the tracking method according to the first embodiment, the transported object is detected not in all the images of the n images but in a part of the images. For this reason, the tracking device and the tracking method of the first embodiment can shorten the processing time as compared with the device of the reference example that detects the transported object in all images.
- the tracking device 10 is configured by the data processing means 11, but may include other configurations.
- the tracking device of the present invention may include data storage means.
- FIG. 4 is a block diagram illustrating another example of the tracking device according to the first embodiment.
- the tracking device 20 includes an image storage unit 121, a detection information storage unit 122, and a tracking information storage unit 123 in addition to the configuration of the tracking device 10.
- the image storage unit 121 includes the image acquisition unit 111, the detection unit 112, and the tracking unit 113
- the detection information storage unit 122 includes the detection unit 112 and the tracking unit 113
- the tracking information storage unit 123 includes the tracking unit 113. Electrically connected.
- the image storage unit 121, the detection information storage unit 122, and the tracking information storage unit 123 may be incorporated in the data storage unit 12 that is hardware, for example, as illustrated in FIG.
- Examples of the data storage means 12 include the storage means described above, and specific examples include ROM, RAM, and the like.
- the tracking device 20 stores n images acquired by the image acquisition unit 111 in the image storage unit 121, and outputs the stored images to the detection unit 112 and the tracking unit 113. Further, the detection result of the detection unit 112 and the detection target image are stored in the detection information storage unit 122, and the stored detection result is output to the tracking unit 113. Furthermore, the tracking result of the tracking unit 113 is stored in the tracking information storage unit 123. Except for these points, the tracking device 20 has the same configuration as that of the tracking device 10, and the description thereof can be used.
- the tracking device of the present invention may include at least one of input means and output means.
- FIG. 5 is a block diagram illustrating another example of the tracking device according to the first embodiment.
- the tracking device 30 includes an input unit 13 and an output unit 14 in addition to the configuration of the tracking device 10.
- the input means 13 is electrically connected to the image acquisition means 111
- the output means 14 is electrically connected to the image acquisition means 111, the detection means 112, and the tracking means 113, respectively.
- the input unit 13 inputs information such as start and stop of image acquisition, for example.
- the input means 13 for example, a normal input means provided in a portable terminal such as a monitor such as a touch panel or an operation key, a normal input means provided in a computer such as a keyboard and a mouse, an input file, another computer, and the like can be used.
- the output unit 14 outputs, for example, n images acquired by the image acquisition unit 111, detection results of the detection unit 112, k detection target images, tracking results of the tracking unit 113, and the like.
- the output means 14 includes, for example, display means such as a monitor that outputs video (for example, various image display devices such as a liquid crystal display (LCD) and a cathode ray tube (CRT) display), a printer that outputs by printing, and a speaker that outputs sound. Etc.
- display means such as a monitor that outputs video
- various image display devices such as a liquid crystal display (LCD) and a cathode ray tube (CRT) display
- the output unit 14 may display the n images, the detection result, the detection target image, and the tracking result on the display unit.
- the input unit 13 and the output unit 14 may be electrically connected to the data processing unit 11 via, for example, an I / O interface.
- the tracking device of the present invention may further include, for example, a video codec and a controller (system controller, I / O controller, etc.). Except for these points, the tracking device 30 has the same configuration as that of the tracking device 10, and the description thereof can be used.
- Modification 1 relates to the tracking device and the tracking method of the present invention.
- the tracking device and the tracking method according to the first modification for example, the description of the tracking device and the tracking method according to the first embodiment can be cited.
- the tracking device acquires the m-th image among the n images by the image acquisition unit, and then includes the k detection target images.
- a detection processing determination unit that determines whether the detection unit performs detection on the first detection target image. In this case, for example, when the detection processing determination unit determines that the detection unit has not performed the detection, the detection unit determines that the m-th image is l + 1 sheets. The transported object is detected as an eye detection target image.
- the “m” is a positive integer less than or equal to n. That is, m is an integer that satisfies 1 ⁇ m ⁇ n.
- the “l” is a positive integer equal to or less than k ⁇ 1. That is, l is an integer that satisfies 1 ⁇ l ⁇ k ⁇ 1.
- the detection processing determination unit is electrically connected to, for example, the image acquisition unit and the detection unit.
- An example of the detection process determination means is a CPU.
- the tracking method of the first modified example includes, after the acquisition of the mth image among the n images in the image acquisition step, the k detection target images. , Including a detection process determination step for determining whether the detection step is performing detection on the first detection target image.
- the m + 1st image is l + 1 sheets. The transported object is detected as an eye detection target image.
- the tracking method of Modification 1 can be implemented using the tracking device of Modification 1, for example. Specifically, first, the m-th image is acquired by the image acquisition means (image acquisition step). Next, it is determined by the detection process determination means whether the detection means is performing detection on the first detection target image (detection process determination step). Specifically, the detection process determination unit determines, for example, whether or not the process by the detection unit is operating by confirming the operating status of the CPU that performs the detection process. When the detection processing determination unit determines that the detection unit has not performed detection, the detection unit selects the m-th image as an l + 1-th detection target image, and the transport An object is detected (detection step).
- the detection processing determination means determines that the detection by the detection means is being performed, the detection by the detection means is not performed.
- the detected conveyance object may be tracked in the m-th image by the tracking unit.
- the tracking device and the tracking method of Modification 1 can track the detected transported object even for an image that is not subjected to the detection process, for example, so that the tracking accuracy can be further improved.
- the detection processing determination unit for example, among the detection result of the l-th detection target image and the n images, the l It may be determined whether or not the association unit performs association with the tracking result of the next image of the first detection target image. In this case, when the detection processing determination unit determines that the detection by the detection unit and the association by the association unit are not performed, the detection unit converts the m-th image into the l + 1-th image. It is preferable to detect the transported object as a detection target image.
- the detection processing determination step includes, for example, the detection result of the l th detection target image and the n images among the l images. It may be determined whether or not the association step is associated with the tracking result of the next image of the first detection target image. In this case, when the detection process determination step determines that the detection by the detection step and the association by the association step are not performed, the detection step determines that the mth image is the l + 1th image. It is preferable to detect the transported object as a detection target image.
- the tracking device and the tracking method of the first modification it is determined whether the detection process in the detection unit or the detection process is performed. If the detection process is not performed, a new income is obtained. A detection process can be performed on the m-th image. For this reason, it is possible to prevent a newly acquired image from waiting for work without being subjected to detection processing. Therefore, according to the tracking device and the tracking method of Modification 1, the processing time can be further shortened.
- Modification 2 relates to the tracking device and the tracking method of the present invention.
- the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the second modification.
- the detection unit acquires the position information of the transported object, and the tracking unit includes the jth image out of the n images. Based on the image, the image before the j ⁇ 1th sheet, and the position information of the conveyed object in the image before the j ⁇ 1th sheet, by calculating the position information of the conveyed object in the jth image, The transported object is tracked.
- J is a positive integer of 2 or more and n or less. That is, j is an integer that satisfies 2 ⁇ j ⁇ n.
- the position information of the conveyed object is acquired in the detection step.
- the jth image among the n images is acquired. Based on the image, the image before the j ⁇ 1th sheet, and the position information of the conveyed object in the image before the j ⁇ 1th sheet, by calculating the position information of the conveyed object in the jth image, The transported object is tracked.
- the tracking method of Modification 2 can be implemented by the tracking device of Modification 2, for example. Specifically, first, in the same manner as in the first embodiment, the first to j ⁇ 1th images are traced. At this time, in the second modification, when the detected object is detected by the detection unit, the position information of the conveyed object is acquired for the detection target image (detection step). Next, the j-th image is acquired in the same manner as in the first embodiment. When the j-th image is not the detection target image, the detection by the detection unit is not performed. On the other hand, when the j-th image is the detection target image, the j-th image is selected as the i-th detection target image, detection is performed by the detection unit, and the i-th detection target is detected. Position information of the conveyed product in the image is acquired (detection step).
- the detected transported object in the j-th image is tracked (tracking process). Specifically, the j-th image, the image before the j ⁇ 1th image, and the position information of the conveyed object in the image before the j ⁇ 1th image are acquired by the tracking unit. Then, based on the position information of the conveyed object in the image before the j ⁇ 1th image, the image before the j ⁇ 1th image, and the image before the j ⁇ 1th image, for example, by the optical flow estimation, the jth image The position information of the conveyed product detected in the first image is calculated.
- the tracking unit performs the j-th image, the i ⁇ 1-th previous detection target image, and the i ⁇ It is preferable to track the position of the transported object by calculating the position information of the transported object in the j-th image based on the position information of the transported object in the first detection target image. It is preferable that the detection target image before the (i ⁇ 1) th image includes the (i ⁇ 1) th detection target image. Thereby, since the newly detected transported object in the i-1th detection target image can be tracked, the tracking of the transported object can be suppressed, and the tracking accuracy can be further improved.
- the position information of the conveyed product in the detection step and the tracking step is preferably coordinates or center coordinates of the conveyed product.
- I is an integer of 2 or more and k or less. That is, i is an integer that satisfies 2 ⁇ i ⁇ k.
- the transported object in the j-th image, the image before the j ⁇ 1th image, and the image before the j ⁇ 1th image Since the position information of the conveyed object in the j-th image is calculated based on the position information, the tracking process can be reduced. Therefore, according to the tracking device and the tracking method of Modification 2, the processing time can be further shortened.
- Modification 3 relates to the tracking device and the tracking method of the present invention.
- the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the third modification.
- the tracking device acquires the positional information of the transported object detected in the detection target image, and the transport device obtained by the tracking unit is obtained.
- An erroneous detection determination means for determining whether the transported object is an erroneously detected object based on position information is included.
- the erroneous detection determination means is electrically connected to, for example, the detection means and the tracking means.
- An example of the erroneous detection determination means is a CPU.
- the position information of the detected transported object is preferably, for example, coordinates or center coordinates of the transported object.
- the position information of the conveyed product obtained by the tracking means is preferably the coordinates or center coordinates of the conveyed item.
- the tracking method of Modification 3 can be implemented by the tracking device of Modification 3, for example. Specifically, first, the detection step and the tracking step are performed in the same manner as in the first embodiment. At this time, in Modification 3, by calculating the position information of the conveyed object in the image used for tracking (for example, the jth image in Modification 2) for the conveyed object detected in the detection target image. ,get. Next, based on the position information of the conveyed product obtained in the tracking step, the erroneous detection determining unit determines whether the conveyed item is an erroneously detected object (error detection determining step).
- the determination of the erroneously detected object can be performed based on, for example, the amount of change in the position of the detected conveyed object between a plurality of images.
- the amount of change in the position (for example, coordinates or center coordinates) of the detected conveyed object in a predetermined number of images is equal to or less than a predetermined numerical value, the detected conveyed object is an erroneously detected object. Can be determined.
- the difference between the average movement vector of the conveyance object calculated from each image and the movement vector of the conveyance object detected between the images is
- the detected conveyed item may be determined as an erroneously detected object.
- the predetermined numerical value include 10 pixels, 5 pixels, and 1 pixel.
- the modified example 3 can detect the erroneously detected object with higher accuracy.
- the predetermined number of images is, for example, 1 to 10 frames, 1 to 5 frames, and 1 to 3 frames.
- the predetermined numerical value can be appropriately set according to, for example, the size of the transport device captured by the image and the length of the transport route. It is preferable that the transported object determined as the erroneously detected object and the information associated with the transported object are deleted from the detection result and the tracking result, for example.
- the deletion is performed, for example, prior to detection of a conveyed product in a new detection target image (next detection target image).
- the deletion is not performed together with the erroneous detection determination unit or the erroneous detection determination step, in the erroneous detection determination unit or the erroneous detection determination step, the conveyance object in the detection result and the tracking result and the information associated with the conveyance object are included.
- the deletion flag is assigned.
- the processing time can be further shortened.
- Modification 4 relates to the tracking device and the tracking method of the present invention.
- the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the fourth modification.
- the tracking device of Modification 4 includes a detection result of a conveyance object in the h-th detection target image and the n images among the k detection-target images. Correspondence for associating the same transported object with the detected transported object and the tracked transported object based on the tracking result of the transported object in the image acquired next to the h-th detection target image Including means.
- the association means is electrically connected to, for example, the detection means and the tracking means.
- An example of the association means is a CPU.
- h is a positive integer equal to or less than k. That is, h is an integer that satisfies 1 ⁇ h ⁇ k.
- the tracking method of the modification 4 includes a detection result of the conveyed object in the h-th detection target image and the n images among the k detection-target images. Correspondence for associating the same transported object with the detected transported object and the tracked transported object based on the tracking result of the transported object in the image acquired next to the h-th detection target image Process.
- the tracking method of Modification 4 can be implemented by the tracking device of Modification 4, for example.
- the detection result of the conveyed product in the h-th detection target image is acquired by the association unit.
- a tracking result of the conveyed object in the image acquired next to the h-th detection target image among the n images is acquired by the association unit.
- the said matching means determines whether the same conveyance object exists by calculating the overlap of the area
- the said matching means can match
- the associating means associates as follows, for example. Specifically, first, the overlapping ratio of the regions is calculated for all combinations of the transported object in the detection result and the transported object in the tracking result. In the case where the area of the transported object of the tracking result is A and the area of the transported object of the detection result is B, the overlap rate is, for example, in the area of the transported object of the tracking result (A).
- the ratio (A ⁇ B / B) of the area where the conveyed object overlaps the ratio (A / B / B) of the area where the conveyed object of the tracking result overlaps in the area (B) of the conveyed object as the detection result,
- a ⁇ B the ratio of the tracking result transported object area and the detection result transported object area
- a ⁇ ⁇ B an area where the tracking result transported object area and the detection result transported object area overlap
- the association unit calculates, for example, a distance between the coordinates (for example, center coordinates) of each transported object in the detection result and the coordinates (for example, center coordinates) of each transported object in the tracking result. It may be determined whether the same transported object exists. In this case, for example, the association unit repeatedly performs association that associates the same transport object in order from the combination with the shortest distance.
- the same conveyance object in the detection result and the tracking result can be associated with each other. For this reason, according to the tracking device and the tracking method of the modified example 4, it is not necessary to track the same transported object as another transported object, and the processing time can be further shortened.
- Modification 5 relates to the tracking device and the tracking method of the present invention.
- the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the modification 5.
- the tracking device includes, in addition to the tracking device according to the first embodiment, a transported object determination unit that determines whether there is a transported object detected in the detection target image acquired before the image.
- the tracking means tracks.
- the transported object determining means is electrically connected to, for example, the detecting means and the tracking means.
- An example of the conveyed product determination means is a CPU.
- the tracking method of the modified example 5 includes a transported object determination step of determining whether a transported object detected in the detection target image acquired before the image exists, When it is determined in the object determination step that the detected transported object exists, the tracking step tracks.
- the tracking method of Modification 5 can be implemented by the tracking device of Modification 5, for example.
- the transported object determination means acquires the detection result of the transported object detected in the detection target image acquired before the q-th image. Then, in the acquired detection result of the transported object, the transported object determining unit determines whether or not the detected transported object exists. When the detected transported object exists, a tracking step by the tracking unit is performed. Moreover, when the above-mentioned deletion flag is given to the detection result of the said conveyed product, the said conveyed product determination means may determine that the conveyed product to which the said deletion flag was given is not detected, for example. . Thereby, for example, since the tracking of the erroneously detected object can be avoided, the processing time can be further shortened.
- the “q” is a positive integer of n or less. That is, q is an integer that satisfies 1 ⁇ q ⁇ n.
- the tracking device and the tracking method of the modified example 5 it is determined whether or not the detected transported object exists. For this reason, according to the tracking device and the tracking method of the modified example 5, when the detected transported object does not exist or when the transported object to be tracked does not exist, the tracking process does not have to be performed. Time can be shortened.
- Modifications 1 to 5 may be used alone, but it is preferable to use a plurality of combinations in order to shorten the processing time and improve the tracking accuracy. More preferably. In the case of the plurality of combinations, the combination is not particularly limited and can be any combination.
- the second embodiment relates to the tracking device and the tracking method of the present invention.
- FIG. 6 shows a block diagram of the tracking device in the present embodiment.
- the tracking device 40 of the present embodiment includes a detection processing determination unit 114, a transported object determination unit 115, an erroneous detection determination unit 116, and an association unit 117.
- the image acquisition unit 111, the detection unit 112, the tracking unit 113, the detection process determination unit 114, the transported object determination unit 115, the erroneous detection determination unit 116, and the association unit 117 are, for example, hardware. It may be incorporated in a certain data processing means (data processing apparatus) 11 or may be software or hardware in which the software is incorporated.
- the data processing unit 11 may include a CPU or the like. Further, the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.
- the detection processing determination unit 114 is the image acquisition unit 111 and the detection unit 112
- the conveyed product determination unit 115 is the detection unit 112 and the tracking unit 113
- the erroneous detection determination unit 116 is the detection unit 112.
- the tracking unit 113 and the association unit 117 are electrically connected to the detection unit 112 and the tracking unit 113, respectively.
- the configuration of the tracking device 40 of the second embodiment is the same as the configuration of the tracking device 10 of the first embodiment, and the description thereof can be used. Further, the descriptions of the modified examples 1 to 5 can be used for the description of the detection processing determination unit 114, the conveyed product determination unit 115, the erroneous detection determination unit 116, and the association unit 117.
- FIGS. 7 and 8 show flowcharts of the tracking method in the present embodiment.
- the tracking method of this embodiment is implemented as follows using the tracking device 40 of FIG. 6, for example.
- the tracking method of the present embodiment includes S1 step (image acquisition), S4 step (detection process determination), S6 step (tracking data update), S2 ′ step (detection / association), S7. It includes a main thread composed of steps (conveyed object determination), S3 step (tracking), S8 step (addition of tracking data), and S9 step (false detection determination).
- the S2 ′ step includes a sub thread composed of an S2 step (detection), an S10 step (association, synchronization), and an S11 step (detection result output).
- FIG. 9 is a diagram illustrating an image acquired by the tracking device 40.
- FIG. 10 is a diagram showing a timeline of processing of the S2 ′ step including the S2 step (detection) and the S10 step (association, synchronization) and the S3 step (tracking).
- step S ⁇ b> 1 frame 1 (first image) is acquired by the image acquisition unit 111.
- step S4 it is determined by the detection process determination means 114 whether the detection process is being implemented (S4).
- the detection unit 112 updates the tracking data associated with the detection result and the tracking result (S6).
- the tracking data is not updated, and the process proceeds to step S2 ′ to start a sub thread.
- the detection unit 112 selects the frame 1 as the detection target image, and detects the target object 1a and the target object 2a in the frame 1.
- the detection means 112 detects the first object 1a and the second object 2a using a learning model that can detect the first object 1 and the second object 2.
- the center coordinates, vertical and horizontal sizes (occupied areas), and certainty (possibility of being the first object 1) of the first object 1a are associated with the frame 1.
- the detection result in which the center coordinates, the position information such as the vertical and horizontal sizes (occupied areas), and the certainty (possibility of being the second object 2) are associated with the frame 1. Is output as
- step S10 the same conveyance object is detected among the first object 1a and the second object 2a detected by the association unit 117 and the object in the tracking result of the next image of the frame 1.
- Associate In frame 1, since there is no tracking data that can be compared with the detected first object 1a and second object 2a, no matching is performed and the process proceeds to step S11.
- step S11 the detection results of the first object 1a and the second object 2a and the frame 1 are output. Then, the sub thread is terminated.
- step S7 is performed in parallel with the step S2, step S10 and step S11 in the step S2 '.
- the conveyed product determination means 115 determines whether the detected conveyed item exists. In frame 1, since there is no transported object detected before frame 1, if No, that is, the process proceeds to step S1.
- step S ⁇ b> 1 the frame 2 is acquired by the image acquisition unit 111. And it is determined by the detection process determination means 114 whether the detection process is implemented (S4). In frame 2, as shown in FIG. 10, since the detection process of frame 1 is completed and the detection process is not performed, the process proceeds to No. Next, for example, the tracking data is updated by the detecting means 112 (S6). In the frame 1, the first object 1a and the second object 2a are detected, and the detection results of the first object 1a and the second object 2a are output. Therefore, in step S6, these are detected objects, and the positional information and the certainty factor of the first object 1a and the second object 2a in the frame 1 are recorded as tracking data.
- the number of detections as the first object 1 is recorded as one for the first object 1a
- the number of detections as the second object 2 is recorded as one for the second object 2a. To do.
- the certainty factor and the number of detection times are listed and held for each type of object. Then, the process proceeds to step S2 ′ to start a sub thread.
- step S2 the detection unit 112 selects the frame 2 as the detection target image, and the frame 2 together with the first object 1a and the second object 2a detected in the frame 1
- the first object 1b is detected.
- the center coordinates, the vertical and horizontal sizes (areas occupied), and the certainty (possibility of the object 1) are associated with the frame 2 for the first objects 1a and 1b by the detection by the detection unit 112. Is output as a detection result.
- the second object 2a its center coordinates, position information such as vertical and horizontal sizes (occupied regions), and certainty (possibility of being the object 2) are output as detection results associated with the frame 2.
- step S10 the association unit 117 detects the first object 1a, 1b and the second object 2a detected and the next image of the frame 2, that is, the tracking result of the frame 3 described later.
- the same objects are associated with each other as the same object.
- region of the target object 2a is calculated by the combination of all the target objects. Each area is calculated from the center coordinates and vertical and horizontal sizes of each object.
- step S11 the detection results of the first objects 1a and 1b and the second object 2a, the correspondence between the first object 1a and the second object 2a in the frames 2 and 3, and the frame 2 are displayed. Output. Then, the sub thread is terminated.
- step S7 is performed in parallel with the step S2, step S10 and step S11 in the step S2 '.
- the conveyed product determination means 115 determines whether the detected conveyed item exists. Since the detected first target object 1a and second target object 2a exist, the process proceeds to Yes.
- step S3 is performed.
- the tracking unit 113 tracks the first target 1a and the second target 2a based on the frame 1 that is the previous detection target image.
- step S3 for the first object 1a and the second object 2a, based on the center coordinates of the first object 1a and the second object 2a in the frame 1 and the frames 1 and 2. Then, the center coordinates of the first object 1a and the second object 2a in the frame 2 are respectively calculated by optical flow estimation.
- step S8 for example, the tracking unit 113 uses the calculated center coordinates of the first object 1a and the second object 2a as position information of the objects 1a and 2a in the frame 2, so that the first object It adds to the positional information on the object 1a and the 2nd target object 2a.
- the erroneous detection determination means 116 determines whether the first object 1a and the second object 2a are erroneous detection objects. Specifically, for each of the first object 1a and the second object 2a, it is determined whether the amount of change in the center coordinate at a predetermined number of frames is equal to or less than a predetermined numerical value.
- the deletion flag is added to the tracking data of the corresponding first object 1a and second object 2a by the erroneous detection determination means 116. Is granted.
- the tracking data to which the deletion flag is assigned is deleted at the next step S6. In the present embodiment, the first object 1a and the second object 2a do not give the deletion flag because they do not satisfy the condition.
- step S9 the process proceeds to step S1, and the frame 3 is acquired by the image acquisition unit 111. And it is determined by the detection process determination means 114 whether the detection process is implemented (S4).
- the process proceeds to Yes.
- step S3, S7, S3, S8 and S9 are performed on frame 3 in the same manner as frame 2, except that frame 2 which is the previous image is used instead of the previous detection target image. Perform the steps.
- step S9 After completion of step S9 for frame 3, the process proceeds to step S1, and the frame 4 is acquired by the image acquisition unit 111. Then, steps S4, S7, S3, S8, and S9 are performed on frame 4 in the same manner as frame 3. After step S9, the process proceeds to step S1.
- tracking is performed in the same manner as the tracking method for frame 2-4 until the image acquisition by the image acquisition unit 111 is completed. Thereby, the conveyed product containing a target object can be tracked.
- step S7 it may be determined whether a new image has been acquired in step S7 if No and / or after step S9. If Yes, that is, if a new image has been acquired, the process proceeds to step S1. On the other hand, in the case of No, that is, when a new image is not acquired, the tracking method of the present embodiment is terminated.
- the detection process and the tracking process are processed in parallel. For this reason, in the tracking device and the tracking method of the second embodiment, it is not necessary to wait for the tracking process when the detection process is performed. For this reason, the tracking device and the tracking method of the second embodiment can further reduce the processing time as compared with the device combining the first to fifth modifications.
- Embodiment 3 relates to the counting device and counting method of the present invention.
- the description of the tracking device and the tracking method can be used for the counting device and the counting method of the present embodiment.
- FIG. 11 shows a block diagram of the counting device in the present embodiment.
- the counting device 50 of the present embodiment includes a counting unit 118 in addition to the tracking device 10 of the first embodiment.
- the image acquisition unit 111, the detection unit 112, the tracking unit 113, and the counting unit 118 may be incorporated in, for example, a data processing unit (data processing apparatus) 11 that is hardware, Hardware in which the software is incorporated may be used.
- the data processing unit 11 may include a CPU or the like.
- the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.
- the counting means 118 is electrically connected to the tracking means 113.
- An example of the counting means 118 is a CPU.
- the counting means 118 counts the tracked transported object. Specifically, when the tracking unit 113 acquires the position information of the detected transported object, the counting unit 118 counts the transported object based on the position information of the transported object, for example.
- the position information of the conveyed object used for counting includes, for example, coordinates of the conveyed object (for example, center coordinates) in one or more images, and a trajectory connecting the coordinates of the conveyed objects in two or more images.
- FIG. 12 shows a flowchart of the counting method in the present embodiment.
- the counting method of this embodiment is implemented as follows using the counting device 50 of FIG. 11, for example.
- the counting method of the present embodiment is performed in addition to S1 step (image acquisition), S2 step (detection), and S3 step (tracking), which are the tracking method of Embodiment 1, and S12 step (counting).
- S1 step image acquisition
- S2 step detection
- S3 step tilting
- S2 step and S3 step may be implemented in parallel and may be implemented in series.
- the steps S1-S3 are performed in the same manner as the tracking method of the first embodiment.
- step S12 the counted transported object is counted by the counting means 118.
- the counting unit 118 determines whether the tracked positional information of the conveyed product satisfies a counting condition that is a condition for counting the conveyed product. Examples of the counting condition include movement of the transported object outside the image, movement of the transported object to a predetermined area, and the distance of the trajectory of the transported object exceeding a predetermined distance. Then, the counting unit 118 counts the transported object determined to satisfy the counting condition. In step S12, after the count, the count detection result and tracking result may be deleted.
- step of updating the tracking data in the case of having the step of updating the tracking data as in the tracking method of the second embodiment, in the step of updating the tracking data by assigning a deletion flag to the detection result and tracking result of the counted transported object May be. Thereby, it can prevent counting about the same conveyed product in multiple times.
- step S12 when the transported object includes a target object, only the target object may be counted, or a transported object other than the target object and the target object may be counted. Moreover, when the said conveyed product contains multiple types of target object, you may count for every target object.
- the count for each object can be performed based on, for example, the number of detections as each object in the tracking method of the second embodiment. For example, the number of detections of the first object 1a in FIG. 9 as the first object 1 is 7 and the number of detections as the second object 2 is 3, and the number of detections is different for each object. In this case, in step S12, the first object 1a is counted as being the first object 1.
- the number of times of detection of the first object 1a of FIG. 9 as the first object 1 is 5 times
- the number of times of detection as the second object 2 is 5, and the number of times of detection for each object.
- the certainty factor in the tracking data is 5
- the highest certainty factor of the first object 1 and the certainty factor of the second object 2 is detected, Determine which is higher.
- the certainty factor of the first object 1 is higher than the certainty factor of the second object, the first object 1 a is counted as being the first object 1.
- the first object 1 a is counted as being the second object 2.
- the counting device of the present embodiment includes an erroneous detection determination unit, it is preferable that the counting unit is electrically connected to the erroneous detection determination unit.
- the counting step is preferably performed after the erroneous detection determination step.
- the counting device and the counting method of the third embodiment include the tracking device and the tracking method of the first embodiment, respectively.
- the processing time can be shortened as compared with the counting device and the counting method including the device of the reference example that detects the conveyed object in all images.
- the counting means when the number of detections of the same conveyed object is included and the number of detections of the conveyed object is equal to or less than a predetermined number of times, the counting means has the predetermined number of times of detection. It is not necessary to count the following conveyed items. Further, in the counting method of the present embodiment, for example, including a detection frequency measurement step of measuring the number of detections of the same transported object, and when the detection frequency of the transported object is a predetermined number or less, the counting step Do not count transported items less than the specified number.
- the detection unit 112 can prevent the erroneously detected object erroneously detected as the transported object, so that the transported object can be counted more accurately. Further, it is preferable that the detection frequency measurement means or the detection frequency measurement process is combined with the aforementioned erroneous detection determination means or the erroneous detection determination process. As a result, the erroneous detection object can be more effectively prevented from being counted, so that the conveyed object can be counted more accurately.
- Embodiment 4 relates to a sorting apparatus and a sorting method of the present invention.
- the description of the tracking device and the tracking method can be used for the sorting device and the sorting method of the present embodiment.
- FIG. 13 shows a block diagram of the sorting apparatus in the present embodiment.
- the sorting device 60 of this embodiment includes a sorting unit 119 in addition to the tracking device 10 of the first embodiment.
- the image acquisition unit 111, the detection unit 112, the tracking unit 113, and the selection unit 119 may be incorporated into, for example, a data processing unit (data processing apparatus) 11 that is hardware, Hardware in which the software is incorporated may be used.
- the data processing unit 11 may include a CPU or the like.
- the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.
- the sorting unit 119 is electrically connected to the tracking unit 113.
- An example of the sorting unit 119 is a CPU.
- the sorting means 119 sorts the objects in the tracked transported object directly or indirectly. In the case of the indirect sorting, when the tracking unit 113 has acquired the position information of the detected object, the sorting unit 119 is, for example, based on the position information of the object, inside or outside the apparatus. The object is sorted by a sorting device.
- the position information of the conveyed object used for the selection includes, for example, coordinates of the conveyed object (for example, center coordinates) in one or more images and a trajectory connecting the coordinates of the conveyed object in two or more images.
- the sorting device examples include a sorting device that can sort a plurality of types of items, a robot arm, and the like.
- the sorting unit 119 may, for example, use the tracked transported object based on the position information of the object. Directly sort objects in In this case, an example of the sorting unit 119 is the above-described sorting device.
- FIG. 14 shows a flowchart of the selection method in the present embodiment.
- the sorting method of the present embodiment is performed as follows, for example, using the sorting device 60 of FIG.
- the sorting method of this embodiment is not limited to the tracking method of Embodiment 1 (step S1 (image acquisition), step S2 (detection), and step S3 (tracking)), and step S13 (screening).
- step S1 image acquisition
- step S2 detection
- step S3 tilting
- step S13 screening
- S2 step and S3 step may be implemented in parallel and may be implemented in series.
- the steps S1-S3 are performed in the same manner as the tracking method of the first embodiment.
- the sorting means 119 sorts the object in the tracked transported object directly or indirectly. Specifically, in step S13, the sorting unit 119 acquires position information of the tracked target object and / or a transported object other than the target object. When the position information is the position information of the object, in step S13, the object is selected by the sorting unit 119, for example, by dividing the object. On the other hand, when the position information is the position information of the conveyed object other than the object, in step S13, the object is selected by the sorting unit 119, for example, by removing the conveyed object other than the object.
- the transported object includes a plurality of types of objects
- a plurality of types of objects and a transported object other than the target object may be selected, and further, a plurality of types of objects may be selected, respectively. Also good.
- the selection unit is electrically connected to the erroneous detection determination unit.
- the selection step is preferably performed after the erroneous detection determination step.
- the sorting device and the sorting method of the fourth embodiment include the tracking device and the tracking method of the first embodiment, respectively. For this reason, according to the sorting apparatus and the sorting method of the fourth embodiment, the processing time can be shortened as compared with the sorting apparatus and the sorting method including the apparatus of the reference example that detects the conveyed product in all images.
- the program of this embodiment is a program that can execute the tracking method, counting method, or sorting method described above on a computer. Or the program of this embodiment may be recorded on a computer-readable recording medium, for example.
- the recording medium is, for example, a non-transitory computer-readable storage medium.
- the recording medium is not particularly limited, and examples thereof include a random access memory (RAM), a read-only memory (ROM), a hard disk (HD), an optical disk, and a floppy (registered trademark) disk (FD).
- Embodiment 6 relates to the tracking system of the present invention.
- the tracking system of the present invention can use, for example, the description of the tracking device and the tracking method.
- FIG. 15 shows an example of the configuration of a tracking system using the tracking device of the present invention.
- the tracking system of this embodiment includes imaging devices 31 a, 31 b, and 31 c, communication interfaces 32 a, 32 b, and 32 c, and a server 34.
- the imaging device 31a is connected to the communication interface 32a.
- the imaging device 31a and the communication interface 32a are installed at the place X.
- the imaging device 31b is connected to the communication interface 32b.
- the imaging device 31b and the communication interface 32b are installed in the place Y.
- the imaging device 31c is connected to the communication interface 32c.
- the imaging device 31c and the communication interface 32c are installed in the place Z.
- Communication interfaces 32 a, 32 b, 32 c and the server 34 are connected via a communication network 33.
- image acquisition means, detection means, and tracking means are stored on the server 34 side.
- the tracking system transmits, for example, n images acquired by using the imaging device 31a at the place X to the server 34, and tracks the transported object on the server 34 side.
- the tracking system of this embodiment may correspond to a combination of the above-described embodiments and modifications. Moreover, the tracking system of this embodiment may be compatible with, for example, cloud computing. Furthermore, in the tracking system of the present embodiment, the communication interfaces 32a, 32b, and 32c and the server 34 may be connected by a wireless communication line.
- an imaging apparatus can be installed on the site, a server or the like can be installed at another location, and a conveyed product can be tracked online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.
- Embodiment 7 relates to the counting system of the present invention.
- the description of the tracking device, the tracking method, the counting device, the counting method, and the tracking system can be used for the counting system of the present invention.
- the counting system according to the present embodiment further includes counting means in the server 34 in the tracking system according to the sixth embodiment.
- the counting system transmits, for example, n measurement images acquired at the place X using the measurement image acquisition unit 311a to the server 34, and counts the conveyed objects on the server 34 side. Except for these points, the counting system of the seventh embodiment can use the description of the tracking system of the sixth embodiment.
- the counting system of the present embodiment for example, it is possible to shorten the processing time at the server when tracking a conveyed product. For this reason, according to the counting system of this embodiment, a conveyed product can be counted in a shorter time, for example.
- an imaging device can be installed at a site, a server or the like can be installed at another location, and the conveyed items can be counted online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.
- Embodiment 8 relates to the sorting system of the present invention.
- the description of the tracking device, the tracking method, the sorting device, the sorting method, and the tracking system can be used for the sorting system of the present invention.
- FIG. 16 shows a configuration of an example of a sorting system using the sorting apparatus of the present invention.
- the sorting system includes imaging devices 31 a, 31 b, and 31 c, sorting devices 35 a, 35 b, and 35 c, communication interfaces 32 a, 32 b, and 32 c, and a server 34.
- the imaging device 31a and the sorting device 35a are connected to the communication interface 32a.
- the imaging device 31a, the sorting device 35a, and the communication interface 32a are installed at the place X.
- the imaging device 31b and the sorting device 35b are connected to the communication interface 32b.
- the imaging device 31b, the sorting device 35b, and the communication interface 32b are installed at the place Y.
- the imaging device 31c and the sorting device 35c are connected to the communication interface 32c.
- the imaging device 31c, the sorting device 35c, and the communication interface 32c are installed in the place Z.
- Communication interfaces 32 a, 32 b, 32 c and the server 34 are connected via a communication network 33.
- image acquisition means, detection means, tracking means, and sorting means are stored on the server 34 side.
- the sorting system transmits, for example, n images acquired using the imaging device 31a at the place X to the server 34, and the server 34 side tracks and sorts the object in the transported object.
- the server 34 transmits, for example, position information of an object to be sorted to the sorting device 35a, and the sorting device 35a sorts the object to be sorted.
- the sorting system of the eighth embodiment can use the description of the tracking system of the sixth embodiment.
- the sorting system of the present embodiment for example, it is possible to shorten the processing time at the server when tracking a conveyed product. For this reason, according to the sorting system of this embodiment, for example, it is possible to sort the object in the transported object in a shorter time. Further, according to the sorting system of the present embodiment, the imaging device and the sorting device can be installed at the site, and the server or the like can be installed at another location, and the object can be sorted online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.
- Appendix 1 Image acquisition means for acquiring n images over time with respect to the conveyed object being conveyed by the conveying device; Detecting means for detecting the conveyed object for k detection target images selected from the n images; A tracking means for tracking a conveyed object detected in the kth detection target image in an image acquired after the kth detection target image among the n images, K in the said detection means is smaller than n,
- the tracking apparatus of the conveyed product characterized by the above-mentioned.
- the detection means After the acquisition of the m-th image among the n images by the image acquisition means, the detection means performs detection on the first detection target image among the k detection target images.
- a detection processing determination means for determining whether or not When it is determined by the detection processing determination means that the detection by the detection means is not performed,
- the tracking device according to appendix 1, wherein the detection unit detects the conveyed object using the m-th image as an (l + 1) -th detection target image.
- the detection means obtains position information of the transported object, The tracking means is based on the j-th image, the image before the j ⁇ 1th image, and the position information of the conveyed object in the image before the j ⁇ 1th image among the n images.
- the tracking device according to appendix 1 or 2, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
- Appendix 4 When the j-th image is the i-th detection target image, the tracking unit detects the j-th image, the i-1th detection target image, and the i-1th detection image.
- the tracking device according to attachment 3, wherein the position of the conveyed object is tracked by calculating position information of the conveyed object in the j-th image based on the positional information of the conveyed object in the target image.
- the tracking unit obtains position information of the conveyed object detected in the detection target image; 5.
- the tracking device according to any one of appendices 1 to 4, further comprising an erroneous detection determination unit that determines whether the transported object is an erroneously detected object based on position information of the conveyed object obtained by the tracking unit.
- Appendix 6 The tracking device according to appendix 5, wherein the erroneous determination detection means determines whether the conveyed object is an erroneously detected object based on a change amount of the position of the detected conveyed object between a plurality of images.
- Appendix 7 Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image.
- the tracking device according to any one of appendices 1 to 6, further including an association unit that associates the detected transported object and the tracked transported object with each other based on the tracking result of the object.
- Appendix 8 A transport object determination means for determining whether or not a transport object detected in the detection target image acquired before the image exists, The tracking device according to any one of appendices 1 to 7, wherein when the transported object determining unit determines that the detected transported object exists, the tracking unit tracks.
- Appendix 9) 9. The tracking device according to any one of appendices 1 to 8, wherein detection by the detection unit and tracking by the tracking unit are performed in parallel.
- (Appendix 13) Including a detection frequency measuring means for measuring the detection frequency of the same transported object The counting apparatus according to appendix 11 or 12, wherein when the number of detections of the transported object is equal to or less than a predetermined number, the counting unit does not count a transported object whose detection frequency is equal to or less than the predetermined number.
- Appendix 14 A tracking means for tracking a transported object including the object being transported by the transport device; Sorting means for sorting objects in the tracked transported object, 11. The transported object sorting apparatus, wherein the tracking means is the transported object tracking apparatus according to any one of appendices 1 to 10.
- the detection step performs detection on the l-th detection target image among the k detection-target images.
- the position information of the conveyed product is acquired, In the tracking step, based on the j-th image, the image before the j ⁇ 1th image, and the position information of the conveyed object in the image before the j ⁇ 1th image among the n images, The tracking method according to appendix 15 or 16, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
- Appendix 19 In the tracking step, obtain positional information of the conveyed object detected in the detection target image, The tracking method according to any one of appendices 15 to 18, including an erroneous detection determination step of determining whether the transported object is an erroneously detected object based on position information of the conveyed object obtained in the tracking process.
- Appendix 20 The tracking method according to appendix 19, wherein, in the erroneous determination detection step, it is determined whether the conveyed object is an erroneously detected object based on an amount of change in the position of the detected conveyed object between a plurality of images.
- Appendix 21 Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image. 21.
- Appendix 22 A transport object determination step for determining whether a transport object detected in the detection target image acquired before the image exists, The tracking method according to any one of appendices 15 to 21, wherein, when it is determined in the transported object determination step that the detected transported object exists, the tracking step tracks.
- Appendix 23 The tracking method according to any one of appendices 15 to 22, wherein the detection step and the tracking step are performed in parallel.
- Appendix 24 The tracking method according to any one of appendices 15 to 23, wherein in the detection step, the transported object is detected using a learning model capable of detecting the transported object.
- Appendix 25 A tracking process for tracking the transported object being transported by the transport device; Counting the tracked transported object, 25.
- a method for counting a conveyed product, wherein the tracking step is the method for tracking a conveyed product according to any one of appendices 15 to 24.
- the tracking step acquires position information of the conveyed object detected in the detection target image, The counting method according to appendix 25, wherein the counting step counts the transported object based on position information of the transported object.
- (Appendix 27) Including a detection frequency measurement process for measuring the detection frequency of the same transported object 27.
- a method for selecting a conveyed product wherein the tracking step is the method for tracking a conveyed product according to any one of appendices 15 to 24.
- Appendix 29 An image acquisition process for acquiring n images over time for a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; A tracking process for tracking a transported object detected in the k-th detection target image in an image acquired after the k-th detection target image among the n images can be executed on a computer. Yes, The program according to claim 1, wherein k in the detection process is smaller than n.
- Appendix 30 An image acquisition process for acquiring n images over time for a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image; And a counting process for counting the number of the tracked transported objects can be executed on a computer, The program according to claim 1, wherein k in the detection process is smaller than n.
- Appendix 31 An image acquisition process for acquiring n images over time for a transported object including a target object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image; A sorting process for sorting objects in the tracked transported object can be executed on a computer; The program according to claim 1, wherein k in the detection process is smaller than n.
- Appendix 32 A computer-readable recording medium in which the program according to any one of appendices 29 to 31 is recorded.
- the terminal and the server are connectable via a communication network outside the system,
- the terminal includes an imaging device,
- the imaging device captures n images over time for a transported object transported by a transport device
- the server includes image acquisition means, detection means, and tracking means,
- the image acquisition means acquires n images over time for a transported object transported by the transport device
- the detection means detects a conveyed object for k detection target images among n images
- the tracking means tracks the detected transported object for n images, K in the said detection means is smaller than n,
- the tracking system of the conveyed product characterized by the above-mentioned.
- the terminal and the server are connectable via a communication network outside the system,
- the terminal includes an imaging device,
- the imaging device captures n images over time for a transported object transported by a transport device
- the server includes image acquisition means, detection means, tracking means, and counting means,
- the image acquisition means acquires n images over time for a transported object transported by the transport device
- the detection means detects a conveyed object for k detection target images among n images
- the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images
- the counting means counts the number of the tracked transported objects, K in the said detection means is smaller than n,
- the counting system of the conveyed product characterized by the above-mentioned.
- the terminal and the server are connectable via a communication network outside the system
- the terminal includes an imaging device and a sorting device,
- the imaging device captures n images over time for a transported object including a target object that is transported by a transporting device
- the sorting means sorts the sorting object in the tracked transported object
- the server includes image acquisition means, detection means, tracking means, and selection means,
- the image acquisition means acquires n images over time for a transported object including a target object being transported by the transporting device
- the detection means detects a conveyed object for k detection target images among n images
- the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images, K in the said detection means is smaller than n,
- the sorting system of a conveyed product characterized by the above-mentioned.
- the processing time can be shortened.
- a product or the like can be tracked in real time in a factory or the like.
- the present invention is extremely useful in the manufacturing industry and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Control Of Conveyors (AREA)
Abstract
La présente invention concerne un procédé de suivi et un dispositif de suivi de marchandises avec lesquels il est possible de raccourcir un temps de traitement. Ce dispositif de suivi de marchandises est caractérisé en ce qu'il comprend : un moyen d'acquisition d'image pour acquérir n images au fil du temps en ce qui concerne des marchandises transportées par un dispositif de transport, un moyen de détection pour détecter les marchandises concernant k images soumises à une détection qui sont sélectionnées parmi les n images, et un moyen de suivi pour suivre les marchandises détectées dans la k'ième image soumise à une détection dans des images parmi les n images qui ont été acquises après la k'ième image soumise à la détection, et en ce que k dans le moyen de détection est inférieur à n.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020500259A JP6989178B2 (ja) | 2018-02-14 | 2018-09-14 | 搬送物の追跡装置、搬送物の計数装置、搬送物の追跡方法、搬送物の計数方法、搬送物の追跡システム、および搬送物の計数システム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018024150 | 2018-02-14 | ||
| JP2018-024150 | 2018-11-02 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019159409A1 true WO2019159409A1 (fr) | 2019-08-22 |
Family
ID=67619900
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/034186 Ceased WO2019159409A1 (fr) | 2018-02-14 | 2018-09-14 | Dispositif de suivi de marchandises, compteur de marchandises, procédé de suivi de marchandises, procédé de comptage de marchandises, système de suivi de marchandises, système de comptage de marchandises |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6989178B2 (fr) |
| WO (1) | WO2019159409A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7136303B1 (ja) | 2021-09-30 | 2022-09-13 | 日本電気株式会社 | 計数装置、計数方法およびコンピュータプログラム |
| WO2024013934A1 (fr) * | 2022-07-14 | 2024-01-18 | 株式会社Fuji | Dispositif de transport de substrat et procédé de détection de substrat |
| JP2024105954A (ja) * | 2023-01-26 | 2024-08-07 | ヤンマーホールディングス株式会社 | 品質管理方法、品質管理システム、及び品質管理プログラム |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015066227A (ja) * | 2013-09-30 | 2015-04-13 | 富士フイルム株式会社 | 薬剤計数装置及び方法 |
| JP2017109161A (ja) * | 2015-12-15 | 2017-06-22 | ウエノテックス株式会社 | 廃棄物選別システム及びその選別方法 |
| JP2017186106A (ja) * | 2016-04-01 | 2017-10-12 | 株式会社東芝 | 配達支援装置、配達支援システム、及び配達支援プログラム |
-
2018
- 2018-09-14 WO PCT/JP2018/034186 patent/WO2019159409A1/fr not_active Ceased
- 2018-09-14 JP JP2020500259A patent/JP6989178B2/ja active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015066227A (ja) * | 2013-09-30 | 2015-04-13 | 富士フイルム株式会社 | 薬剤計数装置及び方法 |
| JP2017109161A (ja) * | 2015-12-15 | 2017-06-22 | ウエノテックス株式会社 | 廃棄物選別システム及びその選別方法 |
| JP2017186106A (ja) * | 2016-04-01 | 2017-10-12 | 株式会社東芝 | 配達支援装置、配達支援システム、及び配達支援プログラム |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7136303B1 (ja) | 2021-09-30 | 2022-09-13 | 日本電気株式会社 | 計数装置、計数方法およびコンピュータプログラム |
| WO2023053473A1 (fr) * | 2021-09-30 | 2023-04-06 | 日本電気株式会社 | Dispositif de comptage, procédé de comptage et support d'enregistrement |
| JP2023050231A (ja) * | 2021-09-30 | 2023-04-11 | 日本電気株式会社 | 計数装置、計数方法およびコンピュータプログラム |
| WO2024013934A1 (fr) * | 2022-07-14 | 2024-01-18 | 株式会社Fuji | Dispositif de transport de substrat et procédé de détection de substrat |
| JP2024105954A (ja) * | 2023-01-26 | 2024-08-07 | ヤンマーホールディングス株式会社 | 品質管理方法、品質管理システム、及び品質管理プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2019159409A1 (ja) | 2020-12-03 |
| JP6989178B2 (ja) | 2022-01-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN113632099B (zh) | 分布式产品缺陷分析系统、方法及计算机可读存储介质 | |
| US11378522B2 (en) | Information processing apparatus related to machine learning for detecting target from image, method for controlling the same, and storage medium | |
| CN109829397B (zh) | 一种基于图像聚类的视频标注方法、系统以及电子设备 | |
| US20150193698A1 (en) | Data processing device | |
| EP3001267A2 (fr) | Système de gestion de travail et procédé de gestion de travail | |
| JP7316731B2 (ja) | ビジョンシステムで画像内のパターンを検出及び分類するためのシステム及び方法 | |
| CN113111844B (zh) | 一种作业姿态评估方法、装置、本地终端及可读存储介质 | |
| JP7134331B2 (ja) | 計数システム、計数装置、機械学習装置、計数方法、部品配置方法、および、プログラム | |
| WO2019159409A1 (fr) | Dispositif de suivi de marchandises, compteur de marchandises, procédé de suivi de marchandises, procédé de comptage de marchandises, système de suivi de marchandises, système de comptage de marchandises | |
| JP2019106119A (ja) | 検出システム、情報処理装置、評価方法及びプログラム | |
| TW202242390A (zh) | 缺陷檢查裝置、缺陷檢查方法以及製造方法 | |
| US20190114785A1 (en) | System for real-time moving target detection using vision based image segmentation | |
| WO2021233058A1 (fr) | Procédé de surveillance d'articles sur une étagère de magasin, ordinateur et système | |
| CN113255651A (zh) | 包裹安检方法、装置及系统和节点设备、存储装置 | |
| WO2023221770A1 (fr) | Procédé et appareil d'analyse de cible dynamique, dispositif, et support de stockage | |
| CN109743497B (zh) | 一种数据集采集方法、系统及电子装置 | |
| JPWO2019088223A1 (ja) | 検出装置及び検出プログラム | |
| US20230169452A1 (en) | System Configuration for Learning and Recognizing Packaging Associated with a Product | |
| CN113902939A (zh) | 基于孪生网络的工业产品大缺陷检测方法 | |
| Dong et al. | Connecting finger defects in flexible touch screen inspected with machine vision based on YOLOv8n | |
| EP3647236A1 (fr) | Dispositif d'instruction de projection, système de tri de bagage, et procédé d'instruction de projection | |
| US20230196773A1 (en) | Object detection device, object detection method, and computer-readable storage medium | |
| US20220327806A1 (en) | Identification information addition device, identification information addition method, and program | |
| CN116385426A (zh) | 一种纺织物表面缺陷检测方法及相关设备 | |
| US20230229119A1 (en) | Robotic process automation (rpa)-based data labelling |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18906684 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2020500259 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18906684 Country of ref document: EP Kind code of ref document: A1 |