[go: up one dir, main page]

WO2019159409A1 - Goods tracker, goods counter, goods-tracking method, goods-counting method, goods-tracking system, and goods-counting system - Google Patents

Goods tracker, goods counter, goods-tracking method, goods-counting method, goods-tracking system, and goods-counting system Download PDF

Info

Publication number
WO2019159409A1
WO2019159409A1 PCT/JP2018/034186 JP2018034186W WO2019159409A1 WO 2019159409 A1 WO2019159409 A1 WO 2019159409A1 JP 2018034186 W JP2018034186 W JP 2018034186W WO 2019159409 A1 WO2019159409 A1 WO 2019159409A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
detection
image
images
transported
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/034186
Other languages
French (fr)
Japanese (ja)
Inventor
裕一郎 田島
佐藤 精基
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Solution Innovators Ltd
Original Assignee
NEC Solution Innovators Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Solution Innovators Ltd filed Critical NEC Solution Innovators Ltd
Priority to JP2020500259A priority Critical patent/JP6989178B2/en
Publication of WO2019159409A1 publication Critical patent/WO2019159409A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G43/00Control devices, e.g. for safety, warning or fault-correcting
    • B65G43/08Control devices operated by article or material being fed, conveyed or discharged
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06MCOUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
    • G06M7/00Counting of objects carried by a conveyor
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments

Definitions

  • the present invention relates to a conveyance tracking device, a conveyance counting device, a conveyance tracking method, a conveyance counting method, a conveyance tracking system, and a conveyance counting system.
  • a product that is transported on a conveyor such as a belt conveyor is imaged with a camera and the product is monitored.
  • the present inventors acquire images of the conveyed object over time, detect the conveyed object in each image, and detect the same conveyed object in each image.
  • a device capable of detecting and tracking the transported object hereinafter also referred to as “device of reference example”.
  • detection since detection is performed for each image, there is a problem that it takes time for detection and tracking processing. In addition, this problem becomes particularly noticeable when, for example, a transported object is detected using a learning model created by machine learning or deep learning.
  • an object of the present invention is to provide a transported object tracking device and a tracking method capable of reducing the processing time.
  • the transported object tracking device of the present invention acquires n images over time for the transported material being transported by the transporting device. Acquisition means; Detecting means for detecting the conveyed object for k detection target images selected from the n images; In the image acquired after each detection target image among the n images, tracking means for tracking the conveyed product detected in each detection target image, K in the detection means is smaller than n.
  • the transporting object counting device of the present invention includes tracking means for tracking the transported material being transported by the transporting device, Counting means for counting the tracked transported goods,
  • the tracking means is the transported object tracking device of the present invention.
  • the transported object tracking method of the present invention includes an image acquisition step of acquiring n images over time for the transported object being transported by the transporting device, A detection step of detecting the conveyed object for k detection target images selected from the n images; In the image acquired after each detection target image among the n images, a tracking step of tracking a conveyed object detected in each detection target image, In the detection step, k is smaller than n.
  • the transported object counting method of the present invention includes a tracking step for tracking the transported object being transported by the transporting device; Counting the tracked transported object,
  • the tracking step is the transported object tracking method of the present invention.
  • the program of the present invention includes an image acquisition process for acquiring n images over time with respect to a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; Among the n images, in the image acquired after each detection target image, it is possible to execute on the computer a tracking process for tracking the conveyed object detected in each detection target image, In the detection process, k is smaller than n.
  • the program of the present invention includes an image acquisition process for acquiring n images over time with respect to a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; In the image obtained after each detection target image among the n images, a tracking process for tracking the conveyed object detected in each detection target image; And a counting process for counting the number of the tracked transported objects can be executed on a computer, In the detection process, k is smaller than n.
  • the delivery tracking system of the present invention includes a terminal and a server, The terminal and the server are connectable via a communication network outside the system,
  • the terminal includes an imaging device, The imaging device captures n images over time for a transported object transported by a transport device,
  • the server includes image acquisition means, detection means, and tracking means,
  • the image acquisition means acquires n images over time for a transported object transported by the transport device,
  • the detection means detects a conveyed object for k detection target images among n images,
  • the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images, K in the detection means is smaller than n.
  • the transported object counting system of the present invention includes a terminal and a server, The terminal and the server are connectable via a communication network outside the system,
  • the terminal includes an imaging device, The imaging device captures n images over time for a transported object transported by a transport device,
  • the server includes image acquisition means, detection means, tracking means, and counting means,
  • the image acquisition means acquires n images over time for a transported object transported by the transport device,
  • the detection means detects a conveyed object for k detection target images among n images,
  • the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
  • the counting means counts the number of the tracked transported objects, K in the detection means is smaller than n.
  • the processing time can be shortened.
  • FIG. 1 is a block diagram illustrating a tracking apparatus according to the first embodiment.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of the tracking device according to the first embodiment.
  • FIG. 3 is a flowchart illustrating the tracking method and program according to the first embodiment.
  • FIG. 4 is a block diagram illustrating another example of the tracking device according to the first embodiment.
  • FIG. 5 is a block diagram illustrating another example of the tracking device according to the first embodiment.
  • FIG. 6 is a block diagram illustrating another example of the tracking apparatus according to the second embodiment.
  • FIG. 7 is a flowchart illustrating the tracking method and program according to the second embodiment.
  • FIG. 8 is a flowchart illustrating the tracking method and program according to the second embodiment.
  • FIG. 1 is a block diagram illustrating a tracking apparatus according to the first embodiment.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of the tracking device according to the first embodiment.
  • FIG. 3 is a flowchar
  • FIG. 9 is a diagram illustrating an image acquired by the tracking device.
  • FIG. 10 is a diagram illustrating a timeline of a part of processing in the tracking method according to the second embodiment.
  • FIG. 11 is a block diagram illustrating a counting device according to the third embodiment.
  • FIG. 12 is a flowchart illustrating a counting method and program according to the third embodiment.
  • FIG. 13 is a block diagram illustrating a sorting apparatus according to the fourth embodiment.
  • FIG. 14 is a flowchart illustrating a selection method and program according to the fourth embodiment.
  • FIG. 15 is a block diagram illustrating a tracking system according to the fifth embodiment.
  • FIG. 16 is a block diagram illustrating a sorting system according to the fifth embodiment.
  • Embodiment 1 relates to a tracking device and a tracking method of the present invention.
  • FIG. 1 shows a block diagram of the tracking device in the present embodiment.
  • the tracking device 10 of this embodiment includes an image acquisition unit 111, a detection unit 112, and a tracking unit 113.
  • the image acquisition unit 111, the detection unit 112, and the tracking unit 113 may be incorporated into data processing unit (data processing apparatus) 11 that is hardware, and software or the software is incorporated therein. It may be hardware.
  • the data processing unit 11 may include a CPU or the like.
  • the data processing unit 11 may include, for example, a ROM, a RAM, and the like which will be described later.
  • FIG. 2 illustrates a block diagram of the hardware configuration of the tracking device 10.
  • the tracking device 10 includes, for example, a CPU (Central Processing Unit) 201, a memory 202, a bus 203, a storage device 204, an input device 206, a display 207, a communication device 208, and the like. Each part of the tracking device 10 is connected via a bus 203 by a respective interface (I / F).
  • the hardware configuration of the tracking device 10 can be employed as a hardware configuration of a counting device and a sorting device, which will be described later, for example.
  • the CPU 201 operates in cooperation with other components by a controller (system controller, I / O controller, etc.), for example, and takes charge of overall control of the tracking device 10.
  • the program 205 and other programs of the present invention are executed by the CPU 201, and various information is read and written.
  • the CPU 201 functions as the image acquisition unit 111, the detection unit 112, and the tracking unit 113.
  • the tracking device 10 includes a CPU as an arithmetic device, the tracking device 10 may include another arithmetic device such as a GPU (Graphics Processing Unit), an APU (Accelerated Processing Unit), or a combination of the CPU and these. Good.
  • the CPU 201 functions as each unit other than the storage unit in Embodiments 2 to 4 and Modifications 1 to 5 described later, for example.
  • the memory 202 includes, for example, a main memory.
  • the main memory is also called a main storage device.
  • the memory 202 reads various operation programs such as the program 205 of the present invention stored in a storage device 204 (auxiliary storage device) described later.
  • the CPU 201 reads data from the memory 202, decodes it, and executes the program.
  • the main memory is, for example, a RAM (Random Access Memory).
  • the memory 202 further includes a ROM (read only memory), for example.
  • the bus 203 can be connected to an external device, for example.
  • the external device include an external storage device (external database and the like), a printer, and the like.
  • the tracking device 10 can be connected to a communication line network by a communication device 208 connected to the bus 103, and can also be connected to the external device via the communication line network.
  • the tracking device 10 can also be connected to a terminal or the like via the communication device 208 and a communication line network.
  • the storage device 204 is also referred to as a so-called auxiliary storage device for the main memory (main storage device), for example.
  • the storage device 204 stores an operation program including the program 205 of the present invention.
  • the storage device 204 includes, for example, a storage medium and a drive that reads from and writes to the storage medium.
  • the storage medium is not particularly limited, and may be, for example, a built-in type or an external type, such as HD (hard disk), FD (floppy (registered trademark) disk), CD-ROM, CD-R, CD-RW, MO, Examples of the drive include a DVD, a flash memory, and a memory card, and the drive is not particularly limited.
  • the storage device 204 may be, for example, a hard disk drive (HDD) in which the storage medium and the drive are integrated.
  • HDD hard disk drive
  • the tracking device 10 further includes an input device 206 and a display 207, for example.
  • the input device 206 includes, for example, a pointing device such as a touch panel, a track pad, and a mouse; a keyboard; an imaging unit such as a camera and a scanner; a card reader such as an IC card reader and a magnetic card reader; a voice input unit such as a microphone; It is done.
  • Examples of the display 208 include display devices such as an LED display and a liquid crystal display.
  • the input device 206 and the display 207 are configured separately, but the input device 206 and the display 207 may be configured as a single unit like a touch panel display.
  • the image acquisition unit 111 is electrically connected to the detection unit 112 and the tracking unit 113.
  • the image acquisition unit 111 acquires n images over time with respect to the conveyed item being conveyed by the conveying device.
  • An example of the image acquisition unit 111 is a CPU.
  • the image acquisition unit 111 acquires the n images over time from, for example, an imaging unit (imaging device) inside or outside the device, a storage unit inside or outside the device, or the like. To do.
  • the image acquisition unit 111 acquires n images according to the order in which the images are captured.
  • the image acquisition unit 111 may capture the n images over time.
  • the image acquisition unit 111 is, for example, an imaging unit (imaging device) that captures the image.
  • the imaging means is, for example, a still camera, a video camera, a mobile phone with a camera, a smartphone with a camera, a mobile terminal with a camera such as a tablet terminal with a camera, a computer with a webcam, a head-mounted display with a camera, etc.
  • Examples of the storage means include random access memory (RAM), read only memory (ROM), flash memory, hard disk (HD), optical disk, floppy disk (FD), and the like.
  • the storage means may be a built-in device or an external device such as an external storage device.
  • the image can be acquired, for example, by capturing an image with the imaging unit when the transport device is transporting a transported object.
  • the image may include, for example, one of an image including the transported object and an image not including the transported object, or may include both.
  • the image is, for example, an image including all or part of the transport device.
  • the image may be, for example, an image in which a certain portion of a transport device that transports the transported object is captured, or may be an image in which different locations are captured. That is, the image may be, for example, an image obtained by imaging a certain part of a route (conveyance route) along which the conveyed product is conveyed, or may be an image obtained by imaging a different part.
  • each image is captured such that the captured regions partially overlap, for example.
  • Examples of the transfer device include a contact type or non-contact type transfer device.
  • Examples of the transport device include a conveyor such as a belt conveyor, a shooter, and a carriage.
  • the transported object is, for example, an object transported by the transport device.
  • the transported object is not particularly limited and can be any object.
  • Specific examples of the transported material include raw materials, work-in-process, semi-finished products, products, merchandise, and stored items.
  • the tracked transported object may be all or a part of the transported object.
  • the tracked transported object may be, for example, an object or a tracking object.
  • the transported object includes an object, the object may be one type or a plurality of types. Examples of the object include non-defective products and defective products.
  • the frequency of acquiring the image is not particularly limited, and the lower limit thereof is, for example, 3 FPS (FlameslamPer Second), preferably 12 FPS, more preferably 20 FPS, and the upper limit thereof is not particularly limited. .
  • the frequency ranges are, for example, 10 to 100 FPS, 10 to 20 FPS, and 60 to 100 FPS.
  • n is an arbitrary positive integer of 2 or more, and the upper limit is not particularly limited.
  • the “n sheets” is, for example, 2 sheets (frames) or more, preferably 3 sheets or more, and more preferably 5 sheets or more.
  • the detection means 112 is electrically connected to the image acquisition means 111 and the tracking means 113.
  • An example of the detection unit 112 is a CPU.
  • the detection unit 112 detects the transported object for k detection target images selected from the n images acquired by the image acquisition unit 111.
  • the detection unit 112 detects the transported object for the k detection target images in the order acquired by the image acquisition unit 111.
  • the detection target image is an image for detecting the transported object, for example.
  • the detection target image can be acquired by selecting from the n images. The method for selecting the detection target image will be described later.
  • the detection target image may be used for tracking by the tracking unit 113 described later.
  • the “k” only needs to be smaller than n, more specifically, any positive integer smaller than n. That is, k is an arbitrary integer that satisfies 1 ⁇ k ⁇ n.
  • the “k sheets” only needs to satisfy the numerical range described above, and is appropriately changed according to the selection method of the detection target image, for example.
  • the tracking unit 113 is electrically connected to the image acquisition unit 111 and the detection unit 112.
  • An example of the detection unit 113 is a CPU.
  • the tracking unit 113 includes a transported object (hereinafter referred to as “detected”) detected in each of the detection target images in an image acquired after each of the detection target images among n images acquired by the image acquisition unit 111. (Also called “conveyance”). For example, the tracking unit 113 detects the transported object for the n detection target images in the order acquired by the image acquisition unit 111.
  • FIG. 3 shows a flowchart of the tracking method in the present embodiment.
  • the tracking method of this embodiment is implemented as follows, for example using the tracking apparatus 10 of FIG.
  • the tracking method of the present embodiment includes an S1 step (image acquisition), an S2 step (detection), and an S3 step (tracking).
  • S2 step and S3 step may be implemented in parallel and may be implemented in series.
  • n images are acquired by the image acquisition unit 111 over time with respect to the transported object transported by the transport device.
  • the imaging unit captures a transport route, and the image acquisition unit 111 acquires the captured image as the image.
  • the image acquisition unit 111 is an imaging unit
  • the image acquisition unit 111 captures the transport route and acquires it as the image.
  • the image acquisition unit 111 reads and acquires the image stored in the data storage unit (not shown).
  • the detection unit 112 detects the transported object for k detection target images selected from the n images. That is, in step S2, the transported object is detected for some of the n images. Specifically, first, in step S2, the detection target image is selected.
  • the detection target image may be selected by, for example, the detection unit 112 or may be selected by another unit.
  • the selection method of the detection target image is not particularly limited. For example, in n images, a predetermined number of images may be selected as the detection target image, or an image every predetermined time is selected as the detection target image. You may choose as The predetermined number and the predetermined time can be appropriately set according to the detection time of the detection target image of the detection unit 112, for example.
  • the detection unit 112 performs, for example, coordinates (for example, center coordinates), sizes (for example, width and length, or vertical and horizontal sizes (areas)) of each conveyance object in the detection target image.
  • the position information of the conveyed product is detected.
  • detection results detection information
  • the detection result may include, for example, the number of detections of each conveyed product.
  • the certainty factor is, for example, the probability (possibility) that the transported object detected by the detection unit 112 is a transported object.
  • the detection unit 112 may detect an object in the transported object.
  • the detection unit 112 may detect the object by detecting a conveyance object other than the object in the conveyance object.
  • the detection unit 112 can detect the transport object or the target object using, for example, color-based identification, contour extraction, template matching, a learning model that can detect the transport object or the target object, and the like.
  • the learning model can be created, for example, by performing machine learning or deep learning on the transported object or object.
  • the detection unit 112 may classify which object corresponds to, for example.
  • the detection means 112 may output the classification class of the detected target object as a detection result, for example.
  • step S3 the tracking unit 113 tracks the transported object detected in each detection target image in the images acquired after each detection target image among the n images. That is, in step S3, when the tracking unit 113 detects a transported object in the pth detection target image among the k detection target images, the pth out of the n images. In the image acquired (captured) after the detection target image, the conveyance object detected in the p-th detection target image is tracked. The tracking unit 113 tracks, for example, the conveyed object detected in the p-th detection target image in all or part of the image acquired after the p-th detection target image.
  • the “p” is a positive integer equal to or less than k.
  • the transported object can be traced by optical flow estimation such as Lucas Kanade method and Horn-Schunk method.
  • the tracking of the transported object is detected in, for example, an image tracking the transported object, an image acquired immediately before the image to be tracked, and an image acquired immediately before the image to be tracked.
  • position information such as coordinates (for example, center coordinates) of the conveyed object
  • position information such as coordinates of the detected conveyed object (for example, center coordinates) in the image used for tracking is calculated by optical flow estimation. This can be done.
  • the tracking unit 113 may output the position information of the conveyed object detected in the image used for the tracking as a tracking result (tracking information). If the detection unit 112 outputs the detection result, in step S3, the tracking unit 113 may associate the detection result and the tracking result with respect to the corresponding transported object.
  • the transported object is detected in k detection target images selected from the n images. That is, in the tracking device and the tracking method according to the first embodiment, the transported object is detected not in all the images of the n images but in a part of the images. For this reason, the tracking device and the tracking method of the first embodiment can shorten the processing time as compared with the device of the reference example that detects the transported object in all images.
  • the tracking device 10 is configured by the data processing means 11, but may include other configurations.
  • the tracking device of the present invention may include data storage means.
  • FIG. 4 is a block diagram illustrating another example of the tracking device according to the first embodiment.
  • the tracking device 20 includes an image storage unit 121, a detection information storage unit 122, and a tracking information storage unit 123 in addition to the configuration of the tracking device 10.
  • the image storage unit 121 includes the image acquisition unit 111, the detection unit 112, and the tracking unit 113
  • the detection information storage unit 122 includes the detection unit 112 and the tracking unit 113
  • the tracking information storage unit 123 includes the tracking unit 113. Electrically connected.
  • the image storage unit 121, the detection information storage unit 122, and the tracking information storage unit 123 may be incorporated in the data storage unit 12 that is hardware, for example, as illustrated in FIG.
  • Examples of the data storage means 12 include the storage means described above, and specific examples include ROM, RAM, and the like.
  • the tracking device 20 stores n images acquired by the image acquisition unit 111 in the image storage unit 121, and outputs the stored images to the detection unit 112 and the tracking unit 113. Further, the detection result of the detection unit 112 and the detection target image are stored in the detection information storage unit 122, and the stored detection result is output to the tracking unit 113. Furthermore, the tracking result of the tracking unit 113 is stored in the tracking information storage unit 123. Except for these points, the tracking device 20 has the same configuration as that of the tracking device 10, and the description thereof can be used.
  • the tracking device of the present invention may include at least one of input means and output means.
  • FIG. 5 is a block diagram illustrating another example of the tracking device according to the first embodiment.
  • the tracking device 30 includes an input unit 13 and an output unit 14 in addition to the configuration of the tracking device 10.
  • the input means 13 is electrically connected to the image acquisition means 111
  • the output means 14 is electrically connected to the image acquisition means 111, the detection means 112, and the tracking means 113, respectively.
  • the input unit 13 inputs information such as start and stop of image acquisition, for example.
  • the input means 13 for example, a normal input means provided in a portable terminal such as a monitor such as a touch panel or an operation key, a normal input means provided in a computer such as a keyboard and a mouse, an input file, another computer, and the like can be used.
  • the output unit 14 outputs, for example, n images acquired by the image acquisition unit 111, detection results of the detection unit 112, k detection target images, tracking results of the tracking unit 113, and the like.
  • the output means 14 includes, for example, display means such as a monitor that outputs video (for example, various image display devices such as a liquid crystal display (LCD) and a cathode ray tube (CRT) display), a printer that outputs by printing, and a speaker that outputs sound. Etc.
  • display means such as a monitor that outputs video
  • various image display devices such as a liquid crystal display (LCD) and a cathode ray tube (CRT) display
  • the output unit 14 may display the n images, the detection result, the detection target image, and the tracking result on the display unit.
  • the input unit 13 and the output unit 14 may be electrically connected to the data processing unit 11 via, for example, an I / O interface.
  • the tracking device of the present invention may further include, for example, a video codec and a controller (system controller, I / O controller, etc.). Except for these points, the tracking device 30 has the same configuration as that of the tracking device 10, and the description thereof can be used.
  • Modification 1 relates to the tracking device and the tracking method of the present invention.
  • the tracking device and the tracking method according to the first modification for example, the description of the tracking device and the tracking method according to the first embodiment can be cited.
  • the tracking device acquires the m-th image among the n images by the image acquisition unit, and then includes the k detection target images.
  • a detection processing determination unit that determines whether the detection unit performs detection on the first detection target image. In this case, for example, when the detection processing determination unit determines that the detection unit has not performed the detection, the detection unit determines that the m-th image is l + 1 sheets. The transported object is detected as an eye detection target image.
  • the “m” is a positive integer less than or equal to n. That is, m is an integer that satisfies 1 ⁇ m ⁇ n.
  • the “l” is a positive integer equal to or less than k ⁇ 1. That is, l is an integer that satisfies 1 ⁇ l ⁇ k ⁇ 1.
  • the detection processing determination unit is electrically connected to, for example, the image acquisition unit and the detection unit.
  • An example of the detection process determination means is a CPU.
  • the tracking method of the first modified example includes, after the acquisition of the mth image among the n images in the image acquisition step, the k detection target images. , Including a detection process determination step for determining whether the detection step is performing detection on the first detection target image.
  • the m + 1st image is l + 1 sheets. The transported object is detected as an eye detection target image.
  • the tracking method of Modification 1 can be implemented using the tracking device of Modification 1, for example. Specifically, first, the m-th image is acquired by the image acquisition means (image acquisition step). Next, it is determined by the detection process determination means whether the detection means is performing detection on the first detection target image (detection process determination step). Specifically, the detection process determination unit determines, for example, whether or not the process by the detection unit is operating by confirming the operating status of the CPU that performs the detection process. When the detection processing determination unit determines that the detection unit has not performed detection, the detection unit selects the m-th image as an l + 1-th detection target image, and the transport An object is detected (detection step).
  • the detection processing determination means determines that the detection by the detection means is being performed, the detection by the detection means is not performed.
  • the detected conveyance object may be tracked in the m-th image by the tracking unit.
  • the tracking device and the tracking method of Modification 1 can track the detected transported object even for an image that is not subjected to the detection process, for example, so that the tracking accuracy can be further improved.
  • the detection processing determination unit for example, among the detection result of the l-th detection target image and the n images, the l It may be determined whether or not the association unit performs association with the tracking result of the next image of the first detection target image. In this case, when the detection processing determination unit determines that the detection by the detection unit and the association by the association unit are not performed, the detection unit converts the m-th image into the l + 1-th image. It is preferable to detect the transported object as a detection target image.
  • the detection processing determination step includes, for example, the detection result of the l th detection target image and the n images among the l images. It may be determined whether or not the association step is associated with the tracking result of the next image of the first detection target image. In this case, when the detection process determination step determines that the detection by the detection step and the association by the association step are not performed, the detection step determines that the mth image is the l + 1th image. It is preferable to detect the transported object as a detection target image.
  • the tracking device and the tracking method of the first modification it is determined whether the detection process in the detection unit or the detection process is performed. If the detection process is not performed, a new income is obtained. A detection process can be performed on the m-th image. For this reason, it is possible to prevent a newly acquired image from waiting for work without being subjected to detection processing. Therefore, according to the tracking device and the tracking method of Modification 1, the processing time can be further shortened.
  • Modification 2 relates to the tracking device and the tracking method of the present invention.
  • the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the second modification.
  • the detection unit acquires the position information of the transported object, and the tracking unit includes the jth image out of the n images. Based on the image, the image before the j ⁇ 1th sheet, and the position information of the conveyed object in the image before the j ⁇ 1th sheet, by calculating the position information of the conveyed object in the jth image, The transported object is tracked.
  • J is a positive integer of 2 or more and n or less. That is, j is an integer that satisfies 2 ⁇ j ⁇ n.
  • the position information of the conveyed object is acquired in the detection step.
  • the jth image among the n images is acquired. Based on the image, the image before the j ⁇ 1th sheet, and the position information of the conveyed object in the image before the j ⁇ 1th sheet, by calculating the position information of the conveyed object in the jth image, The transported object is tracked.
  • the tracking method of Modification 2 can be implemented by the tracking device of Modification 2, for example. Specifically, first, in the same manner as in the first embodiment, the first to j ⁇ 1th images are traced. At this time, in the second modification, when the detected object is detected by the detection unit, the position information of the conveyed object is acquired for the detection target image (detection step). Next, the j-th image is acquired in the same manner as in the first embodiment. When the j-th image is not the detection target image, the detection by the detection unit is not performed. On the other hand, when the j-th image is the detection target image, the j-th image is selected as the i-th detection target image, detection is performed by the detection unit, and the i-th detection target is detected. Position information of the conveyed product in the image is acquired (detection step).
  • the detected transported object in the j-th image is tracked (tracking process). Specifically, the j-th image, the image before the j ⁇ 1th image, and the position information of the conveyed object in the image before the j ⁇ 1th image are acquired by the tracking unit. Then, based on the position information of the conveyed object in the image before the j ⁇ 1th image, the image before the j ⁇ 1th image, and the image before the j ⁇ 1th image, for example, by the optical flow estimation, the jth image The position information of the conveyed product detected in the first image is calculated.
  • the tracking unit performs the j-th image, the i ⁇ 1-th previous detection target image, and the i ⁇ It is preferable to track the position of the transported object by calculating the position information of the transported object in the j-th image based on the position information of the transported object in the first detection target image. It is preferable that the detection target image before the (i ⁇ 1) th image includes the (i ⁇ 1) th detection target image. Thereby, since the newly detected transported object in the i-1th detection target image can be tracked, the tracking of the transported object can be suppressed, and the tracking accuracy can be further improved.
  • the position information of the conveyed product in the detection step and the tracking step is preferably coordinates or center coordinates of the conveyed product.
  • I is an integer of 2 or more and k or less. That is, i is an integer that satisfies 2 ⁇ i ⁇ k.
  • the transported object in the j-th image, the image before the j ⁇ 1th image, and the image before the j ⁇ 1th image Since the position information of the conveyed object in the j-th image is calculated based on the position information, the tracking process can be reduced. Therefore, according to the tracking device and the tracking method of Modification 2, the processing time can be further shortened.
  • Modification 3 relates to the tracking device and the tracking method of the present invention.
  • the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the third modification.
  • the tracking device acquires the positional information of the transported object detected in the detection target image, and the transport device obtained by the tracking unit is obtained.
  • An erroneous detection determination means for determining whether the transported object is an erroneously detected object based on position information is included.
  • the erroneous detection determination means is electrically connected to, for example, the detection means and the tracking means.
  • An example of the erroneous detection determination means is a CPU.
  • the position information of the detected transported object is preferably, for example, coordinates or center coordinates of the transported object.
  • the position information of the conveyed product obtained by the tracking means is preferably the coordinates or center coordinates of the conveyed item.
  • the tracking method of Modification 3 can be implemented by the tracking device of Modification 3, for example. Specifically, first, the detection step and the tracking step are performed in the same manner as in the first embodiment. At this time, in Modification 3, by calculating the position information of the conveyed object in the image used for tracking (for example, the jth image in Modification 2) for the conveyed object detected in the detection target image. ,get. Next, based on the position information of the conveyed product obtained in the tracking step, the erroneous detection determining unit determines whether the conveyed item is an erroneously detected object (error detection determining step).
  • the determination of the erroneously detected object can be performed based on, for example, the amount of change in the position of the detected conveyed object between a plurality of images.
  • the amount of change in the position (for example, coordinates or center coordinates) of the detected conveyed object in a predetermined number of images is equal to or less than a predetermined numerical value, the detected conveyed object is an erroneously detected object. Can be determined.
  • the difference between the average movement vector of the conveyance object calculated from each image and the movement vector of the conveyance object detected between the images is
  • the detected conveyed item may be determined as an erroneously detected object.
  • the predetermined numerical value include 10 pixels, 5 pixels, and 1 pixel.
  • the modified example 3 can detect the erroneously detected object with higher accuracy.
  • the predetermined number of images is, for example, 1 to 10 frames, 1 to 5 frames, and 1 to 3 frames.
  • the predetermined numerical value can be appropriately set according to, for example, the size of the transport device captured by the image and the length of the transport route. It is preferable that the transported object determined as the erroneously detected object and the information associated with the transported object are deleted from the detection result and the tracking result, for example.
  • the deletion is performed, for example, prior to detection of a conveyed product in a new detection target image (next detection target image).
  • the deletion is not performed together with the erroneous detection determination unit or the erroneous detection determination step, in the erroneous detection determination unit or the erroneous detection determination step, the conveyance object in the detection result and the tracking result and the information associated with the conveyance object are included.
  • the deletion flag is assigned.
  • the processing time can be further shortened.
  • Modification 4 relates to the tracking device and the tracking method of the present invention.
  • the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the fourth modification.
  • the tracking device of Modification 4 includes a detection result of a conveyance object in the h-th detection target image and the n images among the k detection-target images. Correspondence for associating the same transported object with the detected transported object and the tracked transported object based on the tracking result of the transported object in the image acquired next to the h-th detection target image Including means.
  • the association means is electrically connected to, for example, the detection means and the tracking means.
  • An example of the association means is a CPU.
  • h is a positive integer equal to or less than k. That is, h is an integer that satisfies 1 ⁇ h ⁇ k.
  • the tracking method of the modification 4 includes a detection result of the conveyed object in the h-th detection target image and the n images among the k detection-target images. Correspondence for associating the same transported object with the detected transported object and the tracked transported object based on the tracking result of the transported object in the image acquired next to the h-th detection target image Process.
  • the tracking method of Modification 4 can be implemented by the tracking device of Modification 4, for example.
  • the detection result of the conveyed product in the h-th detection target image is acquired by the association unit.
  • a tracking result of the conveyed object in the image acquired next to the h-th detection target image among the n images is acquired by the association unit.
  • the said matching means determines whether the same conveyance object exists by calculating the overlap of the area
  • the said matching means can match
  • the associating means associates as follows, for example. Specifically, first, the overlapping ratio of the regions is calculated for all combinations of the transported object in the detection result and the transported object in the tracking result. In the case where the area of the transported object of the tracking result is A and the area of the transported object of the detection result is B, the overlap rate is, for example, in the area of the transported object of the tracking result (A).
  • the ratio (A ⁇ B / B) of the area where the conveyed object overlaps the ratio (A / B / B) of the area where the conveyed object of the tracking result overlaps in the area (B) of the conveyed object as the detection result,
  • a ⁇ B the ratio of the tracking result transported object area and the detection result transported object area
  • a ⁇ ⁇ B an area where the tracking result transported object area and the detection result transported object area overlap
  • the association unit calculates, for example, a distance between the coordinates (for example, center coordinates) of each transported object in the detection result and the coordinates (for example, center coordinates) of each transported object in the tracking result. It may be determined whether the same transported object exists. In this case, for example, the association unit repeatedly performs association that associates the same transport object in order from the combination with the shortest distance.
  • the same conveyance object in the detection result and the tracking result can be associated with each other. For this reason, according to the tracking device and the tracking method of the modified example 4, it is not necessary to track the same transported object as another transported object, and the processing time can be further shortened.
  • Modification 5 relates to the tracking device and the tracking method of the present invention.
  • the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the modification 5.
  • the tracking device includes, in addition to the tracking device according to the first embodiment, a transported object determination unit that determines whether there is a transported object detected in the detection target image acquired before the image.
  • the tracking means tracks.
  • the transported object determining means is electrically connected to, for example, the detecting means and the tracking means.
  • An example of the conveyed product determination means is a CPU.
  • the tracking method of the modified example 5 includes a transported object determination step of determining whether a transported object detected in the detection target image acquired before the image exists, When it is determined in the object determination step that the detected transported object exists, the tracking step tracks.
  • the tracking method of Modification 5 can be implemented by the tracking device of Modification 5, for example.
  • the transported object determination means acquires the detection result of the transported object detected in the detection target image acquired before the q-th image. Then, in the acquired detection result of the transported object, the transported object determining unit determines whether or not the detected transported object exists. When the detected transported object exists, a tracking step by the tracking unit is performed. Moreover, when the above-mentioned deletion flag is given to the detection result of the said conveyed product, the said conveyed product determination means may determine that the conveyed product to which the said deletion flag was given is not detected, for example. . Thereby, for example, since the tracking of the erroneously detected object can be avoided, the processing time can be further shortened.
  • the “q” is a positive integer of n or less. That is, q is an integer that satisfies 1 ⁇ q ⁇ n.
  • the tracking device and the tracking method of the modified example 5 it is determined whether or not the detected transported object exists. For this reason, according to the tracking device and the tracking method of the modified example 5, when the detected transported object does not exist or when the transported object to be tracked does not exist, the tracking process does not have to be performed. Time can be shortened.
  • Modifications 1 to 5 may be used alone, but it is preferable to use a plurality of combinations in order to shorten the processing time and improve the tracking accuracy. More preferably. In the case of the plurality of combinations, the combination is not particularly limited and can be any combination.
  • the second embodiment relates to the tracking device and the tracking method of the present invention.
  • FIG. 6 shows a block diagram of the tracking device in the present embodiment.
  • the tracking device 40 of the present embodiment includes a detection processing determination unit 114, a transported object determination unit 115, an erroneous detection determination unit 116, and an association unit 117.
  • the image acquisition unit 111, the detection unit 112, the tracking unit 113, the detection process determination unit 114, the transported object determination unit 115, the erroneous detection determination unit 116, and the association unit 117 are, for example, hardware. It may be incorporated in a certain data processing means (data processing apparatus) 11 or may be software or hardware in which the software is incorporated.
  • the data processing unit 11 may include a CPU or the like. Further, the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.
  • the detection processing determination unit 114 is the image acquisition unit 111 and the detection unit 112
  • the conveyed product determination unit 115 is the detection unit 112 and the tracking unit 113
  • the erroneous detection determination unit 116 is the detection unit 112.
  • the tracking unit 113 and the association unit 117 are electrically connected to the detection unit 112 and the tracking unit 113, respectively.
  • the configuration of the tracking device 40 of the second embodiment is the same as the configuration of the tracking device 10 of the first embodiment, and the description thereof can be used. Further, the descriptions of the modified examples 1 to 5 can be used for the description of the detection processing determination unit 114, the conveyed product determination unit 115, the erroneous detection determination unit 116, and the association unit 117.
  • FIGS. 7 and 8 show flowcharts of the tracking method in the present embodiment.
  • the tracking method of this embodiment is implemented as follows using the tracking device 40 of FIG. 6, for example.
  • the tracking method of the present embodiment includes S1 step (image acquisition), S4 step (detection process determination), S6 step (tracking data update), S2 ′ step (detection / association), S7. It includes a main thread composed of steps (conveyed object determination), S3 step (tracking), S8 step (addition of tracking data), and S9 step (false detection determination).
  • the S2 ′ step includes a sub thread composed of an S2 step (detection), an S10 step (association, synchronization), and an S11 step (detection result output).
  • FIG. 9 is a diagram illustrating an image acquired by the tracking device 40.
  • FIG. 10 is a diagram showing a timeline of processing of the S2 ′ step including the S2 step (detection) and the S10 step (association, synchronization) and the S3 step (tracking).
  • step S ⁇ b> 1 frame 1 (first image) is acquired by the image acquisition unit 111.
  • step S4 it is determined by the detection process determination means 114 whether the detection process is being implemented (S4).
  • the detection unit 112 updates the tracking data associated with the detection result and the tracking result (S6).
  • the tracking data is not updated, and the process proceeds to step S2 ′ to start a sub thread.
  • the detection unit 112 selects the frame 1 as the detection target image, and detects the target object 1a and the target object 2a in the frame 1.
  • the detection means 112 detects the first object 1a and the second object 2a using a learning model that can detect the first object 1 and the second object 2.
  • the center coordinates, vertical and horizontal sizes (occupied areas), and certainty (possibility of being the first object 1) of the first object 1a are associated with the frame 1.
  • the detection result in which the center coordinates, the position information such as the vertical and horizontal sizes (occupied areas), and the certainty (possibility of being the second object 2) are associated with the frame 1. Is output as
  • step S10 the same conveyance object is detected among the first object 1a and the second object 2a detected by the association unit 117 and the object in the tracking result of the next image of the frame 1.
  • Associate In frame 1, since there is no tracking data that can be compared with the detected first object 1a and second object 2a, no matching is performed and the process proceeds to step S11.
  • step S11 the detection results of the first object 1a and the second object 2a and the frame 1 are output. Then, the sub thread is terminated.
  • step S7 is performed in parallel with the step S2, step S10 and step S11 in the step S2 '.
  • the conveyed product determination means 115 determines whether the detected conveyed item exists. In frame 1, since there is no transported object detected before frame 1, if No, that is, the process proceeds to step S1.
  • step S ⁇ b> 1 the frame 2 is acquired by the image acquisition unit 111. And it is determined by the detection process determination means 114 whether the detection process is implemented (S4). In frame 2, as shown in FIG. 10, since the detection process of frame 1 is completed and the detection process is not performed, the process proceeds to No. Next, for example, the tracking data is updated by the detecting means 112 (S6). In the frame 1, the first object 1a and the second object 2a are detected, and the detection results of the first object 1a and the second object 2a are output. Therefore, in step S6, these are detected objects, and the positional information and the certainty factor of the first object 1a and the second object 2a in the frame 1 are recorded as tracking data.
  • the number of detections as the first object 1 is recorded as one for the first object 1a
  • the number of detections as the second object 2 is recorded as one for the second object 2a. To do.
  • the certainty factor and the number of detection times are listed and held for each type of object. Then, the process proceeds to step S2 ′ to start a sub thread.
  • step S2 the detection unit 112 selects the frame 2 as the detection target image, and the frame 2 together with the first object 1a and the second object 2a detected in the frame 1
  • the first object 1b is detected.
  • the center coordinates, the vertical and horizontal sizes (areas occupied), and the certainty (possibility of the object 1) are associated with the frame 2 for the first objects 1a and 1b by the detection by the detection unit 112. Is output as a detection result.
  • the second object 2a its center coordinates, position information such as vertical and horizontal sizes (occupied regions), and certainty (possibility of being the object 2) are output as detection results associated with the frame 2.
  • step S10 the association unit 117 detects the first object 1a, 1b and the second object 2a detected and the next image of the frame 2, that is, the tracking result of the frame 3 described later.
  • the same objects are associated with each other as the same object.
  • region of the target object 2a is calculated by the combination of all the target objects. Each area is calculated from the center coordinates and vertical and horizontal sizes of each object.
  • step S11 the detection results of the first objects 1a and 1b and the second object 2a, the correspondence between the first object 1a and the second object 2a in the frames 2 and 3, and the frame 2 are displayed. Output. Then, the sub thread is terminated.
  • step S7 is performed in parallel with the step S2, step S10 and step S11 in the step S2 '.
  • the conveyed product determination means 115 determines whether the detected conveyed item exists. Since the detected first target object 1a and second target object 2a exist, the process proceeds to Yes.
  • step S3 is performed.
  • the tracking unit 113 tracks the first target 1a and the second target 2a based on the frame 1 that is the previous detection target image.
  • step S3 for the first object 1a and the second object 2a, based on the center coordinates of the first object 1a and the second object 2a in the frame 1 and the frames 1 and 2. Then, the center coordinates of the first object 1a and the second object 2a in the frame 2 are respectively calculated by optical flow estimation.
  • step S8 for example, the tracking unit 113 uses the calculated center coordinates of the first object 1a and the second object 2a as position information of the objects 1a and 2a in the frame 2, so that the first object It adds to the positional information on the object 1a and the 2nd target object 2a.
  • the erroneous detection determination means 116 determines whether the first object 1a and the second object 2a are erroneous detection objects. Specifically, for each of the first object 1a and the second object 2a, it is determined whether the amount of change in the center coordinate at a predetermined number of frames is equal to or less than a predetermined numerical value.
  • the deletion flag is added to the tracking data of the corresponding first object 1a and second object 2a by the erroneous detection determination means 116. Is granted.
  • the tracking data to which the deletion flag is assigned is deleted at the next step S6. In the present embodiment, the first object 1a and the second object 2a do not give the deletion flag because they do not satisfy the condition.
  • step S9 the process proceeds to step S1, and the frame 3 is acquired by the image acquisition unit 111. And it is determined by the detection process determination means 114 whether the detection process is implemented (S4).
  • the process proceeds to Yes.
  • step S3, S7, S3, S8 and S9 are performed on frame 3 in the same manner as frame 2, except that frame 2 which is the previous image is used instead of the previous detection target image. Perform the steps.
  • step S9 After completion of step S9 for frame 3, the process proceeds to step S1, and the frame 4 is acquired by the image acquisition unit 111. Then, steps S4, S7, S3, S8, and S9 are performed on frame 4 in the same manner as frame 3. After step S9, the process proceeds to step S1.
  • tracking is performed in the same manner as the tracking method for frame 2-4 until the image acquisition by the image acquisition unit 111 is completed. Thereby, the conveyed product containing a target object can be tracked.
  • step S7 it may be determined whether a new image has been acquired in step S7 if No and / or after step S9. If Yes, that is, if a new image has been acquired, the process proceeds to step S1. On the other hand, in the case of No, that is, when a new image is not acquired, the tracking method of the present embodiment is terminated.
  • the detection process and the tracking process are processed in parallel. For this reason, in the tracking device and the tracking method of the second embodiment, it is not necessary to wait for the tracking process when the detection process is performed. For this reason, the tracking device and the tracking method of the second embodiment can further reduce the processing time as compared with the device combining the first to fifth modifications.
  • Embodiment 3 relates to the counting device and counting method of the present invention.
  • the description of the tracking device and the tracking method can be used for the counting device and the counting method of the present embodiment.
  • FIG. 11 shows a block diagram of the counting device in the present embodiment.
  • the counting device 50 of the present embodiment includes a counting unit 118 in addition to the tracking device 10 of the first embodiment.
  • the image acquisition unit 111, the detection unit 112, the tracking unit 113, and the counting unit 118 may be incorporated in, for example, a data processing unit (data processing apparatus) 11 that is hardware, Hardware in which the software is incorporated may be used.
  • the data processing unit 11 may include a CPU or the like.
  • the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.
  • the counting means 118 is electrically connected to the tracking means 113.
  • An example of the counting means 118 is a CPU.
  • the counting means 118 counts the tracked transported object. Specifically, when the tracking unit 113 acquires the position information of the detected transported object, the counting unit 118 counts the transported object based on the position information of the transported object, for example.
  • the position information of the conveyed object used for counting includes, for example, coordinates of the conveyed object (for example, center coordinates) in one or more images, and a trajectory connecting the coordinates of the conveyed objects in two or more images.
  • FIG. 12 shows a flowchart of the counting method in the present embodiment.
  • the counting method of this embodiment is implemented as follows using the counting device 50 of FIG. 11, for example.
  • the counting method of the present embodiment is performed in addition to S1 step (image acquisition), S2 step (detection), and S3 step (tracking), which are the tracking method of Embodiment 1, and S12 step (counting).
  • S1 step image acquisition
  • S2 step detection
  • S3 step tilting
  • S2 step and S3 step may be implemented in parallel and may be implemented in series.
  • the steps S1-S3 are performed in the same manner as the tracking method of the first embodiment.
  • step S12 the counted transported object is counted by the counting means 118.
  • the counting unit 118 determines whether the tracked positional information of the conveyed product satisfies a counting condition that is a condition for counting the conveyed product. Examples of the counting condition include movement of the transported object outside the image, movement of the transported object to a predetermined area, and the distance of the trajectory of the transported object exceeding a predetermined distance. Then, the counting unit 118 counts the transported object determined to satisfy the counting condition. In step S12, after the count, the count detection result and tracking result may be deleted.
  • step of updating the tracking data in the case of having the step of updating the tracking data as in the tracking method of the second embodiment, in the step of updating the tracking data by assigning a deletion flag to the detection result and tracking result of the counted transported object May be. Thereby, it can prevent counting about the same conveyed product in multiple times.
  • step S12 when the transported object includes a target object, only the target object may be counted, or a transported object other than the target object and the target object may be counted. Moreover, when the said conveyed product contains multiple types of target object, you may count for every target object.
  • the count for each object can be performed based on, for example, the number of detections as each object in the tracking method of the second embodiment. For example, the number of detections of the first object 1a in FIG. 9 as the first object 1 is 7 and the number of detections as the second object 2 is 3, and the number of detections is different for each object. In this case, in step S12, the first object 1a is counted as being the first object 1.
  • the number of times of detection of the first object 1a of FIG. 9 as the first object 1 is 5 times
  • the number of times of detection as the second object 2 is 5, and the number of times of detection for each object.
  • the certainty factor in the tracking data is 5
  • the highest certainty factor of the first object 1 and the certainty factor of the second object 2 is detected, Determine which is higher.
  • the certainty factor of the first object 1 is higher than the certainty factor of the second object, the first object 1 a is counted as being the first object 1.
  • the first object 1 a is counted as being the second object 2.
  • the counting device of the present embodiment includes an erroneous detection determination unit, it is preferable that the counting unit is electrically connected to the erroneous detection determination unit.
  • the counting step is preferably performed after the erroneous detection determination step.
  • the counting device and the counting method of the third embodiment include the tracking device and the tracking method of the first embodiment, respectively.
  • the processing time can be shortened as compared with the counting device and the counting method including the device of the reference example that detects the conveyed object in all images.
  • the counting means when the number of detections of the same conveyed object is included and the number of detections of the conveyed object is equal to or less than a predetermined number of times, the counting means has the predetermined number of times of detection. It is not necessary to count the following conveyed items. Further, in the counting method of the present embodiment, for example, including a detection frequency measurement step of measuring the number of detections of the same transported object, and when the detection frequency of the transported object is a predetermined number or less, the counting step Do not count transported items less than the specified number.
  • the detection unit 112 can prevent the erroneously detected object erroneously detected as the transported object, so that the transported object can be counted more accurately. Further, it is preferable that the detection frequency measurement means or the detection frequency measurement process is combined with the aforementioned erroneous detection determination means or the erroneous detection determination process. As a result, the erroneous detection object can be more effectively prevented from being counted, so that the conveyed object can be counted more accurately.
  • Embodiment 4 relates to a sorting apparatus and a sorting method of the present invention.
  • the description of the tracking device and the tracking method can be used for the sorting device and the sorting method of the present embodiment.
  • FIG. 13 shows a block diagram of the sorting apparatus in the present embodiment.
  • the sorting device 60 of this embodiment includes a sorting unit 119 in addition to the tracking device 10 of the first embodiment.
  • the image acquisition unit 111, the detection unit 112, the tracking unit 113, and the selection unit 119 may be incorporated into, for example, a data processing unit (data processing apparatus) 11 that is hardware, Hardware in which the software is incorporated may be used.
  • the data processing unit 11 may include a CPU or the like.
  • the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.
  • the sorting unit 119 is electrically connected to the tracking unit 113.
  • An example of the sorting unit 119 is a CPU.
  • the sorting means 119 sorts the objects in the tracked transported object directly or indirectly. In the case of the indirect sorting, when the tracking unit 113 has acquired the position information of the detected object, the sorting unit 119 is, for example, based on the position information of the object, inside or outside the apparatus. The object is sorted by a sorting device.
  • the position information of the conveyed object used for the selection includes, for example, coordinates of the conveyed object (for example, center coordinates) in one or more images and a trajectory connecting the coordinates of the conveyed object in two or more images.
  • the sorting device examples include a sorting device that can sort a plurality of types of items, a robot arm, and the like.
  • the sorting unit 119 may, for example, use the tracked transported object based on the position information of the object. Directly sort objects in In this case, an example of the sorting unit 119 is the above-described sorting device.
  • FIG. 14 shows a flowchart of the selection method in the present embodiment.
  • the sorting method of the present embodiment is performed as follows, for example, using the sorting device 60 of FIG.
  • the sorting method of this embodiment is not limited to the tracking method of Embodiment 1 (step S1 (image acquisition), step S2 (detection), and step S3 (tracking)), and step S13 (screening).
  • step S1 image acquisition
  • step S2 detection
  • step S3 tilting
  • step S13 screening
  • S2 step and S3 step may be implemented in parallel and may be implemented in series.
  • the steps S1-S3 are performed in the same manner as the tracking method of the first embodiment.
  • the sorting means 119 sorts the object in the tracked transported object directly or indirectly. Specifically, in step S13, the sorting unit 119 acquires position information of the tracked target object and / or a transported object other than the target object. When the position information is the position information of the object, in step S13, the object is selected by the sorting unit 119, for example, by dividing the object. On the other hand, when the position information is the position information of the conveyed object other than the object, in step S13, the object is selected by the sorting unit 119, for example, by removing the conveyed object other than the object.
  • the transported object includes a plurality of types of objects
  • a plurality of types of objects and a transported object other than the target object may be selected, and further, a plurality of types of objects may be selected, respectively. Also good.
  • the selection unit is electrically connected to the erroneous detection determination unit.
  • the selection step is preferably performed after the erroneous detection determination step.
  • the sorting device and the sorting method of the fourth embodiment include the tracking device and the tracking method of the first embodiment, respectively. For this reason, according to the sorting apparatus and the sorting method of the fourth embodiment, the processing time can be shortened as compared with the sorting apparatus and the sorting method including the apparatus of the reference example that detects the conveyed product in all images.
  • the program of this embodiment is a program that can execute the tracking method, counting method, or sorting method described above on a computer. Or the program of this embodiment may be recorded on a computer-readable recording medium, for example.
  • the recording medium is, for example, a non-transitory computer-readable storage medium.
  • the recording medium is not particularly limited, and examples thereof include a random access memory (RAM), a read-only memory (ROM), a hard disk (HD), an optical disk, and a floppy (registered trademark) disk (FD).
  • Embodiment 6 relates to the tracking system of the present invention.
  • the tracking system of the present invention can use, for example, the description of the tracking device and the tracking method.
  • FIG. 15 shows an example of the configuration of a tracking system using the tracking device of the present invention.
  • the tracking system of this embodiment includes imaging devices 31 a, 31 b, and 31 c, communication interfaces 32 a, 32 b, and 32 c, and a server 34.
  • the imaging device 31a is connected to the communication interface 32a.
  • the imaging device 31a and the communication interface 32a are installed at the place X.
  • the imaging device 31b is connected to the communication interface 32b.
  • the imaging device 31b and the communication interface 32b are installed in the place Y.
  • the imaging device 31c is connected to the communication interface 32c.
  • the imaging device 31c and the communication interface 32c are installed in the place Z.
  • Communication interfaces 32 a, 32 b, 32 c and the server 34 are connected via a communication network 33.
  • image acquisition means, detection means, and tracking means are stored on the server 34 side.
  • the tracking system transmits, for example, n images acquired by using the imaging device 31a at the place X to the server 34, and tracks the transported object on the server 34 side.
  • the tracking system of this embodiment may correspond to a combination of the above-described embodiments and modifications. Moreover, the tracking system of this embodiment may be compatible with, for example, cloud computing. Furthermore, in the tracking system of the present embodiment, the communication interfaces 32a, 32b, and 32c and the server 34 may be connected by a wireless communication line.
  • an imaging apparatus can be installed on the site, a server or the like can be installed at another location, and a conveyed product can be tracked online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.
  • Embodiment 7 relates to the counting system of the present invention.
  • the description of the tracking device, the tracking method, the counting device, the counting method, and the tracking system can be used for the counting system of the present invention.
  • the counting system according to the present embodiment further includes counting means in the server 34 in the tracking system according to the sixth embodiment.
  • the counting system transmits, for example, n measurement images acquired at the place X using the measurement image acquisition unit 311a to the server 34, and counts the conveyed objects on the server 34 side. Except for these points, the counting system of the seventh embodiment can use the description of the tracking system of the sixth embodiment.
  • the counting system of the present embodiment for example, it is possible to shorten the processing time at the server when tracking a conveyed product. For this reason, according to the counting system of this embodiment, a conveyed product can be counted in a shorter time, for example.
  • an imaging device can be installed at a site, a server or the like can be installed at another location, and the conveyed items can be counted online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.
  • Embodiment 8 relates to the sorting system of the present invention.
  • the description of the tracking device, the tracking method, the sorting device, the sorting method, and the tracking system can be used for the sorting system of the present invention.
  • FIG. 16 shows a configuration of an example of a sorting system using the sorting apparatus of the present invention.
  • the sorting system includes imaging devices 31 a, 31 b, and 31 c, sorting devices 35 a, 35 b, and 35 c, communication interfaces 32 a, 32 b, and 32 c, and a server 34.
  • the imaging device 31a and the sorting device 35a are connected to the communication interface 32a.
  • the imaging device 31a, the sorting device 35a, and the communication interface 32a are installed at the place X.
  • the imaging device 31b and the sorting device 35b are connected to the communication interface 32b.
  • the imaging device 31b, the sorting device 35b, and the communication interface 32b are installed at the place Y.
  • the imaging device 31c and the sorting device 35c are connected to the communication interface 32c.
  • the imaging device 31c, the sorting device 35c, and the communication interface 32c are installed in the place Z.
  • Communication interfaces 32 a, 32 b, 32 c and the server 34 are connected via a communication network 33.
  • image acquisition means, detection means, tracking means, and sorting means are stored on the server 34 side.
  • the sorting system transmits, for example, n images acquired using the imaging device 31a at the place X to the server 34, and the server 34 side tracks and sorts the object in the transported object.
  • the server 34 transmits, for example, position information of an object to be sorted to the sorting device 35a, and the sorting device 35a sorts the object to be sorted.
  • the sorting system of the eighth embodiment can use the description of the tracking system of the sixth embodiment.
  • the sorting system of the present embodiment for example, it is possible to shorten the processing time at the server when tracking a conveyed product. For this reason, according to the sorting system of this embodiment, for example, it is possible to sort the object in the transported object in a shorter time. Further, according to the sorting system of the present embodiment, the imaging device and the sorting device can be installed at the site, and the server or the like can be installed at another location, and the object can be sorted online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.
  • Appendix 1 Image acquisition means for acquiring n images over time with respect to the conveyed object being conveyed by the conveying device; Detecting means for detecting the conveyed object for k detection target images selected from the n images; A tracking means for tracking a conveyed object detected in the kth detection target image in an image acquired after the kth detection target image among the n images, K in the said detection means is smaller than n,
  • the tracking apparatus of the conveyed product characterized by the above-mentioned.
  • the detection means After the acquisition of the m-th image among the n images by the image acquisition means, the detection means performs detection on the first detection target image among the k detection target images.
  • a detection processing determination means for determining whether or not When it is determined by the detection processing determination means that the detection by the detection means is not performed,
  • the tracking device according to appendix 1, wherein the detection unit detects the conveyed object using the m-th image as an (l + 1) -th detection target image.
  • the detection means obtains position information of the transported object, The tracking means is based on the j-th image, the image before the j ⁇ 1th image, and the position information of the conveyed object in the image before the j ⁇ 1th image among the n images.
  • the tracking device according to appendix 1 or 2, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
  • Appendix 4 When the j-th image is the i-th detection target image, the tracking unit detects the j-th image, the i-1th detection target image, and the i-1th detection image.
  • the tracking device according to attachment 3, wherein the position of the conveyed object is tracked by calculating position information of the conveyed object in the j-th image based on the positional information of the conveyed object in the target image.
  • the tracking unit obtains position information of the conveyed object detected in the detection target image; 5.
  • the tracking device according to any one of appendices 1 to 4, further comprising an erroneous detection determination unit that determines whether the transported object is an erroneously detected object based on position information of the conveyed object obtained by the tracking unit.
  • Appendix 6 The tracking device according to appendix 5, wherein the erroneous determination detection means determines whether the conveyed object is an erroneously detected object based on a change amount of the position of the detected conveyed object between a plurality of images.
  • Appendix 7 Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image.
  • the tracking device according to any one of appendices 1 to 6, further including an association unit that associates the detected transported object and the tracked transported object with each other based on the tracking result of the object.
  • Appendix 8 A transport object determination means for determining whether or not a transport object detected in the detection target image acquired before the image exists, The tracking device according to any one of appendices 1 to 7, wherein when the transported object determining unit determines that the detected transported object exists, the tracking unit tracks.
  • Appendix 9) 9. The tracking device according to any one of appendices 1 to 8, wherein detection by the detection unit and tracking by the tracking unit are performed in parallel.
  • (Appendix 13) Including a detection frequency measuring means for measuring the detection frequency of the same transported object The counting apparatus according to appendix 11 or 12, wherein when the number of detections of the transported object is equal to or less than a predetermined number, the counting unit does not count a transported object whose detection frequency is equal to or less than the predetermined number.
  • Appendix 14 A tracking means for tracking a transported object including the object being transported by the transport device; Sorting means for sorting objects in the tracked transported object, 11. The transported object sorting apparatus, wherein the tracking means is the transported object tracking apparatus according to any one of appendices 1 to 10.
  • the detection step performs detection on the l-th detection target image among the k detection-target images.
  • the position information of the conveyed product is acquired, In the tracking step, based on the j-th image, the image before the j ⁇ 1th image, and the position information of the conveyed object in the image before the j ⁇ 1th image among the n images, The tracking method according to appendix 15 or 16, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
  • Appendix 19 In the tracking step, obtain positional information of the conveyed object detected in the detection target image, The tracking method according to any one of appendices 15 to 18, including an erroneous detection determination step of determining whether the transported object is an erroneously detected object based on position information of the conveyed object obtained in the tracking process.
  • Appendix 20 The tracking method according to appendix 19, wherein, in the erroneous determination detection step, it is determined whether the conveyed object is an erroneously detected object based on an amount of change in the position of the detected conveyed object between a plurality of images.
  • Appendix 21 Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image. 21.
  • Appendix 22 A transport object determination step for determining whether a transport object detected in the detection target image acquired before the image exists, The tracking method according to any one of appendices 15 to 21, wherein, when it is determined in the transported object determination step that the detected transported object exists, the tracking step tracks.
  • Appendix 23 The tracking method according to any one of appendices 15 to 22, wherein the detection step and the tracking step are performed in parallel.
  • Appendix 24 The tracking method according to any one of appendices 15 to 23, wherein in the detection step, the transported object is detected using a learning model capable of detecting the transported object.
  • Appendix 25 A tracking process for tracking the transported object being transported by the transport device; Counting the tracked transported object, 25.
  • a method for counting a conveyed product, wherein the tracking step is the method for tracking a conveyed product according to any one of appendices 15 to 24.
  • the tracking step acquires position information of the conveyed object detected in the detection target image, The counting method according to appendix 25, wherein the counting step counts the transported object based on position information of the transported object.
  • (Appendix 27) Including a detection frequency measurement process for measuring the detection frequency of the same transported object 27.
  • a method for selecting a conveyed product wherein the tracking step is the method for tracking a conveyed product according to any one of appendices 15 to 24.
  • Appendix 29 An image acquisition process for acquiring n images over time for a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; A tracking process for tracking a transported object detected in the k-th detection target image in an image acquired after the k-th detection target image among the n images can be executed on a computer. Yes, The program according to claim 1, wherein k in the detection process is smaller than n.
  • Appendix 30 An image acquisition process for acquiring n images over time for a transported object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image; And a counting process for counting the number of the tracked transported objects can be executed on a computer, The program according to claim 1, wherein k in the detection process is smaller than n.
  • Appendix 31 An image acquisition process for acquiring n images over time for a transported object including a target object being transported by a transport device; A detection process for detecting the conveyed object for k detection target images selected from the n images; In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image; A sorting process for sorting objects in the tracked transported object can be executed on a computer; The program according to claim 1, wherein k in the detection process is smaller than n.
  • Appendix 32 A computer-readable recording medium in which the program according to any one of appendices 29 to 31 is recorded.
  • the terminal and the server are connectable via a communication network outside the system,
  • the terminal includes an imaging device,
  • the imaging device captures n images over time for a transported object transported by a transport device
  • the server includes image acquisition means, detection means, and tracking means,
  • the image acquisition means acquires n images over time for a transported object transported by the transport device
  • the detection means detects a conveyed object for k detection target images among n images
  • the tracking means tracks the detected transported object for n images, K in the said detection means is smaller than n,
  • the tracking system of the conveyed product characterized by the above-mentioned.
  • the terminal and the server are connectable via a communication network outside the system,
  • the terminal includes an imaging device,
  • the imaging device captures n images over time for a transported object transported by a transport device
  • the server includes image acquisition means, detection means, tracking means, and counting means,
  • the image acquisition means acquires n images over time for a transported object transported by the transport device
  • the detection means detects a conveyed object for k detection target images among n images
  • the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images
  • the counting means counts the number of the tracked transported objects, K in the said detection means is smaller than n,
  • the counting system of the conveyed product characterized by the above-mentioned.
  • the terminal and the server are connectable via a communication network outside the system
  • the terminal includes an imaging device and a sorting device,
  • the imaging device captures n images over time for a transported object including a target object that is transported by a transporting device
  • the sorting means sorts the sorting object in the tracked transported object
  • the server includes image acquisition means, detection means, tracking means, and selection means,
  • the image acquisition means acquires n images over time for a transported object including a target object being transported by the transporting device
  • the detection means detects a conveyed object for k detection target images among n images
  • the tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images, K in the said detection means is smaller than n,
  • the sorting system of a conveyed product characterized by the above-mentioned.
  • the processing time can be shortened.
  • a product or the like can be tracked in real time in a factory or the like.
  • the present invention is extremely useful in the manufacturing industry and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Control Of Conveyors (AREA)

Abstract

Provided are a goods tracker and tracking method with which it is possible to shorten a processing time. This goods tracker is characterized by including an image acquisition means for acquiring n images over time with regard to goods being conveyed by a conveying device, a detection means for detecting the goods concerning k images subject to detection that are selected from the n images, and a tracking means for tracking the goods detected in the k'th image subject to detection in images among the n images that have been acquired after the k'th image subject to detection, and in that k in the detection means is smaller than n.

Description

搬送物の追跡装置、搬送物の計数装置、搬送物の追跡方法、搬送物の計数方法、搬送物の追跡システム、および搬送物の計数システムConveyed object tracking device, conveyed object counting device, conveyed object tracking method, conveyed object counting method, conveyed object tracking system, and conveyed object counting system

 本発明は、搬送物の追跡装置、搬送物の計数装置、搬送物の追跡方法、搬送物の計数方法、搬送物の追跡システム、および搬送物の計数システムに関する。 The present invention relates to a conveyance tracking device, a conveyance counting device, a conveyance tracking method, a conveyance counting method, a conveyance tracking system, and a conveyance counting system.

 工場等において、ベルトコンベア等の搬送装置上を移送される製品をカメラで撮像し、製品をモニタリングすることが行われている。 In a factory or the like, a product that is transported on a conveyor such as a belt conveyor is imaged with a camera and the product is monitored.

 本発明者らは、搬送装置上を搬送される搬送物を追跡する装置として、前記搬送物の画像を経時的に取得し、各画像における搬送物を検出し、各画像における同一の搬送物を対応付けすることにより、前記搬送物を検出および追跡可能な装置(以下、「参考例の装置」ともいう)を開発した。しかしながら、前記参考例の装置においては、各画像について検出を実施するため、検出および追跡処理に時間を要するという課題が生じた。また、この課題は、例えば、機械学習または深層学習により作製された学習モデルを用いて搬送物を検出する場合、特に顕著となった。 As an apparatus for tracking a conveyed object conveyed on a conveying device, the present inventors acquire images of the conveyed object over time, detect the conveyed object in each image, and detect the same conveyed object in each image. A device capable of detecting and tracking the transported object (hereinafter also referred to as “device of reference example”) was developed. However, in the apparatus of the reference example, since detection is performed for each image, there is a problem that it takes time for detection and tracking processing. In addition, this problem becomes particularly noticeable when, for example, a transported object is detected using a learning model created by machine learning or deep learning.

 そこで、本発明は、処理時間を短縮可能な搬送物の追跡装置および追跡方法の提供を目的とする。 Therefore, an object of the present invention is to provide a transported object tracking device and a tracking method capable of reducing the processing time.

 前記目的を達成するために、本発明の搬送物の追跡装置(以下、「追跡装置」ともいう)は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得手段と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出手段と、
前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡する追跡手段とを含み、
前記検出手段におけるkは、nより小さいことを特徴とする。
In order to achieve the above object, the transported object tracking device of the present invention (hereinafter also referred to as “tracking device”) acquires n images over time for the transported material being transported by the transporting device. Acquisition means;
Detecting means for detecting the conveyed object for k detection target images selected from the n images;
In the image acquired after each detection target image among the n images, tracking means for tracking the conveyed product detected in each detection target image,
K in the detection means is smaller than n.

 本発明の搬送物の計数装置(以下、「計数装置」ともいう)は、搬送装置により搬送されている搬送物について追跡する追跡手段と、
前記追跡された搬送物をカウントする計数手段とを含み、
前記追跡手段は、前記本発明の搬送物の追跡装置であることを特徴とする。
The transporting object counting device of the present invention (hereinafter also referred to as “counting device”) includes tracking means for tracking the transported material being transported by the transporting device,
Counting means for counting the tracked transported goods,
The tracking means is the transported object tracking device of the present invention.

 本発明の搬送物の追跡方法(以下、「追跡方法」ともいう)は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得工程と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出工程と、
前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡する追跡工程とを含み、
前記検出工程におけるkは、nより小さいことを特徴とする。
The transported object tracking method of the present invention (hereinafter also referred to as “tracking method”) includes an image acquisition step of acquiring n images over time for the transported object being transported by the transporting device,
A detection step of detecting the conveyed object for k detection target images selected from the n images;
In the image acquired after each detection target image among the n images, a tracking step of tracking a conveyed object detected in each detection target image,
In the detection step, k is smaller than n.

 本発明の搬送物の計数方法(以下、「計数方法」ともいう)は、搬送装置により搬送されている搬送物について追跡する追跡工程と、
前記追跡された搬送物をカウントする計数工程とを含み、
前記追跡工程は、前記本発明の搬送物の追跡方法であることを特徴とする。
The transported object counting method of the present invention (hereinafter also referred to as “counting method”) includes a tracking step for tracking the transported object being transported by the transporting device;
Counting the tracked transported object,
The tracking step is the transported object tracking method of the present invention.

 本発明のプログラムは、搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡する追跡処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする。
The program of the present invention includes an image acquisition process for acquiring n images over time with respect to a transported object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
Among the n images, in the image acquired after each detection target image, it is possible to execute on the computer a tracking process for tracking the conveyed object detected in each detection target image,
In the detection process, k is smaller than n.

 本発明のプログラムは、搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡する追跡処理と、
前記追跡された搬送物の数をカウントする計数処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする。
The program of the present invention includes an image acquisition process for acquiring n images over time with respect to a transported object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
In the image obtained after each detection target image among the n images, a tracking process for tracking the conveyed object detected in each detection target image;
And a counting process for counting the number of the tracked transported objects can be executed on a computer,
In the detection process, k is smaller than n.

 本発明の搬送物の追跡システム(以下、「追跡システム」ともいう)は、端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置を含み、
前記撮像装置は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を撮像し、
前記サーバは、画像取得手段、検出手段、および追跡手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡し、
前記検出手段におけるkは、nより小さいことを特徴とする。
The delivery tracking system of the present invention (hereinafter also referred to as “tracking system”) includes a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device,
The imaging device captures n images over time for a transported object transported by a transport device,
The server includes image acquisition means, detection means, and tracking means,
The image acquisition means acquires n images over time for a transported object transported by the transport device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
K in the detection means is smaller than n.

 本発明の搬送物の計数システム(以下、「計数システム」ともいう)は、端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置を含み、
前記撮像装置は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を撮像し、
前記サーバは、画像取得手段、検出手段、追跡手段、および計数手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡し、
前記計数手段は、前記追跡された搬送物の数を計数し、
前記検出手段におけるkは、nより小さいことを特徴とする。
The transported object counting system of the present invention (hereinafter also referred to as “counting system”) includes a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device,
The imaging device captures n images over time for a transported object transported by a transport device,
The server includes image acquisition means, detection means, tracking means, and counting means,
The image acquisition means acquires n images over time for a transported object transported by the transport device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
The counting means counts the number of the tracked transported objects,
K in the detection means is smaller than n.

 本発明によれば、処理時間を短縮できる。 According to the present invention, the processing time can be shortened.

図1は、実施形態1の追跡装置を示すブロック図である。FIG. 1 is a block diagram illustrating a tracking apparatus according to the first embodiment. 図2は、実施形態1の追跡装置のハードウェア構成の一例を示すブロック図である。FIG. 2 is a block diagram illustrating an example of a hardware configuration of the tracking device according to the first embodiment. 図3は、実施形態1の追跡方法およびプログラムを示すフローチャートである。FIG. 3 is a flowchart illustrating the tracking method and program according to the first embodiment. 図4は、実施形態1の追跡装置の他の例を示すブロック図である。FIG. 4 is a block diagram illustrating another example of the tracking device according to the first embodiment. 図5は、実施形態1の追跡装置の他の例を示すブロック図である。FIG. 5 is a block diagram illustrating another example of the tracking device according to the first embodiment. 図6は、実施形態2の追跡装置の他の例を示すブロック図である。FIG. 6 is a block diagram illustrating another example of the tracking apparatus according to the second embodiment. 図7は、実施形態2の追跡方法およびプログラムを示すフローチャートである。FIG. 7 is a flowchart illustrating the tracking method and program according to the second embodiment. 図8は、実施形態2の追跡方法およびプログラムを示すフローチャートである。FIG. 8 is a flowchart illustrating the tracking method and program according to the second embodiment. 図9は、追跡装置により取得される画像を示す図である。FIG. 9 is a diagram illustrating an image acquired by the tracking device. 図10は、実施形態2の追跡方法において、一部の処理のタイムラインを示す図である。FIG. 10 is a diagram illustrating a timeline of a part of processing in the tracking method according to the second embodiment. 図11は、実施形態3の計数装置を示すブロック図である。FIG. 11 is a block diagram illustrating a counting device according to the third embodiment. 図12は、実施形態3の計数方法およびプログラムを示すフローチャートである。FIG. 12 is a flowchart illustrating a counting method and program according to the third embodiment. 図13は、実施形態4の選別装置を示すブロック図である。FIG. 13 is a block diagram illustrating a sorting apparatus according to the fourth embodiment. 図14は、実施形態4の選別方法およびプログラムを示すフローチャートである。FIG. 14 is a flowchart illustrating a selection method and program according to the fourth embodiment. 図15は、実施形態5の追跡システムを示すブロック図である。FIG. 15 is a block diagram illustrating a tracking system according to the fifth embodiment. 図16は、実施形態5の選別システムを示すブロック図である。FIG. 16 is a block diagram illustrating a sorting system according to the fifth embodiment.

 本発明の実施形態について説明する。なお、本発明は、下記の実施形態によって何ら限定および制限されない。なお、以下の図1から図16において、同一部分には、同一符号を付している。また、各実施形態および変形例の説明は、特に言及がない限り、互いの説明を援用できる。さらに、各実施形態および変形例の構成は、特に言及がない限り、互いに組合せ可能である。 Embodiments of the present invention will be described. In addition, this invention is not limited and restrict | limited at all by the following embodiment. In addition, in the following FIGS. 1-16, the same code | symbol is attached | subjected to the same part. Moreover, the description of each embodiment and the modification can use each other's description unless there is particular mention. Furthermore, the configurations of the embodiments and the modifications can be combined with each other unless otherwise specified.

[実施形態1]
 実施形態1は、本発明の追跡装置および追跡方法に関する。
[Embodiment 1]
Embodiment 1 relates to a tracking device and a tracking method of the present invention.

 図1に、本実施形態における追跡装置のブロック図を示す。図1に示すように、本実施形態の追跡装置10は、画像取得手段111、検出手段112、および追跡手段113を含む。図1に示すように、画像取得手段111、検出手段112、および追跡手段113は、ハードウェアであるデータ処理手段(データ処理装置)11に組み込まれてもよく、ソフトウェアまたは前記ソフトウェアが組み込まれたハードウェアでもよい。データ処理手段11は、CPU等を備えてもよい。また、データ処理手段11は、例えば、後述のROM、RAM等を備えてもよい。 FIG. 1 shows a block diagram of the tracking device in the present embodiment. As shown in FIG. 1, the tracking device 10 of this embodiment includes an image acquisition unit 111, a detection unit 112, and a tracking unit 113. As shown in FIG. 1, the image acquisition unit 111, the detection unit 112, and the tracking unit 113 may be incorporated into data processing unit (data processing apparatus) 11 that is hardware, and software or the software is incorporated therein. It may be hardware. The data processing unit 11 may include a CPU or the like. The data processing unit 11 may include, for example, a ROM, a RAM, and the like which will be described later.

 図2に、追跡装置10のハードウェア構成のブロック図を例示する。追跡装置10は、例えば、CPU(中央処理装置)201、メモリ202、バス203、記憶装置204、入力装置206、ディスプレイ207、通信デバイス208等を有する。追跡装置10の各部は、それぞれのインタフェース(I/F)により、バス203を介して接続されている。追跡装置10のハードウェア構成は、例えば、後述の計数装置および選別装置のハードウェア構成としても採用できる。 FIG. 2 illustrates a block diagram of the hardware configuration of the tracking device 10. The tracking device 10 includes, for example, a CPU (Central Processing Unit) 201, a memory 202, a bus 203, a storage device 204, an input device 206, a display 207, a communication device 208, and the like. Each part of the tracking device 10 is connected via a bus 203 by a respective interface (I / F). The hardware configuration of the tracking device 10 can be employed as a hardware configuration of a counting device and a sorting device, which will be described later, for example.

 CPU201は、例えば、コントローラ(システムコントローラ、I/Oコントローラ等)等により、他の構成と連携動作し、追跡装置10の全体の制御を担う。追跡装置10において、CPU201により、例えば、本発明のプログラム205やその他のプログラムが実行され、また、各種情報の読み込みや書き込みが行われる。具体的には、例えば、CPU201が、画像取得手段111、検出手段112、および追跡手段113として機能する。追跡装置10は、演算装置として、CPUを備えるが、GPU(Graphics Processing Unit)、APU(Accelerated Processing Unit)等の他の演算装置を備えてもよいし、CPUとこれらとの組合せを備えてもよい。なお、CPU201は、例えば、後述する実施形態2~4および変形例1~5における記憶手段以外の各手段として機能する。 The CPU 201 operates in cooperation with other components by a controller (system controller, I / O controller, etc.), for example, and takes charge of overall control of the tracking device 10. In the tracking device 10, for example, the program 205 and other programs of the present invention are executed by the CPU 201, and various information is read and written. Specifically, for example, the CPU 201 functions as the image acquisition unit 111, the detection unit 112, and the tracking unit 113. Although the tracking device 10 includes a CPU as an arithmetic device, the tracking device 10 may include another arithmetic device such as a GPU (Graphics Processing Unit), an APU (Accelerated Processing Unit), or a combination of the CPU and these. Good. Note that the CPU 201 functions as each unit other than the storage unit in Embodiments 2 to 4 and Modifications 1 to 5 described later, for example.

 メモリ202は、例えば、メインメモリを含む。前記メインメモリは、主記憶装置ともいう。CPU201が処理を行う際には、例えば、後述する記憶装置204(補助記憶装置)に記憶されている本発明のプログラム205等の種々の動作プログラムを、メモリ202が読み込む。そして、CPU201は、メモリ202からデータを読み出し、解読し、前記プログラムを実行する。前記メインメモリは、例えば、RAM(ランダムアクセスメモリ)である。メモリ202は、例えば、さらに、ROM(読み出し専用メモリ)を含む。 The memory 202 includes, for example, a main memory. The main memory is also called a main storage device. When the CPU 201 performs processing, the memory 202 reads various operation programs such as the program 205 of the present invention stored in a storage device 204 (auxiliary storage device) described later. The CPU 201 reads data from the memory 202, decodes it, and executes the program. The main memory is, for example, a RAM (Random Access Memory). The memory 202 further includes a ROM (read only memory), for example.

 バス203は、例えば、外部機器とも接続できる。前記外部機器は、例えば、外部記憶装置(外部データベース等)、プリンター等があげられる。追跡装置10は、例えば、バス103に接続された通信デバイス208により、通信回線網に接続でき、通信回線網を介して、前記外部機器と接続することもできる。また、追跡装置10は、通信デバイス208および通信回線網を介して、端末等にも接続できる。 The bus 203 can be connected to an external device, for example. Examples of the external device include an external storage device (external database and the like), a printer, and the like. For example, the tracking device 10 can be connected to a communication line network by a communication device 208 connected to the bus 103, and can also be connected to the external device via the communication line network. The tracking device 10 can also be connected to a terminal or the like via the communication device 208 and a communication line network.

 記憶装置204は、例えば、前記メインメモリ(主記憶装置)に対して、いわゆる補助記憶装置ともいう。前述のように、記憶装置204には、本発明のプログラム205を含む動作プログラムが格納されている。記憶装置204は、例えば、記憶媒体と、前記記憶媒体に読み書きするドライブとを含む。前記記憶媒体は、特に制限されず、例えば、内蔵型でも外付け型でもよく、HD(ハードディスク)、FD(フロッピー(登録商標)ディスク)、CD-ROM、CD-R、CD-RW、MO、DVD、フラッシュメモリー、メモリーカード等があげられ、前記ドライブは、特に制限されない。記憶装置204は、例えば、前記記憶媒体と前記ドライブとが一体化されたハードディスクドライブ(HDD)であってもよい。 The storage device 204 is also referred to as a so-called auxiliary storage device for the main memory (main storage device), for example. As described above, the storage device 204 stores an operation program including the program 205 of the present invention. The storage device 204 includes, for example, a storage medium and a drive that reads from and writes to the storage medium. The storage medium is not particularly limited, and may be, for example, a built-in type or an external type, such as HD (hard disk), FD (floppy (registered trademark) disk), CD-ROM, CD-R, CD-RW, MO, Examples of the drive include a DVD, a flash memory, and a memory card, and the drive is not particularly limited. The storage device 204 may be, for example, a hard disk drive (HDD) in which the storage medium and the drive are integrated.

 追跡装置10は、例えば、さらに、入力装置206、ディスプレイ207を有する。入力装置206は、例えば、タッチパネル、トラックパッド、マウス等のポインティングデバイス;キーボード;カメラ、スキャナ等の撮像手段;ICカードリーダ、磁気カードリーダ等のカードリーダ;マイク等の音声入力手段;等があげられる。ディスプレイ208は、例えば、LEDディスプレイ、液晶ディスプレイ等の表示装置があげられる。本実施形態1において、入力装置206とディスプレイ207とは、別個に構成されているが、入力装置206とディスプレイ207とは、タッチパネルディスプレイのように、一体として構成されてもよい。 The tracking device 10 further includes an input device 206 and a display 207, for example. The input device 206 includes, for example, a pointing device such as a touch panel, a track pad, and a mouse; a keyboard; an imaging unit such as a camera and a scanner; a card reader such as an IC card reader and a magnetic card reader; a voice input unit such as a microphone; It is done. Examples of the display 208 include display devices such as an LED display and a liquid crystal display. In the first embodiment, the input device 206 and the display 207 are configured separately, but the input device 206 and the display 207 may be configured as a single unit like a touch panel display.

 画像取得手段111は、検出手段112および追跡手段113と電気的に接続されている。画像取得手段111は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する。画像取得手段111は、例えば、CPUがあげられる。画像取得手段111がCPUの場合、画像取得手段111は、例えば、装置内または装置外の撮像手段(撮像装置)、装置内または装置外の記憶手段等から前記n枚の画像を経時的に取得する。画像取得手段111は、例えば、n枚の画像について、撮像された順序に従って取得する。画像取得手段111は、例えば、前記n枚の画像を経時的に撮像してもよい。この場合、画像取得手段111は、例えば、前記画像を撮像する撮像手段(撮像装置)があげられる。前記撮像手段は、例えば、スチルカメラ、ビデオカメラ、カメラ付の携帯電話、カメラ付のスマートフォン、カメラ付のタブレット端末等のカメラ付の携帯端末、ウェブカメラ付きのコンピュータ、カメラ付きのヘッドマウントディスプレイ等があげられる。前記記憶手段は、例えば、ランダムアクセスメモリ(RAM)、読み出し専用メモリ(ROM)、フラッシュメモリー、ハードディスク(HD)、光ディスク、フロッピー(登録商標)ディスク(FD)等があげられる。前記記憶手段は、装置内蔵型でもよいし、外部記憶装置のような外付け型でもよい。 The image acquisition unit 111 is electrically connected to the detection unit 112 and the tracking unit 113. The image acquisition unit 111 acquires n images over time with respect to the conveyed item being conveyed by the conveying device. An example of the image acquisition unit 111 is a CPU. When the image acquisition unit 111 is a CPU, the image acquisition unit 111 acquires the n images over time from, for example, an imaging unit (imaging device) inside or outside the device, a storage unit inside or outside the device, or the like. To do. For example, the image acquisition unit 111 acquires n images according to the order in which the images are captured. For example, the image acquisition unit 111 may capture the n images over time. In this case, the image acquisition unit 111 is, for example, an imaging unit (imaging device) that captures the image. The imaging means is, for example, a still camera, a video camera, a mobile phone with a camera, a smartphone with a camera, a mobile terminal with a camera such as a tablet terminal with a camera, a computer with a webcam, a head-mounted display with a camera, etc. Can be given. Examples of the storage means include random access memory (RAM), read only memory (ROM), flash memory, hard disk (HD), optical disk, floppy disk (FD), and the like. The storage means may be a built-in device or an external device such as an external storage device.

 前記画像は、例えば、前記搬送装置が搬送物を搬送している際に、前記撮像手段で撮像することにより取得できる。このため、前記画像は、例えば、前記搬送物を含む画像および前記搬送物を含まない画像の一方を含んでもよいし、両方を含んでもよい。前記画像は、例えば、前記搬送装置の全体または一部を含む画像である。前記画像は、例えば、前記搬送物が搬送される搬送装置の一定箇所が撮像された画像でもよいし、異なる箇所が撮像された画像でもよい。すなわち、前記画像は、例えば、前記搬送物が搬送されるルート(搬送ルート)の一定箇所が撮像された画像でもよいし、異なる箇所が撮像された画像でもよい。前記画像が異なる箇所が撮像された画像である場合、各画像は、例えば、撮像された領域が部分的に重複するように撮像される。 The image can be acquired, for example, by capturing an image with the imaging unit when the transport device is transporting a transported object. For this reason, the image may include, for example, one of an image including the transported object and an image not including the transported object, or may include both. The image is, for example, an image including all or part of the transport device. The image may be, for example, an image in which a certain portion of a transport device that transports the transported object is captured, or may be an image in which different locations are captured. That is, the image may be, for example, an image obtained by imaging a certain part of a route (conveyance route) along which the conveyed product is conveyed, or may be an image obtained by imaging a different part. When the image is an image in which a different part is captured, each image is captured such that the captured regions partially overlap, for example.

 前記搬送装置は、例えば、接触型または非接触型の搬送装置があげられる。前記搬送装置は、例えば、ベルトコンベヤ等のコンベヤ、シューター、台車等があげられる。 Examples of the transfer device include a contact type or non-contact type transfer device. Examples of the transport device include a conveyor such as a belt conveyor, a shooter, and a carriage.

 前記搬送物は、例えば、前記搬送装置により搬送される物である。前記搬送物は、特に制限されず、任意の物とできる。具体例として、前記搬送物は、例えば、原材料、仕掛品、半製品、製品、商品、貯蔵品等があげられる。本発明において、追跡される搬送物は、前記搬送物の全部でもよいし、一部でもよい。前記搬送物の一部を追跡する場合、前記追跡される搬送物は、例えば、対象物または追跡対象物ということもできる。前記搬送物が対象物を含む場合、前記対象物は、1種類でもよいし、複数種類でもよい。前記対象物は、例えば、良品、不良品等があげられる。 The transported object is, for example, an object transported by the transport device. The transported object is not particularly limited and can be any object. Specific examples of the transported material include raw materials, work-in-process, semi-finished products, products, merchandise, and stored items. In the present invention, the tracked transported object may be all or a part of the transported object. When tracking a part of the transported object, the tracked transported object may be, for example, an object or a tracking object. When the transported object includes an object, the object may be one type or a plurality of types. Examples of the object include non-defective products and defective products.

 前記画像を取得する頻度は、特に制限されず、その下限は、例えば、3FPS(Flames Per Second)であり、好ましくは、12FPSであり、より好ましくは、20FPSであり、その上限は、特に制限されない。前記頻度の範囲は、例えば、10~100FPS、10~20FPS、60~100FPSである。 The frequency of acquiring the image is not particularly limited, and the lower limit thereof is, for example, 3 FPS (FlameslamPer Second), preferably 12 FPS, more preferably 20 FPS, and the upper limit thereof is not particularly limited. . The frequency ranges are, for example, 10 to 100 FPS, 10 to 20 FPS, and 60 to 100 FPS.

 前記「n」は、2以上の任意の正の整数であり、その上限は、特に制限されない。前記「n枚」は、例えば、2枚(フレーム)以上であり、好ましくは、3枚以上であり、より好ましくは、5枚以上である。 The “n” is an arbitrary positive integer of 2 or more, and the upper limit is not particularly limited. The “n sheets” is, for example, 2 sheets (frames) or more, preferably 3 sheets or more, and more preferably 5 sheets or more.

 検出手段112は、画像取得手段111および追跡手段113と電気的に接続されている。検出手段112は、例えば、CPUがあげられる。検出手段112は、画像取得手段111により取得されたn枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する。検出手段112は、例えば、画像取得手段111により取得された順序に従って、前記k枚の検出対象画像について、前記搬送物を検出する。 The detection means 112 is electrically connected to the image acquisition means 111 and the tracking means 113. An example of the detection unit 112 is a CPU. The detection unit 112 detects the transported object for k detection target images selected from the n images acquired by the image acquisition unit 111. For example, the detection unit 112 detects the transported object for the k detection target images in the order acquired by the image acquisition unit 111.

 前記検出対象画像は、例えば、前記搬送物を検出するための画像である。前記検出対象画像は、前記n枚の画像から選択することにより、取得することができる。前記検出対象画像の選択方法は、後述する。前記検出対象画像は、例えば、後述の追跡手段113による追跡に用いてもよい。 The detection target image is an image for detecting the transported object, for example. The detection target image can be acquired by selecting from the n images. The method for selecting the detection target image will be described later. For example, the detection target image may be used for tracking by the tracking unit 113 described later.

 前記「k」は、nより小さければよく、より具体的には、nより小さい任意の正の整数である。すなわち、kは、1≦k<nを満たす任意の整数である。前記「k枚」は、kが前述の数値範囲を満たせばよく、例えば、前記検出対象画像の選択方法に応じて適宜変更される。 The “k” only needs to be smaller than n, more specifically, any positive integer smaller than n. That is, k is an arbitrary integer that satisfies 1 ≦ k <n. The “k sheets” only needs to satisfy the numerical range described above, and is appropriately changed according to the selection method of the detection target image, for example.

 追跡手段113は、画像取得手段111および検出手段112と電気的に接続されている。検出手段113は、例えば、CPUがあげられる。追跡手段113は、画像取得手段111により取得されたn枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物(以下、「検出された搬送物」ともいう)を追跡する。追跡手段113は、例えば、画像取得手段111により取得された順序に従って、前記n枚の検出対象画像について、前記搬送物を検出する。 The tracking unit 113 is electrically connected to the image acquisition unit 111 and the detection unit 112. An example of the detection unit 113 is a CPU. The tracking unit 113 includes a transported object (hereinafter referred to as “detected”) detected in each of the detection target images in an image acquired after each of the detection target images among n images acquired by the image acquisition unit 111. (Also called “conveyance”). For example, the tracking unit 113 detects the transported object for the n detection target images in the order acquired by the image acquisition unit 111.

 つぎに、図3に、本実施形態における追跡方法のフローチャートを示す。本実施形態の追跡方法は、例えば、図1の追跡装置10を用いて、つぎのように実施する。図3に示すように、本実施形態の追跡方法は、S1ステップ(画像取得)、S2ステップ(検出)、およびS3ステップ(追跡)を含む。本実施形態において、S2ステップおよびS3ステップは、並列に実施されてもよいし、直列に実施されてもよい。 Next, FIG. 3 shows a flowchart of the tracking method in the present embodiment. The tracking method of this embodiment is implemented as follows, for example using the tracking apparatus 10 of FIG. As shown in FIG. 3, the tracking method of the present embodiment includes an S1 step (image acquisition), an S2 step (detection), and an S3 step (tracking). In this embodiment, S2 step and S3 step may be implemented in parallel and may be implemented in series.

 まず、S1ステップでは、画像取得手段111により、前記搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する。前記撮像手段を用いて前記画像を取得する場合、前記撮像手段により搬送ルートを撮像し、画像取得手段111は、撮像された画像を前記画像として取得する。画像取得手段111が撮像手段である場合、画像取得手段111は、前記搬送ルートを撮像し、前記画像として取得する。前記データ記憶手段を用いて前記画像を取得する場合、画像取得手段111は、前記データ記憶手段(図示せず)に記憶された前記画像を読み出し、取得する。 First, in step S1, n images are acquired by the image acquisition unit 111 over time with respect to the transported object transported by the transport device. When acquiring the image using the imaging unit, the imaging unit captures a transport route, and the image acquisition unit 111 acquires the captured image as the image. When the image acquisition unit 111 is an imaging unit, the image acquisition unit 111 captures the transport route and acquires it as the image. When acquiring the image using the data storage unit, the image acquisition unit 111 reads and acquires the image stored in the data storage unit (not shown).

 つぎに、S2ステップでは、検出手段112により、前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する。すなわち、S2ステップでは、前記n枚の画像のうち、一部の画像について、前記搬送物の検出を行なう。具体的には、まず、S2ステップでは、前記検出対象画像を選択する。前記検出対象画像は、例えば、検出手段112により、選択されてもよいし、他の手段により選択されてもよい。前記検出対象画像の選択方法は、特に制限されず、例えば、n枚の画像において、所定枚数毎の画像を前記検出対象画像として選択してもよいし、所定時間毎の画像を前記検出対象画像として選択してもよい。前記所定枚数および前記所定時間は、例えば、検出手段112の前記検出対象画像の検出時間に応じて、適宜設定できる。つぎに、S2ステップでは、検出手段112により、例えば、前記検出対象画像における各搬送物の座標(例えば、中心座標)、大きさ(例えば、幅および長さ、または縦横サイズ(占める領域))等の搬送物の位置情報を検出する。そして、S2ステップでは、例えば、得られた搬送物の位置情報、確信度等の検出結果(検出情報)が、前記搬送物毎に対応する検出対象画像と関連付けられて出力される。前記検出結果は、例えば、各搬送物の検出回数を含んでもよい。前記確信度は、例えば、検出手段112により検出された搬送物が搬送物であることの確からしさ(可能性)である。また、S2ステップでは、検出手段112により、前記搬送物における対象物を検出してもよい。この場合、検出手段112は、例えば、前記搬送物において、前記対象物以外の搬送物を検出することにより、前記対象物を検出してもよい。S2ステップにおいて、検出手段112は、例えば、色に基づく識別、輪郭の抽出、テンプレートマッチング、前記搬送物または対象物を検出可能な学習モデル等を用いて、前記搬送物または対象物を検出できる。前記学習モデルは、例えば、前記搬送物または対象物について、機械学習または深層学習を実施することにより作製できる。また、前記対象物が2種類以上の場合、検出手段112は、例えば、いずれの対象物に該当するか分類してもよい。この場合、検出手段112は、例えば、検出された対象物の分類クラスを検出結果として出力してもよい。 Next, in step S2, the detection unit 112 detects the transported object for k detection target images selected from the n images. That is, in step S2, the transported object is detected for some of the n images. Specifically, first, in step S2, the detection target image is selected. The detection target image may be selected by, for example, the detection unit 112 or may be selected by another unit. The selection method of the detection target image is not particularly limited. For example, in n images, a predetermined number of images may be selected as the detection target image, or an image every predetermined time is selected as the detection target image. You may choose as The predetermined number and the predetermined time can be appropriately set according to the detection time of the detection target image of the detection unit 112, for example. Next, in step S2, the detection unit 112 performs, for example, coordinates (for example, center coordinates), sizes (for example, width and length, or vertical and horizontal sizes (areas)) of each conveyance object in the detection target image. The position information of the conveyed product is detected. In step S2, for example, detection results (detection information) such as positional information and certainty factor of the obtained transported object are output in association with the detection target image corresponding to each transported object. The detection result may include, for example, the number of detections of each conveyed product. The certainty factor is, for example, the probability (possibility) that the transported object detected by the detection unit 112 is a transported object. In step S2, the detection unit 112 may detect an object in the transported object. In this case, for example, the detection unit 112 may detect the object by detecting a conveyance object other than the object in the conveyance object. In step S <b> 2, the detection unit 112 can detect the transport object or the target object using, for example, color-based identification, contour extraction, template matching, a learning model that can detect the transport object or the target object, and the like. The learning model can be created, for example, by performing machine learning or deep learning on the transported object or object. When there are two or more types of objects, the detection unit 112 may classify which object corresponds to, for example. In this case, the detection means 112 may output the classification class of the detected target object as a detection result, for example.

 S3ステップでは、追跡手段113により、前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡する。すなわち、S3ステップでは、追跡手段113が、前記k枚の検出対象画像のうち、p枚目の検出対象画像で搬送物が検出された場合、前記n枚の画像のうち、前記p枚目の検出対象画像より後に取得(撮像)された画像において、前記p枚目の検出対象画像で検出された搬送物を追跡する。追跡手段113は、例えば、前記p枚目の検出対象画像より後に取得された画像の全部または一部において、前記p枚目の検出対象画像で検出された搬送物を追跡する。前記「p」は、k以下の正の整数である。すなわち、pは、1≦p≦kを満たす整数である。前記搬送物の追跡は、例えば、LucasKanade法、Horn-Schunk法等のオプティカルフロー推定により、実施できる。具体的には、前記搬送物の追跡は、例えば、前記搬送物を追跡する画像と、前記追跡する画像の1つ前に取得された画像と、前記1つ前に取得された画像において検出された搬送物の座標(例えば、中心座標)等の位置情報とに基づき、オプティカルフロー推定により、前記追跡に用いる画像における前記検出された搬送物の座標(例えば、中心座標)等の位置情報を算出することで実施できる。この場合、追跡手段113は、例えば、前記追跡に用いる画像において検出された搬送物の位置情報を追跡結果(追跡情報)として出力してもよい。また、検出手段112が前記検出結果を出力している場合、S3ステップでは、追跡手段113は、前記対応する搬送物について、前記検出結果および前記追跡結果を関連付けてもよい。 In step S3, the tracking unit 113 tracks the transported object detected in each detection target image in the images acquired after each detection target image among the n images. That is, in step S3, when the tracking unit 113 detects a transported object in the pth detection target image among the k detection target images, the pth out of the n images. In the image acquired (captured) after the detection target image, the conveyance object detected in the p-th detection target image is tracked. The tracking unit 113 tracks, for example, the conveyed object detected in the p-th detection target image in all or part of the image acquired after the p-th detection target image. The “p” is a positive integer equal to or less than k. That is, p is an integer that satisfies 1 ≦ p ≦ k. The transported object can be traced by optical flow estimation such as Lucas Kanade method and Horn-Schunk method. Specifically, the tracking of the transported object is detected in, for example, an image tracking the transported object, an image acquired immediately before the image to be tracked, and an image acquired immediately before the image to be tracked. Based on position information such as coordinates (for example, center coordinates) of the conveyed object, position information such as coordinates of the detected conveyed object (for example, center coordinates) in the image used for tracking is calculated by optical flow estimation. This can be done. In this case, for example, the tracking unit 113 may output the position information of the conveyed object detected in the image used for the tracking as a tracking result (tracking information). If the detection unit 112 outputs the detection result, in step S3, the tracking unit 113 may associate the detection result and the tracking result with respect to the corresponding transported object.

 以上説明したように、実施形態1の追跡装置および追跡方法によれば、前記n枚の画像から選択されたk枚の検出対象画像において、前記搬送物を検出する。すなわち、実施形態1の追跡装置および追跡方法では、前記n枚の画像の全ての画像ではなく、一部の画像において、前記搬送物を検出する。このため、実施形態1の追跡装置および追跡方法は、全ての画像において、前記搬送物を検出する参考例の装置と比較して、処理時間を短縮できる。 As described above, according to the tracking device and the tracking method of the first embodiment, the transported object is detected in k detection target images selected from the n images. That is, in the tracking device and the tracking method according to the first embodiment, the transported object is detected not in all the images of the n images but in a part of the images. For this reason, the tracking device and the tracking method of the first embodiment can shorten the processing time as compared with the device of the reference example that detects the transported object in all images.

 本実施形態において、追跡装置10は、データ処理手段11から構成されるが、他の構成を含んでもよい。 In the present embodiment, the tracking device 10 is configured by the data processing means 11, but may include other configurations.

 図4に示すように、本発明の追跡装置は、データ記憶手段を含んでもよい。図4は、実施形態1の追跡装置の他の例を示すブロック図である。図4に示すように、追跡装置20は、追跡装置10の構成に加え、画像記憶部121、検出情報記憶部122、および追跡情報記憶部123を含む。画像記憶部121は、画像取得手段111、検出手段112、および追跡手段113と、検出情報記憶部122は、検出手段112および追跡手段113と、追跡情報記憶部123は、追跡手段113と、それぞれ電気的に接続されている。画像記憶部121、検出情報記憶部122、および追跡情報記憶部123は、例えば、図4に示すようにハードウェアであるデータ記憶手段12に組み込まれてもよい。データ記憶手段12は、前述の記憶手段があげられ、具体例として、ROM、RAM等があげられる。追跡装置20は、例えば、画像取得手段111が取得したn枚の画像を画像記憶部121に記憶させ、記憶された画像を検出手段112および追跡手段113に出力する。また、検出手段112の検出結果および検出対象画像を検出情報記憶部122に記憶させ、記憶された検出結果を追跡手段113に出力する。さらに、追跡手段113の追跡結果を追跡情報記憶部123に記憶させる。これらの点を除き、追跡装置20は、追跡装置10と同様の構成を有し、その説明を援用できる。 As shown in FIG. 4, the tracking device of the present invention may include data storage means. FIG. 4 is a block diagram illustrating another example of the tracking device according to the first embodiment. As illustrated in FIG. 4, the tracking device 20 includes an image storage unit 121, a detection information storage unit 122, and a tracking information storage unit 123 in addition to the configuration of the tracking device 10. The image storage unit 121 includes the image acquisition unit 111, the detection unit 112, and the tracking unit 113, the detection information storage unit 122 includes the detection unit 112 and the tracking unit 113, and the tracking information storage unit 123 includes the tracking unit 113. Electrically connected. The image storage unit 121, the detection information storage unit 122, and the tracking information storage unit 123 may be incorporated in the data storage unit 12 that is hardware, for example, as illustrated in FIG. Examples of the data storage means 12 include the storage means described above, and specific examples include ROM, RAM, and the like. For example, the tracking device 20 stores n images acquired by the image acquisition unit 111 in the image storage unit 121, and outputs the stored images to the detection unit 112 and the tracking unit 113. Further, the detection result of the detection unit 112 and the detection target image are stored in the detection information storage unit 122, and the stored detection result is output to the tracking unit 113. Furthermore, the tracking result of the tracking unit 113 is stored in the tracking information storage unit 123. Except for these points, the tracking device 20 has the same configuration as that of the tracking device 10, and the description thereof can be used.

 つぎに、図5に示すように、本発明の追跡装置は、入力手段および出力手段の少なくとも一方を含んでもよい。図5は、実施形態1の追跡装置の他の例を示すブロック図である。図5に示すように、追跡装置30は、追跡装置10の構成に加え、入力手段13および出力手段14を含む。入力手段13は、画像取得手段111と、出力手段14は、画像取得手段111、検出手段112、および追跡手段113と、それぞれ、電気的に接続されている。入力手段13は、例えば、画像取得の開始、停止等の情報を入力する。入力手段13は、例えば、タッチパネル等のモニター、操作キー等の携帯端末に備わる通常の入力手段、キーボード、マウス等のコンピュータに備わる通常の入力手段、入力ファイルおよび他のコンピュータ等を用いることができる。出力手段14は、例えば、画像取得手段111により取得されたn枚の画像、検出手段112の検出結果、k枚の検出対象画像、追跡手段113の追跡結果等を出力する。出力手段14は、例えば、映像により出力するモニター(例えば、液晶ディスプレイ(LCD)、ブラウン管(CRT)ディスプレイ等の各種画像表示装置等)等の表示手段、印刷により出力するプリンター、音声により出力するスピーカー等があげられる。出力手段14が前記表示手段の場合、出力手段14は、前記n枚の画像、前記検出結果、前記検出対象画像、前記追跡結果を、前記表示手段に表示してもよい。入力手段13および出力手段14は、例えば、I/Oインターフェイスを介して、データ処理手段11と電気的に接続してもよい。また、本発明の追跡装置は、例えば、さらに、ビデオコーデック、コントローラ(システムコントローラ、I/Oコントローラ等)を含んでもよい。これらの点を除き、追跡装置30は、追跡装置10と同様の構成を有し、その説明を援用できる。 Next, as shown in FIG. 5, the tracking device of the present invention may include at least one of input means and output means. FIG. 5 is a block diagram illustrating another example of the tracking device according to the first embodiment. As shown in FIG. 5, the tracking device 30 includes an input unit 13 and an output unit 14 in addition to the configuration of the tracking device 10. The input means 13 is electrically connected to the image acquisition means 111, and the output means 14 is electrically connected to the image acquisition means 111, the detection means 112, and the tracking means 113, respectively. The input unit 13 inputs information such as start and stop of image acquisition, for example. As the input means 13, for example, a normal input means provided in a portable terminal such as a monitor such as a touch panel or an operation key, a normal input means provided in a computer such as a keyboard and a mouse, an input file, another computer, and the like can be used. . The output unit 14 outputs, for example, n images acquired by the image acquisition unit 111, detection results of the detection unit 112, k detection target images, tracking results of the tracking unit 113, and the like. The output means 14 includes, for example, display means such as a monitor that outputs video (for example, various image display devices such as a liquid crystal display (LCD) and a cathode ray tube (CRT) display), a printer that outputs by printing, and a speaker that outputs sound. Etc. When the output unit 14 is the display unit, the output unit 14 may display the n images, the detection result, the detection target image, and the tracking result on the display unit. The input unit 13 and the output unit 14 may be electrically connected to the data processing unit 11 via, for example, an I / O interface. The tracking device of the present invention may further include, for example, a video codec and a controller (system controller, I / O controller, etc.). Except for these points, the tracking device 30 has the same configuration as that of the tracking device 10, and the description thereof can be used.

[変形例1]
 変形例1は、本発明の追跡装置および追跡方法に関する。変形例1の追跡装置および追跡方法は、例えば、実施形態1の追跡装置および追跡方法の説明を援用できる。
[Modification 1]
Modification 1 relates to the tracking device and the tracking method of the present invention. For the tracking device and the tracking method according to the first modification, for example, the description of the tracking device and the tracking method according to the first embodiment can be cited.

 変形例1の追跡装置は、実施形態1の追跡装置に加えて、前記画像取得手段により、前記n枚の画像のうち、m枚目の画像を取得以後、前記k枚の検出対象画像のうち、l枚目の検出対象画像に対する検出を前記検出手段が実施しているかを判定する検出処理判定手段を含む。この場合、変形例1の追跡装置は、例えば、前記検出処理判定手段により、前記検出手段による検出が実施されていないと判定された場合、前記検出手段は、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出する。 In addition to the tracking device according to the first embodiment, the tracking device according to the first modified example acquires the m-th image among the n images by the image acquisition unit, and then includes the k detection target images. , A detection processing determination unit that determines whether the detection unit performs detection on the first detection target image. In this case, for example, when the detection processing determination unit determines that the detection unit has not performed the detection, the detection unit determines that the m-th image is l + 1 sheets. The transported object is detected as an eye detection target image.

 前記「m」は、n以下の正の整数である。すなわち、mは、1≦m≦nを満たす整数である。また、前記「l」は、k-1以下の正の整数である。すなわち、lは、1≦l≦k-1を満たす整数である。 The “m” is a positive integer less than or equal to n. That is, m is an integer that satisfies 1 ≦ m ≦ n. The “l” is a positive integer equal to or less than k−1. That is, l is an integer that satisfies 1 ≦ l ≦ k−1.

 前記検出処理判定手段は、例えば、前記画像取得手段および前記検出手段と電気的に接続されている。前記検出処理判定手段は、例えば、CPUがあげられる。 The detection processing determination unit is electrically connected to, for example, the image acquisition unit and the detection unit. An example of the detection process determination means is a CPU.

 変形例1の追跡方法は、実施形態1の追跡方法に加えて、前記画像取得工程において、前記n枚の画像のうち、m枚目の画像を取得以後、前記k枚の検出対象画像のうち、l枚目の検出対象画像に対する検出を前記検出工程が実施しているかを判定する検出処理判定工程を含む。この場合、変形例1の追跡方法は、例えば、前記検出処理判定工程において、前記検出工程による検出が実施されていないと判定された場合、前記検出工程において、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出する。 In addition to the tracking method of the first embodiment, the tracking method of the first modified example includes, after the acquisition of the mth image among the n images in the image acquisition step, the k detection target images. , Including a detection process determination step for determining whether the detection step is performing detection on the first detection target image. In this case, in the tracking method of the first modification, for example, when it is determined in the detection process determination step that the detection by the detection step is not performed, in the detection step, the m + 1st image is l + 1 sheets. The transported object is detected as an eye detection target image.

 変形例1の追跡方法は、例えば、変形例1の追跡装置を用いて実施できる。具体的には、まず、前記画像取得手段により、m枚目の画像を取得する(画像取得工程)。つぎに、前記検出処理判定手段により、前記l枚目の検出対象画像に対する検出を前記検出手段が実施しているかを判定する(検出処理判定工程)。具体的には、前記検出処理判定手段は、例えば、前記検出手段による処理が稼働しているかを、検出処理を実施するCPUの稼働状況を確認することで判定する。そして、前記検出処理判定手段により、前記検出手段による検出が実施されていないと判定された場合、前記検出手段により、前記m枚目の画像をl+1枚目の検出対象画像として選択し、前記搬送物を検出する(検出工程)。他方、前記検出処理判定手段により、前記検出手段による検出が実施されていると判定された場合、前記検出手段による検出は実施しない。この場合、前記追跡手段により、前記m枚目の画像において、前記検出された搬送物を追跡してもよい。これにより、変形例1の追跡装置および追跡方法は、例えば、前記検出処理を実施しない画像についても、前記検出された搬送物を追跡できるため、追跡の精度をより向上できる。 The tracking method of Modification 1 can be implemented using the tracking device of Modification 1, for example. Specifically, first, the m-th image is acquired by the image acquisition means (image acquisition step). Next, it is determined by the detection process determination means whether the detection means is performing detection on the first detection target image (detection process determination step). Specifically, the detection process determination unit determines, for example, whether or not the process by the detection unit is operating by confirming the operating status of the CPU that performs the detection process. When the detection processing determination unit determines that the detection unit has not performed detection, the detection unit selects the m-th image as an l + 1-th detection target image, and the transport An object is detected (detection step). On the other hand, when the detection processing determination means determines that the detection by the detection means is being performed, the detection by the detection means is not performed. In this case, the detected conveyance object may be tracked in the m-th image by the tracking unit. Thereby, the tracking device and the tracking method of Modification 1 can track the detected transported object even for an image that is not subjected to the detection process, for example, so that the tracking accuracy can be further improved.

 また、変形例1の追跡装置が後述の対応付け手段を含む場合、前記検出処理判定手段は、例えば、前記l枚目の検出対象画像の検出結果と、前記n枚の画像のうち、前記l枚目の検出対象画像のつぎの画像の追跡結果との対応付けを、前記対応付け手段が実施しているかを判定してもよい。この場合、前記検出処理判定手段により、前記検出手段による検出および前記対応付け手段による対応付けが実施されていないと判定された場合、前記検出手段は、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出することが好ましい。さらに、変形例1の追跡方法が後述の対応付け工程を含む場合、前記検出処理判定工程は、例えば、前記l枚目の検出対象画像の検出結果と、前記n枚の画像にうち、前記l枚目の検出対象画像のつぎの画像の追跡結果との対応付けを、前記対応付け工程が実施しているかを判定してもよい。この場合、前記検出処理判定工程により、前記検出工程による検出および前記対応付け工程による対応付けが実施されていないと判定された場合、前記検出工程は、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出することが好ましい。 Further, when the tracking device according to the first modification includes the association unit described later, the detection processing determination unit, for example, among the detection result of the l-th detection target image and the n images, the l It may be determined whether or not the association unit performs association with the tracking result of the next image of the first detection target image. In this case, when the detection processing determination unit determines that the detection by the detection unit and the association by the association unit are not performed, the detection unit converts the m-th image into the l + 1-th image. It is preferable to detect the transported object as a detection target image. Further, when the tracking method of the first modification includes an association step described later, the detection processing determination step includes, for example, the detection result of the l th detection target image and the n images among the l images. It may be determined whether or not the association step is associated with the tracking result of the next image of the first detection target image. In this case, when the detection process determination step determines that the detection by the detection step and the association by the association step are not performed, the detection step determines that the mth image is the l + 1th image. It is preferable to detect the transported object as a detection target image.

 以上説明したように、変形例1の追跡装置および追跡方法によれば、前記検出手段または検出工程における検出処理が実施されているかを判定し、前記検出処理が実施されていない場合、新たに所得したm枚目の画像について、検出処理を実施できる。このため、新たに取得した画像が検出処理されずに、作業待ちとなることを防止できる。したがって、変形例1の追跡装置および追跡方法によれば、処理時間をより短縮できる。 As described above, according to the tracking device and the tracking method of the first modification, it is determined whether the detection process in the detection unit or the detection process is performed. If the detection process is not performed, a new income is obtained. A detection process can be performed on the m-th image. For this reason, it is possible to prevent a newly acquired image from waiting for work without being subjected to detection processing. Therefore, according to the tracking device and the tracking method of Modification 1, the processing time can be further shortened.

[変形例2]
 変形例2は、本発明の追跡装置および追跡方法に関する。変形例2の追跡装置および追跡方法は、例えば、実施形態1の追跡装置および追跡方法の説明を援用できる。
[Modification 2]
Modification 2 relates to the tracking device and the tracking method of the present invention. For example, the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the second modification.

 変形例2の追跡装置は、実施形態1の追跡装置に加えて、前記検出手段が、前記搬送物の位置情報を取得し、前記追跡手段は、前記n枚の画像のうち、j枚目の画像と、j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物を追跡する。 In the tracking device according to the second modification, in addition to the tracking device according to the first embodiment, the detection unit acquires the position information of the transported object, and the tracking unit includes the jth image out of the n images. Based on the image, the image before the j−1th sheet, and the position information of the conveyed object in the image before the j−1th sheet, by calculating the position information of the conveyed object in the jth image, The transported object is tracked.

 前記「j」は、2以上、n以下の正の整数である。すなわち、jは、2≦j≦nを満たす整数である。 “J” is a positive integer of 2 or more and n or less. That is, j is an integer that satisfies 2 ≦ j ≦ n.

 変形例2の追跡方法は、実施形態1の追跡方法に加えて、前記検出工程において、前記搬送物の位置情報を取得し、前記追跡工程において、前記n枚の画像のうち、j枚目の画像と、j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物を追跡する。 In the tracking method of the second modification, in addition to the tracking method of the first embodiment, the position information of the conveyed object is acquired in the detection step. In the tracking step, the jth image among the n images is acquired. Based on the image, the image before the j−1th sheet, and the position information of the conveyed object in the image before the j−1th sheet, by calculating the position information of the conveyed object in the jth image, The transported object is tracked.

 変形例2の追跡方法は、例えば、変形例2の追跡装置により実施できる。具体的には、まず、実施形態1と同様にして、1~j-1枚目の画像について、追跡を行なう。この際に、変形例2では、前記検出対象画像について、前記検出手段により搬送物を検出する際に、前記搬送物の位置情報を取得する(検出工程)。つぎに、実施形態1と同様にして、j枚目の画像を取得する。前記j枚目の画像が前記検出対象画像でない場合、前記検出手段による検出は実施しない。他方、前記j枚目の画像が前記検出対象画像である場合、前記j枚目の画像をi枚目の検出対象画像として選択し、検出手段による検出を実施し、前記i枚目の検出対象画像における搬送物の位置情報を取得する(検出工程)。 The tracking method of Modification 2 can be implemented by the tracking device of Modification 2, for example. Specifically, first, in the same manner as in the first embodiment, the first to j−1th images are traced. At this time, in the second modification, when the detected object is detected by the detection unit, the position information of the conveyed object is acquired for the detection target image (detection step). Next, the j-th image is acquired in the same manner as in the first embodiment. When the j-th image is not the detection target image, the detection by the detection unit is not performed. On the other hand, when the j-th image is the detection target image, the j-th image is selected as the i-th detection target image, detection is performed by the detection unit, and the i-th detection target is detected. Position information of the conveyed product in the image is acquired (detection step).

 つぎに、前記j枚目の画像における前記検出された搬送物について、追跡する(追跡工程)。具体的には、前記追跡手段により、前記j枚目の画像と、前記j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とを取得する。そして、前記j枚目の画像と、前記j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、例えば、前記オプティカルフロー推定により、前記j枚目の画像において検出された搬送物の位置情報を算出する。 Next, the detected transported object in the j-th image is tracked (tracking process). Specifically, the j-th image, the image before the j−1th image, and the position information of the conveyed object in the image before the j−1th image are acquired by the tracking unit. Then, based on the position information of the conveyed object in the image before the j−1th image, the image before the j−1th image, and the image before the j−1th image, for example, by the optical flow estimation, the jth image The position information of the conveyed product detected in the first image is calculated.

 前記j枚目の画像がi枚目の検出対象画像の場合、前記追跡工程において、前記追跡手段により、前記j枚目の画像と、i-1枚目以前の検出対象画像と、前記i-1枚目以前の検出対象画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物の位置を追跡することが好ましい。前記i-1枚目以前の検出対象画像は、i-1枚目の検出対象画像を含むことが好ましい。これにより、i-1枚目の検出対象画像において新たに検出された搬送物も追跡できるため、前記搬送物の追跡漏れを抑制でき、追跡の精度をより向上できる。前記検出工程および前記追跡工程における搬送物の位置情報は、前記搬送物の座標または中心座標が好ましい。 When the j-th image is the i-th detection target image, in the tracking step, the tracking unit performs the j-th image, the i−1-th previous detection target image, and the i− It is preferable to track the position of the transported object by calculating the position information of the transported object in the j-th image based on the position information of the transported object in the first detection target image. It is preferable that the detection target image before the (i−1) th image includes the (i−1) th detection target image. Thereby, since the newly detected transported object in the i-1th detection target image can be tracked, the tracking of the transported object can be suppressed, and the tracking accuracy can be further improved. The position information of the conveyed product in the detection step and the tracking step is preferably coordinates or center coordinates of the conveyed product.

 前記「i」は、2以上、k以下の整数である。すなわち、iは、2≦i≦kを満たす整数である。 “I” is an integer of 2 or more and k or less. That is, i is an integer that satisfies 2 ≦ i ≦ k.

 以上説明したように、変形例2の追跡装置および追跡方法によれば、前記j枚目の画像と、前記j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出するため、前記追跡時の処理を低減できる。したがって、変形例2の追跡装置および追跡方法によれば、処理時間をより短縮できる。 As described above, according to the tracking device and the tracking method of the modified example 2, the transported object in the j-th image, the image before the j−1th image, and the image before the j−1th image. Since the position information of the conveyed object in the j-th image is calculated based on the position information, the tracking process can be reduced. Therefore, according to the tracking device and the tracking method of Modification 2, the processing time can be further shortened.

[変形例3]
 変形例3は、本発明の追跡装置および追跡方法に関する。変形例3の追跡装置および追跡方法は、例えば、実施形態1の追跡装置および追跡方法の説明を援用できる。
[Modification 3]
Modification 3 relates to the tracking device and the tracking method of the present invention. For example, the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the third modification.

 変形例3の追跡装置は、実施形態1の追跡装置に加えて、前記追跡手段が、前記検出対象画像で検出された搬送物の位置情報を取得し、前記追跡手段で得られた搬送物の位置情報に基づき、前記搬送物が誤検出物であるかを判定する誤検出判定手段を含む。 In addition to the tracking device according to the first embodiment, the tracking device according to the modified example 3 acquires the positional information of the transported object detected in the detection target image, and the transport device obtained by the tracking unit is obtained. An erroneous detection determination means for determining whether the transported object is an erroneously detected object based on position information is included.

 前記誤検出判定手段は、例えば、前記検出手段および前記追跡手段と電気的に接続されている。前記誤検出判定手段は、例えば、CPUがあげられる。 The erroneous detection determination means is electrically connected to, for example, the detection means and the tracking means. An example of the erroneous detection determination means is a CPU.

 前記検出された搬送物の位置情報は、例えば、前記搬送物の座標または中心座標が好ましい。また、前記追跡手段で得られた搬送物の位置情報は、前記搬送物の座標または中心座標が好ましい。 The position information of the detected transported object is preferably, for example, coordinates or center coordinates of the transported object. Moreover, the position information of the conveyed product obtained by the tracking means is preferably the coordinates or center coordinates of the conveyed item.

 変形例3の追跡方法は、実施形態1の追跡方法に加えて、前記追跡工程において、前記検出対象画像で検出された搬送物の位置情報を取得し、前記追跡工程で得られた搬送物の位置情報に基づき、前記搬送物が誤検出物であるかを判定する誤検出判定工程を含む。 In the tracking method of Modification 3, in addition to the tracking method of Embodiment 1, in the tracking step, position information of the transported object detected in the detection target image is acquired, and the transported object obtained in the tracking process is acquired. An erroneous detection determination step of determining whether the conveyed product is an erroneously detected object based on position information;

 変形例3の追跡方法は、例えば、変形例3の追跡装置により実施できる。具体的には、まず、実施形態1と同様にして、前記検出工程および前記追跡工程を実施する。この際に、変形例3では、前記検出対象画像で検出された搬送物について、前記追跡に用いる画像(例えば、変形例2におけるj枚目の画像)における搬送物の位置情報を算出することにより、取得する。つぎに、前記誤検出判定手段により、前記追跡工程において得られた搬送物の位置情報に基づき、前記搬送物が誤検出物であるかを判定する(誤検出判定工程)。具体的に、前記誤検出物の判定は、例えば、複数の画像間における前記検出された搬送物の位置の変化量に基づき、実施できる。具体例として、所定の画像数における、前記検出された搬送物の位置(例えば、座標または中心座標)の変化量が、所定の数値以下である場合、前記検出された搬送物は、誤検出物であると判定できる。また、各画像から算出された搬送物の平均移動ベクトルと、画像間(例えば、j-1枚目の画像と、j枚目の画像)の検出された搬送物の移動ベクトルとの乖離が、各画像から算出された搬送物の平均移動ベクトルの標準偏差以上の場合、前記検出された搬送物は、誤検出物であると判定してもよい。前記所定の数値は、例えば、10ピクセル、5ピクセル、1ピクセルがあげられる。また、このような条件に設定することにより、変形例3は、前記誤検出物をより精度よく、検出できる。前記所定の画像数は、例えば、1~10フレーム、1~5フレーム、1~3フレームである。前記所定の数値は、例えば、前記画像により撮像される前記搬送装置の大きさ、前記搬送ルートの長さに応じて、適宜設定できる。前記誤検出物と判定された搬送物および前記搬送物に関連付けられた情報は、例えば、前記検出結果および追跡結果から削除することが好ましい。前記削除は、例えば、新たな検出対象画像(つぎの検出対象画像)における搬送物の検出に先立ち実施する。前記削除を、前記誤検出判定手段または誤検出判定工程とともに実施しない場合、前記誤検出判定手段または誤検出判定工程では、前記検出結果および追跡結果における搬送物および前記搬送物に関連付けられた情報に、削除フラグを付与する。 The tracking method of Modification 3 can be implemented by the tracking device of Modification 3, for example. Specifically, first, the detection step and the tracking step are performed in the same manner as in the first embodiment. At this time, in Modification 3, by calculating the position information of the conveyed object in the image used for tracking (for example, the jth image in Modification 2) for the conveyed object detected in the detection target image. ,get. Next, based on the position information of the conveyed product obtained in the tracking step, the erroneous detection determining unit determines whether the conveyed item is an erroneously detected object (error detection determining step). Specifically, the determination of the erroneously detected object can be performed based on, for example, the amount of change in the position of the detected conveyed object between a plurality of images. As a specific example, when the amount of change in the position (for example, coordinates or center coordinates) of the detected conveyed object in a predetermined number of images is equal to or less than a predetermined numerical value, the detected conveyed object is an erroneously detected object. Can be determined. Further, the difference between the average movement vector of the conveyance object calculated from each image and the movement vector of the conveyance object detected between the images (for example, the (j−1) th image and the jth image) is When the average deviation of the average movement vector of the conveyed product calculated from each image is equal to or larger than the standard deviation, the detected conveyed item may be determined as an erroneously detected object. Examples of the predetermined numerical value include 10 pixels, 5 pixels, and 1 pixel. Moreover, by setting to such conditions, the modified example 3 can detect the erroneously detected object with higher accuracy. The predetermined number of images is, for example, 1 to 10 frames, 1 to 5 frames, and 1 to 3 frames. The predetermined numerical value can be appropriately set according to, for example, the size of the transport device captured by the image and the length of the transport route. It is preferable that the transported object determined as the erroneously detected object and the information associated with the transported object are deleted from the detection result and the tracking result, for example. The deletion is performed, for example, prior to detection of a conveyed product in a new detection target image (next detection target image). When the deletion is not performed together with the erroneous detection determination unit or the erroneous detection determination step, in the erroneous detection determination unit or the erroneous detection determination step, the conveyance object in the detection result and the tracking result and the information associated with the conveyance object are included. The deletion flag is assigned.

 以上説明したように、変形例3の追跡装置および追跡方法によれば、前記搬送物として誤検出された誤検出物かを判定できるため、不要な追跡処理の実施を低減できる。したがって、変形例3の追跡装置および追跡方法によれば、処理時間をより短縮できる。 As described above, according to the tracking device and the tracking method of the third modification, since it is possible to determine whether the detected object is a false detection object, it is possible to reduce unnecessary tracking processing. Therefore, according to the tracking device and the tracking method of Modification 3, the processing time can be further shortened.

[変形例4]
 変形例4は、本発明の追跡装置および追跡方法に関する。変形例4の追跡装置および追跡方法は、例えば、実施形態1の追跡装置および追跡方法の説明を援用できる。
[Modification 4]
Modification 4 relates to the tracking device and the tracking method of the present invention. For example, the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the fourth modification.

 変形例4の追跡装置は、実施形態1の追跡装置に加えて、前記k枚の検出対象画像のうち、h枚目の検出対象画像における搬送物の検出結果と、前記n枚の画像のうち、前記h枚目の検出対象画像のつぎに取得された画像における搬送物の追跡結果に基づき、前記検出された搬送物と前記追跡された搬送物とのうち、同じ搬送物を対応づける対応付け手段を含む。 In addition to the tracking device of the first embodiment, the tracking device of Modification 4 includes a detection result of a conveyance object in the h-th detection target image and the n images among the k detection-target images. Correspondence for associating the same transported object with the detected transported object and the tracked transported object based on the tracking result of the transported object in the image acquired next to the h-th detection target image Including means.

 前記対応付け手段は、例えば、前記検出手段および前記追跡手段と電気的に接続されている。前記対応付け手段は、例えば、CPUがあげられる。 The association means is electrically connected to, for example, the detection means and the tracking means. An example of the association means is a CPU.

 前記「h」は、k以下の正の整数である。すなわち、hは、1≦h≦kを満たす整数である。 The “h” is a positive integer equal to or less than k. That is, h is an integer that satisfies 1 ≦ h ≦ k.

 変形例4の追跡方法は、実施形態1の追跡方法に加えて、前記k枚の検出対象画像のうち、h枚目の検出対象画像における搬送物の検出結果と、前記n枚の画像のうち、前記h枚目の検出対象画像のつぎに取得された画像における搬送物の追跡結果に基づき、前記検出された搬送物と前記追跡された搬送物とのうち、同じ搬送物を対応づける対応付け工程を含む。 In addition to the tracking method of the first embodiment, the tracking method of the modification 4 includes a detection result of the conveyed object in the h-th detection target image and the n images among the k detection-target images. Correspondence for associating the same transported object with the detected transported object and the tracked transported object based on the tracking result of the transported object in the image acquired next to the h-th detection target image Process.

 変形例4の追跡方法は、例えば、変形例4の追跡装置により実施できる。具体的には、前記対応付け手段により、h枚目の検出対象画像における搬送物の検出結果を取得する。つぎに、前記対応付け手段により、前記n枚の画像のうち、前記h枚目の検出対象画像のつぎに取得された画像における搬送物の追跡結果を取得する。そして、前記対応付け手段は、例えば、前記検出結果における各搬送物の領域と、前記追跡結果における各搬送物の領域の重なりを算出することにより、同じ搬送物が存在するかを判定する。これにより、前記対応付け手段は、例えば、各搬送物の距離に基づき判定する場合と比較して、各画像における同じ搬送物をより正確に対応付けできる。そして、前記領域の重なりが存在する場合、前記重なりが生じている前記検出結果における搬送物と、前記追跡結果における搬送物とが、同じ搬送物であると判定し、同一の搬送物として関連付ける対応付けを実施する。また、前記検出結果および追跡結果において、複数の搬送物が存在する場合、前記対応付け手段は、例えば、下記のように対応付けする。具体的には、まず、前記検出結果における搬送物と、前記追跡結果における搬送物との全ての組合せについて、それぞれ、領域の重複率を算出する。前記追跡結果の搬送物の領域をAとし、前記検出結果の搬送物の領域をBとした場合、前記重複率は、例えば、前記追跡結果の搬送物の領域(A)において、前記検出結果の搬送物が重複する領域の割合(A∩B/A)、前記検出結果の搬送物の領域(B)において、前記追跡結果の搬送物が重複する領域の割合(A∩B/B)、前記追跡結果の搬送物の領域および前記検出結果の搬送物の領域の和集合(A∪B)において、前記追跡結果の搬送物の領域と前記検出結果の搬送物の領域とが重複する領域(A∩B)の割合(A∩B/A∪B)等があげられる。そして、算出された重複率において、重複率が高い組合せから順番に同一の搬送物として関連付ける対応付けを、繰り返し実施する。前記対応付け工程において、前記重複率に基づき対応付けを行なう場合、例えば、前記重複率が所定割合以下の搬送物については、対応付けを実施しなくてもよい。また、前記対応付け手段は、例えば、前記検出結果における各搬送物の座標(例えば、中心座標)と、前記追跡結果における各搬送物の座標(例えば、中心座標)との距離を算出することにより、同じ搬送物が存在するかを判定してもよい。この場合、前記対応付け手段は、例えば、前記距離が短い組合せから順番に同一の搬送物として関連付ける対応付けを、繰り返し実施する。 The tracking method of Modification 4 can be implemented by the tracking device of Modification 4, for example. Specifically, the detection result of the conveyed product in the h-th detection target image is acquired by the association unit. Next, a tracking result of the conveyed object in the image acquired next to the h-th detection target image among the n images is acquired by the association unit. And the said matching means determines whether the same conveyance object exists by calculating the overlap of the area | region of each conveyance object in the said detection result, and the area | region of each conveyance object in the said tracking result, for example. Thereby, the said matching means can match | combine the same conveyance thing in each image more correctly compared with the case where it determines based on the distance of each conveyance object, for example. If there is an overlap of the areas, the transported object in the detection result in which the overlap has occurred and the transported object in the tracking result are determined to be the same transported object, and are associated as the same transported object. Perform the attachment. Further, in the detection result and the tracking result, when there are a plurality of transported objects, the associating means associates as follows, for example. Specifically, first, the overlapping ratio of the regions is calculated for all combinations of the transported object in the detection result and the transported object in the tracking result. In the case where the area of the transported object of the tracking result is A and the area of the transported object of the detection result is B, the overlap rate is, for example, in the area of the transported object of the tracking result (A). The ratio (A∩B / B) of the area where the conveyed object overlaps, the ratio (A / B / B) of the area where the conveyed object of the tracking result overlaps in the area (B) of the conveyed object as the detection result, In the union (A∪B) of the tracking result transported object area and the detection result transported object area, an area where the tracking result transported object area and the detection result transported object area overlap (A割 合 B) ratio (A∩B / A 等 B) and the like. And in the calculated duplication rate, the association which associates as an identical conveyed object in order from the combination with a high duplication rate is repeatedly implemented. In the associating step, when associating is performed based on the overlap rate, for example, it is not necessary to perform associating for a transported object having the overlap rate of a predetermined ratio or less. In addition, the association unit calculates, for example, a distance between the coordinates (for example, center coordinates) of each transported object in the detection result and the coordinates (for example, center coordinates) of each transported object in the tracking result. It may be determined whether the same transported object exists. In this case, for example, the association unit repeatedly performs association that associates the same transport object in order from the combination with the shortest distance.

 以上説明したように、変形例4の追跡装置および追跡方法によれば、前記検出結果および前記追跡結果における同じ搬送物について、対応付けすることができる。このため、変形例4の追跡装置および追跡方法によれば、同じ搬送物について、別の搬送物であるとして、それぞれを追跡する必要がなくなり、処理時間をより短縮できる。 As described above, according to the tracking device and the tracking method of the modification example 4, the same conveyance object in the detection result and the tracking result can be associated with each other. For this reason, according to the tracking device and the tracking method of the modified example 4, it is not necessary to track the same transported object as another transported object, and the processing time can be further shortened.

[変形例5]
 変形例5は、本発明の追跡装置および追跡方法に関する。変形例5の追跡装置および追跡方法は、例えば、実施形態1の追跡装置および追跡方法の説明を援用できる。
[Modification 5]
Modification 5 relates to the tracking device and the tracking method of the present invention. For example, the description of the tracking device and the tracking method of the first embodiment can be cited as the tracking device and the tracking method of the modification 5.

 変形例5の追跡装置は、実施形態1の追跡装置に加え、前記画像より前に取得された検出対象画像で検出された搬送物が存在するかを判定する搬送物判定手段を含み、前記搬送物判定手段が前記検出された搬送物が存在すると判定した場合、前記追跡手段は、追跡する。 The tracking device according to the modified example 5 includes, in addition to the tracking device according to the first embodiment, a transported object determination unit that determines whether there is a transported object detected in the detection target image acquired before the image. When the object determining means determines that the detected transported object exists, the tracking means tracks.

 前記搬送物判定手段は、例えば、前記検出手段および前記追跡手段と電気的に接続されている。前記搬送物判定手段は、例えば、CPUがあげられる。 The transported object determining means is electrically connected to, for example, the detecting means and the tracking means. An example of the conveyed product determination means is a CPU.

 変形例5の追跡方法は、実施形態1の追跡方法に加え、前記画像より前に取得された検出対象画像で検出された搬送物が存在するかを判定する搬送物判定工程を含み、前記搬送物判定工程において前記検出された搬送物が存在すると判定した場合、前記追跡工程は、追跡する。 In addition to the tracking method of the first embodiment, the tracking method of the modified example 5 includes a transported object determination step of determining whether a transported object detected in the detection target image acquired before the image exists, When it is determined in the object determination step that the detected transported object exists, the tracking step tracks.

 変形例5の追跡方法は、例えば、変形例5の追跡装置により実施できる。具体的には、前記搬送物判定手段により、q枚目の画像より前に取得された検出対象画像で検出された搬送物の検出結果を取得する。そして、取得された搬送物の検出結果において、前記搬送物判定手段により、前記検出された搬送物が存在するかを判定する。そして、前記検出された搬送物が存在する場合、前記追跡手段による追跡工程を実施する。また、前記搬送物の検出結果に前述の削除フラグが付与されている場合、前記搬送物判定手段は、例えば、前記削除フラグが付与された搬送物は、検出されていないと判定してもよい。これにより、例えば、前記誤検出物の追跡を回避できるため、処理時間をより短縮できる。前記「q」は、n以下の正の整数である。すなわち、qは、1≦q≦nを満たす整数である。 The tracking method of Modification 5 can be implemented by the tracking device of Modification 5, for example. Specifically, the transported object determination means acquires the detection result of the transported object detected in the detection target image acquired before the q-th image. Then, in the acquired detection result of the transported object, the transported object determining unit determines whether or not the detected transported object exists. When the detected transported object exists, a tracking step by the tracking unit is performed. Moreover, when the above-mentioned deletion flag is given to the detection result of the said conveyed product, the said conveyed product determination means may determine that the conveyed product to which the said deletion flag was given is not detected, for example. . Thereby, for example, since the tracking of the erroneously detected object can be avoided, the processing time can be further shortened. The “q” is a positive integer of n or less. That is, q is an integer that satisfies 1 ≦ q ≦ n.

 以上説明したように、変形例5の追跡装置および追跡方法によれば、前記検出された搬送物が存在するかを判定する。このため、変形例5の追跡装置および追跡方法によれば前記検出された搬送物が存在しない場合、または追跡すべき搬送物が存在しない場合に、前記追跡工程を実施しなくてよいため、処理時間をより短縮できる。 As described above, according to the tracking device and the tracking method of the modified example 5, it is determined whether or not the detected transported object exists. For this reason, according to the tracking device and the tracking method of the modified example 5, when the detected transported object does not exist or when the transported object to be tracked does not exist, the tracking process does not have to be performed. Time can be shortened.

 なお、変形例1~5は、それぞれを単独で使用してもよいが、処理時間をより短縮、かつ追跡の精度をより向上できるため、複数を組合せて使用することが好ましく、全てを組合せ使用することがさらに好ましい。前記複数の組合せの場合、前記組合せは、特に制限されず、任意の組合せとできる。 Modifications 1 to 5 may be used alone, but it is preferable to use a plurality of combinations in order to shorten the processing time and improve the tracking accuracy. More preferably. In the case of the plurality of combinations, the combination is not particularly limited and can be any combination.

[実施形態2]
 実施形態2は、本発明の追跡装置および追跡方法に関する。
[Embodiment 2]
The second embodiment relates to the tracking device and the tracking method of the present invention.

 図6に、本実施形態における追跡装置のブロック図を示す。図6に示すように、本実施形態の追跡装置40は、実施形態1の追跡装置10に加え、検出処理判定手段114、搬送物判定手段115、誤検出判定手段116、および対応付け手段117を含む。図6に示すように、画像取得手段111、検出手段112、追跡手段113、検出処理判定手段114、搬送物判定手段115、誤検出判定手段116、および対応付け手段117は、例えば、ハードウェアであるデータ処理手段(データ処理装置)11に組み込まれてもよく、ソフトウェアまたは前記ソフトウェアが組み込まれたハードウェアでもよい。データ処理手段11は、CPU等を備えてもよい。また、データ処理手段11は、例えば、前述のROM、RAM等のデータ記憶手段を備えてもよい。この場合、各手段が、例えば、前記データ記憶手段における対応する記憶部と電気的に接続されている。 FIG. 6 shows a block diagram of the tracking device in the present embodiment. As shown in FIG. 6, in addition to the tracking device 10 of the first embodiment, the tracking device 40 of the present embodiment includes a detection processing determination unit 114, a transported object determination unit 115, an erroneous detection determination unit 116, and an association unit 117. Including. As shown in FIG. 6, the image acquisition unit 111, the detection unit 112, the tracking unit 113, the detection process determination unit 114, the transported object determination unit 115, the erroneous detection determination unit 116, and the association unit 117 are, for example, hardware. It may be incorporated in a certain data processing means (data processing apparatus) 11 or may be software or hardware in which the software is incorporated. The data processing unit 11 may include a CPU or the like. Further, the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.

 また、本実施形態において、検出処理判定手段114は、画像取得手段111および検出手段112と、搬送物判定手段115は、検出手段112および追跡手段113と、誤検出判定手段116は、検出手段112および追跡手段113と、対応付け手段117は、検出手段112および追跡手段113と、それぞれ、電気的に接続されている。これらの点を除き、実施形態2の追跡装置40の構成は、実施形態1の追跡装置10の構成と同様であり、その説明を援用できる。また、検出処理判定手段114、搬送物判定手段115、誤検出判定手段116、および対応付け手段117の説明は、変形例1~5の説明を援用できる。 Further, in the present embodiment, the detection processing determination unit 114 is the image acquisition unit 111 and the detection unit 112, the conveyed product determination unit 115 is the detection unit 112 and the tracking unit 113, and the erroneous detection determination unit 116 is the detection unit 112. The tracking unit 113 and the association unit 117 are electrically connected to the detection unit 112 and the tracking unit 113, respectively. Except for these points, the configuration of the tracking device 40 of the second embodiment is the same as the configuration of the tracking device 10 of the first embodiment, and the description thereof can be used. Further, the descriptions of the modified examples 1 to 5 can be used for the description of the detection processing determination unit 114, the conveyed product determination unit 115, the erroneous detection determination unit 116, and the association unit 117.

 つぎに、図7および8に、本実施形態における追跡方法のフローチャートを示す。本実施形態の追跡方法は、例えば、図6の追跡装置40を用いて、つぎのように実施する。図7に示すように、本実施形態の追跡方法は、S1ステップ(画像取得)、S4ステップ(検出処理判定)、S6ステップ(追跡データの更新)、S2’ステップ(検出/対応付け)、S7ステップ(搬送物判定)、S3ステップ(追跡)、S8ステップ(追跡データの追加)、およびS9ステップ(誤検出判定)から構成されるメインスレッドを含む。また、図8に示すように、S2’ステップは、S2ステップ(検出)、S10ステップ(対応付け、同期)、およびS11ステップ(検出結果の出力)から構成されるサブスレッドを含む。 Next, FIGS. 7 and 8 show flowcharts of the tracking method in the present embodiment. The tracking method of this embodiment is implemented as follows using the tracking device 40 of FIG. 6, for example. As shown in FIG. 7, the tracking method of the present embodiment includes S1 step (image acquisition), S4 step (detection process determination), S6 step (tracking data update), S2 ′ step (detection / association), S7. It includes a main thread composed of steps (conveyed object determination), S3 step (tracking), S8 step (addition of tracking data), and S9 step (false detection determination). As shown in FIG. 8, the S2 ′ step includes a sub thread composed of an S2 step (detection), an S10 step (association, synchronization), and an S11 step (detection result output).

 図9および10を用いて、本実施形態の追跡方法についてより具体的に説明する。図9は、追跡装置40により取得される画像を示す図である。また、図10は、S2ステップ(検出)およびS10ステップ(対応付け、同期)を含むS2’ステップと、S3ステップ(追跡)との処理のタイムラインを示す図である。 9 and 10, the tracking method of this embodiment will be described more specifically. FIG. 9 is a diagram illustrating an image acquired by the tracking device 40. FIG. 10 is a diagram showing a timeline of processing of the S2 ′ step including the S2 step (detection) and the S10 step (association, synchronization) and the S3 step (tracking).

(フレーム1)
 まず、S1ステップにおいて、画像取得手段111により、フレーム1(1枚目の画像)を取得する。つぎに、検出処理判定手段114により、検出処理が実施されているかを判定する(S4)。フレーム1では、検出処理は実施されていないため、Noである場合に進む。つぎに、例えば、検出手段112により、検出結果および追跡結果が関連付けられた追跡データを更新する(S6)。フレーム1では、前記検出結果および前記追跡結果は存在しないため、追跡データの更新は行なわずに、S2’ステップに進み、サブスレッドを開始する。
(Frame 1)
First, in step S <b> 1, frame 1 (first image) is acquired by the image acquisition unit 111. Next, it is determined by the detection process determination means 114 whether the detection process is being implemented (S4). In frame 1, since the detection process is not performed, the process proceeds to No. Next, for example, the detection unit 112 updates the tracking data associated with the detection result and the tracking result (S6). In frame 1, since the detection result and the tracking result do not exist, the tracking data is not updated, and the process proceeds to step S2 ′ to start a sub thread.

 サブスレッドの開始後、S2ステップにおいて、検出手段112により、フレーム1を検出対象画像として選択し、フレーム1における対象物1aおよび対象物2aを検出する。検出手段112は、第1の対象物1および第2の対象物2を検出可能な学習モデルを用いて、第1の対象物1aおよび第2の対象物2aを検出する。そして、検出手段112による検出により、第1の対象物1aについて、その中心座標、縦横サイズ(占める領域)、および確信度(第1の対象物1である可能性)が、フレーム1と関連付けられた検出結果として出力される。また、第2の対象物2aについて、その中心座標、縦横サイズ(占める領域)等の位置情報、および確信度(第2の対象物2である可能性)が、フレーム1と関連付けられた検出結果として出力される。 After the start of the sub thread, in step S2, the detection unit 112 selects the frame 1 as the detection target image, and detects the target object 1a and the target object 2a in the frame 1. The detection means 112 detects the first object 1a and the second object 2a using a learning model that can detect the first object 1 and the second object 2. As a result of detection by the detection means 112, the center coordinates, vertical and horizontal sizes (occupied areas), and certainty (possibility of being the first object 1) of the first object 1a are associated with the frame 1. Is output as a detection result. Further, with respect to the second object 2a, the detection result in which the center coordinates, the position information such as the vertical and horizontal sizes (occupied areas), and the certainty (possibility of being the second object 2) are associated with the frame 1. Is output as

 つぎに、S10ステップでは、対応付け手段117により、検出された第1の対象物1aおよび第2の対象物2aと、フレーム1のつぎの画像の追跡結果における対象物とのうち、同じ搬送物を対応づける。フレーム1では、検出された第1の対象物1aおよび第2の対象物2aと比較可能な追跡データが存在しないため、対応付けは行なわず、S11ステップに進む。S11ステップでは、第1の対象物1aおよび第2の対象物2aの検出結果ならびにフレーム1を出力する。そして、サブスレッドを終了する。 Next, in step S10, the same conveyance object is detected among the first object 1a and the second object 2a detected by the association unit 117 and the object in the tracking result of the next image of the frame 1. Associate. In frame 1, since there is no tracking data that can be compared with the detected first object 1a and second object 2a, no matching is performed and the process proceeds to step S11. In step S11, the detection results of the first object 1a and the second object 2a and the frame 1 are output. Then, the sub thread is terminated.

 他方、S2’ステップにおけるS2ステップ、S10ステップおよびS11ステップと並行して、S7ステップを実施する。S7ステップでは、搬送物判定手段115により、前記検出された搬送物が存在するかを判定する。フレーム1では、フレーム1より前に検出された搬送物は存在しないため、Noである場合、すなわち、S1ステップに進む。 On the other hand, the step S7 is performed in parallel with the step S2, step S10 and step S11 in the step S2 '. In step S7, the conveyed product determination means 115 determines whether the detected conveyed item exists. In frame 1, since there is no transported object detected before frame 1, if No, that is, the process proceeds to step S1.

(フレーム2-4)
 つぎに、S1ステップにおいて、画像取得手段111により、フレーム2を取得する。そして、検出処理判定手段114により、検出処理が実施されているかを判定する(S4)。フレーム2では、図10に示すように、フレーム1の検出処理が終了し、検出処理は実施されていないため、Noである場合に進む。つぎに、例えば、検出手段112により、追跡データを更新する(S6)。フレーム1において、第1の対象物1aおよび第2の対象物2aが検出され、第1の対象物1aおよび第2の対象物2aの検出結果が出力されている。そこで、S6ステップでは、これらを検出された対象物とし、フレーム1における第1の対象物1aおよび第2の対象物2aの位置情報および確信度を追跡データとして記録する。また、第1の対象物1aについて、第1の対象物1としての検出回数を1回として記録し、第2の対象物2aについて、第2の対象物2としての検出回数を1回として記録する。なお、前記確信度および検出回数は、対象物の種類毎にリスト化し、保持する。そして、S2’ステップに進み、サブスレッドを開始する。
(Frame 2-4)
Next, in step S <b> 1, the frame 2 is acquired by the image acquisition unit 111. And it is determined by the detection process determination means 114 whether the detection process is implemented (S4). In frame 2, as shown in FIG. 10, since the detection process of frame 1 is completed and the detection process is not performed, the process proceeds to No. Next, for example, the tracking data is updated by the detecting means 112 (S6). In the frame 1, the first object 1a and the second object 2a are detected, and the detection results of the first object 1a and the second object 2a are output. Therefore, in step S6, these are detected objects, and the positional information and the certainty factor of the first object 1a and the second object 2a in the frame 1 are recorded as tracking data. Further, the number of detections as the first object 1 is recorded as one for the first object 1a, and the number of detections as the second object 2 is recorded as one for the second object 2a. To do. The certainty factor and the number of detection times are listed and held for each type of object. Then, the process proceeds to step S2 ′ to start a sub thread.

 サブスレッドの開始後、まず、S2ステップにおいて、検出手段112により、フレーム2を検出対象画像として選択し、フレーム1で検出された第1の対象物1aおよび第2の対象物2aとともに、フレーム2における第1の対象物1bを検出する。これにより、検出手段112による検出により、第1の対象物1a、1bについて、その中心座標、縦横サイズ(占める領域)、および確信度(対象物1である可能性)が、フレーム2と関連付けられた検出結果として出力される。また、第2の対象物2aについて、その中心座標、縦横サイズ(占める領域)等の位置情報、および確信度(対象物2である可能性)が、フレーム2と関連付けられた検出結果として出力される。 After the start of the sub-thread, first, in step S2, the detection unit 112 selects the frame 2 as the detection target image, and the frame 2 together with the first object 1a and the second object 2a detected in the frame 1 The first object 1b is detected. As a result, the center coordinates, the vertical and horizontal sizes (areas occupied), and the certainty (possibility of the object 1) are associated with the frame 2 for the first objects 1a and 1b by the detection by the detection unit 112. Is output as a detection result. In addition, with respect to the second object 2a, its center coordinates, position information such as vertical and horizontal sizes (occupied regions), and certainty (possibility of being the object 2) are output as detection results associated with the frame 2. The

 つぎに、S10ステップでは、対応付け手段117により、検出された第1の対象物1a、1bおよび第2の対象物2aと、フレーム2のつぎの画像、すなわち、後述するフレーム3の追跡結果における対象物とのうち、同じ対象物について、同一の対象物として対応づける。具体的には、フレーム2において、実線または破線で示す第1の対象物1aおよび第2の対象物2aの各領域と、フレーム3において実線または破線で示す第1の対象物1aおよび第2の対象物2aの各領域との重複率を、全ての対象物の組合せで算出する。前記各領域は、各対象物の中心座標および縦横サイズから算出する。そして、算出された重複率において、重複率が高いものから順番に同一の対象物として関連付ける対応付けを繰り返し、実施する。まず、フレーム2における第2の対象物2aとフレーム3における第2の対象物2aとの重複率は、他の対象物の組合せの重複率より高いため、同じ対象物として関連付けすることにより、対応づける。また、フレーム2における第1の対象物1aについても、同様にして、フレーム3における第1の対象物1aと対応付けする。 Next, in step S10, the association unit 117 detects the first object 1a, 1b and the second object 2a detected and the next image of the frame 2, that is, the tracking result of the frame 3 described later. Of the objects, the same objects are associated with each other as the same object. Specifically, in the frame 2, each region of the first object 1 a and the second object 2 a indicated by a solid line or a broken line, and the first object 1 a and the second object indicated by a solid line or a broken line in the frame 3. The overlapping rate with each area | region of the target object 2a is calculated by the combination of all the target objects. Each area is calculated from the center coordinates and vertical and horizontal sizes of each object. Then, in the calculated duplication rate, associations associated with the same object in order from the highest duplication rate are repeated and executed. First, since the overlapping rate of the second object 2a in frame 2 and the second object 2a in frame 3 is higher than the overlapping rate of the combination of other objects, Put it on. Similarly, the first object 1a in the frame 2 is associated with the first object 1a in the frame 3.

 そして、S11ステップでは、第1の対象物1a、1bおよび第2の対象物2aの検出結果、フレーム2および3における第1の対象物1aおよび第2の対象物2aの対応関係ならびにフレーム2を出力する。そして、サブスレッドを終了する。 In step S11, the detection results of the first objects 1a and 1b and the second object 2a, the correspondence between the first object 1a and the second object 2a in the frames 2 and 3, and the frame 2 are displayed. Output. Then, the sub thread is terminated.

 他方、S2’ステップにおけるS2ステップ、S10ステップおよびS11ステップと並行して、S7ステップを実施する。S7ステップでは、搬送物判定手段115により、前記検出された搬送物が存在するかを判定する。前記検出された第1の対象物1aおよび第2の対象物2aが存在するため、Yesである場合に進む。 On the other hand, the step S7 is performed in parallel with the step S2, step S10 and step S11 in the step S2 '. In step S7, the conveyed product determination means 115 determines whether the detected conveyed item exists. Since the detected first target object 1a and second target object 2a exist, the process proceeds to Yes.

 つぎに、S3ステップを実施する。ここで、フレーム2は、検出対象画像であるため、追跡手段113により、1つ前の検出対象画像であるフレーム1に基づき、第1の対象物1aおよび第2の対象物2aを追跡する。具体的には、S3ステップでは、第1の対象物1aおよび第2の対象物2aについて、フレーム1における第1の対象物1aおよび第2の対象物2aの中心座標ならびにフレーム1および2に基づき、オプティカルフロー推定により、フレーム2における第1の対象物1aおよび第2の対象物2aの中心座標を、それぞれ算出する。そして、S8ステップにおいて、例えば、追跡手段113により、算出された第1の対象物1aおよび第2の対象物2aの中心座標をフレーム2における対象物1a、2aの位置情報として、第1の対象物1aおよび第2の対象物2aの位置情報に追加する。 Next, step S3 is performed. Here, since the frame 2 is a detection target image, the tracking unit 113 tracks the first target 1a and the second target 2a based on the frame 1 that is the previous detection target image. Specifically, in step S3, for the first object 1a and the second object 2a, based on the center coordinates of the first object 1a and the second object 2a in the frame 1 and the frames 1 and 2. Then, the center coordinates of the first object 1a and the second object 2a in the frame 2 are respectively calculated by optical flow estimation. Then, in step S8, for example, the tracking unit 113 uses the calculated center coordinates of the first object 1a and the second object 2a as position information of the objects 1a and 2a in the frame 2, so that the first object It adds to the positional information on the object 1a and the 2nd target object 2a.

 S9ステップでは、誤検出判定手段116により、第1の対象物1aおよび第2の対象物2aが誤検出物であるかを判定する。具体的には、第1の対象物1aおよび第2の対象物2aのそれぞれについて、所定のフレーム数における中心座標の変化量が、所定の数値以下であるかを判定する。第1の対象物1aおよび第2の対象物2aが前記条件を満たす場合、誤検出判定手段116により、該当する第1の対象物1aおよび第2の対象物2aの追跡データに、削除フラグが付与される。そして、削除フラグが付与された追跡データは、つぎのS6ステップ時に削除される。なお、本実施形態では、第1の対象物1aおよび第2の対象物2aは、前記条件を満たさないため、削除フラグを付与しない。 In step S9, the erroneous detection determination means 116 determines whether the first object 1a and the second object 2a are erroneous detection objects. Specifically, for each of the first object 1a and the second object 2a, it is determined whether the amount of change in the center coordinate at a predetermined number of frames is equal to or less than a predetermined numerical value. When the first object 1a and the second object 2a satisfy the above condition, the deletion flag is added to the tracking data of the corresponding first object 1a and second object 2a by the erroneous detection determination means 116. Is granted. The tracking data to which the deletion flag is assigned is deleted at the next step S6. In the present embodiment, the first object 1a and the second object 2a do not give the deletion flag because they do not satisfy the condition.

 S9ステップの終了後、S1ステップに進み、画像取得手段111によりフレーム3を取得する。そして、検出処理判定手段114により、検出処理が実施されているかを判定する(S4)。フレーム3では、図10に示すように、フレーム2の検出処理が終了しておらず、検出処理は実施されているため、Yesである場合に進む。そして、S3工程において、1つ前の検出対象画像に代えて、1つ前の画像であるフレーム2を用いる以外はフレーム2と同様にして、フレーム3に対して、S7、S3、S8およびS9ステップを実施する。 After step S9, the process proceeds to step S1, and the frame 3 is acquired by the image acquisition unit 111. And it is determined by the detection process determination means 114 whether the detection process is implemented (S4). In frame 3, as shown in FIG. 10, since the detection process of frame 2 has not been completed and the detection process has been performed, the process proceeds to Yes. In step S3, S7, S3, S8 and S9 are performed on frame 3 in the same manner as frame 2, except that frame 2 which is the previous image is used instead of the previous detection target image. Perform the steps.

 フレーム3に対するS9ステップの終了後、S1ステップに進み、画像取得手段111によりフレーム4を取得する。そして、フレーム4に対して、フレーム3と同様にして、S4、S7、S3、S8およびS9ステップを実施する。S9ステップ後、S1ステップに進む。 After completion of step S9 for frame 3, the process proceeds to step S1, and the frame 4 is acquired by the image acquisition unit 111. Then, steps S4, S7, S3, S8, and S9 are performed on frame 4 in the same manner as frame 3. After step S9, the process proceeds to step S1.

 フレーム5以降の画像についても、フレーム2-4の追跡方法と同様にして、追跡を行ない、画像取得手段111による画像の取得が終了するまで繰り返す。これにより、対象物を含む搬送物を追跡することができる。 For the images after frame 5, tracking is performed in the same manner as the tracking method for frame 2-4 until the image acquisition by the image acquisition unit 111 is completed. Thereby, the conveyed product containing a target object can be tracked.

 なお、本実施形態の追跡方法では、S7ステップにおいて、Noの場合、および/またはS9ステップ後において、新たな画像が取得されているか判定してもよい。そして、Yesの場合、すなわち、新たな画像が取得されている場合、S1ステップに進む。他方、Noの場合、すなわち、新たな画像が取得されていない場合、本実施形態の追跡方法を終了する。 In the tracking method of the present embodiment, it may be determined whether a new image has been acquired in step S7 if No and / or after step S9. If Yes, that is, if a new image has been acquired, the process proceeds to step S1. On the other hand, in the case of No, that is, when a new image is not acquired, the tracking method of the present embodiment is terminated.

 以上説明したように、実施形態2の追跡装置および追跡方法によれば、検出処理および追跡処理を並行して処理する。このため、実施形態2の追跡装置および追跡方法では、検出処理の実施時に、追跡処理を待機させる必要がない。このため、実施形態2の追跡装置および追跡方法は、変形例1~5を組合せた装置と比較して、さらに処理時間を短縮できる。 As described above, according to the tracking device and the tracking method of the second embodiment, the detection process and the tracking process are processed in parallel. For this reason, in the tracking device and the tracking method of the second embodiment, it is not necessary to wait for the tracking process when the detection process is performed. For this reason, the tracking device and the tracking method of the second embodiment can further reduce the processing time as compared with the device combining the first to fifth modifications.

[実施形態3]
 実施形態3は、本発明の計数装置および計数方法に関する。本実施形態の計数装置および計数方法は、例えば、前記追跡装置および追跡方法の説明を援用できる。
[Embodiment 3]
Embodiment 3 relates to the counting device and counting method of the present invention. For example, the description of the tracking device and the tracking method can be used for the counting device and the counting method of the present embodiment.

 図11に、本実施形態における計数装置のブロック図を示す。図11に示すように、本実施形態の計数装置50は、実施形態1の追跡装置10に加え、計数手段118を含む。図11に示すように、画像取得手段111、検出手段112、追跡手段113、および計数手段118は、例えば、ハードウェアであるデータ処理手段(データ処理装置)11に組み込まれてもよく、ソフトウェアまたは前記ソフトウェアが組み込まれたハードウェアでもよい。データ処理手段11は、CPU等を備えてもよい。また、データ処理手段11は、例えば、前述のROM、RAM等のデータ記憶手段を備えてもよい。この場合、各手段が、例えば、前記データ記憶手段における対応する記憶部と電気的に接続されている。 FIG. 11 shows a block diagram of the counting device in the present embodiment. As shown in FIG. 11, the counting device 50 of the present embodiment includes a counting unit 118 in addition to the tracking device 10 of the first embodiment. As shown in FIG. 11, the image acquisition unit 111, the detection unit 112, the tracking unit 113, and the counting unit 118 may be incorporated in, for example, a data processing unit (data processing apparatus) 11 that is hardware, Hardware in which the software is incorporated may be used. The data processing unit 11 may include a CPU or the like. Further, the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.

 計数手段118は、追跡手段113と電気的に接続されている。計数手段118は、例えば、CPUがあげられる。計数手段118は、前記追跡された搬送物をカウントする。具体的には、追跡手段113が前記検出された搬送物の位置情報を取得している場合、計数手段118は、例えば、前記搬送物の位置情報に基づき、前記搬送物をカウントする。カウントに利用する搬送物の位置情報は、例えば、1以上の画像における搬送物の座標(例えば、中心座標)、2以上の画像における搬送物の座標を連結した軌跡等があげられる。 The counting means 118 is electrically connected to the tracking means 113. An example of the counting means 118 is a CPU. The counting means 118 counts the tracked transported object. Specifically, when the tracking unit 113 acquires the position information of the detected transported object, the counting unit 118 counts the transported object based on the position information of the transported object, for example. The position information of the conveyed object used for counting includes, for example, coordinates of the conveyed object (for example, center coordinates) in one or more images, and a trajectory connecting the coordinates of the conveyed objects in two or more images.

 つぎに、図12に、本実施形態における計数方法のフローチャートを示す。本実施形態の計数方法は、例えば、図11の計数装置50を用いて、つぎのように実施する。図12に示すように、本実施形態の計数方法は、実施形態1の追跡方法であるS1ステップ(画像取得)、S2ステップ(検出)、およびS3ステップ(追跡)に加え、S12ステップ(計数)を含む。本実施形態において、S2ステップおよびS3ステップは、並列に実施されてもよいし、直列に実施されてもよい。 Next, FIG. 12 shows a flowchart of the counting method in the present embodiment. The counting method of this embodiment is implemented as follows using the counting device 50 of FIG. 11, for example. As shown in FIG. 12, the counting method of the present embodiment is performed in addition to S1 step (image acquisition), S2 step (detection), and S3 step (tracking), which are the tracking method of Embodiment 1, and S12 step (counting). including. In this embodiment, S2 step and S3 step may be implemented in parallel and may be implemented in series.

 まず、実施形態1の追跡方法と同様にして、S1-S3ステップを実施する。 First, the steps S1-S3 are performed in the same manner as the tracking method of the first embodiment.

 つぎに、S12ステップでは、計数手段118により、追跡された搬送物をカウントする。具体的には、S12ステップでは、計数手段118により、追跡された搬送物の位置情報が、前記搬送物をカウントする条件である計数条件を満たすかを判定する。前記計数条件は、例えば、前記搬送物の画像外への移動、前記搬送物の所定領域への移動、前記搬送物の軌跡の距離が所定距離を超える等があげられる。そして、計数手段118により、前記計数条件を満たしていると判定された搬送物について、カウントする。S12ステップでは、前記カウント後、カウント済の搬送物の検出結果および追跡結果を、削除してもよい。また、前記実施形態2の追跡方法のように、追跡データを更新する工程を有する場合、カウント済の搬送物の検出結果および追跡結果に削除フラグを付与し、追跡データを更新する工程において、削除してもよい。これにより、同じ搬送物について、複数回カウントすることを防止できる。 Next, in step S12, the counted transported object is counted by the counting means 118. Specifically, in step S12, the counting unit 118 determines whether the tracked positional information of the conveyed product satisfies a counting condition that is a condition for counting the conveyed product. Examples of the counting condition include movement of the transported object outside the image, movement of the transported object to a predetermined area, and the distance of the trajectory of the transported object exceeding a predetermined distance. Then, the counting unit 118 counts the transported object determined to satisfy the counting condition. In step S12, after the count, the count detection result and tracking result may be deleted. Further, in the case of having the step of updating the tracking data as in the tracking method of the second embodiment, in the step of updating the tracking data by assigning a deletion flag to the detection result and tracking result of the counted transported object May be. Thereby, it can prevent counting about the same conveyed product in multiple times.

 S12ステップにおいて、前記搬送物が対象物を含む場合、前記対象物のみをカウントしてもよいし、前記対象物および対象物以外の搬送物をカウントしてもよい。また、前記搬送物が複数種類の対象物を含む場合、対象物毎にカウントしてもよい。前記対象物毎のカウントは、例えば、実施形態2の追跡方法における各対象物としての検出回数に基づき、実施できる。例えば、図9の第1の対象物1aの第1の対象物1としての検出回数が7回、第2の対象物2としての検出回数が3回であり、対象物毎に検出回数が異なる場合、S12ステップでは、第1の対象物1aは、第1の対象物1であるとして、カウントする。また、例えば、図9の第1の対象物1aの第1の対象物1としての検出回数が5回、第2の対象物2としての検出回数が5回であり、対象物毎の検出回数が同じ場合、例えば、さらに、前記追跡データにおける確信度に基づき、判断する。具体的には、第1の対象物1aの前記追跡データに記録された確信度のうち、第1の対象物1の確信度および第2の対象物2の確信度の最高値を検出し、いずれが高いかを判定する。そして、第1の対象物1の確信度が、第2の対象物の確信度より高い場合、第1の対象物1aは、第1の対象物1であるとして、カウントする。他方、第2の対象物2の確信度が、第1の対象物1の確信度より高い場合、第1の対象物1aは、第2の対象物2であるとして、カウントする。本実施形態の計数装置が誤検出判定手段を含む場合、前記計数手段は、前記誤検出判定手段と電気的に接続されていることが好ましい。また、本実施形態の計数方法が誤検出判定工程を含む場合、前記計数工程は、前記誤検出判定工程の後に実施することが好ましい。 In step S12, when the transported object includes a target object, only the target object may be counted, or a transported object other than the target object and the target object may be counted. Moreover, when the said conveyed product contains multiple types of target object, you may count for every target object. The count for each object can be performed based on, for example, the number of detections as each object in the tracking method of the second embodiment. For example, the number of detections of the first object 1a in FIG. 9 as the first object 1 is 7 and the number of detections as the second object 2 is 3, and the number of detections is different for each object. In this case, in step S12, the first object 1a is counted as being the first object 1. Further, for example, the number of times of detection of the first object 1a of FIG. 9 as the first object 1 is 5 times, the number of times of detection as the second object 2 is 5, and the number of times of detection for each object. Are the same, for example, based on the certainty factor in the tracking data. Specifically, out of the certainty factors recorded in the tracking data of the first object 1a, the highest certainty factor of the first object 1 and the certainty factor of the second object 2 is detected, Determine which is higher. Then, when the certainty factor of the first object 1 is higher than the certainty factor of the second object, the first object 1 a is counted as being the first object 1. On the other hand, when the certainty factor of the second object 2 is higher than the certainty factor of the first object 1, the first object 1 a is counted as being the second object 2. In the case where the counting device of the present embodiment includes an erroneous detection determination unit, it is preferable that the counting unit is electrically connected to the erroneous detection determination unit. Moreover, when the counting method of this embodiment includes an erroneous detection determination step, the counting step is preferably performed after the erroneous detection determination step.

 以上説明したように、実施形態3の計数装置および計数方法は、それぞれ、実施形態1の追跡装置および追跡方法を含む。このため、実施形態3の計数装置および計数方法によれば、全ての画像において、前記搬送物を検出する参考例の装置を含む計数装置および計数方法と比較して、処理時間を短縮できる。 As described above, the counting device and the counting method of the third embodiment include the tracking device and the tracking method of the first embodiment, respectively. For this reason, according to the counting device and the counting method of the third embodiment, the processing time can be shortened as compared with the counting device and the counting method including the device of the reference example that detects the conveyed object in all images.

 本実施形態の計数装置において、例えば、同じ搬送物の検出回数を測定する検出回数測定手段を含み、前記搬送物の検出回数が所定回数以下の場合、前記計数手段は、前記検出回数が所定回数以下の搬送物をカウントしなくてもよい。また、本実施形態の計数方法において、例えば、同じ搬送物の検出回数を測定する検出回数測定工程を含み、前記搬送物の検出回数が所定回数以下の場合、前記計数工程は、前記検出回数が所定回数以下の搬送物をカウントしない。このような構成を採用することにより、例えば、前記検出工程において、検出手段112により、搬送物として誤検出された誤検出物のカウントを防止できるため、より正確に搬送物をカウントできる。また、前記検出回数測定手段または検出回数測定工程は、前述の誤検出判定手段または誤検出判定工程と組合せることが好ましい。これにより、誤検出物のカウントをより効果的に防止できるため、さらに正確に搬送物をカウントできる。 In the counting device according to the present embodiment, for example, when the number of detections of the same conveyed object is included and the number of detections of the conveyed object is equal to or less than a predetermined number of times, the counting means has the predetermined number of times of detection. It is not necessary to count the following conveyed items. Further, in the counting method of the present embodiment, for example, including a detection frequency measurement step of measuring the number of detections of the same transported object, and when the detection frequency of the transported object is a predetermined number or less, the counting step Do not count transported items less than the specified number. By adopting such a configuration, for example, in the detection step, the detection unit 112 can prevent the erroneously detected object erroneously detected as the transported object, so that the transported object can be counted more accurately. Further, it is preferable that the detection frequency measurement means or the detection frequency measurement process is combined with the aforementioned erroneous detection determination means or the erroneous detection determination process. As a result, the erroneous detection object can be more effectively prevented from being counted, so that the conveyed object can be counted more accurately.

[実施形態4]
 実施形態4は、本発明の選別装置および選別方法に関する。本実施形態の選別装置および選別方法は、例えば、前記追跡装置および追跡方法の説明を援用できる。
[Embodiment 4]
Embodiment 4 relates to a sorting apparatus and a sorting method of the present invention. For example, the description of the tracking device and the tracking method can be used for the sorting device and the sorting method of the present embodiment.

 図13に、本実施形態における選別装置のブロック図を示す。図13に示すように、本実施形態の選別装置60は、実施形態1の追跡装置10に加え、選別手段119を含む。図13に示すように、画像取得手段111、検出手段112、追跡手段113、および選別手段119は、例えば、ハードウェアであるデータ処理手段(データ処理装置)11に組み込まれてもよく、ソフトウェアまたは前記ソフトウェアが組み込まれたハードウェアでもよい。データ処理手段11は、CPU等を備えてもよい。また、データ処理手段11は、例えば、前述のROM、RAM等のデータ記憶手段を備えてもよい。この場合、各手段が、例えば、前記データ記憶手段における対応する記憶部と電気的に接続されている。 FIG. 13 shows a block diagram of the sorting apparatus in the present embodiment. As shown in FIG. 13, the sorting device 60 of this embodiment includes a sorting unit 119 in addition to the tracking device 10 of the first embodiment. As shown in FIG. 13, the image acquisition unit 111, the detection unit 112, the tracking unit 113, and the selection unit 119 may be incorporated into, for example, a data processing unit (data processing apparatus) 11 that is hardware, Hardware in which the software is incorporated may be used. The data processing unit 11 may include a CPU or the like. Further, the data processing means 11 may include data storage means such as the aforementioned ROM and RAM, for example. In this case, each unit is electrically connected to a corresponding storage unit in the data storage unit, for example.

 選別手段119は、追跡手段113と電気的に接続されている。選別手段119は、例えば、CPUがあげられる。選別手段119は、前記追跡された搬送物における対象物を直接または間接的に選別する。前記間接的な選別であり、追跡手段113が前記検出された対象物の位置情報を取得している場合、選別手段119は、例えば、前記対象物の位置情報に基づき、装置内または装置外の選別装置により、前記対象物を選別する。前記選別に利用する搬送物の位置情報は、例えば、1以上の画像における搬送物の座標(例えば、中心座標)、2以上の画像における搬送物の座標を連結した軌跡等があげられる。前記選別装置は、例えば、複数種類の物を仕分け可能な仕分け装置、ロボットアーム等があげられる。前記直接的な選別であり、追跡手段113が前記検出された対象物の位置情報を取得している場合、選別手段119は、例えば、前記対象物の位置情報に基づき、前記追跡された搬送物における対象物を直接的に選別する。この場合、選別手段119は、例えば、前述の選別装置があげられる。 The sorting unit 119 is electrically connected to the tracking unit 113. An example of the sorting unit 119 is a CPU. The sorting means 119 sorts the objects in the tracked transported object directly or indirectly. In the case of the indirect sorting, when the tracking unit 113 has acquired the position information of the detected object, the sorting unit 119 is, for example, based on the position information of the object, inside or outside the apparatus. The object is sorted by a sorting device. The position information of the conveyed object used for the selection includes, for example, coordinates of the conveyed object (for example, center coordinates) in one or more images and a trajectory connecting the coordinates of the conveyed object in two or more images. Examples of the sorting device include a sorting device that can sort a plurality of types of items, a robot arm, and the like. In the case of the direct sorting and when the tracking unit 113 has acquired the position information of the detected object, the sorting unit 119 may, for example, use the tracked transported object based on the position information of the object. Directly sort objects in In this case, an example of the sorting unit 119 is the above-described sorting device.

 つぎに、図14に、本実施形態における選別方法のフローチャートを示す。本実施形態の選別方法は、例えば、図13の選別装置60を用いて、つぎのように実施する。図14に示すように、本実施形態の選別方法は、実施形態1の追跡方法であるS1ステップ(画像取得)、S2ステップ(検出)、およびS3ステップ(追跡)に加え、S13ステップ(選別)を含む。本実施形態において、S2ステップおよびS3ステップは、並列に実施されてもよいし、直列に実施されてもよい。 Next, FIG. 14 shows a flowchart of the selection method in the present embodiment. The sorting method of the present embodiment is performed as follows, for example, using the sorting device 60 of FIG. As shown in FIG. 14, the sorting method of this embodiment is not limited to the tracking method of Embodiment 1 (step S1 (image acquisition), step S2 (detection), and step S3 (tracking)), and step S13 (screening). including. In this embodiment, S2 step and S3 step may be implemented in parallel and may be implemented in series.

 まず、実施形態1の追跡方法と同様にして、S1-S3ステップを実施する。 First, the steps S1-S3 are performed in the same manner as the tracking method of the first embodiment.

 つぎに、S13ステップでは、選別手段119により、追跡された搬送物における対象物を直接または間接的に選別する。具体的には、S13ステップでは、選別手段119により、追跡された対象物および対象物以外の搬送物少なくとも一方の位置情報を取得する。そして、前記位置情報が前記対象物の位置情報の場合、S13ステップでは、選別手段119により、例えば、前記対象物をより分けることにより、前記対象物を選別する。他方、前記位置情報が前記対象物以外の搬送物の位置情報の場合、S13ステップでは、選別手段119により、例えば、前記対象物以外の搬送物を除去することにより、前記対象物を選別する。前記搬送物が複数種類の対象物を含む場合、複数種類の対象物と、前記対象物以外の搬送物とを選別してもよいし、さらに、複数種類の対象物について、それぞれ、選別してもよい。本実施形態の選別装置が誤検出判定手段を含む場合、前記選別手段は、前記誤検出判定手段と電気的に接続されていることが好ましい。また、本実施形態の選別方法が誤検出判定工程を含む場合、前記選別工程は、前記誤検出判定工程の後に実施することが好ましい。 Next, in step S13, the sorting means 119 sorts the object in the tracked transported object directly or indirectly. Specifically, in step S13, the sorting unit 119 acquires position information of the tracked target object and / or a transported object other than the target object. When the position information is the position information of the object, in step S13, the object is selected by the sorting unit 119, for example, by dividing the object. On the other hand, when the position information is the position information of the conveyed object other than the object, in step S13, the object is selected by the sorting unit 119, for example, by removing the conveyed object other than the object. When the transported object includes a plurality of types of objects, a plurality of types of objects and a transported object other than the target object may be selected, and further, a plurality of types of objects may be selected, respectively. Also good. When the sorting device of this embodiment includes an erroneous detection determination unit, it is preferable that the selection unit is electrically connected to the erroneous detection determination unit. Moreover, when the selection method of this embodiment includes an erroneous detection determination step, the selection step is preferably performed after the erroneous detection determination step.

 以上説明したように、実施形態4の選別装置および選別方法は、それぞれ、実施形態1の追跡装置および追跡方法を含む。このため、実施形態4の選別装置および選別方法によれば、全ての画像において、前記搬送物を検出する参考例の装置を含む選別装置および選別方法と比較して、処理時間を短縮できる。 As described above, the sorting device and the sorting method of the fourth embodiment include the tracking device and the tracking method of the first embodiment, respectively. For this reason, according to the sorting apparatus and the sorting method of the fourth embodiment, the processing time can be shortened as compared with the sorting apparatus and the sorting method including the apparatus of the reference example that detects the conveyed product in all images.

[実施形態5]
 本実施形態のプログラムは、前述の追跡方法、計数方法、または選別方法を、コンピュータ上で実行可能なプログラムである。または、本実施形態のプログラムは、例えば、コンピュータ読み取り可能な記録媒体に記録されてもよい。前記記録媒体は、例えば、非一時的なコンピュータ可読記録媒体(non-transitory computer-readable storage medium)である。前記記録媒体は、特に制限されず、例えば、ランダムアクセスメモリ(RAM)、読み出し専用メモリ(ROM)、ハードディスク(HD)、光ディスク、フロッピー(登録商標)ディスク(FD)等があげられる。
[Embodiment 5]
The program of this embodiment is a program that can execute the tracking method, counting method, or sorting method described above on a computer. Or the program of this embodiment may be recorded on a computer-readable recording medium, for example. The recording medium is, for example, a non-transitory computer-readable storage medium. The recording medium is not particularly limited, and examples thereof include a random access memory (RAM), a read-only memory (ROM), a hard disk (HD), an optical disk, and a floppy (registered trademark) disk (FD).

[実施形態6]
 実施形態6は、本発明の追跡システムに関する。本発明の追跡システムは、例えば、前記追跡装置および追跡方法等の説明を援用できる。
[Embodiment 6]
Embodiment 6 relates to the tracking system of the present invention. The tracking system of the present invention can use, for example, the description of the tracking device and the tracking method.

 図15に、本発明の追跡装置を用いた追跡システムの一例の構成を示す。図15に示すように、本実施形態の追跡システムは、撮像装置31a、31b、31cと、通信インターフェイス32a、32b、32cと、サーバ34とを備える。撮像装置31aは、通信インターフェイス32aに接続されている。撮像装置31aおよび通信インターフェイス32aは、場所Xに設置されている。撮像装置31bは、通信インターフェイス32bに接続されている。撮像装置31bおよび通信インターフェイス32bは、場所Yに設置されている。撮像装置31cは、通信インターフェイス32cに接続されている。撮像装置31cおよび通信インターフェイス32cは、場所Zに設置されている。そして、通信インターフェイス32a、32b、32cと、サーバ34とが、通信回線網33を介して接続されている。 FIG. 15 shows an example of the configuration of a tracking system using the tracking device of the present invention. As shown in FIG. 15, the tracking system of this embodiment includes imaging devices 31 a, 31 b, and 31 c, communication interfaces 32 a, 32 b, and 32 c, and a server 34. The imaging device 31a is connected to the communication interface 32a. The imaging device 31a and the communication interface 32a are installed at the place X. The imaging device 31b is connected to the communication interface 32b. The imaging device 31b and the communication interface 32b are installed in the place Y. The imaging device 31c is connected to the communication interface 32c. The imaging device 31c and the communication interface 32c are installed in the place Z. Communication interfaces 32 a, 32 b, 32 c and the server 34 are connected via a communication network 33.

 この追跡システムでは、サーバ34側に、画像取得手段、検出手段、および追跡手段が格納される。また、前記追跡システムは、例えば、場所Xで撮像装置31aを用いて取得されたn枚の画像を、サーバ34に送信し、サーバ34側で、搬送物を追跡する。 In this tracking system, image acquisition means, detection means, and tracking means are stored on the server 34 side. In addition, the tracking system transmits, for example, n images acquired by using the imaging device 31a at the place X to the server 34, and tracks the transported object on the server 34 side.

 なお、本実施形態の追跡システムは、前述の実施形態および変形例の組合せに対応したものであってもよい。また、本実施形態の追跡システムは、例えば、クラウドコンピューティングに対応したものであってもよい。さらに、本実施形態の追跡システムは、通信インターフェイス32a、32b、32cと、サーバ34とが、無線通信回線により接続されていてもよい。 Note that the tracking system of this embodiment may correspond to a combination of the above-described embodiments and modifications. Moreover, the tracking system of this embodiment may be compatible with, for example, cloud computing. Furthermore, in the tracking system of the present embodiment, the communication interfaces 32a, 32b, and 32c and the server 34 may be connected by a wireless communication line.

 本実施形態の追跡システムによれば、搬送物を追跡する際のサーバでの処理時間を短縮できる。本実施形態の追跡システムによれば、撮像装置を現場に設置し、サーバ等は他の場所に設置して、オンラインにより搬送物を追跡できる。そのため、装置の設置に場所を取ることがなく、メンテナンスも容易である。また、各設置場所が離れている場合であっても、一箇所での集中管理や遠隔操作が可能となる。 According to the tracking system of this embodiment, it is possible to shorten the processing time at the server when tracking a conveyed product. According to the tracking system of the present embodiment, an imaging apparatus can be installed on the site, a server or the like can be installed at another location, and a conveyed product can be tracked online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.

[実施形態7]
 実施形態7は、本発明の計数システムに関する。本発明の計数システムは、例えば、前記追跡装置、追跡方法、計数装置、計数方法、および追跡システム等の説明を援用できる。
[Embodiment 7]
Embodiment 7 relates to the counting system of the present invention. For example, the description of the tracking device, the tracking method, the counting device, the counting method, and the tracking system can be used for the counting system of the present invention.

 本実施形態の計数システムは、実施形態6の追跡システムにおいて、サーバ34内に、さらに計数手段が格納されている。そして、前記計数システムは、例えば、場所Xで計測画像取得手段311aを用いて取得されたn枚の計測画像を、サーバ34に送信し、サーバ34側で、搬送物をカウントする。これらの点を除き、実施形態7の計数システムは、実施形態6の追跡システムの説明を援用できる。 The counting system according to the present embodiment further includes counting means in the server 34 in the tracking system according to the sixth embodiment. The counting system transmits, for example, n measurement images acquired at the place X using the measurement image acquisition unit 311a to the server 34, and counts the conveyed objects on the server 34 side. Except for these points, the counting system of the seventh embodiment can use the description of the tracking system of the sixth embodiment.

 本実施形態の計数システムによれば、例えば、搬送物を追跡する際のサーバでの処理時間を短縮できる。このため、本実施形態の計数システムによれば、例えば、より短時間に搬送物を計数できる。本実施形態の計数システムによれば、撮像装置を現場に設置し、サーバ等は他の場所に設置して、オンラインにより搬送物をカウントできる。そのため、装置の設置に場所を取ることがなく、メンテナンスも容易である。また、各設置場所が離れている場合であっても、一箇所での集中管理や遠隔操作が可能となる。 According to the counting system of the present embodiment, for example, it is possible to shorten the processing time at the server when tracking a conveyed product. For this reason, according to the counting system of this embodiment, a conveyed product can be counted in a shorter time, for example. According to the counting system of this embodiment, an imaging device can be installed at a site, a server or the like can be installed at another location, and the conveyed items can be counted online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.

[実施形態8]
 実施形態8は、本発明の選別システムに関する。本発明の選別システムは、例えば、前記追跡装置、追跡方法、選別装置、選別方法、および追跡システム等の説明を援用できる。
[Embodiment 8]
Embodiment 8 relates to the sorting system of the present invention. For example, the description of the tracking device, the tracking method, the sorting device, the sorting method, and the tracking system can be used for the sorting system of the present invention.

 図16に、本発明の選別装置を用いた選別システムの一例の構成を示す。図16に示すように、本実施形態の選別システムは、撮像装置31a、31b、31cと、選別装置35a、35b、35cと、通信インターフェイス32a、32b、32cと、サーバ34とを備える。撮像装置31aおよび選別装置35aは、通信インターフェイス32aに接続されている。撮像装置31a、選別装置35a、および通信インターフェイス32aは、場所Xに設置されている。撮像装置31bおよび選別装置35bは、通信インターフェイス32bに接続されている。撮像装置31b、選別装置35b、および通信インターフェイス32bは、場所Yに設置されている。撮像装置31cおよび選別装置35cは、通信インターフェイス32cに接続されている。撮像装置31c、選別装置35c、および通信インターフェイス32cは、場所Zに設置されている。そして、通信インターフェイス32a、32b、32cと、サーバ34とが、通信回線網33を介して接続されている。 FIG. 16 shows a configuration of an example of a sorting system using the sorting apparatus of the present invention. As illustrated in FIG. 16, the sorting system according to the present embodiment includes imaging devices 31 a, 31 b, and 31 c, sorting devices 35 a, 35 b, and 35 c, communication interfaces 32 a, 32 b, and 32 c, and a server 34. The imaging device 31a and the sorting device 35a are connected to the communication interface 32a. The imaging device 31a, the sorting device 35a, and the communication interface 32a are installed at the place X. The imaging device 31b and the sorting device 35b are connected to the communication interface 32b. The imaging device 31b, the sorting device 35b, and the communication interface 32b are installed at the place Y. The imaging device 31c and the sorting device 35c are connected to the communication interface 32c. The imaging device 31c, the sorting device 35c, and the communication interface 32c are installed in the place Z. Communication interfaces 32 a, 32 b, 32 c and the server 34 are connected via a communication network 33.

 この選別システムでは、サーバ34側に、画像取得手段、検出手段、追跡手段、および選別手段が格納される。また、前記選別システムは、例えば、場所Xで撮像装置31aを用いて取得されたn枚の画像を、サーバ34に送信し、サーバ34側で、搬送物における対象物を追跡および選別する。また、サーバ34が、例えば、選別される対象物の位置情報等を選別装置35aに送信し、選別装置35aが、前記選別される対象物を選別する。これらの点を除き、実施形態8の選別システムは、実施形態6の追跡システムの説明を援用できる。 In this sorting system, image acquisition means, detection means, tracking means, and sorting means are stored on the server 34 side. In addition, the sorting system transmits, for example, n images acquired using the imaging device 31a at the place X to the server 34, and the server 34 side tracks and sorts the object in the transported object. Further, the server 34 transmits, for example, position information of an object to be sorted to the sorting device 35a, and the sorting device 35a sorts the object to be sorted. Except for these points, the sorting system of the eighth embodiment can use the description of the tracking system of the sixth embodiment.

 本実施形態の選別システムによれば、例えば、搬送物を追跡する際のサーバでの処理時間を短縮できる。このため、本実施形態の選別システムによれば、例えば、より短時間に搬送物における対象物を選別できる。また、本実施形態の選別システムによれば、撮像装置および選別装置を現場に設置し、サーバ等は他の場所に設置して、オンラインにより対象物を選別できる。そのため、装置の設置に場所を取ることがなく、メンテナンスも容易である。また、各設置場所が離れている場合であっても、一箇所での集中管理や遠隔操作が可能となる。 According to the sorting system of the present embodiment, for example, it is possible to shorten the processing time at the server when tracking a conveyed product. For this reason, according to the sorting system of this embodiment, for example, it is possible to sort the object in the transported object in a shorter time. Further, according to the sorting system of the present embodiment, the imaging device and the sorting device can be installed at the site, and the server or the like can be installed at another location, and the object can be sorted online. Therefore, the installation of the apparatus does not take a place, and maintenance is easy. Further, even when the installation locations are separated, centralized management and remote operation can be performed at one location.

 以上、実施形態を参照して本発明を説明したが、本発明は、上記実施形態に限定されるものではない。本発明の構成や詳細には、本発明のスコープ内で当業者が理解しうる様々な変更をできる。 As mentioned above, although this invention was demonstrated with reference to embodiment, this invention is not limited to the said embodiment. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.

<付記>
 上記の実施形態および実施例の一部または全部は、以下の付記のように記載されうるが、以下には限られない。
(付記1)
搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得手段と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出手段と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡手段とを含み、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の追跡装置。
(付記2)
前記画像取得手段により、前記n枚の画像のうち、m枚目の画像を取得以後、前記k枚の検出対象画像のうち、l枚目の検出対象画像に対する検出を前記検出手段が実施しているかを判定する検出処理判定手段を含み、
前記検出処理判定手段により、前記検出手段による検出が実施されていないと判定された場合、
前記検出手段は、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出する、付記1記載の追跡装置。
(付記3)
前記検出手段は、前記搬送物の位置情報を取得し、
前記追跡手段は、前記n枚の画像のうち、j枚目の画像と、j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物を追跡する、付記1または2記載の追跡装置。
(付記4)
前記j枚目の画像がi枚目の検出対象画像の場合、前記追跡手段は、j枚目の画像と、i-1枚目以前の検出対象画像と、前記i-1枚目以前の検出対象画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物の位置を追跡する、付記3記載の追跡装置。
(付記5)
前記追跡手段は、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記追跡手段で得られた搬送物の位置情報に基づき、前記搬送物が誤検出物であるかを判定する誤検出判定手段を含む、付記1から4のいずれかに記載の追跡装置。
(付記6)
前記誤判定検出手段は、複数の画像間における前記検出された搬送物の位置の変化量に基づき、前記搬送物が誤検出物であるかを判定する、付記5記載の追跡装置。
(付記7)
前記k枚の検出対象画像のうち、h枚目の検出対象画像における搬送物の検出結果と、前記n枚の画像のうち、前記h枚目の検出対象画像のつぎに取得された画像における搬送物の追跡結果に基づき、前記検出された搬送物と前記追跡された搬送物とのうち、同じ搬送物を対応づける対応付け手段を含む、付記1から6のいずれかに記載の追跡装置。
(付記8)
前記画像より前に取得された検出対象画像で検出された搬送物が存在するかを判定する搬送物判定手段を含み、
前記搬送物判定手段が前記検出された搬送物が存在すると判定した場合、前記追跡手段は、追跡する、付記1から7のいずれかに記載の追跡装置。
(付記9)
前記検出手段による検出と、前記追跡手段による追跡とが、並列して実施される、付記1から8のいずれか一項に記載の追跡装置。
(付記10)
前記検出手段は、前記搬送物を検出可能な学習モデルを用いて、前記搬送物を検出する、付記1から9のいずれかに記載の追跡装置。
(付記11)
搬送装置により搬送されている搬送物について追跡する追跡手段と、
前記追跡された搬送物をカウントする計数手段とを含み、
前記追跡手段は、付記1から10のいずれかに記載の搬送物の追跡装置であることを特徴とする、搬送物の計数装置。
(付記12)
前記追跡手段は、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記計数手段は、前記搬送物の位置情報に基づき、前記搬送物をカウントする、付記11記載の計数装置。
(付記13)
同じ搬送物の検出回数を測定する検出回数測定手段を含み、
前記搬送物の検出回数が所定回数以下の場合、前記計数手段は、前記検出回数が所定回数以下の搬送物をカウントしない、付記11または12記載の計数装置。
(付記14)
搬送装置により搬送されている、対象物を含む搬送物について追跡する追跡手段と、
前記追跡された搬送物における対象物を選別する選別手段とを含み、
前記追跡手段は、付記1から10のいずれかに記載の搬送物の追跡装置であることを特徴とする、搬送物の選別装置。
(付記15)
搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得工程と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出工程と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡工程とを含み、
前記検出工程におけるkは、nより小さいことを特徴とする、搬送物の追跡方法。
(付記16)
前記画像取得工程において、前記n枚の画像のうち、m枚目の画像を取得以後、前記k枚の検出対象画像のうち、l枚目の検出対象画像に対する検出を前記検出工程が実施しているかを判定する検出処理判定工程を含み、
前記検出処理判定工程において、前記検出工程による検出が実施されていないと判定された場合、
前記検出工程において、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出する、付記15記載の追跡方法。
(付記17)
前記検出工程において、前記搬送物の位置情報を取得し、
前記追跡工程において、前記n枚の画像のうち、j枚目の画像と、j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物を追跡する、付記15または16記載の追跡方法。
(付記18)
前記j枚目の画像がi枚目の検出対象画像の場合、前記追跡工程において、j枚目の画像と、i-1枚目以前の検出対象画像と、前記i-1枚目以前の検出対象画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物の位置を追跡する、付記17記載の追跡方法。
(付記19)
前記追跡工程において、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記追跡工程で得られた搬送物の位置情報に基づき、前記搬送物が誤検出物であるかを判定する誤検出判定工程を含む、付記15から18のいずれかに記載の追跡方法。
(付記20)
前記誤判定検出工程において、複数の画像間における前記検出された搬送物の位置の変化量に基づき、前記搬送物が誤検出物であるかを判定する、付記19記載の追跡方法。
(付記21)
前記k枚の検出対象画像のうち、h枚目の検出対象画像における搬送物の検出結果と、前記n枚の画像のうち、前記h枚目の検出対象画像のつぎに取得された画像における搬送物の追跡結果に基づき、前記検出された搬送物と前記追跡された搬送物とのうち、同じ搬送物を対応づける対応付け工程を含む、付記15から20のいずれかに記載の追跡方法。
(付記22)
前記画像より前に取得された検出対象画像で検出された搬送物が存在するかを判定する搬送物判定工程を含み、
前記搬送物判定工程において前記検出された搬送物が存在すると判定された場合、前記追跡工程は、追跡する、付記15から21のいずれかに記載の追跡方法。
(付記23)
前記検出工程と、前記追跡工程とが、並列して実施される、付記15から22のいずれかに記載の追跡方法。
(付記24)
前記検出工程において、前記搬送物を検出可能な学習モデルを用いて、前記搬送物を検出する、付記15から23のいずれかに記載の追跡方法。
(付記25)
搬送装置により搬送されている搬送物について追跡する追跡工程と、
前記追跡された搬送物をカウントする計数工程とを含み、
前記追跡工程は、付記15から24のいずれかに記載の搬送物の追跡方法であることを特徴とする、搬送物の計数方法。
(付記26)
前記追跡工程は、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記計数工程は、前記搬送物の位置情報に基づき、前記搬送物をカウントする、付記25記載の計数方法。
(付記27)
同じ搬送物の検出回数を測定する検出回数測定工程を含み、
前記搬送物の検出回数が所定回数以下の場合、前記計数工程は、前記検出回数が所定回数以下の搬送物をカウントしない、付記25または26記載の計数方法。
(付記28)
搬送装置により搬送されている、対象物を含む搬送物について追跡する追跡工程と、
前記追跡された搬送物における対象物を選別する選別工程とを含み、
前記追跡工程は、付記15から24のいずれかに記載の搬送物の追跡方法であることを特徴とする、搬送物の選別方法。
(付記29)
搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする、プログラム。
(付記30)
搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡処理と、
前記追跡された搬送物の数をカウントする計数処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする、プログラム。
(付記31)
搬送装置により搬送されている、対象物を含む搬送物について経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡処理と、
前記追跡された搬送物における対象物を選別する選別処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする、プログラム。
(付記32)
付記29から31のいずれかに記載のプログラムを記録していることを特徴とする、コンピュータ読み取り可能な記録媒体。
(付記33)
端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置を含み、
前記撮像装置は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を撮像し、
前記サーバは、画像取得手段、検出手段、および追跡手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、n枚の画像について、検出された搬送物を追跡し、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の追跡システム。
(付記34)
端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置を含み、
前記撮像装置は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を撮像し、
前記サーバは、画像取得手段、検出手段、追跡手段、および計数手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡し、
前記計数手段は、前記追跡された搬送物の数を計数し、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の計数システム。
(付記35)
端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置と選別装置とを含み、
前記撮像装置は、搬送装置により搬送されている、対象物を含む搬送物について、経時的にn枚の画像を撮像し、
前記選別手段は、追跡された搬送物における選別対象物を選別し、
前記サーバは、画像取得手段、検出手段、追跡手段、および選別手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている、対象物を含む搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡し、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の選別システム。
<Appendix>
Some or all of the above embodiments and examples can be described as the following supplementary notes, but are not limited thereto.
(Appendix 1)
Image acquisition means for acquiring n images over time with respect to the conveyed object being conveyed by the conveying device;
Detecting means for detecting the conveyed object for k detection target images selected from the n images;
A tracking means for tracking a conveyed object detected in the kth detection target image in an image acquired after the kth detection target image among the n images,
K in the said detection means is smaller than n, The tracking apparatus of the conveyed product characterized by the above-mentioned.
(Appendix 2)
After the acquisition of the m-th image among the n images by the image acquisition means, the detection means performs detection on the first detection target image among the k detection target images. A detection processing determination means for determining whether or not
When it is determined by the detection processing determination means that the detection by the detection means is not performed,
The tracking device according to appendix 1, wherein the detection unit detects the conveyed object using the m-th image as an (l + 1) -th detection target image.
(Appendix 3)
The detection means obtains position information of the transported object,
The tracking means is based on the j-th image, the image before the j−1th image, and the position information of the conveyed object in the image before the j−1th image among the n images. The tracking device according to appendix 1 or 2, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
(Appendix 4)
When the j-th image is the i-th detection target image, the tracking unit detects the j-th image, the i-1th detection target image, and the i-1th detection image. The tracking device according to attachment 3, wherein the position of the conveyed object is tracked by calculating position information of the conveyed object in the j-th image based on the positional information of the conveyed object in the target image.
(Appendix 5)
The tracking unit obtains position information of the conveyed object detected in the detection target image;
5. The tracking device according to any one of appendices 1 to 4, further comprising an erroneous detection determination unit that determines whether the transported object is an erroneously detected object based on position information of the conveyed object obtained by the tracking unit.
(Appendix 6)
The tracking device according to appendix 5, wherein the erroneous determination detection means determines whether the conveyed object is an erroneously detected object based on a change amount of the position of the detected conveyed object between a plurality of images.
(Appendix 7)
Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image. The tracking device according to any one of appendices 1 to 6, further including an association unit that associates the detected transported object and the tracked transported object with each other based on the tracking result of the object.
(Appendix 8)
A transport object determination means for determining whether or not a transport object detected in the detection target image acquired before the image exists,
The tracking device according to any one of appendices 1 to 7, wherein when the transported object determining unit determines that the detected transported object exists, the tracking unit tracks.
(Appendix 9)
9. The tracking device according to any one of appendices 1 to 8, wherein detection by the detection unit and tracking by the tracking unit are performed in parallel.
(Appendix 10)
The tracking device according to any one of appendices 1 to 9, wherein the detection unit detects the transported object using a learning model capable of detecting the transported object.
(Appendix 11)
A tracking means for tracking the transported object being transported by the transport device;
Counting means for counting the tracked transported goods,
11. A transported object counting apparatus, wherein the tracking means is the transported object tracking apparatus according to any one of appendices 1 to 10.
(Appendix 12)
The tracking unit obtains position information of the conveyed object detected in the detection target image;
The counting device according to appendix 11, wherein the counting means counts the conveyed object based on position information of the conveyed object.
(Appendix 13)
Including a detection frequency measuring means for measuring the detection frequency of the same transported object
The counting apparatus according to appendix 11 or 12, wherein when the number of detections of the transported object is equal to or less than a predetermined number, the counting unit does not count a transported object whose detection frequency is equal to or less than the predetermined number.
(Appendix 14)
A tracking means for tracking a transported object including the object being transported by the transport device;
Sorting means for sorting objects in the tracked transported object,
11. The transported object sorting apparatus, wherein the tracking means is the transported object tracking apparatus according to any one of appendices 1 to 10.
(Appendix 15)
An image acquisition step of acquiring n images over time for a conveyed item being conveyed by a conveying device;
A detection step of detecting the conveyed object for k detection target images selected from the n images;
A tracking step of tracking a conveyed object detected in the kth detection target image in an image obtained after the kth detection target image among the n images,
The method of tracking a conveyed product, wherein k in the detection step is smaller than n.
(Appendix 16)
In the image acquisition step, after the acquisition of the m-th image among the n images, the detection step performs detection on the l-th detection target image among the k detection-target images. Including a detection process determination step for determining whether
In the detection process determination step, when it is determined that the detection by the detection step is not performed,
16. The tracking method according to appendix 15, wherein in the detection step, the transported object is detected using the m-th image as an l + 1th detection target image.
(Appendix 17)
In the detection step, the position information of the conveyed product is acquired,
In the tracking step, based on the j-th image, the image before the j−1th image, and the position information of the conveyed object in the image before the j−1th image among the n images, The tracking method according to appendix 15 or 16, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
(Appendix 18)
When the j-th image is the i-th detection target image, in the tracking step, the j-th image, the i−1-th detection target image, and the i−1-th detection image The tracking method according to appendix 17, wherein the position of the transported object is tracked by calculating the position information of the transported object in the j-th image based on the position information of the transported object in the target image.
(Appendix 19)
In the tracking step, obtain positional information of the conveyed object detected in the detection target image,
The tracking method according to any one of appendices 15 to 18, including an erroneous detection determination step of determining whether the transported object is an erroneously detected object based on position information of the conveyed object obtained in the tracking process.
(Appendix 20)
The tracking method according to appendix 19, wherein, in the erroneous determination detection step, it is determined whether the conveyed object is an erroneously detected object based on an amount of change in the position of the detected conveyed object between a plurality of images.
(Appendix 21)
Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image. 21. The tracking method according to any one of appendices 15 to 20, further including an associating step of associating the same transported object between the detected transported object and the tracked transported object based on a tracking result of the object.
(Appendix 22)
A transport object determination step for determining whether a transport object detected in the detection target image acquired before the image exists,
The tracking method according to any one of appendices 15 to 21, wherein, when it is determined in the transported object determination step that the detected transported object exists, the tracking step tracks.
(Appendix 23)
The tracking method according to any one of appendices 15 to 22, wherein the detection step and the tracking step are performed in parallel.
(Appendix 24)
The tracking method according to any one of appendices 15 to 23, wherein in the detection step, the transported object is detected using a learning model capable of detecting the transported object.
(Appendix 25)
A tracking process for tracking the transported object being transported by the transport device;
Counting the tracked transported object,
25. A method for counting a conveyed product, wherein the tracking step is the method for tracking a conveyed product according to any one of appendices 15 to 24.
(Appendix 26)
The tracking step acquires position information of the conveyed object detected in the detection target image,
The counting method according to appendix 25, wherein the counting step counts the transported object based on position information of the transported object.
(Appendix 27)
Including a detection frequency measurement process for measuring the detection frequency of the same transported object
27. The counting method according to appendix 25 or 26, wherein when the number of detections of the transported object is equal to or less than a predetermined number, the counting step does not count a transported object whose detection number is equal to or less than the predetermined number.
(Appendix 28)
A tracking step for tracking a transported object including an object being transported by a transport device;
A sorting step of sorting objects in the tracked transported object,
25. A method for selecting a conveyed product, wherein the tracking step is the method for tracking a conveyed product according to any one of appendices 15 to 24.
(Appendix 29)
An image acquisition process for acquiring n images over time for a transported object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
A tracking process for tracking a transported object detected in the k-th detection target image in an image acquired after the k-th detection target image among the n images can be executed on a computer. Yes,
The program according to claim 1, wherein k in the detection process is smaller than n.
(Appendix 30)
An image acquisition process for acquiring n images over time for a transported object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image;
And a counting process for counting the number of the tracked transported objects can be executed on a computer,
The program according to claim 1, wherein k in the detection process is smaller than n.
(Appendix 31)
An image acquisition process for acquiring n images over time for a transported object including a target object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image;
A sorting process for sorting objects in the tracked transported object can be executed on a computer;
The program according to claim 1, wherein k in the detection process is smaller than n.
(Appendix 32)
A computer-readable recording medium in which the program according to any one of appendices 29 to 31 is recorded.
(Appendix 33)
Including a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device,
The imaging device captures n images over time for a transported object transported by a transport device,
The server includes image acquisition means, detection means, and tracking means,
The image acquisition means acquires n images over time for a transported object transported by the transport device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the detected transported object for n images,
K in the said detection means is smaller than n, The tracking system of the conveyed product characterized by the above-mentioned.
(Appendix 34)
Including a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device,
The imaging device captures n images over time for a transported object transported by a transport device,
The server includes image acquisition means, detection means, tracking means, and counting means,
The image acquisition means acquires n images over time for a transported object transported by the transport device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
The counting means counts the number of the tracked transported objects,
K in the said detection means is smaller than n, The counting system of the conveyed product characterized by the above-mentioned.
(Appendix 35)
Including a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device and a sorting device,
The imaging device captures n images over time for a transported object including a target object that is transported by a transporting device,
The sorting means sorts the sorting object in the tracked transported object,
The server includes image acquisition means, detection means, tracking means, and selection means,
The image acquisition means acquires n images over time for a transported object including a target object being transported by the transporting device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
K in the said detection means is smaller than n, The sorting system of a conveyed product characterized by the above-mentioned.

 以上説明したように、本発明によれば、処理時間を短縮できる。このため、本発明によれば、例えば、工場等において、製品等をリアルタイムに追跡することができる。このため、本発明は、製造業等において極めて有用である。 As described above, according to the present invention, the processing time can be shortened. For this reason, according to the present invention, for example, a product or the like can be tracked in real time in a factory or the like. For this reason, the present invention is extremely useful in the manufacturing industry and the like.

 この出願は、2018年2月14日に出願された日本出願特願2018―024150を基礎とする優先権を主張し、その開示のすべてをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2018-024150 filed on Feb. 14, 2018, the entire disclosure of which is incorporated herein.

1、1a、1b、1c、1d、1e 第1の対象物
2、2a、2b   第2の対象物
10、20、30、40 追跡装置
11        データ処理手段
111       画像取得手段
112       検出手段
113       追跡手段
114       検出処理判定手段
115       搬送物判定手段
116       誤検出判定手段
117       対応付け手段
118       計数手段
119       選別手段
12        データ記憶手段
121       画像記憶部
122       検出情報記憶部
123       追跡情報記憶部
13        入力手段
14        出力手段
201       CPU
202       メモリ
203       バス
204       記憶装置
205       プログラム
206       入力装置
207       ディスプレイ
208       通信デバイス
31a、31b、31c 撮像装置
32a、32b、32c 通信インターフェイス
35a、35b、35c 選別装置
33        通信回線網
34        サーバ
50        計数装置
60        選別装置
1, 1a, 1b, 1c, 1d, 1e First object 2, 2a, 2b Second object 10, 20, 30, 40 Tracking device 11 Data processing means 111 Image acquisition means 112 Detection means 113 Tracking means 114 Detection processing determination means 115 Carrying object determination means 116 False detection determination means 117 Correlation means 118 Counting means 119 Sorting means 12 Data storage means 121 Image storage section 122 Detection information storage section 123 Tracking information storage section 13 Input means 14 Output means 201 CPU
202 Memory 203 Bus 204 Storage device 205 Program 206 Input device 207 Display 208 Communication device 31a, 31b, 31c Imaging device 32a, 32b, 32c Communication interface 35a, 35b, 35c Sorting device 33 Communication network 34 Server 50 Counting device 60 Sorting device

Claims (35)

搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得手段と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出手段と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡手段とを含み、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の追跡装置。
Image acquisition means for acquiring n images over time with respect to the conveyed object being conveyed by the conveying device;
Detecting means for detecting the conveyed object for k detection target images selected from the n images;
A tracking means for tracking a conveyed object detected in the kth detection target image in an image acquired after the kth detection target image among the n images,
K in the said detection means is smaller than n, The tracking apparatus of the conveyed product characterized by the above-mentioned.
前記画像取得手段により、前記n枚の画像のうち、m枚目の画像を取得以後、前記k枚の検出対象画像のうち、l枚目の検出対象画像に対する検出を前記検出手段が実施しているかを判定する検出処理判定手段を含み、
前記検出処理判定手段により、前記検出手段による検出が実施されていないと判定された場合、
前記検出手段は、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出する、請求項1記載の追跡装置。
After the acquisition of the m-th image among the n images by the image acquisition means, the detection means performs detection on the first detection target image among the k detection target images. A detection processing determination means for determining whether or not
When it is determined by the detection processing determination means that the detection by the detection means is not performed,
2. The tracking device according to claim 1, wherein the detection unit detects the transported object using the m-th image as an l + 1-th detection target image.
前記検出手段は、前記搬送物の位置情報を取得し、
前記追跡手段は、前記n枚の画像のうち、j枚目の画像と、j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物を追跡する、請求項1または2記載の追跡装置。
The detection means obtains position information of the transported object,
The tracking means is based on the j-th image, the image before the j−1th image, and the position information of the conveyed object in the image before the j−1th image among the n images. The tracking device according to claim 1, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
前記j枚目の画像がi枚目の検出対象画像の場合、前記追跡手段は、j枚目の画像と、i-1枚目以前の検出対象画像と、前記i-1枚目以前の検出対象画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物の位置を追跡する、請求項3記載の追跡装置。 When the j-th image is the i-th detection target image, the tracking unit detects the j-th image, the i-1th detection target image, and the i-1th detection image. The tracking device according to claim 3, wherein the position of the conveyed object is tracked by calculating position information of the conveyed object in the j-th image based on position information of the conveyed object in the target image. 前記追跡手段は、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記追跡手段で得られた搬送物の位置情報に基づき、前記搬送物が誤検出物であるかを判定する誤検出判定手段を含む、請求項1から4のいずれか一項に記載の追跡装置。
The tracking unit obtains position information of the conveyed object detected in the detection target image;
5. The tracking device according to claim 1, further comprising: an erroneous detection determination unit that determines whether the transported object is an erroneously detected object based on position information of the conveyed object obtained by the tracking unit. .
前記誤判定検出手段は、複数の画像間における前記検出された搬送物の位置の変化量に基づき、前記搬送物が誤検出物であるかを判定する、請求項5記載の追跡装置。
 
The tracking device according to claim 5, wherein the erroneous determination detection unit determines whether the conveyed object is an erroneously detected object based on a change amount of the position of the detected conveyed object between a plurality of images.
前記k枚の検出対象画像のうち、h枚目の検出対象画像における搬送物の検出結果と、前記n枚の画像のうち、前記h枚目の検出対象画像のつぎに取得された画像における搬送物の追跡結果に基づき、前記検出された搬送物と前記追跡された搬送物とのうち、同じ搬送物を対応づける対応付け手段を含む、請求項1から6のいずれか一項に記載の追跡装置。 Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image. The tracking according to any one of claims 1 to 6, further comprising an association unit that associates the same transported object between the detected transported object and the tracked transported object based on a tracking result of the object. apparatus. 前記画像より前に取得された検出対象画像で検出された搬送物が存在するかを判定する搬送物判定手段を含み、
前記搬送物判定手段が前記検出された搬送物が存在すると判定した場合、前記追跡手段は、追跡する、請求項1から7のいずれか一項に記載の追跡装置。
A transport object determination means for determining whether or not a transport object detected in the detection target image acquired before the image exists,
The tracking device according to any one of claims 1 to 7, wherein when the transported object determination unit determines that the detected transported object exists, the tracking unit tracks.
前記検出手段による検出と、前記追跡手段による追跡とが、並列して実施される、請求項1から8のいずれか一項に記載の追跡装置。 The tracking device according to any one of claims 1 to 8, wherein the detection by the detection unit and the tracking by the tracking unit are performed in parallel. 前記検出手段は、前記搬送物を検出可能な学習モデルを用いて、前記搬送物を検出する、請求項1から9のいずれか一項に記載の追跡装置。 The tracking device according to claim 1, wherein the detection unit detects the transported object using a learning model capable of detecting the transported object. 搬送装置により搬送されている搬送物について追跡する追跡手段と、
前記追跡された搬送物をカウントする計数手段とを含み、
前記追跡手段は、請求項1から10のいずれか一項に記載の搬送物の追跡装置であることを特徴とする、搬送物の計数装置。
A tracking means for tracking the transported object being transported by the transport device;
Counting means for counting the tracked transported goods,
The said tracking means is the conveyance tracking apparatus as described in any one of Claim 1 to 10, The conveyance object counting apparatus characterized by the above-mentioned.
前記追跡手段は、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記計数手段は、前記搬送物の位置情報に基づき、前記搬送物をカウントする、請求項11記載の計数装置。
The tracking unit obtains position information of the conveyed object detected in the detection target image;
The counting device according to claim 11, wherein the counting unit counts the conveyed object based on position information of the conveyed object.
同じ搬送物の検出回数を測定する検出回数測定手段を含み、
前記搬送物の検出回数が所定回数以下の場合、前記計数手段は、前記検出回数が所定回数以下の搬送物をカウントしない、請求項11または12記載の計数装置。
Including a detection frequency measuring means for measuring the detection frequency of the same transported object,
The counting device according to claim 11 or 12, wherein when the number of detections of the conveyed product is equal to or less than a predetermined number, the counting unit does not count a conveyed item whose detection frequency is equal to or less than the predetermined number.
搬送装置により搬送されている、対象物を含む搬送物について追跡する追跡手段と、
前記追跡された搬送物における対象物を選別する選別手段とを含み、
前記追跡手段は、請求項1から10のいずれか一項に記載の搬送物の追跡装置であることを特徴とする、搬送物の選別装置。
A tracking means for tracking a transported object including the object being transported by the transport device;
Sorting means for sorting objects in the tracked transported object,
The said tracking means is the tracking apparatus of the conveyed product as described in any one of Claim 1 to 10, The sorting device of the conveyed product characterized by the above-mentioned.
搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得工程と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出工程と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡工程とを含み、
前記検出工程におけるkは、nより小さいことを特徴とする、搬送物の追跡方法。
An image acquisition step of acquiring n images over time for a conveyed item being conveyed by a conveying device;
A detection step of detecting the conveyed object for k detection target images selected from the n images;
A tracking step of tracking a conveyed object detected in the kth detection target image in an image obtained after the kth detection target image among the n images,
The method of tracking a conveyed product, wherein k in the detection step is smaller than n.
前記画像取得工程において、前記n枚の画像のうち、m枚目の画像を取得以後、前記k枚の検出対象画像のうち、l枚目の検出対象画像に対する検出を前記検出工程が実施しているかを判定する検出処理判定工程を含み、
前記検出処理判定工程において、前記検出工程による検出が実施されていないと判定された場合、
前記検出工程において、前記m枚目の画像をl+1枚目の検出対象画像として、前記搬送物を検出する、請求項15記載の追跡方法。
In the image acquisition step, after the acquisition of the m-th image among the n images, the detection step performs detection on the l-th detection target image among the k detection-target images. Including a detection process determination step for determining whether
In the detection process determination step, when it is determined that the detection by the detection step is not performed,
The tracking method according to claim 15, wherein, in the detection step, the conveyance object is detected using the m-th image as an (l + 1) -th detection target image.
前記検出工程において、前記搬送物の位置情報を取得し、
前記追跡工程において、前記n枚の画像のうち、j枚目の画像と、j-1枚目以前の画像と、前記j-1枚目以前の画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物を追跡する、請求項15または16記載の追跡方法。
In the detection step, the position information of the conveyed product is acquired,
In the tracking step, based on the j-th image, the image before the j−1th image, and the position information of the conveyed object in the image before the j−1th image among the n images, The tracking method according to claim 15 or 16, wherein the transported object is tracked by calculating position information of the transported object in the j-th image.
前記j枚目の画像がi枚目の検出対象画像の場合、前記追跡工程において、j枚目の画像と、i-1枚目以前の検出対象画像と、前記i-1枚目以前の検出対象画像における搬送物の位置情報とに基づき、前記j枚目の画像における搬送物の位置情報を算出することにより、前記搬送物の位置を追跡する、請求項17記載の追跡方法。 When the j-th image is the i-th detection target image, in the tracking step, the j-th image, the i−1-th detection target image, and the i−1-th detection image The tracking method according to claim 17, wherein the position of the conveyed object is tracked by calculating position information of the conveyed object in the j-th image based on position information of the conveyed object in the target image. 前記追跡工程において、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記追跡工程で得られた搬送物の位置情報に基づき、前記搬送物が誤検出物であるかを判定する誤検出判定工程を含む、請求項15から18のいずれか一項に記載の追跡方法。
In the tracking step, obtain positional information of the conveyed object detected in the detection target image,
The tracking method according to any one of claims 15 to 18, further comprising a false detection determination step of determining whether or not the transported object is a false detection object based on position information of the transported object obtained in the tracking step. .
前記誤判定検出工程において、複数の画像間における前記検出された搬送物の位置の変化量に基づき、前記搬送物が誤検出物であるかを判定する、請求項19記載の追跡方法。 The tracking method according to claim 19, wherein, in the erroneous determination detection step, it is determined whether the conveyed object is an erroneously detected object based on an amount of change in the position of the detected conveyed object between a plurality of images. 前記k枚の検出対象画像のうち、h枚目の検出対象画像における搬送物の検出結果と、前記n枚の画像のうち、前記h枚目の検出対象画像のつぎに取得された画像における搬送物の追跡結果に基づき、前記検出された搬送物と前記追跡された搬送物とのうち、同じ搬送物を対応づける対応付け工程を含む、請求項15から20のいずれか一項に記載の追跡方法。 Among the k detection target images, the detection result of the conveyance object in the h th detection target image, and among the n images, the conveyance in the image acquired next to the h th detection target image. The tracking according to any one of claims 15 to 20, further comprising an associating step of associating the same transported object among the detected transported object and the tracked transported object based on a tracking result of the object. Method. 前記画像より前に取得された検出対象画像で検出された搬送物が存在するかを判定する搬送物判定工程を含み、
前記搬送物判定工程において前記検出された搬送物が存在すると判定された場合、前記追跡工程は、追跡する、請求項15から21のいずれか一項に記載の追跡方法。
A transport object determination step for determining whether a transport object detected in the detection target image acquired before the image exists,
The tracking method according to any one of claims 15 to 21, wherein, when it is determined in the transported object determination step that the detected transported object exists, the tracking step tracks.
前記検出工程と、前記追跡工程とが、並列して実施される、請求項15から22のいずれか一項に記載の追跡方法。 The tracking method according to any one of claims 15 to 22, wherein the detection step and the tracking step are performed in parallel. 前記検出工程において、前記搬送物を検出可能な学習モデルを用いて、前記搬送物を検出する、請求項15から23のいずれか一項に記載の追跡方法。 The tracking method according to any one of claims 15 to 23, wherein in the detection step, the transported object is detected using a learning model capable of detecting the transported object. 搬送装置により搬送されている搬送物について追跡する追跡工程と、
前記追跡された搬送物をカウントする計数工程とを含み、
前記追跡工程は、請求項15から24のいずれか一項に記載の搬送物の追跡方法であることを特徴とする、搬送物の計数方法。
A tracking process for tracking the transported object being transported by the transport device;
Counting the tracked transported object,
25. A method for counting a conveyed product, wherein the tracking step is the method for tracking a conveyed product according to any one of claims 15 to 24.
前記追跡工程は、前記検出対象画像で検出された搬送物の位置情報を取得し、
前記計数工程は、前記搬送物の位置情報に基づき、前記搬送物をカウントする、請求項25記載の計数方法。
The tracking step acquires position information of the conveyed object detected in the detection target image,
26. The counting method according to claim 25, wherein the counting step counts the conveyed object based on position information of the conveyed object.
同じ搬送物の検出回数を測定する検出回数測定工程を含み、
前記搬送物の検出回数が所定回数以下の場合、前記計数工程は、前記検出回数が所定回数以下の搬送物をカウントしない、請求項25または26記載の計数方法。
Including a detection frequency measurement process for measuring the detection frequency of the same transported object,
27. The counting method according to claim 25 or 26, wherein when the number of detections of the conveyed product is equal to or less than a predetermined number, the counting step does not count a conveyed item whose detection number is equal to or less than the predetermined number.
搬送装置により搬送されている、対象物を含む搬送物について追跡する追跡工程と、
前記追跡された搬送物における対象物を選別する選別工程とを含み、
前記追跡工程は、請求項15から24のいずれか一項に記載の搬送物の追跡方法であることを特徴とする、搬送物の選別方法。
A tracking step for tracking a transported object including an object being transported by a transport device;
A sorting step of sorting objects in the tracked transported object,
25. A method for sorting a conveyed product, wherein the tracking step is the method for tracking a conveyed product according to any one of claims 15 to 24.
搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする、プログラム。
An image acquisition process for acquiring n images over time for a transported object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
A tracking process for tracking a transported object detected in the k-th detection target image in an image acquired after the k-th detection target image among the n images can be executed on a computer. Yes,
The program according to claim 1, wherein k in the detection process is smaller than n.
搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡処理と、
前記追跡された搬送物の数をカウントする計数処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする、プログラム。
An image acquisition process for acquiring n images over time for a transported object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image;
And a counting process for counting the number of the tracked transported objects can be executed on a computer,
The program according to claim 1, wherein k in the detection process is smaller than n.
搬送装置により搬送されている、対象物を含む搬送物について経時的にn枚の画像を取得する画像取得処理と、
前記n枚の画像から選択されたk枚の検出対象画像について、前記搬送物を検出する検出処理と、
前記n枚の画像のうち、k枚目の検出対象画像より後に取得された画像において、前記k枚目の検出対象画像で検出された搬送物を追跡する追跡処理と、
前記追跡された搬送物における対象物を選別する選別処理とをコンピュータ上で実行可能であり、
前記検出処理におけるkは、nより小さいことを特徴とする、プログラム。
An image acquisition process for acquiring n images over time for a transported object including a target object being transported by a transport device;
A detection process for detecting the conveyed object for k detection target images selected from the n images;
In the image acquired after the kth detection target image among the n images, a tracking process for tracking the conveyed object detected in the kth detection target image;
A sorting process for sorting objects in the tracked transported object can be executed on a computer;
The program according to claim 1, wherein k in the detection process is smaller than n.
請求項29から31のいずれか一項に記載のプログラムを記録していることを特徴とする、コンピュータ読み取り可能な記録媒体。 32. A computer-readable recording medium in which the program according to any one of claims 29 to 31 is recorded. 端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置を含み、
前記撮像装置は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を撮像し、
前記サーバは、画像取得手段、検出手段、および追跡手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、n枚の画像について、検出された搬送物を追跡し、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の追跡システム。
Including a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device,
The imaging device captures n images over time for a transported object transported by a transport device,
The server includes image acquisition means, detection means, and tracking means,
The image acquisition means acquires n images over time for a transported object transported by the transport device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the detected transported object for n images,
K in the said detection means is smaller than n, The tracking system of the conveyed product characterized by the above-mentioned.
端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置を含み、
前記撮像装置は、搬送装置により搬送されている搬送物について、経時的にn枚の画像を撮像し、
前記サーバは、画像取得手段、検出手段、追跡手段、および計数手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡し、
前記計数手段は、前記追跡された搬送物の数を計数し、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の計数システム。
Including a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device,
The imaging device captures n images over time for a transported object transported by a transport device,
The server includes image acquisition means, detection means, tracking means, and counting means,
The image acquisition means acquires n images over time for a transported object transported by the transport device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
The counting means counts the number of the tracked transported objects,
K in the said detection means is smaller than n, The counting system of the conveyed product characterized by the above-mentioned.
端末とサーバとを含み、
前記端末と前記サーバとは、システム外の通信回線網を介して、接続可能であり、
前記端末は、撮像装置と選別装置とを含み、
前記撮像装置は、搬送装置により搬送されている、対象物を含む搬送物について、経時的にn枚の画像を撮像し、
前記選別手段は、追跡された搬送物における選別対象物を選別し、
前記サーバは、画像取得手段、検出手段、追跡手段、および選別手段を含み、
前記画像取得手段は、前記搬送装置により搬送されている、対象物を含む搬送物について、経時的にn枚の画像を取得し、
前記検出手段は、n枚の画像のうち、k枚の検出対象画像について、搬送物を検出し、
前記追跡手段は、前記n枚の画像のうち、各検出対象画像より後に取得された画像において、前記各検出対象画像で検出された搬送物を追跡し、
前記検出手段におけるkは、nより小さいことを特徴とする、搬送物の選別システム。
Including a terminal and a server,
The terminal and the server are connectable via a communication network outside the system,
The terminal includes an imaging device and a sorting device,
The imaging device captures n images over time for a transported object including a target object that is transported by a transporting device,
The sorting means sorts the sorting object in the tracked transported object,
The server includes image acquisition means, detection means, tracking means, and selection means,
The image acquisition means acquires n images over time for a transported object including a target object being transported by the transporting device,
The detection means detects a conveyed object for k detection target images among n images,
The tracking means tracks the conveyed object detected in each detection target image in the images acquired after each detection target image among the n images,
K in the said detection means is smaller than n, The sorting system of a conveyed product characterized by the above-mentioned.
PCT/JP2018/034186 2018-02-14 2018-09-14 Goods tracker, goods counter, goods-tracking method, goods-counting method, goods-tracking system, and goods-counting system Ceased WO2019159409A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020500259A JP6989178B2 (en) 2018-02-14 2018-09-14 Transport item tracking device, transport item counting device, transport item tracking method, transport item counting method, transport item tracking system, and transport item counting system.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018024150 2018-02-14
JP2018-024150 2018-11-02

Publications (1)

Publication Number Publication Date
WO2019159409A1 true WO2019159409A1 (en) 2019-08-22

Family

ID=67619900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/034186 Ceased WO2019159409A1 (en) 2018-02-14 2018-09-14 Goods tracker, goods counter, goods-tracking method, goods-counting method, goods-tracking system, and goods-counting system

Country Status (2)

Country Link
JP (1) JP6989178B2 (en)
WO (1) WO2019159409A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7136303B1 (en) 2021-09-30 2022-09-13 日本電気株式会社 Counting device, counting method and computer program
WO2024013934A1 (en) * 2022-07-14 2024-01-18 株式会社Fuji Substrate conveyance device and substrate detection method
JP2024105954A (en) * 2023-01-26 2024-08-07 ヤンマーホールディングス株式会社 Quality control method, quality control system, and quality control program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015066227A (en) * 2013-09-30 2015-04-13 富士フイルム株式会社 Drug counting apparatus and method
JP2017109161A (en) * 2015-12-15 2017-06-22 ウエノテックス株式会社 Waste screening system and screening method therefor
JP2017186106A (en) * 2016-04-01 2017-10-12 株式会社東芝 Delivery support device, delivery support system, and delivery support program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015066227A (en) * 2013-09-30 2015-04-13 富士フイルム株式会社 Drug counting apparatus and method
JP2017109161A (en) * 2015-12-15 2017-06-22 ウエノテックス株式会社 Waste screening system and screening method therefor
JP2017186106A (en) * 2016-04-01 2017-10-12 株式会社東芝 Delivery support device, delivery support system, and delivery support program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7136303B1 (en) 2021-09-30 2022-09-13 日本電気株式会社 Counting device, counting method and computer program
WO2023053473A1 (en) * 2021-09-30 2023-04-06 日本電気株式会社 Counting device, counting method, and recording medium
JP2023050231A (en) * 2021-09-30 2023-04-11 日本電気株式会社 Counting device, counting method and computer program
WO2024013934A1 (en) * 2022-07-14 2024-01-18 株式会社Fuji Substrate conveyance device and substrate detection method
JP2024105954A (en) * 2023-01-26 2024-08-07 ヤンマーホールディングス株式会社 Quality control method, quality control system, and quality control program

Also Published As

Publication number Publication date
JP6989178B2 (en) 2022-01-05
JPWO2019159409A1 (en) 2020-12-03

Similar Documents

Publication Publication Date Title
CN113632099B (en) Distributed product defect analysis system, method and computer readable storage medium
US11378522B2 (en) Information processing apparatus related to machine learning for detecting target from image, method for controlling the same, and storage medium
CN109829397B (en) Video annotation method and system based on image clustering and electronic equipment
US8135172B2 (en) Image processing apparatus and method thereof
EP3001270A2 (en) Work management system and work management method
US20150193698A1 (en) Data processing device
JP7316731B2 (en) Systems and methods for detecting and classifying patterns in images in vision systems
CN113111844B (en) Operation posture evaluation method and device, local terminal and readable storage medium
JP7134331B2 (en) Counting system, counting device, machine learning device, counting method, parts arrangement method, and program
WO2019159409A1 (en) Goods tracker, goods counter, goods-tracking method, goods-counting method, goods-tracking system, and goods-counting system
JP2019106119A (en) Detection system, information processing apparatus, evaluation method, and program
TW202242390A (en) Defect inspection device, defect inspection method, and manufacturing method
US20190114785A1 (en) System for real-time moving target detection using vision based image segmentation
WO2021233058A1 (en) Method for monitoring articles on shop shelf, computer and system
WO2023221770A1 (en) Dynamic target analysis method and apparatus, device, and storage medium
WO2021049119A1 (en) Learning device, learning method, and non-transitory computer-readable medium in which learning program has been stored
CN113255651A (en) Package security check method, device and system, node equipment and storage device
CN109743497B (en) Data set acquisition method and system and electronic device
US20230169452A1 (en) System Configuration for Learning and Recognizing Packaging Associated with a Product
CN113902939A (en) Large defect detection method for industrial products based on twin network
Dong et al. Connecting finger defects in flexible touch screen inspected with machine vision based on YOLOv8n
EP3647236A1 (en) Projection instruction device, baggage sorting system, and projection instruction method
CN116385426A (en) Textile surface defect detection method and related equipment
US20230229119A1 (en) Robotic process automation (rpa)-based data labelling
CN114596576A (en) An image processing method, device, electronic device and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18906684

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020500259

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18906684

Country of ref document: EP

Kind code of ref document: A1