WO2014125882A1 - 情報処理システム、情報処理方法及びプログラム - Google Patents
情報処理システム、情報処理方法及びプログラム Download PDFInfo
- Publication number
- WO2014125882A1 WO2014125882A1 PCT/JP2014/051214 JP2014051214W WO2014125882A1 WO 2014125882 A1 WO2014125882 A1 WO 2014125882A1 JP 2014051214 W JP2014051214 W JP 2014051214W WO 2014125882 A1 WO2014125882 A1 WO 2014125882A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time
- moving body
- person
- information processing
- video camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
Definitions
- Some aspects according to the present invention relate to an information processing system, an information processing method, and a program.
- Patent Document 1 whether or not the person photographed by each video camera is the same person based on the similarity between the feature quantity of the person photographed by the video camera and the feature quantity of the tracked person. By discriminating, the person is tracked.
- the processing is terminated when a specified predetermined time or more elapses.
- the tracking person does not appear in the video camera shooting range by the predetermined time, the tracking of the person cannot be continued.
- the tracking person has been doing some criminal activity for a long time outside the shooting range of the video camera (blind spot), it will not be possible to continue tracking, so it will not be able to fully fulfill the role of person tracking End up.
- Some aspects of the present invention have been made in view of the above-described problems, and provide an information processing system, an information processing method, and a program capable of preferably performing person tracking using videos of a plurality of video cameras.
- One of the purposes is to do.
- the information processing system calculates a time from when the first moving body frames out from the image of the first video camera to when the second moving body frames in the image of the second video camera. On the basis of the means to perform, the attribute of the first moving body, the similarity between the first moving body and the second moving body, and the time, the first moving body and the second moving body. Determining means for determining whether or not the moving body is the same moving body.
- the information processing method calculates a time from when the first moving body frames out from the video of the first video camera to when the second moving body frames in the video of the second video camera. On the basis of the attributes of the first mobile body, the similarity between the first mobile body and the second mobile body, and the time, the first mobile body and the second mobile body.
- the information processing system performs a step of determining whether or not the moving body is the same moving body.
- the program according to the present invention calculates a time from when the first moving body frames out from the video of the first video camera to when the second moving body frames in the video of the second video camera. And the first mobile body and the second mobile body based on the attributes of the first mobile body, the similarity between the first mobile body and the second mobile body, and the time. And causing the computer to execute a process of determining whether or not they are the same moving object.
- “part”, “means”, “apparatus”, and “system” do not simply mean physical means, but “part”, “means”, “apparatus”, “system”. This includes the case where the functions possessed by "are realized by software. Further, even if the functions of one “unit”, “means”, “apparatus”, and “system” are realized by two or more physical means or devices, two or more “parts” or “means”, The functions of “device” and “system” may be realized by a single physical means or device.
- an information processing system an information processing method, and a program that can suitably perform person tracking using images from a plurality of video cameras.
- FIG. 1 is a block diagram showing a system configuration of the monitoring system 1.
- the monitoring system 1 is broadly divided into an information processing server 100 and a plurality of video cameras 200 (video cameras 200A to 200N are collectively referred to as video cameras 200) that capture (capture) video (moving images). Composed.
- the monitoring system 1 will be described as a system for monitoring (tracking) a person photographed by the video camera 200 as a photographing apparatus, but the monitoring target is not limited to this.
- the monitoring target may be applied to various moving objects such as cars, bicycles, and motorcycles.
- the video camera 200 captures a video (moving image), determines whether or not there is a person in the captured video, and then determines a position (including a movement trajectory in the video) related to the person, A person detection result including information such as a feature amount is transmitted to the information processing server 100 together with the captured video.
- the video camera 200 can also track a person in the video (within the angle of view) by comparing the shot video between frames.
- processes such as person detection, feature amount extraction, and person tracking in the camera may be performed not on the video camera 200 but on, for example, the information processing server 100 or other information processing apparatus (not shown). .
- the information processing server 100 analyzes the video shot by the video camera 200 to determine whether or not the person shot by each video camera 200 is the same person. By repeating this operation, each person is tracked.
- the information processing server 100 specifies the attribute of the photographed person, and then determines the next video camera 200 from the attribute and the frame out from the video of the video camera 200.
- the identity of the person is determined based on the time to frame-in the video and the similarity.
- the attribute of a person refers to an attribute that varies in moving speed (that is, varies in moving time). For example, there are gait features (such as wandering and staggered feet) and personal belongings (having a cane or luggage).
- the video processed by the information processing server 100 is not only a real-time video shot by the video camera 200, but also after being shot by the video camera 200, a storage device (for example, HDD (Hard Disk Drive) or VCR (Video Cassette Recorder). It is also conceivable to track (analyze) the video stored in ()).
- a storage device for example, HDD (Hard Disk Drive) or VCR (Video Cassette Recorder). It is also conceivable to track (analyze) the video stored in ()).
- the video stored in the storage device is reproduced in reverse order (reverse reproduction) and tracked.
- reverse reproduction Normally, when a person takes a suspicious action, it is necessary to find out what route the person has taken and how the action was taken. It is extremely useful to be able to track by.
- the information processing server 100 In person monitoring (person tracking) in the information processing server 100, the information processing server 100 outputs various display screens such as a monitoring screen to the display device 300, and also performs various operation inputs related to person monitoring from the input device 400.
- the operation signal is received. More specifically, for example, on the monitoring screen displayed on the display device 300, a user who is a monitor indicates where the person to be monitored is now by displaying a plurality of images input from the video camera 200. Is able to grasp.
- the display device 300 is, for example, a display that displays an image on a liquid crystal, an organic EL (Electro Luminescence), or the like.
- the display device 300 displays the monitoring screen output from the information processing server 100.
- the input device 400 is a device for a user (monitor) to input various information.
- various pointing devices such as a mouse, a touch pad, and a touch panel, a keyboard, and the like correspond to the input device 400.
- Processing such as registration of the person to be monitored is performed based on a user operation on the input device 400.
- the configurations of the information processing server 100, the display device 300, and the input device 400 can be variously changed.
- the display device 300 and the input device 400 may be realized as one client, or the functions of the information processing server 100, the display device 300, and the input device 400 are realized by three or more information processing devices. You may do it.
- the client may have some functions of the information processing server 100 according to the present embodiment.
- the route from the shooting range a1 to the shooting range a2 is not branched, but may be branched, and a point where the person P1 can move from the shooting range a1 is other than the shooting range a2. There may be.
- the speed of the person P1 and the distance D between the shooting range a1 and the shooting range a2 are known, the time until the person P1 reaches the shooting range a2 after being out of frame from the image of the camera C1 is , It can be predicted to be in the vicinity of D / V1.
- the distance between the shooting range a1 and the shooting range a2 is known in advance when the video cameras C1 and C2 are installed, the distance should be registered in a database (corresponding to the inter-camera information 451 described later) so that it can be read. good.
- the moving speed V1 of the person P1 can be specified by analyzing the video of the video camera C1.
- the time score at which the person P1 out of the video from the video of the video camera C1 frames in the video of the video camera C2 (herein referred to as the appearance time score) is D at the time when the video is out of the video of the video camera C1. It becomes the highest in the vicinity after / V1, and becomes lower when moving away from that time.
- the appearance time score is a score for determining whether or not the person newly framed in the video of the video camera C2 is the same person as the person P1 framed out of the video of the video camera C1. Can do.
- the technique is based on the premise that the moving speed of the person P1 from the shooting range a1 to the shooting range a2 is substantially constant with the moving speed V1 in the video of the video camera C1.
- the movement speed of the person is not necessarily constant, and there are cases where the movement speed is estimated to be substantially constant and movement speed variation is large depending on the attribute of each person.
- the attribute is that the visitor who visits the place for the first time. If you have a line-of-sight feature such as walking or looking at the camera, you can be presumed to be a suspicious person (for example, a blacklisted person, past travel time is significantly different from the average, avoiding others , Etc.).
- the information processing server 100 compares the feature quantity of the person shown in the video camera C1 with the feature quantity of the person shown in the video camera C2, and determines the similarity between them. The higher the similarity is, the higher the possibility that both persons are the same person. Based on the similarity and the appearance time score, it is determined whether the person framed out from the video of the video camera C1 and the person framed in the video of the video camera C2 are the same person.
- the same person is determined using the appearance time score. Specifically, a score for determining whether or not they are the same person can be calculated by the following mathematical formula.
- ⁇ is a parameter for deciding whether to be the same person with emphasis on either the appearance time score or the similarity of the feature amount.
- T0 be the time at which the person P1 is predicted to reach the video of the video camera C2 based on the moving speed V1 of the person P1 (T0 is the time after the person P1 has framed out of the video of the video camera C1.
- the time (2 ⁇ T0) from the appearance prediction time T0 to (1 + 2 ⁇ ) T0 which is the upper limit of the time zone to be extracted, is the lower limit of the time zone from the appearance prediction time T0 (1- ⁇ ) T0.
- the reason why the time is longer than the time ( ⁇ T0) is that the moving speed of the human is considered to be wider when it is delayed than when it is accelerated.
- ⁇ is a parameter for setting the time zone width to be determined by the same person.
- the width of this time zone is determined by the magnitude of variation in the appearance time of the person. Therefore, ⁇ is set larger for a person who has an attribute that is considered to have a large variation in travel time, and ⁇ is set smaller for a person who has an attribute that is thought to have a small variation in travel time. do it.
- the determination may be based mainly on the feature amount similarity.
- the appearance time score and the feature amount similarity may be calculated together. In the following, a description will be given focusing on a method of calculating the appearance time score and the feature amount similarity together.
- FIG. 4 is a functional block diagram showing a functional configuration of the information processing server 100 according to the present embodiment.
- the information processing server 100 includes an image information acquisition unit 410, a person extraction unit 420, a person feature amount similarity calculation unit 430, an appearance time score calculation unit 440, a database 450, and a person attribute.
- An acquisition unit 460, an integrated score calculation unit 470, and a person association unit 480 are included.
- the image information acquisition unit 410 acquires the video directly from the video camera 200 or the video captured by the video camera 200 from the storage medium.
- the person extraction unit 420 specifies a person image area from each frame of the video captured by the video camera 200 and calculates a feature amount of each person image.
- the person feature quantity similarity calculation unit 430 calculates the similarity of the person feature quantity extracted by the person extraction unit 420. In general, it can be considered that a person having a high similarity in feature quantity is highly likely to be the same person. At this time, the similarity may be calculated only for a person within a time width (illustrated in FIG. 3) determined based on the person attribute specified by the person attribute acquisition unit 460.
- the appearance time score calculation unit 440 calculates a score (appearance time score) in which a person shown in a video shot by a video camera 200 and a person shown in a video shot by another video camera 200 are the same person. calculate.
- the score for determining whether the person newly framed in the video of the video camera C2 is the same person as the person P1 framed out of the video of the video camera C1 is framed out of the video of the video camera C1.
- the appearance time score at which the person P1 frames in the video of the video camera C2 can be obtained, and the appearance time score can be obtained using time as a variable.
- the function for obtaining the appearance time score may be a step-like function shown in FIG. 5A or a step-like function shown in FIG. It can be expressed using a mountain-shaped function shown in b).
- a mountain-like function is used as shown in FIG. 5B, as described above, the appearance time score at which the person P1 framed out from the video of the video camera C1 frames into the video of the video camera C2 is as follows. It becomes the highest in the vicinity after D / V1 at the time when the frame is out of the video of C1, and becomes lower when the time is away from that time.
- the appearance time score calculation unit 440 refers to the inter-camera information 451 stored in the DB 450 when calculating the appearance time score.
- the inter-camera information 451 includes information such as the distance and positional relationship between the video cameras 200 (which video camera 200 can be moved to which video camera 200 can be moved).
- the person attribute acquisition unit 460 identifies the attribute of the person extracted by the person extraction unit 420.
- the attribute specified here is an attribute that affects the movement speed of the person as described above.
- the attribute has a gait feature such as wandering or staggered, or possesses possession such as a cane or luggage. If you are a visitor to the place for the first time, have a gaze feature such as sneaking around or looking at the camera, you can assume that you are a suspicious person (for example, a blacklisted person, past travel time) May be significantly different from the average, avoiding others, etc.).
- gait features such as wandering and staggered feet can be determined from the trajectory of the foot position of the person in the video.
- a possession such as a cane or a baggage
- the face area and face orientation are extracted from the image by comparing with the facial features of people with various face orientations registered in advance. It is possible to estimate the position of the eyes in the face area based on the model and to determine based on the line-of-sight direction obtained as a result of determining the position of the black eye in the eye from the density distribution of the eye area.
- the integrated score calculation unit 470 Based on the similarity and appearance time score of the person feature calculated by the person feature amount similarity calculation unit 430 and the appearance time score calculation unit 440, the integrated score calculation unit 470 A score for determining whether or not the person shown in the video of the other video camera 200 is the same person is calculated. As a method for calculating the score, for example, the method shown in the above “1.2.1” can be considered.
- the person associating unit 480 associates persons determined to be the same person among the persons photographed by the respective video cameras 200 based on the score calculated by the integrated score calculating unit 470. At this time, whether or not they are the same person can be determined based on, for example, whether or not the score exceeds a certain threshold. If it is known which person shown in each video camera 200 is the same person, the movement history of each person can be specified, that is, the person can be tracked.
- FIG. 6 is a flowchart showing a processing flow of the information processing server 100 according to the present embodiment.
- Each processing step to be described later can be executed in any order or in parallel as long as there is no contradiction in processing contents, and other steps can be added between the processing steps. good. Further, a step described as a single step for convenience can be executed by being divided into a plurality of steps, and a step described as being divided into a plurality of steps for convenience can be executed as one step.
- the person extraction unit 420 extracts a person region from the image input from the image information acquisition unit 410 (S501), and extracts a feature amount from the extracted person region (person image) (S503).
- the person feature quantity similarity calculation unit 430 is between the person shown in the video of a certain video camera 200 and the person shown in the video of the video camera 200 at a position that can be moved from the shooting range of the video camera 200. Then, the similarity is calculated (S505). At this time, as described with reference to FIG. 3, a time width may be provided for the person extraction.
- the appearance time score calculation unit 440 generates an appearance time score based on the difference time between the time when the person framed out from the video of one video camera 200 and the time when the person framed in the video of another video camera 200. Is calculated (S507).
- the person attribute acquisition unit 460 identifies the attribute of each person shown on the video camera 200 (S509). Since a specific example of this specifying method has been described in “1.3”, description thereof is omitted here.
- the integrated score calculation unit 470 includes the person attributes specified by the person attribute acquisition unit 460, the feature amount similarity calculated by the person feature amount similarity calculation unit 430, and the appearance time score calculated by the appearance time score calculation unit 440. Based on the above, a score for determining whether or not the person shown in each video camera 200 is the same person is calculated (S511), and the person association unit 480 determines the correspondence between the persons based on the score. Determine (S513).
- the information processing server 100 includes a processor 601, a memory 603, a storage device 605, an input interface (I / F) 607, a data I / F 609, a communication I / F 611, and a display device 613.
- a processor 601 a memory 603, a storage device 605, an input interface (I / F) 607, a data I / F 609, a communication I / F 611, and a display device 613.
- the processor 601 controls various processes in the information processing server 100 by executing a program stored in the memory 603.
- a program stored in the memory 603. For example, the image information acquisition unit 410, the person extraction unit 420, the person feature amount similarity calculation unit 430, the appearance time score calculation unit 440, the person attribute acquisition unit 460, the integrated score calculation unit 470, and the person association described in FIG.
- the processing related to the unit 480 can be realized as a program mainly operating on the processor 601 after being temporarily stored in the memory 603.
- the memory 603 is a storage medium such as a RAM (Random Access Memory).
- the memory 603 temporarily stores a program code of a program executed by the processor 601 and data necessary for executing the program. For example, in the storage area of the memory 603, a stack area necessary for program execution is secured.
- the storage device 605 is a non-volatile storage medium such as a hard disk or a flash memory.
- the storage device 605 includes an operating system, an image information acquisition unit 410, a person extraction unit 420, a person feature amount similarity calculation unit 430, an appearance time score calculation unit 440, a person attribute acquisition unit 460, an integrated score calculation unit 470, and a person.
- Various programs for realizing the associating unit 480, various data such as the inter-camera information 451 included in the DB 450, and the like are stored. Programs and data stored in the storage device 605 are referred to by the processor 601 by being loaded into the memory 603 as necessary.
- the input I / F 607 is a device for receiving input from the user.
- the input device 400 described in FIG. 1 can also be realized by the input I / F 607.
- Specific examples of the input I / F 607 include a keyboard, a mouse, a touch panel, and various sensors.
- the input I / F 607 may be connected to the information processing server 100 via an interface such as USB (Universal Serial Bus).
- the data I / F 609 is a device for inputting data from outside the information processing server 100.
- a specific example of the data I / F 609 includes a drive device for reading data stored in various storage devices.
- the data I / F 609 may be provided outside the information processing server 100. In this case, the data I / F 609 is connected to the information processing server 100 via an interface such as a USB.
- the communication I / F 611 is a device for performing data communication with an external device of the information processing server 100, such as a video camera 200, by wire or wireless.
- the communication I / F 611 may be provided outside the information processing server 100. In that case, the communication I / F 611 is connected to the information processing server 100 via an interface such as a USB.
- the display device 613 is a device for displaying various information.
- the display device 300 described in FIG. 1 can also be realized by the display device 613.
- Specific examples of the display device 613 include a liquid crystal display and an organic EL (Electro-Luminescence) display.
- the display device 613 may be provided outside the information processing server 100. In that case, the display device 613 is connected to the information processing server 100 via, for example, a display cable.
- the identity determination method is changed according to whether or not the person has an attribute that affects the moving speed. Specifically, for example, for a person having an attribute that has a high possibility of affecting the moving speed, the ratio of identity determination based on the appearance time score is reduced, or the time zone of identity determination is expanded. . As a result, it is possible to suitably perform person tracking.
- FIG. 8 is a block diagram illustrating a functional configuration of the monitoring apparatus 700 that is an information processing system. As illustrated in FIG. 8, the monitoring apparatus 700 includes a similarity calculation unit 710, a time calculation unit 720, an attribute specifying unit 730, and a determination unit 740.
- the similarity calculation unit 710 is referred to as a moving body (herein referred to as a first moving body) reflected in an image of a certain video camera (herein referred to as a first video camera) among a plurality of video cameras (not shown). ) And the degree of similarity between the moving body (referred to herein as the second moving body) reflected in the video of another video camera (referred to herein as the second video camera).
- a moving body a person, a car, a bicycle, a motorcycle, etc. can be mentioned, for example.
- the time calculation unit 720 calculates the time from when the first moving body frames out of the video of the first video camera until the second moving body frames in the video of the second video camera. .
- the attribute specifying unit 730 specifies the attribute of the first moving object reflected on the first video camera.
- the attribute of the moving object refers to an attribute that affects the moving speed, for example. For example, if the moving body is a person, if it has gait features such as wandering and staggered legs, or if you have personal belongings such as a cane or luggage, the visitor who visited the place for the first time, Or if you have a gaze feature such as glancing at the camera, you can assume that you are a suspicious person (for example, a blacklisted person, past movement time is significantly different from the average, avoiding others, Etc.).
- the determining unit 740 It is determined whether the body and the second moving body are the same moving body.
- An information processing system comprising: a determination unit that determines whether or not.
- Appendix 5 The information processing system according to appendix 4, wherein the degree of influence of the similarity and the time on the determination result differs according to the attribute of the first moving object.
- the determining means determines whether the similarity is the same for the second moving body within which the time falls within a time width set according to the attribute of the first moving body.
- the time width is set across the average moving time from the shooting range of the first video camera to the shooting range of the second video camera, and the time from the upper limit of the time width to the average moving time is:
- Appendix 8 The information processing system according to any one of appendix 1 to appendix 7, wherein the attribute has a correlation with a moving speed of the moving object.
- Appendix 10 The information processing method according to appendix 9, further comprising a step of specifying an attribute of the first moving object.
- the time width is set across the average moving time from the shooting range of the first video camera to the shooting range of the second video camera, and the time from the upper limit of the time width to the average moving time is:
- (Appendix 17) A process of calculating a time from when the first moving body frames out of the image of the first video camera until the second moving body frames in the image of the second video camera; and the first movement Based on the attributes of the body, the similarity between the first moving body and the second moving body, and the time, the first moving body and the second moving body are the same moving body.
- Appendix 18 The program information processing method according to appendix 17, further comprising a process of specifying an attribute of the first moving object.
- Supplementary note 17 or supplementary note 18 further comprising a process of calculating a similarity between the first moving object reflected in the image of the first video camera and the second movable object reflected in the image of the second video camera.
- the program according to any one of the above.
- Appendix 20 The program according to any one of appendix 17 to appendix 19, wherein the influence of the time on the determination result differs according to the attribute of the first moving object.
- Appendix 21 The program according to appendix 20, wherein the degree of influence of the similarity and the time on the determination result differs according to the attribute of the first moving object.
- the time width is set across the average moving time from the shooting range of the first video camera to the shooting range of the second video camera, and the time from the upper limit of the time width to the average moving time is:
- Appendix 24 The program according to any one of appendix 17 to appendix 23, wherein the attribute has a correlation with a moving speed of the moving object.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Alarm Systems (AREA)
Abstract
Description
図1乃至図7は、第1実施形態を説明するための図である。以下、これらの図を参照しながら、次の流れに従って本実施形態を説明する。まず、「1.1」でシステム構成の概要を示すとともに、「1.2」で動作の概要を説明する。その後、「1.3」でシステムの機能構成を説明し、「1.4」で処理の流れを、「1.5」で、本システムを実現可能なハードウェア構成の具体例を示す。最後に、「1.6」以降で、本実施形態に係る効果等を説明する。
図1を参照しながら、本実施形態に係る情報処理システムである監視システム1のシステム構成を説明する。図1は、監視システム1のシステム構成を示すブロック図である。
(1.2.1 出現時間スコアと特徴量の統合比率を制御する方法)
まず、図2を参照しながら、本実施形態に係る人物の追跡方法を説明する。図2において、ビデオカメラC1で撮影範囲a1を、ビデオカメラC2で撮影範囲a2を撮影しており、撮影範囲a1にいる追跡対象の人物P1が移動速度V1で、撮影範囲a2の方向に移動しているものとする。
具体的には、以下の数式により同一人物であるか否かを判定するスコアを算出することができる。
この他、人物の同一性の判定を行う時間帯を設定することも考えられる。例えば、人物P1の移動速度V1に基づいてビデオカメラC2の映像に人物P1が到達すると予測される時刻をT0とすると(なお、T0は、人物P1がビデオカメラC1の映像からフレームアウトした後、D/V1経過後に相当する)、人物P1と同一人物であるか否かを判定する時間Tは、下記のように設定することができる。例えば、γ=2βとして以下説明する(図3参照)。
次に、図4を参照しながら、本実施形態に係る情報処理サーバ100の機能構成を説明する。図4は、本実施形態に係る情報処理サーバ100の機能構成を示す機能ブロック図である。
以下、情報処理サーバ100の処理の流れを、図6を参照しながら説明する。図6は、本実施形態に係る情報処理サーバ100の処理の流れを示すフローチャートである。
以下、図7を参照しながら、上述してきた情報処理サーバ100をコンピュータにより実現する場合のハードウェア構成の一例を説明する。なお前述の通り、情報処理サーバ100の機能は、複数の情報処理装置により実現することも可能である。
以上説明したように、本実施形態に係る監視システム1では、移動速度に影響を与えるような属性を持つ人物か否かに応じて、同一性の判定方法を変えるようにしている。具体的には、例えば、移動速度に影響を与える可能性の高い属性を有する人物に関しては、出現時間スコアに基づく同一性判定の比率を下げたり、若しくは、同一性判定の時間帯を広げたりする。これにより、好適に人物追跡を行うことができるようになる。
以下、第2実施形態を、図8を参照しながら説明する。図8は、情報処理システムである監視装置700の機能構成を示すブロック図である。図8に示すように、監視装置700は、類似度算出部710と時間算出部720と属性特定部730と判別部740とを含む。
このように実装することで、本実施形態に係る監視装置700によれば、複数のビデオカメラの映像を用いて、好適に人物追跡を行うことができる。
なお、前述の実施形態の構成は、組み合わせたり或いは一部の構成部分を入れ替えたりしてもよい。また、本発明の構成は前述の実施形態のみに限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々変更を加えてもよい。
第1の移動体が第1のビデオカメラの映像からフレームアウトしてから第2のビデオカメラの映像に第2の移動体がフレームインするまでの時間を算出する手段と、前記第1の移動体の属性と、前記第1の移動体及び前記第2の移動体の類似度と、前記時間とに基づいて、前記第1の移動体及び前記第2の移動体が同一の移動体であるか否かを判別する判別手段とを備える情報処理システム。
前記第1の移動体の属性を特定する手段を更に備える付記1記載の情報処理システム。
前記第1のビデオカメラの映像に映る第1の移動体と、前記第2のビデオカメラの映像に映る第2の移動体との類似度を算出する手段を更に備える、付記1又は付記2のいずれか1項記載の情報処理システム。
前記第1の移動体の属性に応じて、前記時間が判別結果に与える影響が異なる、付記1乃至付記3のいずれか1項記載の情報処理システム。
前記第1の移動体の属性に応じて、前記類似度と、前記時間とが判別結果に与える影響の大きさが異なる、付記4記載の情報処理システム。
前記判別手段は、前記第1の移動体の属性に応じて設定される時間幅内に前記時間が収まる前記第2の移動体に対して、前記類似度が同一の移動体であるか否かを判別する、付記1乃至付記3のいずれか1項記載の情報処理システム。
前記時間幅は、前記第1のビデオカメラの撮影範囲から前記第2のビデオカメラの撮影範囲までの平均移動時間を跨いで設定され、前記時間幅の上限から前記平均移動時間までの時間は、前記時間幅の下限から前記平均移動時間までの時間よりも長い、付記6記載の情報処理システム。
前記属性は、移動体の移動速度に相関がある、付記1乃至付記7のいずれか1項記載の情報処理システム。
第1の移動体が第1のビデオカメラの映像からフレームアウトしてから第2のビデオカメラの映像に第2の移動体がフレームインするまでの時間を算出するステップと、前記第1の移動体の属性と、前記第1の移動体及び前記第2の移動体の類似度と、前記時間とに基づいて、前記第1の移動体及び前記第2の移動体が同一の移動体であるか否かを判別するステップとを情報処理システムが行う情報処理方法。
前記第1の移動体の属性を特定するステップを更に備える付記9記載の情報処理方法。
前記第1のビデオカメラの映像に映る第1の移動体と、前記第2のビデオカメラの映像に映る第2の移動体との類似度を算出するステップを更に備える、付記9又は付記10のいずれか1項記載の情報処理方法。
前記第1の移動体の属性に応じて、前記時間が判別結果に与える影響が異なる、付記9乃至付記11のいずれか1項記載の情報処理方法。
前記第1の移動体の属性に応じて、前記類似度と、前記時間とが判別結果に与える影響の大きさが異なる、付記12記載の情報処理方法。
前記第1の移動体の属性に応じて設定される時間幅内に前記時間が収まる前記第2の移動体に対して、前記類似度が同一の移動体であるか否かを判別する、付記9乃至付記11のいずれか1項記載の情報処理方法。
前記時間幅は、前記第1のビデオカメラの撮影範囲から前記第2のビデオカメラの撮影範囲までの平均移動時間を跨いで設定され、前記時間幅の上限から前記平均移動時間までの時間は、前記時間幅の下限から前記平均移動時間までの時間よりも長い、付記14記載の情報処理方法。
前記属性は、移動体の移動速度に相関がある、付記9乃至付記15のいずれか1項記載の情報処理方法。
第1の移動体が第1のビデオカメラの映像からフレームアウトしてから第2のビデオカメラの映像に第2の移動体がフレームインするまでの時間を算出する処理と、前記第1の移動体の属性と、前記第1の移動体及び前記第2の移動体の類似度と、前記時間とに基づいて、前記第1の移動体及び前記第2の移動体が同一の移動体であるか否かを判別する処理とをコンピュータに実行させるプログラム。
前記第1の移動体の属性を特定する処理を更に備える付記17記載のプログラム情報処理方法。
前記第1のビデオカメラの映像に映る第1の移動体と、前記第2のビデオカメラの映像に映る第2の移動体との類似度を算出する処理を更に備える、付記17又は付記18のいずれか1項記載のプログラム。
前記第1の移動体の属性に応じて、前記時間が判別結果に与える影響が異なる、付記17乃至付記19のいずれか1項記載のプログラム。
前記第1の移動体の属性に応じて、前記類似度と、前記時間とが判別結果に与える影響の大きさが異なる、付記20記載のプログラム。
前記第1の移動体の属性に応じて設定される時間幅内に前記時間が収まる前記第2の移動体に対して、前記類似度が同一の移動体であるか否かを判別する、付記17乃至付記19のいずれか1項記載のプログラム。
前記時間幅は、前記第1のビデオカメラの撮影範囲から前記第2のビデオカメラの撮影範囲までの平均移動時間を跨いで設定され、前記時間幅の上限から前記平均移動時間までの時間は、前記時間幅の下限から前記平均移動時間までの時間よりも長い、付記22記載のプログラム。
前記属性は、移動体の移動速度に相関がある、付記17乃至付記23のいずれか1項記載のプログラム。
Claims (10)
- 第1の移動体が第1のビデオカメラの映像からフレームアウトしてから第2のビデオカメラの映像に第2の移動体がフレームインするまでの時間を算出する手段と、
前記第1の移動体の属性と、前記第1の移動体及び前記第2の移動体の類似度と、前記時間とに基づいて、前記第1の移動体及び前記第2の移動体が同一の移動体であるか否かを判別する判別手段と
を備える情報処理システム。 - 前記第1の移動体の属性を特定する手段
を更に備える請求項1記載の情報処理システム。 - 前記第1のビデオカメラの映像に映る第1の移動体と、前記第2のビデオカメラの映像に映る第2の移動体との類似度を算出する手段
を更に備える、請求項1又は請求項2のいずれか1項記載の情報処理システム。 - 前記第1の移動体の属性に応じて、前記時間が判別結果に与える影響が異なる、
請求項1乃至請求項3のいずれか1項記載の情報処理システム。 - 前記第1の移動体の属性に応じて、前記類似度と、前記時間とが判別結果に与える影響の大きさが異なる、請求項4記載の情報処理システム。
- 前記判別手段は、前記第1の移動体の属性に応じて設定される時間幅内に前記時間が収まる前記第2の移動体に対して、前記類似度が同一の移動体であるか否かを判別する、
請求項1乃至請求項3のいずれか1項記載の情報処理システム。 - 前記時間幅は、前記第1のビデオカメラの撮影範囲から前記第2のビデオカメラの撮影範囲までの平均移動時間を跨いで設定され、
前記時間幅の上限から前記平均移動時間までの時間は、前記時間幅の下限から前記平均移動時間までの時間よりも長い、
請求項6記載の情報処理システム。 - 前記属性は、移動体の移動速度に相関がある、
請求項1乃至請求項7のいずれか1項記載の情報処理システム。 - 第1の移動体が第1のビデオカメラの映像からフレームアウトしてから第2のビデオカメラの映像に第2の移動体がフレームインするまでの時間を算出するステップと、
前記第1の移動体の属性と、前記第1の移動体及び前記第2の移動体の類似度と、前記時間とに基づいて、前記第1の移動体及び前記第2の移動体が同一の移動体であるか否かを判別するステップと
を情報処理システムが行う情報処理方法。 - 第1の移動体が第1のビデオカメラの映像からフレームアウトしてから第2のビデオカメラの映像に第2の移動体がフレームインするまでの時間を算出する処理と、
前記第1の移動体の属性と、前記第1の移動体及び前記第2の移動体の類似度と、前記時間とに基づいて、前記第1の移動体及び前記第2の移動体が同一の移動体であるか否かを判別する処理と
をコンピュータに実行させるプログラム。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015500168A JP6406241B2 (ja) | 2013-02-15 | 2014-01-22 | 情報処理システム、情報処理方法及びプログラム |
| US14/765,449 US9697420B2 (en) | 2013-02-15 | 2014-01-22 | Information processing system, information processing method, and computer-readable recording medium |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013-027796 | 2013-02-15 | ||
| JP2013027796 | 2013-02-15 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014125882A1 true WO2014125882A1 (ja) | 2014-08-21 |
Family
ID=51353892
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2014/051214 Ceased WO2014125882A1 (ja) | 2013-02-15 | 2014-01-22 | 情報処理システム、情報処理方法及びプログラム |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US9697420B2 (ja) |
| JP (1) | JP6406241B2 (ja) |
| WO (1) | WO2014125882A1 (ja) |
Cited By (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016119625A (ja) * | 2014-12-22 | 2016-06-30 | セコム株式会社 | 監視システム |
| JP2017111587A (ja) * | 2015-12-16 | 2017-06-22 | 日本電気株式会社 | 移動時間記憶システム、移動時間記憶方法および移動時間記憶プログラム |
| JP2018157287A (ja) * | 2017-03-16 | 2018-10-04 | 三菱電機インフォメーションネットワーク株式会社 | 映像再生装置、映像再生方法、映像再生プログラム及び映像再生システム |
| JP2019087841A (ja) * | 2017-11-06 | 2019-06-06 | 京セラドキュメントソリューションズ株式会社 | 監視システム |
| US10445887B2 (en) | 2013-03-27 | 2019-10-15 | Panasonic Intellectual Property Management Co., Ltd. | Tracking processing device and tracking processing system provided with same, and tracking processing method |
| JP2020064675A (ja) * | 2015-12-16 | 2020-04-23 | 日本電気株式会社 | 移動時間記憶システム、移動時間記憶方法および移動時間記憶プログラム |
| JP2020095564A (ja) * | 2018-12-14 | 2020-06-18 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
| WO2020137536A1 (ja) * | 2018-12-28 | 2020-07-02 | 日本電気株式会社 | 人物認証装置、制御方法、及びプログラム |
| JP2020107354A (ja) * | 2016-03-30 | 2020-07-09 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
| WO2020166129A1 (ja) * | 2019-02-14 | 2020-08-20 | 株式会社日立製作所 | 状態認識装置 |
| JP2020177678A (ja) * | 2020-07-06 | 2020-10-29 | 日本電気株式会社 | 解析装置、制御方法、及びプログラム |
| JP2021073618A (ja) * | 2016-03-18 | 2021-05-13 | 日本電気株式会社 | 映像監視システム、映像監視方法、及びプログラム |
| JP2021168216A (ja) * | 2020-01-09 | 2021-10-21 | 日本電気株式会社 | 情報処理システム、方法およびプログラム |
| JP2022030832A (ja) * | 2020-08-07 | 2022-02-18 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| US11527091B2 (en) | 2019-03-28 | 2022-12-13 | Nec Corporation | Analyzing apparatus, control method, and program |
| JP2024071823A (ja) * | 2022-11-15 | 2024-05-27 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びコンピュータプログラム |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106296724B (zh) * | 2015-05-12 | 2020-04-03 | 杭州海康威视数字技术股份有限公司 | 一种目标人物的轨迹信息的确定方法、系统及处理服务器 |
| US9984309B2 (en) * | 2015-09-30 | 2018-05-29 | International Business Machines Corporation | Classifying and grouping electronic images |
| US10397528B2 (en) | 2016-02-26 | 2019-08-27 | Amazon Technologies, Inc. | Providing status information for secondary devices with video footage from audio/video recording and communication devices |
| KR102459633B1 (ko) | 2016-02-26 | 2022-10-28 | 아마존 테크놀로지스, 인크. | 오디오/비디오 레코딩 및 통신 디바이스로부터의 비디오 장면의 공유 |
| US10748414B2 (en) | 2016-02-26 | 2020-08-18 | A9.Com, Inc. | Augmenting and sharing data from audio/video recording and communication devices |
| US10489453B2 (en) | 2016-02-26 | 2019-11-26 | Amazon Technologies, Inc. | Searching shared video footage from audio/video recording and communication devices |
| US9965934B2 (en) | 2016-02-26 | 2018-05-08 | Ring Inc. | Sharing video footage from audio/video recording and communication devices for parcel theft deterrence |
| US11393108B1 (en) | 2016-02-26 | 2022-07-19 | Amazon Technologies, Inc. | Neighborhood alert mode for triggering multi-device recording, multi-camera locating, and multi-camera event stitching for audio/video recording and communication devices |
| US10841542B2 (en) | 2016-02-26 | 2020-11-17 | A9.Com, Inc. | Locating a person of interest using shared video footage from audio/video recording and communication devices |
| US11138745B2 (en) | 2018-04-30 | 2021-10-05 | Uatc, Llc | Object association for autonomous vehicles |
| US10621444B1 (en) * | 2019-10-25 | 2020-04-14 | 7-Eleven, Inc. | Action detection during image tracking |
| US11714882B2 (en) * | 2020-10-09 | 2023-08-01 | Dragonfruit Ai, Inc. | Management of attributes associated with objects in video data |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006221355A (ja) * | 2005-02-09 | 2006-08-24 | Hitachi Ltd | 監視装置及び監視システム |
| JP2008219570A (ja) * | 2007-03-06 | 2008-09-18 | Matsushita Electric Ind Co Ltd | カメラ間連結関係情報生成装置 |
| JP2009017416A (ja) * | 2007-07-09 | 2009-01-22 | Mitsubishi Electric Corp | 監視装置及び監視方法及びプログラム |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4706535B2 (ja) | 2006-03-30 | 2011-06-22 | 株式会社日立製作所 | 複数カメラを用いた移動物体監視装置 |
| KR100904846B1 (ko) * | 2007-08-27 | 2009-06-25 | 아주대학교산학협력단 | 움직임을 추적하여 영상획득기기들의 네트워크 구성을추론하는 장치 및 방법 |
| EP2418849B1 (en) * | 2009-04-10 | 2013-10-02 | Omron Corporation | Monitoring system, and monitoring terminal |
| JP6233624B2 (ja) * | 2013-02-13 | 2017-11-22 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
| WO2014175356A1 (ja) * | 2013-04-26 | 2014-10-30 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
| US20160042621A1 (en) * | 2014-06-13 | 2016-02-11 | William Daylesford Hogg | Video Motion Detection Method and Alert Management |
-
2014
- 2014-01-22 US US14/765,449 patent/US9697420B2/en active Active
- 2014-01-22 WO PCT/JP2014/051214 patent/WO2014125882A1/ja not_active Ceased
- 2014-01-22 JP JP2015500168A patent/JP6406241B2/ja active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006221355A (ja) * | 2005-02-09 | 2006-08-24 | Hitachi Ltd | 監視装置及び監視システム |
| JP2008219570A (ja) * | 2007-03-06 | 2008-09-18 | Matsushita Electric Ind Co Ltd | カメラ間連結関係情報生成装置 |
| JP2009017416A (ja) * | 2007-07-09 | 2009-01-22 | Mitsubishi Electric Corp | 監視装置及び監視方法及びプログラム |
Cited By (31)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2529943B (en) * | 2013-03-27 | 2019-11-06 | Panasonic Ip Man Co Ltd | Tracking processing device and tracking processing system provided with same, and tracking processing method |
| US10445887B2 (en) | 2013-03-27 | 2019-10-15 | Panasonic Intellectual Property Management Co., Ltd. | Tracking processing device and tracking processing system provided with same, and tracking processing method |
| JP2016119625A (ja) * | 2014-12-22 | 2016-06-30 | セコム株式会社 | 監視システム |
| JP2017111587A (ja) * | 2015-12-16 | 2017-06-22 | 日本電気株式会社 | 移動時間記憶システム、移動時間記憶方法および移動時間記憶プログラム |
| JP2020064675A (ja) * | 2015-12-16 | 2020-04-23 | 日本電気株式会社 | 移動時間記憶システム、移動時間記憶方法および移動時間記憶プログラム |
| US12175687B2 (en) | 2016-03-18 | 2024-12-24 | Nec Corporation | Information processing apparatus, control method, and program |
| JP7047944B2 (ja) | 2016-03-18 | 2022-04-05 | 日本電気株式会社 | 映像監視システム、映像監視方法、及びプログラム |
| US12165339B2 (en) | 2016-03-18 | 2024-12-10 | Nec Corporation | Information processing apparatus, control method, and program |
| US11823398B2 (en) | 2016-03-18 | 2023-11-21 | Nec Corporation | Information processing apparatus, control method, and program |
| JP2021073618A (ja) * | 2016-03-18 | 2021-05-13 | 日本電気株式会社 | 映像監視システム、映像監視方法、及びプログラム |
| US11361452B2 (en) | 2016-03-18 | 2022-06-14 | Nec Corporation | Information processing apparatus, control method, and program |
| JP2020107354A (ja) * | 2016-03-30 | 2020-07-09 | 日本電気株式会社 | 解析装置、解析方法及びプログラム |
| US11176698B2 (en) | 2016-03-30 | 2021-11-16 | Nec Corporation | Analysis apparatus, analysis method, and storage medium |
| JP2018157287A (ja) * | 2017-03-16 | 2018-10-04 | 三菱電機インフォメーションネットワーク株式会社 | 映像再生装置、映像再生方法、映像再生プログラム及び映像再生システム |
| JP2019087841A (ja) * | 2017-11-06 | 2019-06-06 | 京セラドキュメントソリューションズ株式会社 | 監視システム |
| JP2020095564A (ja) * | 2018-12-14 | 2020-06-18 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
| JP7151449B2 (ja) | 2018-12-14 | 2022-10-12 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
| JPWO2020137536A1 (ja) * | 2018-12-28 | 2021-10-28 | 日本電気株式会社 | 人物認証装置、制御方法、及びプログラム |
| JP7314959B2 (ja) | 2018-12-28 | 2023-07-26 | 日本電気株式会社 | 人物認証装置、制御方法、及びプログラム |
| WO2020137536A1 (ja) * | 2018-12-28 | 2020-07-02 | 日本電気株式会社 | 人物認証装置、制御方法、及びプログラム |
| US12020510B2 (en) | 2018-12-28 | 2024-06-25 | Nec Corporation | Person authentication apparatus, control method, and non-transitory storage medium |
| WO2020166129A1 (ja) * | 2019-02-14 | 2020-08-20 | 株式会社日立製作所 | 状態認識装置 |
| US11527091B2 (en) | 2019-03-28 | 2022-12-13 | Nec Corporation | Analyzing apparatus, control method, and program |
| JP7218778B2 (ja) | 2020-01-09 | 2023-02-07 | 日本電気株式会社 | 情報処理システム、方法およびプログラム |
| JP2023052590A (ja) * | 2020-01-09 | 2023-04-11 | 日本電気株式会社 | 情報処理システム、方法およびプログラム |
| JP7567948B2 (ja) | 2020-01-09 | 2024-10-16 | 日本電気株式会社 | 情報処理システム、方法およびプログラム |
| JP2021168216A (ja) * | 2020-01-09 | 2021-10-21 | 日本電気株式会社 | 情報処理システム、方法およびプログラム |
| JP2020177678A (ja) * | 2020-07-06 | 2020-10-29 | 日本電気株式会社 | 解析装置、制御方法、及びプログラム |
| JP7479987B2 (ja) | 2020-08-07 | 2024-05-09 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| JP2022030832A (ja) * | 2020-08-07 | 2022-02-18 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| JP2024071823A (ja) * | 2022-11-15 | 2024-05-27 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びコンピュータプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20150363638A1 (en) | 2015-12-17 |
| JPWO2014125882A1 (ja) | 2017-02-02 |
| JP6406241B2 (ja) | 2018-10-17 |
| US9697420B2 (en) | 2017-07-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6406241B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
| JP6741130B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
| KR102465532B1 (ko) | 객체 인식 방법 및 장치 | |
| US9729865B1 (en) | Object detection and tracking | |
| US10027883B1 (en) | Primary user selection for head tracking | |
| JP6233624B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
| US20180260630A1 (en) | Activity recognition method and system | |
| JP7131599B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
| KR102384299B1 (ko) | 폭행 감지 기능을 구비한 cctv 촬영 장치 및 cctv 영상에 기초한 폭행 감지 방법 | |
| EP3573022A1 (en) | Method for tracking pedestrian and electronic device | |
| WO2018179202A1 (ja) | 情報処理装置、制御方法、及びプログラム | |
| WO2014061342A1 (ja) | 情報処理システム、情報処理方法及びプログラム | |
| Phankokkruad et al. | An evaluation of technical study and performance for real-time face detection using web real-time communication | |
| Zaidi et al. | Video anomaly detection and classification for human activity recognition | |
| CN109508657B (zh) | 人群聚集分析方法、系统、计算机可读存储介质及设备 | |
| KR102561308B1 (ko) | 교통정보 제공 방법, 장치 및 그러한 방법을 실행하기 위하여 매체에 저장된 컴퓨터 프로그램 | |
| US20210241013A1 (en) | Spoof detection by estimating subject motion from captured image frames | |
| KR20230166840A (ko) | 인공지능을 이용한 객체 이동 경로 확인 방법 | |
| US10740623B2 (en) | Representative image generation device and representative image generation method | |
| KR101648786B1 (ko) | 객체 인식 방법 | |
| Yoon et al. | Tracking System for mobile user Based on CCTV | |
| JP6798609B2 (ja) | 映像解析装置、映像解析方法およびプログラム | |
| KR101564760B1 (ko) | 범죄 사건의 예측을 위한 영상 처리 장치 및 방법 | |
| CN111046788A (zh) | 一种停留人员的检测方法、装置及系统 | |
| WO2020022362A1 (ja) | 動き検出装置、特性検出装置、流体検出装置、動き検出システム、動き検出方法、プログラム、および、記録媒体 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14750987 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14765449 Country of ref document: US |
|
| ENP | Entry into the national phase |
Ref document number: 2015500168 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 14750987 Country of ref document: EP Kind code of ref document: A1 |