[go: up one dir, main page]

WO2012124852A1 - Dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et système de surveillance et procédé l'utilisant - Google Patents

Dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et système de surveillance et procédé l'utilisant Download PDF

Info

Publication number
WO2012124852A1
WO2012124852A1 PCT/KR2011/002143 KR2011002143W WO2012124852A1 WO 2012124852 A1 WO2012124852 A1 WO 2012124852A1 KR 2011002143 W KR2011002143 W KR 2011002143W WO 2012124852 A1 WO2012124852 A1 WO 2012124852A1
Authority
WO
WIPO (PCT)
Prior art keywords
zone
monitoring
pixel
image
stereo
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2011/002143
Other languages
English (en)
Korean (ko)
Inventor
강인배
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ITXSECURITY CO Ltd
Original Assignee
ITXSECURITY CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ITXSECURITY CO Ltd filed Critical ITXSECURITY CO Ltd
Publication of WO2012124852A1 publication Critical patent/WO2012124852A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19652Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Definitions

  • the present invention relates to a stereo camera apparatus capable of recognizing an object based on 3D depth map data obtained by using two cameras, and more particularly, to track a location of a recognized object.
  • a method of using a stereo camera, a method of using a laser scan, or a method of using a time of flight (TOF) The back is known.
  • stereo matching using a stereo camera is a hardware implementation of a process of recognizing a stereoscopic object using two eyes, and a pair of images obtained by photographing the same subject with two cameras. It is a method of extracting information about depth (or distance) in space through the interpretation process of.
  • binocular differences on the same Epipolar Line of images obtained from two cameras are calculated.
  • the binocular difference includes distance information, and the geometrical characteristic calculated from the binocular difference becomes the depth.
  • the binocular difference value is calculated in real time from the input image, three-dimensional distance information of the observation space can be measured.
  • stereo matching algorithm for example, "image matching method using a plurality of image lines” of the Republic of Korea Patent No. 0517876 or "binocular difference estimation method for three-dimensional object recognition” of the Republic of Korea Patent No. 0601958.
  • Applicant has already invented an image recognition apparatus capable of distinguishing and recognizing objects in a space, in particular, objects to be monitored by a manager using such a stereo matching algorithm, and Korean Patent Application Nos. 10-2010-0039302 and 10-. 2010-0039366 is pending.
  • movement information in the surveillance zone of the monitor may be divided into sensing information about points and lines acquired through a detector and sensing information about a space.
  • Detection of points or lines is the detection of the opening of a door or window, or the detection of an object on an infrared line using an infrared sensor.
  • the detection of the space is to detect specific changes in the space by using a method of detecting temperature change through infrared rays, and so on, and to monitor the detailed movement in the space by simply detecting whether an object moves in the space. This is not possible.
  • An object of the present invention is a stereo camera device for recognizing an object based on 3D depth map data acquired by using two cameras and simultaneously monitoring and tracking the position of the recognized object in space. To provide a monitoring system and method used.
  • the stereo camera device of the present invention for achieving the above object, a stereo camera having a first camera and a second camera for photographing the same surveillance zone to generate a pair of stereo digital image; And an image processor extracting an object moving in the surveillance zone while calculating distance information of each pixel through image processing on the stereo digital image output from the stereo camera.
  • the image processor is configured to set at least one monitoring zone defined by a pixel area set to belong to the image and a predetermined distance range for each pixel of the pixel area, and based on the calculated distance information for each pixel. If the extracted object is determined to be located in the surveillance zone includes a surveillance zone-alarm unit for generating a surveillance zone entry alarm and output to the external monitoring device.
  • the surveillance zone entry alert may preferably include an identifier of the surveillance zone in which the extracted object is located.
  • the image processor may include a distance information calculator configured to calculate 3D depth map data using the stereo digital image; An object extractor configured to extract an area of a moving object by comparing one of the stereo digital images with a reference background image; And an object recognizing unit for calculating an area or a representative length of the extracted object and recognizing the object as a monitoring target when the calculated area or representative length of the object corresponds to a preset range.
  • the stereo camera device and a monitoring device for receiving an alarm from the stereo camera device to display to the administrator in the form of a digital map.
  • a method for monitoring a stereo camera device including: generating a pair of stereo digital images using two cameras photographing the same surveillance zone; Extracting an object moving in the surveillance zone while calculating distance information of each pixel through image processing on the stereo digital image output from the stereo camera; Setting at least one surveillance zone defined by a pixel area set to belong to the image and a preset distance range for each pixel of the pixel area; And generating a surveillance zone entry alarm when the extracted object is determined to be located in the surveillance zone based on the calculated pixel-by-pixel distance information, and outputting the generated surveillance zone alert to an external monitoring apparatus.
  • the stereo camera device may provide location information on the space of the object in addition to providing a simple alarm for recognition of the object.
  • an alarm may be generated by detecting an object moving on a specific surveillance zone, and the location and movement path of the corresponding surveillance zone may be tracked.
  • This feature enables a stereo camera to provide intelligent surveillance beyond the function of a surveillance sensor, and provide stereoscopic information to a manager beyond a simple alarm level, depending on the device coupled to the stereo camera device of the present invention.
  • the position information on the surveillance space generated by the present invention is provided in units of detailed surveillance zones, the position information can be displayed on the digital drawing even with a lighter system resource, such as a portable monitoring apparatus.
  • FIG. 1 is a block diagram of a surveillance system including a stereo camera device according to an embodiment of the present invention
  • FIG. 3 is a view showing the monitoring zone (S) and the monitoring zone (L1, L2, L3) according to an embodiment of the present invention
  • FIG. 4 is an example of an image photographing the surveillance zone S of FIG. 3;
  • FIG. 6 is a view provided to explain a method of extracting a central axis of an object.
  • the surveillance system 100 of the present invention includes a stereo camera device 130 and a monitoring device 150 connected through a predetermined network 110 to monitor a moving subject in a three-dimensional space, and to alert an alarm. Can be output and its location displayed on the digital drawing.
  • This monitoring system 100 can be used not only for security purposes, but also for any application that tracks the location of a specific object entering a specific space and needs the location information.
  • the stereo camera device 130 is installed in a specific surveillance zone for crime prevention and other purposes, and the monitoring device 150 is preferably located in a manager area away from the surveillance zone.
  • the network 110 may be an internal dedicated communication network or a commercial public network such as the Internet, a mobile communication network, a public line network (PSTN), or a wired as well as a wireless network.
  • PSTN public line network
  • the stereo camera device 130 and the monitoring device 150 should include interface means for connecting to the network 110.
  • the stereo camera device 130 includes a stereo camera 131 and an image processor 140, and may recognize an object moving on the surveillance area to determine whether the object is a specific monitoring target.
  • the stereo camera 131 includes a first camera 133, a second camera 135, and an image receiver 137.
  • the first camera 133 and the second camera 135 are a pair of cameras spaced apart from each other to photograph the same surveillance zone, and are called a stereo camera.
  • the first camera 133 and the second camera 135 output an analog (or digital) stereo video signal of the surveillance zone to the image receiver 137.
  • the image receiver 137 converts a video signal (or image) of a continuous frame input from the first camera 133 and the second camera 135 into a digital image, and synchronizes the frame to the image processor 140 in synchronization with the frame. to provide.
  • the image processor 140 extracts an area of an object moving on a shooting area (monitoring area) from a pair of digital image frames output from the image receiver 137 to determine whether the object is of interest, and the stereo camera device 130.
  • the above determination process may be performed in real time on all frames of the image (video) which are continuously input from the.
  • the image processor 140 includes a distance information calculator 141, an object extractor 143, an object recognizer 145, an object tracker 147, and a monitoring zone-alarm 149.
  • a distance information calculator 141 the distance information calculator 141, the object extractor 143, the object recognizer 145, the object tracker 147, and the monitoring zone-alarm 149 will be described with reference to FIGS. 2 to 4.
  • FIGS. 2 to 4. Operations of the distance information calculator 141, the object extractor 143, the object recognizer 145, the object tracker 147, and the monitoring zone-alarm 149 will be described with reference to FIGS. 2 to 4.
  • the first camera 133 and the second camera 135 are arranged to photograph a specific surveillance zone.
  • the image receiver 137 converts the analog image signal into a digital image signal and then provides the image processor 140 in synchronization with a frame. (Step S201).
  • the distance information calculator 141 calculates 3D depth map data including distance information of each pixel from a pair of digital images received in real time from the image receiver 137.
  • the object extractor 143 and the object recognizer 145 extract a region of a moving object from at least one image of a pair of digital images input through the image receiver 137.
  • the object extractor 143 first extracts the moving object using a conventionally known image processing technique. Extraction of a moving object is performed by obtaining a differential image obtained by subtracting a basic background image from a newly input image.
  • the object extractor 143 and the object recognizer 145 determine whether the extracted object is an object of a type monitored by the manager. For example, it is determined whether the object is a person, a car, or an animal, or if it is a person, it is determined whether or not the person is over a certain height. If the monitoring target is a person, the object extraction unit 143 and the object recognition unit 145 extracts and recognizes an object that is specifically determined as a person.
  • the object extraction unit 143 detects the outline of the object from the difference image.
  • the object recognition unit 145 calculates the area of the object or the representative length of the object by using the depth map data calculated by the object line extracting unit 143 and the outline of the object extracted by the distance information calculating unit 141.
  • the object recognition unit 145 may determine whether the extracted object is an object of interest by determining whether the calculated area or representative length of the object falls within a preset area or length range of the object of interest.
  • the object tracking unit 147 tracks the recognized movement of the object of interest and provides the location information to the monitoring zone-alarm unit 149.
  • image processing techniques already known in the art regarding such motion tracking, and such a method can be used as appropriate.
  • the operation of the monitoring zone-alarm section 149 below is the distance information calculation unit 141, the object extraction unit 143 and the object recognition unit 145 It is also possible to repeat the operation described above with respect to the stereo image input in real time from the image receiving unit 137.
  • the monitoring zone-alarm unit 149 generates an alarm and outputs the alarm to the monitoring device 150 once the object of interest appears in the image.
  • the alarm may correspond to an alarm in a pure sense including information on the surveillance zone in which the stereo camera device 130 is installed and information indicating that the monitored object appears in the surveillance zone.
  • FIG. 3 shows a surveillance zone S in which the stereo camera device 130 of the present invention is installed, and a plurality of surveillance zones L1, L2, and L3 existing in the surveillance zone S.
  • FIG. 3 is an example of an image P photographing the surveillance zone S of FIG. 3.
  • the surveillance zone-alarm unit 149 monitors the image of FIG. 4. Generate an alert when the target object appears in m1.
  • the monitoring zone-alarm 149 determines that the object being tracked enters the special monitoring zone based on the information provided by the object tracking unit 147, the monitoring zone-alarm unit 149 generates the monitoring zone entry alarm and together with the location information. Output to the monitoring device 150.
  • the surveillance zone S is determined according to the installation position of the stereo camera device 130 and the angle of view of the camera device 130, while the surveillance zones L1, L2, and L3 are surveillance zones. It is set by the administrator within (S).
  • the monitoring zones L1, L2, and L3 are specified by pixel ranges L1-1, L2-2, and L3-3 indicating a corresponding zone in the image, and a distance range from the stereo camera device 130 to the corresponding zone. do.
  • the pixel range L1-1 shown in FIG. 4 and the distance range d1 to d2 are specified.
  • the pixel ranges of the second monitoring zone L2-1 and L2-2 overlap somewhat, but the distance ranges are different, and thus may be distinguished from each other.
  • the monitoring zone-alarm unit 149 may obtain the distance to the object by grasping the distance information for each pixel based on the depth map data provided from the distance information calculator 141.
  • the monitoring zone-alarm unit 149 determines that the object is located at m1 when the object being tracked from the object tracking unit 147 is in the L1-1 pixel range and the distance to the object is in the range d1 to d2.
  • the monitoring zone entry alarm is generated and output to the monitoring device 150.
  • the watched zone entry alert may basically include information (eg, watched zone identifier) and detection time information about a watched zone to which an object enters from a plurality of watched zones.
  • the surveillance zone entry information may include the image itself of at least one frame in which the corresponding sensing moment is captured.
  • the monitoring zone-alarm unit 149 is also provided.
  • the monitoring zone entry information is output to the monitoring device 150.
  • Object tracking and monitoring of the stereo camera device 130 of the present invention is performed by the above method.
  • the monitoring device 150 of the present invention may be connected to the stereo camera device 130 through the network 110, and may receive various alarms from the stereo camera device 130.
  • the monitoring device 150 may correspond to not only a general computer but also a mobile phone, a PDA, a smart phone, and other dedicated terminals carried by an individual.
  • the monitoring device 150 may include a display unit (b) to visually check whether the object enters a specific monitoring zone in the monitoring zone in the case of the monitoring zone entry warning.
  • the monitoring device 150 may visually display the movement of the object in the surveillance area to the manager using an alarm provided by the stereo camera device 130.
  • the monitoring device 150 uses an alarm provided by the monitoring zone-alarm unit 149 while preserving a digital map as shown in FIG. 3 in which the monitoring zone S and the monitoring zones L1, L2, and L3 are displayed. 3 may be displayed to the user.
  • Information of the degree of this drawing is useful because it can be a small enough capacity to be transmitted to the portable monitoring device 150 by wire or wireless, and to be processed and visually displayed by the portable monitoring device 150.
  • the monitoring apparatus 150 further includes a voice guidance processing unit (not shown), and when the object approaches a specific surveillance zone, a predetermined guidance message (eg, through a speaker (not shown) installed in the surveillance zone S) (eg, will be able to output "step back from ooo").
  • a voice guidance processing unit not shown
  • a predetermined guidance message eg, through a speaker (not shown) installed in the surveillance zone S
  • the monitoring device 150 may retain the function of the monitoring zone-alarm unit 149 of the stereo camera device 130.
  • the monitoring apparatus 150 may further include a monitoring zone unit (not shown) that determines whether the object enters a specific monitoring zone by using information calculated and provided by the object tracking unit 147.
  • the actual area per pixel (hereinafter, referred to as a 'unit area' of a pixel) at a distance (do) at which the object is extracted in operation S203 is obtained, and then the pixel included in the outline of the object is calculated. This is done by multiplying numbers.
  • the actual area M corresponding to the entire frame at the maximum depth D and the actual area m corresponding to the entire frame at the position do of the extracted object based on the existing background image ( do) is displayed.
  • the actual area m (do) corresponding to the entire frame at a distance do where the object is located may be obtained as in Equation 1 below.
  • M is an actual area corresponding to the entire frame (eg, 720 ⁇ 640 pixels) at the maximum distance do based on the existing background image.
  • Equation 2 the total number of pixels. According to Equation 2, it can be seen that m p (do) depends on the distance do to the corresponding object confirmed from the distance information of the 3D depth map data.
  • the area of the object can be obtained as shown in Equation 3 by multiplying the unit area m p (do) of the pixel by the number qc of the pixels included in the outline.
  • qc is the number of pixels included in the object.
  • the object recognizing unit 145 extracts a media axis of an object having a width of 1 pixel by applying a skeletal or thinning algorithm to the object extracted by the object extracting unit 143.
  • a skeletal or thinning algorithm e.g., a Medial Axis Transform (MAT) algorithm or Zhang Suen algorithm can be applied.
  • the central axis a of the object is a set of points having a plurality of boundary points among the respective points (or pixels) in the object R as shown in FIG. 6.
  • the boundary point refers to a point closest to the point in the object among the points on the outline B, and the points b1 and b2 on the outline become the boundary point of the point P1 in the object R. Therefore, the central axis algorithm is a process of extracting points having a plurality of boundary points and may be expressed as in Equation 4 below.
  • P ma is a central axis represented by a set of x
  • x is a point present in the object R
  • b min (x) is the number of boundary points of the point x.
  • the central axis is a set of points x whose number of boundary points is greater than one.
  • the structure of the skeleton may change somewhat according to a method of obtaining a distance from an internal point x to an arbitrary pixel on the outline (for example, 4-Distance, 8-Distance, Euclidean Distance, etc.). .
  • the center line may be extracted by extracting a peak value of the Gaussian value for the object.
  • the representative length of the object is obtained using the depth map data.
  • the representative length of the object is a value calculated from an image as an actual length of an object set to represent the object, and may correspond to an actual length of a central axis, an actual width of an object, or an actual height of an object. However, the representative length of the object is affected by the position of the camera, the shooting angle, and the characteristics of the shooting area.
  • the calculation of the actual length of an object is a method of obtaining the actual length per pixel (hereinafter referred to as the 'unit length' of a pixel) at a distance (do) where the object is located, and then multiplying the number of pixels representing the object. Is done.
  • the number of pixels representing the object may correspond to the number of pixels forming the central axis, the number of pixels to be the width or height of the object.
  • the width or height of the object as the number of pixels representing the object, can be obtained through the range of the x-axis coordinate or the y-axis coordinate of the object area, and the length of the central axis is, for example, the number of pixels included in the central axis. It can be obtained by adding.
  • the unit length of a particular pixel varies from pixel to pixel (exactly depending on the depth of the pixel), and can be obtained as follows with reference to FIG. 5.
  • the size of the image frame is 720x640 pixels.
  • the corresponding actual length L (do) is indicated.
  • the actual length L (do) corresponding to the vertical axis (or the horizontal axis) of the entire frame at the depth do where the object is located may be obtained as in Equation 5 below.
  • L (do) is the actual length corresponding to the vertical axis (or horizontal axis) of the entire frame at the depth do
  • Lmax is the vertical axis (or horizontal axis) of the entire frame at the maximum depth D based on the existing background image. The corresponding actual length.
  • L p (do) is the unit length of the pixel included in the object region located at the depth do
  • Qy is the number of pixels along the vertical axis of the entire frame.
  • the object recognition unit 145 obtains the representative length of the object.
  • the representative length of the object may be calculated by Equation 7 by multiplying the unit length L p (do) of the pixel by the number qo of the pixels representing the object.
  • qo is the number of pixels representing the object.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

L'invention porte sur un dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et sur un procédé associé. Le dispositif de caméra stéréo de la présente invention extrait un objet mobile à surveiller à partir d'une image stéréo capturée à l'intérieur d'une zone surveillée spécifique, puis génère et fournit une alarme à un dispositif de surveillance lorsque l'objet extrait entre dans une zone surveillée de manière spécifique tout en suivant le mouvement de l'objet correspondant.
PCT/KR2011/002143 2011-03-14 2011-03-29 Dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et système de surveillance et procédé l'utilisant Ceased WO2012124852A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020110022280A KR20120104711A (ko) 2011-03-14 2011-03-14 감시구역 상의 객체의 경로를 추적할 수 있는 스테레오 카메라 장치, 그를 이용한 감시시스템 및 방법
KR10-2011-0022280 2011-03-14

Publications (1)

Publication Number Publication Date
WO2012124852A1 true WO2012124852A1 (fr) 2012-09-20

Family

ID=46830903

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/002143 Ceased WO2012124852A1 (fr) 2011-03-14 2011-03-29 Dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et système de surveillance et procédé l'utilisant

Country Status (2)

Country Link
KR (1) KR20120104711A (fr)
WO (1) WO2012124852A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2851880A1 (fr) * 2013-09-19 2015-03-25 Canon Kabushiki Kaisha Procédé de commande dans un système de capture d'image, appareil de commande et support de stockage, en particulier pour une porte d'entrée
WO2015083875A1 (fr) * 2013-12-03 2015-06-11 전자부품연구원 Procédé et système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule
WO2017071084A1 (fr) * 2015-10-28 2017-05-04 小米科技有限责任公司 Procédé et dispositif d'alarme
CN108898617A (zh) * 2018-05-24 2018-11-27 宇龙计算机通信科技(深圳)有限公司 一种目标对象的跟踪方法和装置
CN110942578A (zh) * 2019-11-29 2020-03-31 韦达信息技术(深圳)有限公司 智能分析防盗报警系统

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101640527B1 (ko) 2012-10-09 2016-07-18 에스케이 텔레콤주식회사 단일객체의 크기를 추정하는 영상 감시장치 및 방법
KR101519261B1 (ko) 2013-12-17 2015-05-11 현대자동차주식회사 차량의 모니터링 방법 및 자동 제동 장치
KR101400169B1 (ko) * 2014-02-06 2014-05-28 (주)라이드소프트 방범 관제를 위한 가상현실 기법을 이용한 시각적 순찰 시스템 및 그 방법
KR101593187B1 (ko) * 2014-07-22 2016-02-11 주식회사 에스원 3차원 영상 정보를 이용한 이상 행동 감시 장치 및 방법
EP3026653A1 (fr) * 2014-11-27 2016-06-01 Kapsch TrafficCom AB Procédé de commande d'un système de surveillance de trafic
KR101645451B1 (ko) * 2015-04-14 2016-08-12 공간정보기술 주식회사 스테레오 카메라를 이용한 감지영역 내의 이동객체 감지시스템
KR102076531B1 (ko) 2015-10-27 2020-02-12 한국전자통신연구원 멀티 센서 기반 위치 추적 시스템 및 방법
KR101748780B1 (ko) * 2016-12-02 2017-06-19 (주) 비전에스티 스테레오 카메라를 이용한 도로객체 인식방법 및 장치

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003052034A (ja) * 2001-08-06 2003-02-21 Sumitomo Osaka Cement Co Ltd ステレオ画像を用いた監視システム
JP2003246268A (ja) * 2002-02-22 2003-09-02 East Japan Railway Co ホーム転落者検知方法及び装置
KR20090027410A (ko) * 2007-09-12 2009-03-17 한국철도기술연구원 스테레오 영상기반 승강장 모니터링 시스템 및 그 방법

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003052034A (ja) * 2001-08-06 2003-02-21 Sumitomo Osaka Cement Co Ltd ステレオ画像を用いた監視システム
JP2003246268A (ja) * 2002-02-22 2003-09-02 East Japan Railway Co ホーム転落者検知方法及び装置
KR20090027410A (ko) * 2007-09-12 2009-03-17 한국철도기술연구원 스테레오 영상기반 승강장 모니터링 시스템 및 그 방법

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2851880A1 (fr) * 2013-09-19 2015-03-25 Canon Kabushiki Kaisha Procédé de commande dans un système de capture d'image, appareil de commande et support de stockage, en particulier pour une porte d'entrée
US10218899B2 (en) 2013-09-19 2019-02-26 Canon Kabushiki Kaisha Control method in image capture system, control apparatus and a non-transitory computer-readable storage medium
WO2015083875A1 (fr) * 2013-12-03 2015-06-11 전자부품연구원 Procédé et système mobile pour estimer un emplacement de caméra par l'intermédiaire de la génération et de la sélection d'une particule
WO2017071084A1 (fr) * 2015-10-28 2017-05-04 小米科技有限责任公司 Procédé et dispositif d'alarme
US10147288B2 (en) 2015-10-28 2018-12-04 Xiaomi Inc. Alarm method and device
CN108898617A (zh) * 2018-05-24 2018-11-27 宇龙计算机通信科技(深圳)有限公司 一种目标对象的跟踪方法和装置
CN110942578A (zh) * 2019-11-29 2020-03-31 韦达信息技术(深圳)有限公司 智能分析防盗报警系统

Also Published As

Publication number Publication date
KR20120104711A (ko) 2012-09-24

Similar Documents

Publication Publication Date Title
WO2012124852A1 (fr) Dispositif de caméra stéréo capable de suivre le trajet d'un objet dans une zone surveillée, et système de surveillance et procédé l'utilisant
CN103168467B (zh) 使用热图像坐标的安防摄像机追踪和监控系统及方法
CN105915846B (zh) 一种单双目复用的入侵物监测方法及系统
WO2014073841A1 (fr) Procédé de détection de localisation intérieure basée sur image et terminal mobile utilisant ledit procédé
WO2016072625A1 (fr) Système de contrôle d'emplacement de véhicule pour parc de stationnement utilisant une technique d'imagerie, et son procédé de commande
WO2012005387A1 (fr) Procédé et système de suivi d'un objet mobile dans une zone étendue à l'aide de multiples caméras et d'un algorithme de poursuite d'objet
WO2011136407A1 (fr) Appareil et procédé de reconnaissance d'image à l'aide d'un appareil photographique stéréoscopique
WO2016099084A1 (fr) Système de fourniture de service de sécurité et procédé utilisant un signal de balise
WO2016107230A1 (fr) Système et procédé pour reproduire des objets dans une scène tridimensionnelle (3d)
JP2020182146A (ja) 監視装置、及び、監視方法
WO2014051262A1 (fr) Procédé d'établissement de règles d'événement et appareil de surveillance d'événement l'utilisant
WO2018135906A1 (fr) Caméra et procédé de traitement d'image d'une caméra
CN111601011A (zh) 一种基于视频流图像的自动告警方法及系统
WO2012091326A2 (fr) Système de vision de rue en temps réel tridimensionnel utilisant des informations d'identification distinctes
WO2023074995A1 (fr) Système pour détecter et exprimer une température anormale au niveau d'un site industriel à l'aide d'un appareil de prise de vues d'imagerie thermique et générer une alarme pour informer celui-ci, et son procédé de fonctionnement
WO2017142311A1 (fr) Système de suivi de multiples objets et procédé de suivi de multiples objets utilisant ce dernier
KR101446422B1 (ko) 영상 감시 시스템 및 방법
WO2021020866A1 (fr) Système et procédé d'analyse d'images pour surveillance à distance
WO2018139847A1 (fr) Procédé d'identification personnelle par comparaison faciale
WO2021025242A1 (fr) Dispositif électronique et son procédé pour l'identification d'image virtuelle d'objet par réflexion dans un environnement intérieur
WO2012137994A1 (fr) Dispositif de reconnaissance d'image et son procédé de surveillance d'image
WO2020111353A1 (fr) Procédé et appareil pour détecter un équipement d'invasion de confidentialité et système associé
WO2018097384A1 (fr) Appareil et procédé de notification de fréquentation
KR102374357B1 (ko) 밀집 통제를 위한 영상 감시 장치
WO2015026002A1 (fr) Appareil d'appariement d'images et procédé d'appariement d'images au moyen de cet appareil

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11860792

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11860792

Country of ref document: EP

Kind code of ref document: A1