[go: up one dir, main page]

WO2011025085A1 - Method and system for combined audio-visual surveillance cross-reference to related applications - Google Patents

Method and system for combined audio-visual surveillance cross-reference to related applications Download PDF

Info

Publication number
WO2011025085A1
WO2011025085A1 PCT/KR2009/005366 KR2009005366W WO2011025085A1 WO 2011025085 A1 WO2011025085 A1 WO 2011025085A1 KR 2009005366 W KR2009005366 W KR 2009005366W WO 2011025085 A1 WO2011025085 A1 WO 2011025085A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
video
input
triggering
location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2009/005366
Other languages
French (fr)
Inventor
Jung-Jae Park
Yong-Soo Jang
Hyoung-Jin Park
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Axium Technologies Inc
Original Assignee
Axium Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Axium Technologies Inc filed Critical Axium Technologies Inc
Publication of WO2011025085A1 publication Critical patent/WO2011025085A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0269System arrangements wherein the object is to detect the exact location of child or item using a navigation satellite system, e.g. GPS
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/08Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates generally to security devices, and more particularly to networked surveillance camera systems.
  • the present invention also relates to alarm and camera systems triggered with a combination of video and audio analysis.
  • surveillance systems are utilized for crime prevention, detection, and prosecution purposes.
  • Common locations where surveillance systems are utilized include banks, airports, casinos, parking lots and garages, parks, corporate facilities and the like, though increased availability has brought about its deployment in more private settings such as personal residences.
  • surveillance systems typically employ closed circuit television (CCTV) cameras that are connected to a central observation station with one or more video monitors.
  • CCTV closed circuit television
  • CCTV systems are intended for security personnel to constantly observe the monitors and respond to any incidents in real-time.
  • CCTV cameras may also be connected to video recording devices that archive footage for subsequent viewing, analysis, and other related uses.
  • VHS Video Home System
  • VCR Videocassette Recorders
  • IP network camera systems Like analog CCTV systems, cameras are installed in multiple locations, with the footage being viewable from the central monitoring station. The cameras have digital sensors, however, in which photons of light from each image or frame of video are converted to data representative of the same. This data is transmitted over a conventional data transfer link such as Ethernet. A minimalist video server may be incorporated into each of the cameras, and a remote client software application may communicate with each of the video servers to request video data for display. Because the networking protocols for IP network camera systems are the same as those utilized in standard computer networks, surveillance systems were able to use existing network infrastructure. Unlike analog CCTV systems, IP network camera systems do not have distance limitations, and because a large volume of data can be stored with relative ease on hard disk drives, optical discs, and other such media, the burdens previously associated with access and management of surveillance footage is greatly reduced.
  • surveillance cameras Regardless of the scope of deployment, surveillance cameras largely remain a tool for after-the-fact reconstruction of an incident for investigation and prosecution purposes rather than a proactive prevention of the same. While the visibility of cameras and the widespread knowledge of the existence of ongoing surveillance may have some deterrent effect against criminal or otherwise legally actionable activity, early detection and response thereto may not be possible. As another example, despite the installation 4.2 million cameras in the United Kingdom, the 2005 subway bombings could not be prevented, resulting in 56 deaths and over 700 injuries.
  • the primary limitation in this regard is the inability of human beings to sustain the necessary concentration levels for extended periods of time. Studies have shown that after twenty minutes of watching live surveillance footage, up to 90% of the information being shown will be missed. Monitoring abilities further deteriorate as the number of cameras is increased.
  • a surveillance system for monitoring a location.
  • the system may include an audio module having an audio input connected to an acoustic transducer.
  • the acoustic transducer may be receptive to environmental sound signals from the monitored location.
  • the surveillance system may include a video module with a video input connected to an imaging sensor.
  • the imaging sensor may be receptive to visual events from the monitored location.
  • There may also be a central processor that is connected to the audio module and the video module.
  • the central processor may be programmable to generate an alert signal based upon a concurrent detection of an environmental sound signal and a visual event. This environmental sound signal and visual event may be associated with a triggering incident.
  • a method of surveillance of a location for triggering events may include receiving an input audio signal of the location, and receiving an input video stream of the location.
  • the method may also include detecting a specific one of the triggering events based upon the input audio signal matching a predefined sonic signature. Additionally, the detecting of the specific one of the
  • triggering events may be based upon the input video signal matching a predefined image sequence.
  • the predefined sonic signature and the predefined image sequence may be associated with the specific one of the triggering events.
  • the method may further include generating an alarm in response to the detection of the specific one of the triggering events.
  • the method may include transmitting the input audio signal, the input video stream, and the alarm to a remote monitor.
  • a surveillance system and a method for monitoring a location have an advantage generating alarm when the audio module and the video module detect the triggering event.
  • FIG. 1 is an illustration of an exemplary environment in which a surveillance system of the present invention is deployed
  • FIG. 2 is a block diagram of the various components of one embodiment of the surveillance system, including multiple surveillance camera units, networking components, and a remote monitoring station;
  • FIG. 3 is a block diagram of the components of the surveillance camera unit
  • FIG. 4 is a flowchart illustrating the steps for one embodiment of a method of surveillance
  • FIG. 5 is a flowchart illustrating the steps involved in receiving the input audio and detecting the triggering event therefrom.
  • FIG. 6 is a flowchart showing an execution flow following the detection of the triggering event in accordance with various embodiments of the present invention.
  • FIG. 1 shows an exemplary environment 10 in which a surveillance system of the present invention may be installed and deployed.
  • the environment 10 includes a building 12 that is located on a street corner.
  • a parking lot 18 As part of its security arrangements, these various locations are monitored by surveillance camera units 20.
  • the parking lot 18 is monitored by a first camera unit 20a in order to monitor, for example, potential carjackings, assault, robberies, vehicle vandalism, and so forth.
  • the entrance 14 is monitored by a pair of second and third camera units 20b, 20c, respectively, to track the people entering and exiting the building 12.
  • the side entrance 16 is monitored by a fourth camera unit 20d to track vehicles and their license plates approaching the gate for access.
  • the surveillance camera units 20 may be installed in different places in the environment 10 for a variety of purposes.
  • location is understood to refer generally to the environment 10 that is being monitored with the surveillance system of the present invention.
  • location is also understood to refer to specific segments of the environment 10 that is being monitored by a specific one of the surveillance camera units 20.
  • different surveillance camera units 20 may also be referred to by the particular installation location, e.g., the first surveillance camera unit 20a that covers the parking lot 18 may also be referred to as the parking lot camera unit.
  • the surveillance system 1 includes one or more surveillance camera units 20 that are positioned throughout various locations in the environment 10 to record video footage therefrom.
  • the parking lot camera unit 20a, the left front door camera unit 20b, the right front door camera unit 20b, the left front door camera unit 20c, and the side entrance camera unit 20d are connected to an internal network 22.
  • the surveillance footage recorded by the surveillance camera units 20 may be transmitted to and displayed on a remote monitoring station 24 that is manned by security personnel.
  • the remote monitoring station 24 is likewise connected to the internal network 22.
  • the actual location of the remote monitoring station 24 need not be limited to within the building 12. Monitoring services performed by the security personnel may be outsourced to third party providers otherwise unaffiliated with the management of the building 12, and may also be remotely located from the same.
  • the internal network 22 is contemplated to be a Transmission Control Protocol/Internet Protocol (TCP/IP) network that can be interconnected to other systems on the Internet.
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • Data traffic from each of the surveillance camera units 20 may be aggregated by a hub or switch 26 that similarly complies with the TCP/IP standards of the internal network 22.
  • the use of the TCP/IP network in this context is by way of example only, as many existing networked camera devices are compliant therewith. Other networking standards may, however, be substituted without departing from the present invention.
  • the remote monitoring station 24 may be a conventional desktop computer having a central processing unit, memory, and input and output devices connected thereto such as keyboards, mice, and display units 28.
  • the remote monitoring station 24 is understood to have software instructions loaded thereon that, when executed, perform various functions involved with accessing and displaying footage from the surveillance camera units 20.
  • the surveillance camera unit 20 functions as a server, as the term is understood in relation to the TCP/IP internal network 22.
  • the remote monitoring station 24 thus functions as a client requesting data from the server. With a communications link established between the surveillance camera unit 20 and the remote monitoring station 24, however, upon the automated detection of certain events, the surveillance camera unit 20 may notify the remote monitoring station.
  • the remote monitoring station 24 includes a web browser application such as Internet Explorer from Microsoft Corporation of Redmond, Washington, or Firefox from the Mozilla Foundation.
  • the surveillance camera units 20 are understood to have basic versions of a HyperText Transfer Protocol (HTTP) server and a video streaming server.
  • HTTP HyperText Transfer Protocol
  • video streaming server Via plug-in modules supplementing the functionality of the web browser application with media playback features, data from the video streaming server is processed and displayed on the remote monitoring station.
  • the remote monitoring station 24 is loaded with a dedicated video feed display application such as Maximum® from Axium Technologies, Inc. of Irwindale, California.
  • the video streaming servers of the surveillance camera units 20 communicate directly with such a display application to deliver the recorded surveillance footage.
  • the display 28 may be segregated into four subsections 28a-d, corresponding to each of the surveillance camera units 20a-d in the system 1.
  • a variety of other layouts that conveniently show the different feeds are also envisioned, along with the interactive features that may direct the operation of the surveillance camera units 20.
  • the surveillance camera unit 20 in accordance with one embodiment of the present invention includes an audio module 30, a video module 32, and a central processor 34.
  • the audio module 30 is connected to an acoustic transducer 36 or microphone, which generates an analog electrical signal of the sound from the monitored location. The analog electrical is then converted to a digital representation by an analog-to-digital converter (ADC) 38.
  • ADC 38 may be incorporated into the audio module 30.
  • the ADC 38 may be a separate, standalone component as shown in the block diagram of FIG. 3.
  • the video module 32 is connected to a video camera 40, which in its most basic form includes a sensor that converts photons of light striking it into a representative
  • the photons of light are understood to be reflections from the pertinent scene of the monitored location. Any suitable video camera having various lenses, adjustable apertures, and sensor types and resolutions may be utilized.
  • the present invention also contemplates a method of surveillance.
  • the method begins with a step 200 of receiving an input audio signal of the specific location in the environment 10.
  • the step 200 of receiving the input audio signal and the step 202 of receiving the input video stream may occur simultaneously, as the operation of the microphone 36 and the audio module 30 are not exclusive of the operation of the video camera 40 and the video module 32.
  • the method continues with a step 204 of detecting a triggering event based upon such signals. The processing of the audio signal and of the video stream will be described in turn, below.
  • the flowchart of FIG. 5 best illustrates the further detailed steps involved with receiving the audio signal and detecting the triggering event.
  • the analog audio signal is acquired from the monitored environment 10 by the acoustic transducer 36, indicated as step 300. This step is understood to correspond generally to step 200 above.
  • the ADC 38 converts the analog signal to a digital representation.
  • the converted digital representation is then fed to the audio module 30 in accordance with step 304, and is analyzed to determine whether the recorded sound signal matches any predefined sonic signature in step 306.
  • the predefined sonic signature is understood to be a reference sample of a sound associated with the triggering event.
  • triggering events include vehicle collisions, firearm discharge, graffiti vandalism, assault, robbery, burglary and the like.
  • vehicle collisions may have a number of corresponding sounds such as breaking glass, skidding rubber, and crumpling sheet metal.
  • a firearm discharge may include a sound of the explosion of gunpowder and the crack of the bullet reaching supersonic speeds.
  • Graffiti vandalism may have a spray paint can discharge sound, as well as a sound associated with the agitator rolling around within the can.
  • burglary or other crimes necessitating the destruction of entry barriers are typically accompanied by sounds of breaking glass and similar impacts.
  • the predefined sonic signature for each of aforementioned trigger events may be stored for access by the audio module 30.
  • the surveillance camera unit 20 may include a memory module 42 for this purpose. Because the predefined sonic signatures may be stored for subsequent retrieval even when the surveillance camera unit 20 is powered off, the use of a non-volatile memory device such as Flash is envisioned. Further, because updates of the predefined sonic signatures can be provided, the memory module 42 may be removable, such as a Secure Digital (SD) card.
  • SD Secure Digital
  • the input audio signals are from a live environment, so there are many superfluous sounds that may be mixed in with the sounds of interest.
  • the input audio signal is understood to have a triggering event component that is the sound of interest, and a background noise component that, to increase accuracy, must be minimized.
  • the level of the background noise component is normalized to that of earlier recorded background noise components that, in hindsight, did not include the triggering event component.
  • a large sampling of the earlier recorded background noise components may be utilized to build an accurate representation of the noise characteristics for the particular location being monitored in the environment 10. Because different points throughout the day, different days of the week, and different months may have different noise characteristics, each such time division may have its own noise normalization levels. This profiling of noise is therefore understood to be intelligent and self-educational.
  • the audio module 30 compares the input audio signal and the predefined sound signature to determine whether the triggering event occurred.
  • various digital signal processing (DSP) algorithms may be utilized to determine the degree of similarity.
  • SIMD single instruction multiple data
  • the method of surveillance includes the step 204 of detecting the triggering event.
  • the footage captured by the video camera 40 is evaluated by the video module 30 for particular events that may be unfolding.
  • the evaluation procedure is understood to be built upon several basic image processing algorithms that involve an analysis of a sequence of image frames of the input video stream, so a DSP device may be utilized. It is possible to utilize two independent devices for the audio module 30 and the video module 32, but it is also contemplated that a single device may perform both functions.
  • the exact circuit implementation is not intended to be limiting.
  • the video module 30 may utilize the OnBoardTM Application Programming Interface (API) from ObjectVideo, Inc. of Reston, VA, though any other video analytics libraries may be substituted.
  • API OnBoardTM Application Programming Interface
  • the video module 32 is capable of differentiating between different objects that may appear, including people, vehicles, and other items such as luggage. When such objects cross over a predefined boundary, a tripwire event notification may be generated. Further sophisticated analyses are possible with a second predefined boundary, and various rules relative to the first boundary may be defined.
  • another event notification may be generated. Similar to the enter or exit event, the video module 32 is capable of detecting when an object appears or disappears from an area of interest without first appearing or subsequently disappearing, respectively, from the periphery. When objects are taken away or left behind, another event notification may be generated. In order to reduce the possibility of false positives, the video module 32 may include the ability to filter out objects that are too small or too large, objects that change size or shape too rapidly.
  • a particular sequence of events detected from the surveillance footage by the video module 32 are understood to be representative of a specific triggering event such as a vehicle crash, a theft, an assault, a robbery, and the like.
  • a vehicle crash e.g., a vehicle is observed crossing a second tripwire before a first tripwire when normal flow of traffic should be the opposite, then it can be determined that the vehicle is travelling in the wrong direction.
  • an object is left behind, there is a possibility that it could contain dangerous explosives with the potential to cause serious harm, whereas when an object normally within the area of interest suddenly disappears, it may have been stolen.
  • people Prior to committing a crime, people tend to loiter in a location to conduct reconnaissance and/or to pick a suitable victim, though it is just as likely for people to loiter when waiting for someone to arrive.
  • an notification to that effect is provided to the central processor 34.
  • Various embodiments of the present invention contemplate different ways to proceed based upon the sequence in which the notification from the audio module 30 and the video module 32 are received. It is envisioned that such functionality reduces the need for constant human monitoring, and only when potential events are identified, is human monitoring and action necessary.
  • FIG. 6 is a flowchart illustrating one possible execution flow. Beginning with step 400a, which generally corresponds to step 200 above, the input audio signal is received. Further, step 400b, which generally corresponds to step 202 above, the input video stream is received. In steps 410a and 410b, a triggering event is detected by the respective one of the audio module 30 and the video module 32. These two steps are understood to correspond to step 204 above.
  • the triggering events detected by the audio module 30 and the video module 32 may be based off the same occurrence in the monitored location, and if so, the two will generate its respective event notifications to the central processor 34 at the same time. If the event notifications are not based off the same occurrence, the notifications will generally be received at different times. This evaluation is made in decision branch 420.
  • the method of surveillance shown in the flowchart of FIG. 4 continues with a step 206 of generating an alarm in response to the received event notifications.
  • a step 206 of generating an alarm in response to the received event notifications is only if the notifications are concurrently received, does the central processor 34 generates an alarm signal according to corresponding step 430.
  • GPS Global Positioning Satellite
  • the surveillance camera unit 20 includes a GPS receiver 44 that is connected to the central processor 34.
  • the acquisition of GPS coordinates is well known in the art, and a further description of the same will be omitted.
  • the GPS coordinates are incorporated into the alarm signal according to step 429.
  • the central processor 34 transmits the alarm signal to the remote monitoring station 24 per step 432.
  • the surveillance camera unit 20 includes a network communications module 46 that establishes a data transfer link to the remote monitoring station 24 over the internal network 22.
  • the internal network 22 is a TCP/IP network, with the physical cabling being Ethernet. Therefore, the network communications module 46 is understood to include ports to which Ethernet cables can be connected. Alternative network communications modalities such as WiFi may also be utilized, however, and in which case the network communications module 46 would include a wireless transceiver.
  • step 434 the recorded input video stream and the input audio signal may also be transmitted to the remote monitoring station 24 in accordance with step 434. Similar to the transmitted alarm signal, this data is transmitted by the network communications module 46. Together with the transmission of the alarm signal in step 432, step 434 generally corresponds to a step 208 of transmitting the input audio signal, the input video signal, and the alarm signal to the remote monitoring station 24 as shown in the flowchart of FIG. 4.
  • the data can be stored in the memory module 42 for backup purposes according to step 436.
  • the memory module 42 may be a portable device that can be removed from the surveillance camera unit 20.
  • devices attached to a peripheral port 48 may be activated in step 438 after the alarm signal is generated, that is, when the triggering event is detected by both the audio module 30 and the video module 32.
  • Exemplary devices that may be connected to the peripheral port 48 include floodlights or strobe lights, as well as alarm sound generators. Such devices may provide a startling effect to a perpetrator, and direct the attention of nearby security personnel. It will be appreciated that any other suitable device may be so triggered by the central processor 34.
  • the central processor 34 simply generates an event signal per step 440, and transmits the same in step 442.
  • the notification may be recorded by the remote monitoring station 24 that there was a possible detection of a triggering event from either an input audio signal or an input video stream, and the display 28 may indicate as such.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Emergency Management (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

A surveillance system and a method for monitoring a location are disclosed. An audio module connected to an acoustic transducer receives an input audio signal, and detects triggering events based upon matching the same to a signature. Additionally, a video module connected to a video camera receives an input video stream, and detects triggering events with video analytics. An alarm is generated when the audio module and the video module detect the triggering event.

Description

METHOD AND SYSTEM FOR COMBINED AUDIO-VISUAL SURVEILLANCE CROSS-REFERENCE TO RELATED APPLICATIONS
The present invention relates generally to security devices, and more particularly to networked surveillance camera systems. The present invention also relates to alarm and camera systems triggered with a combination of video and audio analysis.
As an integral part of a security plan, surveillance systems are utilized for crime prevention, detection, and prosecution purposes. Common locations where surveillance systems are utilized include banks, airports, casinos, parking lots and garages, parks, corporate facilities and the like, though increased availability has brought about its deployment in more private settings such as personal residences. Surveillance systems typically employ closed circuit television (CCTV) cameras that are connected to a central observation station with one or more video monitors. Thus, multiple locations may be monitored simultaneously from a single location by a minimal number of operators. By design, CCTV systems are intended for security personnel to constantly observe the monitors and respond to any incidents in real-time. CCTV cameras may also be connected to video recording devices that archive footage for subsequent viewing, analysis, and other related uses.
Earlier analog CCTV systems were deficient in a number of different respects. The distance between each individual camera and the central monitoring station was limited because of transmission distance restrictions associated with coaxial cables. These coaxial cables were bulky and therefore challenging to route, and were also costly to acquire and maintain.
In order for video footage to be archived, it was necessary to record to magnetic storage devices such as Video Home System (VHS) tapes with Videocassette Recorders (VCRs). Because videotape is a sequential access device, random access to relevant footage is challenging at best. Furthermore, archiving footage for potential future needs was problematic. In Standard Play (SP) mode, depending in the length of tape in the cassette, anywhere between 2 hours to 3 hours of footage could be recorded. In other play modes such as Extended Play (EP) and Long Play (LP), up to 8 hours of footage or 12 hours of footage, respectively, could be recorded. As such, numerous videocassette tapes were necessary along with an appropriate system to manage the rotation of tapes. The number of videotapes increased exponentially where many video cameras were deployed, as each camera typically required its own VCR and tape. Although multiple cameras could be aggregated into a single monitor and a single VCR/tape, large CCTV installations were nevertheless deficient.
With developments in and widespread deployment of networking technology, CCTV surveillance systems are increasingly being replaced with Internet Protocol (IP) network camera systems. Like analog CCTV systems, cameras are installed in multiple locations, with the footage being viewable from the central monitoring station. The cameras have digital sensors, however, in which photons of light from each image or frame of video are converted to data representative of the same. This data is transmitted over a conventional data transfer link such as Ethernet. A minimalist video server may be incorporated into each of the cameras, and a remote client software application may communicate with each of the video servers to request video data for display. Because the networking protocols for IP network camera systems are the same as those utilized in standard computer networks, surveillance systems were able to use existing network infrastructure. Unlike analog CCTV systems, IP network camera systems do not have distance limitations, and because a large volume of data can be stored with relative ease on hard disk drives, optical discs, and other such media, the burdens previously associated with access and management of surveillance footage is greatly reduced.
With ever-growing data storage capacities, various applications and needs that exploit those improvements continually arise. For example, owners of department stores and malls generally have a duty of care to customers to eliminate any potential hazards on the premises, and to aid anyone that may have become injured. Due to the prevailing litigious atmosphere, it is likely that the owner of the premises may be sued weeks, or even months after the injury.
Without conclusive video footage of accidents and the responses thereto, a victim-plaintiff prepared with convincing medical reports and professional expert witnesses may quickly gain the upper hand in litigation and settlement negotiations. Accordingly, it is common for surveillance footage in such commercial/retail environments to be retained for as long as two months, and even longer.
Technical Problem
Regardless of the scope of deployment, surveillance cameras largely remain a tool for after-the-fact reconstruction of an incident for investigation and prosecution purposes rather than a proactive prevention of the same. While the visibility of cameras and the widespread knowledge of the existence of ongoing surveillance may have some deterrent effect against criminal or otherwise legally actionable activity, early detection and response thereto may not be possible. As another example, despite the installation 4.2 million cameras in the United Kingdom, the 2005 subway bombings could not be prevented, resulting in 56 deaths and over 700 injuries.
The primary limitation in this regard is the inability of human beings to sustain the necessary concentration levels for extended periods of time. Studies have shown that after twenty minutes of watching live surveillance footage, up to 90% of the information being shown will be missed. Monitoring abilities further deteriorate as the number of cameras is increased.
A potential solution is the use of video analytics to perform the tedious task of monitoring surveillance footage, and only alert a human being when certain conditions are detected. Although refinements to the rules-based analytical algorithms aim to screen out unwarranted alarms, false triggering nevertheless remain problematic with conventional video analytics. It is acknowledged that the widespread deployment of immature video analytic technology has reduced confidence in its capabilities, thereby reducing its effectiveness.
Accordingly, there is a need in the art for a method and system for combined audio-visual surveillance that reduces false alarms and increase detection of incidents.
Technical Solution
In accordance with one embodiment of the present invention, a surveillance system for monitoring a location is contemplated. The system may include an audio module having an audio input connected to an acoustic transducer. The acoustic transducer may be receptive to environmental sound signals from the monitored location. Additionally, the surveillance system may include a video module with a video input connected to an imaging sensor. The imaging sensor may be receptive to visual events from the monitored location. There may also be a central processor that is connected to the audio module and the video module. The central processor may be programmable to generate an alert signal based upon a concurrent detection of an environmental sound signal and a visual event. This environmental sound signal and visual event may be associated with a triggering incident.
According to another embodiment of the present invention, a method of surveillance of a location for triggering events is contemplated. The method may include receiving an input audio signal of the location, and receiving an input video stream of the location. The method may also include detecting a specific one of the triggering events based upon the input audio signal matching a predefined sonic signature. Additionally, the detecting of the specific one of the
triggering events may be based upon the input video signal matching a predefined image sequence. The predefined sonic signature and the predefined image sequence may be associated with the specific one of the triggering events. The method may further include generating an alarm in response to the detection of the specific one of the triggering events. Furthermore, the method may include transmitting the input audio signal, the input video stream, and the alarm to a remote monitor.
The present invention will be best understood by reference to the following detailed description when read in conjunction with the accompanying drawings.
Advantageous Effects
Accordingly, A surveillance system and a method for monitoring a location have an advantage generating alarm when the audio module and the video module detect the triggering event.
Description of Drawings
These and other features and advantages of the various embodiments disclosed herein will be better understood with respect to the following description and drawings, in which:
FIG. 1 is an illustration of an exemplary environment in which a surveillance system of the present invention is deployed;
FIG. 2 is a block diagram of the various components of one embodiment of the surveillance system, including multiple surveillance camera units, networking components, and a remote monitoring station;
FIG. 3 is a block diagram of the components of the surveillance camera unit;
FIG. 4 is a flowchart illustrating the steps for one embodiment of a method of surveillance;
FIG. 5 is a flowchart illustrating the steps involved in receiving the input audio and detecting the triggering event therefrom; and
FIG. 6 is a flowchart showing an execution flow following the detection of the triggering event in accordance with various embodiments of the present invention.
Common reference numerals are used throughout the drawings and the detailed description to indicate the same elements.
Mode for Invention
The detailed description set forth below in connection with the appended drawings is intended as a description of the presently preferred embodiment of the invention, and is not intended to represent the only form in which the present invention may be constructed or utilized. The description sets forth the functions of the invention in connection with the illustrated embodiment. It is to be understood, however, that the same or equivalent functions and may be accomplished by different embodiments that are also intended to be encompassed within the scope of the invention. It is further understood that the use of relational terms such as first and second, top and bottom, and the like are used solely to distinguish one from another entity or step without necessarily requiring or implying any actual such relationship or order between such entities or steps.
The illustration of FIG. 1 shows an exemplary environment 10 in which a surveillance system of the present invention may be installed and deployed. By way of example, the environment 10 includes a building 12 that is located on a street corner. There is an entrance 14 that faces one of the streets, and a gated side entrance 16 on another one of the streets. Along the first street, there is an entryway into a parking lot 18. As part of its security arrangements, these various locations are monitored by surveillance camera units 20. In further detail, the parking lot 18 is monitored by a first camera unit 20a in order to monitor, for example, potential carjackings, assault, robberies, vehicle vandalism, and so forth. The entrance 14 is monitored by a pair of second and third camera units 20b, 20c, respectively, to track the people entering and exiting the building 12. The side entrance 16 is monitored by a fourth camera unit 20d to track vehicles and their license plates approaching the gate for access.
Notwithstanding the foregoing installation specificities, it will be appreciated by those having ordinary skill in the art that the surveillance camera units 20 may be installed in different places in the environment 10 for a variety of purposes. As utilized herein, the term location is understood to refer generally to the environment 10 that is being monitored with the surveillance system of the present invention. Furthermore, the term location is also understood to refer to specific segments of the environment 10 that is being monitored by a specific one of the surveillance camera units 20. Along these lines, for purposes of explaining the various features of the surveillance system, different surveillance camera units 20 may also be referred to by the particular installation location, e.g., the first surveillance camera unit 20a that covers the parking lot 18 may also be referred to as the parking lot camera unit.
With reference to the block diagram of FIG. 2, further details of the surveillance system 1 as contemplated by various embodiments of the present invention will be described. As noted above, the surveillance system 1 includes one or more surveillance camera units 20 that are positioned throughout various locations in the environment 10 to record video footage therefrom. The parking lot camera unit 20a, the left front door camera unit 20b, the right front door camera unit 20b, the left front door camera unit 20c, and the side entrance camera unit 20d are connected to an internal network 22.
The surveillance footage recorded by the surveillance camera units 20 may be transmitted to and displayed on a remote monitoring station 24 that is manned by security personnel. The remote monitoring station 24 is likewise connected to the internal network 22.
The actual location of the remote monitoring station 24 need not be limited to within the building 12. Monitoring services performed by the security personnel may be outsourced to third party providers otherwise unaffiliated with the management of the building 12, and may also be remotely located from the same.
As will be described in further detail below, the internal network 22 is contemplated to be a Transmission Control Protocol/Internet Protocol (TCP/IP) network that can be interconnected to other systems on the Internet. Data traffic from each of the surveillance camera units 20 may be aggregated by a hub or switch 26 that similarly complies with the TCP/IP standards of the internal network 22. The use of the TCP/IP network in this context is by way of example only, as many existing networked camera devices are compliant therewith. Other networking standards may, however, be substituted without departing from the present invention.
The remote monitoring station 24 may be a conventional desktop computer having a central processing unit, memory, and input and output devices connected thereto such as keyboards, mice, and display units 28. The remote monitoring station 24 is understood to have software instructions loaded thereon that, when executed, perform various functions involved with accessing and displaying footage from the surveillance camera units 20. As will be described in further detail below, the surveillance camera unit 20 functions as a server, as the term is understood in relation to the TCP/IP internal network 22. The remote monitoring station 24 thus functions as a client requesting data from the server. With a communications link established between the surveillance camera unit 20 and the remote monitoring station 24, however, upon the automated detection of certain events, the surveillance camera unit 20 may notify the remote monitoring station.
In one contemplated variation, the remote monitoring station 24 includes a web browser application such as Internet Explorer from Microsoft Corporation of Redmond, Washington, or Firefox from the Mozilla Foundation. The surveillance camera units 20 are understood to have basic versions of a HyperText Transfer Protocol (HTTP) server and a video streaming server. Via plug-in modules supplementing the functionality of the web browser application with media playback features, data from the video streaming server is processed and displayed on the remote monitoring station.
In another contemplated variation, the remote monitoring station 24 is loaded with a dedicated video feed display application such as Maximum® from Axium Technologies, Inc. of Irwindale, California. The video streaming servers of the surveillance camera units 20 communicate directly with such a display application to deliver the recorded surveillance footage.
As best illustrated in FIG. 2, the display 28 may be segregated into four subsections 28a-d, corresponding to each of the surveillance camera units 20a-d in the system 1. A variety of other layouts that conveniently show the different feeds are also envisioned, along with the interactive features that may direct the operation of the surveillance camera units 20.
In further detail shown in FIG. 3, the surveillance camera unit 20 in accordance with one embodiment of the present invention includes an audio module 30, a video module 32, and a central processor 34. The audio module 30 is connected to an acoustic transducer 36 or microphone, which generates an analog electrical signal of the sound from the monitored location. The analog electrical is then converted to a digital representation by an analog-to-digital converter (ADC) 38. In some embodiments, the ADC 38 may be incorporated into the audio module 30. Alternatively, the ADC 38 may be a separate, standalone component as shown in the block diagram of FIG. 3. The video module 32 is connected to a video camera 40, which in its most basic form includes a sensor that converts photons of light striking it into a representative
video signal. The photons of light are understood to be reflections from the pertinent scene of the monitored location. Any suitable video camera having various lenses, adjustable apertures, and sensor types and resolutions may be utilized.
Referring to the flowchart of FIG. 4, the present invention also contemplates a method of surveillance. The method begins with a step 200 of receiving an input audio signal of the specific location in the environment 10. There is also a subsequent step 202 of receiving an input video signal of the specific location in the environment 10. It is understood that the step 200 of receiving the input audio signal and the step 202 of receiving the input video stream may occur simultaneously, as the operation of the microphone 36 and the audio module 30 are not exclusive of the operation of the video camera 40 and the video module 32. After the input audio signal and the input video stream are received, the method continues with a step 204 of detecting a triggering event based upon such signals. The processing of the audio signal and of the video stream will be described in turn, below.
The flowchart of FIG. 5 best illustrates the further detailed steps involved with receiving the audio signal and detecting the triggering event. As noted previously, the analog audio signal is acquired from the monitored environment 10 by the acoustic transducer 36, indicated as step 300. This step is understood to correspond generally to step 200 above. Then, in step 302, the ADC 38 converts the analog signal to a digital representation. The converted digital representation is then fed to the audio module 30 in accordance with step 304, and is analyzed to determine whether the recorded sound signal matches any predefined sonic signature in step 306.
If there is determined to be a match in decision branch 308, then an appropriate event signal indicating the same is generated to the central processor 34 according to step 310. Otherwise, the camera system 20 continues to monitor the environment. It is understood that the foregoing operations on the analog audio signal are continuously performed on a real-time basis.
In further detail, the predefined sonic signature is understood to be a reference sample of a sound associated with the triggering event. A variety of triggering events are contemplated, including vehicle collisions, firearm discharge, graffiti vandalism, assault, robbery, burglary and the like. By way of example, vehicle collisions may have a number of corresponding sounds such as breaking glass, skidding rubber, and crumpling sheet metal. A firearm discharge may include a sound of the explosion of gunpowder and the crack of the bullet reaching supersonic speeds. Graffiti vandalism may have a spray paint can discharge sound, as well as a sound associated with the agitator rolling around within the can. Usually, depending on the perpetrator's ability to silence the victim, assault and robbery victims scream for help. Like vehicle collisions, burglary or other crimes necessitating the destruction of entry barriers are typically accompanied by sounds of breaking glass and similar impacts.
Each of the foregoing sounds has a particular characteristic that may be compared to the input audio signals. So that the surveillance system 1 is able to respond to a variety of situations, the predefined sonic signature for each of aforementioned trigger events may be stored for access by the audio module 30. Referring again to the block diagram of FIG. 3, the surveillance camera unit 20 may include a memory module 42 for this purpose. Because the predefined sonic signatures may be stored for subsequent retrieval even when the surveillance camera unit 20 is powered off, the use of a non-volatile memory device such as Flash is envisioned. Further, because updates of the predefined sonic signatures can be provided, the memory module 42 may be removable, such as a Secure Digital (SD) card.
The input audio signals are from a live environment, so there are many superfluous sounds that may be mixed in with the sounds of interest. In further detail, the input audio signal is understood to have a triggering event component that is the sound of interest, and a background noise component that, to increase accuracy, must be minimized. Thus, the level of the background noise component is normalized to that of earlier recorded background noise components that, in hindsight, did not include the triggering event component. A large sampling of the earlier recorded background noise components may be utilized to build an accurate representation of the noise characteristics for the particular location being monitored in the environment 10. Because different points throughout the day, different days of the week, and different months may have different noise characteristics, each such time division may have its own noise normalization levels. This profiling of noise is therefore understood to be intelligent and self-educational.
As indicated above, the audio module 30 compares the input audio signal and the predefined sound signature to determine whether the triggering event occurred. With the input audio signal being in digital form, various digital signal processing (DSP) algorithms may be utilized to determine the degree of similarity. Along these lines, the audio module 30 may be a dedicated DSP microprocessor such as the DaVinci® line of devices, including the DM6446 integrated circuit, from Texas Instruments of Dallas, Texas. It is understood that these DSP devices have architectures that are specially designed for signal processing applications, such as fast multiply-accumulate (MAC) operations, single instruction multiple data (SIMD) operations, and so forth. Those having ordinary skill in the art will be able to readily ascertain an appropriate substitute device.
Turning now to the video analytics feature of the surveillance camera unit 20, as noted above, the method of surveillance according to one embodiment of the present invention includes the step 204 of detecting the triggering event. The footage captured by the video camera 40 is evaluated by the video module 30 for particular events that may be unfolding. The evaluation procedure is understood to be built upon several basic image processing algorithms that involve an analysis of a sequence of image frames of the input video stream, so a DSP device may be utilized. It is possible to utilize two independent devices for the audio module 30 and the video module 32, but it is also contemplated that a single device may perform both functions. The exact circuit implementation is not intended to be limiting.
According to one embodiment, the video module 30 may utilize the OnBoard™ Application Programming Interface (API) from ObjectVideo, Inc. of Reston, VA, though any other video analytics libraries may be substituted. From the footage, the video module 32 is capable of differentiating between different objects that may appear, including people, vehicles, and other items such as luggage. When such objects cross over a predefined boundary, a tripwire event notification may be generated. Further sophisticated analyses are possible with a second predefined boundary, and various rules relative to the first boundary may be defined.
Additionally, when objects enter or exit an area of interest, another event notification may be generated. Similar to the enter or exit event, the video module 32 is capable of detecting when an object appears or disappears from an area of interest without first appearing or subsequently disappearing, respectively, from the periphery. When objects are taken away or left behind, another event notification may be generated. In order to reduce the possibility of false positives, the video module 32 may include the ability to filter out objects that are too small or too large, objects that change size or shape too rapidly.
Again, like the audio analytics, a particular sequence of events detected from the surveillance footage by the video module 32 are understood to be representative of a specific triggering event such as a vehicle crash, a theft, an assault, a robbery, and the like. For example, if a vehicle is observed crossing a second tripwire before a first tripwire when normal flow of traffic should be the opposite, then it can be determined that the vehicle is travelling in the wrong direction. As another example, when an objet is left behind, there is a possibility that it could contain dangerous explosives with the potential to cause serious harm, whereas when an object normally within the area of interest suddenly disappears, it may have been stolen. Prior to committing a crime, people tend to loiter in a location to conduct reconnaissance and/or to pick a suitable victim, though it is just as likely for people to loiter when waiting for someone to arrive.
When the triggering event is detected by either the audio module 30 or the video module 32, an notification to that effect is provided to the central processor 34. Various embodiments of the present invention contemplate different ways to proceed based upon the sequence in which the notification from the audio module 30 and the video module 32 are received. It is envisioned that such functionality reduces the need for constant human monitoring, and only when potential events are identified, is human monitoring and action necessary.
FIG. 6 is a flowchart illustrating one possible execution flow. Beginning with step 400a, which generally corresponds to step 200 above, the input audio signal is received. Further, step 400b, which generally corresponds to step 202 above, the input video stream is received. In steps 410a and 410b, a triggering event is detected by the respective one of the audio module 30 and the video module 32. These two steps are understood to correspond to step 204 above. Here, the triggering events detected by the audio module 30 and the video module 32 may be based off the same occurrence in the monitored location, and if so, the two will generate its respective event notifications to the central processor 34 at the same time. If the event notifications are not based off the same occurrence, the notifications will generally be received at different times. This evaluation is made in decision branch 420.
The method of surveillance shown in the flowchart of FIG. 4 continues with a step 206 of generating an alarm in response to the received event notifications. Along these lines, and referring back to the flowchart of FIG. 6, only if the notifications are concurrently received, does the central processor 34 generates an alarm signal according to corresponding step 430.
Optionally, prior to generating the alarm signal, specific location information in the form of Global Positioning Satellite (GPS) coordinates may be generated in step 428. As best shown in FIG. 3, the surveillance camera unit 20 includes a GPS receiver 44 that is connected to the central processor 34. The acquisition of GPS coordinates is well known in the art, and a further description of the same will be omitted. Once generated, the GPS coordinates are incorporated into the alarm signal according to step 429.
With or without the GPS coordinates, the central processor 34 then transmits the alarm signal to the remote monitoring station 24 per step 432. The surveillance camera unit 20 includes a network communications module 46 that establishes a data transfer link to the remote monitoring station 24 over the internal network 22. As noted above, the internal network 22 is a TCP/IP network, with the physical cabling being Ethernet. Therefore, the network communications module 46 is understood to include ports to which Ethernet cables can be connected. Alternative network communications modalities such as WiFi may also be utilized, however, and in which case the network communications module 46 would include a wireless transceiver.
Following the transmission of the alarm signal, the recorded input video stream and the input audio signal may also be transmitted to the remote monitoring station 24 in accordance with step 434. Similar to the transmitted alarm signal, this data is transmitted by the network communications module 46. Together with the transmission of the alarm signal in step 432, step 434 generally corresponds to a step 208 of transmitting the input audio signal, the input video signal, and the alarm signal to the remote monitoring station 24 as shown in the flowchart of FIG. 4.
In conjunction with transmitting the input audio signal and the input video stream to the remote monitoring station 24, the data can be stored in the memory module 42 for backup purposes according to step 436. As noted above, the memory module 42 may be a portable device that can be removed from the surveillance camera unit 20.
Additionally, though also optionally, devices attached to a peripheral port 48 may be activated in step 438 after the alarm signal is generated, that is, when the triggering event is detected by both the audio module 30 and the video module 32. Exemplary devices that may be connected to the peripheral port 48 include floodlights or strobe lights, as well as alarm sound generators. Such devices may provide a startling effect to a perpetrator, and direct the attention of nearby security personnel. It will be appreciated that any other suitable device may be so triggered by the central processor 34.
If, on the other hand, the event notifications are not concurrently received as determined in the decision branch, then the central processor 34 simply generates an event signal per step 440, and transmits the same in step 442. The notification may be recorded by the remote monitoring station 24 that there was a possible detection of a triggering event from either an input audio signal or an input video stream, and the display 28 may indicate as such.
The particulars shown herein are by way of example and for purposes of illustrative discussion of the embodiments of the present invention only and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the present invention. In this regard, no attempt is made to show structural details of the present invention in more detail than is necessary for the fundamental understanding of the present invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the present invention may be embodied in practice.

Claims (22)

1. A surveillance system for monitoring a location, comprising:
an audio module having an audio input connected to an acoustic transducer receptive to environmental sound signals from the monitored location;
a video module having a video input connected to an imaging sensor receptive to visual events from the monitored location; and
a central processor connected to the audio module and the video module, the central processor being programmable to generate an alert signal based upon a concurrent detection of an environmental sound signal and a visual event associated with a triggering incident.
2. The system of Claim 1, further comprising:
a memory module connected to the central processor, audio data from the audio module and video data from the video module being stored in the memory module.
3. The system of Claim 1, further comprising:
a global positioning satellite (GPS) receiver connected to the central processor and generating coordinates of the monitored location, the coordinates being included in the alert signal.
4. The system of Claim 1, further comprising:
a remote monitoring station in network communication with the central processor;
wherein audio data generated by the audio module, video data generated by the video module, and the alert signal are transmitted to the remote monitoring station.
5. The system of Claim 4, wherein the remote monitoring station is in communication with other central processors each connected to a separate audio module and a separate video module monitoring different segments of the monitored location.
6. The system of Claim 4, further comprising:
a network communications module connected to the central processor, the network communications module establishing a link to the remote monitoring station over a network.
7. The system of Claim 6, wherein the network link conforms to the Transmission Control Protocol/Internet Protocol (TCP/IP) networking standard.
8. The system of Claim 6, wherein the network link is wired Ethernet.
9. The system of Claim 1, wherein:
the audio module includes an analog to digital converter (ADC) for converting the environmental sound signals to a representative audio data stream to the central processor.
10. The system of Claim 1, wherein the triggering incident is defined at least in part by a sonic signature to which the received environmental sound signal is compared.
11. The system of Claim 10, wherein the sonic signature includes a reference sample of sound associated with the triggering event.
12. The system of Claim 1, wherein the central processor is programmable to generate an event notification based upon a detection of solely the environmental sound signal associated with the triggering incident.
13. The system of Claim 1, wherein the central processor is programmable to generate an event notification based upon a detection of solely the visual event associated with the triggering incident.
14. A method of surveillance of a location for triggering events, the method comprising:
receiving an input audio signal of the location;
receiving an input video stream of the location;
detecting a specific one of the triggering events based upon the input audio signal matching a predefined sonic signature and the input video signal matching a predefined image sequence, the predefined sonic signature and the predefined image sequence being associated with the specific one of the triggering events;
generating an alarm in response to the detection of the specific one of the triggering events; and
transmitting the input audio signal, the input video stream, and the alarm to a remote monitor.
15. The method of Claim 14, wherein the predefined sonic signature is a reference sample of a sound associated with the triggering event.
16. The method of Claim 15, wherein the input audio signal has a background noise component and a triggering event component.
17. The method of Claim 16, wherein matching the input audio signal to the predefined sonic signature includes a comparison of the triggering event component to the reference sample.
18. The method of Claim 16, further comprising:
normalizing the level of the background noise component of the input audio signal to the level of the background noise component of an earlier recorded input audio signal of the location.
19. The method of Claim 15, further comprising:
storing an other predefined sonic signature of a different triggering event, the other predefined sonic signature including a reference sample of a sound associated with the different triggering event.
20. The method of Claim 14, wherein subsequent to receiving the input audio signal, the method further includes converting the input audio signal to a digital representation.
21. The method of Claim 14, further comprising:
storing the input audio signal and the input video stream to a memory module.
22. The method of Claim 14, further comprising:
receiving a set of coordinates of the location from a GPS receiver; and
transmitting the coordinates to the remote monitor.
PCT/KR2009/005366 2009-08-25 2009-09-21 Method and system for combined audio-visual surveillance cross-reference to related applications Ceased WO2011025085A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US54742109A 2009-08-25 2009-08-25
US12/547,421 2009-08-25

Publications (1)

Publication Number Publication Date
WO2011025085A1 true WO2011025085A1 (en) 2011-03-03

Family

ID=43628164

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2009/005366 Ceased WO2011025085A1 (en) 2009-08-25 2009-09-21 Method and system for combined audio-visual surveillance cross-reference to related applications

Country Status (2)

Country Link
KR (1) KR20110025886A (en)
WO (1) WO2011025085A1 (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013086088A1 (en) * 2011-12-06 2013-06-13 Southern Imperial, Inc. Retail system signal receiver unit
US8629772B2 (en) 2011-12-06 2014-01-14 Southern Imperial, Inc. Signal emitting retail device
US8884761B2 (en) 2012-08-21 2014-11-11 Souther Imperial, Inc. Theft detection device and method for controlling
CN104183096A (en) * 2013-05-22 2014-12-03 张平 Patient nursing system and method
US9318008B2 (en) 2011-12-06 2016-04-19 Southern Imperial, Inc. Signal emitting retail device
US9324220B2 (en) 2012-08-21 2016-04-26 Southern Imperial, Inc. Theft detection device and method for controlling same
US9451214B2 (en) 2012-08-27 2016-09-20 Korea University Research And Business Foundation Indoor surveillance system and indoor surveillance method
US20170223314A1 (en) * 2016-01-29 2017-08-03 John K. Collings, III Limited Access Community Surveillance System
WO2017211206A1 (en) * 2016-06-08 2017-12-14 中兴通讯股份有限公司 Video marking method and device, and video monitoring method and system
CN107770244A (en) * 2017-09-07 2018-03-06 深圳市盛路物联通讯技术有限公司 Data transfer control method and Related product
US10121341B2 (en) 2017-01-23 2018-11-06 Southern Imperial Llc Retail merchandise hook with radio transmission
US10885753B2 (en) 2018-03-21 2021-01-05 Fasteners For Retail, Inc. Anti-theft device with remote alarm feature
US10922935B2 (en) 2014-06-13 2021-02-16 Vivint, Inc. Detecting a premise condition using audio analytics
US10978199B2 (en) 2019-01-11 2021-04-13 Honeywell International Inc. Methods and systems for improving infection control in a building
US10993550B2 (en) 2018-03-21 2021-05-04 Fasteners For Retail, Inc. Anti-theft retail merchandise pusher with remote alarm feature
US11087601B1 (en) 2020-04-02 2021-08-10 Fasteners For Retail, Inc Anti-theft device with cable attachment
CN113573019A (en) * 2021-07-13 2021-10-29 广东晋华建设工程有限公司 Safety management and control system for construction based on constructional engineering
US11184739B1 (en) 2020-06-19 2021-11-23 Honeywel International Inc. Using smart occupancy detection and control in buildings to reduce disease transmission
US11288945B2 (en) 2018-09-05 2022-03-29 Honeywell International Inc. Methods and systems for improving infection control in a facility
US11363894B2 (en) 2019-04-05 2022-06-21 Fasteners For Retail, Inc. Anti-theft pusher with incremental distance detection
US11372383B1 (en) 2021-02-26 2022-06-28 Honeywell International Inc. Healthy building dashboard facilitated by hierarchical model of building control assets
US11402113B2 (en) 2020-08-04 2022-08-02 Honeywell International Inc. Methods and systems for evaluating energy conservation and guest satisfaction in hotels
US11474489B1 (en) 2021-03-29 2022-10-18 Honeywell International Inc. Methods and systems for improving building performance
US11619414B2 (en) 2020-07-07 2023-04-04 Honeywell International Inc. System to profile, measure, enable and monitor building air quality
US11620594B2 (en) 2020-06-12 2023-04-04 Honeywell International Inc. Space utilization patterns for building optimization
US11662115B2 (en) 2021-02-26 2023-05-30 Honeywell International Inc. Hierarchy model builder for building a hierarchical model of control assets
US11722763B2 (en) 2021-08-06 2023-08-08 Motorola Solutions, Inc. System and method for audio tagging of an object of interest
US11783658B2 (en) 2020-06-15 2023-10-10 Honeywell International Inc. Methods and systems for maintaining a healthy building
US11783652B2 (en) 2020-06-15 2023-10-10 Honeywell International Inc. Occupant health monitoring for buildings
US11823295B2 (en) 2020-06-19 2023-11-21 Honeywell International, Inc. Systems and methods for reducing risk of pathogen exposure within a space
GB2620594A (en) * 2022-07-12 2024-01-17 Ava Video Security Ltd Computer-implemented method, security system, video-surveillance camera, and server
US11894145B2 (en) 2020-09-30 2024-02-06 Honeywell International Inc. Dashboard for tracking healthy building performance
US11914336B2 (en) 2020-06-15 2024-02-27 Honeywell International Inc. Platform agnostic systems and methods for building management systems
USD1019446S1 (en) 2020-04-16 2024-03-26 Fasteners For Retail, Inc. Security tag holder
US12038187B2 (en) 2021-09-28 2024-07-16 Honeywell International Inc. Multi-sensor platform for a building
US12131828B2 (en) 2020-06-22 2024-10-29 Honeywell Internationa Inc. Devices, systems, and methods for assessing facility compliance with infectious disease guidance
US12142382B2 (en) 2021-03-01 2024-11-12 Honeywell International Inc. Airborne infection early warning system
US12142385B2 (en) 2020-06-22 2024-11-12 Honeywell International Inc. Methods and systems for reducing a risk of spread of disease among people in a space
USD1051753S1 (en) 2022-12-21 2024-11-19 Fasteners For Retail, Inc. Security tag housing
US12150564B2 (en) 2019-09-30 2024-11-26 Fasteners For Retail, Inc. Anti-sweeping hook with integrated loss prevention functionality
US12260140B2 (en) 2017-03-31 2025-03-25 Honeywell International Inc. Providing a comfort dashboard
US12261448B2 (en) 2022-06-07 2025-03-25 Honeywell International Inc. Low power sensor with energy harvesting
US12367871B2 (en) 2022-12-29 2025-07-22 Motorola Solutions, Inc. Automated detection and tracking of conversations of interest in crowded areas
US12406218B2 (en) 2020-06-15 2025-09-02 Honeywell International Inc. Dashboard for multi site management system
US12431621B2 (en) 2023-01-26 2025-09-30 Honeywell International Inc. Compact dual band antenna
US12437262B2 (en) 2021-08-23 2025-10-07 Fasteners For Retail, Inc. Anti-sweeping hook with integrated inventory monitoring and/or loss prevention functionality

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101976443B1 (en) 2012-11-08 2019-05-09 한화테크윈 주식회사 System and method for detecting audio data
KR101969504B1 (en) 2017-05-02 2019-04-16 서강대학교산학협력단 Sound event detection method using deep neural network and device using the method
KR20210157105A (en) 2020-06-19 2021-12-28 주식회사 케이티 Device, method and computer program for detecting movement of object

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001256578A (en) * 2000-03-09 2001-09-21 Nippon Avionics Co Ltd Remote security system
JP2002042272A (en) * 2000-07-26 2002-02-08 Minolta Co Ltd Monitor, monitoring method, and record medium readable of computer recording monitor program
JP2002245572A (en) * 2001-02-16 2002-08-30 Takuto:Kk Remote monitoring type dynamic image security system
JP2007188406A (en) * 2006-01-16 2007-07-26 Jcreation Co Ltd Security system, security method, and security program
JP2007188156A (en) * 2006-01-11 2007-07-26 Jcreation Co Ltd Security device, security method, and security program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001256578A (en) * 2000-03-09 2001-09-21 Nippon Avionics Co Ltd Remote security system
JP2002042272A (en) * 2000-07-26 2002-02-08 Minolta Co Ltd Monitor, monitoring method, and record medium readable of computer recording monitor program
JP2002245572A (en) * 2001-02-16 2002-08-30 Takuto:Kk Remote monitoring type dynamic image security system
JP2007188156A (en) * 2006-01-11 2007-07-26 Jcreation Co Ltd Security device, security method, and security program
JP2007188406A (en) * 2006-01-16 2007-07-26 Jcreation Co Ltd Security system, security method, and security program

Cited By (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9318007B2 (en) 2011-12-06 2016-04-19 Southern Imperial, Inc. Signal emitting retail device
US8629772B2 (en) 2011-12-06 2014-01-14 Southern Imperial, Inc. Signal emitting retail device
US8803687B2 (en) 2011-12-06 2014-08-12 Southern Imperial, Inc. Retail system signal receiver unit for recognizing a preset audible alarm tone
WO2013086088A1 (en) * 2011-12-06 2013-06-13 Southern Imperial, Inc. Retail system signal receiver unit
US9318008B2 (en) 2011-12-06 2016-04-19 Southern Imperial, Inc. Signal emitting retail device
US8884761B2 (en) 2012-08-21 2014-11-11 Souther Imperial, Inc. Theft detection device and method for controlling
US9324220B2 (en) 2012-08-21 2016-04-26 Southern Imperial, Inc. Theft detection device and method for controlling same
US9451214B2 (en) 2012-08-27 2016-09-20 Korea University Research And Business Foundation Indoor surveillance system and indoor surveillance method
CN104183096A (en) * 2013-05-22 2014-12-03 张平 Patient nursing system and method
US10922935B2 (en) 2014-06-13 2021-02-16 Vivint, Inc. Detecting a premise condition using audio analytics
US20170223314A1 (en) * 2016-01-29 2017-08-03 John K. Collings, III Limited Access Community Surveillance System
WO2017211206A1 (en) * 2016-06-08 2017-12-14 中兴通讯股份有限公司 Video marking method and device, and video monitoring method and system
CN107483879A (en) * 2016-06-08 2017-12-15 中兴通讯股份有限公司 Video marker method, apparatus and video frequency monitoring method and system
CN107483879B (en) * 2016-06-08 2020-06-09 中兴通讯股份有限公司 Video marking method and device and video monitoring method and system
US10997839B2 (en) 2017-01-23 2021-05-04 Fasteners For Retail, Inc. Retail merchandise hook with radio transmission
US11295591B2 (en) 2017-01-23 2022-04-05 Fasteners For Retail, Inc. Anti-theft retail merchandise hook with radio transmission
US11663893B2 (en) 2017-01-23 2023-05-30 Fasteners For Retail, Inc. Anti-theft retail merchandise hook with radio transmission
US10121341B2 (en) 2017-01-23 2018-11-06 Southern Imperial Llc Retail merchandise hook with radio transmission
US11990013B2 (en) 2017-01-23 2024-05-21 Fasteners for Retails, Inc. Anti-theft retail merchandise hook with radio transmission
US10720035B2 (en) 2017-01-23 2020-07-21 Fasteners For Retail, Inc. Anti-theft retail merchandise hook with radio transmission
US12437620B2 (en) 2017-01-23 2025-10-07 Fasteners For Retail, Inc. Anti-theft retail merchandise hook with radio transmission
US12260140B2 (en) 2017-03-31 2025-03-25 Honeywell International Inc. Providing a comfort dashboard
US12393385B2 (en) 2017-03-31 2025-08-19 Honeywell International Inc. Providing a comfort dashboard
CN107770244A (en) * 2017-09-07 2018-03-06 深圳市盛路物联通讯技术有限公司 Data transfer control method and Related product
US12307865B2 (en) 2018-03-21 2025-05-20 Fasteners For Retail, Inc. Anti-theft device with remote alarm feature
US11317738B2 (en) 2018-03-21 2022-05-03 Fasteners For Retail, Inc. Anti-theft retail merchandise pusher with remote alarm feature
US11737579B2 (en) 2018-03-21 2023-08-29 Fasteners For Retail, Inc. Anti-theft retail merchandise pusher with remote alarm feature
US10993550B2 (en) 2018-03-21 2021-05-04 Fasteners For Retail, Inc. Anti-theft retail merchandise pusher with remote alarm feature
US10885753B2 (en) 2018-03-21 2021-01-05 Fasteners For Retail, Inc. Anti-theft device with remote alarm feature
US12144438B2 (en) 2018-03-21 2024-11-19 Fasteners For Retail, Inc. Anti-theft retail merchandise pusher with remote alarm feature
US11605276B2 (en) 2018-03-21 2023-03-14 Fasteners For Retail, Inc. Anti-theft device with remote alarm feature
US11626004B2 (en) 2018-09-05 2023-04-11 Honeywell International, Inc. Methods and systems for improving infection control in a facility
US11288945B2 (en) 2018-09-05 2022-03-29 Honeywell International Inc. Methods and systems for improving infection control in a facility
US10978199B2 (en) 2019-01-11 2021-04-13 Honeywell International Inc. Methods and systems for improving infection control in a building
US12131821B2 (en) 2019-01-11 2024-10-29 Honeywell International Inc. Methods and systems for improving infection control in a building
US11887722B2 (en) 2019-01-11 2024-01-30 Honeywell International Inc. Methods and systems for improving infection control in a building
US12183453B2 (en) 2019-01-11 2024-12-31 Honeywell International Inc. Methods and systems for improving infection control in a building
US11363894B2 (en) 2019-04-05 2022-06-21 Fasteners For Retail, Inc. Anti-theft pusher with incremental distance detection
US11707141B2 (en) 2019-04-05 2023-07-25 Fasteners For Retail, Inc. Anti-theft pusher with incremental distance detection
US12137819B2 (en) 2019-04-05 2024-11-12 Fasteners For Retail, Inc. Anti-theft pusher with incremental distance detection
US12150564B2 (en) 2019-09-30 2024-11-26 Fasteners For Retail, Inc. Anti-sweeping hook with integrated loss prevention functionality
US11727773B2 (en) 2020-04-02 2023-08-15 Fasteners For Retail, Inc. Anti-theft device with cable attachment
US11087601B1 (en) 2020-04-02 2021-08-10 Fasteners For Retail, Inc Anti-theft device with cable attachment
US12148275B2 (en) 2020-04-02 2024-11-19 Fasteners For Retail, Inc. Anti-theft device with cable attachment
USD1019446S1 (en) 2020-04-16 2024-03-26 Fasteners For Retail, Inc. Security tag holder
USD1019444S1 (en) 2020-04-16 2024-03-26 Fasteners For Retail, Inc. Security tag holder
USD1019445S1 (en) 2020-04-16 2024-03-26 Fasteners For Retail, Inc. Security tag holder
US11620594B2 (en) 2020-06-12 2023-04-04 Honeywell International Inc. Space utilization patterns for building optimization
US12210986B2 (en) 2020-06-12 2025-01-28 Honeywell International Inc. Space utilization patterns for building optimization
US11783658B2 (en) 2020-06-15 2023-10-10 Honeywell International Inc. Methods and systems for maintaining a healthy building
US11783652B2 (en) 2020-06-15 2023-10-10 Honeywell International Inc. Occupant health monitoring for buildings
US12437597B2 (en) 2020-06-15 2025-10-07 Honeywell International Inc. Methods and systems for maintaining a healthy building
US12406218B2 (en) 2020-06-15 2025-09-02 Honeywell International Inc. Dashboard for multi site management system
US11914336B2 (en) 2020-06-15 2024-02-27 Honeywell International Inc. Platform agnostic systems and methods for building management systems
US12282975B2 (en) 2020-06-19 2025-04-22 Honeywell International Inc. Systems and methods for reducing risk of pathogen exposure within a space
US11823295B2 (en) 2020-06-19 2023-11-21 Honeywell International, Inc. Systems and methods for reducing risk of pathogen exposure within a space
US11778423B2 (en) 2020-06-19 2023-10-03 Honeywell International Inc. Using smart occupancy detection and control in buildings to reduce disease transmission
US11184739B1 (en) 2020-06-19 2021-11-23 Honeywel International Inc. Using smart occupancy detection and control in buildings to reduce disease transmission
US12131828B2 (en) 2020-06-22 2024-10-29 Honeywell Internationa Inc. Devices, systems, and methods for assessing facility compliance with infectious disease guidance
US12142385B2 (en) 2020-06-22 2024-11-12 Honeywell International Inc. Methods and systems for reducing a risk of spread of disease among people in a space
US11619414B2 (en) 2020-07-07 2023-04-04 Honeywell International Inc. System to profile, measure, enable and monitor building air quality
US12135137B2 (en) 2020-08-04 2024-11-05 Honeywell International Inc. Methods and systems for evaluating energy conservation and guest satisfaction in hotels
US11402113B2 (en) 2020-08-04 2022-08-02 Honeywell International Inc. Methods and systems for evaluating energy conservation and guest satisfaction in hotels
US12424329B2 (en) 2020-09-30 2025-09-23 Honeywell International Inc. Dashboard for tracking healthy building performance
US11894145B2 (en) 2020-09-30 2024-02-06 Honeywell International Inc. Dashboard for tracking healthy building performance
US11662115B2 (en) 2021-02-26 2023-05-30 Honeywell International Inc. Hierarchy model builder for building a hierarchical model of control assets
US11372383B1 (en) 2021-02-26 2022-06-28 Honeywell International Inc. Healthy building dashboard facilitated by hierarchical model of building control assets
US11599075B2 (en) 2021-02-26 2023-03-07 Honeywell International Inc. Healthy building dashboard facilitated by hierarchical model of building control assets
US12111624B2 (en) 2021-02-26 2024-10-08 Honeywell International Inc. Healthy building dashboard facilitated by hierarchical model of building control assets
US11815865B2 (en) 2021-02-26 2023-11-14 Honeywell International, Inc. Healthy building dashboard facilitated by hierarchical model of building control assets
US12142382B2 (en) 2021-03-01 2024-11-12 Honeywell International Inc. Airborne infection early warning system
US11474489B1 (en) 2021-03-29 2022-10-18 Honeywell International Inc. Methods and systems for improving building performance
CN113573019A (en) * 2021-07-13 2021-10-29 广东晋华建设工程有限公司 Safety management and control system for construction based on constructional engineering
US11722763B2 (en) 2021-08-06 2023-08-08 Motorola Solutions, Inc. System and method for audio tagging of an object of interest
US12437262B2 (en) 2021-08-23 2025-10-07 Fasteners For Retail, Inc. Anti-sweeping hook with integrated inventory monitoring and/or loss prevention functionality
US12038187B2 (en) 2021-09-28 2024-07-16 Honeywell International Inc. Multi-sensor platform for a building
US12261448B2 (en) 2022-06-07 2025-03-25 Honeywell International Inc. Low power sensor with energy harvesting
GB2620594A (en) * 2022-07-12 2024-01-17 Ava Video Security Ltd Computer-implemented method, security system, video-surveillance camera, and server
GB2620594B (en) * 2022-07-12 2024-09-25 Ava Video Security Ltd Computer-implemented method, security system, video-surveillance camera, and server
USD1051753S1 (en) 2022-12-21 2024-11-19 Fasteners For Retail, Inc. Security tag housing
US12367871B2 (en) 2022-12-29 2025-07-22 Motorola Solutions, Inc. Automated detection and tracking of conversations of interest in crowded areas
US12431621B2 (en) 2023-01-26 2025-09-30 Honeywell International Inc. Compact dual band antenna

Also Published As

Publication number Publication date
KR20110025886A (en) 2011-03-14

Similar Documents

Publication Publication Date Title
WO2011025085A1 (en) Method and system for combined audio-visual surveillance cross-reference to related applications
KR100882890B1 (en) Surveillance system and method
US20090121861A1 (en) Detecting, deterring security system
JP3974038B2 (en) Intruder detection using trajectory analysis in surveillance and reconnaissance systems
US7015943B2 (en) Premises entry security system
JP4101655B2 (en) Apparatus and method for resolving entry / exit conflict in security monitoring system
CN1256694C (en) Method and apparatus for reducing false alarms at exit/entry locations for residential security monitoring
US20070019077A1 (en) Portable surveillance camera and personal surveillance system using the same
KR20150041939A (en) A door monitoring system using real-time event detection and a method thereof
KR100857073B1 (en) Real time remote monitoring system
WO2012137994A1 (en) Image recognition device and image-monitoring method therefor
JP2002344953A (en) Image pickup device for crime-prevention
RU2120139C1 (en) Guarding and monitoring system
JP2007128390A (en) Monitor system
CN208506918U (en) A kind of intelligent safety and defence system
KR101046819B1 (en) Intrusion monitoring method and intrusion monitoring system by software fence
JP2002183845A (en) Security equipment
KR101741312B1 (en) Real-time monitoring system for home
WO2017099262A1 (en) Method for managing cctv system having image screen comparison units
JP2003317168A (en) How to collect information on illegal activities, illegal activities, etc.
CN114830620A (en) Tamper detection on camera
JP2000132774A (en) Intruder monitoring system
JPH0869579A (en) Monitor device
JP2005250634A (en) Automatic monitoring method and device
JP3039188U (en) Surveillance recorder

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 20097020699

Country of ref document: KR

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09848788

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC

122 Ep: pct application non-entry in european phase

Ref document number: 09848788

Country of ref document: EP

Kind code of ref document: A1