[go: up one dir, main page]

EP4459585A1 - Surveillance system using 3d-cameras - Google Patents

Surveillance system using 3d-cameras Download PDF

Info

Publication number
EP4459585A1
EP4459585A1 EP23171021.1A EP23171021A EP4459585A1 EP 4459585 A1 EP4459585 A1 EP 4459585A1 EP 23171021 A EP23171021 A EP 23171021A EP 4459585 A1 EP4459585 A1 EP 4459585A1
Authority
EP
European Patent Office
Prior art keywords
person
tracked
gate unit
analytics module
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23171021.1A
Other languages
German (de)
French (fr)
Inventor
Omar Tello
Dominik Laubach
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sensalytics GmbH
Original Assignee
Sensalytics GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sensalytics GmbH filed Critical Sensalytics GmbH
Priority to EP23171021.1A priority Critical patent/EP4459585A1/en
Publication of EP4459585A1 publication Critical patent/EP4459585A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19671Addition of non-video data, i.e. metadata, to video stream

Definitions

  • the present invention relates to a surveillance system using 3D-cameras for tracking persons within a given area.
  • surveillance systems for theft prevention.
  • Known surveillance systems may include a security tag that is removably attached to an item and exit gates having sensors for detecting tags on unpaid items.
  • the exit gates may be automatically closed if a person tries to pass through the gate with an unpaid item.
  • Simpler gates just trigger an optical or acoustical alarm when they detect an unpaid article.
  • the surveillance systems described above require all items to be tagged and unpaid items to be detected at the exit of the shop.
  • Other known surveillance systems use video cameras and image recognition software in order to monitor persons in shops or other environments.
  • video surveillance using regular video cameras and image recognition software is relatively complex and expensive.
  • An object of the present invention is to provide a surveillance system using 3D cameras which is less complex and cheaper.
  • a surveillance system using 3D-cameras for tracking persons within a given area comprises: a sensor arrangement comprising at least one 3D-camera configured to continuously track a person within said area, wherein the tracking is preferably performed without determining the identity of the person.
  • the system is further configured to assign a unique identifier (ID) to each tracked person and to generate metadata concerning the position or a change of position of a tracked person from video data recorded by the at least one 3D camera.
  • the system further comprises a data analytics module configured to analyze data received from the at least one 3D-camera and to determine based thereon, if a tracked person shows a particular behavior, and a gate unit arranged at an exit of the area.
  • the data analytics module is further configured to: determine if a person enters an exit zone and to determine if a person entering the exit zone is a person who has shown a particular behavior and if so, to send a control signal to the gate unit.
  • the data analytics module preferably performs the tracking and data analysis in real time.
  • the above-mentioned unique identifier is used to easier differentiate one person from a plurality of persons being in the monitored area.
  • Such an ID may for example be an image, a number, a matrix of numbers or a vector of numbers or combination thereof.
  • the unique identifier (ID) is created based on three coordinates, one x-component, one y-component and one z-component. The z-component may represent the height of the person.
  • the unique identifier is not coupled to the identity of the person being tracked. As an example, each tracked person can appear on a screen as a dot that is assigned a specific number (ID).
  • tracking of the person may be performed without identifying the identity of the person (i.e. without a high-resolution image or the persons face attributes, contact information, name etc.). Information such as height, color of clothing and similar features is sufficient to be able to assign a unique identifier (ID) and track the person across the monitored area.
  • the tracking is performed using a sensor arrangement that does not use for example facial recognition algorithms to identify the person. Identifying a person using facial recognition and similar identification methods may cause problems with personal integrity and storing of such information. Furthermore, methods using facial recognition requires complex and power consuming systems.
  • the present invention which tracks a person without identifying the identity of the person, thus eliminates problems around personal integrity and provides a less complex tracking system.
  • the surveillance system uses 3D-cameras or stereoscopic cameras, respectively, for tracking persons.
  • a stereo camera has got two or more lenses with a separate image sensor for each lens. This gives it the ability to capture three-dimensional images.
  • 3D cameras which can be used for the surveillance system according to the present disclosure are for instance cameras from Hella Aglaia, XOVIS or Intenta.
  • the surveillance system is configured to generate metadata from video data recorded by the at least one 3D camera, the metadata (also referred to as "non-video data") comprising one or a combination of the following: a unique identifier (ID) for each tracked person; the position or a change of position of a tracked person, the head and/or foot position of a tracked person or other geometrical data related to the body of the person being tracked.
  • ID unique identifier
  • non-video data is information extracted from the video data recorded by the at least one 3D camera.
  • the data analytics module may be configured to determine based on the metadata, if a tracked person shows a particular behavior or not. In the alternative, the data analytics module may be configured to determine in addition, based on the (raw) video data recorded by the at least one 3D camera, if a tracked person shows a particular behavior.
  • the data analytics module may be realized by any number of hardware and/or software components configured to perform the specified functions.
  • the components may all be in one location, for instance in a computer, or distributed across different units of the system. The same also applies to the 3D cameras and other elements of the surveillance system according to the invention.
  • control signal is used to trigger an alarm at the gate unit, to block or close the gate unit or to cause the gate unit to function in any desired manner.
  • the data analytics module is configured to detect a particular behavior of a tracked person by extracting behavioral information from video data recorded by the at least one 3D camera and/or metadata extracted from the video data.
  • the data analytics module is configured to detect the behavior of a tracked person by analyzing metadata, such as the person's position, time, velocity of movement and/or direction of view of a tracked person.
  • behavior of a person may be classified as action or inaction.
  • An action may include any one or a combination of looking in a direction, reaching for an item of merchandise, traveling along a path at the premises, visiting an aisle or a location at the premises, walking or running at a certain speed, and spending an amount of time at the premises or a specific location.
  • a typical behavior that is indicative of a possible theft may be for instance that a person enters a shop, goes directly to a shelf, takes a product and then quickly leaves the shop without paying.
  • Inaction may include failing to reach for an object wherein an object is dropped or positioned and the individual does not retrieve the dropped object. Inaction may also include failing to walk to a particular location or failure to perform a particular task.
  • Parameters for defining or assessing behavior may include one or more of: a position of a tracked person, direction of movement, direction of view, velocity or acceleration of movement, time, etc.
  • a skilled person may define a particular behavior and provide and/or define characteristics of the particular behavior as desired and configure the system so as to identify such behavior.
  • 3D cameras usually cannot recognize, if the person has taken a product from a shelf, but the system can determine from various other parameters - such as position of the person, speed of movement, time and travelling path of the person - if a person should be considered suspicious or not.
  • the sensor arrangement is configured to start tracking a person as soon as it enters a monitored area.
  • the monitored area may be essentially the total area of a shop or a part thereof, such as a checkout area.
  • the sensor arrangement continually tracks the person through at least the checkout area.
  • a surveillance system using 3D-cameras for tracking persons within a given area comprises: a sensor arrangement comprising at least one 3D-camera configured to continuously track a person within said area.
  • the system is further configured to assign a unique identifier (ID) to each tracked person and to generate metadata at least concerning the position or a change of position of a tracked person from video data recorded by the at least one 3D camera, as described above.
  • the system comprises a data analytics module configured to analyze the metadata received from the at least one 3D-camera and to determine based thereon, if a tracked person shows a particular behavior.
  • the metadata that is evaluated to determine an individual's behavior may comprise one or a combination of the following parameters: the position or a change of position of a tracked person, the head and/or foot position of a tracked person, time, velocity of movement, etc..
  • any desired reaction may be triggered, such as generating an alarm signal or any desired command for a device.
  • the system 1 comprises several 3D video cameras 2 mounted on the ceiling of a shop 9 at positions which allow monitoring of basically the entire area of the shop 9 including an exit zone 4.
  • the shop 9 is equipped with a gate unit 3 which may be arranged close to the exit 10.
  • the exit zone 4 may be confined by lines 5.
  • the gate unit 3 may be a simple sensor gate comprising one or more sensor units that detect unpaid items and trigger an alarm when a person tries to leave the shop 9 with an unpaid item.
  • a gate unit 3 may comprise a sensor unit of a well-known Electronic Article Surveillance System (EAS System).
  • the gate unit 3 may be an automatic passage gate having a barrier that may block the exit 10 and thus prevent a person from leaving the shop 9.
  • the gate unit 3 may comprise a horizontally extending bar or arm which may be opened or closed.
  • the gate unit 3 may also be an ordinary automatic door such as a sliding door.
  • the 3D cameras 2 continuously record stereoscopic pictures as a video stream and track a person 6 within the shop 9, wherein the tracking is performed without determining the identity of the person.
  • the system 1 assigns a unique identifier (ID) to each tracked person 6 and generates other metadata concerning at least the position or a change of position of a tracked person 6.
  • a data analytics module 12 shown in fig. 3 is configured to analyze the data received from the 3D-cameras 2 and to determine based thereon, if a tracked person 6 behaves in a certain pattern which is considered suspicious. If such behavior pattern was recognized by the system 1 for a person 6, and this person 6 enters the exit zone 4, the system 1 sends a control signal to the gate unit 3 in order to close it or at least to trigger an alarm.
  • the gate unit 3 may include an acoustical or optical alarm device.
  • the above-mentioned unique identifier is used to easier differentiate one person from a plurality of persons being in the monitored area. Such a value may for example be an image to which a number is assigned.
  • the unique identifier is a dot (6) displayed on a screen having a specific number ID which represents one specific person 6.
  • Fig. 1 shows a person 6 having a specific ID at a first point in time, a movement path as a dashed line and the position of the person 6 at a second point in time.
  • the data analytics module 12 is configured to analyze the data received from the 3D-cameras 2 and to determine based thereon, if a tracked person 6 behaves in a way which is considered suspicious. To this end, the data analytics module 12 may consider one or more of the following parameters: the position or a change of position of a tracked person, the head position or a direction of view of a tracked person; a velocity or acceleration of movement, time and other suitable parameters by which a person's behavior can be described and assessed.
  • behavior of a person 6 may be classified as action or inaction.
  • An action may include any one or a combination of looking in a direction, reaching for an item of merchandise, traveling along a path at the premises, visiting an aisle or a location at the premises or spending an amount of time at the premises or a specific location.
  • An action may further include picking up an object wherein the object has been placed or left at a particular location or moving a particular object such as the opening of a door, drawer or compartment.
  • an action may include moving to a particular position, a first individual engaging a second individual and/or moving a hand, arm, leg and/or foot in a particular motion.
  • An action may also include positioning a head in a particular direction, such as, for example, looking directly at security personnel or a security camera. Since 3D cameras cannot see a person's details, it may be desired for certain applications to integrate conventional video cameras into the surveillance system in order to improve the quality of behavioral detection. Conventional video cameras see more details and allow a person's behavior to be analyzed in more detail.
  • Inaction may include failing to reach for an object wherein an object is dropped or positioned and the individual does not retrieve the dropped object. Inaction may also include failing to walk to a particular location or failure to perform a particular task.
  • a skilled person may define any particular user behavior and provide and/or define characteristics of the particular person's behavior as desired and configure the system so as to identify such behavior.
  • the skilled person can use any known technology such as matching algorithms or AI.
  • the data analytics module 12 can be configured to detect a particular behavior of a tracked person 6 based on either raw video data recorded by the 3D cameras 2, or based on metadata extracted from the video data.
  • the data analytics module 12 is configured to detect the behavior of a tracked person by analyzing metadata, such as the position, velocity of movement and/or direction of view of a tracked person.
  • Fig. 2 shows the monitored shop 9 from the side.
  • the 3D cameras 2 are mounted on the ceiling and monitor the shop 9 from above.
  • a field of view is indicated by reference number 8.
  • the field a view 8 of the 3D cameras 2 basically covers the entire shop area.
  • Fig. 3 shows components of surveillance system 1 according to an embodiment of the present invention in a block diagram.
  • the system comprises a computer 11 which is connected to a number of 3D cameras 2 and receives metadata therefrom.
  • the 3D cameras are also linked together, producing a global coordinate system and a unique ID for each person being tracked.
  • the computer 11 includes a data analytics module 12 comprising software for detecting certain user behavior which is considered relevant to recognize possible theft or other criminal activity.
  • the data analytics module 12 may include a matching algorithm or matching module, such as a comparator, that compares predefined characteristics and/or a predefined model of user behavior with user behavior in the metadata. Indication of a match generates a control signal which is then sent to the gate 3.
  • a matching algorithm or matching module such as a comparator
  • the data analytics module 12 may also use AI in order to recognize a certain user behavior.
  • the data analytics module 12 is preferably configured to recognize abnormal patterns of behavior or unexpected patterns of behavior and to trigger an alarm at the gate unit 3 in order to alert security or investigators of potentially abnormal scenarios, events or conditions.
  • the data analytics module 12 is preferably configured for real-time analytics.
  • surveillance system 1 includes data analytics module 12 which may be configured to track a person moving to a particular location which then stops at that position and looks into a certain direction for a certain time.
  • the data analytics module 12 may also be configured to track abnormal velocity of patrons and/or individuals arriving or departing from a particular location.
  • a typical arrival and/or departure velocity may be preset or obtained from an algorithm of previous individuals that may have arrived or departed from a particular location over a preset or variable amount of time. Deviation from the arrival and/or departure velocity, or any other abnormal behavior may be used to tag a person 6 as "potentially suspicious".
  • an alarm can be triggered at the gate unit 3.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Library & Information Science (AREA)
  • Alarm Systems (AREA)

Abstract

A surveillance system (1) using 3D-cameras for tracking persons (6) within a given area is suggested, the system (1) comprising: a sensor arrangement comprising at least one 3D-camera (2) configured to track a person within said area (8), wherein the tracking is performed without determining the identity of the person (6), wherein the system (1) is further configured to assign a unique identifier (ID) to each tracked person (6) and to generate data concerning the position or a change of position of a tracked person (6); a data analytics module (12) configured to analyze data received from the at least one 3D-camera (2) and to determine based thereon, if a tracked person (6) shows a particular behavior, and a gate unit (3) arranged at an exit of the area (8). The data analytics module (12) is further configured to: determine if a person (6) enters an exit zone (4); and to determine if a person (6) entering the exit zone (4) is a person (6) who has shown a particular behavior before, and if so, to transmit a control signal to the gate unit (3).

Description

    TECHNICAL FIELD
  • The present invention relates to a surveillance system using 3D-cameras for tracking persons within a given area.
  • BACKGROUND
  • In order to prevent customer from taking articles outside the store without paying, many shops are equipped with surveillance systems for theft prevention. Known surveillance systems may include a security tag that is removably attached to an item and exit gates having sensors for detecting tags on unpaid items. The exit gates may be automatically closed if a person tries to pass through the gate with an unpaid item. Simpler gates just trigger an optical or acoustical alarm when they detect an unpaid article.
  • The surveillance systems described above require all items to be tagged and unpaid items to be detected at the exit of the shop. Other known surveillance systems use video cameras and image recognition software in order to monitor persons in shops or other environments. However, video surveillance using regular video cameras and image recognition software is relatively complex and expensive.
  • An object of the present invention is to provide a surveillance system using 3D cameras which is less complex and cheaper.
  • SUMMARY
  • According to an aspect of the present disclosure, a surveillance system using 3D-cameras for tracking persons within a given area is provided. The system comprises: a sensor arrangement comprising at least one 3D-camera configured to continuously track a person within said area, wherein the tracking is preferably performed without determining the identity of the person. The system is further configured to assign a unique identifier (ID) to each tracked person and to generate metadata concerning the position or a change of position of a tracked person from video data recorded by the at least one 3D camera. The system further comprises a data analytics module configured to analyze data received from the at least one 3D-camera and to determine based thereon, if a tracked person shows a particular behavior, and a gate unit arranged at an exit of the area. The data analytics module is further configured to: determine if a person enters an exit zone and to determine if a person entering the exit zone is a person who has shown a particular behavior and if so, to send a control signal to the gate unit. The data analytics module preferably performs the tracking and data analysis in real time.
  • The above-mentioned unique identifier (ID) is used to easier differentiate one person from a plurality of persons being in the monitored area. Such an ID may for example be an image, a number, a matrix of numbers or a vector of numbers or combination thereof. In one embodiment the unique identifier (ID) is created based on three coordinates, one x-component, one y-component and one z-component. The z-component may represent the height of the person. In a preferred embodiment, the unique identifier is not coupled to the identity of the person being tracked. As an example, each tracked person can appear on a screen as a dot that is assigned a specific number (ID).
  • By assigning a unique identifier (ID) to the person, tracking of the person may be performed without identifying the identity of the person (i.e. without a high-resolution image or the persons face attributes, contact information, name etc.). Information such as height, color of clothing and similar features is sufficient to be able to assign a unique identifier (ID) and track the person across the monitored area.
  • In a preferred embodiment the tracking is performed using a sensor arrangement that does not use for example facial recognition algorithms to identify the person. Identifying a person using facial recognition and similar identification methods may cause problems with personal integrity and storing of such information. Furthermore, methods using facial recognition requires complex and power consuming systems. The present invention, which tracks a person without identifying the identity of the person, thus eliminates problems around personal integrity and provides a less complex tracking system.
  • According to the present disclosure, the surveillance system uses 3D-cameras or stereoscopic cameras, respectively, for tracking persons. A stereo camera has got two or more lenses with a separate image sensor for each lens. This gives it the ability to capture three-dimensional images. 3D cameras which can be used for the surveillance system according to the present disclosure are for instance cameras from Hella Aglaia, XOVIS or Intenta.
  • In an embodiment of the present disclosure the surveillance system is configured to generate metadata from video data recorded by the at least one 3D camera, the metadata (also referred to as "non-video data") comprising one or a combination of the following: a unique identifier (ID) for each tracked person; the position or a change of position of a tracked person, the head and/or foot position of a tracked person or other geometrical data related to the body of the person being tracked. In this specification "metadata" or "non-video data" is information extracted from the video data recorded by the at least one 3D camera.
  • According to one aspect, the data analytics module may be configured to determine based on the metadata, if a tracked person shows a particular behavior or not. In the alternative, the data analytics module may be configured to determine in addition, based on the (raw) video data recorded by the at least one 3D camera, if a tracked person shows a particular behavior.
  • According to one aspect of the present disclosure, the data analytics module may be realized by any number of hardware and/or software components configured to perform the specified functions. The components may all be in one location, for instance in a computer, or distributed across different units of the system. The same also applies to the 3D cameras and other elements of the surveillance system according to the invention.
  • In a specific embodiment of the surveillance system, the above-mentioned control signal is used to trigger an alarm at the gate unit, to block or close the gate unit or to cause the gate unit to function in any desired manner.
  • In one embodiment, the data analytics module is configured to detect a particular behavior of a tracked person by extracting behavioral information from video data recorded by the at least one 3D camera and/or metadata extracted from the video data. In a specific embodiment, the data analytics module is configured to detect the behavior of a tracked person by analyzing metadata, such as the person's position, time, velocity of movement and/or direction of view of a tracked person.
  • Generally speaking, behavior of a person may be classified as action or inaction. An action may include any one or a combination of looking in a direction, reaching for an item of merchandise, traveling along a path at the premises, visiting an aisle or a location at the premises, walking or running at a certain speed, and spending an amount of time at the premises or a specific location. A typical behavior that is indicative of a possible theft may be for instance that a person enters a shop, goes directly to a shelf, takes a product and then quickly leaves the shop without paying.
  • Inaction may include failing to reach for an object wherein an object is dropped or positioned and the individual does not retrieve the dropped object. Inaction may also include failing to walk to a particular location or failure to perform a particular task.
  • Parameters for defining or assessing behavior may include one or more of: a position of a tracked person, direction of movement, direction of view, velocity or acceleration of movement, time, etc. A skilled person may define a particular behavior and provide and/or define characteristics of the particular behavior as desired and configure the system so as to identify such behavior. 3D cameras usually cannot recognize, if the person has taken a product from a shelf, but the system can determine from various other parameters - such as position of the person, speed of movement, time and travelling path of the person - if a person should be considered suspicious or not.
  • In a preferred embodiment, the sensor arrangement is configured to start tracking a person as soon as it enters a monitored area. The monitored area may be essentially the total area of a shop or a part thereof, such as a checkout area. In one embodiment, the sensor arrangement continually tracks the person through at least the checkout area.
  • According to another aspect of the present disclosure, a surveillance system using 3D-cameras for tracking persons within a given area is provided that comprises: a sensor arrangement comprising at least one 3D-camera configured to continuously track a person within said area. The system is further configured to assign a unique identifier (ID) to each tracked person and to generate metadata at least concerning the position or a change of position of a tracked person from video data recorded by the at least one 3D camera, as described above. The system comprises a data analytics module configured to analyze the metadata received from the at least one 3D-camera and to determine based thereon, if a tracked person shows a particular behavior. The metadata that is evaluated to determine an individual's behavior may comprise one or a combination of the following parameters: the position or a change of position of a tracked person, the head and/or foot position of a tracked person, time, velocity of movement, etc.. In case the system identifies a person that has shown a particular behavior, any desired reaction may be triggered, such as generating an alarm signal or any desired command for a device.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the invention will be described in the following; reference being made to the appended drawings which illustrate non-limiting examples of how the inventive concept can be reduced into practice.
  • Fig. 1
    shows a top view of a store equipped with a surveillance system according to an embodiment of the present invention;
    Fig. 2
    shows a side view of the store of fig. 1; and
    Fig. 3
    shows is a schematic illustration of various components of surveillance system used for tracking according to an embodiment of the present disclosure;
    DETAILED DESCRIPTION
  • Particular embodiments of the present disclosure are described hereinbelow with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely examples of the disclosure, which may be embodied in various forms. Well-known functions or constructions are not described in detail to avoid obscuring the present disclosure in unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure.
  • Additionally, the present disclosure may be described herein in terms of functional block components and various processing steps. It should be appreciated that such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions.
  • With reference to FIG. 1, a surveillance system according to an embodiment of this disclosure is shown as 1. The system 1 comprises several 3D video cameras 2 mounted on the ceiling of a shop 9 at positions which allow monitoring of basically the entire area of the shop 9 including an exit zone 4. The shop 9 is equipped with a gate unit 3 which may be arranged close to the exit 10. The exit zone 4 may be confined by lines 5.
  • In one embodiment of the invention, the gate unit 3 may be a simple sensor gate comprising one or more sensor units that detect unpaid items and trigger an alarm when a person tries to leave the shop 9 with an unpaid item. As an example, a gate unit 3 according to the present invention may comprise a sensor unit of a well-known Electronic Article Surveillance System (EAS System). In another embodiment, the gate unit 3 may be an automatic passage gate having a barrier that may block the exit 10 and thus prevent a person from leaving the shop 9. The gate unit 3 may comprise a horizontally extending bar or arm which may be opened or closed. The gate unit 3 may also be an ordinary automatic door such as a sliding door.
  • In operation, the 3D cameras 2 continuously record stereoscopic pictures as a video stream and track a person 6 within the shop 9, wherein the tracking is performed without determining the identity of the person. The system 1 assigns a unique identifier (ID) to each tracked person 6 and generates other metadata concerning at least the position or a change of position of a tracked person 6. A data analytics module 12 shown in fig. 3 is configured to analyze the data received from the 3D-cameras 2 and to determine based thereon, if a tracked person 6 behaves in a certain pattern which is considered suspicious. If such behavior pattern was recognized by the system 1 for a person 6, and this person 6 enters the exit zone 4, the system 1 sends a control signal to the gate unit 3 in order to close it or at least to trigger an alarm. To this end, the gate unit 3 may include an acoustical or optical alarm device.
  • The above-mentioned unique identifier (ID) is used to easier differentiate one person from a plurality of persons being in the monitored area. Such a value may for example be an image to which a number is assigned. In one embodiment shown in fig. 1, the unique identifier is a dot (6) displayed on a screen having a specific number ID which represents one specific person 6. Fig. 1 shows a person 6 having a specific ID at a first point in time, a movement path as a dashed line and the position of the person 6 at a second point in time.
  • As mentioned above, the data analytics module 12 is configured to analyze the data received from the 3D-cameras 2 and to determine based thereon, if a tracked person 6 behaves in a way which is considered suspicious. To this end, the data analytics module 12 may consider one or more of the following parameters: the position or a change of position of a tracked person, the head position or a direction of view of a tracked person; a velocity or acceleration of movement, time and other suitable parameters by which a person's behavior can be described and assessed.
  • Generally speaking, behavior of a person 6 may be classified as action or inaction. An action may include any one or a combination of looking in a direction, reaching for an item of merchandise, traveling along a path at the premises, visiting an aisle or a location at the premises or spending an amount of time at the premises or a specific location. An action may further include picking up an object wherein the object has been placed or left at a particular location or moving a particular object such as the opening of a door, drawer or compartment. Furthermore, an action may include moving to a particular position, a first individual engaging a second individual and/or moving a hand, arm, leg and/or foot in a particular motion. An action may also include positioning a head in a particular direction, such as, for example, looking directly at security personnel or a security camera. Since 3D cameras cannot see a person's details, it may be desired for certain applications to integrate conventional video cameras into the surveillance system in order to improve the quality of behavioral detection. Conventional video cameras see more details and allow a person's behavior to be analyzed in more detail.
  • Inaction may include failing to reach for an object wherein an object is dropped or positioned and the individual does not retrieve the dropped object. Inaction may also include failing to walk to a particular location or failure to perform a particular task.
  • A skilled person may define any particular user behavior and provide and/or define characteristics of the particular person's behavior as desired and configure the system so as to identify such behavior. To this end, the skilled person can use any known technology such as matching algorithms or AI.
  • Basically, the data analytics module 12 can be configured to detect a particular behavior of a tracked person 6 based on either raw video data recorded by the 3D cameras 2, or based on metadata extracted from the video data. In a specific embodiment, the data analytics module 12 is configured to detect the behavior of a tracked person by analyzing metadata, such as the position, velocity of movement and/or direction of view of a tracked person.
  • Fig. 2 shows the monitored shop 9 from the side. The 3D cameras 2 are mounted on the ceiling and monitor the shop 9 from above. A field of view is indicated by reference number 8. As can be seen, the field a view 8 of the 3D cameras 2 basically covers the entire shop area.
  • Fig. 3 shows components of surveillance system 1 according to an embodiment of the present invention in a block diagram. The system comprises a computer 11 which is connected to a number of 3D cameras 2 and receives metadata therefrom. The 3D cameras are also linked together, producing a global coordinate system and a unique ID for each person being tracked. The computer 11 includes a data analytics module 12 comprising software for detecting certain user behavior which is considered relevant to recognize possible theft or other criminal activity. To this end, the data analytics module 12 may include a matching algorithm or matching module, such as a comparator, that compares predefined characteristics and/or a predefined model of user behavior with user behavior in the metadata. Indication of a match generates a control signal which is then sent to the gate 3. In another embodiment, the data analytics module 12 may also use AI in order to recognize a certain user behavior. In any case, the data analytics module 12 is preferably configured to recognize abnormal patterns of behavior or unexpected patterns of behavior and to trigger an alarm at the gate unit 3 in order to alert security or investigators of potentially abnormal scenarios, events or conditions. The data analytics module 12 is preferably configured for real-time analytics.
  • For example, surveillance system 1 includes data analytics module 12 which may be configured to track a person moving to a particular location which then stops at that position and looks into a certain direction for a certain time. The data analytics module 12 may also be configured to track abnormal velocity of patrons and/or individuals arriving or departing from a particular location. A typical arrival and/or departure velocity may be preset or obtained from an algorithm of previous individuals that may have arrived or departed from a particular location over a preset or variable amount of time. Deviation from the arrival and/or departure velocity, or any other abnormal behavior may be used to tag a person 6 as "potentially suspicious". When a person 6 so tagged enters the exit zone 4, an alarm can be triggered at the gate unit 3.

Claims (5)

  1. A surveillance system (1) using 3D-cameras for tracking persons (6) within a given area, the system (1) comprising:
    - a sensor arrangement comprising at least one 3D-camera (2) configured to track a person within said area (8), wherein the tracking is performed without determining the identity of the person (6), wherein the system (1) is further configured to assign a unique identifier (ID) to each tracked person (6) and to generate data concerning the position or a change of position of a tracked person (6);
    - a data analytics module (12) configured to analyze data received from the at least one 3D-camera (2) and to determine based thereon, if a tracked person (6) shows a particular behavior, and
    - a gate unit (3) arranged at an exit of the area (8);
    wherein the data analytics module (12) is further configured to:
    - determine if a person (6) enters an exit zone (4); and to
    - determine if a person (6) entering the exit zone (4) is a person (6) who has shown a particular behavior before, and if so, to transmit a control signal to the gate unit (3).
  2. The surveillance system (1) of claim 1, wherein it is configured to generate metadata from video data, the metadata comprising one or a combination of the following data: a unique identifier (ID) for each tracked person (6); the position or a change of position of a tracked person (6), the head position or a foot position or other geometrical data related to the body of the person being tracked; and wherein the data analytics module (12) is configured to determine based on the metadata received, if a tracked person (6) shows a particular behavior.
  3. The surveillance system (1) of claim 1, wherein the control signal triggers an alarm at the gate unit (3), closes the gate unit or causes the gate unit (3) to function in a desired manner.
  4. The surveillance system (1) of claim 1, wherein the data analytics module (12) is configured to detect a particular behavior of a person (6) by extracting behavioral information from any one or a combination of video data recorded by the at least one 3D camera or metadata extracted from the video data.
  5. The surveillance system (1) of claim 1, wherein the data analytics module (12) is configured to trigger an acoustical or optical alarm at the gate unit (3).
EP23171021.1A 2023-05-02 2023-05-02 Surveillance system using 3d-cameras Pending EP4459585A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP23171021.1A EP4459585A1 (en) 2023-05-02 2023-05-02 Surveillance system using 3d-cameras

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP23171021.1A EP4459585A1 (en) 2023-05-02 2023-05-02 Surveillance system using 3d-cameras

Publications (1)

Publication Number Publication Date
EP4459585A1 true EP4459585A1 (en) 2024-11-06

Family

ID=86328513

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23171021.1A Pending EP4459585A1 (en) 2023-05-02 2023-05-02 Surveillance system using 3d-cameras

Country Status (1)

Country Link
EP (1) EP4459585A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018147160A (en) * 2017-03-03 2018-09-20 株式会社東芝 Information processing apparatus, information processing method, and program
JP6646176B1 (en) * 2018-07-16 2020-02-14 アクセル ロボティクス コーポレーションAccel Robotics Corp. Autonomous store tracking system
WO2021186751A1 (en) * 2020-03-18 2021-09-23 株式会社 テクノミライ Digital auto-filing security system, method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018147160A (en) * 2017-03-03 2018-09-20 株式会社東芝 Information processing apparatus, information processing method, and program
JP6646176B1 (en) * 2018-07-16 2020-02-14 アクセル ロボティクス コーポレーションAccel Robotics Corp. Autonomous store tracking system
WO2021186751A1 (en) * 2020-03-18 2021-09-23 株式会社 テクノミライ Digital auto-filing security system, method, and program

Similar Documents

Publication Publication Date Title
US9646228B2 (en) Role-based tracking and surveillance
AU2003290998B2 (en) Event driven video tracking system
US7280673B2 (en) System and method for searching for changes in surveillance video
US9881216B2 (en) Object tracking and alerts
EP1371039B1 (en) Automatic system for monitoring persons entering and leaving a changing room
US20080074496A1 (en) Video analytics for banking business process monitoring
US20250225851A1 (en) Monitoring device, suspicious object detecting method, and recording medium
US7295106B1 (en) Systems and methods for classifying objects within a monitored zone using multiple surveillance devices
US20080018738A1 (en) Video analytics for retail business process monitoring
JP2005501351A (en) Vision-based method and apparatus for detecting fraud events in a retail environment
US12406503B2 (en) Method and apparatus for the detection of behaviours in a retail environment
JP2018093283A (en) Monitoring information collection system
JP5594879B2 (en) Image monitoring device
Patil et al. Suspicious movement detection and tracking based on color histogram
US20120169872A1 (en) Method and system for detecting duress
WO2021186149A1 (en) Security system
EP4459585A1 (en) Surveillance system using 3d-cameras
JP2003169320A (en) Monitoring method and system thereof
EP1405279A1 (en) Vision based method and apparatus for detecting an event requiring assistance or documentation
Flinchbaugh et al. Autonomous video surveillance
KR102760265B1 (en) Gate systeme having artificial intelligence
JP7054075B2 (en) Information processing system, information processing method, and program
Senior et al. Visual person searches for retail loss detection: Application and evaluation
Chang et al. Event detection and target tracking based on co-operative multi-camera system
Surya et al. A SYSTEM FOR OBJECT MOTION DETECTION

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20250304