[go: up one dir, main page]

US20170046891A1 - Systems and methods for location identification and tracking using a camera - Google Patents

Systems and methods for location identification and tracking using a camera Download PDF

Info

Publication number
US20170046891A1
US20170046891A1 US15/234,254 US201615234254A US2017046891A1 US 20170046891 A1 US20170046891 A1 US 20170046891A1 US 201615234254 A US201615234254 A US 201615234254A US 2017046891 A1 US2017046891 A1 US 2017046891A1
Authority
US
United States
Prior art keywords
person
vehicle
enter
facility
portable camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/234,254
Inventor
Steve E. Trivelpiece
David Gathright
Steven J. Raynesford
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sensormatic Electronics LLC
Original Assignee
Tyco Fire and Security GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tyco Fire and Security GmbH filed Critical Tyco Fire and Security GmbH
Priority to US15/234,254 priority Critical patent/US20170046891A1/en
Assigned to TYCO FIRE & SECURITY GMBH reassignment TYCO FIRE & SECURITY GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GATHRIGHT, DAVID, RAYNESFORD, STEVEN J., TRIVELPIECE, STEVE E.
Publication of US20170046891A1 publication Critical patent/US20170046891A1/en
Assigned to Sensormatic Electronics, LLC reassignment Sensormatic Electronics, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TYCO FIRE & SECURITY GMBH
Assigned to Sensormatic Electronics, LLC reassignment Sensormatic Electronics, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TYCO FIRE & SECURITY GMBH
Priority to US17/388,640 priority patent/US11544984B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G07C9/00079
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • G07C9/25Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
    • G07C9/253Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition visually
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C9/00309Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated with bidirectional data transmission between data carrier and locks
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/28Individual registration on entry or exit involving the use of a pass the pass enabling tracking or indicating presence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C2009/00753Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated by active electrical keys
    • G07C2009/00769Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated by active electrical keys with data transmission performed by wireless means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C2209/00Indexing scheme relating to groups G07C9/00 - G07C9/38
    • G07C2209/04Access control involving a hierarchy in access rights

Definitions

  • This document relates generally to systems and methods for location identification and tracking. More particularly, this document relates to systems and methods for location identification and tracking using a camera.
  • GPSs Global Positioning Systems
  • RF Radio Frequency
  • fixed camera systems Fixed Camera Systems
  • mobile camera systems mobile camera systems.
  • the GPSs require an external system with additional requirements that cannot always be met (e.g., visibility of satellites). GPSs do not work indoors, do not provide direction, and are limited to outside locations under specific circumstances.
  • the RF based indoor positioning systems require installed infrastructure with specific vendor technology.
  • the fixed camera systems require an extensive system of installed infrastructure with specific vendor technology.
  • the fixed camera systems have limited ability to indicate an individual's facing direction and path.
  • the mobile camera systems simply record the camera's perspective and do not provide locational information.
  • the tracking of objects through a space is a common need in areas (such as security, traffic management and advertising) for the purposes of path verification, path optimization, security monitoring, etc.
  • Object tracking has also traditionally been solved using static video cameras or attached video cameras.
  • the present disclosure concerns implementing systems and methods for location identification and tracking of a person, object and/or vehicle.
  • the methods comprise: obtaining, by a computing system, a video of a surrounding environment which was captured by a portable camera disposed on, coupled to and/or attached to the person, object or vehicle; comparing, by the computing system, first images of the video to pre-stored second images to identify geographic locations where the first images were captured by the portable camera; analyzing, by the computing system, the identified geographic locations to verify that the person, object or vehicle is (1) traveling along a correct path, (2) traveling towards a facility for which the person, object or vehicle has authorization to enter, (3) traveling towards a zone or secured area internal or external to the facility for which the person, object or vehicle has authorization to enter, or (4) located within or in proximity to the facility, zone or secured area for which the person, object or vehicle has authorization to enter; and transmitting a notification from the computing system indicating the results of the analyzing.
  • the computing system may comprise at least one of the portable camera and a computing device remote from the portable camera.
  • the computing system may also analyze the determined geographic location to additionally or alternatively determine: if the correct path is being traveled or was followed; and/or if the correct path is being traveled or was followed in a prescribed timely manner.
  • the person, object or vehicle is allowed to enter the facility, zone or secured area when it is verified that the person, object or vehicle has authorization to enter the same.
  • a lock may be commanded to unlock so as to allow entry into the facility, zone or secured area.
  • a unique identifier for the portable camera may be used as an electronic key for unlocking the lock.
  • Security personnel may also be notified that the person, object or vehicle is authorized to enter the facility, zone or secured area.
  • Security personnel may additionally or alternatively be notified when it is not verified that the person, object or vehicle has authorization to enter the facility, zone or secured area.
  • Measures may be taken to prevent the person, object or vehicle from entering the facility, zone or secured area when it is not verified that the person, object or vehicle has authorization to enter the same.
  • the measures can comprise locking a lock.
  • the unique identifier of the portable camera may be used as an electronic key for locking the lock.
  • the unique identifier of the portable camera may also be used to: determine an access or clearance level of the person, object or vehicle; facilitate authentication of the person, object or vehicle on which the portable camera is disposed; and/or facilitate keyless access control into or out of the facility or secured area.
  • the identified geographic locations are used to generate a path of travel of the person, object or vehicle through the surrounding environment.
  • Global Positioning System (“GPS”) data, inertial navigation data, beacon data and/or sensor data may be used in conjunction with the video to identify the geographic locations, to verify that the identified geographic locations are correct, to detect a path of travel, or to verify that the path of travel is correct.
  • the portable camera may be used to capture an image of a mirror reflection of the person, object or vehicle to facilitate access control.
  • FIG. 1 is a schematic illustration of an exemplary system.
  • FIG. 2 is a flow diagram of an exemplary method for locating and/or tracking a person, object or vehicle.
  • FIG. 3 provides an illustration of an exemplary computing device that can be used to implement the present solution.
  • the present disclosure concerns implementing systems and methods for location identification and tracking using a camera and/or other environment sensing/recording devices (e.g., Light Detection And Ranging (“LIDAR”) devices, plenoptic cameras, and/or structured light sensors).
  • LIDAR Light Detection And Ranging
  • plenoptic cameras plenoptic cameras
  • structured light sensors e.g., structured light sensors
  • LIDAR Light Detection And Ranging
  • Any known or to be known camera and/or environment sensing/recording device can be used herein without limitation.
  • the systems and methods can be employed in various applications.
  • the systems and methods can be used for (1) access control into buildings/areas based on a person's security clearance level, (2) determining whether an employee performed his(her) required tasks (e.g., completed an assignment, went to the correct place(s), followed correct procedures, travelled the correct path through a facility, etc.), and/or (3) determining where a person is currently located within a facility.
  • Data captured by the camera and/or environment sensing/recording device e.g., video and/or images captured by the camera
  • present solution is discussed herein primarily in relation to camera and/or video capture scenarios for ease of explanation.
  • present solution is not limited in this regard as evident from the immediately preceding two (2) paragraphs.
  • video capture is the process of converting an analog video signal (such as that produced by a camera) to digital video.
  • the resulting digital data are computer files referred to herein as a digital video stream, a video stream, a captured video, and/or a video.
  • the video may be stored in a compressed or uncompressed format.
  • Methods for data compression are well known in the art. Any known or to be known data compression technique can be used herein without limitation.
  • the methods generally involve: recording images by the camera to create visual, locational fingerprints; identifying subsequent visits to known locations using images captured by the camera and a plurality of known location images which were pre-stored in a data store; analyzing differences from expected images; verifying paths using the visual, locational fingerprints and the known location images; identifying waypoints using the visual, locational fingerprints; logging position with a coordinate system; providing location and direction; optionally augmenting the position with the addition of GPS data, inertial navigation data, beacon information, or other sensor data (e.g., beam break sensor data); optionally augmenting tracking using sensor data; and/or providing additional spatial information using a three dimensional (“3D”) camera.
  • 3D three dimensional
  • the 3D camera can comprise a portable standard camera mounted on any mobile object, person or vehicle.
  • the camera is operative to work in any environment and to provide facing direction.
  • the visual, locational fingerprints can be previously stored, auto-learned or result from post processing of a route. Real-time processing could be added to provide immediate feedback for a self-contained system.
  • the camera based technique described herein does not require significant data storage, additional infrastructure for operability, or external systems for operability.
  • images can be digitally rotated and distances can be determined between a person and an object of interest.
  • the camera also allows automated detection of milestones or achievement of waypoints for fulfillment of procedures.
  • the camera based technique(s) described herein overcome(s) certain limitations of conventional techniques by providing location tracking, environment monitoring, individual perspective, and automatic verification of route and procedures. Additionally, the camera based technique(s) described herein can be used to provide access control, security zone control, visual evidence of security and individual involvement, detection of unexpected or unauthorized items in an environment, and identity verification. Furthermore, the camera based technique(s) described herein provide(s) solutions that are self-contained, use standard equipment, and do not require existing infrastructure.
  • FIG. 1 there is provided an exemplary system 100 configured for location identification and tracking of persons, object and/or vehicles using a portable camera 108 .
  • the portable camera 108 is attached to a person 106 (or alternatively to an object or vehicle).
  • Portable cameras are well known in the art, and therefore will not be described herein.
  • the portable camera 108 can comprise any known or to be known camera.
  • the portable camera 108 is a 3D camera capable of generating 3D measurements of a space.
  • the portable camera 108 is an LG 360 CAM spherical camera having part number LGR105.AUSTATS available from LG Electronics Inc. of South Korea, a Theta S model camera available from Ricoh Company, Ltd. of Japan, or a GoPro action camera available from GoPro, Inc. of San Mateo, Calif.
  • the 3D measurements can be used to verify a person's location and/or position within a space.
  • the present solution is not limited to the particulars of these scenarios.
  • the portable device 108 can additionally or alternatively comprises a LIDAR device, a structured light system or other wide-field camera system.
  • the portable camera 108 may have additional circuitry added thereto as an accessory 120 .
  • the additional circuitry can include, but is not limited to, a GPS circuit, an inertial navigation circuit, a beacon reader circuit, a processing unit and/or a data store.
  • a GPS circuit can include, but is not limited to, a GPS circuit, an inertial navigation circuit, a beacon reader circuit, a processing unit and/or a data store.
  • Each of the listed electronic circuits/devices is well known in the art, and therefore will not be described herein. Any known or to be known GPS circuit, an inertial navigation circuit, a beacon reader circuit, a processing unit and/or a data store can be used herein without limitation.
  • the accessory 120 can be coupled to the portable camera via any suitable coupling means, such as an adhesive or mechanical coupler (e.g., screws, clamps, Velcro, straps, etc.).
  • the portable camera 108 captures video of a surrounding environment as the person 106 travels to, from and/or through facilities 102 , 104 .
  • the capturing is achieved (in some scenarios) by converting analog video signals generated by the camera into a digital format.
  • the captured video may be processed and/or stored by the portable camera 108 .
  • the video is communicated from the portable camera 108 to a remote computing system 112 via a wireless communications link 114 for processing thereat.
  • the video data (which may be in an analog or digital form) is converted or transformed into a Radio Frequency (RF′′) form for transmission over the wireless communications link 114 .
  • RF′′ Radio Frequency
  • the computing system 112 comprises at least one computing device and/or at least one data store (e.g., internal memory (e.g., a RAM, ROM, etc.) and/or external memory (e.g., a database)).
  • data store e.g., internal memory (e.g., a RAM, ROM, etc.) and/or external memory (e.g., a database)
  • An exemplary architecture for the computing system 112 is described below in relation to FIG. 3 .
  • the present solution is not limited to wireless communications capabilities.
  • the video may alternatively be communicated to the remote computing system 112 via a wired connection.
  • a plurality of kiosks is disposed at a facility. The camera downloads the video when it is inserted into a kiosk.
  • the processing can involve: comparing video image content with pre-stored image content to identify locations internal to or external to the facilities 102 , 104 (e.g., via pattern recognition or detection of known targets (symbols, signs, markers, statues, trees, doorways, or other landmarks, object or items) strategically placed internal/external to facilities 102 , 104 ); using results of the comparison operations to determine at least one geographic location at which the image(s) was(were) captured and/or a path of travel for the person 108 ; and analyzing the identified geographic locations and/or determined path of travel to verify that the person 108 is traveling along a correct path, traveling towards a facility 102 for which the person has authorization to enter, traveling towards a zone 1 - 3 within a facility for which the person has authorization to enter, traveling towards a secured area 110 for which the person has authorization to enter, and/or is in or is about to enter a facility, secured area and/or zone for which the person has authorization to enter.
  • targets symbols, signs, markers
  • the correct path is selected from a plurality of pre-defined paths stored in a data store.
  • the correct path is dynamically generated or determined during operation of system 100 .
  • the correct path can be selected and/or dynamically generated/determined based on a current determined path of travel of the person, a predicted facility/secured area/zone to which the person is traveling, historical paths of travel associated with the person, a security/access level of the person, tasks assigned to the person, tasks previously performed by the person, tasks being performed by the person, and/or time of day/week/month/year.
  • a determination as to whether or not the person is traveling along the correct path is made based on results of a comparison of the correct path of travel to the path of travel determined for the person based on the video data captured by the camera 108 .
  • the video image content comparison can involve a percentage matching and/or a probability percentage that the processor knows correctly where the camera 108 is in 3D space.
  • the percentage and/or probability percentage are selected based on a particular application, and may be set in a system as threshold values. For example, a match is determined to exist if greater than fifty percent (50%) of a video image's content (within the entire image or a select portion of the image) is the same as a pre-stored image's content (within the entire image or a select portion of the image).
  • a threshold value of fifty (50) to determine if a match does indeed exist between the two (2) images.
  • a match is determined to exist when the amount of similar image content exceeds fifty (50).
  • a match does not exist when the amount of similar image content is equal to or less than fifty (50).
  • the comparison operations involve comparing images of the video to pre-stored images to identify matches. For example, as a result of the comparison operations, a percentage of a first video image's content is determined to match a first pre-stored image's content.
  • Location information is stored in a data store so as to be associated with the first pre-stored image. The location information specifies a known location of a landmark shown in the first pre-stored image. This known location is obtained and used for the camera's location at the time the first video image was captured thereby.
  • the present solution is not limited to the particulars of this example. Other techniques for determined camera locations based on results of image processing can be used herein.
  • Such other techniques can involve identifying specific visual elements and/or encoded information embedded within an image or series of images. Additionally, multiple iterations of this may be performed for matching image sets. Accordingly, a plurality of camera locations may be determined which can be analyzed and/or mapped to define a path of travel.
  • various measures can be taken for access control purposes. For example, if (a) the results indicate that the person 106 is traveling towards a particular facility 102 , 104 , zone 1 , 2 , 3 , or secured area 110 b and (b) the person 106 has the appropriate access/clearance level to enter the same, then access control operations are performed for allowing the person to enter the same.
  • the person's access/clearance level is determined by accessing a data store.
  • a unique identifier assigned to the person e.g., the camera's unique identifier or other sequence of symbols
  • the access control operations can involve commanding a door lock of an access point 118 to unlock (thereby eliminating the need for a Common Access Card (“CAC”) card) or notifying a security guard that the person 106 is authorized to access the facility, zone or secured area.
  • the command and/or notification can be generated by and sent from the portable camera 108 and/or the computing system 112 to an electronic device 122 located at the access point 118 or in the security guards possession. Any known or to be known wired or wireless communication technique can be used without limitation for the stated purpose.
  • a notification is provided to the person 106 (via camera 108 , accessory 122 or other communication device (e.g., a mobile phone 124 )) and/or to security personnel (e.g., via a computing device or communication device (e.g., a mobile phone)) indicating that the person 106 is attempting to access a facility, area and/or zone for which the person does not have the appropriate access/clearance level to enter.
  • security personnel can take measures to prevent the person from entering a facility, zone and/or secured area for which (s)he does not have permission to access.
  • the portable camera 108 has a unique identifier associated therewith.
  • the unique identifier may be used (as mentioned above) to facilitate the identification of the person for which the video is being captured.
  • his(her) access/clearance level can be retrieved from a data store local to the portable camera 108 or remote from the portable camera 108 .
  • the required access/clearance level for the facilities 102 , 104 , zones 1 - 3 , and secured area 110 can also be stored in the data store local to the camera 108 or remote from the camera (such as in the computing system 112 ). In this case, the access/clearance level retrieved for the person is compared to the required access/clearance level for a given facility, zone or secured area.
  • the unique identifier also facilitates authentication of the person in possession of the camera so as to prevent others from using that camera assigned to that person.
  • Techniques for authenticating a person's identity are well known in the art. Any known or to be known authentication technique can be employed herein without limitation.
  • the unique identifier can further be used to facilitate keyless access control into a facility, secured area and/or zone.
  • the camera's unique identifier is used at access or entry points (e.g., doors) 118 to allow authentication and access control.
  • the camera's unique identifier is used as the person's unique identifier for authentication purposes.
  • the camera's unique identifier is compared to a plurality of stored identifiers to determine if a match exists.
  • the camera's unique identifier is used as an electronic key for access control purposes such as unlocking a lock 132 .
  • the electronic key may be communicated from the camera 108 or other communication device 124 in the person's possession to the lock 132 via Radio Frequency Identification (“RFID”) technology, Bluetooth technology or other Short Range Communication (“SRC”) technology.
  • RFID Radio Frequency Identification
  • SRC Short Range Communication
  • GPS data inertial navigation data
  • beacon data and/or other sensor data (e.g., accelerometer data, gyroscope data, etc.) is used in conjunction with the video data to detect the person's path of travel.
  • sensor data e.g., accelerometer data, gyroscope data, etc.
  • the listed types of data are well known in the art, and therefore are not described in detail herein. Techniques for generating and/or acquiring such data are also well known in the art. Any known or to be known technique for generating and/or acquiring such data can be used herein without limitation.
  • the GPS data, inertial navigation data, beacon data and/or other sensor data can be used to interpolate positions between known visual waypoints (e.g., to determine that the person turned left or right), i.e., to determine a position between two known fixed reference points in physical space (e.g., two landmarks).
  • known visual waypoints e.g., to determine that the person turned left or right
  • two known fixed reference points in physical space e.g., two landmarks.
  • the GPS data, inertial navigation data, beacon data and/or other sensor data may alternatively or additionally be used to verify that the results of a video data analysis are correct, i.e., that the person's position determined using the video data is correct. This verification can generally involve comparing location/position results of the video analysis with locations/positions specified by the GPS data, inertial navigation data, beacon data and/or other sensor data generate or acquired at the same time as the respective video image.
  • the beacon data can be obtained via wireless communications between the camera 108 and beacons 130 (which are strategically placed in system 100 at indoor and/or outdoor locations). Beacons are well known in the art, and therefore will not be described herein. Any known or to be known beacon can be employed herein without limitation.
  • the beacons include iBeacons® available from Apple Inc. of Cupertino, Calif.
  • facial recognition can be used at access points (e.g., doors) 118 to verify that the person possessing the camera 108 is the correct person.
  • the facial recognition is achieved using mirrors 116 placed at the access points 118 so that the camera 108 can take pictures of the person's reflection shown in the mirrors.
  • Facial recognition techniques are well known in the art, and therefore will not be described herein. Any known or to be known facial recognition technique can be used herein without limitation.
  • the facial recognition technique can be implemented by camera 108 and/or computing system 112 .
  • the present solution may be used in scenarios in which the person's activities are to be kept secret or confidential. Accordingly, cryptography may be employed to protect information communicated within system 100 .
  • information communicated between the camera 108 and the computing system 112 may be encrypted in accordance with a chaotic, random or pseudo-random number sequence generation algorithm.
  • Such cryptographic algorithms are well known in the art, and will not be described herein. Any known or to be known cryptographic algorithm can be used herein without limitation.
  • the seed value of the algorithm can include, but is not limited to, the camera's unique identifier, the person's unique identifier, a unique identifier of a given location (e.g., a facility, secured area or zone the person was in, is currently in, or is traveling towards), a unique identifier of a correct path, and/or a unique identifier of a task (or mission) that was, is being or is to be performed by the person.
  • all or a portion of the data processing that is performed by system 100 can be done using plaintext data and/or encrypted data depending on a particular application.
  • Method 200 begins with step 202 and continues with step 204 where a portable camera (e.g., portable camera 108 of FIG. 1 ) is disposed on a person (e.g., person 106 of FIG. 1 ), object or vehicle.
  • a portable camera e.g., portable camera 108 of FIG. 1
  • a person e.g., person 106 of FIG. 1
  • a unique identifier of the portable camera is used in step 206 to facilitate authentication of the person, object or vehicle on which the camera is disposed.
  • the unique identifier can include, but is not limited to, a sequence of numbers, a sequence of letters, a sequence of letters and numbers, and/or a sequence of other symbols.
  • the unique identifier can be compared to a plurality of unique identifiers stored in a data store for authentication purposes. For example, the person's authentication is at least partially obtained if the unique identifier acquired from the camera matches a unique identifier stored in the data store.
  • information other than or in addition to the camera's unique identifier is used for authentication purposes in step 206 .
  • This information can include, but is limited to, a user name and/or password associated with the person.
  • the portable camera performs operations to record video of a surrounding environment.
  • the video comprises a plurality of video images.
  • the video images are compared to pre-stored images in step 210 ( a ) to detect a path of travel of the person, object or vehicle through the surrounding environment and/or (b) to identify at least one geographic location at which the image(s) was(were) captured.
  • Image processing is well known in the art. Any known or to be known image processing technique can be used herein without limitation.
  • the image processing generally involves: comparing a captured image with pre-stored images to identify which one of the pre-stored images contains some or all of the first captured image's content; and obtaining location information stored in a data store so as to be associated with the identified pre-stored image. These image processing steps are iteratively or simultaneously performed for all or some of the captured images. Thereafter, the location information may be used to define a path of travel for the person, object or vehicle. Once the path of travel is defined, the path may be plotted on a multi-dimensional map. The multi-dimensional map may be displayed to relevant persons (e.g., security guards). The path of travel may also be stored for later use as historical travel information for the person, object or person.
  • the path of travel and/or identified geographic location(s) is(are) analyzed in step 212 to verify that the person, object or vehicle is (1) traveling along a correct path, (2) traveling towards a facility for which the person, object or vehicle has authorization to enter, and/or (3) traveling towards a zone or secured area internal or external to the facility for which the person, object or vehicle has authorization to enter, and/or (4) is located within or in proximity to a facility, secured area or zone for which the person, object or vehicle has authorization to enter.
  • the path of travel and/or identified geographic location(s) additionally or alternatively be analyzed to determine: if the correct path is being traveled or was followed; and/or if the correct path is being traveled or was followed in a prescribed timely manner.
  • step 212 can involve: using a unique identifier of the portable camera to determine an access or clearance level of the person, object or vehicle; and/or using at least one of GPS data, inertial navigation data, beacon data and sensor data in conjunction with the video to verify that the identified geographic location(s) and/or path of travel is(are) correct.
  • items (1), (2) and/or (3) is(are) verified by (A) comparing the identified geographic location(s) to pre-defined geographic location(s) stored in a data store and/or (B) comparing the detected path of travel to at least one pre-defined path of travel stored in the data store.
  • a destination of the person/vehicle/object is predicted based on the identified geographic location(s), detected path of travel, the direction of travel, and/or historical travel information associated with the person/vehicle/object. The predicted destination is then compared to a plurality of destinations stored in a data store.
  • GPS data specifies a position of the camera at a specified time as determined by a global navigation satellite.
  • Inertial navigation data specifies the position, orientation and velocity (direction and speed of movement) of the moving camera which were determined using a dead reckoning technique.
  • the inertial navigation data is obtained using a processor, motion sensors (e.g., accelerometers) and rotation sensors (e.g., gyroscopes).
  • Beacon data includes, but is not limited to, unique beacon identifiers which can be used to obtain known locations of the beacons (e.g., iBeacons®).
  • the sensor data can include, but is not limited to, beam break sensor data indicating detected motion of the person to which the camera is attached.
  • iBeacons® and beam break sensors are well known in the art, and therefore will not be described herein. Any known or to be known iBeacon® and/or beam break sensor can be used herein without limitation.
  • a door lock may be commanded to unlock so that the person, object or vehicle can enter the facility (e.g., facility 102 or 104 of FIG. 1 ) or secured area (e.g., secured area 110 of FIG. 1 ).
  • the unique identifier of the portable camera can be used to generate the door lock command so as to facilitate keyless access control into the facility or secured area.
  • the portable camera may be used to capture on image of a mirror reflection of the person, object or vehicle to facilitate access control.
  • step 218 security personnel are notified that the person, object or vehicle is authorized to enter or remain in the facility, zone or secured area.
  • notification can be electronically provided by wirelessly communicating a notification message to a communication device in the possession of the security personnel.
  • the communication device can include, but is not limited to, a personal computer, a smart phone, a portable computer, a personal digital assistant, and/or a smart watch.
  • an auditory, visual and/or tactile alarm may be output from the communication device of the security personnel and/or another computing device (e.g., a security system in proximity to the security personnel) in addition to and/or in response to the notification message's reception.
  • step 224 is performed where method 200 ends or other processing is performed.
  • steps 220 - 222 are performed. These steps involve: taking measures to prevent the person, object or vehicle from entering or remaining in the facility, zone or secured area; and optionally providing notification to security personnel.
  • the measures comprise locking a door lock.
  • the unique identifier of the portable camera may be used as an electronic key for locking the door lock.
  • the notification can be electronically provided by wirelessly communicating a notification message to a communication device in the possession of the security personnel.
  • the communication device can include, but is not limited to, a personal computer, a smart phone, a portable computer, a personal digital assistant, and/or a smart watch.
  • an auditory, visual and/or tactile alarm may be output from the communication device of the security personnel and/or another computing device (e.g., a security system in proximity to the security personnel) in addition to and/or in response to the notification message's reception.
  • step 224 is performed where method 200 ends or other processing is performed.
  • the systems and methods described herein have many novel features.
  • the systems use any standard camera mounted on a person, vehicle or object to facilitate locating and/or tracking movement thereof; continually compare active images to a stored map to determine where in the map the moving object is located; perform exception handling on any image that does not fit into its stored mapping; use known visual or Infrared (“IR”) markers to determine an immediate location of the person, vehicle or object (note: this could be used to resync position or to use limited capability hardware which might be required by cheap systems); integrate location into a coordinate system using changes in image to indicate relative distance; store an image with the map for future reference or comparison; augment position or motion data with additional sensors; use a 3D data collection to provide additional detailed mapping information; and define security zones within the mapped area to provide conditional access or alarming.
  • IR Infrared
  • FIG. 3 there is shown a hardware block diagram comprising an exemplary computer system 300 .
  • the machine can include a set of instructions which are used to cause the computer system to perform any one or more of the methodologies discussed herein.
  • the machine can function as a server or a router.
  • the exemplary computer system 300 can correspond to the computing system 112 of FIG. 1 and/or the computing elements of camera 108 of FIG. 1 .
  • system 300 would include imaging components in addition to the computing elements shown in FIG. 3 .
  • the imaging component can include, but are not limited to, an image capturing device. Image capturing devices are well known in the art, and therefore will not be described herein. Any known or to be known image capturing device can be used herein without limitation.
  • the computer system 300 can operate independently as a standalone device. However, the present solution is not limited in this regard and in other scenarios the computer system can be operatively connected (networked) to other machines in a distributed environment to facilitate certain operations described herein. Accordingly, while only a single machine is illustrated in FIG. 3 , it should be understood that the present solution can be taken to involve any collection of machines that individually or jointly execute one or more sets of instructions as described herein.
  • the computer system 300 is comprised of a processor 302 (e.g., a Central Processing Unit (“CPU”)), a main memory 304 , a static memory 306 , a drive unit 308 for mass data storage and comprised of machine readable media 320 , input/output devices 310 , a display unit 312 (e.g., a Liquid Crystal Display (“LCD”), a solid state display, or a Cathode Ray Tube (“CRT”)), and a network interface device 314 . Communications among these various components can be facilitated by means of a data bus 318 .
  • One or more sets of instructions 324 can be stored completely or partially in one or more of the main memory 304 , static memory 306 , and drive unit 308 .
  • the instructions can also reside within the processor 302 during execution thereof by the computer system.
  • the input/output devices 310 can include a keyboard, a keypad, a mouse, buttons, a multi-touch surface (e.g., a touchscreen), a speaker, a microphone, an imaging capturing device, and so on.
  • the network interface device 314 can be comprised of hardware components and software or firmware to facilitate wired or wireless network data communications in accordance with a network communication protocol utilized by a data network (e.g., a Local Area Network (“LAN”) and/or a Wide Area Network (“WAN”)).
  • LAN Local Area Network
  • WAN Wide Area Network
  • the drive unit 308 can comprise a machine readable medium 320 on which is stored one or more sets of instructions 324 (e.g., software) which are used to facilitate one or more of the methodologies and functions described herein.
  • the term “machine-readable medium” shall be understood to include any tangible medium that is capable of storing instructions or data structures which facilitate any one or more of the methodologies of the present disclosure.
  • Exemplary machine-readable media can include magnetic media, solid-state memories, optical-media and so on. More particularly, tangible media as described herein can include; magnetic disks; magneto-optical disks; CD-ROM disks and DVD-ROM disks, semiconductor memory devices, Electrically Erasable Programmable Read-Only Memory (“EEPROM”)) and flash memory devices.
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • a tangible medium as described herein is one that is non-transitory insofar as it does not involve a propagating signal.
  • Computer system 300 should be understood to be one possible example of a computer system which can be used in connection with the various implementations. However, the present solution is not limited in this regard and any other suitable computer system architecture can also be used without limitation.
  • Dedicated hardware implementations including, but not limited to, application-specific integrated circuits, programmable logic arrays, and other hardware devices can likewise be constructed to implement the methods described herein.
  • Applications that can include the apparatus and systems of various implementations broadly include a variety of electronic and computer systems. Some implementations may implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit.
  • the exemplary system is applicable to software, firmware, and hardware implementations.
  • the present solution can take the form of a computer program product on a tangible computer-usable storage medium (for example, a hard disk or a CD-ROM).
  • the computer-usable storage medium can have computer-usable program code embodied in the medium.
  • the term computer program product, as used herein, refers to a device comprised of all the features enabling the implementation of the methods described herein.
  • Computer program, software application, computer software routine, and/or other variants of these terms mean any expression, in any language, code, or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code, or notation; or b) reproduction in a different material form.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)
  • Alarm Systems (AREA)

Abstract

Systems and methods for location identification and tracking of a person, object and/or vehicle. The methods involve: obtaining, by a computing system, a video of a surrounding environment which was captured by a portable camera coupled to the person, object or vehicle; comparing, by the computing system, first images of the video to pre-stored second images to identify geographic locations where the first images were captured by the portable camera; analyzing, by the computing system, the identified geographic locations to verify that the person, object or vehicle is (1) traveling along a correct path, (2) traveling towards a facility for which the person, object or vehicle has authorization to enter, or (3) traveling towards a zone or secured area internal or external to the facility for which the person, object or vehicle has authorization to enter; and transmitting a notification from the computing system indicating the results of the analyzing.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefit of U.S. Provisional Patent Application No. 62/204,138 filed on Aug. 12, 2015. The contents of this Provisional Patent Application are hereby incorporated by reference.
  • FIELD OF THE INVENTION
  • This document relates generally to systems and methods for location identification and tracking. More particularly, this document relates to systems and methods for location identification and tracking using a camera.
  • BACKGROUND
  • There are numerous systems used for location tracking. These systems include Global Positioning Systems (“GPSs”), Radio Frequency (“RF”) based indoor positioning systems, fixed camera systems, and mobile camera systems. The GPSs require an external system with additional requirements that cannot always be met (e.g., visibility of satellites). GPSs do not work indoors, do not provide direction, and are limited to outside locations under specific circumstances. The RF based indoor positioning systems require installed infrastructure with specific vendor technology. The fixed camera systems require an extensive system of installed infrastructure with specific vendor technology. The fixed camera systems have limited ability to indicate an individual's facing direction and path. The mobile camera systems simply record the camera's perspective and do not provide locational information.
  • The tracking of objects through a space is a common need in areas (such as security, traffic management and advertising) for the purposes of path verification, path optimization, security monitoring, etc. Object tracking has also traditionally been solved using static video cameras or attached video cameras.
  • Human security personnel routinely monitor facilities by walking, driving or otherwise monitoring a patrol route. However, verification of the performance of those duties requires additional management personnel, easily circumvented auxiliary verification measures, or complex techniques to adequately certify the execution of security monitoring duties.
  • Even in the instances where the path can be verified, the procedure and environment cannot. The individual's perspective is lost using current methods, which fail to provide details about what was done, where it was done, and how it was done.
  • Other techniques (such as GPS) require external systems with additional system requirements that cannot be met (e.g., visibility of satellites). Although such technologies may record a path, they cannot provide the perspective of the individual. Recorded video surveillance at a location and recorded perspective video from a body-mounted camera require personnel to view and potentially detect irregularities in procedure or environment, but cannot automatically demonstrate that correct procedures were followed nor immediately detect variations in procedure or environment.
  • SUMMARY
  • The present disclosure concerns implementing systems and methods for location identification and tracking of a person, object and/or vehicle. The methods comprise: obtaining, by a computing system, a video of a surrounding environment which was captured by a portable camera disposed on, coupled to and/or attached to the person, object or vehicle; comparing, by the computing system, first images of the video to pre-stored second images to identify geographic locations where the first images were captured by the portable camera; analyzing, by the computing system, the identified geographic locations to verify that the person, object or vehicle is (1) traveling along a correct path, (2) traveling towards a facility for which the person, object or vehicle has authorization to enter, (3) traveling towards a zone or secured area internal or external to the facility for which the person, object or vehicle has authorization to enter, or (4) located within or in proximity to the facility, zone or secured area for which the person, object or vehicle has authorization to enter; and transmitting a notification from the computing system indicating the results of the analyzing. The computing system may comprise at least one of the portable camera and a computing device remote from the portable camera. The computing system may also analyze the determined geographic location to additionally or alternatively determine: if the correct path is being traveled or was followed; and/or if the correct path is being traveled or was followed in a prescribed timely manner.
  • In some scenarios, the person, object or vehicle is allowed to enter the facility, zone or secured area when it is verified that the person, object or vehicle has authorization to enter the same. For example, a lock may be commanded to unlock so as to allow entry into the facility, zone or secured area. A unique identifier for the portable camera may be used as an electronic key for unlocking the lock. Security personnel may also be notified that the person, object or vehicle is authorized to enter the facility, zone or secured area.
  • Security personnel may additionally or alternatively be notified when it is not verified that the person, object or vehicle has authorization to enter the facility, zone or secured area. Measures may be taken to prevent the person, object or vehicle from entering the facility, zone or secured area when it is not verified that the person, object or vehicle has authorization to enter the same. The measures can comprise locking a lock. The unique identifier of the portable camera may be used as an electronic key for locking the lock. The unique identifier of the portable camera may also be used to: determine an access or clearance level of the person, object or vehicle; facilitate authentication of the person, object or vehicle on which the portable camera is disposed; and/or facilitate keyless access control into or out of the facility or secured area.
  • In those or other scenarios, the identified geographic locations are used to generate a path of travel of the person, object or vehicle through the surrounding environment. Global Positioning System (“GPS”) data, inertial navigation data, beacon data and/or sensor data may be used in conjunction with the video to identify the geographic locations, to verify that the identified geographic locations are correct, to detect a path of travel, or to verify that the path of travel is correct. Additionally or alternatively, the portable camera may be used to capture an image of a mirror reflection of the person, object or vehicle to facilitate access control.
  • DESCRIPTION OF THE DRAWINGS
  • The present solution will be described with reference to the following drawing figures, in which like numerals represent like items throughout the figures.
  • FIG. 1 is a schematic illustration of an exemplary system.
  • FIG. 2 is a flow diagram of an exemplary method for locating and/or tracking a person, object or vehicle.
  • FIG. 3 provides an illustration of an exemplary computing device that can be used to implement the present solution.
  • DETAILED DESCRIPTION
  • It will be readily understood that the components of the present solution as generally described herein and illustrated in the appended figures could be arranged and designed in a wide variety of different configurations. Thus, the following more detailed description of the present solution, as represented in the figures, is not intended to limit the scope of the present disclosure, but is merely representative of various implementations of the present solution. While the various aspects of the present solution are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
  • The present solution may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the present solution is, therefore, indicated by the appended claims rather than by this detailed description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
  • Reference throughout this specification to features, advantages, or similar language does not imply that all of the features and advantages that may be realized with the present solution should be or are in any single embodiment of the present solution. Rather, language referring to the features and advantages is understood to mean that a specific feature, advantage, or characteristic described in connection with an embodiment is included in at least one embodiment of the present solution. Thus, discussions of the features and advantages, and similar language, throughout the specification may, but do not necessarily, refer to the same embodiment.
  • Furthermore, the described features, advantages and characteristics of the present solution may be combined in any suitable manner in one or more embodiments. One skilled in the relevant art will recognize, in light of the description herein, that the present solution can be practiced without one or more of the specific features or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments of the present solution.
  • Reference throughout this specification to “one embodiment”, “an embodiment”, or similar language means that a particular feature, structure, or characteristic described in connection with the indicated embodiment is included in at least one embodiment of the present solution. Thus, the phrases “in one embodiment”, “in an embodiment”, and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
  • As used in this document, the singular form “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise. Unless defined otherwise, all technical and scientific terms used herein have the same meanings as commonly understood by one of ordinary skill in the art. As used in this document, the term “comprising” means “including, but not limited to”.
  • The present disclosure concerns implementing systems and methods for location identification and tracking using a camera and/or other environment sensing/recording devices (e.g., Light Detection And Ranging (“LIDAR”) devices, plenoptic cameras, and/or structured light sensors). Each of the listed devices are well known in the art, and therefore will not be described in detail herein. Any known or to be known camera and/or environment sensing/recording device can be used herein without limitation. The systems and methods can be employed in various applications. For example, the systems and methods can be used for (1) access control into buildings/areas based on a person's security clearance level, (2) determining whether an employee performed his(her) required tasks (e.g., completed an assignment, went to the correct place(s), followed correct procedures, travelled the correct path through a facility, etc.), and/or (3) determining where a person is currently located within a facility. Data captured by the camera and/or environment sensing/recording device (e.g., video and/or images captured by the camera) can be used for verification, legal issue resolution, and/or future security review to determine who was present within the building/area at any given time.
  • Notably, the present solution is discussed herein primarily in relation to camera and/or video capture scenarios for ease of explanation. The present solution is not limited in this regard as evident from the immediately preceding two (2) paragraphs.
  • As known in the art, video capture is the process of converting an analog video signal (such as that produced by a camera) to digital video. The resulting digital data are computer files referred to herein as a digital video stream, a video stream, a captured video, and/or a video. The video may be stored in a compressed or uncompressed format. Methods for data compression are well known in the art. Any known or to be known data compression technique can be used herein without limitation.
  • The methods generally involve: recording images by the camera to create visual, locational fingerprints; identifying subsequent visits to known locations using images captured by the camera and a plurality of known location images which were pre-stored in a data store; analyzing differences from expected images; verifying paths using the visual, locational fingerprints and the known location images; identifying waypoints using the visual, locational fingerprints; logging position with a coordinate system; providing location and direction; optionally augmenting the position with the addition of GPS data, inertial navigation data, beacon information, or other sensor data (e.g., beam break sensor data); optionally augmenting tracking using sensor data; and/or providing additional spatial information using a three dimensional (“3D”) camera.
  • The 3D camera can comprise a portable standard camera mounted on any mobile object, person or vehicle. The camera is operative to work in any environment and to provide facing direction. The visual, locational fingerprints can be previously stored, auto-learned or result from post processing of a route. Real-time processing could be added to provide immediate feedback for a self-contained system.
  • Notably, the camera based technique described herein does not require significant data storage, additional infrastructure for operability, or external systems for operability. By adding a 3D camera, images can be digitally rotated and distances can be determined between a person and an object of interest. The camera also allows automated detection of milestones or achievement of waypoints for fulfillment of procedures.
  • The camera based technique(s) described herein overcome(s) certain limitations of conventional techniques by providing location tracking, environment monitoring, individual perspective, and automatic verification of route and procedures. Additionally, the camera based technique(s) described herein can be used to provide access control, security zone control, visual evidence of security and individual involvement, detection of unexpected or unauthorized items in an environment, and identity verification. Furthermore, the camera based technique(s) described herein provide(s) solutions that are self-contained, use standard equipment, and do not require existing infrastructure.
  • Referring now to FIG. 1, there is provided an exemplary system 100 configured for location identification and tracking of persons, object and/or vehicles using a portable camera 108. The portable camera 108 is attached to a person 106 (or alternatively to an object or vehicle). Portable cameras are well known in the art, and therefore will not be described herein. The portable camera 108 can comprise any known or to be known camera.
  • In some scenarios, the portable camera 108 is a 3D camera capable of generating 3D measurements of a space. For example, the portable camera 108 is an LG 360 CAM spherical camera having part number LGR105.AUSTATS available from LG Electronics Inc. of South Korea, a Theta S model camera available from Ricoh Company, Ltd. of Japan, or a GoPro action camera available from GoPro, Inc. of San Mateo, Calif. The 3D measurements can be used to verify a person's location and/or position within a space. The present solution is not limited to the particulars of these scenarios. As noted above, the portable device 108 can additionally or alternatively comprises a LIDAR device, a structured light system or other wide-field camera system.
  • In those or other scenarios, the portable camera 108 may have additional circuitry added thereto as an accessory 120. The additional circuitry can include, but is not limited to, a GPS circuit, an inertial navigation circuit, a beacon reader circuit, a processing unit and/or a data store. Each of the listed electronic circuits/devices is well known in the art, and therefore will not be described herein. Any known or to be known GPS circuit, an inertial navigation circuit, a beacon reader circuit, a processing unit and/or a data store can be used herein without limitation. The accessory 120 can be coupled to the portable camera via any suitable coupling means, such as an adhesive or mechanical coupler (e.g., screws, clamps, Velcro, straps, etc.).
  • During operation, the portable camera 108 captures video of a surrounding environment as the person 106 travels to, from and/or through facilities 102, 104. As should be understood, the capturing is achieved (in some scenarios) by converting analog video signals generated by the camera into a digital format. The captured video may be processed and/or stored by the portable camera 108. Additionally or alternatively, the video is communicated from the portable camera 108 to a remote computing system 112 via a wireless communications link 114 for processing thereat. As should be understood, the video data (which may be in an analog or digital form) is converted or transformed into a Radio Frequency (RF″) form for transmission over the wireless communications link 114. The computing system 112 comprises at least one computing device and/or at least one data store (e.g., internal memory (e.g., a RAM, ROM, etc.) and/or external memory (e.g., a database)). An exemplary architecture for the computing system 112 is described below in relation to FIG. 3.
  • The present solution is not limited to wireless communications capabilities. The video may alternatively be communicated to the remote computing system 112 via a wired connection. For example, a plurality of kiosks is disposed at a facility. The camera downloads the video when it is inserted into a kiosk.
  • In all cases, the processing can involve: comparing video image content with pre-stored image content to identify locations internal to or external to the facilities 102, 104 (e.g., via pattern recognition or detection of known targets (symbols, signs, markers, statues, trees, doorways, or other landmarks, object or items) strategically placed internal/external to facilities 102, 104); using results of the comparison operations to determine at least one geographic location at which the image(s) was(were) captured and/or a path of travel for the person 108; and analyzing the identified geographic locations and/or determined path of travel to verify that the person 108 is traveling along a correct path, traveling towards a facility 102 for which the person has authorization to enter, traveling towards a zone 1-3 within a facility for which the person has authorization to enter, traveling towards a secured area 110 for which the person has authorization to enter, and/or is in or is about to enter a facility, secured area and/or zone for which the person has authorization to enter.
  • In some scenarios, the correct path is selected from a plurality of pre-defined paths stored in a data store. Alternatively or additionally, the correct path is dynamically generated or determined during operation of system 100. The correct path can be selected and/or dynamically generated/determined based on a current determined path of travel of the person, a predicted facility/secured area/zone to which the person is traveling, historical paths of travel associated with the person, a security/access level of the person, tasks assigned to the person, tasks previously performed by the person, tasks being performed by the person, and/or time of day/week/month/year. A determination as to whether or not the person is traveling along the correct path is made based on results of a comparison of the correct path of travel to the path of travel determined for the person based on the video data captured by the camera 108.
  • Notably, the video image content comparison can involve a percentage matching and/or a probability percentage that the processor knows correctly where the camera 108 is in 3D space. The percentage and/or probability percentage are selected based on a particular application, and may be set in a system as threshold values. For example, a match is determined to exist if greater than fifty percent (50%) of a video image's content (within the entire image or a select portion of the image) is the same as a pre-stored image's content (within the entire image or a select portion of the image). Once the amount of similar image content is determined, it is compared to a threshold value of fifty (50) to determine if a match does indeed exist between the two (2) images. A match is determined to exist when the amount of similar image content exceeds fifty (50). A match does not exist when the amount of similar image content is equal to or less than fifty (50).
  • In some scenarios, the comparison operations involve comparing images of the video to pre-stored images to identify matches. For example, as a result of the comparison operations, a percentage of a first video image's content is determined to match a first pre-stored image's content. Location information is stored in a data store so as to be associated with the first pre-stored image. The location information specifies a known location of a landmark shown in the first pre-stored image. This known location is obtained and used for the camera's location at the time the first video image was captured thereby. The present solution is not limited to the particulars of this example. Other techniques for determined camera locations based on results of image processing can be used herein. Such other techniques can involve identifying specific visual elements and/or encoded information embedded within an image or series of images. Additionally, multiple iterations of this may be performed for matching image sets. Accordingly, a plurality of camera locations may be determined which can be analyzed and/or mapped to define a path of travel.
  • Based on results of the analysis, various measures can be taken for access control purposes. For example, if (a) the results indicate that the person 106 is traveling towards a particular facility 102, 104, zone 1, 2, 3, or secured area 110 b and (b) the person 106 has the appropriate access/clearance level to enter the same, then access control operations are performed for allowing the person to enter the same. In some scenarios, the person's access/clearance level is determined by accessing a data store. A unique identifier assigned to the person (e.g., the camera's unique identifier or other sequence of symbols) is stored in the data store so as to be associated with a list of locations for which the person has the appropriate clearance/access level to enter. The access control operations can involve commanding a door lock of an access point 118 to unlock (thereby eliminating the need for a Common Access Card (“CAC”) card) or notifying a security guard that the person 106 is authorized to access the facility, zone or secured area. The command and/or notification can be generated by and sent from the portable camera 108 and/or the computing system 112 to an electronic device 122 located at the access point 118 or in the security guards possession. Any known or to be known wired or wireless communication technique can be used without limitation for the stated purpose.
  • In contrast, if the results indicate that the person does not have the appropriate access/clearance level to enter the particular facility 102, 104, zone 1, 2, 3, or secured area 110 b indicated by the determined path of travel, then a notification is provided to the person 106 (via camera 108, accessory 122 or other communication device (e.g., a mobile phone 124)) and/or to security personnel (e.g., via a computing device or communication device (e.g., a mobile phone)) indicating that the person 106 is attempting to access a facility, area and/or zone for which the person does not have the appropriate access/clearance level to enter. After which, the security personnel can take measures to prevent the person from entering a facility, zone and/or secured area for which (s)he does not have permission to access.
  • In some scenarios, the portable camera 108 has a unique identifier associated therewith. The unique identifier may be used (as mentioned above) to facilitate the identification of the person for which the video is being captured. Once the person has been identified, his(her) access/clearance level can be retrieved from a data store local to the portable camera 108 or remote from the portable camera 108. The required access/clearance level for the facilities 102, 104, zones 1-3, and secured area 110 can also be stored in the data store local to the camera 108 or remote from the camera (such as in the computing system 112). In this case, the access/clearance level retrieved for the person is compared to the required access/clearance level for a given facility, zone or secured area.
  • The unique identifier also facilitates authentication of the person in possession of the camera so as to prevent others from using that camera assigned to that person. Techniques for authenticating a person's identity are well known in the art. Any known or to be known authentication technique can be employed herein without limitation.
  • The unique identifier can further be used to facilitate keyless access control into a facility, secured area and/or zone. In this case, the camera's unique identifier is used at access or entry points (e.g., doors) 118 to allow authentication and access control. For example, the camera's unique identifier is used as the person's unique identifier for authentication purposes. In this case, the camera's unique identifier is compared to a plurality of stored identifiers to determine if a match exists. Alternatively or additionally, the camera's unique identifier is used as an electronic key for access control purposes such as unlocking a lock 132. The electronic key may be communicated from the camera 108 or other communication device 124 in the person's possession to the lock 132 via Radio Frequency Identification (“RFID”) technology, Bluetooth technology or other Short Range Communication (“SRC”) technology.
  • In those or other scenarios, GPS data, inertial navigation data, beacon data and/or other sensor data (e.g., accelerometer data, gyroscope data, etc.) is used in conjunction with the video data to detect the person's path of travel. The listed types of data are well known in the art, and therefore are not described in detail herein. Techniques for generating and/or acquiring such data are also well known in the art. Any known or to be known technique for generating and/or acquiring such data can be used herein without limitation.
  • In some scenarios, the GPS data, inertial navigation data, beacon data and/or other sensor data can be used to interpolate positions between known visual waypoints (e.g., to determine that the person turned left or right), i.e., to determine a position between two known fixed reference points in physical space (e.g., two landmarks). Methods for interpolating positions using waypoints are well known in the art. Any known or to be known interpolation methods can be used herein without limitation.
  • The GPS data, inertial navigation data, beacon data and/or other sensor data may alternatively or additionally be used to verify that the results of a video data analysis are correct, i.e., that the person's position determined using the video data is correct. This verification can generally involve comparing location/position results of the video analysis with locations/positions specified by the GPS data, inertial navigation data, beacon data and/or other sensor data generate or acquired at the same time as the respective video image. The beacon data can be obtained via wireless communications between the camera 108 and beacons 130 (which are strategically placed in system 100 at indoor and/or outdoor locations). Beacons are well known in the art, and therefore will not be described herein. Any known or to be known beacon can be employed herein without limitation. For example, the beacons include iBeacons® available from Apple Inc. of Cupertino, Calif.
  • In yet other scenarios, facial recognition can be used at access points (e.g., doors) 118 to verify that the person possessing the camera 108 is the correct person. The facial recognition is achieved using mirrors 116 placed at the access points 118 so that the camera 108 can take pictures of the person's reflection shown in the mirrors. Facial recognition techniques are well known in the art, and therefore will not be described herein. Any known or to be known facial recognition technique can be used herein without limitation. The facial recognition technique can be implemented by camera 108 and/or computing system 112.
  • The present solution may be used in scenarios in which the person's activities are to be kept secret or confidential. Accordingly, cryptography may be employed to protect information communicated within system 100. For example, information communicated between the camera 108 and the computing system 112 may be encrypted in accordance with a chaotic, random or pseudo-random number sequence generation algorithm. Such cryptographic algorithms are well known in the art, and will not be described herein. Any known or to be known cryptographic algorithm can be used herein without limitation. In all scenarios, the seed value of the algorithm can include, but is not limited to, the camera's unique identifier, the person's unique identifier, a unique identifier of a given location (e.g., a facility, secured area or zone the person was in, is currently in, or is traveling towards), a unique identifier of a correct path, and/or a unique identifier of a task (or mission) that was, is being or is to be performed by the person. Notably, all or a portion of the data processing that is performed by system 100 can be done using plaintext data and/or encrypted data depending on a particular application.
  • Referring now to FIG. 2, there is provided a flow diagram of an exemplary method 200 for location identification and tracking of persons, object and/or vehicles. Method 200 begins with step 202 and continues with step 204 where a portable camera (e.g., portable camera 108 of FIG. 1) is disposed on a person (e.g., person 106 of FIG. 1), object or vehicle.
  • A unique identifier of the portable camera is used in step 206 to facilitate authentication of the person, object or vehicle on which the camera is disposed. The unique identifier can include, but is not limited to, a sequence of numbers, a sequence of letters, a sequence of letters and numbers, and/or a sequence of other symbols. The unique identifier can be compared to a plurality of unique identifiers stored in a data store for authentication purposes. For example, the person's authentication is at least partially obtained if the unique identifier acquired from the camera matches a unique identifier stored in the data store.
  • In some scenarios, information other than or in addition to the camera's unique identifier is used for authentication purposes in step 206. This information can include, but is limited to, a user name and/or password associated with the person.
  • Thereafter in step 208, the portable camera performs operations to record video of a surrounding environment. The video comprises a plurality of video images. The video images are compared to pre-stored images in step 210 (a) to detect a path of travel of the person, object or vehicle through the surrounding environment and/or (b) to identify at least one geographic location at which the image(s) was(were) captured. Image processing is well known in the art. Any known or to be known image processing technique can be used herein without limitation.
  • In some scenarios, the image processing generally involves: comparing a captured image with pre-stored images to identify which one of the pre-stored images contains some or all of the first captured image's content; and obtaining location information stored in a data store so as to be associated with the identified pre-stored image. These image processing steps are iteratively or simultaneously performed for all or some of the captured images. Thereafter, the location information may be used to define a path of travel for the person, object or vehicle. Once the path of travel is defined, the path may be plotted on a multi-dimensional map. The multi-dimensional map may be displayed to relevant persons (e.g., security guards). The path of travel may also be stored for later use as historical travel information for the person, object or person.
  • The path of travel and/or identified geographic location(s) is(are) analyzed in step 212 to verify that the person, object or vehicle is (1) traveling along a correct path, (2) traveling towards a facility for which the person, object or vehicle has authorization to enter, and/or (3) traveling towards a zone or secured area internal or external to the facility for which the person, object or vehicle has authorization to enter, and/or (4) is located within or in proximity to a facility, secured area or zone for which the person, object or vehicle has authorization to enter. The path of travel and/or identified geographic location(s) additionally or alternatively be analyzed to determine: if the correct path is being traveled or was followed; and/or if the correct path is being traveled or was followed in a prescribed timely manner.
  • The analysis of step 212 can involve: using a unique identifier of the portable camera to determine an access or clearance level of the person, object or vehicle; and/or using at least one of GPS data, inertial navigation data, beacon data and sensor data in conjunction with the video to verify that the identified geographic location(s) and/or path of travel is(are) correct. In some scenarios, items (1), (2) and/or (3) is(are) verified by (A) comparing the identified geographic location(s) to pre-defined geographic location(s) stored in a data store and/or (B) comparing the detected path of travel to at least one pre-defined path of travel stored in the data store. Additionally or alternatively, a destination of the person/vehicle/object is predicted based on the identified geographic location(s), detected path of travel, the direction of travel, and/or historical travel information associated with the person/vehicle/object. The predicted destination is then compared to a plurality of destinations stored in a data store.
  • As should be readily understood, GPS data specifies a position of the camera at a specified time as determined by a global navigation satellite. Inertial navigation data specifies the position, orientation and velocity (direction and speed of movement) of the moving camera which were determined using a dead reckoning technique. The inertial navigation data is obtained using a processor, motion sensors (e.g., accelerometers) and rotation sensors (e.g., gyroscopes). Beacon data includes, but is not limited to, unique beacon identifiers which can be used to obtain known locations of the beacons (e.g., iBeacons®). The sensor data can include, but is not limited to, beam break sensor data indicating detected motion of the person to which the camera is attached. iBeacons® and beam break sensors are well known in the art, and therefore will not be described herein. Any known or to be known iBeacon® and/or beam break sensor can be used herein without limitation.
  • The person, object or vehicle is allowed to enter or remain in the facility, zone or secured area when it is verified that the person, object or vehicle has authorization to enter the same, as shown by steps 214-216. In this regard, a door lock may be commanded to unlock so that the person, object or vehicle can enter the facility (e.g., facility 102 or 104 of FIG. 1) or secured area (e.g., secured area 110 of FIG. 1). The unique identifier of the portable camera can be used to generate the door lock command so as to facilitate keyless access control into the facility or secured area. Additionally or alternatively, the portable camera may be used to capture on image of a mirror reflection of the person, object or vehicle to facilitate access control.
  • In optional step 218, security personnel are notified that the person, object or vehicle is authorized to enter or remain in the facility, zone or secured area. Such notification can be electronically provided by wirelessly communicating a notification message to a communication device in the possession of the security personnel. The communication device can include, but is not limited to, a personal computer, a smart phone, a portable computer, a personal digital assistant, and/or a smart watch. Additionally or alternatively, an auditory, visual and/or tactile alarm may be output from the communication device of the security personnel and/or another computing device (e.g., a security system in proximity to the security personnel) in addition to and/or in response to the notification message's reception. Subsequently, step 224 is performed where method 200 ends or other processing is performed.
  • When it is not verified that the person, object or vehicle has authorization to enter the facility, zone or secured area, steps 220-222 are performed. These steps involve: taking measures to prevent the person, object or vehicle from entering or remaining in the facility, zone or secured area; and optionally providing notification to security personnel. In some scenarios, the measures comprise locking a door lock. The unique identifier of the portable camera may be used as an electronic key for locking the door lock. The notification can be electronically provided by wirelessly communicating a notification message to a communication device in the possession of the security personnel. The communication device can include, but is not limited to, a personal computer, a smart phone, a portable computer, a personal digital assistant, and/or a smart watch. Additionally or alternatively, an auditory, visual and/or tactile alarm may be output from the communication device of the security personnel and/or another computing device (e.g., a security system in proximity to the security personnel) in addition to and/or in response to the notification message's reception. Upon completing step 220 or 222, step 224 is performed where method 200 ends or other processing is performed.
  • In view of the forgoing, the systems and methods described herein have many novel features. For example, the systems: use any standard camera mounted on a person, vehicle or object to facilitate locating and/or tracking movement thereof; continually compare active images to a stored map to determine where in the map the moving object is located; perform exception handling on any image that does not fit into its stored mapping; use known visual or Infrared (“IR”) markers to determine an immediate location of the person, vehicle or object (note: this could be used to resync position or to use limited capability hardware which might be required by cheap systems); integrate location into a coordinate system using changes in image to indicate relative distance; store an image with the map for future reference or comparison; augment position or motion data with additional sensors; use a 3D data collection to provide additional detailed mapping information; and define security zones within the mapped area to provide conditional access or alarming.
  • Referring now to FIG. 3, there is shown a hardware block diagram comprising an exemplary computer system 300. The machine can include a set of instructions which are used to cause the computer system to perform any one or more of the methodologies discussed herein. In a networked deployment, the machine can function as a server or a router. In one or more scenarios, the exemplary computer system 300 can correspond to the computing system 112 of FIG. 1 and/or the computing elements of camera 108 of FIG. 1. In the camera scenarios, system 300 would include imaging components in addition to the computing elements shown in FIG. 3. The imaging component can include, but are not limited to, an image capturing device. Image capturing devices are well known in the art, and therefore will not be described herein. Any known or to be known image capturing device can be used herein without limitation.
  • The computer system 300 can operate independently as a standalone device. However, the present solution is not limited in this regard and in other scenarios the computer system can be operatively connected (networked) to other machines in a distributed environment to facilitate certain operations described herein. Accordingly, while only a single machine is illustrated in FIG. 3, it should be understood that the present solution can be taken to involve any collection of machines that individually or jointly execute one or more sets of instructions as described herein.
  • The computer system 300 is comprised of a processor 302 (e.g., a Central Processing Unit (“CPU”)), a main memory 304, a static memory 306, a drive unit 308 for mass data storage and comprised of machine readable media 320, input/output devices 310, a display unit 312 (e.g., a Liquid Crystal Display (“LCD”), a solid state display, or a Cathode Ray Tube (“CRT”)), and a network interface device 314. Communications among these various components can be facilitated by means of a data bus 318. One or more sets of instructions 324 can be stored completely or partially in one or more of the main memory 304, static memory 306, and drive unit 308. The instructions can also reside within the processor 302 during execution thereof by the computer system. The input/output devices 310 can include a keyboard, a keypad, a mouse, buttons, a multi-touch surface (e.g., a touchscreen), a speaker, a microphone, an imaging capturing device, and so on. The network interface device 314 can be comprised of hardware components and software or firmware to facilitate wired or wireless network data communications in accordance with a network communication protocol utilized by a data network (e.g., a Local Area Network (“LAN”) and/or a Wide Area Network (“WAN”)).
  • The drive unit 308 can comprise a machine readable medium 320 on which is stored one or more sets of instructions 324 (e.g., software) which are used to facilitate one or more of the methodologies and functions described herein. The term “machine-readable medium” shall be understood to include any tangible medium that is capable of storing instructions or data structures which facilitate any one or more of the methodologies of the present disclosure. Exemplary machine-readable media can include magnetic media, solid-state memories, optical-media and so on. More particularly, tangible media as described herein can include; magnetic disks; magneto-optical disks; CD-ROM disks and DVD-ROM disks, semiconductor memory devices, Electrically Erasable Programmable Read-Only Memory (“EEPROM”)) and flash memory devices. A tangible medium as described herein is one that is non-transitory insofar as it does not involve a propagating signal.
  • Computer system 300 should be understood to be one possible example of a computer system which can be used in connection with the various implementations. However, the present solution is not limited in this regard and any other suitable computer system architecture can also be used without limitation. Dedicated hardware implementations including, but not limited to, application-specific integrated circuits, programmable logic arrays, and other hardware devices can likewise be constructed to implement the methods described herein. Applications that can include the apparatus and systems of various implementations broadly include a variety of electronic and computer systems. Some implementations may implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit. Thus, the exemplary system is applicable to software, firmware, and hardware implementations.
  • Further, it should be understood that the present solution can take the form of a computer program product on a tangible computer-usable storage medium (for example, a hard disk or a CD-ROM). The computer-usable storage medium can have computer-usable program code embodied in the medium. The term computer program product, as used herein, refers to a device comprised of all the features enabling the implementation of the methods described herein. Computer program, software application, computer software routine, and/or other variants of these terms, in the present context, mean any expression, in any language, code, or notation, of a set of instructions intended to cause a system having an information processing capability to perform a particular function either directly or after either or both of the following: a) conversion to another language, code, or notation; or b) reproduction in a different material form.
  • All of the apparatus, methods, and algorithms disclosed and claimed herein can be made and executed without undue experimentation in light of the present disclosure. While the present solution has been described in terms of preferred embodiments, it will be apparent to those having ordinary skill in the art that variations may be applied to the apparatus, methods and sequence of steps of the method without departing from the concept, spirit and scope of the present solution. More specifically, it will be apparent that certain components may be added to, combined with, or substituted for the components described herein while the same or similar results would be achieved. All such similar substitutes and modifications apparent to those having ordinary skill in the art are deemed to be within the spirit, scope and concept of the present solution as defined.
  • The features and functions disclosed above, as well as alternatives, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations or improvements may be made by those skilled in the art, each of which is also intended to be encompassed by the disclosed solutions.

Claims (33)

We claim:
1. A method for location identification and tracking of a person, object and/or vehicle, comprising:
obtaining, by a computing system, a video of a surrounding environment which was captured by a portable camera coupled to the person, object or vehicle;
comparing, by the computing system, first images of the video to pre-stored second images to identify geographic locations where the first images were captured by the portable camera;
analyzing, by the computing system, the identified geographic locations to verify that the person, object or vehicle is (1) traveling along a correct path, (2) traveling towards a facility for which the person, object or vehicle has authorization to enter, (3) traveling towards a zone or secured area internal or external to the facility for which the person, object or vehicle has authorization to enter, or (4) located in or in proximity to a facility, zone or secured area for which the person, object or vehicle has authorization to enter; and
transmitting a notification from the computing system indicating the results of the analyzing.
2. The method according to claim 1, further comprising allowing the person, object or vehicle to enter the facility, zone or secured area when it is verified that the person, object or vehicle has authorization to enter the same.
3. The method according to claim 1, further comprising commanding a lock to unlock when it is verified that the person, object or vehicle has authorization to enter the facility or secured area.
4. The method according to claim 3, wherein a unique identifier for the portable camera is used an electronic key for unlocking the lock.
5. The method according to claim 1, further comprising notifying security personnel that the person, object or vehicle is authorized to enter the facility, zone or secured area when it is verified that the person, object or vehicle has authorization to enter the same.
6. The method according to claim 1, further comprising providing notification to security personnel when it is not verified that the person, object or vehicle has authorization to enter the facility, zone or secured area.
7. The method according to claim 1, further comprising taking measures to prevent the person, object or vehicle from entering the facility, zone or secured area when it is not verified that the person, object or vehicle has authorization to enter the same.
8. The method according to claim 7, wherein the measures comprise locking a lock.
9. The method according to claim 8, wherein a unique identifier of the portable camera is used as an electronic key for locking the lock.
10. The method according to claim 1, further comprising using a unique identifier of the portable camera to determine an access or clearance level of the person, object or vehicle.
11. The method according to claim 1, further comprising using a unique identifier of the portable camera to facilitate authentication of the person, object or vehicle to which the portable camera is coupled.
12. The method according to claim 1, further comprising using a unique identifier of the portable camera to facilitate keyless access control into or out of the facility or secured area.
13. The method according to claim 1, further comprising using at least one of Global Positioning System (“GPS”) data, inertial navigation data, beacon data and sensor data in conjunction with the video to identify the geographic locations, to verify that the identified geographic locations are correct, to detect a path of travel, or to verify that the path of travel is correct.
14. The method according to claim 1, further comprising using the portable camera to capture on image of a mirror reflection of the person, object or vehicle to facilitate access control.
15. The method according to claim 1, wherein the computing system comprises at least one of the portable camera and a computing device remote from the portable camera.
16. The method according to claim 1, further comprising using the identified geographic locations to generate a path of travel of the person, object or vehicle through the surrounding environment.
17. A system, comprising:
a computing system having hardware and software configured to
obtain a video of a surrounding environment which was captured by a portable camera coupled to a person, object or vehicle,
compare first images of the video to pre-stored second images to identify geographic locations where the first images were captured by the portable camera,
analyze the identified geographic locations to verify that the person, object or vehicle is (1) traveling along a correct path, (2) traveling towards a facility for which the person, object or vehicle has authorization to enter, (3) traveling towards a zone or secured area internal or external to the facility for which the person, object or vehicle has authorization to enter or (4) located in or in proximity to a facility, zone or secured area for which the person, object or vehicle has authorization to enter, and
transmit a notification indicating the results of the analyzing.
18. The system according to claim 17, wherein the person, object or vehicle is allowed to enter the facility, zone or secured area when it is verified that the person, object or vehicle has authorization to enter the same.
19. The system according to claim 17, wherein the computing system further causes a lock to unlock when it is verified that the person, object or vehicle has authorization to enter the facility or secured area.
20. The system according to claim 19, wherein a unique identifier for the portable camera is used an electronic key for unlocking the lock.
21. The system according to claim 17, wherein the computing system further provides a notification to security personnel that the person, object or vehicle is authorized to enter the facility, zone or secured area when it is verified that the person, object or vehicle has authorization to enter the same.
22. The system according to claim 17, wherein the computing system further provides a notification to security personnel when it is not verified that the person, object or vehicle has authorization to enter the facility, zone or secured area.
23. The system according to claim 17, wherein the computing system causes measures to be taken to prevent the person, object or vehicle from entering the facility, zone or secured area when it is not verified that the person, object or vehicle has authorization to enter the same.
24. The system according to claim 23, wherein the measures comprise locking a lock.
25. The system according to claim 24, wherein a unique identifier of the portable camera is used an electronic key for locking the lock.
26. The system according to claim 17, wherein a unique identifier of the portable camera is used to determine an access or clearance level of the person, object or vehicle.
27. The system according to claim 17, wherein a unique identifier of the portable camera is used to facilitate authentication of the person, object or vehicle to which the portable camera is coupled.
28. The system according to claim 17, wherein a unique identifier of the portable camera is used to facilitate keyless access control into or out of the facility or secured area.
29. The system according to claim 17, wherein at least one of Global Positioning System (“GPS”) data, inertial navigation data, beacon data and sensor data is used in conjunction with the video to identify the geographic locations, to verify that the identified geographic locations are correct, to detect a path of travel, or to verify that the path of travel is correct.
30. The system according to claim 17, wherein the portable camera is used to capture on image of a mirror reflection of the person, object or vehicle to facilitate access control.
31. The system according to claim 17, wherein the computing system comprises at least one of the portable camera and a computing device remote from the portable camera.
32. The system according to claim 17, wherein the identified geographic locations is used to generate a path of travel of the person, object or vehicle through the surrounding environment.
33. The system according to claim 17, wherein at least one of the video and notification is encrypted prior to be communicated between devices in the system.
US15/234,254 2015-08-12 2016-08-11 Systems and methods for location identification and tracking using a camera Abandoned US20170046891A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/234,254 US20170046891A1 (en) 2015-08-12 2016-08-11 Systems and methods for location identification and tracking using a camera
US17/388,640 US11544984B2 (en) 2015-08-12 2021-07-29 Systems and methods for location identification and tracking using a camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562204138P 2015-08-12 2015-08-12
US15/234,254 US20170046891A1 (en) 2015-08-12 2016-08-11 Systems and methods for location identification and tracking using a camera

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/388,640 Continuation US11544984B2 (en) 2015-08-12 2021-07-29 Systems and methods for location identification and tracking using a camera

Publications (1)

Publication Number Publication Date
US20170046891A1 true US20170046891A1 (en) 2017-02-16

Family

ID=57994285

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/234,254 Abandoned US20170046891A1 (en) 2015-08-12 2016-08-11 Systems and methods for location identification and tracking using a camera
US17/388,640 Active US11544984B2 (en) 2015-08-12 2021-07-29 Systems and methods for location identification and tracking using a camera

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/388,640 Active US11544984B2 (en) 2015-08-12 2021-07-29 Systems and methods for location identification and tracking using a camera

Country Status (1)

Country Link
US (2) US20170046891A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170186253A1 (en) * 2015-12-29 2017-06-29 Skidata Ag Method for monitoring access authorizations by an access monitoring system
US20170249793A1 (en) * 2016-02-25 2017-08-31 Dean Drako Unattended physical delivery access method and control system
CN108650485A (en) * 2018-03-21 2018-10-12 江苏同袍信息科技有限公司 Personnel monitoring's method and system based on image detection and wireless probe
US20180314901A1 (en) * 2017-04-28 2018-11-01 Toyota Jidosha Kabushiki Kaisha Non-transitory storage medium storing image transmission program, image transmission device, and image transmission method
WO2019014334A1 (en) * 2017-07-14 2019-01-17 Carrier Corporation Intent driven building occupant path and system interaction optimization
US10209771B2 (en) * 2016-09-30 2019-02-19 Sony Interactive Entertainment Inc. Predictive RF beamforming for head mounted display
US10447864B1 (en) * 2015-12-28 2019-10-15 Amazon Technologies, Inc. Remote access control
WO2019209799A1 (en) * 2018-04-25 2019-10-31 Carrier Corporation Systems and methods for trajectory prediction to enable seamless access using mobile devices
US10531225B2 (en) * 2016-12-14 2020-01-07 Alibaba Group Holding Limited Method and apparatus for verifying entity information
US10549198B1 (en) * 2018-10-30 2020-02-04 Niantic, Inc. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
US10623680B1 (en) * 2017-07-11 2020-04-14 Equinix, Inc. Data center viewing system
WO2020092460A1 (en) * 2018-11-01 2020-05-07 Carrier Corporation Integrate body cameras with hotel key box
US10665096B2 (en) 2017-04-28 2020-05-26 Toyota Jidosha Kabushiki Kaisha Non-transitory storage medium storing image transmission program, image transmission device, and image transmission method
US10726723B1 (en) * 2017-08-25 2020-07-28 Objectvideo Labs, Llc Parking lot use monitoring for small businesses
EP3662659A4 (en) * 2017-08-02 2020-07-29 Objectvideo Labs, LLC MONITORING ACCESS TO PROPERTY WITH A PORTABLE CAMERA
US20210034927A1 (en) * 2019-07-29 2021-02-04 Robert Bosch Gmbh Apparatus and method for person detection, tracking, and identification utilizing wireless signals and images
US11048950B2 (en) * 2016-11-25 2021-06-29 Hangzhou Hikvision Digital Technology Co., Ltd. Method and device for processing images of vehicles
CN113538759A (en) * 2021-07-08 2021-10-22 深圳创维-Rgb电子有限公司 Access control management method, device, device and storage medium based on display device
US11170627B2 (en) * 2019-03-22 2021-11-09 Eaton Intelligent Power Limited Process monitoring
CN113678178A (en) * 2019-03-29 2021-11-19 日本电气株式会社 Monitoring system, monitoring device, monitoring method, and non-transitory computer-readable medium
US20220130174A1 (en) * 2019-03-07 2022-04-28 Nec Corporation Image processing apparatus, control method, and non-transitory storage medium
US20220164581A1 (en) * 2020-11-24 2022-05-26 Boe Technology Group Co., Ltd. Region management and control method, device, apparatus and storage medium
US20220351136A1 (en) * 2016-09-30 2022-11-03 Runbuggy Omi, Inc. Predictive analytics for transport services
US12499409B2 (en) 2022-07-11 2025-12-16 Runbuggy Omi, Inc. Predictive analytics for transport services

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11594138B1 (en) * 2021-08-19 2023-02-28 Beta Air, Llc Systems and methods for optimizing a controlled flight plan

Family Cites Families (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7629899B2 (en) * 1997-10-22 2009-12-08 Intelligent Technologies International, Inc. Vehicular communication arrangement and method
US5844505A (en) * 1997-04-01 1998-12-01 Sony Corporation Automobile navigation system
US9177476B2 (en) * 1997-10-22 2015-11-03 American Vehicular Sciences Llc Method and system for guiding a person to a location
US7796081B2 (en) * 1997-10-22 2010-09-14 Intelligent Technologies International, Inc. Combined imaging and distance monitoring for vehicular applications
US8831970B2 (en) * 2000-08-24 2014-09-09 Martin Herman Weik, III Virtual attendant system and parking management system
US8001054B1 (en) * 2001-07-10 2011-08-16 American Express Travel Related Services Company, Inc. System and method for generating an unpredictable number using a seeded algorithm
US20060244830A1 (en) * 2002-06-04 2006-11-02 Davenport David M System and method of navigation with captured images
US20110285842A1 (en) * 2002-06-04 2011-11-24 General Electric Company Mobile device positioning system and method
WO2005024595A2 (en) * 2003-09-03 2005-03-17 Visible Tech-Knowledgy, Inc. Electronically updateable label and display
US7707239B2 (en) * 2004-11-01 2010-04-27 Scenera Technologies, Llc Using local networks for location information and image tagging
US7321305B2 (en) * 2005-07-05 2008-01-22 Pinc Solutions Systems and methods for determining a location of an object
US20060224647A1 (en) * 2005-03-30 2006-10-05 Impinj, Inc. RFID tag using updatable seed values for generating a random number
WO2007056620A1 (en) * 2005-11-14 2007-05-18 Massachusetts Institute Of Technology Enhanced security protocol for radio frequency systems
US8653946B2 (en) * 2006-05-10 2014-02-18 Electronics And Telecommunications Research Institute Passive RFID reader and operation control method therefor
JP4677364B2 (en) * 2006-05-23 2011-04-27 株式会社村上開明堂 Vehicle monitoring device
US8437954B1 (en) * 2006-06-02 2013-05-07 Intelligent Design Labs, L.L.C. Real time travel director
KR100912076B1 (en) * 2006-07-26 2009-08-12 한국전자통신연구원 Apparatus and method for Integrated Reader and Tag
DE102006062061B4 (en) * 2006-12-29 2010-06-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus, method and computer program for determining a position based on a camera image from a camera
US20100092093A1 (en) * 2007-02-13 2010-04-15 Olympus Corporation Feature matching method
US7739034B2 (en) * 2007-04-17 2010-06-15 Itt Manufacturing Enterprises, Inc. Landmark navigation for vehicles using blinking optical beacons
IL183006A0 (en) * 2007-05-06 2007-12-03 Wave Group Ltd A bilateral robotic omni-directional situational awarness system having a smart throw able transportaion case
US8373581B2 (en) * 2007-06-19 2013-02-12 Magna Electronics, Inc. Mobile control node system and method for vehicles
US8009013B1 (en) * 2007-09-21 2011-08-30 Precision Control Systems of Chicago, Inc. Access control system and method using user location information for controlling access to a restricted area
US20090315686A1 (en) * 2007-10-16 2009-12-24 Rcd Technology, Inc. Rfid tag using encrypted value
US20090096574A1 (en) * 2007-10-16 2009-04-16 Rcd Technology, Inc. Rfid tag using encrypted password protection
US20090160610A1 (en) * 2007-12-19 2009-06-25 Doddamane Krishna S Pseudorandom number generator
WO2009094591A2 (en) * 2008-01-24 2009-07-30 Micropower Appliance Video delivery systems using wireless cameras
US8131118B1 (en) * 2008-01-31 2012-03-06 Google Inc. Inferring locations from an image
US8204299B2 (en) * 2008-06-12 2012-06-19 Microsoft Corporation 3D content aggregation built into devices
DE102010006702A1 (en) * 2009-02-26 2010-09-02 Navigon Ag Method and device for calculating alternative routes in a navigation system
US9465129B1 (en) * 2009-03-06 2016-10-11 See Scan, Inc. Image-based mapping locating system
US9571625B2 (en) * 2009-08-11 2017-02-14 Lg Electronics Inc. Electronic device and control method thereof
US9560324B2 (en) * 2010-02-12 2017-01-31 Drew Incorporated Tactical vision system
US8509488B1 (en) * 2010-02-24 2013-08-13 Qualcomm Incorporated Image-aided positioning and navigation system
US8412450B1 (en) * 2010-03-17 2013-04-02 The United States Of America As Represented By The Secretary Of The Navy Method for navigating in GPS denied environments
KR101744723B1 (en) * 2010-12-20 2017-06-20 한국전자통신연구원 Indoor location position system and method for recognizing indoor location position using the same
US9311586B2 (en) * 2011-03-22 2016-04-12 Jamie Robinette Apparatus and method for locating, tracking, controlling and recognizing tagged objects using active RFID technology
US8625853B2 (en) * 2011-08-11 2014-01-07 International Business Machines Corporation Parking lot information system using image technology for identifying available parking spaces
US8743051B1 (en) * 2011-09-20 2014-06-03 Amazon Technologies, Inc. Mirror detection-based device functionality
US8624725B1 (en) * 2011-09-22 2014-01-07 Amazon Technologies, Inc. Enhanced guidance for electronic devices having multiple tracking modes
US9253607B2 (en) * 2011-12-16 2016-02-02 Maxlinear, Inc. Method and system for location determination and navigation using textual information
US9182748B2 (en) * 2012-02-08 2015-11-10 Identive Group, Inc. RFID access control reader with enhancements
US9489644B2 (en) * 2012-02-23 2016-11-08 Ford Global Technologies, Llc Vehicle drive matching system and method
JP2013250141A (en) * 2012-05-31 2013-12-12 Toshiba Corp Route search device and route search method
US9681103B2 (en) * 2012-11-13 2017-06-13 International Business Machines Corporation Distributed control of a heterogeneous video surveillance network
GB201223520D0 (en) * 2012-12-31 2013-02-13 Tomtom Dev Germany Gmbh Method and apparatus for route comparison
US9959674B2 (en) * 2013-02-26 2018-05-01 Qualcomm Incorporated Directional and X-ray view techniques for navigation using a mobile device
US9679414B2 (en) * 2013-03-01 2017-06-13 Apple Inc. Federated mobile device positioning
US9928652B2 (en) * 2013-03-01 2018-03-27 Apple Inc. Registration between actual mobile device position and environmental model
US20160021512A1 (en) * 2013-03-13 2016-01-21 Retail Optimization International Inc. Systems and methods for indoor location services
WO2015017036A1 (en) * 2013-07-29 2015-02-05 Lenel Systems International, Inc. Systems and methods for integrated security access control for video and audio streaming
GB201316013D0 (en) * 2013-09-09 2013-10-23 Tomtom Dev Germany Gmbh Methods and systems for generating alternative routes
US20150119076A1 (en) * 2013-10-31 2015-04-30 Nant Holdings Ip, Llc Self-calibrating mobile indoor location estimations, systems and methods
CN105917329B (en) * 2014-02-18 2019-08-30 麦克赛尔株式会社 Information display device and information display program
US10430832B2 (en) * 2014-02-21 2019-10-01 Alexey Filatoff Facility mapping and interactive tracking
KR101830249B1 (en) * 2014-03-20 2018-03-29 한국전자통신연구원 Position recognition apparatus and method of mobile object
EP3164811B1 (en) * 2014-07-04 2019-04-24 Mapillary AB Method for adding images for navigating through a set of images
WO2016061471A1 (en) * 2014-10-17 2016-04-21 Hitachi High Technologies America, Inc. Interactive laboratory robotic system
US9648579B2 (en) * 2014-11-21 2017-05-09 Calamp Corp. Systems and methods for driver and vehicle tracking
US20180025632A1 (en) * 2014-12-15 2018-01-25 Intelligent Technologies International, Inc. Mapping Techniques Using Probe Vehicles
US9912901B2 (en) * 2014-12-18 2018-03-06 S4 Worldwide, Llc Body camera
US9601022B2 (en) * 2015-01-29 2017-03-21 Qualcomm Incorporated Systems and methods for restricting drone airspace access
US9791573B2 (en) * 2015-06-30 2017-10-17 International Business Machines Corporation Intelligent global positioning system service
US10395126B2 (en) * 2015-08-11 2019-08-27 Honda Motor Co., Ltd. Sign based localization
US20170227470A1 (en) * 2016-02-04 2017-08-10 Proxy Technologies, Inc. Autonomous vehicle, system and method for structural object assessment and manufacture thereof
US10129698B2 (en) * 2016-07-14 2018-11-13 United Parcel Service Of America, Inc. Internal location address and automatic routing of intra-facility movement

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10447864B1 (en) * 2015-12-28 2019-10-15 Amazon Technologies, Inc. Remote access control
US9892574B2 (en) * 2015-12-29 2018-02-13 Skidata Ag Method for monitoring access authorizations by an access monitoring system
US20170186253A1 (en) * 2015-12-29 2017-06-29 Skidata Ag Method for monitoring access authorizations by an access monitoring system
US20170249793A1 (en) * 2016-02-25 2017-08-31 Dean Drako Unattended physical delivery access method and control system
US20220351136A1 (en) * 2016-09-30 2022-11-03 Runbuggy Omi, Inc. Predictive analytics for transport services
US10514754B2 (en) * 2016-09-30 2019-12-24 Sony Interactive Entertainment Inc. RF beamforming for head mounted display
US10209771B2 (en) * 2016-09-30 2019-02-19 Sony Interactive Entertainment Inc. Predictive RF beamforming for head mounted display
US20190138087A1 (en) * 2016-09-30 2019-05-09 Sony Interactive Entertainment Inc. RF Beamforming for Head Mounted Display
US11048950B2 (en) * 2016-11-25 2021-06-29 Hangzhou Hikvision Digital Technology Co., Ltd. Method and device for processing images of vehicles
US10531225B2 (en) * 2016-12-14 2020-01-07 Alibaba Group Holding Limited Method and apparatus for verifying entity information
US11212641B2 (en) 2016-12-14 2021-12-28 Advanced New Technologies Co., Ltd. Method and apparatus for verifying entity information
US10681492B1 (en) * 2016-12-14 2020-06-09 Alibaba Group Holding Limited Method and apparatus for verifying entity information
US10966052B2 (en) 2016-12-14 2021-03-30 Advanced New Technologies Co., Ltd. Method and apparatus for verifying entity information
US11062154B2 (en) * 2017-04-28 2021-07-13 Toyota Jidosha Kabushiki Kaisha Non-transitory storage medium storing image transmission program, image transmission device, and image transmission method
CN108806244A (en) * 2017-04-28 2018-11-13 丰田自动车株式会社 Image transfer apparatus, method and non-transient storage media
US20180314901A1 (en) * 2017-04-28 2018-11-01 Toyota Jidosha Kabushiki Kaisha Non-transitory storage medium storing image transmission program, image transmission device, and image transmission method
US10621449B2 (en) * 2017-04-28 2020-04-14 Toyota Jidosha Kabushiki Kaisha Non-transitory storage medium storing image transmission program, image transmission device, and image transmission method
US10665096B2 (en) 2017-04-28 2020-05-26 Toyota Jidosha Kabushiki Kaisha Non-transitory storage medium storing image transmission program, image transmission device, and image transmission method
US10623680B1 (en) * 2017-07-11 2020-04-14 Equinix, Inc. Data center viewing system
CN110869984A (en) * 2017-07-14 2020-03-06 开利公司 Intent-driven optimization of building occupancy paths and system interactions
WO2019014334A1 (en) * 2017-07-14 2019-01-17 Carrier Corporation Intent driven building occupant path and system interaction optimization
US10832508B2 (en) 2017-07-14 2020-11-10 Carrier Corporation Intent driven building occupant path and system interaction optimization
EP3662659A4 (en) * 2017-08-02 2020-07-29 Objectvideo Labs, LLC MONITORING ACCESS TO PROPERTY WITH A PORTABLE CAMERA
AU2018312581B2 (en) * 2017-08-02 2023-05-18 Objectvideo Labs, Llc Supervising property access with portable camera
US11373495B2 (en) 2017-08-02 2022-06-28 Objectvideo Labs, Llc Supervising property access with portable camera
US10726723B1 (en) * 2017-08-25 2020-07-28 Objectvideo Labs, Llc Parking lot use monitoring for small businesses
US11227496B1 (en) 2017-08-25 2022-01-18 Objectvideo Labs, Llc Parking lot use monitoring for small businesses
CN108650485A (en) * 2018-03-21 2018-10-12 江苏同袍信息科技有限公司 Personnel monitoring's method and system based on image detection and wireless probe
WO2019209799A1 (en) * 2018-04-25 2019-10-31 Carrier Corporation Systems and methods for trajectory prediction to enable seamless access using mobile devices
JP2022024169A (en) * 2018-10-30 2022-02-08 ナイアンティック, インコーポレイテッド Verification of the player's real-world location using landmark image data corresponding to the verification path
KR20220080212A (en) * 2018-10-30 2022-06-14 나이앤틱, 인크. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
US11771996B2 (en) * 2018-10-30 2023-10-03 Niantic, Inc. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
KR102571387B1 (en) 2018-10-30 2023-08-28 나이앤틱, 인크. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
US10828569B2 (en) 2018-10-30 2020-11-10 Niantic, Inc. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
JP2022506396A (en) * 2018-10-30 2022-01-17 ナイアンティック, インコーポレイテッド Verification of the player's real-world location using landmark image data corresponding to the verification path
US10549198B1 (en) * 2018-10-30 2020-02-04 Niantic, Inc. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
JP7239668B2 (en) 2018-10-30 2023-03-14 ナイアンティック, インコーポレイテッド Verification of the player's real-world location using image data of landmarks corresponding to the verification path
US20220226733A1 (en) * 2018-10-30 2022-07-21 Niantic, Inc. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
US11325042B2 (en) 2018-10-30 2022-05-10 Niantic, Inc. Verifying a player's real world location using image data of a landmark corresponding to a verification pathway
US11877096B2 (en) * 2018-11-01 2024-01-16 Carrier Corporation Integrate body cameras with hotel key box
WO2020092460A1 (en) * 2018-11-01 2020-05-07 Carrier Corporation Integrate body cameras with hotel key box
US20210274132A1 (en) * 2018-11-01 2021-09-02 Carrier Corporation Integrate body cameras with hotel key box
US20220130174A1 (en) * 2019-03-07 2022-04-28 Nec Corporation Image processing apparatus, control method, and non-transitory storage medium
US12094250B2 (en) * 2019-03-07 2024-09-17 Nec Corporation Image processing apparatus, control method, and non-transitory storage medium
US11170627B2 (en) * 2019-03-22 2021-11-09 Eaton Intelligent Power Limited Process monitoring
CN113678178A (en) * 2019-03-29 2021-11-19 日本电气株式会社 Monitoring system, monitoring device, monitoring method, and non-transitory computer-readable medium
US20210034927A1 (en) * 2019-07-29 2021-02-04 Robert Bosch Gmbh Apparatus and method for person detection, tracking, and identification utilizing wireless signals and images
US10997474B2 (en) * 2019-07-29 2021-05-04 Robert Bosch Gmbh Apparatus and method for person detection, tracking, and identification utilizing wireless signals and images
US20220164581A1 (en) * 2020-11-24 2022-05-26 Boe Technology Group Co., Ltd. Region management and control method, device, apparatus and storage medium
US12217510B2 (en) * 2020-11-24 2025-02-04 Boe Technology Group Co., Ltd. Region management and control method, device, apparatus and storage medium
CN113538759A (en) * 2021-07-08 2021-10-22 深圳创维-Rgb电子有限公司 Access control management method, device, device and storage medium based on display device
US12499409B2 (en) 2022-07-11 2025-12-16 Runbuggy Omi, Inc. Predictive analytics for transport services

Also Published As

Publication number Publication date
US11544984B2 (en) 2023-01-03
US20210358241A1 (en) 2021-11-18

Similar Documents

Publication Publication Date Title
US11544984B2 (en) Systems and methods for location identification and tracking using a camera
US10937263B1 (en) Smart credentials for protecting personal information
US11582711B2 (en) Systems, devices, methods, and program products enhancing structure walkthroughs
US10915777B2 (en) Communication terminal, communication system, and image processing method
JP6968134B2 (en) Data utilization device, data utilization program and data storage device
US10891814B2 (en) Mobile credential management system for vehicle key box access control
US20130102335A1 (en) Mobile device, information processing device, location information acquisition method, location information acquisition system, and program
KR101775650B1 (en) A facial recognition management system using portable terminal
US10591576B1 (en) Automated system for vehicle tracking
US20200168015A1 (en) Systems, devices, methods, and program products enhancing structure walkthroughs
US20210374387A1 (en) Mobile device-assisted facial recognition
CN110402590A (en) Right discriminating system at least partly autonomous vehicle
WO2017041864A1 (en) A trusted geolocation beacon and a method for operating a trusted geolocation beacon
JP4958600B2 (en) Watch system and masking method
US9489537B2 (en) Personal information protection for navigation systems
US20200357251A1 (en) Security Apparatus and Control Method Thereof
HK1243529A1 (en) Portable identification and data display device and system and method of using same
KR101783377B1 (en) A security management method using a face recognition algorithm
JP2018018367A (en) Authentication system, authentication method, and authentication program
US12422514B2 (en) Information processing system and program
CN116347350A (en) Information processing system, information processing method, and non-transitory computer-readable medium
JPWO2013146582A1 (en) Position determination system, position determination method, computer program, and position determination apparatus
JP7786469B2 (en) MOBILE BODY MANAGEMENT DEVICE, MOBILE BODY MANAGEMENT METHOD, AND MOBILE BODY MANAGEMENT PROGRAM
Sambana et al. An Artificial Intelligence approach to Intelligent Vehicle Control and Monitoring System
US20200043193A1 (en) Position determination methods and systems for vehicle

Legal Events

Date Code Title Description
AS Assignment

Owner name: TYCO FIRE & SECURITY GMBH, SWITZERLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TRIVELPIECE, STEVE E.;GATHRIGHT, DAVID;RAYNESFORD, STEVEN J.;REEL/FRAME:039408/0352

Effective date: 20160810

AS Assignment

Owner name: SENSORMATIC ELECTRONICS, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TYCO FIRE & SECURITY GMBH;REEL/FRAME:047182/0674

Effective date: 20180927

AS Assignment

Owner name: SENSORMATIC ELECTRONICS, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TYCO FIRE & SECURITY GMBH;REEL/FRAME:047188/0715

Effective date: 20180927

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION