US11836984B2 - Electronic device and method for counting objects - Google Patents
Electronic device and method for counting objects Download PDFInfo
- Publication number
- US11836984B2 US11836984B2 US17/253,293 US201817253293A US11836984B2 US 11836984 B2 US11836984 B2 US 11836984B2 US 201817253293 A US201817253293 A US 201817253293A US 11836984 B2 US11836984 B2 US 11836984B2
- Authority
- US
- United States
- Prior art keywords
- electronic device
- sensor
- depth
- objects
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
- G06V10/803—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
- H04N23/23—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from thermal infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
Definitions
- the invention relates in general to an electronic device and a method for counting objects and, in particular, a system, a device and a method for counting live objects based on fusion of depth and thermal sensor.
- Object counting technology can provide valuable data for different purpose. For example, information related to number of people may be useful to professionals in making decision on personnel scheduling, resource allocation, business strategy, and security monitoring.
- Video people counting technology can provide a new real-time, accurate and time, accurate and non-interference method for counting passengers. It became popular very fast and turned into a hot topic for research and development. Although, the technology over video people counting has advanced rapidly and was making great progress, this technology suffers a number of drawbacks.
- an object of the present invention to provide a system, a device and a method for people counting system based on depth and thermal sensors or camera.
- the depth sensor identifies an object by collating a plurality of points having depth distance of similar depth level into object groups.
- the similar depth level is determined by a reference depth distance and a tolerant range value.
- a point is collated to an object group if the depth distance of the point is within the tolerant range value of the reference depth distance.
- the depth sensor is a time of flight sensor for detecting an electromagnetic wave signal reflected back from the object.
- the electromagnetic signal is modulated with high speeds up to 100 MHz.
- the counting device further comprises an illumination unit for emitting electromagnetic wave signals.
- the illumination unit comprises one or more light emitting diodes for emitting electromagnetic wave signal.
- the illumination unit is adapted to emitting electromagnetic wave signals of a plurality of wave lengths.
- the depth sensor comprises an ultrasound sensor and an ultrasound emitter.
- the thermal sensor comprises an infrared photodetector.
- the infrared photodetector is a quantum well infrared photodetector.
- the infrared photodetector is a thermal infrared sensor.
- the counting device further comprises a processing unit for determining whether an object belongs to a type of object by comparing the temperatures of an object group with one or more temperature pattern of the type of object.
- the type of object belongs to a homosapien class.
- the counting device further comprises a memory module for storing the spatial thermal information of a plurality of consecutive time intervals.
- the number stored in the register is increased when an object is identified in the spatial thermal information of a current time interval and the object was not identified in the spatial information of a previous time interval.
- a method of counting a type of object within an area comprising the step of:
- FIG. 1 A is a schematic diagram of a counting device of an embodiment of the present invention.
- FIG. 1 B is a schematic diagram of a counting device of an alternative embodiment of the present invention.
- FIG. 1 C is a flow diagram showing a process of image fusion procedure
- FIG. 2 is a schematic diagram of a counting device installed in a confined area
- FIG. 3 is a sample image generated by a depth sensor of a counting device of FIG. 1 ;
- FIG. 4 is a sample image generated by segmenting the image of FIG. 3 according to different depth levels.
- FIG. 5 is an image generated by the thermal camera of the counting device of FIG. 1 A with identified high temperature objects which are labelled with rectangles.
- video counting system may be based on video camera to capture the scene of an area.
- this kind of system usually captures images using optical camera or digital camera with CCD (charged coupled device) and CMOS (complementary metal-oxide-semiconductor). It was designed main for capturing the colours of visible light spectrum. The images recorded were directly affected by environmental changes such as ambient lights, and shadows. Further, in a crowded scene, it was very difficult to effectively split clumps of passengers and estimate number of head counts accurate. The effectiveness of this method was highly depended on the pattern recognition algorithm and the computational power available. When the accuracy of the video counting technology relies heavily on the algorithm and sheer computing power of the machine running the algorithm, the cost of this type of technology was typically high and hence the permeation rate was generally low.
- a head counting device may perform a method for counting people heads.
- the method comprises the step of capturing images, moving images of a predetermined counting view, detecting a motion region in the moving images of the predetermined counting view; calculating a motion region speed value indicating a speed of movement of the motion region.
- a contribution zone is repeatedly defined based on a predetermined counting boundary, the motion region speed value, and a contribution time period.
- a sub area value representing the size of the area of the motion region contained in the defined contribution zone is repeatedly retrieved and registered.
- a total area value is generated by adding a plurality of registered sub area values, and estimating the number of objects that have passed the counting boundary based on a reference object area value.
- a people counter may be used for estimating the number of people crossing a counting boundary.
- the people counter had an image sensor arranged to capture images of a predetermined counting view; a motion region detector arranged to, in the captured images, detect motion region areas passing the counting boundary; an area accumulator arranged to accumulate a total motion region by integrating the motion region areas detected by the motion region detector, wherein integrating comprised summing area slices of motion regions from a plurality of the captured images.
- the people count also had an object counting unit arranged to estimate the number of objects that have passed the counting boundary by dividing the total motion region area by a reference area.
- people counting system must disclose facial appearances of the people.
- the present invention provides an electronic device 10 for counting objects of similar type in an area.
- the electronic device 10 has a register for storing a number of counts of the objects in the area, a depth sensor 12 for measuring depth distances between a plurality of points of an object and the depth sensor; and a thermal sensor 14 for measuring temperatures of the points of the object.
- the electronic device 10 may be used as a counting device and may be installed over a confined area as shown in FIG. 2 .
- the counting device may combine the distance data collected by the depth sensor and the temperatures data collected by the thermal sensor to derive spatial thermal information of the object for determining an increase in the number stored in the register.
- the type of object is a homosapiens class or people.
- the counting device 10 may be used for counting the number of people in an area without disclosing/recording facial appearances of the people.
- the counting device is adapted to carried out a method of counting a type of object within an area, comprising the step of
- a thermal camera may be useful to identify people by detecting the differential temperature difference between human and the environment, however when the ambient temperature rises to certain value, the camera may not effectively differentiate animate and inanimate objects.
- the depth sensor or camera captures the scene and passes the data through a foreground extraction module for cleaning up the foreground signals and then the data is passed to a depth object identification module such that it can identify the objects in the premises.
- a depth sensor may detect electromagnetic wave signals reflected from an object.
- the electromagnetic wave signal is in the infrared light spectrum.
- the electromagnetic signal is modulated with high speeds up to 100 MHz.
- the reflected light is collected onto a sensor capable of measuring the signal's phase ( ⁇ ), amplitude (A), and offset ( ⁇ ). These are measured indirectly, by changing the intensity measurements.
- the counting device 10 is provided with an illumination unit for emitting electromagnetic wave signals or infrared light.
- the illumination unit comprises one or more light emitting diodes for emitting electromagnetic wave signal.
- the illumination unit is adapted to emitting electromagnetic wave signals of a plurality of wave lengths.
- the depth sensor 12 is adapted to identify an object by collating a plurality of points having depth distance of similar depth level into object groups. This can be achieved by segmenting objects in various depth levels, individual objects can be isolated and separately identified as shown in FIG. 3 and FIG. 4 .
- the points of an object are considered as of the same object group if the distances are within the tolerance range of a reference distance.
- the depth sensor is a time of flight sensor for detecting an electromagnetic wave signal reflected back from the object.
- the system can derive the object-sensor distance (D), also known as the depth.
- the system can further calculate the 3D coordinate for every point scanned as the depth sensor's focal length (f) can be adjusted by the system and is known.
- the system can then correct any radial distortion by converting the distorted values (index d) to the undistorted values (index u).
- the depth sensor comprises an ultrasound sensor and an ultrasound emitter.
- a stereo camera or a 3D camera may be used to capture images including the depth information.
- the thermal sensor or camera captures the heat signature on the scene and then passes the data to a thermal object identification module for identifying the object with a distinct heat signature.
- Different objects emit different amounts of black body radiations as a function of their temperatures.
- the radiation emitted by a living organism is quite different to non-living object.
- the heat radiated by a homo-sapiens is different to other animals.
- the thermal sensor 14 is an infrared photodetector which consists of one or more elements that measure these radiations. These sensors may be photo-sensitive in the far infrared range. Accordingly, silicon or germanium lenses may be used, as glass does not transmit these wavelengths.
- the system may be implemented with a quantum well infrared photodetector which is also known as a cooled infrared detector.
- a quantum well infrared photodetector which is also known as a cooled infrared detector.
- an uncooled infrared detector is used as they may be more compact and cost effective.
- both the thermopile array and microbolometer type of detector are used to ensure an accurate result.
- thermopile sensor With a thermopile sensor, the heat radiated from an object may be absorbed by a small membrane of the sensor which caused the temperature raised. The temperature difference between the membrane and a thermal mass induces a change in the electric potential between the two known as the Seebeck effect. This system records a change in the voltage and derives an absolute temperature measurement of the object.
- a microbolometer has a similar structure but instead of detecting an induced current, it detects temperature coefficient of resistance.
- thermopile array is more compact and cost efficient.
- the counting device 10 further comprises a processing unit for determining whether an object belongs to a type of object by comparing the temperatures of an object group with one or more temperature pattern of the type of object.
- the identification data from the depth identification module and the identification data from the thermal identification module are fused together.
- the counting device of an embodiment of the present invention derives spatial thermal information of a particular time interval. This spatial thermal information of a particular is stored in a memory module of the counting device 10 .
- the counting device 10 may be arranged to count all the objects within a detected area at a particular time. For example, if there are seven human like objects in the premises at time T, the counting system 10 will output seven. Similarly, at time T+1, if there are three human like objects, the counting system will output three.
- the counting system may be arranged to process the information in both images captured by each of thermal and depth camera.
- images captured by the other camera may provide supplementary information which may assist in determining the objects more accurately.
- the thermal image may provide additional information to the analytical engine of depth image so that the small object can now be identified.
- the depth and thermal cameras may be arranged to perform a calibration process so as to obtain the intrinsic and extrinsic camera parameter.
- both the depth and the thermal cameras are calibrated with reference to the feature points of the calibration target.
- the features points should be detectable and identifiable by both the depth and thermal cameras.
- any human-like object can be identified in the premises.
- the system proceeds to the process of thresholding the image from thermal sensor or camera, the head position of human targets can be located and thus the human target can be identified. This process is typically carried out by the processing unit of the counting device.
- the images or spatial thermal information from different frames or time intervals are then compared and only newly identified objects is counted in the people counting system.
- the number stored in the register is increased when an object is identified in the spatial thermal information of a current time interval and the object was not identified in the spatial information of a previous time interval.
- the system is adapted to obtain a 3D thermogram as a means to detect objects.
- the depth sensor data and thermal sensor data are fused.
- this process comprises the step of ascertaining the focal length of the depth sensor and calculating the 3D coordinate of every pixel using the distance measurement.
- each 3D point's temperature measurement can be derived by bilinear interpolating between its four nearest neighbours. Fusing and analyzing images from both sensors can yield a more accurate people counting methodology.
- the identified objects will be tracked in multiple consecutive fused frames.
- the counting device 10 has a memory module installed for storing.
- inventions may be advantageous in that no video camera is installed or used in the system.
- the depth sensor and the thermal sensor do not directly record the appearance of an object and hence no facial and colour image are recorded. As both sensors reveal no detail facial appearance of the object being detected, the privacy is entirely protected.
- the recording is thermal suitable in submission as an evidence in law enforcement.
- any appropriate computing system architecture may be utilised. This will include standalone computers, network computers and dedicated hardware devices.
- computing system and “computing device” are used, these terms are intended to cover any appropriate arrangement of computer hardware capable of implementing the function described.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Toxicology (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Processing (AREA)
Abstract
Description
-
- a register for storing a number of count of the objects in the area;
- a depth sensor for measuring depth distances between a plurality of points of an object and the depth sensor;
- a thermal sensor for measuring temperatures of the points of the object;
- wherein the distances and the temperatures are used to derive spatial thermal information of the object for determining an increase in the number stored in the register.
-
- collecting distance data of one or more object in the area with a depth sensor, such that an object in the area is identified;
- collecting temperature data of the object in the area;
- combining the distance data and temperature data to derive a spatial thermal information for determining the object belongs to a type of objects;
- increasing a number stored in a register in the event that the spatial thermal information of the object is not found in the the spatial information of objects in a previous time interval.
-
- (i) collecting distance data of one or more object in the area with a depth sensor, such that an object in the area is identified;
- (ii) collecting temperature data of the object in the area;
- (iii) combining the distance data and temperature data to derive a spatial thermal information for determining the object belongs to a type of objects;
- (iv) increasing a number stored in a register in the event that the spatial thermal information of the object is not found in the spatial information of objects in a previous time interval.
Claims (19)
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2018/093676 WO2020000368A1 (en) | 2018-06-29 | 2018-06-29 | Electronic device and method for counting objects |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210271895A1 US20210271895A1 (en) | 2021-09-02 |
| US11836984B2 true US11836984B2 (en) | 2023-12-05 |
Family
ID=68984581
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/253,293 Active 2039-07-09 US11836984B2 (en) | 2018-06-29 | 2018-06-29 | Electronic device and method for counting objects |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US11836984B2 (en) |
| CN (1) | CN112771534A (en) |
| WO (1) | WO2020000368A1 (en) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP4010782A4 (en) * | 2019-08-05 | 2023-07-19 | Tellus You Care, Inc. | Non-contact identification of multi-person presence for elderly care |
| US12067052B2 (en) * | 2020-01-03 | 2024-08-20 | AlgoLook, Inc. | Air particulate classification |
| CN116028962B (en) * | 2023-03-27 | 2023-06-13 | 联通(四川)产业互联网有限公司 | Real-time online data security compliance supervision method, device and storage medium |
Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5877688A (en) * | 1995-04-12 | 1999-03-02 | Matsushita Electric Industrial Co., Ltd. | Thermal object measuring apparatus |
| US20110116055A1 (en) * | 2009-11-18 | 2011-05-19 | Seiko Epson Corporation | Image forming apparatus |
| CN102622588A (en) | 2012-03-08 | 2012-08-01 | 无锡数字奥森科技有限公司 | Dual-certification face anti-counterfeit method and device |
| CN102881239A (en) | 2011-07-15 | 2013-01-16 | 鼎亿数码科技(上海)有限公司 | Advertisement playing system and method based on image identification |
| US20130088422A1 (en) * | 2011-10-05 | 2013-04-11 | Sony Corporation | Input apparatus and input recognition method |
| US20130182905A1 (en) * | 2012-01-17 | 2013-07-18 | Objectvideo, Inc. | System and method for building automation using video content analysis with depth sensing |
| US20140368615A1 (en) * | 2013-06-12 | 2014-12-18 | Disney Enterprises, Inc. | Sensor fusion for depth estimation |
| US20160191327A1 (en) | 2013-08-13 | 2016-06-30 | Zte Corporation | Method, Device, System for Detecting Data Link, Controller and Gateway |
| WO2016139203A1 (en) | 2015-03-04 | 2016-09-09 | Thyssenkrupp Elevator Ag | Multi camera load estimation |
| US20170156673A1 (en) * | 2015-12-07 | 2017-06-08 | Panasonic Corporation | Living body information measurement device, living body information measurement method, and storage medium storing program |
| CN106934752A (en) | 2017-03-07 | 2017-07-07 | 高剑 | A kind of KXG based on bus |
| US20180106598A1 (en) * | 2016-10-14 | 2018-04-19 | Fujitsu Limited | Distance measuring apparatus and distance measuring method |
| US20190362507A1 (en) * | 2017-02-24 | 2019-11-28 | Flir Systems, Inc. | Real-time detecton of periodic motion systems and methods |
| US10514256B1 (en) * | 2013-05-06 | 2019-12-24 | Amazon Technologies, Inc. | Single source multi camera vision system |
| US20200256734A1 (en) * | 2017-07-20 | 2020-08-13 | Goertek Inc. | Method and Device for Measuring Body Temperature and Smart Apparatus |
| US10982871B2 (en) * | 2016-06-03 | 2021-04-20 | Mitsubishi Electric Corporation | Equipment control device and method, utilizing basal metabolism data calculated from estimated characteristics of a person based on detected visible light image data and corresponding thermal image data |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101985674B1 (en) * | 2012-09-18 | 2019-06-04 | 삼성전자 주식회사 | Method of recognizing contactless user interface motion and System there-of |
-
2018
- 2018-06-29 CN CN201880095219.0A patent/CN112771534A/en active Pending
- 2018-06-29 US US17/253,293 patent/US11836984B2/en active Active
- 2018-06-29 WO PCT/CN2018/093676 patent/WO2020000368A1/en not_active Ceased
Patent Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5877688A (en) * | 1995-04-12 | 1999-03-02 | Matsushita Electric Industrial Co., Ltd. | Thermal object measuring apparatus |
| US20110116055A1 (en) * | 2009-11-18 | 2011-05-19 | Seiko Epson Corporation | Image forming apparatus |
| CN102881239A (en) | 2011-07-15 | 2013-01-16 | 鼎亿数码科技(上海)有限公司 | Advertisement playing system and method based on image identification |
| US20130088422A1 (en) * | 2011-10-05 | 2013-04-11 | Sony Corporation | Input apparatus and input recognition method |
| US20130182905A1 (en) * | 2012-01-17 | 2013-07-18 | Objectvideo, Inc. | System and method for building automation using video content analysis with depth sensing |
| CN102622588A (en) | 2012-03-08 | 2012-08-01 | 无锡数字奥森科技有限公司 | Dual-certification face anti-counterfeit method and device |
| US10514256B1 (en) * | 2013-05-06 | 2019-12-24 | Amazon Technologies, Inc. | Single source multi camera vision system |
| US20140368615A1 (en) * | 2013-06-12 | 2014-12-18 | Disney Enterprises, Inc. | Sensor fusion for depth estimation |
| US20160191327A1 (en) | 2013-08-13 | 2016-06-30 | Zte Corporation | Method, Device, System for Detecting Data Link, Controller and Gateway |
| WO2016139203A1 (en) | 2015-03-04 | 2016-09-09 | Thyssenkrupp Elevator Ag | Multi camera load estimation |
| US20170156673A1 (en) * | 2015-12-07 | 2017-06-08 | Panasonic Corporation | Living body information measurement device, living body information measurement method, and storage medium storing program |
| US10982871B2 (en) * | 2016-06-03 | 2021-04-20 | Mitsubishi Electric Corporation | Equipment control device and method, utilizing basal metabolism data calculated from estimated characteristics of a person based on detected visible light image data and corresponding thermal image data |
| US20180106598A1 (en) * | 2016-10-14 | 2018-04-19 | Fujitsu Limited | Distance measuring apparatus and distance measuring method |
| US20190362507A1 (en) * | 2017-02-24 | 2019-11-28 | Flir Systems, Inc. | Real-time detecton of periodic motion systems and methods |
| CN106934752A (en) | 2017-03-07 | 2017-07-07 | 高剑 | A kind of KXG based on bus |
| US20200256734A1 (en) * | 2017-07-20 | 2020-08-13 | Goertek Inc. | Method and Device for Measuring Body Temperature and Smart Apparatus |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2020000368A1 (en) | 2020-01-02 |
| CN112771534A (en) | 2021-05-07 |
| US20210271895A1 (en) | 2021-09-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9489565B2 (en) | Image processing apparatus, image processing method, program, and image processing system | |
| US9224278B2 (en) | Automated method and system for detecting the presence of a lit cigarette | |
| US8170278B2 (en) | System and method for detecting and tracking an object of interest in spatio-temporal space | |
| Abuarafah et al. | Real-time crowd monitoring using infrared thermal video sequences | |
| US11836984B2 (en) | Electronic device and method for counting objects | |
| Günter et al. | Privacy-preserving people detection enabled by solid state LiDAR | |
| WO2020144367A1 (en) | Detection and identification systems for humans or objects | |
| WO2011054971A2 (en) | Method and system for detecting the movement of objects | |
| CN112541403B (en) | Indoor personnel falling detection method by utilizing infrared camera | |
| CN112654897B (en) | Multi-sensor theft/threat detection system for crowd pre-screening | |
| US20070090295A1 (en) | Image processing system | |
| Oppliger et al. | Sensor fusion of 3d time-of-flight and thermal infrared camera for presence detection of living beings | |
| CN118258399A (en) | A method for indoor human detection based on multi-sensor fusion | |
| CN107274396B (en) | Device for counting number of people | |
| US11734834B2 (en) | Systems and methods for detecting movement of at least one non-line-of-sight object | |
| CN112347920A (en) | A kind of intelligent people flow statistics collection system in exhibition hall area based on neural network | |
| HK40046212A (en) | Electronic device and method for counting objects | |
| CN111414967A (en) | Method for improving robustness of temperature measurement system and monitoring system | |
| Zhang et al. | Fast crowd density estimation in surveillance videos without training | |
| JPH07209309A (en) | Method and apparatus for detecting heat source moving speed and heat source distribution | |
| Armitage et al. | Measuring pedestrian trajectories with low cost infrared detectors: preliminary results | |
| Jędrasiak et al. | The comparison of capabilities of low light camera, thermal imaging camera and depth map camera for night time surveillance applications | |
| Morinaka et al. | Human information sensor | |
| Logu et al. | Real‐Time Mild and Moderate COVID‐19 Human Body Temperature Detection Using Artificial Intelligence | |
| Wang et al. | Pedestrian Flow Monitoring System for Commercial Areas Based on Deep Learning and QT |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| AS | Assignment |
Owner name: LOGISTICS AND SUPPLY CHAIN MULTITECH R&D CENTRE LIMITED, HONG KONG Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TONG, CHI HUNG;REEL/FRAME:065431/0804 Effective date: 20231019 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |