[go: up one dir, main page]

WO2025106961A1 - Surveillance basée sur l'impression d'un individu à l'aide de capteurs non invasifs et sans contact - Google Patents

Surveillance basée sur l'impression d'un individu à l'aide de capteurs non invasifs et sans contact Download PDF

Info

Publication number
WO2025106961A1
WO2025106961A1 PCT/US2024/056365 US2024056365W WO2025106961A1 WO 2025106961 A1 WO2025106961 A1 WO 2025106961A1 US 2024056365 W US2024056365 W US 2024056365W WO 2025106961 A1 WO2025106961 A1 WO 2025106961A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
impression
sensors
storage medium
readable storage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/056365
Other languages
English (en)
Inventor
Kelly W. BOLES
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mayo Foundation for Medical Education and Research
Mayo Clinic in Florida
Original Assignee
Mayo Foundation for Medical Education and Research
Mayo Clinic in Florida
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mayo Foundation for Medical Education and Research, Mayo Clinic in Florida filed Critical Mayo Foundation for Medical Education and Research
Publication of WO2025106961A1 publication Critical patent/WO2025106961A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms

Definitions

  • Existing wellness monitoring systems are typically employed to determine if an adverse, or emergent condition is present and do not necessarily convey normal states of wellness. They often embody one or more limitations including direct contacting sensors such as "wearables” or pendants; being functionally invasive including explicit identification; preferring adherence to a rote schedule; inefficient power management, the use of large data payloads; a technically complex user experience which introduce challenges to subjects with diminished cognitive ability; along with the requirement for the subject to engage with the monitoring system-this scenario forces the subject to continually confront the premise for being monitored, which can lead to psychological & emotional distress.
  • the impression-based data include measurements of the environment that indicate an impression of a monitored subject within the environment.
  • the impression-based data may be highly compressed, such as by using a binary coding that may include using a Cayenne Low-Power Payload encoding protocol.
  • the method also includes generating, by the processor, feature data by analyzing the impression-based data to detect patterns within the impression-based data.
  • the method also includes generating, by the processor, a report indicating monitoring of the monitored subject based on the feature data.
  • the method includes acquiring impression-based data from an environment with at least one sensor, where the impression-based data comprise measurements of the environment that indicate an impression of a monitored subject within the environment.
  • the impressionbased data are compacted and/or compressed by a processor of the at least one sensor, where the impression-based data are compressed using a binary coding.
  • the compressed impressionbased data are then transmitted from the at least one sensor to a computer system using a wireless protocol.
  • a report indicating monitoring of the monitored subject based on the compressed impression-based data is then generated by the computer system.
  • FIG. 1 shows an example impression-based monitoring system in accordance with some examples described in the present disclosure.
  • FIG. 2 is a block diagram illustrating a dataflow for an example impressionbased monitoring system.
  • FIG. 3 is a flow chart setting forth the steps of an example method for monitoring a subject using an impression-based monitoring system.
  • FIG. 4 illustrates an example report that can be generated and presented to a user.
  • the report can be presented via a graphical user interface (GUI) as a dashboard that indicates various measurements from the impression-based data in addition to feature data indicating the identification of patterns or other deterministic outcomes in the impression-based data.
  • GUI graphical user interface
  • FIG. 5 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes CO2 measurement data, temperature measurement data, and humidity measurement data.
  • FIG. 6 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes pulse measurement data and respiratory measurement data of the subject.
  • FIG. 7 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes various data sources, including pulse measurement data, respiratory measurement data, environmental temperature data, environmental humidity data, and location data.
  • FIG. 8 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes visible + IR / visible light data, gas resistance data, and barometric pressure data.
  • FIG. 9 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes CO2 data, including a histogram of CO2 measurements, motion data, and presence data.
  • FIG. 10 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes various different datatypes.
  • FIG. 11 illustrates the temperature and humidity data from the dashboard shown in FIG. 10.
  • the measured temperature and humidity data indicate a heat event anomaly that occurred in the environment.
  • FIG. 12 illustrates an example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data include temporally correlated sensor data.
  • FIG. 13 is a block diagram of an example impression-based monitoring system.
  • FIG. 14 is a block diagram of example components that can implement the system of FIG. 13.
  • Described here are systems and methods for continuously and unobtrusively monitoring subjects of concern, such as the elderly persons or any person with diminished physical and/or cognitive states, such as in rehabilitation and recovery.
  • Smart sensors are used to measure environmental and/or biological impressions (e.g., temperature, CO2 and other gaseous levels, humidity level, time-of-flight presence recognition, electrostatic fields, etc.) of the subject's environment, as well as vital signs without any contact (e.g., respiration and heart rates).
  • a baseline of normal daily activities for the subject is established, and then the impression-based data are analyzed to flag anomalous patterns.
  • thresholds associated with anomalous event triggers can be manually or dynamically configured.
  • the objective with these anomalous event triggers is the assignment of appropriate sensitivity levels to detect the potential anomalous events.
  • No user input from the monitored person is typically required.
  • the acquired data may be pre-processed at the level of the sensors themselves (i.e., using edge computing), and may be highly compressed, resulting in very long battery life for battery-powered sensors and the need for only low-bandwidth data transfer networks.
  • Some sensors may be bidirectional, enabling them to remotely control elements within the monitored subject’s home.
  • the systems and methods are flexible and robust with respect to sensor use, malfunction, and upgrades, including over the air (OTA) updates, where applicable.
  • OTA over the air
  • the system can distinguish and track several subjects (e.g., people, animals) in the same residence through the use of a singularization process, which is an enhanced form of instance-level recognition, to discern one or more subj ects of interest from multiple different subject instances.
  • singularization includes discerning and/or emphasizing something (e g., one subject) as distinct from the surrounding elements or group (e.g., the environment, other subjects).
  • the acquired data may be inherently de-identified, such that the data can be transported in an inherently de-identified manner.
  • data may be re-identified by a retrieving application, or the like.
  • the sensors used by the system to acquire impression-based data allow for a high compaction and/or compression rate of data, coupled with low-power data transfer protocol, thereby enabling very long battery life (e.g., many years).
  • pulse and respiration data may be compressed down to as few as 8 data bytes.
  • This compression helps to democratize the disclosed systems and methods, since with the higher levels of data compression there is a significantly reduced need for an expensive, consumption-based bandwidth.
  • such compression can be achieved by using binary coding instead of traditional JSON objects.
  • the compression can be achieved by using a Cayenne Low-Power Payload encoding protocol.
  • the disclosed system also enables very efficient battery' utilization.
  • the sensors could be installed in the residence by an untrained person, or in some cases may be self-provisioned.
  • the disclosed system makes use of a connected network of sensors, the system is robust to one or more sensors malfunctioning or otherwise coming offline, since additional sensors in the sensor network can be utilized to still collect impression-based data. Additionally or alternatively, the multiple sensors can be utilized to escalate the collection of impression-based data when an anomalous pattern is detected in the impression-based data. For instance, when an anomalous pattern is detected in the impressionbased data, one or more typically dormant sensors can be brought online to acquire additional data. As still another advantage of using a network of sensors, multiple different subjects can be independently monitored and distinguished in the same environment by combining and analyzing the measurements of different sensors.
  • the systems and methods described in the present disclosure improve upon, and address capability gaps associated with traditional monitoring systems for elderly and other subjects, including panic button lanyards, smart wearables, smart devices, home assistant, and video-based approaches.
  • the disclosed systems and methods determine human viability and/or wellness through non-invasive, precise biological, and environmental measures that recognize impressions (e.g., biological signatures), the signals of viable life that a person or associated objects inherently introduce into their surroundings as a normal state of being.
  • Biological impressions can recognize and/or convey the need for medical care, or serve to supplement traditional primary and acute care methodologies.
  • the system may serve to augment, or otherwise replace, traditional in-person wellness checks conducted by law enforcement, or comparable agencies.
  • An adverse incident by definition signifies that an abnormal condition has occurred; however, abnormal should be referenced to a baseline of normalcy.
  • the disclosed systems and methods are capable of determining and reporting normal states of biological wellness, which should represent the prevalent state of a subject.
  • the disclosed systems and methods are able to recognize and produce normalcy patterns, for instance, by applying Interpretive Phenomenological Analysis to sensory data, along with modeling premised upon establishing cloud-resident digital twins that support real-time analysis.
  • the systems and methods described in the present disclosure are advantageous for monitoring elderly persons for elder care purposes. Additionally or alternatively, the disclosed systems and methods may be used to monitor subjects who have diminished physical and/or cognitive abilities, such as for in-home treatments, rehabilitation, and recovery. The ability for a patient to receive medical treatments and/or recover at home while maintaining normal daily living could be facilitated by having effective monitoring in place. Bilateral benefit can be achieved as fewer clinical appointments and/or earlier hospital discharges can emerge as viable options.
  • the impression-based monitoring system 100 uses non-intrusive intelligent sensors 1 2 that can be transparent to the monitored subject and convey the wellness state of a person based upon biological and environmental impressions with a high degree of confidence.
  • the monitored subject does not need to own or be able to operate a smart device, charge device batteries, wear a device, calibrate a device, switch devices on/off, link smart devices, be responsive to or assessed against a fixed schedule, and/or interact with devices (not required, but interactive sensors could be optionally incorporated).
  • the impression-based monitoring system minimizes the requirements placed on the monitored subject and increases reliability’ while minimizing intrusion and any perceived loss of privacy, dignity’, or autonomy.
  • the monitored subject will not be explicitly identified; that is, the data collected by the sensors 102 will be de-identified.
  • the impression-based monitoring system 100 can function using any conventional cellular or broadband Internet Service Provider (ISP).
  • ISP Internet Service Provider
  • an efficient low-power, long- range, networking approach may alternatively be utilized, such as LoRaWAN (long-range, low-power wide area network) and/or NB-IoT (narrow band internet of things).
  • LoRaWAN is based on the LoRa modulation protocol and associated infrastructure.
  • the sensors 102 include sensors and devices used in relation to the operation of sensors.
  • the sensors 102 can include environmental sensors, physiological sensors, optical sensors, particulate matter sensors, gaseous sensors, electrostatic sensors, radar sensors, LIDAR sensors, spectroscopic sensors, infrared sensors, chemical sensors (including those that measure volatile organic compounds (VOCs)), and so on.
  • Each sensor 102 may be the same sensor or may be different sensors. Accordingly, each sensor 102 may be configured to measure one or more specific parameters (e.g., temperature, humidity, subject motion, subject presence, etc.) that individually or collectively provide impression-based data of a monitored subject and/or their environment.
  • the sensors 102 illustrated in the impressionbased monitoring system 100 are representative examples.
  • the data acquired with the sensors 102 may be inherently deidentified.
  • the resolution of data acquired by the sensors 102 may be constrained so as not to provide enough detail to identify the monitored subject, thereby rendering the data inherently de-identified.
  • machine learning can be implemented for pattern learning and inferencing at the edge for selected sensors 102 to enable data localization restrictions and/or data privacy.
  • the sensors 102 may include environmental sensors, physiological sensors, optical sensors (e.g., cameras, video cameras), electrostatic sensors, chemical sensors, and so on.
  • environmental sensors may include sensors that measure temperature, humidity, carbon dioxide (CO2), and so on, of the environment.
  • CO2 carbon dioxide
  • the environmental impressions generated by human activity and objects can be accurately discerned and patterned using such environmental sensors.
  • the fresh air concentration of CO2 is approximately 400 parts per million (ppm), or 0.04% CO2 in air by volume.
  • An example of a discernable object would be a gas range and/or oven, as elevated levels of humidity and CO2 are introduced into the environment when these appliances are used.
  • Physiological sensors may include sensors that measure pulse, respiration, presence, motion, position, falls, and so on, of the subject.
  • the physiological sensors may include frequency modulated continuous wave (FMCW) radar sensors that can capture pulse, respiration, presence, motion, position, falls, and so on.
  • FMCW frequency modulated continuous wave
  • the sensors 102 may include thermal imaging sensors that can detect the presence, motion, and physical contours of objects in an environment.
  • the sensors can measure black-body radiation and/or gray -body radiation.
  • the sensors 102 may also include cameras and/or time-of-flight (ToF) sensors that capture images, video, subject contours, or the like.
  • artificial intelligence (Al) and/or machine learning (ML) enhanced edge entity recognition can be implemented based on data recorded by such sensors.
  • edge AI/ML equipped sensors can be used to locally analyze video captures and distinguish and classify, in a de- identified manner, entities within the environment, such as “human,’’ “cat,” “dog,” and the like. Recognizing the presence of an entity without explicitly identifying an entity may be referred to as an instance-level recognition mode. Presence recognition can be used within the singularization process.
  • some ToF sensors may provide a plurality of different zones of ToF sensor distance measurements as feature data. By exposing the supporting histogram for each of these zones, AI/ML algorithms may be applied directly on compact normalized histograms from the ToF sensors to develop tailored feature data sets.
  • LiDAR or other ToF sensors may employ edge inferencing (e.g.. by using a locally stored machine learning model) and, for example, can return zonal measurements in millimeters.
  • a LiDAR or other ToF sensor may expose raw data captures in the form of compact normalized histograms, which support tailored, customized inferencing of feature data. This capability further improves the ability to achieve de-identified singularization described above.
  • Electrostatic sensors may include charge variation (QV AR) electrostatic sensors that are capable of measuring human and environmental electrostatic fields to determine presence, motion, and touch.
  • QV AR charge variation
  • the sensors 102 may also include ToF ranging sensors.
  • ToF ranging sensors may be used for human presence detection, obstacle detection, gesture recognition, laser liquid-level recognition, liquid intake detection, and/or urine output detection.
  • sensors such as FMCW sensors, QVAR sensors, and ToF sensors can measure and detect the presence of a monitored person without motion (i.e., the presence of the monitored person can be detected even if they are not moving).
  • the sensors 102 may include optical spectroscopy sensors that measure the subject’s pulse, respiration, and/or hydration.
  • Air flow sensors may also be used to ascertain proper ventilation and/or to capture adverse events such as doors and/or windows being open during inclement weather conditions.
  • volatile organic compound sensors may be utilized by the system to detect the presence of urine that may be associated with human or animal incontinence, or other air-quality wellness related concerns present in the environment.
  • urine with an elevated odor of ammonia can convey potential dehydration of the monitored subject. Dehydration could be further confirmed in these instances using spectroscopic sensors.
  • the sensors 102 may include gaseous sensors that are capable of detecting gases or gaseous compounds in the environment. Such sensors may be capable of detecting carbon monoxide, carbon dioxide, ammonia, radon, VOCs, formaldehyde, nitrogen dioxide, sulfur dioxide, methane, ozone, dimethyl sulfide, and the like.
  • the sensors 102 may include particulate matter sensors to detect allergens or other factors that could contribute to respiratory' concerns or adverse environmental conditions.
  • particulate matter sensors may detect and/or characterize solid particles and/or liquid droplets suspended in the air of the environment.
  • the particulate matter sensors may sense PM 10 particulate matter. PM2.5 particulate matter, or the like.
  • Such sensors may be capable of sensing and/or characterizing smoke (e.g., tobacco smoke, smoke from a wood-burning fireplace, candle smoke, incense), cooking emissions (e.g., cooking activities with an open-flame gas stove), household cleaning products (e.g., aerosol sprays), pesticides, insecticides, pet dander, mold spores, dust mites, furniture and/or building materials (e.g., formaldehyde and other chemicals released from particleboard or pressed wood products; asbestos; etc.), and the like.
  • smoke e.g., tobacco smoke, smoke from a wood-burning fireplace, candle smoke, incense
  • cooking emissions e.g., cooking activities with an open-flame gas stove
  • household cleaning products e.g., aerosol sprays
  • pesticides e.g., insecticides, pet dander, mold spores, dust mites, furniture and/or building materials (e.g., formaldehyde and other chemicals released from particleboard
  • the sensors 102 may include one or more dimethyl sulfide detectors.
  • Dimethyl sulfide detectors can be used to detect off-gassing produced by a decomposing body, which may provide an indication that a monitored subject is deceased. These sensors facilitate rapid discovery' of a deceased individual, which can minimize psychological trauma.
  • sensors that may be incorporated into the impression-based monitoring system include pulse oximeter sensors, infrared sensors (e.g., to measure body temperature. subject motion), accelerometers (e.g., to measure motion, direction), global navigation satellite system (GNSS) receivers configured to receive signals from GNSS satellites (e.g.. GPS data) to provide location data and timestamps for temporally indexed impression-based data, intelligent microphones, and so on.
  • the sensors 102 may include flow sensors, moisture sensors (e.g., to measure soiling of bedding, plant watering (signifying human activity), etc.), light sensors (to monitor room occupancy), traditional security sensors (e.g., door contacts, glass break detectors, pressure pads, etc.), and so on.
  • the sensors 102 may include wireless sensors that are connected over the network 108, or connected together to form a mesh network of sensors 102, for example.
  • the sensors 102 may also be configured for edge computing.
  • one or more of the sensors 102 may include a sensor electronic control assembly having a sensor electronic processor, a sensor memory, and a transceiver (also referred to as a sensor transceiver).
  • the sensor electronic processor may receive impression-based data recorded by the sensor 102, and may also receive (e.g., via the sensor transceiver) impressionbased data from other sensors 102.
  • the sensor electronic processor may thus be configured to receive impression-based data recorded by the sensor or transmitted by other sensors 102, store the impression-based in the sensor memory, process impression-based data, and/or transmit impression-based data or processed impression-based data (e.g., via the sensor transceiver).
  • the sensor electronic processor and sensor memory may collectively form a sensor electronic controller that is configured to perform certain methods described herein (e.g., the process of FIG. 3).
  • the sensors 102 may have increased computational capabilities and can be equipped with microprocessors that allow them to perform complex calculations and analysis on the data they collect. This enables sensors to perform advanced signal processing, and pattern recognition, as well as to implement machine learning and/or other artificial intelligence algorithms, allowing them to extract valuable insights from the data they collect. In this way, edge processing can be employed on the sensors 102 to minimize the size of data payloads and the frequency at which data transfers occur.
  • the sensors 102 can be equipped with faster and more efficient processors, allowing them to process data much more quickly.
  • the sensors 102 can collect and analyze large amounts of data in realtime, enabling more responsive and precise monitoring and control.
  • Payload encoding approaches such as Cayenne Low Power Payload (LPP), or other similar encoding methodologies, can be used to further amplify the efficiency of the sensors 102 as sensory data can be effectively transferred using just a few bytes versus conventional JSON objects commonly used for data exchange.
  • LPP Cayenne Low Power Payload
  • the external device 104 is in communication with the sensors 102 and is configured to receive impression-based data (e.g., environmental data, physiological data, subject motion data, subject presence data, etc.) generated by the sensors 102.
  • impression-based data e.g., environmental data, physiological data, subject motion data, subject presence data, etc.
  • the external device 104 can receive the impression-based data by retrieving the impression-based data from a database or other storage device or medium.
  • the external device 104 can retrieve the impression-based data from a server (e.g., server 106) and/or server application.
  • the external device 104 may include, for example, a smartphone, a tablet computer, a cellular phone, a laptop computer, a smart watch, a heads-up display, virtual reality (‘ VR”) goggles, augmented reality (“AR”) goggles, and the like.
  • the external device 104 may include a gateway or hub device.
  • the impression-based data generated by a sensor 102 can be detected by or shared with other sensors 102.
  • the impression-based data can also be additionally or alternatively shared with the external device 104.
  • the sensors 102 communicate with the external device 104, for example, to transmit at least a portion of the impression-based data stored on or collected by the sensors 102.
  • the external device 104 may include a long-range transceiver to communicate with the sen- er 106 and/or a short-range transceiver to communicate with other external devices or power tool devices via, for example, a long-range communication protocol such as LoRa®, or shorter range protocols such as Bluetooth® or Wi-Fi®.
  • the external device 104 bridges the communication between the sensors 102 and the server 106. For example, the sensors 102 may transmit data to the external device 104, and the external device 104 may forward the data from the sensors 102 to the server 106 over the network 108.
  • the external device 104 may include a device electronic control assembly having a device electronic processor, a device memory, and the aforementioned transceiver (also referred to as a device transceiver).
  • the device electronic processor may be coupled to detectors configured to receive the noted types of wireless communications that the sensors 102 may transmit.
  • the device electronic processor may be configured to receive the communications transmitted by the sensors 102 via the detector(s). process the communications, store the communications in the device memory, and/or transmit the communications (e.g., via the device transceiver).
  • the device electronic processor and device memory' may collectively form a device electronic controller that is configured to perform certain methods described herein (e.g.. the process of FIG. 3).
  • the server 106 includes a server electronic control assembly having a server electronic processor, a server memory, and a transceiver.
  • the transceiver allows the server 106 to communicate with the external device 104.
  • the server electronic processor receives impression-based data and/or processed impression-based data from the sensors 102 (e.g., via the external device 104, directly from one or more of the sensors 102), and stores the received data in the server memory.
  • the server 106 may maintain a database (e.g., on the server memory) for containing impression-based data, trained machine learning controls (e.g., trained machine learning model and/or algorithms), artificial intelligence controls (e.g., rules and/or other control logic implemented in an artificial intelligence model and/or algorithm), and the like.
  • the server electronic processor and server memory' may collectively form a server electronic controller that is configured to perform certain methods described herein (e.g., the process of FIG. 3).
  • the server 106 may be a distributed device in which the server electronic processor and server memory are distributed among two or more units that are communicatively coupled (e.g., via the network 108).
  • the network 108 may be a long-range wireless network such as the broadband ISPs, cellular networks, a local area network (LAN), a wide area network (WAN), a personal area network (PAN), a low-power wide area network (LPWAN). or a combination thereof.
  • the network 108 may' be a short-range wireless communication network, and in yet other embodiments, the network 108 may be a wired network using, for example, MODBUS remote terminal unit (RTU) differential-pair cabling and/or USB cables.
  • RTU MODBUS remote terminal unit
  • the network 108 may’ include a combination of long-range, short- range, and/or wired connections. In some embodiments, the network 108 may include both wired and wireless devices and connections.
  • cellular-based NB-IoT may be used as a radio access technology (RAT) to facilitate communications between end devices and an NB-IoT radio access network (RAN), connecting eventually to a cellular core network.
  • RAT radio access technology
  • RAN radio access network
  • data encoding methods defined by' the 3rd Generation Partnership Project (3 GPP) may be used in conjunction with NB-IoT.
  • the sensors 102 communicate using a LoRa network (e.g.. a LoRaWAN network).
  • LoRa is a low-power, long- range wireless communication protocol designed for the Internet of Things (loT) applications. Global frequency allocations are covered in the LoRa Alliance's Regional Parameters documentation to enable communications between end devices and a gateway or base station, allowing for long-range communication with low power consumption.
  • the LoRa protocol utilizes a chirp spread spectrum (CSS) modulation technique to achieve long-range communication with low power consumption.
  • CSS modulation the data is modulated onto a chirp waveform that has a linearly increasing or decreasing frequency over time.
  • LoRa uses a variation of CSS called chirp spread factor (SF), which controls the rate at which the chirp frequency changes over time.
  • SF chirp spread factor
  • a higher SF value results in a slower chirp rate and longer transmission time, which increases the range and robustness of the communication.
  • it also decreases the data rate and increases the power consumption.
  • LoRa uses CSS modulation and higher SF values, it is able to achieve long-range, low-power communication suitable for applications such as loT devices, smart cities, and industrial automation.
  • LoRa-compatible technology is capable of transmitting data over long distances, typically a few kilometers in urban areas and up to 10 kilometers or more in rural areas. Communication distances can be influenced by factors such as attenuating structures and/or vegetation. This makes it suitable for loT applications that require long-range communication. LoRa-compatible devices also have low power requirements, allowing them to operate for years on a single battery charge. This makes it ideal for loT applications that require long-term, low-maintenance operation. Additionally, LoRa-compatible technology is optimized for low data rate applications, typically in the range of a few hundred bytes per message. This makes it suitable for loT applications that do not require high-bandwidth data transmission.
  • LoRa also supports both uplink and downlink communication between devices (e.g., sensors 102) and gateways, allowing for two-way communication. This makes it suitable for applications that require bidirectional communication, such as remote device control. As yet another advantage, LoRa-compatible technology' is relatively low-cost compared to other loT communication technologies, making it accessible to a wider range of applications and industries.
  • Management dashboards exist for every capability element represented in the LoRaWAN architectural stack (or more generally, the LoRa architectural stack) including sensory devices, gateways, backhaul carriers, network servers, and application servers. There are three different classes of LoRa-compatible devices, each with different capabilities and power consumption profdes.
  • Class A devices are the most common type of LoRa-compatible devices. They are designed to conserve power and operate on a strict duty' cycle. Class A devices have two communication windows: one immediately after it transmits a message, and another at a predetermined time after it receives a downlink message from the network server. This makes them suitable for applications that require infrequent transmission of small amounts of data, such as sensors that need to report data at regular intervals.
  • the impression-based monitoring system 100 can make use of Class A sensory devices to transmit data on a periodic basis and also report by exception.
  • the edge loT sensor sampling rate will often be greater than the reporting interval.
  • a cloud-resident digital twin will be able to apply interpolation and infer data points between fixed reporting intervals when no intervening exceptions have been recorded.
  • Class B devices are designed for applications that require a higher level of responsiveness. They have the same communication windows as Class A devices, but they also have additional, scheduled receive windows that allow them to receive data from the network server at specific times. This makes them suitable for applications that require periodic communication with the network, such as smart meters or asset tracking devices.
  • Class C devices are the most power-hungry of the three LoRa-compatible device classes, as they remain in a receive mode almost continuously. This means they can receive data from the network at any time, making them suitable for applications that require immediate communication, such as security systems or remote controls. Class C devices have a lower battery life than Class A and B devices, but they are ideal for applications that require high responsiveness.
  • the impression-based monitoring system 100 can make use of Class C loT devices when bidirectional actions are desired, such as switching/controlling devices in a monitored subject’s residence.
  • LoRaWAN network There are two types of LoRaWAN network: public and private.
  • Public LoRaWAN networks are owned and operated by third-party service providers who offer connectivity services to end-users.
  • Public LoRaWAN networks are typically designed to cover large geographical areas, such as entire cities or regions, and are intended to support a wide range of loT applications. End-users can connect their devices to these networks by subscribing to the services provided by the network operator.
  • Private LoRaWAN networks are owned and operated by organizations, such as companies, municipalities, or universities, and are designed to support specific loT applications within a restricted area.
  • Private LoRaWAN networks are typically deployed in areas such as industrial plants, warehouses, or campuses, where connectivity requirements are specific to the environment.
  • Private LoRaWAN networks can be further classified into two types: enterprise and community LoRaWAN networks.
  • Enterprise LoRaWAN networks are owned and operated by organizations for their own use. They are typically deployed to support specific business needs and provide connectivity for a limited number of devices.
  • Community LoRaWAN networks are deployed by businesses, groups of organizations, or individuals in a community to provide connectivity for loT devices. They are typically used to support community-based loT projects and are open for use by anyone in the community.
  • a non-limiting example of a community network can include Amazon Sidewalk (Amazon.com, Inc.; Seattle, Washington).
  • the impression-based monitoring system 100 can utilize public and/or private LoRaWAN networks for data backhaul transport.
  • a cellular network can be used for a primary backhaul, secondary backhaul, or the like.
  • a satellite communication network may be used for primary and/or secondary backhauling, including potentially geosynchronous equatorial orbit (GEO), other geostationary orbit, and/or low Earth orbit (LEO) satellites.
  • GEO geosynchronous equatorial orbit
  • LEO low Earth orbit
  • Individual LoRaWAN networks (or other LoRa-based networks) can be established in the absence of an enterprise or private (e.g.. community) LoRaWAN network.
  • the impression-based monitoring system 100 may include one or more LoRaWAN gateways 110, or other LoRa-based gateway devices.
  • LoRaWAN gateways 110 are devices that act as a bridge between LoRaWAN devices and the internet.
  • the LoRaWAN gateways 110 can bridge communications between sensors 102 and an external device 104 and/or a server 106.
  • the LoRaWAN gateways 110 receive signals from LoRa-compatible devices (e.g., sensors 102) and forward them to a central network server (e.g., server 106), which can then process and analyze the data.
  • LoRa-compatible devices e.g.. sensors 102 transmit data using radio waves at a specific frequency and modulation.
  • LoRaWAN gateways 110 have one or more antennas that can receive these signals. The antennas are typically positioned at a high location to maximize coverage area.
  • the gateway 110 receives the LoRa signals and then forwards the LoRaW AN packets to the central network server (e.g., server 106).
  • the central network server e.g., server 106) receives the LoRaWAN packets and decodes them into payloads using LoRa protocols.
  • the central network server (e.g., server 106) may filter out invalid or duplicate packets to reduce network traffic and improve network efficiency.
  • the signals from a single sensor 102 can be received by multiple LoRaWAN gateways 110.
  • This functionality supports resiliency, and can also be used to ascertain the location and/or positioning of a sensor 102 via triangulation premised upon received signal strength (RSSI).
  • RSSI received signal strength
  • Preferred gateways 110 can be selected for bi-directional communications based upon received signal strength.
  • a LoRaWAN gateway 110 can also leverage Ethernet or bundled cellular carrier services for data backhaul.
  • FIG. 2 illustrates an architecture of an example impression-based monitoring system 100.
  • Flow is generally from left to right, except when bi-directional commands are issued.
  • LoRa sensors backhaul data via LoRaWAN gateways or Amazon Sidewalk bridges using wireless, managed LTE (cellular), or Ethernet.
  • a LoRaWAN network server such as the Things Network, provisions and manages LoRa devices.
  • the application server(s) may include, as non-limiting examples, Datacake - data visualization, MQTT - loT Pub / Sub broker, Node Red - Flow based conditional loT programming, Azure loT - loT Digital Twin ecosystem, Google Cloud Platform - Advanced Al / ML loT analytics, and/or Azure AWS loT - Sidewalk ecosystem, or other loT device cloud platforms.
  • FIG. 3 a flowchart is illustrated as setting forth the steps of an example method for monitoring a subject using an impression-based monitoring system.
  • the method includes receiving impression-based data with a processor, as indicated at step 302.
  • the impression-based data may include sensor data acquired using one or more sensors arranged within a subject’s residence, or within another environment.
  • the sensors may include environmental sensors, physiological sensors, optical sensors, electrostatic sensors, chemical sensors, and so on.
  • the sensors include at least one smart sensor.
  • the at least one smart sensor may include a LoRa-compatible sensor.
  • the sensors are capable of detecting the presence and/or movement of the subject based on one or more impressions left on the environment. In this way, the subject may be monitored without requiring contact between the subject and the sensor(s).
  • the impressionbased data may be time series data, such that each sample of the impression-based data can have an associated time stamp.
  • the impression-based data may be received by a processor embedded on one of the sensors. Additionally or alternatively, the impression-based data may be received by a processor of an external device (e.g., a smart phone, a tablet device, a laptop, a computer, etc.), a server, or the like. As described above, in some examples the impression-based data may be received by the processor of the external device, server, or the like retrieving the impressionbased data from a data storage device or medium, which may include retrieving the impressionbased data from a server (e.g., server 106).
  • a server e.g., server 106
  • the impression-based data may be received by a processor of one of the sensors and the impression-based data may be compressed and/or compacted using a binary encoding (e.g.. a Cayenne Low-Power Payload encoding protocol, etc.).
  • the compressed impression-based data may then be transmitted to an external device and/or server from the sensor.
  • the compressed impressionbased data may be transmitted using a LoRa protocol.
  • the impression-based data are then processed by the processor (e.g., processor of an external device, processor of a server, or the like) to generate feature data that indicate the presence and/or status of the subject in the residence or other environment, as indicated at step 304.
  • the feature data can include a prediction or detection of an anomaly in the impression-based data that indicates a change in status of the subject.
  • the impression-based data may be compared to baseline data indicative of a normal status of the subject in their environment. Deviation from normal in one or more of the measured impression-based datatypes may indicate an anomaly that warrants checking in on the subject.
  • the impression-based data may also be processed to discern the presence and/or status of a person.
  • the sensor processor may process the impression-based data (i. e. , provide edge computing, such as edge machine learning and pattern analysis). Processing the impression-based data may include processing audio data in the impression-based data using voice recognition, cough detection, or the like. Natural language processing and/or key term discernment can be used to identify whether the subject is in need of assistance (e.g., by detecting words that indicate the need for help).
  • the compressed impression-based data may in some instances first be decompressed and/or decompacted.
  • the decompressed impression-based data may then be processed as described above.
  • the compressed impression-based data may be processed to generate feature data, such as by using machine learning and/or artificial intelligence techniques to extract or otherwise generate the feature data from the impression-based data.
  • Impression-based data from more than one sensor can be received in step 302 and the data may be aggregated.
  • the sensors can function independently, such that the loss of one sensor will not render the impression-based monitoring system unusable.
  • the biometric impressions of different subjects within the environment can be distinguished when more than a single subject (e.g., person and/or animals) are present within a given environment.
  • the ability to distinguish multiple subjects within an environment can use impression-based datatypes such as LiDAR, FMCW radar, and thermal imaging.
  • Device-level machine learning that supports pattern recognition can be utilized to facilitate the counting and singularization of multiple subjects.
  • a human has a typical body temperature of 37°C, and walks erect on two legs.
  • Dogs and cats have an average body temperature between 38.1°C and 39.2°C, and move about on four legs with a predictable cadence. Both dogs and cats have a higher heart rate than humans.
  • Senor synthesis can be used to identify the variances in body temperature and subject stature when generating deterministic results. In this way, sensor fusion or synthesis of impressionbased data can be used to help differentiate between different subjects within the same environment.
  • the generated feature data may then include data labels identifying the different subjects in the environment.
  • mice have an average body temperature of 26-28°C and a heart rate of 500-700 beats per minute. These characteristics are substantially different from human attributes and, therefore, can be used to readily identify rodents that may be present in the environment.
  • combining impression-based data received from multiple disparate sources of information can help improve the accuracy of deterministic outcomes when gauged against the limited results produced by individual sensors.
  • a more comprehensive understanding of an environment can occur by applying sensor fusion that will facilitate decision making, especially in complex or dynamic situations.
  • Sensor synthesis can also facilitate selecting an appropriate complement of sensors. For example, sensors can be optimized during the sensor synthesis process through the configuration of associated parameters, noise filtering, and/or by applying consideration to physical placement and orientation of the individual sensors.
  • Sensor synthesis for some sensors may include the inferencing of relative versus absolute data. For instance, a subject's contour measurements will remain consistent irrespective of the distance of the subject from a LiDAR sensor. Gaseous measures, on the other hand, will be gauged using a relative measurement. When aggregating, fusing, or otherwise synthesizing these data types, baselines for gaseous and/or particulate concentrations can first be developed by leveraging edge inferencing of feature data.
  • the impression-based data may be received by a processor of external device (e.g., external device 104) and/or a server (e.g.. server 106).
  • the impression-based data may include data from a single sensor, multiple sensors of the same type, or multiple sensors of different types.
  • the impression-based data can then be analyzed by the processor to create feature data that are indicative of deterministic outcomes.
  • aggregated heterogeneous data i.e., impression-based data from different sensor types
  • the impression-based data may be received by a processor embedded on one of the sensors (e.g., one of the sensors 102) and processed by the embedded processor using a machine learning model that is remotely trained and then sent to the embedded processor for inferencing.
  • the impression-based data may include data from the sensor on which the processor is embedded, or additionally or alternatively may include receiving impression-based data from one or more other sensors.
  • the processor may receive impression-based data of the same type as the sensor on which the processor is embedded, or impression-based data from different sensor types.
  • the processor embedded on the sensor may also receive a trained machine learning algorithm, such as by retrieving or otherwise receiving the trained machine learning model from a server (e.g., server 106).
  • the machine learning model is trained on the server and then the trained machine learning model is sent by the server to the embedded processor.
  • the trained machine learning model may include model parameters (e.g., weights, biases, or both), which have been computed or otherwise estimated by training the machine learning model on training data.
  • the trained machine learning model may also include the particular model architecture to be implemented. For instance, data pertaining to the layers in a neural network architecture (e.g., number of layers, type of layers, ordering of layers, connections between layers, hyperparameters for layers) may be sent as part of the trained machine learning model.
  • the model architecture may be stored locally on the embedded processor and sending the trained machine learning model may include sending updated parameters (e.g., weights, biases, hyperparameters) to the embedded processor.
  • the impression-based data may be received by a processor embedded on one of the sensors (e.g., one of the sensors 102) and processed by the embedded processor using a machine learning model that is trained and/or updated locally by the embedded processor.
  • the impression-based data may include data from the sensor on which the processor is embedded, or additionally or alternatively may include receiving impression-based data from one or more other sensors.
  • the processor may receive impression-based data of the same type as the sensor on which the processor is embedded, or impression-based data from different sensor types.
  • the embedded processor may also receive a trained machine learning algorithm, such as by retrieving or otherwise receiving the trained machine learning model from a memory or other data storage device or medium of the sensor on which the processor is embedded.
  • the machine learning model is trained locally on the sensor on impression-based data measured by the sensor or on impression-based data measured by other sensors and sent to the embedded processor.
  • the trained machine learning model may include model parameters (e.g.. weights, biases, or both), which have been computed or otherwise estimated by training or otherw ise updating the machine learning model on training data.
  • the trained machine learning model also includes the particular model architecture to be implemented.
  • data pertaining to the layers in a neural network architecture may be stored as part of the trained machine learning model.
  • the machine learning model may include a convolutional neural network (CNN) that is trained to recognize actions or events extracted from impression-based data.
  • CNN may be trained to recognize actions or events extracted from sequential LiDAR data captures, which can extend the CNN into temporal dimensioning. Temporal LiDAR data captures can be processed in a manner approaching low-resolution images.
  • the impression-based monitoring system 100 is capable of employing self-training premised on object detection and/or anomaly detection models that are trained, retrained, or otherwise updated based on the incremental acquisition and/or analysis of additional impression-based data.
  • the performance and value of the impression-based monitoring system 100 can increase as it becomes more personalized through a process of environmental acclimation that continuously updates the on- device machine learning model(s).
  • Training machine learning models directly on edge devices can be challenging due to limited resources; however, it is an advantage of the systems and methods described in the present disclosure that the impression-based monitoring system 100 receives and analyzes low-fidelity data, which allows for improved computational efficiency for on-device training.
  • Edge computing and federated learning can additionally be implemented to help offload training tasks to more capable devices (e.g., an external device 104, other local computing devices, etc.).
  • more capable devices e.g., an external device 104, other local computing devices, etc.
  • inferencing is more feasible on edge devices like a processor embedded on a sensor, especially with optimizations for performance and efficiency.
  • the feature data may then be received by the processor and operated on to generate a report that indicates the monitoring of the subject, as indicated at step 306.
  • the report may include an alert or alarm, such as an auditory alert, a visual alert, a text alert (e.g., an SMS text message, email, or other message alert) that may be sent to one or more users (e.g., family members, care takers, etc.).
  • the feature data may indicate a deviation from the subject's normal daily activities.
  • the report may include an alert that is sent to a family member, a care taker, or the like, indicating that the subject has deviated in some significant way from their normal daily activities.
  • the impression-based monitoring system 100 may be in communication with the subject’s electronic health record (EHR).
  • EHR electronic health record
  • the impression-based data, feature data, and/or report can be exported or otherwise sent to the subject’s EHR to be used as a part of the subject’s continuum of care.
  • anomaly detection I escalation is anticipated. Certain sensors can be left normally dormant and automatically activated to provide additional detail if / when an anomaly in the environment is detected. Additional sensors could be remotely activated if system faults or malfunctions are detected. The backhaul frequency may be increased or otherwise updated during anomalous event episodes.
  • the collection of impression-based data can include the positioning of sensors where daily life activities are more focused. For example, concentrated measurements can be obtained from restrooms, bedrooms, and kitchens. Suitable measurements can also be collected from macro-environments, such as main and/or open living areas.
  • FIG. 4 illustrates an example report that can be generated and presented to a user.
  • the report can be presented via a graphical user interface (GUI) as a dashboard that indicates various measurements from the impression-based data in addition to feature data indicating the identification of patterns or other deterministic outcomes in the impression-based data.
  • GUI graphical user interface
  • the report indicates the sensing of a main living area of a monitored subject.
  • the impression-based data includes CO2 measurement data, temperature measurement data, and humidity measurement data.
  • FIG. 5 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the report indicates the sensing of a bedroom of a monitored subject.
  • the impression-based data includes CO2 measurement data, temperature measurement data, and humidity measurement data.
  • FIG. 6 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes pulse measurement data and respiratory measurement data of the subject.
  • FIG. 7 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes various data sources, including pulse measurement data, respiratory measurement data, environmental temperature data, environmental humidity data, and location data.
  • the presented data also include histograms of one or more of the impression-based data types.
  • FIG. 8 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes visible + IR / visible light data, gas resistance data, and barometric pressure data.
  • FIG. 9 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes CO2 data, including a histogram of CO2 measurements, motion data, and presence data.
  • FIG. 10 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes various different data types.
  • FIG. 11 illustrates the temperature and humidity data from the dashboard show n in FIG. 10.
  • the measured temperature and humidity data indicate a heat event anomaly that occurred in the environment.
  • the generated feature data will include an indication of this anomaly in the temperature and/or humidity data.
  • These feature data may be processed to generate an alert and/or alarm as described above.
  • FIG. 12 illustrates another example report that can be generated and presented to a user as a dashboard via a GUI.
  • the impression-based data includes pulse, respiration, carbon dioxide, and luminance measurements. The measurements are correlated based upon the time series data and serve to convey signals of life in a deterministic manner.
  • FIG. 13 shows an example of a system 1300 for monitoring an subject in accordance with some embodiments of the systems and methods described in the present disclosure.
  • a computing device 1350 can receive one or more types of data (e.g., impression-based data) from data source 1302.
  • computing device 1350 can execute at least a portion of an impression-based monitoring system 1304 to monitor an subject from data received from the data source 1302 (e.g., one or more sensors, such as LoRa-compatible sensors).
  • the computing device 1350 can communicate information about data received from the data source 1302 to a server 1352 over a communication network 1354, which can execute at least a portion of the impressionbased monitoring system 1304.
  • the server 1352 can return information to the computing device 1350 (and/or any other suitable computing device) indicative of an output of the impression-based monitoring system 1304.
  • computing device 1350 and/or server 1352 can be any suitable computing device or combination of devices, such as a desktop computer, a laptop computer, a smartphone, a tablet computer, a wearable computer, a server computer, a virtual machine being executed by a physical computing device, and so on.
  • the computing device 1350 and/or server 1352 can also reconstruct a de-identified likeness of the subject from the data.
  • the computing device 1350 and/or server 1352 could be used to resolve the identity of a subject, where appropriate, to de-identified data sets, such as when data are being ported into a subject’s EHR.
  • data source 1302 can be any suitable source of data (e.g., impression-based data, other sensor data), such as one or more smart sensors (e.g., LoRa- compatible sensors), another computing device (e.g., a server storing impression-based data and/or other sensor data), and so on.
  • data source 1302 can be local to computing device 1350.
  • data source 1302 can be incorporated with computing device 1350 (e.g., computing device 1350 can be configured as part of a device for measuring, recording, estimating, acquiring, or otherwise collecting or storing data).
  • data source 1302 can be connected to computing device 1350 by a cable, a direct wireless link, and so on.
  • data source 1302 can be located locally and/or remotely from computing device 1350, and can communicate data to computing device 1350 (and/or server 1352) via a communication network (e.g., communication network 1354).
  • a communication network e.g., communication network 1354
  • communication network 1354 can be any suitable communication network or combination of communication networks.
  • communication network 1354 can include a Wi-Fi network (which can include one or more wireless routers, one or more switches, etc.), a peer-to-peer network (e.g., a Bluetooth network), a LoRa network (e.g., a LoRaWAN network), a cellular network (e.g., a 3G network, a 4G network, a 5G network.
  • NB-IoT. etc. complying with any suitable standard, such as CDMA, GSM, LTE, LTE Advanced, WiMAX, etc.), other types of wireless network, a wired network, and so on.
  • communication network 1354 can be a local area network, a wide area network, a public network (e.g., the Internet), a private or semi-private network (e.g., a corporate or university intranet), any other suitable type of network, or any suitable combination of networks.
  • Communications links shown in FIG. 13 can each be any suitable communications link or combination of communications links, such as wired links, fiber optic links, Wi-Fi links, Bluetooth links, LoRa links, cellular links, and so on.
  • FIG. 14 an example of hardware 1400 that can be used to implement data source 1302. computing device 1350, and server 1352 in accordance with some embodiments of the systems and methods described in the present disclosure is show n.
  • computing device 1350 can include a processor 1402, a display 1404, one or more inputs 1406, one or more communication systems 1408, and/or memory 1410.
  • processor 1402 can be any suitable hardware processor or combination of processors, such as a central processing unit (CPU), a graphics processing unit (GPU), and so on.
  • the processor 1402 may be implemented as a co-processor or other co-processing unit.
  • display 1404 can include any suitable display devices, such as a liquid crystal display (LCD) screen, a light-emitting diode (LED) display, an organic LED (OLED) display, an electrophoretic display (e.g., an “e-ink” display), a computer monitor, a touchscreen, a television, and so on.
  • inputs 1406 can include any suitable input devices and/or sensors that can be used to receive user input, such as a keyboard, a mouse, a touchscreen, a microphone, and so on.
  • communications systems 1408 can include any suitable hardware, firmware, and/or software for communicating information over communication network 1354 and/or any other suitable communication networks.
  • communications systems 1408 can include one or more transceivers, one or more communication chips and/or chip sets, and so on.
  • communications systems 1408 can include hardware, firmware, and/or software that can be used to establish a Wi-Fi connection, a Bluetooth connection, a LoRa connection, a cellular connection, an Ethernet connection, and so on.
  • memory 1410 can include any suitable storage device or devices that can be used to store instructions, values, data, or the like, that can be used, for example, by processor 1402 to present content using display 1404, to communicate with server 1352 via communications system(s) 1408, and so on.
  • Memory 1410 can include any suitable volatile memory, non-volatile memory, storage, or any suitable combination thereof.
  • memory 1410 can include random-access memory (RAM), read-only memory (ROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM), other forms of volatile memory, other forms of non-volatile memory 7 , one or more forms of semi-volatile memory, one or more flash drives, one or more hard disks, one or more solid state drives, one or more optical drives, and so on.
  • RAM random-access memory
  • ROM read-only memory
  • EPROM electrically programmable ROM
  • EEPROM electrically erasable ROM
  • other forms of volatile memory other forms of non-volatile memory 7
  • one or more forms of semi-volatile memory one or more flash drives, one or more hard disks, one or more solid state drives, one or more optical drives, and so on.
  • memory 1410 can have encoded thereon, or otherwise stored therein, a computer program for controlling operation of computing device 1350.
  • processor 1402 can execute at least a portion of the computer program to present content (e.g., images, user interfaces, graphics, tables), receive content from server 1352, transmit information to server 1352, and so on.
  • content e.g., images, user interfaces, graphics, tables
  • the processor 1402 and the memory 1410 can be configured to perform the methods described herein (e.g., the method of FIG. 3).
  • server 1352 can include a processor 1412, a display 1414, one or more inputs 1416, one or more communications systems 1418, and/or memory 1420.
  • processor 1412 can be any suitable hardware processor or combination of processors, such as a CPU, a GPU, and so on.
  • display 1414 can include any suitable display devices, such as an LCD screen, LED display. OLED display, electrophoretic display, a computer monitor, a touchscreen, a television, and so on.
  • inputs 1416 can include any suitable input devices and/or sensors that can be used to receive user input, such as a keyboard, a mouse, a touchscreen, a microphone, and so on.
  • communications systems 1418 can include any suitable hardware, firmware, and/or software for communicating information over communication network 1354 and/or any other suitable communication networks.
  • communications systems 1418 can include one or more transceivers, one or more communication chips and/or chip sets, and so on.
  • communications systems 1418 can include hardware, firmware, and/or software that can be used to establish a Wi-Fi connection, a Bluetooth connection, a LoRa connection, a cellular connection, an Ethernet connection, and so on.
  • memon' 1420 can include any suitable storage device or devices that can be used to store instructions, values, data, or the like, that can be used, for example, by processor 1412 to present content using display 1414, to communicate with one or more computing devices 1350, and so on.
  • Memory' 1420 can include any suitable volatile memory, non-volatile memory, storage, or any suitable combination thereof.
  • memory 1420 can include RAM, ROM, EPROM, EEPROM, other types of volatile memory, other ty pes of non-volatile memory, one or more types of semi-volatile memory, one or more flash drives, one or more hard disks, one or more solid state drives, one or more optical drives, and so on.
  • memory 1420 can have encoded thereon a server program for controlling operation of server 1352.
  • processor 1412 can execute at least a portion of the server program to transmit information and/or content (e.g., data, images, a user interface) to one or more computing devices 1350, receive information and/or content from one or more computing devices 1350, receive instructions from one or more devices (e.g., a personal computer, a laptop computer, a tablet computer, a smartphone), and so on.
  • information and/or content e.g., data, images, a user interface
  • the server 1352 is configured to perform the methods described in the present disclosure.
  • the processor 1412 and memory’ 1420 can be configured to perform the methods described herein (e.g., the method of FIG. 3).
  • data source 1302 can include a processor 1422, one or more data acquisition systems 1424, one or more communications systems 1426, and/or memory 1428.
  • processor 1422 can be any suitable hardware processor or combination of processors, such as a CPU, a GPU, and so on.
  • the one or more data acquisition systems 1424 are generally configured to acquire data, images, or both, and can include one or more sensors, which in some instances may be smart sensors such as LoRa-compatible sensors as described above. Additionally or alternatively, in some embodiments, the one or more data acquisition systems 1424 can include any suitable hardware, firmware, and/or software for coupling to and/or controlling operations of one or more sensors. In some embodiments, one or more portions of the data acquisition system(s) 1424 can be removable and/or replaceable.
  • data source 1302 can include any suitable inputs and/or outputs.
  • data source 1302 can include input devices and/or sensors that can be used to receive user input, such as a keyboard, a mouse, a touchscreen, a microphone, a trackpad, a trackball, and so on.
  • data source 1302 can include any suitable display devices, such as an LCD screen, an LED display, an OLED display, an electrophoretic display, a computer monitor, a touchscreen, a television, etc., one or more speakers, and so on.
  • communications systems 1426 can include any suitable hardware, firmware, and/or software for communicating information to computing device 1350 (and, in some embodiments, over communication network 1354 and/or any other suitable communication networks).
  • communications systems 1426 can include one or more transceivers, one or more communication chips and/or chip sets, and so on.
  • communications systems 1426 can include hardware, firmware, and/or software that can be used to establish a wired connection using any suitable port and/or communication standard (e.g., HDMI, VGA, DVI video, USB, RS-232, etc.), Wi-Fi connection, a Bluetooth connection, a LoRa connection, a cellular connection, an Ethernet connection, and so on.
  • memory 1428 can include any suitable storage device or devices that can be used to store instructions, values, data, or the like, that can be used, for example, by processor 1422 to control the one or more data acquisition systems 1424, and/or receive data from the one or more data acquisition systems 1424; to generate representations from data; present content (e.g., data, images, a user interface) using a display; communicate with one or more computing devices 1350; and so on.
  • Memory 1428 can include any suitable volatile memory, non-volatile memory, storage, or any suitable combination thereof.
  • memory 1428 can include RAM, ROM.
  • memory' 1428 can have encoded thereon, or otherwise stored therein, a program for controlling operation of data source 1302.
  • processor 1422 can execute at least a portion of the program to generate images, transmit information and/or content (e.g., data, images, a user interface) to one or more computing devices 1350, receive information and/or content from one or more computing devices 1350, receive instructions from one or more devices (e.g., a personal computer, a laptop computer, a tablet computer, a smartphone, etc.), and so on.
  • information and/or content e.g., data, images, a user interface
  • processor 1422 can execute at least a portion of the program to generate images, transmit information and/or content (e.g., data, images, a user interface) to one or more computing devices 1350, receive information and/or content from one or more computing devices 1350, receive instructions from one or more devices (e.g., a personal computer, a laptop computer, a tablet computer, a smartphone, etc.), and so on.
  • devices e.g., a personal computer, a laptop computer, a tablet computer, a smartphone, etc
  • any suitable computer-readable media can be used for storing instructions for performing the functions and/or processes described herein.
  • computer-readable media can be transitory' or non-transitory.
  • non-transitory computer-readable media can include media such as magnetic media (e.g., hard disks, floppy disks), optical media (e.g., compact discs, digital video discs, Blu-ray discs), semiconductor media (e.g., RAM, flash memory, EPROM, EEPROM), any suitable media that is not fleeting or devoid of any semblance of permanence during transmission, and/or any suitable tangible media.
  • transitory computer-readable media can include signals on networks, in wires, conductors, optical fibers, circuits, or any suitable media that is fleeting and devoid of any semblance of permanence during transmission, and/or any' suitable intangible media.
  • a component may be, but is not limited to being, a processor device, a process being executed (or executable) by a processor device, an object, an executable, a thread of execution, a computer program, or a computer.
  • a component may be, but is not limited to being, a processor device, a process being executed (or executable) by a processor device, an object, an executable, a thread of execution, a computer program, or a computer.
  • an application running on a computer and the computer can be a component.
  • One or more components may reside within a process or thread of execution, may be localized on one computer, may be distnaded between two or more computers or other processor devices, or may be included within another component (or system, module, and so on).
  • devices or systems disclosed herein can be utilized or installed using methods embodying aspects of the disclosure.
  • description herein of particular features, capabilities, or intended purposes of a device or system is generally intended to inherently include disclosure of a method of using such features for the intended purposes, a method of implementing such capabilities, and a method of installing disclosed (or otherwise known) components to support these purposes or capabilities.
  • discussion herein of any method of manufacturing or using a particular device or system, including installing the device or system is intended to inherently include disclosure, as embodiments of the disclosure, of the utilized features and implemented capabilities of such device or system.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Des données basées sur l'impression sont acquises à l'aide d'un ou de plusieurs capteurs dans un environnement. Les données basées sur l'impression comprennent des mesures de l'environnement qui indiquent une impression, ou qui autrement communiquent des attributs, d'un sujet surveillé dans l'environnement. Ces données basées sur l'impression peuvent être acquises sans que le sujet surveillé doive interagir avec les capteurs. Des données de caractéristiques sont générées par analyse des données basées sur l'impression pour détecter des motifs dans les données basées sur l'impression. Un rapport indiquant le bien-être relatif du sujet surveillé peut être généré sur la base des données de caractéristiques.
PCT/US2024/056365 2023-11-17 2024-11-18 Surveillance basée sur l'impression d'un individu à l'aide de capteurs non invasifs et sans contact Pending WO2025106961A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363600448P 2023-11-17 2023-11-17
US63/600,448 2023-11-17

Publications (1)

Publication Number Publication Date
WO2025106961A1 true WO2025106961A1 (fr) 2025-05-22

Family

ID=93796836

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/056365 Pending WO2025106961A1 (fr) 2023-11-17 2024-11-18 Surveillance basée sur l'impression d'un individu à l'aide de capteurs non invasifs et sans contact

Country Status (1)

Country Link
WO (1) WO2025106961A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170053068A1 (en) * 2014-02-28 2017-02-23 Delos Living Llc Methods for enhancing wellness associated with habitable environments
US20180292523A1 (en) * 2015-05-31 2018-10-11 Sens4Care Remote monitoring system of human activity
US20190209022A1 (en) * 2018-01-05 2019-07-11 CareBand Inc. Wearable electronic device and system for tracking location and identifying changes in salient indicators of patient health
US20210000347A1 (en) * 2014-07-29 2021-01-07 Sempulse Corporation Enhanced physiological monitoring devices and computer-implemented systems and methods of remote physiological monitoring of subjects
US20220138294A1 (en) * 2020-11-03 2022-05-05 Kpn Innovations, Llc. Systems and methods for a configurable device environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170053068A1 (en) * 2014-02-28 2017-02-23 Delos Living Llc Methods for enhancing wellness associated with habitable environments
US20210000347A1 (en) * 2014-07-29 2021-01-07 Sempulse Corporation Enhanced physiological monitoring devices and computer-implemented systems and methods of remote physiological monitoring of subjects
US20180292523A1 (en) * 2015-05-31 2018-10-11 Sens4Care Remote monitoring system of human activity
US20190209022A1 (en) * 2018-01-05 2019-07-11 CareBand Inc. Wearable electronic device and system for tracking location and identifying changes in salient indicators of patient health
US20220138294A1 (en) * 2020-11-03 2022-05-05 Kpn Innovations, Llc. Systems and methods for a configurable device environment

Similar Documents

Publication Publication Date Title
JP7549843B2 (ja) 産業環境の監視及び管理のための方法、システム、キット、及び装置
Al Mamun et al. Sensors and systems for wearable environmental monitoring toward IoT-enabled applications: A review
US11405268B2 (en) Fine grained network management to edge device features
US20210057101A1 (en) In-home remote monitoring systems and methods for predicting health status decline
US7616110B2 (en) Mobile wireless customizable health and condition monitor
EP3531350B1 (fr) Système de sécurité basé sur un réseau neuronal à apprentissage profond et son procédé de commande
Dimitrievski et al. Rural healthcare IoT architecture based on low-energy LoRa
US8618930B2 (en) Mobile wireless customizable health and condition monitor
Gupta et al. Detecting anomalous user behavior in remote patient monitoring
CN110197732B (zh) 一种基于多传感器的远程健康监控系统、方法与设备
US20220293278A1 (en) Connected contact tracing
Jacoby et al. Whisper: wireless home identification and sensing platform for energy reduction
Della Mea et al. A communication infrastructure for the health and social care internet of things: Proof-of-concept study
KR20120138313A (ko) 응급상황 알림방법 및 이를 이용한 u-헬스 장치
CN107958434B (zh) 智能看护方法、装置、电子设备及存储介质
Carlson et al. Smart watch RSSI localization and refinement for behavioral classification using laser-SLAM for mapping and fingerprinting
US12198526B2 (en) Airborne pathogen detection through networked biosensors
WO2025106961A1 (fr) Surveillance basée sur l'impression d'un individu à l'aide de capteurs non invasifs et sans contact
Foko et al. An integrated smart system for ambient-assisted living
Lai et al. A remote monitoring system for rodent infestation based on LoRaWAN
Khan et al. Applications of lpwans
Edoh et al. Evaluation of a multi-tier heterogeneous sensor network for patient monitoring: The case of benin
Pnevmatikakis Recognising daily functioning activities in smart homes
Sangeetha et al. Pervasive healthcare system based on environmental monitoring
Hema et al. Smart and wearable sensors used in numerous modern applications and their significance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24817816

Country of ref document: EP

Kind code of ref document: A1