[go: up one dir, main page]

WO2013056335A1 - Système et procédé de détection d'urgence et de réponse - Google Patents

Système et procédé de détection d'urgence et de réponse Download PDF

Info

Publication number
WO2013056335A1
WO2013056335A1 PCT/CA2011/001168 CA2011001168W WO2013056335A1 WO 2013056335 A1 WO2013056335 A1 WO 2013056335A1 CA 2011001168 W CA2011001168 W CA 2011001168W WO 2013056335 A1 WO2013056335 A1 WO 2013056335A1
Authority
WO
WIPO (PCT)
Prior art keywords
emergency
local
response
detection
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CA2011/001168
Other languages
English (en)
Inventor
Alex Mihailidis
Jesse Hoey
David Giesbrecht
Tracy Lee
Vicky Young
Melinda Helene HAMILL
Jennifer Boger
John Paul Lobos
Babak Taati
Yani A. IOANNOU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University Health Network
Original Assignee
University Health Network
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Health Network filed Critical University Health Network
Priority to EP11874403.6A priority Critical patent/EP2769368A4/fr
Priority to PCT/CA2011/001168 priority patent/WO2013056335A1/fr
Priority to CA2879204A priority patent/CA2879204A1/fr
Priority to US13/655,920 priority patent/US20130100268A1/en
Priority to CA2792621A priority patent/CA2792621A1/fr
Publication of WO2013056335A1 publication Critical patent/WO2013056335A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/043Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting an emergency event, e.g. a fall
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0492Sensor dual technology, i.e. two or more technologies collaborate to extract unsafe condition, e.g. video tracking and RFID tracking

Definitions

  • the present disclosure relates to emergency detection systems, an in particular, to automated emergency detection and response systems and methods.
  • An object of the invention is to provide an emergency detection and response system and method that overcomes at least some of the drawbacks of known techniques, or at least, provides a useful alternative thereto.
  • a system for detecting and responding to an emergency event occurring in or near a predetermined area comprising: two or more detection modules each adapted for operative coupling to one or more sensors distinctly disposed in or near the area to detect the event, and each configured to generate a respective data set representative of the detected event; and a controller operatively coupled to said detection modules and configured to compare each said respective data set and implement a response protocol as a result thereof as a function of at least one said respective data set.
  • a method for responding to an emergency event occurring in or near a predetermined area comprising: monitoring the area with two or more sensors distinctly disposed in or near the area; generating respective data sets representative of activity within the area via said two or more sensors; comparing said respective data sets; automatically detecting the event from at least one of said respective data sets; selecting a response protocol as a function of said comparison and at least one of said respective data sets; and implementing said selected response protocol.
  • a system for detecting and responding to emergency events occurring in a predetermined local area comprising a plurality of local emergency detection and response units positioned in the local area.
  • Each local emergency detection and response unit includes one or more local sensing agents and a local detection manager.
  • the local detection manager includes a local processor and each local sensing agent is operable to detect a change in a given emergency factor in the local area and to convey data representative of the change in the emergency factor.
  • the detection manager is operable to receive the data and to assign a value to the emergency factor according to the data.
  • a local interface agent is provided for issuing, on a data path, one or more event status signals including the assigned value for the emergency factor.
  • a central location controller unit is also provided and includes a central processor and a central interface agent in communication with the local interface agent on the data path to receive the event status messages therefrom.
  • Each local emergency detection and response unit includes a local emergency event response agent for responding to at least one person in or near the local area.
  • the central location controller unit and/or the local emergency detection and response unit are operable for classifying the assigned value of the emergency function to form an assigned value classification and initiating the local emergency event response agent to implement a response protocol according to the assigned value classification, and which includes dispatching a message to the person and recording a response therefrom
  • the central processor includes a ranking agent for ranking status signals being received from more than one local emergency detection and response unit in the same local area.
  • the ranking agent is operable to rank each of the emergency detection and response units according to one or more ranking criteria.
  • the central processor is also operable to select one of the emergency detection and response units according to the ranking as an active local emergency detection and response unit to initiate the emergency event response protocol.
  • the central processor is operable to receive status signals over regular, or periodic time intervals and to change the selection of the active emergency detection and response unit according to changes in the status signals.
  • the at least one emergency factor is selected from a plurality of video variables and/or thresholds, a plurality of audio variables and/or thresholds, and a plurality of environmental variables and/or thresholds.
  • the environmental variables and/or thresholds include smoke concentration, carbon monoxide concentration, oxygen concentration, and/or environmental pollutant concentration.
  • the ranking agent is operable to access and compare a plurality of emergency factors received from the plurality of reporting local emergency detection and response units.
  • the emergency factor includes a video image, the variable including size, shape, and motion of object being tracked.
  • the emergency factor includes an audio signal, the variable including amplitude and type of the audio signal.
  • the ranking agent assigns, as active, the active local emergency detection and response unit possessing the highest ranking for a given emergency factor.
  • the local processor or the central processor includes a voice processing agent for issuing voice signals to the person or receiving and interpreting voice signals from the person.
  • one of the sensing functions includes a fixed or steerable digital camera.
  • one of the sensing functions includes a microphone.
  • one of the sensing functions includes an environmental detector.
  • a system for detecting and responding to emergency events occurring in a predetermined local area comprises a plurality of local emergency detection and response units positioned in the local area.
  • Each local emergency detection and response unit includes one or more local sensing agents and a local detection manager.
  • the local detection manager includes a local processor.
  • Each local sensing agent is operable to detect emergency events by a change in a given emergency factor in the local area and to convey data representative of the change in the emergency factor.
  • the detection manager is operable to receive the data and to assign a value to the emergency factor according to the data.
  • a local interface agent is provided for issuing, on a data path, one or more event status messages including the assigned value for the emergency factor.
  • a central location controller unit is also provided which includes a central processor and a central interface agent in communication with the local interface agent on the data path to receive the event status messages therefrom.
  • Each local emergency detection and response unit includes a local emergency event response agent for responding to at least one person in or near the predetermined area.
  • the central location controller unit and/or the local emergency detection and response unit are operable for classifying the assigned value of the emergency function to form an assigned value classification and for initiating the local emergency event response agent to implement a response protocol according to the assigned value classification.
  • the central processor is operable to receive regular or periodic messages from each emergency detection and response unit after detecting activity in the local area. The central processor may then record the activity to track a subject between multiple emergency detection and response units as the subject moves through the local area.
  • a system for detecting and responding to emergency events occurring in a predetermined local area comprises a plurality of local emergency detection and response units to be positioned in the local area.
  • Each local emergency detection and response unit includes one or more local sensing agents and a local detection manager.
  • the local detection manager includes a local processor.
  • Each local sensing agent is operable to detect a change in a given emergency factor in the local area and to convey data representative of the change in the emergency factor.
  • the detection manager is operable to receive the data and to assign a value to the emergency factor according to the data.
  • a local interface agent is also provided for issuing, on a data path, one or more event status signals includes the assigned value for the emergency factor.
  • Each local emergency detection and response unit includes a local emergency event response agent for responding to at least one person in or near the local area, the local emergency detection and response unit being operable for classifying the assigned value of the emergency function to form an assigned value classification and initiating the local emergency event response agent to implement a response protocol according to the assigned value classification.
  • the response protocol includes dispatching a message to the person and recording a response therefrom.
  • One or more of the local detection and response units are responsive to data received on the data path for ranking status signals received from more than one local emergency detection and response unit in the same local area.
  • One or more of the local emergency detection and response units are operable to select one of the emergency detection and response units according to the ranking as an active local emergency detection and response unit to initiate the emergency event response protocol.
  • the local emergency detection and response units are each operable for exchanging the assigned values from one another to form an assigned value group.
  • one or more of the local emergency detection and response units are operable to select an active emergency detection and response unit according to a ranking of individual values in the value group.
  • the central location controller unit and/or the local emergency detection and response unit is operable for classifying the assigned value of the emergency function to form an assigned value classification and initiating the local emergency event response agent to implement a response protocol according to the assigned value classification.
  • a method for detecting and responding to emergency events occurring in a predetermined local area comprises providing a plurality of local emergency detection and response units positioned in the local area, providing each local emergency detection and response unit with one or more local sensing agents and a local detection manager with a local processor, arranging the local sensing agents to detect a change in a given emergency factor in the local area and to convey first data representative of the change in the emergency factor, arranging the detection manager to assign a value to the emergency factor according to the first data, providing a local interface agent for issuing, on a data path, one or more event status signals including the assigned value for the emergency factor, providing a central location controller unit including a central processor and a central interface agent in communication with the local interface agent on the data path to receive the event status messages therefrom, providing in each local emergency detection and response unit a local emergency event response agent for responding to at least one person in or near the predetermined area, arranging the central location controller unit and/or the local emergency detection and
  • a method for detecting and responding to emergency events occurring in a predetermined local area comprises providing a plurality of local emergency detection and response sites in the local area, providing each local emergency detection and response site with one or more local sensing agents, sensing at each of the sites a change in a given emergency factor in the local area, providing a detection manager to receive from each site a first data representative of the change in the emergency factor, arranging the detection manager to assign a value to the emergency factor according to the first data, dispatching one or more event status signals including the assigned value for the emergency factor, providing a central location controller to receive the event status messages therefrom, arranging the emergency detection and response site and/or the central location controller to classify the assigned value of the emergency function to form an assigned value classification, arranging the central location controller to rank status signals received from more than one local emergency detection and response site in the same predetermined local area, according to one or more ranking criteria and to select one of the emergency detection and response sites according to the ranking as an active local emergency
  • a method for detecting and responding to emergency events occurring in a predetermined local area comprises: providing a local emergency detection and response unit in the local area, providing the local emergency detection and response unit with one or more local sensing agents and a local detection manager with a local processor, configuring the local sensing agents to detect an emergency event by a change in a corresponding emergency factor in the local area, and to generate data representative of the change in the emergency factor, arranging the detection manager to assign a value to the emergency factor according to the first data, providing a local interface agent for issuing, on a data path, one or more event status signals including the assigned value for the emergency factor, providing a local emergency event response agent for responding to at least one person in or near the local area, and configuring the local interface agent to receive an initiation signal for initiating the local emergency event response agent to implement a response protocol according to an assigned value.
  • a method for detecting and responding to emergency events occurring in a predetermined local area comprises: providing a plurality of local emergency detection and response sites in the local area; providing each local emergency detection and response site with one or more local sensing agents; configuring the sensing agents to sense a change in a predetermined emergency factor in the local area; providing a detection manager at each site to receive, from each sensing local sensing agent, data representative of the change in the emergency factor, arranging the detection manager to assign a value to the emergency factor according to the data; dispatching, over a data path, one or more event status messages including the assigned value for the emergency factor; providing a central location controller to receive, in a first operative phase, the event status messages over the data path; and configuring the central location controller, in a second operative phase, to: classify the assigned value of the emergency factor to form an assigned value classification; rank the status messages according to one or more ranking criteria, select one of the emergency detection and response sites according to the ranking as an
  • Some exemplary embodiments further comprise configuring the central location controller to: operate in the first operative phase receive the event status messages; operate in the second operative phase concurrently with the first operative phase to: select a new active local emergency detection and response site according to changes in the assigned values of the emergency factors; and instruct the new active local emergency detection and response site to initiate the emergency event response protocol.
  • FIG. 1 is a schematic diagram of an emergency detection and response system, in accordance with one embodiment of the invention.
  • Figure 2 is a schematic diagram of the system of Figure 1 , further comprising respective response modules for each detection module, in accordance with one embodiment of the invention
  • Figures 3A and 3B are operational perspective views of an emergency detection and response system, in accordance with an exemplary embodiment of the invention.
  • Figures 4 and 5 are schematic views of portions of the system of Figure 3;
  • Figure 6 is a flow diagram showing an operative mode of the system of Figure 3A, in accordance with an exemplary embodiment of the invention.
  • Figures 7A and 7B are raw and processed video images, respectively, captured by an emergency detection and response system in tracking a user in a predetermined area and in identifying regular activity, in accordance with one embodiment of the invention.
  • Figures 8A and 8B are raw and processed video images, respectively, captured by the emergency detection and response system of Figures 7A and 7B, identifying a possible emergency event, in accordance with one embodiment of the invention.
  • each data set may be processed locally and/or centrally to identify an emergency event, referred to hereinbelow in various examples as a local emergency factor, or again be processed in defining a globally assessed emergency event or factor.
  • the system can be configured to implement an appropriate response protocol.
  • an appropriate response may be established based on a single data set upon this data set identifying an emergency event of concern, whereas in other embodiments, two or more data sets, for example as generated from distinctly located sensors, may be used to improve emergency identification and response.
  • respective data sets may be compared and ranked, whereby a highest ranking data set (or emergency factor) may be used to further identify the emergency event and/or implement an appropriate response protocol.
  • a highest ranking data set or emergency factor
  • the audio signals captured by each of these sensors may be ranked based on one or more ranking criteria (e.g. signal strength, background noise, proximity as defined by complimentary sensory means, speech recognition reliability, etc.), and a response protocol implemented as a function of this highest ranking signal.
  • this protocol may involve the dispatch of an emergency communication to an external party based, at least in part, on a content of the highest ranking audio signal (e.g.
  • this protocol may rather or also involve prompting a user in or near the predetermined area via a prompting device associated with a highest ranking sensor (i.e. the sensor having produced the highest ranking data set), whereby a user response captured by the highest ranking sensor and/or other sensors/input devices associated therewith (e.g. commonly located) may be used to further ascertain the situation in implementing the response protocol.
  • a prompting device associated with a highest ranking sensor i.e. the sensor having produced the highest ranking data set
  • a user response captured by the highest ranking sensor and/or other sensors/input devices associated therewith e.g. commonly located
  • respective data sets may rather or also be compared and merged based on overlap identified therebetween, thereby allowing to produce a more accurate rendition of the emergency event taking place, and thus leading to the rendition of a more accurate or informative emergency factor.
  • imaging signals are captured by two or more distinctly disposed imaging sensors (e.g. video, infrared (IR), multispectral imaging, heat sensor, range sensor, microphone array, etc.)
  • data sets representative thereof may be compared to identify overlap therebetween (e.g. via computer vision techniques, 2D/3D mapping, etc.) to enhance emergency identification and thus improve response thereto.
  • a merged data set may provide for a more accurate depiction or imaging of the event, for example, where different imaging sensors may be used to piece together a more complete image of a user in distress located in partial view of these sensors.
  • merged data sets may be used to improve locational techniques for accurately identifying ,a location of such a user, for example, using various 2D/3D mapping techniques.
  • both data set ranking and data set merging may be implemented in concert to increase system accuracy, or may be implemented independently depending on the application at hand, or again exclusively depending on the intended application for the system.
  • the system 100 generally comprises two or more detection modules 102A, 102B each adapted for operative coupling to one or more sensors, as in sensors 104 A and 104B, respectively, which are distinctly disposed in or near a predetermined area 106 to detect emergency events occurring in or near this area (e.g. at location A).
  • the sensors 104A, 104B, in concert with detection modules 102A, 102B, are generally configured to generate respective data sets representative of a detected event.
  • distinct detection modules are depicted in this example to schematically represent the generation of distinct data sets representative of distinct data perspectives, viewpoints and/or origins as prescribed by the respectively located sensors from which they originate, various data/signal processing architectures and platforms may be considered herein without departing from the general scope and nature of the present disclosure.
  • distinctly located sensors may themselves be configured to generate respective data sets via integral detection modules, or again be configured to communicate raw or pre-processed data to a common detection module for further processing, albeit retaining some original signature in identifying the sensor from which such data originated.
  • the system 100 further comprises one or more controllers 108 operatively coupled to each detection module 102A, 102B and configured to compare each of the respective data sets (e.g. via ranking and/or merging module 109) and implement a response protocol as a result of this comparison and as a function of at least one of these respective data sets.
  • the controller 108 may be configured to process locally derived emergency factors provided by the respective data sets of each detection module to select and implement an appropriate response protocol.
  • the controller may derive each local emergency factor from raw and/or processed data in making this comparison.
  • the controller may rather derive a global emergency factor from distinctly provided data sets, for example where such data sets are compared for ranking and/or merging purposes.
  • various techniques may be employed to achieve intended results, such as machine learning techniques, artificial intelligence and/or computer vision techniques implemented on the local and/or global data set(s).
  • the controller 108 may be configured to dispatch an emergency message or warning to an external user, such as a friend, family member, neighbour, medial practitioner, security unit, external emergency response unit and the like, via one or more wired and/or wireless communication networks, this functionality commonly depicted in this illustrative embodiment by antenna 1 10.
  • Examples of such communication links may include, but are not limited to, a landline phone link, a cellular phone or data link, a residential Wi-Fi network in concert with an associated communication platform (e.g. home computer network for dispatching an e-mail, text message or the like), and the like, as will be readily appreciated by the skilled artisan.
  • a given response protocol may include the dispatch of related sensory data, be it raw, pre-processed and/or processed data, for example in qualifying the detected emergency.
  • Dispatched data may include, but is not limited to, audio data, for example as recorded in identifying the emergency event and/or as a response to an emergency initiated prompt (discussed below), video/imaging data depicting the user's emergency/condition, positional data to identify a location of the user within the area or amongst plural areas commonly monitored by the system, environmental data, etc.
  • controller 108 is depicted in this example to interface with respective detection modules 102A, 102B, alternative processing architectures may also be contemplated without departing form the general scope of the present disclosure.
  • the computational and/or communicative functionalities of the controller 108 as described in the context of this example may be singularly implemented within a same global control module, wherefrom emergency detection and response protocols may be commonly initiated in respect of a given area, but also in respect of plural areas commonly monitored by an expanded embodiment of system 100.
  • each detection unit may comprise a respective sensor(s), processor, data storage device and network communication interface to implement various computational and communicative functionalities of the system 100 locally.
  • the controller may be described as comprising a merging module and/or a ranking module, as introduced above and further described below, for the purpose of managing multiple data sets.
  • processing modules may be implemented independently within the context of a common controller or control module, or again implemented distributively within the context of two or more central and/or distributed controllers.
  • modularity of the processing techniques contemplated herein may be more or less defined in different embodiments, whereby data may be processed in parallel, in sequence and/or cooperatively depending on the intended outcome of the system's implementation.
  • the system 200 again generally comprises two or more detection modules 202A, 202B each adapted for operative coupling to one or more sensors, as in sensors 204A and 204B, respectively, which are distinctly disposed in or near a predetermined area 206 to detect emergency events occurring in or near this area (e.g. in respect of user B).
  • the sensors 204A, 204B, in concert with detection modules 202A, 202B, are generally configured to generate respective data sets representative of a detected event.
  • the system 200 further comprises one or more controllers 208 operatively coupled to each detection module 202A, 202B and configured to compare each of the respective data sets (e.g. via ranking and/or merging module 209) and implement a response protocol as a result of this comparison and as a function of at least one of these respective data sets.
  • a respective response module 212A, 212B is associated with each of the detection modules 202A, 202B and adapted for operative coupling to a respective prompting device 214A, 2I4B and input device, which in this embodiment, is commonly operated via sensors 204A, 204B.
  • a response protocol may be selected and implemented in which the user B is first prompted via an appropriate response module and prompting device to provide a user input in return, which user input can then be relayed to the controller (or local response module) to refine, adjust and/or terminate the response protocol.
  • a highest ranking data set may be used to identify a most reliable sensory location (e.g. from which a user response may have the greatest likelihood of being understood or captured by the system 200). Based on this ranking, a response module and prompting device associated with this location (e.g. associated with a highest ranking detection module/data set) may be operated by the controller to proceed with the next step of the response protocol.
  • a prompt is provided via a selected prompting device, and a user input is recorded in response thereto.
  • the sensors 204A, 204B include an audio sensor (e.g. microphone, microphone array), this audio sensor may double as input device for recording the user's response.
  • a combined detection and response unit encompassing the functionality of both detection and response modules may be considered, as can a self-contained processing unit further encompassing at least some of the processing functionalities of the controller, to name a few.
  • an additional or alternative prompting device such as a visual prompting device for the hearing impaired (e.g. communal or dedicated screen) may be used by respective response modules or shared between them in prompting a user response.
  • the controller 208 may be configured to dispatch an emergency message or warning to an external user (e.g. via antenna 210), based on the processed sensory data and/or user responses.
  • the system 10 is generally provided for detecting and responding to emergency events occurring in a predetermined local area 12.
  • the system includes a plurality of local emergency detection and response units (or hereinafter referred to as ED units) 14 positioned in the local area 12.
  • ED units local emergency detection and response units
  • Each EDR unit 14 includes one or more local sensing agents or sensors 16 and a local detection manager 18 (i.e. detection module).
  • the local detection manager includes a local processor 20 with a control module 22 which communicates with the local sensing agents 16, including a local video sensing agent 24, a local audio sensing agent 26 and, in this example, an environmental sensing agent 28.
  • Each local sensing agent is operable to detect a change in a given emergency factor in the local area and to report to the control module accordingly.
  • Each local sensing agent conveys data representative of the change in the emergency factor to the control module 22.
  • the local video sensing agent 24 includes a video camera 30 monitored by a video processing module 32.
  • the video processing module 32 is thus operable to detect a change in a given emergency factor in the local area 12, in this case, for example, by detecting the presence of a person, subject, user or other object in the local area 12.
  • the person may include, among others, a patient in a healthcare facility, or a resident of a residential or other facility, with or without disabilities, or others who may be prone to falling or to disorientation or suffer from other condition worthy of monitoring and response, in the manner to be described.
  • the audio sensing agent 26 includes a microphone 34 monitored by a speech dialog module 36.
  • the speech dialog module 36 is thus operable to detect a change in a given emergency factor in the local area 12, in this case, for example, by detecting a verbal message from a person in the local area, which may be in response to automated questions being asked by the EDR units 14 to determine the severity of the emergency, as will be described.
  • the video processing module 32 and the speech dialog module 36 convey data to the control module 22 for further processing.
  • the control module 22, the video processing module 32, and the speech dialog module 36 may be applications running on the local processor 20 or be one or more distinct processors, such as by way of video cards and the like.
  • the control module 22 is, in turn, operable to assign a value to the emergency factor according to the data received from one or more of the video processing module, the speech dialog module and the environmental monitor 28, as the case may be.
  • a local interface agent 38 is provided for issuing, on a data path, one or more event status signals including the assigned value for the emergency factor.
  • a central location controller unit 40 is provided in the form of a central server, with one or more central processors 42 communicating with a central interface agent 44 to receive event status messages therefrom on a communication channel shown at 45.
  • the central and local interface agents may include wired or wireless network cards, RFID tags, Bluetooth, or other forms of transmitter receivers as the case may be.
  • the system 10 is operable in a communication network which, in this example, is computer implemented and may be provided in a number of forms, by way of one or more software programs configured to run on one or more general purpose computers, such as a personal computer, or on a single custom built computer, such as a programmable logic controller (PLC) which is dedicated to the function of the system alone.
  • PLC programmable logic controller
  • a system controlling such a communication network may, alternatively, be executed on a more substantial computer mainframe.
  • the general purpose computer may work within a network involving several general purpose computers, for example those sold under the trade names APPLE or IBM, or clones thereof, which are programmed with operating systems known by the trade names WINDOWS, LINUX or other equivalents of these.
  • the system may involve pre-programmed software using a number of possible languages or a custom designed version of a programming software.
  • the computer network may be include a wired local area network, or a wide area network such as the Internet, or a combination of the two, with or without added security, authentication protocols, or under "peer-to-peer” or “client-server” or other networking architectures.
  • the network may also be a wireless network or a combination of wired and wireless networks.
  • the wireless network may operate under frequencies such as those referred to as 'radio frequency' or "RF" using protocols such as the 802.1 1, TCP/IP, BLUE TOOTH and the like, or other wireless, satellite or cell packet protocols.
  • Each EDR unit 14 further includes a local emergency event response agent or module 46 for responding to at least one person in or near the predetermined local area (e.g. via a dedicated or shared prompting device and input device).
  • the emergency event response agent is provided by the speech dialog module 36 and a loudspeaker 48.
  • each local EDR unit 14 includes a housing 14a containing the local processor 20, the control module 22, the video camera 30, the video processing module 32, the speech dialog module 36, the microphone 34 and the loudspeaker 48.
  • the surface of the housing 14a may be paintable, allowing for custom colouring of the housing according to the decor of a monitored location.
  • the housing may also be provided in varying shapes and styles creating different options for the product's appearance, as desired.
  • each EDR unit may be provided with a backup battery.
  • the central location controller unit 40 and/or the EDR units 14 are operable for classifying the assigned value of the emergency function to form an assigned value classification and for initiating the local emergency event response agent 46 to implement a response protocol according to the assigned value classification.
  • the central processor 42 includes a ranking agent 50 for ranking status signals being received from more than one local EDR unit 14 in the same predetermined local area 12.
  • the ranking agent 50 is operable to rank each of the EDR units 14 according to one or more ranking criteria.
  • the central processor is thus operable to select one of the EDR units 14 according to the ranking as an "active" EDR unit to initiate the emergency event response protocol.
  • one or more of the EDR units themselves may be configured to selective an "active" EDR unit.
  • the data paths 14b may be configured to form a local emergency and detection and response network, in which the local emergency detection and response units are each operable for exchanging the assigned values from one another to form an assigned value group.
  • one or more of the local emergency detection and response units being operable to select an active emergency detection and response unit according to a ranking of individual values in the value group.
  • At least one emergency factor may include a plurality of video variables and/or thresholds, a plurality of audio variables and/or thresholds, and a plurality of environmental variables and/or thresholds.
  • the environmental variables and/or thresholds may, in this ease, include temperature, atmospheric pressure, humidity, smoke concentration, carbon monoxide concentration, oxygen concentration, and/or environmental pollutant concentration.
  • the ranking agent 50 may access and compare a plurality of emergency factors received from the plurality of reporting local emergency detection and response units 14.
  • the emergency factor may include, in this case, a video image, the variable including size, shape, and motion of object being tracked.
  • the emergency factor may include an audio signal, the variable including amplitude and type of the audio signal.
  • Figures 3 to 5 shows a general schematic of the various parts of a single unit
  • Figure 6 is a flow diagram outlining an exemplary decision-making process performed by the central server, to communicate with multiple EDR units simultaneously.
  • the system 10 is configured so that the EDR units 14 may be located throughout a person's living space.
  • the central server 40 makes overall decisions about which EDR unit 14 is actively monitoring or communicating with the human user at a given point in time. In the event of an emergency, the central server 40 may also facilitate communications with the outside world (e.g. contact a neighbour, relative or 91 1 ), by way of an external interface unit 52, for example.
  • Each EDR unit 14 may thus include one or several hardware components which may be installed in a common housing, such as one or more cameras (e.g. webcam or 'steerable' camera, infrared or multispectral imaging device, heat sensor, range sensor, etc.) one or more small loudspeakers, a single, multiple or small array of microphones, a computer processor, or an environmental monitor, such as a smoke and/or carbon monoxide detector.
  • a common housing such as one or more cameras (e.g. webcam or 'steerable' camera, infrared or multispectral imaging device, heat sensor, range sensor, etc.) one or more small loudspeakers, a single, multiple or small array of microphones, a computer processor, or an environmental monitor, such as a smoke and/or carbon monoxide detector.
  • cameras e.g. webcam or 'steerable' camera, infrared or multispectral imaging device, heat sensor, range sensor, etc.
  • small loudspeakers e.g. web
  • Each EDR unit 14 may be portable or mobile, such as on a movable robot, or it may be stationary and installed in an appropriate location in a user's house or long-term care facility, such as on the ceiling of the living room or bedroom.
  • the EDR unit, or a component thereof may be mounted on the user. This might include a blood pressure or heart rate monitor, or the like.
  • the EDR unit 14 may use the camera(s) or microphone(s) to monitor the living environment of a human subject in real-time.
  • the camera may be fixed within the housing, with a static field of view, or 'steerable' allowing it to follow the movement of a given subject.
  • the local processor 20 in this case performs real-time analysis of the video and audio inputs to determine if an emergency event, such as a fall, has occurred.
  • the EDR unit 14 can communicate with the subject via the microphone 34 and loudspeaker 48 and initiate a dialog using speech recognition software. Communicating with the subject in this way, allows the system to determine the level of assistance required. If external assistance is required (e.g.
  • the local processor 20 can relay this information to the central server 40 located at a convenient location in the house or other facility. Information between the local processor 20 and the central server 40 can occur via either a standard wired or wireless (Wi-Fi) communication network.
  • the server may send information about an emergency event to the outside world via a variety of possible communications methods (e.g. landline or cell phone network, text messaging, email), via the external interface 52.
  • the system 10 may ensure that the privacy of the subject is maintained at all times by configuring the local processor 20 to relay only computer vision and speech recognition results, as well as information about a possible emergency event, to the central server, but not any of the original video or audio information.
  • original video or audio information may be relayed to the central server for further processing, as well as other or alternative types of data such as blob information, feature vectors, etc.
  • the environmental sensing agent 28 or Environmental Monitor may include sub-components such as a smoke detector and a carbon monoxide detector.
  • the Environmental Monitor may also relay this information to the appropriate emergency services via the central server 40.
  • the video processing module 32 takes real-time video input from the video camera 30, and performs computer vision algorithms to determine if an emergency has occurred.
  • the employed computer vision algorithms may include object extraction and tracking techniques such as adaptive background subtraction, color analysis, image gradient estimation, and connected component analysis. These techniques allow the system to isolate a human subject from various static and dynamic unwanted features of the video scene, including the static background, dynamic cast shadows and varying light conditions. As such, characteristics of the subject's movement, posture and behaviour may be monitored in real-time to determine if an emergency (e.g. a fall) has occurred.
  • the video processing module 32 may relay information of this emergency event to the control module 22.
  • Figures 7A and 7B illustrate tracking results for a subject walking, with an original "webcam” image and the extracted silhouette of the subject (light blue, or white as shown in the black and white version presented herein) and their shadow (dark blue or grey in the black and white version presented herein).
  • the tracking box on the shows "green” (in this case in chain dotted lines).
  • the tracking box may then change state, such as to the colour “red” (as shown by the solid lines) and the silhouette is elongated with very little shadow.
  • the control module 22 may instruct the speech dialog module 36 to initiate a conversation with the subject using speech recognition software, such as a small vocabulary, speaker-independent automatic speech recognition (ASR) software.
  • ASR automatic speech recognition
  • This ASR software may be specially trained on population- specific and/or situation-specific voice data (e.g. older adult voices, voices under distress, atypical voice patterns caused by affliction or emergency events). The system may also learn its users' specific voice patterns in an offline or on-line manner during the lifetime of its usage to maximize speech recognition results.
  • Other audio processing techniques such as real-time background noise suppression, and the recognition of environmental sounds (e.g. falling objects, slam sounds, etc.) may also be employed to ensure robust system performance.
  • the speech dialog module 36 may communicate directly with the subject by outputting speech prompts via the loudspeakers) 48 and listening to audio input via the microphone(s) 34, to determine the level of assistance that the subject may require.
  • the outcome of this speech dialog can be sent to the control module 22 and, if further assistance is required, the control module 22 can relay this to the central server 40 via the communications network 45.
  • visual prompts may be used to prompt the hearing impaired (e.g. via images, text and/or speech-to-text displays in or near the area), as can one or more physical input interfaces (e.g. a specialized keypad or touchscreen) be used for those unable to respond orally to reply to such prompts.
  • speech synthesis technology may be used in addressing occupants in rooms.
  • the voice pattern of the speech synthesis system may be customized or trained with the voice patterns of a particular person, e.g. a familiar and trusted voice, for instance to allow people with afflictions such as Alzheimer's or dementia to more easily co-operate with it.
  • An alternative example implementation of the system 10 may allow for emergency situations to be detected by either the video processing module 32 or the speech dialog module 36 simultaneously.
  • the microphone 34 may be on at all times, allowing the speech dialog module 36 to listen for key emergency words or audio events (e.g. a cry for "help!, a loud crash), or again to detect distressed and/or atypical speech.
  • This implementation may be particularly useful if the video processing module 32 is unable to detect a given emergency situation (e.g. if the subject is outside the field of view of the camera(s), or during low light conditions such as nighttime).
  • the EDR unit 14 may also include a motion sensor 54, such as an infrared sensor.
  • the video camera 30 may also be equipped for "night vision". This may add additional functionality to the system, such as the ability for a given unit to automatically turn on when significant motion is detected (e.g. when a person enters a room), or for more robust vision tracking in low light conditions (e.g. at nighttime). This functionality may allow the system to also operate in an "away" mode, thereby to detect in-home disturbances or intrusions when the person is not home. Therefore, an additional application for the system may be to act as a home security system or to augment existing home security systems.
  • a light bulb may also be fitted in each EDR unit, so as to be activated by a light switch, for example on a neighbouring wall. If desired, the light on the EDR unit may operate in the same fashion as a conventional ceiling-mounted light fixture, enabling a user to replace or augment existing ceiling mounted light fixtures with functions of the device 10.
  • the central server 40 is able to handle simultaneous communications with one or multiple EDR units 14, allowing for multiple EDR units 14 to be installed in different rooms of a house, assisted living facility or long-term care facility. Therefore, at a given point in time, the central server may analyze the information simultaneously received from multiple EDR units 14 and determine which EDR unit is currently "active" (i.e., which camera currently has the subject of interest within its field of view). This may be accomplished by comparing the audio and/or computer vision tracking and/or audio processing results from each local processor 20. For example, the EDR unit currently tagged as "active" may be the one currently tracking the object with the largest size or with significant movement. This methodology allows for the central server 40 to track a subject between the cameras of multiple EDR units installed in the same room, or throughout various rooms of a house, ensuring that an emergency event is detected robustly.
  • the system may be configured to select as "active" the EDR for which respective data generated thereby achieves a highest ranking, e.g. a highest reliability measure.
  • respective data sets processed from multiple EDRs can be compared to identify overlap therebetween, whereby multiple data sets may be merged based on this identified overlap to achieve greater monitoring, emergency detection and/or response.
  • the system may be configured to actively process data from each active EDR to generate a merged data set. Merged data sets may, for example, provide greater locational data with respect to the user (e.g. 2D/3D mapping) as well as greater depiction of the user's status, particularly where only partial images can be rendered by each EDR, for example.
  • Each EDR may also be adapted for automatic global localization (e.g. real- world coordinates such as longitude/latitude or address), for example via GPS and/or network geolocation, with a possible manual override.
  • global localization e.g. real- world coordinates such as longitude/latitude or address
  • Such global localization may prove useful in initiating emergency responses and dispatching an exact location of the emergency event to external responders.
  • a given embodiment may include an event interface, whereby a central location controller unit may include an externally addressable interface allowing authorized third party clients to receive event data, such as emergency factors and other information gathered by the local processors, and act on it.
  • the event interface may allow authorized users to send event information to external notification methods, publish emergency events, import contact data and data storage, interface with web-services and social networks, GPS/locational information for emergency response and positional information for tracking the position of people in their environment (e.g. used in applications such as extracting, analyzing, or using patterns of living).
  • data may be correlated with other sensors in the environment such as pressure sensors, motion sensors, "smart-home" sensors, etc., for instance for improving the accuracy of emergency detection and 3rd party applications.
  • smart-home devices may also be used in the context of the herein described system to ensure user safety. For example, ensuring lights are on in the rooms people are present in or about to enter to prevent falls, or ensuring that the stove is turned off when a person leaves it unattended, e.g. when user is detected leaving the kitchen, may contribute to improved user safety.
  • the central controller may also implement the functionality of a central "smart-home" controller, allowing user interfaces to smart home devices, for example.
  • the central location controller unit and/or the local units may be interfaced by users and/or administrators over a local or remotely accessible interface, such as a database-backed or configuration-file based webpage configuration. This interface may allow the users to set preferences on event notification, account settings for 3rd party interfaces, etc.
  • an administrator configuration may include advanced fine-tuning options for the emergency factor along with debugging information.
  • online learning may also be implemented to adjust parameters of the system in response to learned user preferences, such as a user's preference for emergency response criteria (e.g. detection sensitivity, timeout on user prompts for automatic emergency calls, etc.). For example some users may have slower responses to voice prompts than others, and the system may learn to wait longer for a response. Some users may be more or less independent than others, allowing the system to learn user's tolerance preferences for help/emergency response in order to preserve dignity and a sense of independence, for example. .
  • Figure 6 is a flow diagram outlining this decision-making process performed by the central server 40, which may communicate with multiple EDR units 14 simultaneously.
  • the central server 40 receives emergency monitoring results from the EDR units 14, as shown in Figure 3A, with the three EDR units 14 on the left side of the space 12 reporting with the message on data paths 14b.
  • This connection 45 may be a wireless network connection (Wi-Fi) or some other type of communications connection.
  • the emergency monitoring information received by the central server 40 may include complete or partial results from the analysis of video input and/or audio input from each EDR unit 14.
  • video analysis results may include information such as: the present or absence of an object of interest (e.g. human user) in the camera's field of view, size and movement of the object, etc.
  • audio analysis results may include: the presence or absence of a significant audio event of interest (e.g. speech, loud crash noise), the detected loudness of the audio event, etc.
  • Other important video/audio monitoring results may also be received at this step depending on the nature of the video/audio analysis performed by the EDR units.
  • the monitoring results received from all EDR units 14 are compared, and the central server 40 decides which EDR unit 14 is currently in a fully "active" state.
  • the central server 40 via ranking agent 50, ranks all the received emergency monitoring results according to a set of ranking criteria (e.g. digital imaging, video and/or audio analysis criteria and thresholds).
  • ranking criteria e.g. digital imaging, video and/or audio analysis criteria and thresholds.
  • analysis criteria and thresholds may be fixed or dynamic depending on a given video or audio scenario.
  • Such ranking criteria may include video analysis metrics such as the current size of the object being tracked, if present, by each video camera 30, and audio analysis metrics such as the current amplitude of audio or speech, if present, captured by each microphone 34.
  • the EDR unit 14 with the highest rank may then be chosen to be the currently "active" EDR unit 14. Using these ranking criteria may ensure that the chosen active EDR unit will be the unit that is currently best suited for monitoring the human user and detecting and responding to possible emergency events. In this case, the system will continue to receive EDR data on a regular basis as the subject progresses through a particular monitoring time period, either in a single monitored location or in a plurality of monitored location. This may involve several ceiling mounted EDR units mounted throughout the location(s) and one central controller that supervises and coordinates the monitoring operations of the EDR units.
  • the centra! controller may be configured to employ predictive modeling to determine a predicted location of an occupant.
  • the central controller may activate (power-on) tracking units that are located in areas that are proximal to and currently containing the occupant(s). Areas that are not in use by the occupant(s) may be kept in a standby (power-save) mode. Predictive algorithms may be employed to determine the areas the occupant is most likely to move to, activating tracking units that are located in the areas along the occupant(s)' likely path of travel.
  • the central controller may be used to place all units on standby except those monitoring areas of critical interest. Examples of this include monitoring only a bedroom location, while the occupant is asleep or only the entrance area if the occupant has left his/her home.
  • the central server 40 may notify all EDR units 14 of this decision, and/or notify only the currently active unit to continue with the full detection of and response to an emergency event, the latter case shown by path 14c in Figure 3B.
  • This decision-making framework by the server may prevent multiple EDR units from detecting the same emergency event and confusing the user by all starting a dialog with the user simultaneously.
  • the EDR units are operable to sense data either continuously, regularly or periodically, thus dispatching signals to the central controller continuously, regularly or periodically and then being responsive to the central controller to be selected for implementing dialog with a subject. Therefore, the communication with the subject occurs following (and not before) selection of an active EDR, according to prevailing emergency factor data.
  • a successive train of EDR' s may be activated and then deactivated, in succession, as the subject moves.
  • the active EDR unit 14 is then operable to monitor the user and detect if an emergency occurs (e.g. a fall). If such an event does occur, this information may be relayed back to the server, which may in turn instruct the active EDR unit 14 to initiate a dialog with the subject using speech recognition software to determine what level of assistance may be required (steps 304 and 307).
  • an emergency e.g. a fall
  • this information may be relayed back to the server, which may in turn instruct the active EDR unit 14 to initiate a dialog with the subject using speech recognition software to determine what level of assistance may be required (steps 304 and 307).
  • the active EDR unit determines that the user does require assistance, this information may be relayed to the central server 40 (step 308).
  • the central server 40 may then initiate the appropriate communication about the detected emergency situation to the outside world (step 309).
  • the system could notify a neighbour, relative, medical staff (e.g. family doctor), emergency services (e.g. 911), or other appropriate personnel, via the external interface unit 52.
  • this communication may occur via a variety of possible communications protocols, such as landline phone, cell phone, email, text message (SMS), or some other communication system.
  • information about the current monitored behaviour of the user may be used to update a learned model of the user's daily "normal" behaviour and routines.
  • This model may contain information about time of day and duration spent in certain locations of a house, etc.
  • the EDR unit or central server may employ artificial intelligence algorithms (e.g. Markov Decision Process methods) to enable the system to gradually learn the expected daily behaviours of a given subject. This may allow the overall system to detect emergency situations more robustly by further characterizing "normal” versus "abnormal" behaviour for a given subject, which is done at steps 305 and 306.
  • artificial intelligence algorithms e.g. Markov Decision Process methods
  • the central server 40 may still tell the active EDR unit 14 to initiate a dialog with the user (step 307) that is adaptable and appropriate for the particular situation to determine if any assistance is required. For example, the system could ask the user if everything is ok because it appears they have been in the bedroom all afternoon.
  • the active EDR unit 14 could perform these tasks by employing adaptable speech dialog technologies such as Text-To-Speech (TTS) software in conjunction with automatic speech recognition (ASR) software.
  • TTS Text-To-Speech
  • ASR automatic speech recognition
  • EDR units 14 may have a scaled-down local processor or may even have no processor at all.
  • EDR units may not perform full video processing or speech/audio processing, but may send video and audio data over the communications network 45 in real-time to either be processed by central server 40 or by a nearby EDR unit that does contain a fully functional local processor 20.
  • This type of network configuration may be advantageous in some scenarios due to the centralization of video and audio processing for the system 10, which may make the system more affordable by reducing hardware and software costs of the EDR units.
  • EDR units without a fully functional local processor 20 may be constructed using a smaller housing 14a, which may make them easier to install throughout a home as well as more aesthetically pleasing to the user.
  • control module 22 may not assign a value to the emergency factor according to the data received from either the video processing module or the speech dialog module, or both, as the case may be, but rather transmit the raw video data image or images to the central controller, which may then assemble an image or images from the control unit of a neighbouring EDR unit in order to assemble a full image of the combined field of view, so that the emergency factor may then be valued based on this merged data set.
  • the device 10 may be operated as follows.
  • the units that are closest to the occupant may "view” and “listen” for input (e.g., commands and responses) from the occupant.
  • the input received by each EDR unit (hereinafter referred to below as a ceiling mounted unit (CMU)) may then assigned a confidence factor (CF), which may indicate what the likelihood is that the input from the occupant perceived by the system matches an actual verbal input from the occupant.
  • CMU ceiling mounted unit
  • CF confidence factor
  • the CF may be influenced by aspects such as (but not limited to) CMU proximity to the occupant, ambient noise levels, and word recognition error rate (WRER, which is estimated by the speech recognition software), and may be calculated using a function similar to the one described in Equation 1 :
  • CF 1/[ ⁇ 1 (proximity) + p2(ambient noise level) + p3(l-WRER)] (1)
  • ⁇ , ⁇ 2, and ⁇ 3 are constants determined through a combination of within-lab experimentation and calibration once the system is installed in an environment. Post- installation calibration may be necessary, as the layout of the environment may influence the acoustics, to determine each CMUs' "hearing" range.
  • the CMU that rates the occupant's response with the highest CF may then dictate action by the central control unit.
  • This unit may not necessarily be the CMU closest to the occupant (as demonstrated in the scenarios below).
  • a situation may occur where an occupant has an adverse event in a visual monitoring area covered by two or more CMUs. As the images from the units may be stitched together, an entire image of the occupant may be available to the system.
  • the system may also be designed so that the occupant may activate the system using a keyword (e.g., "Help!), to enable the occupant to procure assistance regardless of whether s he is visible by the system. In this instance, the system may determine the occupant's most likely responses using verbal input only.
  • the unit that has the highest CF may be the one relaying responses, although since the proximity will be unknown, Equation 1 may be altered to reflect this (such as a '0' value for proximity).
  • the constants ⁇ , ⁇ 2, and ⁇ 3 for a particular application may be established through empirical testing.
  • the actual values for ⁇ , ⁇ 2, and ⁇ 3 will depend on the specific configuration of the device 10, the physical characteristics of the location being monitored, as well as the general health of the subject. For instance, for an application in which a subject is only able to whisper, the constant ⁇ 2 may be set to a higher value, when compared with the same location with a subject who is able to speak at a normal speaking volume.
  • SCENARIO 1 TELEVISION OFF
  • the television is on at a volume of 70dB when the adverse event occurs, creating a source of interference.
  • CMU #2 is further from the occupant than CMU #1, it is also further from the competing noise source and thus has less competing ambient noise, for example, 50dB and 65dB respectively.
  • the television and the occupant are approximately equidistant from CMU #1, the occupant is closer to CMU #2 than the television is.
  • the CF for the two CMUs might be:
  • CF2 l/[0.2(8) + 0.6(50) + 0.3(1-0.05)] ⁇ 0.031 [00108]
  • the CF for CMU #2 is higher than the CF for CMU #1 (i.e., CF1 ⁇ CF2).
  • the verbal input received by CMU #2 may then be used by the central control unit to determine the appropriate action(s) to take.
  • an exemplary embodiment operates through one or more CMU's and a central control unit, where the CMU's are mounted to the ceiling, resulting in improved monitoring coverage while ensuring that the EDR units are unobtrusive and discrete.
  • Each CMU includes a vision sensor, in this case as a non-recording video camera.
  • the CMU also includes a microphone, one or more speakers, a processor, and a smoke alarm.
  • Multiple CMR's may thus be used and networked together to ensure that designated areas of a monitored facility, such as a subject's home, may be monitored.
  • One central controller may thus be installed in each home and be responsible for coordinating the CMU's and relaying communications to the outside world.
  • the system 10 may use computer vision techniques to track a subject as the subject moves about the subject's home and detects if there is an acute emergency event, such as a fall or a fire.
  • the vision sensor does not record, transmit, or store collected images.
  • the system does not provide continuous data about the subject's status and activities to an external source, thus preserving the subject's privacy.
  • an acute event is detected, one or more CMU's sensing the event dispatch event data to the central controller which then selects either the closest CMU, or another CMU with improved sensing results for the event, to be an active CMU to employ speech recognition algorithms to have a dialogue with the subject in distress to determine if and what type of assistance is required.
  • 91 1 and the monitoring provider's live switchboard may be made available as needed, as well as one or more respondents that were predefined by the subject (e.g. a neighbour, a family member, and/or a friend).
  • the subject By verbally answering a short series of simple "yes'V'no" questions, or other simple verbal phrases, the subject may select which respondent s/he would like to contact.
  • the system 10 may connect to a live operator at the provider's switchboard to assess the situation and provide appropriate assistance.
  • the subject may thus remain fully in control of the situation and his/her health, promoting feelings the dignity, autonomy, and independence without compromising safety.
  • the system 10 provides, in one example, an automated emergency detection and response system that uses computer vision to detect an emergency event, such as a fall, and speech recognition and artificial intelligence to then determine the level of assistance that is required by an individual.
  • an emergency event such as a fall
  • speech recognition and artificial intelligence to then determine the level of assistance that is required by an individual.
  • These devices may be networked together with a central server 40 to provide more comprehensive monitoring throughout an entire living environment, without the necessity of having an older adult wear an alert device or call button.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Alarm Systems (AREA)

Abstract

Différents modes de réalisation de l'invention concernent des systèmes et des procédés de détection d'urgence et de réponse automatisés. Dans certains modes de réalisation, des ensembles de données respectifs générés en relation avec des capteurs situés de manière distincte sont comparés, et un protocole de réponse approprié est sélectionné sur la base de cette comparaison et en fonction d'au moins l'un des ensembles de données.
PCT/CA2011/001168 2008-05-27 2011-10-21 Système et procédé de détection d'urgence et de réponse Ceased WO2013056335A1 (fr)

Priority Applications (5)

Application Number Priority Date Filing Date Title
EP11874403.6A EP2769368A4 (fr) 2011-10-21 2011-10-21 Système et procédé de détection d'urgence et de réponse
PCT/CA2011/001168 WO2013056335A1 (fr) 2011-10-21 2011-10-21 Système et procédé de détection d'urgence et de réponse
CA2879204A CA2879204A1 (fr) 2011-10-21 2011-10-21 Systeme et procede de detection d'urgence et de reponse
US13/655,920 US20130100268A1 (en) 2008-05-27 2012-10-19 Emergency detection and response system and method
CA2792621A CA2792621A1 (fr) 2011-10-21 2012-10-19 Methode et systeme de detection et de reponse aux urgences

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CA2011/001168 WO2013056335A1 (fr) 2011-10-21 2011-10-21 Système et procédé de détection d'urgence et de réponse

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/471,213 Continuation-In-Part US8063764B1 (en) 2008-05-27 2009-05-22 Automated emergency detection and response

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/655,920 Continuation-In-Part US20130100268A1 (en) 2008-05-27 2012-10-19 Emergency detection and response system and method

Publications (1)

Publication Number Publication Date
WO2013056335A1 true WO2013056335A1 (fr) 2013-04-25

Family

ID=48140250

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2011/001168 Ceased WO2013056335A1 (fr) 2008-05-27 2011-10-21 Système et procédé de détection d'urgence et de réponse

Country Status (3)

Country Link
EP (1) EP2769368A4 (fr)
CA (1) CA2879204A1 (fr)
WO (1) WO2013056335A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITUB20153405A1 (it) * 2015-09-04 2017-03-04 Trilogis S R L Sistema di controllo degli utenti
CN113273930A (zh) * 2021-06-04 2021-08-20 李侃 一种集成智能救援功能的扫地机器人及其控制方法
WO2024056937A1 (fr) * 2022-09-14 2024-03-21 Marielectronics Oy Capteur et système de surveillance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090295572A1 (en) * 2008-05-30 2009-12-03 International Business Machines Corporation System and Method for Detecting and Broadcasting a Critical Event
US7671718B2 (en) * 2004-01-27 2010-03-02 Turner Richard H Method and apparatus for detection and tracking of objects within a defined area
US7978076B2 (en) * 2004-02-04 2011-07-12 Contigo Solutions, Inc. System for, and method of, monitoring the movement of mobile items

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040030531A1 (en) * 2002-03-28 2004-02-12 Honeywell International Inc. System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor
US20070132597A1 (en) * 2005-12-09 2007-06-14 Valence Broadband, Inc. Methods and systems for monitoring patient support exiting and initiating response

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7671718B2 (en) * 2004-01-27 2010-03-02 Turner Richard H Method and apparatus for detection and tracking of objects within a defined area
US7978076B2 (en) * 2004-02-04 2011-07-12 Contigo Solutions, Inc. System for, and method of, monitoring the movement of mobile items
US20090295572A1 (en) * 2008-05-30 2009-12-03 International Business Machines Corporation System and Method for Detecting and Broadcasting a Critical Event

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2769368A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITUB20153405A1 (it) * 2015-09-04 2017-03-04 Trilogis S R L Sistema di controllo degli utenti
CN113273930A (zh) * 2021-06-04 2021-08-20 李侃 一种集成智能救援功能的扫地机器人及其控制方法
WO2024056937A1 (fr) * 2022-09-14 2024-03-21 Marielectronics Oy Capteur et système de surveillance

Also Published As

Publication number Publication date
EP2769368A1 (fr) 2014-08-27
CA2879204A1 (fr) 2013-04-25
EP2769368A4 (fr) 2015-07-08

Similar Documents

Publication Publication Date Title
US20130100268A1 (en) Emergency detection and response system and method
US8063764B1 (en) Automated emergency detection and response
EP3410935B1 (fr) Système de surveillance de personnes et d'assistance personnelle, en particulier destiné aux personnes âgées et aux personnes présentant des besoins spéciaux et cognitifs
Kon et al. Evolution of smart homes for the elderly
EP3588455B1 (fr) Identification de l'emplacement d'une personne
EP2353153B1 (fr) Système pour suivre la présence de personnes dans un bâtiment, et procédé et produit de programme informatique
US10380875B1 (en) Immersive virtual reality detection and alerting technology
EP2953104B1 (fr) Système de commande de domotique
US12207171B2 (en) Smart speakerphone emergency monitoring
WO2020253162A1 (fr) Robot et son procédé de commande, et système domotique intelligent
US20150194034A1 (en) Systems and methods for detecting and/or responding to incapacitated person using video motion analytics
US11875571B2 (en) Smart hearing assistance in monitored property
US20190228628A1 (en) Audio monitoring system
AU2020391477B2 (en) Accessibility features for monitoring systems
CA2792621A1 (fr) Methode et systeme de detection et de reponse aux urgences
WO2021002293A1 (fr) Procédé de réponse d'urgence, système de confirmation de sécurité, dispositif de gestion, section d'espace et procédé de commande de dispositif de gestion
CN120029457A (zh) 一种融合多模态感知的协同监测与人机交互系统及方法
WO2013056335A1 (fr) Système et procédé de détection d'urgence et de réponse
CN119948543A (zh) 用于监测的传感器和系统
US11521384B1 (en) Monitoring system integration with augmented reality devices
KR20200047083A (ko) 음성인식을 통한 노약자 위급 상황 관리시스템
Adlam et al. Implementing monitoring and technological interventions in smart homes for people with dementia-case studies
TWI689899B (zh) 人員意識預警系統及其方法
KR20240094061A (ko) 엣지디바이스를 이용한 위급상황 감지 시스템 및 방법
FI124369B (fi) Menetelmä ja järjestely aktiivisuuden ja toimintakyvyn arvioimiseksi vuorovaikutuksen fysiologisten signaalien perusteella

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11874403

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2011874403

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2011874403

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2879204

Country of ref document: CA